Friday, January 1, 2010


They’re baaaaaaaaaack…. proving once again that Americans and most of the rest of the world who on one hand claim to embrace the Gregorian Calendar can’t perform basic math on the other. Just about all the media including TV and radio stations and newspapers and magazines have been offering not just year-end reviews but calendar-based end-of-the-decade stories about significant happenings and personalities. You’d figure after the new millennium debacle of 2000 and the after the fact sheepish admissions that 2000 wasn’t really the start of the new millennium, that all newspapers including the Dallas Morning News (DMN) would understand the concept of ten (10). They’re in good company to include and Andy Serwer's article, The '00s: Goodbye (at Last) to the Decade from Hell and’s business section which on December 27, 2009 featured a financial post by William Hanley, Good riddance To a Debacle of a Decade.

It certainly appears that they still don’t get it or just feel it financially expedient to ignore reality, fess up later (wink & nod) and then repeat/update the story a year later (CaChing!). I guess you can sell more fireworks with that faulty rationale. With that in mind and with tensed jaw – and compassion for all, trying to work through this season of joy and goodwill, we thought that a math primer was in order – again for these yahoos who are invoking a calendar-based decade. Now for you folks that are just arbitrarily referring to the last ten years without regard to which calendar years are encompassed, our apologies.

This post is for all you folks who think you understand the concept of ten and the Gregorian calendar but erroneously and arbitrarily assume that the last calendar decade was, in fact, from 2000 to 2009. In fact, it wasn’t, just like the year 2000 wasn’t the start of the new millennium, rather the end of the last.

Even the iconic and courageous Dick Clark in his 2009 New Year’s Eve Times Square countdown talked about the “end of the decade.” The usually right on Jacquielynn Floyd of the DMN today offers palaver about the last calendar decade.

Valley 101’s Clay Thompson, columnist for the Arizona Republic, tried to offer an explanation of this faulty math and only succeeded in mucking the whole thing up and proving that he is more than just “slightly skewed”. A comment from one of his readers, however, nails this issue once and for all time.

“First of all there was no year "zero".....thus the last year of the first decade was 10 and the first year of the second decade was 11. The first year of every decade, century and millennium ends in the digit 1. Therefore, 2011 will be the first year of the new decade; just as 2001 was the first year of the current decade, century and millennium.” So, the first decade of this century started with the new millennium on 1/1/2001, and will end on 12/31/2010.

So 2001 to 2010 equals 10 years. You HAVE TO HAVE ten years before you can call it a decade and that includes all the time through that tenth year.

I like the example that a baby is not ONE year old until he/she has lived it. Their first decade of life ends on their 10th birthday. Then and only then can we say that the child has lived a decade.

So going from baby to the calendar which likewise starts from the year one (1), its first full decade contained the years from 1 to 10, the second decade from 11 to 20, and so on. As Wikipedia points out, “The interval from the year 2001 to 2010 could thus be called the 201st decade…” Another dictionary defines decade, “Officially a ten-year period beginning with the year 1, as 1921-1930, 1931-1940, etc.”

Lastly for you folks that haven’t quite mastered math, First finger 2001, 2nd finger 2002, 3rd finger 2003, 4th finger 2004, 5th finger 2005, 6th finger 2006, 7th finger 2007, 8th finger 2008, 9th finger 2009 and (sigh) 10th finger 2010. Decades have 10 years in them (that’s what the word means). 2010 is, thusly, the end of the first decade in the new millennium.

If we let our guard down and get creative with the calendar we tend to develop the same capacity to settle for less than the best in the rest of our lives or at the least try and define any entity based on convenience or ignorance. Ultimately the truth will lose and we will soon be winking and nodding all the time. Gang, this is not debatable. This is a simple (yes) mathematical fact.

Think I’ll go out and buy zero tomatoes for my salad tonight. Now, was that nine or ten lords a-leaping?


Ned Buxton

No comments: