A lot of this runaround is happening because people get hung up on the fact that the "AD" era began as AD 1. But that year is not magic--it didn't even correlate with the year of Jesus's birth or death. So let's just start the AD era a year before, and call that year "AD 0". It can even overlap with BC 1. BC 1 is the same as AD 0. Fine, we can handle that, right? Then the 00s are [0, 100), 100s are [100, 200), etc. Zero problem, and we can start calling them the 1700s etc., guilt free.
I would also accept that the 1st century has one less year than future centuries. Everyone said Jan 1, 2000 was "the new millenium" and "the 21st century". It didn't bother anyone except Lua programmers, I'm pretty sure.
Things like "17th century", "1600s", or "1990s" are rarely exact dates, and almost always fuzzy. It really doesn't matter what the exact start and end day is. If you need exact dates then use exact dates.
A calendar change like this is a non-starter. A lot of disruption for no real purpose other than pleasing some pedantics.
Exactly. Historians often talk about things like "the long 18th century" running from 1688 (Britian's "Glorious Revolution") to 1815 (the defeat of Napoleon) because it makes sense culturally to have periods that don't exactly fit 100-year chunks.
I thought the article was going to argue against chunking ideas into centuries because it's an arbitrary, artificial construct superimposed on fluid human culture. I could get behind that, generally, while acknowledging that many academic pursuits need arbitrary bins other people understand for context. I did not expect to see arguments for stamping out the ambiguities in labellibg these arbitrary time chunks. Nerdy pub trivia aside, I don't see the utility of instantly recalling the absolute timeline of the American revotion in relation to the enlightenment. The 'why's— the relationships among the ideas— hold the answers. The 'when's just help with context. To my eye, the century count labels suit their purpose for colloquial usage and the precise years work fine for more specific things. Not everything has to be good at everything to be useful enough for something.
This reminds me that centuries such as "the third century BC" are even harder to translate into date ranges. That one's 201 BC to 300 BC, inclusive, backward. Or you might see "the last quarter of the second millennium BC", which means minus 2000 to about minus 1750. [Edit: no it doesn't.]
In fact archeologists have adapted to writing "CE" and "BCE" these days, but despite that flexibility I've never seen somebody write a date range like "the 1200s BCE". But they should.
Some people have proposed resetting year 1 to 10,000 years earlier. The current year would be 12024. This way you can have pretty much all of recoded human history in positive dates, while still remaining mostly compatible with the current system. It would certainly be convenient, but I don't expect significant uptick any time soon.
For earlier dates "n years ago" is usually easier, e.g. "The first humans migrated to Australia approximately 50,000 years ago".
> you might see "the last quarter of the second millennium BC", which means minus 2000 to about minus 1750.
From comparing some online answers (see links), I'd conclude that even though the numbers are ordered backward, "first"/"last"/"early"/"late" would more commonly be understood to reference the years' relative position in a timeline. That is, "2000 to about minus 1750" would be the first quarter of the second millennium BC.
Imagine how much code is being rewritten all the time for a variety of reasons. Code is live and must be maintained. Adding one more reason isn't much of a stretch.