In article <201501251019290550.005C05BC@smtp.24cl.home> you write:
I've always wondered why this is such a big issue, and why it's done as it is.
A lot of people don't think the current approach is so great.
In UNIX, for instance, time is measured as the number of seconds since the UNIX epoch. imo, the counting of the number of seconds should not be "adjusted", unless there's a time warp of some sort. The leap second adjustment should be in the display of the time, i.e., similar to how time zones are handled.
It shares with time zones the problem that you cannot tell what the UNIX timestamp will be for a particular future time. If you want to have something happen at, say, July 2 2025 at 12:00 UTC you can guess what the timstamp for that will be, but if there's another leapsecond or two, you'll be wrong. Life would be a lot easier for everyone except a handful of astronomers if we forgot about leap seconds and adjusted by a full minute every couple of centuries.