We live in an age of warnings and apparent portents of impending doom. For example, the year 2000 CE, incorrectly deemed the millennium by the maths illiterate (or foolish) and the religiously fanatical, which are not necessarily synonymous despite superficial similarities, was supposed to be a year of catastrophic (in the maths sense) change of a religious nature and was an actual near catastrophe of the informational technology type due to a set of problems collectively called Y2K.
At issue with this was that because of some prodded decisions made without adequate forethought – the product of corporate oligarch managers of conspicuously absent technical knowledge – the means of tracking dates in computer code would crash and burn in 2000 CE. The response of civilization to this was varied. Those with some grasp of what was at stake, including the Yankee republic, executed what amounted to a war-like effort to assure that we did not come to work after New Year’s that year and find useless computers on our desks. (Note that this presumes that they were not useless prior to that date.) Others, who had no grasp of the matter, due to either ignorance or managerial apathy, sailed blithely along.
This gave rise to three types of responses. The first was to fix the problem, either directly or, for purchased software, by intimidation of the originator. The second was to do nothing, the product of apathy or ignorance. And the third was often a blind, religious persecution of the users of unfixable programs.
The latter perhaps deserves some explanation. As a scientist I tend to use a lot of software that is not in common use. Some of it I write myself; some I purchase. And because the vendors of scientific software have a small market base, they are often going out of business. Hence, as Y2K approached I had numerous pieces of software that had not been “fixed”. These fell into two categories: programs I had personally written and knew that if the OS were fixed, the programs would work correctly; and programs I had bought whose originators were no longer in business. The latter could not be fixed but in many cases I was relatively sure that they would still work properly if, again, the OS were fixed.
To the Yankee government’s software gestapo, this was not relevant. There could only be two types of software: that which had been fixed, and that which could not and was accordingly banned. Professional judgment was irrelevant. Unless one had a certificate from the originating organization, for purchased software, that their software had been fixed, that software was banned.
So my Y2K challenge was finding new software that had “fixed” date handling and would do what my old software had done. And figuring out how to pay for this new software. And figuring up how to make up the lost time spent finding new software instead of getting work done.
For may people who ignored the problem however, and this included most of the home users who had little idea of how to even approach the problem, Y2K dawned and some of their software did not work properly. In some cases their machines didn’t work at all; in others, some of their work got sabotaged.
The reason I mention this is that I read this morning [Link] that something like 0.85 of the Internet Protocol addresses have been used. The IP address is a designation of your “identity” on the ‘net. Most ISPs and large organization networks have been using dynamic assignment of these addresses for some time now to minimize the demand, but there are limits to how far that can go.
An IP address right now is a set of four eight bit (a byte) numbers. Since 2^8 = 256, this means that the maximum number of people who can be on the ‘net and have unique addresses is 256^4 ~ 4.3 E+09. The actual number is somewhat less than this in actual practice because IP addresses are assigned by the first two or three numbers. Hence if you have an organization of 10 users and your organization officially “owns” a set of IP address on the basis of the first three numbers, then 246 of those numbers are not used but are not available to be used.
Now when all of these address are assigned, no more assignments can be made and BOOM! the ‘net becomes static. To avoid this, the plan is to go to a new IP ddress scheme with six eight bit numbers: 256^5 ~ 2.8E+14. Now this may seem like a lot of growth room, and it is, but it also the bow wave of a catastrophe and that is why folks have been so slow to get around to implementing it.
The first thing that has to be done is every OS has to be changed or patched to work with this form of IP address. In a large organization this means that every computer has to be touched by the IT apparat. And while theu are doing this they can’t be doing their usual stuff of rescueing damsels and kittens. In the home it means that every computer has to be touched by its user, which measn they have to be educated – or the process be made foolproof.
Then the network routers and switches that have or use IP addresses have to be fixed, changed, or replaced. In the large organization, this is another big thing for the IT apparat. In the home, this means doing things to routers that the average home user doesn’t do, and in practice the company who sold the router doesn’t want to worry about but will happily sell a new router for.
A lot of home folks are not going to either understand this, understand how to fix this, or do anything until they get up one morning and they are stuck because they can’t access the ‘net to get help, contact anyone, whatever. And everyone who has VOIP phone service, doesn’t.
I shan’t even mention the expected time to this catastrophe. But it will be sooner than the Mayan date for the end of the universe.