Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

On the stroke of midnight on New Year’s Eve 1999, aircraft will fall out of the sky, lifts stop mid floor, power stations will cease generating, nuclear reactors will go critical and missiles will fire themselves at random. 1 Broadcasting will stop, vehicles will grind to a halt so food supplies will not get through, money will disappear as the banks’ computers fail and civil disorder will break out. Those were only some of the disasters that were predicted.

Some people, particularly in America, took it all so seriously that they headed for the hills with enough equipment to make them self-sufficient. 2 Even in Britain families headed for farmhouses in remote Scotland burdened with supplies. 3 Some airlines decided that they would have no planes in the air over this critical time, just in case. 4 In Britain, the government took this sufficiently seriously to set up Taskforce 2000 as early as 1996 to warn of the threat of the Millennium Bug . 5

These and many more were reported in The Times, hardly the most sensational of newspapers, so what was all the fuss about? The Millennium Bug, or Y2K, problem was real, but the difficulty was that no one really knew how serious—or otherwise—it was. In these circumstances there are always people who fear the worst and act accordingly. It wasn’t helped by those, who perhaps should have known better, whipping things up and frightening susciptible people (Fig. 29.1).

Fig. 29.1
figure 1

The Millennium Bug logo as used by the campaign to deal with it. Source: http://www.bbc.co.uk/news/magazine-30576670

The difficulty had been created many years before when computer resources were scarce and it seemed reasonable to represent the year by just two digits; assuming that the first two were 19, 1987 would just be stored as 87. This worked fine and nobody thought about the implications at the end of the century and what would happen when year 2000 arrived. In most cases, it was assumed the computer would compute the date as 1900.

For many applications this wouldn’t really matter, but if the calculation was for the number of days between dates before and after the millennium then a completely wrong answer would be obtained. 6 If the calculation was to find the amount of interest on a loan, for example, then there was a real problem. In a stock control program, items entered after the end of the century would be deemed so old they would probably be up for scrapping and certainly wouldn’t be seen as newer than items that were actually older.

There were dangers that critical information would be seen as very old rather than new, and so unimportant and hence deleted. On top of these were other more subtle problems in that many programers were unaware that the year 2000 was an exception to the normal rule that century years were not leap years. Thus date calculations beyond the end of February would be in error.

In practice, it took a great deal of time for programers to examine all this code and either fix the problems or declare the application free of them. It was a bonanza for computer consultants and there were those, usually wise after the event, that thought the whole thing had been a giant con. 7 They were just as wrong as the doomsayers because there were some actual problems that needed to be investigated.

In the event, the sky didn’t fall in and nothing really catastrophic happened. Those who had real problems had largely dealt with them, life carried on as normal and the survivalists who had headed for the hills looked rather foolish. There were actual incidents that still got through, for example with documents with 1900 dates, but people are quite good at overcoming these difficulties when they know what it is all about.

What the whole thing demonstrated was how dependent everyone had become on computer systems, and how a simple technical error could threaten everyday life. Looking back to the beginning of the twentieth century, the whole thing would have baffled anyone living then. It was unimaginable that the world would become so dependent on something which wouldn’t be invented for half a century that it could infect all aspects of people’s lives. Even though many of the purported dangers were overblown, there were some real risks associated with the Millennium Bug.

Looking back, it can be seen how this castle was built; how one development enabled another and each took the technology a little further until the whole thing was constructed. This doesn’t mean that progress is linear. One development enables others, but the route taken is often determined by other factors. We are dealing with human beings and their infinite variability. However, a general outline can be observed.

First came the vacuum tube, that child of the light bulb. Though wireless was invented first, it was in a blind alley until electronics enabled amplification, rectification and many other signal processing advantages. Then it could bring mass entertainment in the form of radio and ultimately television, but also radar and navigation systems. But these devices could only take the revolution so far.

It was the exploitation of the peculiar properties of, first, germanium and, later, silicon that brought so much to the second half of the century. They gave us the various forms of transistor which looked so puny and uninspiring at first, but gradually their potential began to be realized. It wasn’t just the size of products that could be shrunk, but their lower power and greater reliability made things possible that were not before.

However, the simple idea of packing many of them on to a single piece of silicon—an integrated circuit or ‘chip’—would have unforeseen consequences. Gordon Moore predicted his extraordinary ‘law’ suggesting that the densities of these devices would double every year or two. It didn’t seem possible that this rule could go on holding for half a century or more.

Another child of electronics was the computer. While Charles Babbage had laid out the principles a century before, it was only with the coming of electronics that it was possible to sensibly build such machines. Only gradually did the enormous power of a fixed device, that could be programed to undertake different tasks, begin to be realized. These sets of instructions or ‘software’ could be varied in innumerable ways, but still run on the same physical device.

Developments in electronics brought down the size and cost of these machines, but it was the marriage of the computer and the integrated circuit, with its potential for huge numbers of transistors, that would transform the beast into something quite different. Development of a chip is a complex and expensive process so the manufacturer wants to sell as many as possible. The conversion of the computer into a microprocessor all on one chip was the ultimate device for the industry.

It was soon found that in anything that required sequencing or control or computation it was simpler to use a microprocessor and define its properties in software. Gradually, microprocessors crept in everywhere: into cars, washing machines, televisions, as well as the more obvious computers. They are in surprising places, for example even the humble computer ‘mouse’ contains a small microcontroller.

There was another strand in communications. It had begun with the telegraph , but that was starting to be superseded by the telephone at the start of the twentieth century. At first, this was purely an electrical device but electronics crept in and enabled interconnection and, ultimately, dialling to anywhere in the world. Electronics had brought ‘instant’ communication and the ability to get information across the world without the enormous delays of physically carrying the message.

When telecommunication and computing started to merge, then more things became possible. It became practicable to separate the telephone from its wire and produce the ‘personal communicator’ much beloved in science fiction. The mobile phone has become universal and most now regard it as a necessity and can’t imagine life without it. The coming together of radio communications, computing, integrated circuits and display technology, to name only the main parts, has made this possible.

Largely the same bundle of technologies spun off another child—the Internet. It brought a further form of communication in electronic mail, without which so much of modern life could not function. Then it also spawned the World Wide Web . This was seen as a super library that everyone could access, but it has gone far beyond that basic idea. The simple, few clicks, accessing vast amounts of information, has changed so many fields that users cannot now imagine being without it.

Thus these various strands combine to produce new possibilities which in turn open up further generations. The myriad devices that inhabit modern homes are the result of all these intermingling lines of development. Each new generation produces opportunities for more. The greater number of strands, the larger number of possibilities there are to exploit, and so it has proved—progress has been exponential.

Obviously it has not been possible to include everything in one volume, and some subjects such as avionics, electron microscopes and computer games have been omitted. The selection has been of things felt to have the greater impact on everyday life. This is a set of personal choices and so will not please everybody, but hopefully it achieves its objective of tracing the main outlines and showing how these technologies have piled on each other and the impact they have had on people’s lives.

It might be expected to include some look into the future at this point. This is, however, an area beset with difficulties as many futurologists have found to their cost. The factor that is often forgotten is the lifetime of the items that already exist. It is no good suggesting everyone will be living in plastic pod houses in 30 years when houses typically last much longer than that. Most of the housing stock will still be there then so only a small minority could be different.

With many more technological items, such as mobile phones, the physical, or fashion, lifetime is often much shorter, so some change can be expected. However, it is not always in a linear direction. Different factors come into play and progress heads off in unexpected ways. For example, the trend with mobile phones was that they became smaller and smaller. However, this reversed and they are becoming larger so that the bigger displays, necessary to accommodate the additional features, can be included.

Despite this, the trend is generally for smaller, lighter, and particularly lower power devices. This is partly due to economic forces to make items ever cheaper, which is an area where the electronics industry has an extraordinary record. Additionally, there is the pressure to minimize the impact on the Earth, either in the amount and type of raw materials consumed but also in the waste or by-products such as carbon dioxide.

We can also be confident that the convergence of computers, the Internet, mobile phones, cameras and televisions will continue. Now that these are all digital platforms it is easy for the information files to be swapped between devices. Because they can all share the same computing power and other resources such as displays, it is easy to pile even more features onto devices. The only restraint is that the complexity can become impossible for the user.

One thing we can be certain of is that electronics, and its offshoots of computing and telecommunications, will have an ever greater impact on how we live and work. As automation takes a greater role in manufacturing and computing in services, people will be left with the human interactions and attending to the machinery when it fails to perform correctly. The other area is the use of craft or skills for things too specialized or in too small quantities to make automation practical.

As the technology has clearly had such an impact, the question is often raised as to whether technology drives history. From Karl Marx onwards there have been proponents of the concept of technological determinism. If a particular technological development occurs, is the result a foregone conclusion? Marx is quoted as saying: ‘The hand mill gives you society with a feudal lord; the steam mill, society with the industrial capitalist.’ 8

One does wonder with some of the commentators whether they have ever shaped metal with a file or wielded a soldering iron or, more particularly, were ever involved in the process of producing a new technological product. If so, they would know that nothing is certain about the result. Everything can be there that seems to be exactly what is required and then it is a complete flop. On the other hand, something that seems quite improbable can be a roaring success. Why did it take around a century for a quarter of homes to have a telephone, but the mobile phone take off much more rapidly? The answers to questions such as this don’t lie in technological developments but in the way people respond to them.

A technological development does not dictate an outcome, it is an enabler; it sets up opportunities. There is no certainty which of these will be taken. Sometimes there is an immense length of time before interest grows, maybe because the products aren’t quite right or simply the idea’s time hasn’t come. Microwave ovens, for example, took some 30 years or so from their invention to being a staple of the ordinary kitchen. There are now so many things that no one can imagine living without.

The idea of making your own entertainment has gone. Now it is dominated by television and other electronic devices. Amusements are now more individual and in the home. This has had the effect of separating people, but perversely they can also ‘come together’ in spirit for some great event either nationally or even across the world. On the other hand, there has been the rise of the music festival and huge pop concert where the young gather in vast numbers.

The nature of work has changed. Automation has increasingly replaced manual work, decreasing the number of repetitive jobs, and largely eliminating the dangerous ones. In the office, work is mostly based around computers which again have taken out much of the drudgery, leaving the people to deal with the human contact. For good or ill, this has spawned the call center which absolutely depends on communications and computing power. The type of jobs that people do has almost completely changed in the course of the twentieth century.

So much of this depends in some way or another on the development of electronics and its offshoots of telecommunications and computing. It is these that have shaped so many of the changes in our lives in the course of the twentieth century. This is the electronics revolution.

Notes

  1. 1.

    Copps, A. Millennium mayhem or hype? The Times, 30 June, 1999.

  2. 2.

    Wheelwright, G. Doomsayers of Y2K head for the hills. The Times, 19 August, 1998.

  3. 3.

    Harris, G., Y2K: The end of the world as we know it. The Times, 16 January, 1999.

  4. 4.

    Keenan, S. Hoping to avoid disruption. The Times, 27 February, 1999.

  5. 5.

    Elliott, V. Doom merchant or a voice crying in the wilderness? The Times, 6 November, 1998.

  6. 6.

    Ainsworth, P. (1996) The millennium bug . IEE Review, 42:4, 140–142.

  7. 7.

    Kaletsky, A. The bug that never was. The Times, 6 January, 2000.

  8. 8.

    Smith, M. R. and Marx, L. (1994) Does Technology Drive History?: The Dilemma of Technological Determinism. Cambridge, MA, MIT Press, p.123.