Technology

1Any sufficiently advanced technology
is indistinguishable from magic.
Arthur C. Clarke

It is recognised that the scope of modern technology is enormous, such that any practical discussion will have to be rationalised to some shorter list, which might simply reflect my interests in computing. In this context, the application of computers, both in terms of processing power and the ability to distribute information seems to now be central to much of what is happening in the world. Today, the ubiquity of computers has spread into almost every aspect of what we might describe as a global society, i.e. automated control and communications. Sixty years ago, most people could have counted the number of electric motors in their home on one hand; while thirty years ago, this was equally true of computers. Today, it is probably true to say that most people do not know how many computer chips are in their homes, but more importantly, they may not yet appreciate just how dependent their lives have already become on computer AI technology and computer-aided applications. These applications are predicated not only on the development of computer hardware, but also the ability of software to program the hardware to communicate on a global basis. The list below reflects the development of some initial sections:

While reference might be made to an earlier technology timeline, it might be argued that the electronic computer age was triggered by the development of the transistor in 1948, which was then quickly followed by the first commercially available computer from Univac in 1951. While all computer hardware operates on the basis of binary switching, the earliest computers were invariably programmed in a low-level language, often referred to as machine code. In 1953, IBM had developed the first high-level language called Fortran and it would be IBM who would eventually pioneer the first wave of home PC's in 1981. However, during the first 30 years or so, most computers took the form of a large central mainframe, the cost of which restricted ownership to large corporations and institutions. However, while the cost of mainframe computers was a restriction, it possibly help to facilitate the initial development of communication protocols, which then allowed the remote access to a centralised system. By 1971, this innovation had led to the 1st generation of the Internet, although it was not originally known as such, even so, it was adequate for the earlier implementations of email to start to spread. Over the next 20 years, the growing sophistication of hardware and software, increased exponentially in-line with Moore's Law and would ultimately lead to the modern concept of the Internet and the Web. This consolidation was based on a hypertext language known as HTML initially developed by Tim Berners-Lee operating over a communication protocol called IP. The rest, as they say, is history, but one of the first goals of this section will be to continue to detail the design of this website and all its inherent problems. However, see Technology Evolution  for a wider discussion of potential technology developments.