Bookmark and Share
The Future of Computing
December 19, 2014
by J.A. Young
An example of a computer chip produced by D-Wave Systems Inc. that utilizes quantum phenomenon. via Wikimedia Commons


The world of computing is in a very interesting place right now, and as we move into the first months of 2015 humanity is drawing nearer and nearer to some very big technological boundaries. This is because the fundamental nature of computers, which have built the world around us, is likely to change in the coming years. That may not only mean changes in speed and price, but also qualitative changes as well.

Essentially, the computers that we all use today are part of the same lineage of machines that were first envisioned by mathematician John Von Neumann in 1945. Von Neumann, whose intellectual legacy has left fingerprints in everything from nuclear physics to science fiction, was the man who gave us the idea of a CPU or central processing unit. His model of a computational machine, which featured a memory unit linked to a logic and control unit, has made much of the modern technological world possible. Every modern computer in the world today owes its existence to Von Neummann’s pioneering work. Since improvements have been made to this basic structure and computer chips have become smaller, the power of computers has followed an exponential curve of improving speeds known as Moore's law. Gordon E. Moore, a co-founder of tech giant Intel, was the first person to notice that the amount of transistors that can be fit onto a computer chip doubles once every two years. This model of growth is the reason that computers have become so fast and cheap in such a short amount of time. The power of exponential growth that fuels this paradigm means that the same computer chip used to power a speaking Santa Claus greeting card is more powerful than the computer that was used on early Apollo space missions.

In the words of former Microsoft executive Nathan Myhrvold, "The way Moore's Law occurs in computing is really unprecedented in other walks of life. If the Boeing 747 obeyed Moore's Law, it would travel a million miles an hour, it would be shrunken down in size, and a trip to New York would cost about five dollars.” [1] The most interesting thing about computers in 2015 is that we are starting to see some radical developments in the 65-year history of progress through Moore’s law. Right now speculation is abounding in the computer science field that we may be nearing the end of Moore’s law, and that in a few years we will have hit the limit of transistors that can be crammed onto computer chips. This would of course mean that the exponential growth of computers would cease, assuming that we do not enter a new trend of improvement. As you might expect, the question of what this new paradigm of computing may be is a very, very big question. Not only is it the key question in computer science, but the answer will tell us a great deal about what the world of 2050 looks like. Cellphones have made the transition from cumbersome burdens to the pocket-sized computers and watches of today in a short amount of time. The next series of improvements in computing could make those changes seem insignificant.

One essential part of Moore’s law is that as computer components get increasingly smaller and closer together, computers become faster because data has less space in which to travel. This means that more calculations can be done more quickly. This move towards miniaturizing computer circuits has led engineers to look for new ways to do computation on minute scales. This leads us to the idea of quantum computing, which has the potential to change everything about the world in a fundamental way just as Von Neumann’s architecture did. Today there are scientists all over the world trying to make quantum computers that work in the real world, because right now the field is stuck in a sort of theoretical limbo. The ideas behind the technology have been proven, but only in laboratory settings.

Quantum computing aims to use the physical information from individual atoms, the state of particles and how they change etc., as the building blocks of computers. Even a technological layman understands the idea that computers are based upon single units of information called bits, that use electricity to represent 1’s and 0’s and create what we generally consider “code”. This is the language of computers, all you need is a way to transfer information. Quantum computers would use information from the atomic level to create bits, which means that computer transistors could be unfathomably small and astonishingly fast. Whether the field moves in the direction of quantum computing or we continue to follow Moore’s law for some time longer, it is unlikely that we will recognize the computers of today 20 years from now. They will seem to us as arcane and strange as the ticker-tape Turing machine that kicked off the computer age seem today. That is because if we can build computers using quantum technology, it would be simple to to have a computer the size of a coin that dwarfs today's super-computers.


Just recently the scientific community made a big step in the direction of quantum computing. On December 11, 2014, a team of Danish scientists nailed down a key problem in mathematical physics that had remain unsolved for over 80 years. [2] The issue was described by Nicolaj Thomas Zinner, associate professor at the Department of Physics and Astronomy at Aarhus University, as “The problem has been to calculate when atoms do one thing or another in the real world. We have been able to calculate this in theory, but when we experiment and insert data into existing models, they fall apart.” You see, not only are atoms unbelievable small, but they are also tricky subjects for observation in the lab. For a long time this made many people assume that they would be unsuitable for building computers, because they tend to change and cannot be measured closely enough to be used to transmit information. Zinner goes on to say that, “We have finally solved that problem.” [2] This is just one of 1,000 steps that have to be taken inside the laboratory and the classroom before you are I can hope to wield massively powerful computers on the tips of our fingers, but it is a step none the less. Whatever the future of computers may hold, one thing is for certain. It will change so drastically in such a short amount of time, that the world of tomorrow will be far beyond the realm of science fiction.

[1] http://archive.wired.com/wired/archive/3.09/myhrvold_pr.html
[2] http://sciencenordic.com/physicists-solve-decade-old-quantum-mechanics-problem
Return to List