The Future of Computers

AI, Quantum Computing, and Faster Processors

Nick McClure
11 min readMay 6, 2021

Introduction

Computers smaller than a large room used to be the thoughts of science fiction writers, while these days, every person on the planet has more computing power in their pocket than presidents had access to just 50 years ago. Computers have become an important part of society. Individuals these days rely on computers, including smart phones, for social interactions, their jobs, and entertainment. In more developed countries, it is unheard of to not have access to a computer or even the internet. Electronic devices have come a long way in the past century, and have grown exponentially in the past few decades, so it is interesting to contemplate their future. From their inception as crude circuit boards to the modern computer with infinite possibilities, the future of computers will include artificial intelligence, quantum physics, and lighting fast speeds due to innovation in processor efficiency and size.

Background

Surface-Mounted Integrated Circuits from Apollo

A major change in computers over the last few decades has been their speed and size. As stated previously, computers used to be the size of whole rooms, and they were not for consumer use, not that anyone had the room, money, or need for a computer at the time. They used to also only be able to do simple calculations. When calculators become small enough to bring to school, people were in awe. Calculator watches became popular in the 1970s because it was such a cool sight to see an advanced form of technology that can fit on your wrist, the same way a simple clock does. As the size shrunk, this did not limit their function. Computers became faster with every new generation of products and innovations. The circuit boards of the processors used for the moon landing in 1969, called the Apollo Guidance Computer, was so simplistic that each individual bit could be seen easily with the naked eye. In contrast, computers these days are comprised of billions and trillions of bits. One terabyte hard drive can be purchased for less than $50 and has eight trillion bits. There was an observation made by computer scientists that the number of transistors in computers doubles about every two years. This has commonly become known as Moore’s Law and still applies to this day, but many speculate that it will cease to be true soon.

The Apollo Guidance Computer

The Future of Moore’s Law

These days, computers are as fast as they ever have been. Moore’s law still applies to transistor counts every year, however, as stated previously, the law may end soon. Circuit boards can only get so small, and we can only put so many transistors in a processor. Even so, according to IC Insights in the graph to the left, from as recent as 2019, there were an average of 10–100 billion transistors in modern processors. These types of processors included memory, graphics processing, and central processing units. These numbers are huge and should supposedly continue to grow, if Moore’s law holds true. In recent years, Artificial intelligence (AI) and machine learning have become more prominent. They used to be a scary science fiction topic but are now technologies that are used in industrial settings. Machine learning is a way that an AI can learn on its own. It looks at lots of scenarios or images and makes its own decisions based on the data it has gathered. An article from the Verge put it best, “Well, the biggest advantage of this method is the most obvious: you never have to actually program it” (The Verge). Machine learning basically programs itself, which has allowed it to be used in settings where programming every individual process would be too labor intensive from a human for it to be practical.

New Processors

5nm Chip

The future of computers is very bright, with new innovations coming every day it seems. The computer industry brings in a lot of money with very high profit margins so developing new products and innovating is encouraged and incentivized. Chips for parts like central processing units (CPUs) use processes in terms of nanometers. 10nm process was the standard for many years and then 7nm and 5nm made their debuts in 2018. These measurements refer to the size of the transistors. Due to the constraints of the natural world and laws of physics, new manufacturing practices can only make these measurements can only get so small. Moore’s law is also supposed to end soon, and manufacturers can only put so many transistors in a chip, so new innovations are needed to increase the speed of electronics every year. According to Conner Forrest from TechRepublic, a website publishing articles from tech journalists, industry analysts, and real-world IT professionals, “Despite the number of transistors on a wafer not being able to increase, the report said that manufacturers will look to new transistor designs and geometries in order to keep improving performance” (Forrest). This quote refers to a study about the current technology industry and transistor innovations. New transistor designs are needed to increase speed on computer chips so they can be made smaller and faster. New geometries could make them more compact, which would also result in more transistors being fit into chips, which would ultimately increase the speed.

Increasing Speeds

Computers now may consist of many transistors, but the future will continue to improve upon this and increase the number of processors. Speeds and efficiency must increase in technology for new sales to be made and money is the biggest motivator, so processor count will increase in the future. Programming difficulty will also increase due to the changing landscape of technology. New software will be needed to be used with these new fast machines and programmers will need to know how to work with a system that has millions of processors as opposed to the few in average computers these days. In an article by Software World, a website that highlights useful software for businesses and individuals, they say that “Future computers will consist of thousands or even millions of processors, which poses a real problem to traditional programmers not used to thinking in parallel,” said project leader Professor Kevin Hammond of the University of St Andrews” (“Tackling the fastest…”). This “thinking in parallel” is an important difference to programming right now. Now, programming involves solving problems and created logic-based programs that will run on one processor. However, in the future, there will be millions of individual processors on computers. Programs will need to be able to utilize all this power and working in parallel is the method to do so. This would allow many tasks to be completed simultaneously, so the product is polished and finished blazingly quick. The source, Software World, promotes a software called ParaPhrase, which allows programmers to assemble their programs in units called patterns, which can then be reassembled in optimal ways without structurally changing the program’s function. This would aid in the future of parallel processing, where the different patterns can be assigned to separate processors, so the whole program can run at a fraction of the time it would take right now.

Artificial Intelligence

Another branch in the future of computers is AI and Machine Learning. These are already used today for many things. Recently there have been advancements in this field in which AI aids computing. NVIDIA, a popular computer graphics software and hardware company which makes graphics cards for computers, has been dabbling in this field to make graphics or gaming performance better with the help of AI. They have software that can upscale images or games from lower resolution to the native resolution of someone’s screen. This allows the computer to use less processing power to only render a lower resolution image, and then the AI upscales it with little use of the processor. This improves performance significantly. NVIDIA also has software that decreases the latency time within games. Sometimes it can decrease latency as much as 50% with the help of AI. Finally, NVIDIA has released a software that, when using a microphone, it can block out all background noise that is not a person’s voice. This is very impressive because you can vacuum in the background and still hear the person’s voice clearly with little sound degradation and the vacuum cannot be heard.

Recently, artificial intelligence has been used for competitive outlets that test its usability against the human mind. AI is made to “think” like a human while also having instant access to millions of databases and the internet. For this reason, AI is shown to perform better than humans and against other chess bots. Chess bots or chess computers have been around for a while where they make a move that will help them win. An AI chess computer, however, does not just make moves that will help it win. It takes it a step further and makes moves that purposefully hurt itself, but it gives them an advantage later in the game. This risky gameplay is unique to the AI because it can “think” rather than just calculate the best move. In an article about AI and chess titled “AI Is Now the Undisputed Champion of Computer Chess” from the journal Popular Mechanics, it says, “In 100 games, AlphaZero never lost. The AI engine won the match with dazzling sacrifices, risky moves, and a beautiful style that was completely new to the world of computer chess” (Herkewitz). This new style of playing chess for computers is an interesting look into the future of the science. AI is currently very advanced, and it will only continue this trend of being more applicable and useful for our everyday lives.

Quantum Computing

Quantum Computer from Rigetti

A major aspect of the future of computers is the field of quantum computing. Quantum mechanics is a science that studies molecules at the atomic or subatomic level. The word quantum comes from the interactions of certain subatomic particles. Electrons can shift between different states which helps in the computing world. Computers are based on ones and zeroes, or on and off. These usually take the form of transistors which are either on or off with electricity. In quantum computing, the transistors are the state of the electrons, whether they are spinning one way or another. This allows them to be much smaller because they are less than the size of a whole atom. In an article from the website Space Daily, a content aggregator that has formal news services from multiple international news agencies about future technology and space, it says “When the quantum dots are cooled down to liquid helium temperatures and optically excited, a singe electron can be trapped in each of the quantum dots. The spin states of the electrons can then be used as information stores. Laser pulses can read and alter the states optically from outside. This makes the system ideal as a building block for future quantum computers” (“Quantum Computer…”). Using this as a building block for future quantum computers will allow for them to get much smaller and faster. There is ongoing research right now on quantum computers in other areas as well and there are 15 working quantum computers right now.. There are many barriers to quantum computers that scientists must face, but more of these barriers are overcome every year with new research. Another article from Space Daily titled “Unconventional Superconductor May be Used to Create Quantum Computers of the Future” says that “After an intensive period of analyses the research team was able to establish that they had probably succeeded in creating a topological superconductor” (“Unconventional Superconductor…”). This topological superconductor was a large barrier to building a functional quantum computer. The superconductor would allow for electricity to be conducted at the rate that they need to make a system like that work.

Conclusion

Computers have become one of the most influential aspects of most individual’s everyday lives. They have created new jobs and allowed existing jobs to be done much easier. Right now, computers are being used more than ever for people working at-home during the COVID-19 pandemic. No one would have predicted this drastic change in computer utility even last year. Large, world-affecting events and major breakthroughs in computing is what leads to and what will lead to the future described in this essay. Innovation is needed and always will be needed to further this field. AI, machine learning, more efficient production processes, and quantum mechanics are all various parts of the industry that need to be innovated further, but when more of the issues are worked out, computing will be as fast, useful, and influential as it has ever been.

Work Cited

“Dream Machines Of The Future: Computers Based On Cutting-Edge Technology.” Electronics For You, 3 May 2017. Gale General OneFile, link.gale.com/apps/doc/A491053375/GPS?u=napl44696&sid=GPS&xid=1e7e7ab8. Accessed 6 Apr. 2021.

Forrest, Conner. “Moore’s Law Dead IN 2021: Here’s What the next Revolution Will Mean.” 26 July 2016, www.techrepublic.com/article/moores-law-dead-in-2021-heres-what-the-next-revolution-will-mean/.

Herkewitz, William. “AI Is Now the Undisputed Champion of Computer Chess.” Popular Mechanics, vol. 197, no. 1, Mar.-Apr. 2020, p. 18+. Gale In Context: High School, link.gale.com/apps/doc/A618291432/GPS?u=napl44696&sid=GPS&xid=ea64247f. Accessed 8 Apr. 2021.

“New quantum liquid crystals may play role in future of computers.” Space Daily, 25 Apr. 2017. Gale General OneFile, link.gale.com/apps/doc/A490571307/GPS?u=napl44696&sid=GPS&xid=6a04ee41. Accessed 5 Apr. 2021.

Szondy, David. “Apollo’s Brain: The Computer That Guided Man to the Moon.” New Atlas, 26 July 2019, newatlas.com/apollo-11-guidance-computer/59766/.

“Tackling the Fastest and Most Powerful Computing Systems on the Planet: Future Computers with Millions of Processors.” Software World, vol. 46, no. 3, May 2015, p. 14. Gale In Context: High School, link.gale.com/apps/doc/A416116570/GPS?u=napl44696&sid=GPS&xid=75abed05. Accessed 6 Apr. 2021.

“Transistor Count Trends Continue to Track with Moore’s Law”. 5 Mar. 2020, www.icinsights.com/news/bulletins/Transistor-Count-Trends-Continue-To-Track-With-Moores-Law/.

“Unconventional superconductor may be used to create quantum computers of the future.” Space Daily, 6 Mar. 2018. Gale General OneFile, link.gale.com/apps/doc/A529947060/GPS?u=napl44696&sid=GPS&xid=776e61cd. Accessed 1 Apr. 2021.

Vincent, James. “The State of AI in 2019.” 28 Jan. 2019, www.theverge.com/2019/1/28/18197520/ai-artificial-intelligence-machine-learning-computational-science.

“Quantum computer made of standard semiconductor materials.” Space Daily, 12 Dec. 2015. Gale General OneFile, link.gale.com/apps/doc/A437221427/GPS?u=napl44696&sid=GPS&xid=3d6f6233. Accessed 31 Mar. 2021.

Photo Links

--

--