What technological innovations and challenges made the computer in the palm of our hands possible with the advent of the smartphone?

W

This article explains how smartphones have become an essential tool in our daily lives, and discusses how they have evolved into miniaturized computers through advances in semiconductor technology, integrated circuits, Moore’s Law, and more, and what the future holds.

 

In 2024, our day begins with a smartphone, where we search for news or check social media messages. During the workday, we use our smartphones to work, play games, chat, and de-stress. When we get home, we’ll still have our phones with us, and we’ll fall asleep with them by our sides. Around the world, people can’t do without their smartphones. This deep integration into our daily lives has made smartphones more than just a communication device; they have become essential tools for accessing information, social networking, entertainment, and shopping. Especially since the COVID-19 pandemic, as remote work and online learning have become more common, our dependence on smartphones has increased. While the presence of smartphones has maximized the convenience of our lives, it has also led to less direct communication between people and information overload.
The main reason why smartphones have become so popular is that they can do most of what a computer can do on a small device. It’s like having a computer in the palm of your hand. When you consider that the world’s first computer, the Aniak, built in 1946, weighed a ton and was big enough to take up a whole room in a lab, you can get an idea of how far we’ve come in about 70 years. There were many technological innovations behind the evolution of these large computers to the size of today’s smartphones, but what are the technologies that made it possible to develop “small computers” like smartphones and make them accessible to many people?
First, to realize a computer, you need a circuit that can store and process information. This requires semiconductor devices, which are mainly made of silicon. Semiconductors are intermediate between conductors that conduct electricity well and non-conductors that do not conduct electricity well, and their properties vary greatly depending on the surrounding conditions. By connecting semiconductors whose properties change depending on the conditions, a circuit can be implemented to create a computer that can perform different tasks depending on the conditions. In the case of the aforementioned Aniak, the computer was built using semiconductors called vacuum tubes. However, because vacuum tubes were so large, computers had to be large. However, in 1947, a semiconductor called the transistor was invented, which revolutionized circuit design. Before the invention of the transistor, circuits were realized by connecting various devices, including semiconductor devices, using copper wires, but after the invention of the transistor, circuits were realized by combining various devices on a small substrate so that they could not be separated. The introduction of transistors paved the way for computers to become smaller and more efficient.
Because they can be integrated, the resulting circuits are called integrated circuits. Integrated circuits were able to improve reliability and longevity over traditional circuits, and research continued. Especially in the 1960s, advances in semiconductor technology were rapid. In 1962, the MOSFET was developed, which dramatically changed the way integrated circuits were designed. Although MOSFETs can be fabricated as stand-alone devices, their main advantage is that they are specialized for integrated circuit design. MOSFETs are made by attaching metals to a silicon substrate, the main component of a semiconductor, and shaving off the parts of the substrate that are not needed. The process of drawing the areas on the substrate that will be MOSFETs, connecting the metals, and shaving off the areas that are not needed results in a circuit with MOSFETs connected to the metals. This process led to modern microprocessing technology, which made it possible to realize more complex and advanced circuits. Many people refer to circuit design as schematic design because the circuit is drawn on a board and then processed to create the circuit. The advantage of this type of circuit design is that many copies of the same circuit can be made at the same time, just like drawing a picture and making multiple copies. Mass production became possible.
We can’t talk about the development of integrated circuits without mentioning Moore’s Law. In 1965, Gordon Moore, the founder of Intel, predicted that the number of MOSFET devices in an integrated circuit, or density, would double every 18 months. This is consistent with the size of MOSFET devices halving every 18 months. While engineers who develop semiconductor devices look at Moore’s Law in terms of device size reduction, circuit designers have interpreted it in terms of circuit integration. This prediction, which left everyone scratching their heads, has held up for 50 years, and integrated circuits have made great strides. Doubling the density means that 18 months later, the same size circuit can do twice as much. In addition, a circuit that does the same thing 18 months later can be implemented in half the size if it doubles in density. This law was the main driver of technological progress in the semiconductor industry until the beginning of the 21st century. It enabled miniaturization at the same time. This allowed computers to gradually shrink in size, and in the 2010s, smartphones were born.
The miniaturization of MOSFETs, the increase in the density of integrated circuits, and the advancement of process technology have enabled mass production, making smartphones accessible to everyone. However, in recent years, engineers have argued that MOSFET devices are already small enough that further miniaturization is difficult, and some people are skeptical of miniaturization due to the increase in power consumption through the increase in the number of devices. However, engineers who have been developing integrated circuits for decades are pursuing advancements in a variety of ways. For example, researchers are modeling the human brain’s neurons, which can perform many computations while consuming little energy, and trying to incorporate them into integrated circuits. This field of neuromorphic engineering has the potential to be used not only in smartphones, but also in artificial intelligence (AI), autonomous driving, and healthcare. If this research continues, we could see wearable computers or computers that can be injected into the body, which are currently considered impossible. These innovations are expected to revolutionize not only our daily lives but also society as a whole. Who would have imagined 50 years ago that we would carry a computer in our hands?

 

About the author

Blogger

Hello! Welcome to Polyglottist. This blog is for anyone who loves Korean culture, whether it's K-pop, Korean movies, dramas, travel, or anything else. Let's explore and enjoy Korean culture together!