How deep learning has led to a new leap in AI

H

The next leap in artificial intelligence was made possible by the rise of deep learning and big data. Deep learning solves complex problems, and AlphaGo beat Lee Sedol by learning from vast amounts of notation data. The South Korean government is planning to invest heavily in advancing AI and big data technologies.

 

There is a third spring in AI. The first spring came with the emergence of the term artificial intelligence and the perceptron theory, which was the first implementation of artificial neural networks. However, the first winter of artificial intelligence arrived when the limitations of the perceptron theory were proven and computer power could not support it. The second spring of AI came with the introduction of expert systems in industry, which enabled companies to realize significant cost savings. However, as personal computers became more widespread, the value of these expert systems declined, and AI entered its second winter. So what brought us the third spring?
The driving force behind the third spring of AI is deep learning. Deep learning is the evolution of artificial neural networks, which were first discussed in the 1950s. Deep learning has become the most recent development in the field of artificial intelligence as computer power has increased and multilayer perceptrons have been able to solve problems that single perceptrons could not solve. A prime example of deep learning is AlphaGo, which is what most people think of when they think of artificial intelligence. Around this time last year, Google’s AlphaGo defeated professional Go player Lee Sedol. Go was the last bastion of chess, the untouchable realm that machines dared not cross, even when chess was falling to AI. This is because Go has more moves than there are atoms in the universe, requiring not only computation but also human intuition. However, AlphaGo used deep learning to learn 160,000 moves with 30 million points, which it then retrained on by playing a million games against itself. By analyzing this data, it was able to condense the meaningful number of moves to determine the best place to place them, rather than simply counting all of them. But without the Go notation data, AlphaGo would not have been able to defeat Lee Sedol, no matter how sophisticated an AI it was. In the process of analyzing and learning data, which is the core function of deep learning, the existence of ‘data’, or ‘big data’, is indispensable.
Big data is an important concept in this context. The amount of data has increased exponentially as the use of PCs, smartphones, and the Internet has become more popular. This large, diverse, and short-lived unstructured data is called big data. And the technology that analyzes big data and turns it into meaningful information for us is called big data technology. With the advent of big data technology, machines are able to understand unstructured data and extract meaningful information. Using this information, AI can perform more successful computations. For example, let’s say you have an AI that naturally translates Korean into English. Korean is an unstructured data that doesn’t have any specific structure or rules as a natural language. In other words, the oars of deep learning are moving the ship of artificial intelligence, but the oars are actually rowed by a man called big data. I mentioned earlier that deep learning is a field that developed by overcoming the limitations of the 1950s perceptron theory. However, given that big data technology, a field that has been developing rapidly since the 1980s, plays a large role in data processing in the deep learning process, it can be said that the sudden development of the deep learning field is somewhat correlated with the emergence of big data.
Of course, the current state of artificial intelligence, especially deep learning, is the result of a confluence of technologies. Without any one of them, the development of deep learning might have been postponed to a later date: better computer hardware, more advanced algorithms, and data processing. However, while computer hardware and algorithms have developed as subfields of AI, big data has developed independently and is a broad and important field that also applies to AI.
Shortly after AlphaGo took South Korea by storm in 2016, the South Korean government invested a staggering 3.5 trillion won ($1 trillion) in AI over five years, with the goal of attracting more than 2.5 trillion won in private investment. However, in order for A.I. to perform more accurate computations, it is important to develop the underlying big data technology. Currently, Korea’s big data technology level is about 3.3 years behind that of developed countries, and the market size accounts for only about 1% of the global market (as of 2024). While it is important to invest in AI, it is also important to invest in big data analytics technology for AI so that they can interact and grow together, which will help the development of Korea’s AI sector.

 

About the author

Blogger

Hello! Welcome to Polyglottist. This blog is for anyone who loves Korean culture, whether it's K-pop, Korean movies, dramas, travel, or anything else. Let's explore and enjoy Korean culture together!