What ethical responsibilities do humans have in the face of technological advancement, and how can we fulfill them through reflection?

W

 

This article explores the nature and history of technology and the ethical issues of technology in the development of humans and technology, and discusses what ethical reflection and responsibility is needed for humans to use technology appropriately. It emphasizes the increasing importance of human ethics and responsibility in the development of technology and suggests the need for education.

 

Introduction

The quest for the nature of humanity has been ongoing since we were able to find leisure in our lives. Humans have categorized life on Earth into animals, plants, insects, and microorganisms. The animal category was further divided into species of mammals, primates, reptiles, and amphibians. Humans belong to the mammalian family. However, humans continue to try to distinguish themselves from animals. We can think of six main differences that distinguish humans from other animals: rational, instrumental, playful, social, cultural, and ethical. Of these, being rational and ethical makes a big difference. Humans are capable of acting on their own, making judgments and making choices that they consider most valuable. By reflecting on their choices, they can make better choices and judgments next time. This kind of rational thinking is what allows humans to form and maintain complex societies. Humans are also characterized as instrumental beings, with the ability to make and use tools for different purposes. Tools were invented by humans with the advent of humans, so they are an integral part of what it means to be human. This has allowed humans to pursue goals beyond mere survival and to make original achievements in various fields such as art, philosophy, and science.
Humans use technology to invent and improve tools. Technology is also an integral part of the human story. From the creation of stone tools, technology progressed at an unstoppable pace through the Middle Ages and World War II. The automobile was first invented in 1769 and it took over 100 years for Panhard Levasor to develop a car with an engine in the front of the car in 1891, but the pace of technological progress has accelerated since then. After the world’s first cell phone was developed by Motorola’s Dr. Martin Cooper and his team in 1973, it took only 30 years for smartphones to become commercially available with the unveiling of the iPhone in 2007. The internet, touchscreens, and fingerprinting, which were groundbreaking and surprising inventions at the time, are now ubiquitous and integral to the development of new products. Nanotechnology, biotechnology, ubiquitous computing, and other technologies of the 21st century are still unfamiliar to us, but in a generation, or even just a few years, they will be ubiquitous in our society.
The development of technology and the development of society move at roughly the same pace. As technology advances, society goes through a series of revolutions and transforms into something completely new. After the agricultural revolution, the industrial revolution, the information age, and now the ubiquitous age, we don’t know how much society will change. Then, there will be a corresponding change in the inner part of humanity. Spiritual and ethical. A changing society creates new events that we have never known before. It creates new forms of life, new ways of thinking, and new crimes. Dilemmas will emerge that are difficult to solve with the ethical standards of the past, and these changes will require ethical reflection not only by individuals but also by society as a whole. That’s why we need to think about technological advances and human ethics together.

 

What is the meaning of technology?

First, let’s start with what we mean by technology. The term technology comes from the Greek word techne. Initially, it meant the practice of producing something external to the human mind, but nowadays, the term technology is more commonly used to mean the production of material goods. There are three aspects that make up this technology: first, technology is an artifact. For example, a stone is just a natural object that is abundant in nature, but when we process it in the past and use it as a weapon or tool, it becomes a technology as an artifact. Secondly, technology can also refer to knowledge, which is the specific logic and knowledge required to create and use an artifact. The third is technology as activity. Technology is subject to the activities of those who create it and those who utilize it. Even if technology as an artifact is developed with engineering knowledge, if it is not used, it ceases to be technology. Therefore, technology is not just limited to material creations, but is also deeply connected to the intellectual and social activities of the people who create and utilize it.

 

History of technology

Technology appeared with humans. It’s a bit of a stretch to say “with humans,” since humans weren’t born with a torch in one hand and a sword in the other, but it’s clear that humans are responsible for the emergence of technology. In the Paleolithic era, humans lived a hunter-gatherer lifestyle. Picking berries, hunting animals, and performing these activities with their bare hands was difficult, so they created tools by chiseling stone into shapes that were easier to hold. They also made things like spears, arrows, traps, and knives to facilitate hunting. Along the way, technology became essential for human survival and became increasingly sophisticated as human societies developed. When humans discovered fire and began to gather around it, they formed communities, settled down, and underwent an agricultural revolution as they entered the Neolithic Age. Their technology changed accordingly, creating farming tools such as ground stone axes, hooks with sharpened blades, and scythes made from ground stone, as well as pottery to store food. Bronze was not a material that could be easily worked by everyone, so by the Bronze Age, a class of specialized craftsmen had formed who were responsible for the production and working of the metal. In ancient times, iron was used to make tools, such as the catapults and cranes invented by Archimedes, which were used as weapons of war. In fact, there were no major technological advances in ancient times due to the idea that nature should not be interfered with. However, in the Middle Ages, technology took a big step forward. The Middle Ages are often referred to as the Dark Ages of science, but while science was stagnant, technology was able to advance because the Christian ideas that dominated the era recognized the value of harnessing nature. Agriculture, the military, and the power sector saw major advances during this time. Gunpowder, invented in China in the 13th century, reached Europe in the 15th century, allowing the use of cannons, but it wasn’t originally developed for military purposes. The most accepted theory is that gunpowder was first invented in China, but it was actually one of the first medicines used to treat boils and fevers. Neither the crane nor Chinese gunpowder were developed for use in warfare. However, the crane is an ancient weapon of warfare, and gunpowder remains a key component of artillery shells. We should recognize that it was human intervention that changed the purpose of these technologies. The initial purpose of technology was often to survive and improve life, but that purpose can easily be altered by human intentions. Many of the technologies we enjoy today were developed during World War II. Technologies that were not originally intended for military purposes were utilized in the war, and indeed, many different technologies were experimented with and used for war purposes. For example, airplanes were originally a means of transporting mail over long distances, but in World War II they were used as a weapon of war. Physics was used to determine the most effective shell shapes, and chemistry was used to develop new explosives. Nuclear weapons, the most common example that comes to mind when we talk about the negative side of technology, were also first invented during this time. The internet, an indispensable technology, was initially developed for military purposes. The internet and computing technology have evolved so fast that we now have an invisible connection between computers. Mark Weiser felt that using computers was too complicated, so he devised a “silent technology” that would allow these computers to run without the user realizing it: ubiquitous computing. Ubiquitous computing technology aims to converge physical and electronic space. It effectively embeds computers into real-world objects and environments, allowing information to flow between objects, people, and computers. If we look at the current trends in technology development, we can see that we are moving towards the era of ubiquitous computing, albeit in a fragmented way. Internet home appliances (washing machines, refrigerators, microwaves), automatic meter reading and control systems (water, electricity, gas, lighting), remote-controlled robots that clean and protect homes, intelligent concept cars, wristwatches, mobile phones, and accessories with communication functions (earrings, rings). While these technological advances have contributed to improving the quality of human life, they have also created new problems.

 

What’s wrong with technology?

World War II saw the greatest technological advances, but it also saw the greatest problems. Nuclear weapons. World War II was able to end when the United States dropped atomic bombs on the Japanese cities of Hiroshima and Nagasaki, forcing Japan to surrender. The power of an atomic bomb is so great that a single bomb can virtually obliterate a city. A 15-kiloton bomb has a photospheric radius of about 500 meters and a thermal radiation radius of about 3.5 kilometers from the detonation site. This means that within a second of the explosion, anything within a 500-meter radius of the blast core would be photospheric and vaporized. Between 500 meters and 1 kilometer, you have a 30% chance of survival if you are shielded or in a building. Within 3.5 kilometers, the area exposed to thermal radiation will receive about 2,000 degrees of instantaneous thermal radiation, causing carbonization of the skin if exposed directly and third-degree burns if exposed indirectly. The atomic bombs dropped on Hiroshima and Nagasaki, Japan, were 16 and 21 kt, respectively, and the damage they caused is still being felt nearly 70 years later. The destructive power of these bombs is a stark reminder of the dangers that man-made technology can pose to nature and to ourselves.
In December 1938, German scientists Otto Hahn and Fritz Strassmann discovered the principle of nuclear fission. When uranium-235 absorbs a neutron, the atom splits in two, releasing a lot of energy in the process. Originally, nuclear fission was used to generate energy to boil water and turn steam turbines. It was the will of the user that turned it into a weapon of mass destruction in warfare. These examples are a reminder of how easily technology can be corrupted by human hands.
Perhaps the biggest problem with technology today is cyberbullying and privacy violations. Cyber issues include spam, computer viruses, and personal information leaks. In 2017, many people were victimized by ransomware. It’s a malicious software called ransomware that spreads through the mail to the victim’s computer, preventing them from opening all their documents and demanding a huge amount of money. It’s a virus that doesn’t have a perfect antivirus, which makes it even more damaging. In the upcoming ubiquitous society, people and things will be connected by computers, so you need to be wary of the security threats that may arise. If a device is stolen or lost, access to the target network could be compromised. Furthermore, attacks such as battery drain and signal jamming can render ubiquitous computing technology unusable. These risks mean that technological advances can seriously threaten human safety and privacy.
While these are specific technological crimes, such as virus infections and hacking through email, there is a more frequent and serious cybercrime that occurs more frequently. It’s called malice. It’s a combination of the words “evil” and “reply,” which literally means a bad comment. It’s often directed at public figures, such as celebrities. It doesn’t criticize the person or their behavior, but rather crosses the line into accusations. The problem was highlighted when Choi Jin-sil, a famous South Korean actress, took her own life because of the abuse she received, but the problem hasn’t been solved to the point where we’re still talking about internet blindness. Moreover, this year, two singers have already lost their lives due to malicious comments. These crimes, such as malicious comments, rumor-mongering, and cyberbullying, don’t require a specialized skill set to commit. Anyone can be a perpetrator of these crimes. At this point, we need to reflect on the human ethics that have not kept up with the rapid development of technology. The use of technology requires more than just proficiency. The ethical use and responsibility of technology will become even more important in the future.

 

What are the responsibilities of users of technology?

Since the development of AI, some people have worried about the day when machines will take over the human race, as depicted in popular movies and novels. Perhaps our greatest fear in the development of technology is the moment when we lose control of it. The philosopher Jacques Ellul, who may be considered the founder of the philosophy of technology, believed that modern technology would lead to the loss of human freedom. According to him, freedom is the ability to make choices and not have to give reasons for them. With modern technology, choices are automatic, and modern technologies inevitably combine and expand upon each other. Once one technology is developed, another technology must be developed to maintain it.
This all seems to be true. Humans created the first technology, and humans have been the only ones to develop and use technology. Humans are the creators and sole users of technology, which emphasizes our responsibility in its use.
In Hans Jonas’s book, Ethics in Technology and Medicine – Practicing the Principle of Responsibility, he said that technology permeates all matters concerning human beings, and technology also functions as human power. Furthermore, since all human behavior is subject to moral scrutiny, technology as a form of behavior is also subject to moral scrutiny. One of the terms Hans Jonas used was “ethical vacuum”. The ethical vacuum refers to the gap between the development of science and technology and the ethics that have not kept pace. We can see from this argument that users cannot escape their responsibility.
Ethics is a code of human behavior, a code of what we ought to do or observe as human beings. The development of nuclear weapons, the social problems caused by the proliferation of the internet, and the threat of ubiquitous technology are all problems caused by a lack of human ethics. Why did people 60 years ago create weapons of mass destruction from research that could have been used to generate more energy? Why do people use the internet to communicate so quickly that they can plant viruses on other people’s computers via email, invade someone’s privacy, and hurt each other by spreading rumors and gossip behind the cover of anonymity? These questions require a fundamental reflection on how we use technology. As Hans Jonas points out, there is still an ethical vacuum in our society, and it will continue to grow as technology advances faster. Technology is a tool, and while it is not inherently good or evil, it can be a weapon or a tool for the betterment of society, depending on the will of the people who use it. This is why we must take responsibility for how we use technology, and why ethical judgment and reflection are more important than ever.

 

Conclusion

In conclusion, we need to reflect not only on technology, but also on human ethics when using technology. From the beginning of their formal education, students are taught about basic morals and ethics. Despite this, crime has not disappeared from society, and as technology advances, more and different types of crimes have been committed. To solve this problem, ethics education will need to be adapted to the changed society. This process is long and probably still ongoing. Educators and students are at least a generation apart. That’s how long it takes for a student to complete their studies and become an educator. But even in that short time, less than a generation, technology continues to evolve. That’s why it’s important for educators to keep up with the changes in society and the advancements in technology. And we’re not just talking about technical ethics. Researchers who develop technology need to learn ethics, and users who enjoy technology until the day they die need to learn ethics about user responsibility. Of course, this is still being taught, so what needs to change is the methodology. Ethics shouldn’t be something that students have to cram into their heads just to pass a test. They need to be strongly taught, starting in school, that it’s wrong to do something that goes against ethics and morals, even in the smallest of ways.
This alternative may seem unclear, abstract, and unrealistic. However, human ethics is not something that can be manipulated in any tangible way. Ethical consciousness is shaped by individual will and social environment, and there are limits to fully regulating or institutionalizing it. What we, as users of technology, can do is to take responsibility for our actions and not use technology in a negative way. With a little bit of effort from each individual, over time, the term ethical vacuum will fade into history and we will be humans who use technology wisely, not humans who are controlled by it.

 

About the author

Blogger

Hello! Welcome to Polyglottist. This blog is for anyone who loves Korean culture, whether it's K-pop, Korean movies, dramas, travel, or anything else. Let's explore and enjoy Korean culture together!

About the blog owner

Hello! Welcome to Polyglottist. This blog is for anyone who loves Korean culture, whether it’s K-pop, Korean movies, dramas, travel, or anything else. Let’s explore and enjoy Korean culture together!