Every day, we voluntarily inform the machines about ourselves. This happens when we take cookies online or use search engines.
We rarely worry about how our data is sold and used, before clicking “Agree” we want to go to the page that we will understand that it will be used to target as our customer and to agree with some of these items to buy this service, which we do not need.
But what, if the machines use data to decide, who should notice as an enemy, who needs to be killed?
A group of UN and private companies is concerned that this thought or landscape is now closer to reality. They claim that international control of deadly autonomous weapons
Big -Cale drone fight continues in Ukraine
The Khains region of Ukraine has been under frequent attacks of armed drones run by the Russian army for several days, which are mainly targeting non-girls.
According to official sources, more than 150 ordinary people were killed and several hundred people were injured. In an independent human rights investigation appointed by the United Nations, it has been decided that these attacks are a crime against humanity.
The Ukrainian army is also dependent on drones and complains that the “drone wall” is a protective line of unmanned armed air vehicles (UAV) to protect the weak parts of the country.
At one time, the most high technology and expensive UAV capacity of the rich countries was to buy. But now Ukraine has proven that with little simplicity, low -coast drones can be corrected for severe effects. While reflecting this change in conflicts and war around the world, the chanting of modern war is being written again.
‘Digital inhumanization’ has increased
However, this modern form of war can be catastrophic, the growing threat of unmanned drones or other autonomous weapons, increases growing anxiety about ‘killer robots’, which is raining in the sky, they decide that they have to attack.
“The general secretary has always stated that the use of machines with the power of the United Nations Association Association Ezomi Nakamitsu is morally disgusting to end human life with the power of the machines completely determined.”
“It should not be allowed. In reality it should be banned by international law. This is the situation in the UN.”
An international non -international human rights organization – Human Rights Watch (HRW) says that the use of autonomous weapons will be the latest, most serious examples of “digital inhumanization”. Under this, artificial intelligence (AI) in cases such as policing, law enforcement and boundary controls that affect people, making many important and serious decisions that change life.
Mary Warham, Advocate Director of the Department of Weapons of Human Rights Watch, said, “Many countries are investing in artificial intelligence and related strategies for the development of land and sea -based autonomous weapons, for the development of land and sea -based autonomous weapons.”
“The United States is at the top of the United States, but other large countries like Russia, China, Israel and South Korea are also investing in the autonomous weapons system.”

Their selfish arguments
In favor of the I-Operated War, it often indicates the borders of human beings to justify its expansion. Soldiers can make mistakes in decision making, work on emotions, they need rest, and they certainly demand salary. They argue that machines, behaviors and motions or people are getting better every day to identify the dangers on the basis of walking.
Some supporters have suggested that the autonomous systems to decide at a later stage can be noticed when the trigger should be pulled.
There are two main objections while controlling machines on the battlefield: first, the technique is not completely safe. Second, the UN and many other organizations consider the use of laws immoral.
Human Rights Watch Mary Warham said, “Human goals are very easy to understand the machines. People with disabilities are especially at risk because they have problems, they have problems. They can be considered as weapons. It is also concerned that the face detection techniques and other biometric measurements are unable to correct the color of different skin.
“AI is still flawed, and it also contains superstitions for people created by those systems.”
As far as moral objections, Nicole Van Ruzen, executive director of Stop Killer Robots, who campaigned for a new international law related to autonomy in the arms system, said that it would make it very difficult to determine the responsibility for war crimes and other brutality.
He says, “Who is responsible? Is this the creator? Or the person who programmed the algorithm? This situation gives rise to many issues and anxiety and if they are greatly used it will be a moral failure.”

Until 2026 restrictions?
The technology is moving forward, and already on the battlefield, the AI capable targeting system is being used, to control the technology, to control the technology, as soon as possible to create international rules.
In May, there was informal discussion at the UN headquarters, where General Secretary Antonio Guterres called for a legal compulsory agreement to control and recover the use of autonomous weapons by 2026.
The attempt to control and ban the law is not new. In fact, the United Nations held the first meeting of diplomats in Geneva in the 21st, described as “a challenging emerging subject on the disarmament agenda.”
However, no autonomous weapons system was used in war and conflicts at that time.
Now 11 years later, the discussion is underway, but there is still no Sens about the definition of autonomous weapons, if they agree on their use, that is, to give up legal control.
Nevertheless, the non -governmental organization and the UN are optimistic that international communities are slowly moving towards general understanding on large issues.