Posts Tagged ‘eclipse’

Musk, Killer Robots, Trump, the Eclipse

Wednesday, August 23rd, 2017

By Bob Gaydos

Donald Trump looking at the solar eclipse.

Donald Trump looking at the solar eclipse.

Elon Musk and Donald Trump made significant scientific statements this week. Digest that sentence for a second. …

OK, it’s not as strange as it sounds because each man was true to himself. That is, neither message was surprising, considering the source, but each was important, also considering the source.

Monday, Musk and 115 other prominent scientists in the field of robotics and artificial intelligence attending a conference in Melbourne, Australia, delivered a letter to the United Nations urging a ban on development and use of killer robots. This is not science fiction.

Responding to previous urging by members of the group of AI and robotics specialists, the UN had recently voted to hold formal discussions on so-called autonomous weapons. With their open letter, Musk and the others, coming from 26 countries, wanted the UN to be clear about their position — these are uniquely dangerous weapons and not so far off in the future.

Also on Monday, on the other side of the planet, as millions of Americans, equipped with special glasses or cardboard box viewers,  marveled at the rare site of a solar eclipse, Trump, accompanied by his wife, Melania, and their son, Barron, walked out onto a balcony at the White House and stared directly at the sun. No glasses. No cardboard box. No problem. I’m Trump. Watch me give the middle finger to science.

Of course, the only reason Trump shows up in the same sentence as Musk in a scientific discussion is that the man with the orange hair holds the title of president of the United States and, as such, has the power to decide what kind of weapons this nation employs and when to use them. Also, the president — any president — has the power, through words and actions, to exert profound influence on the beliefs, attitudes and opinions of people used to looking to the holder of the office to set an example. Hey, if it’s good enough for the president, it’s good enough for me. This is science fiction.

Please, fellow Americans, don’t stare at the sun during the next eclipse.

Trump’s disdain for science (for knowledge of any kind, really) and his apparently pathological need to do the opposite of what more knowledgeable people recommend, regardless of the topic, are a dangerous combination. When you’re talking about killer robots, it’s a potentially deadly one.

The U.S.Army Crusher robotic weapon.

The U.S.Army Crusher robotic weapon.

How deadly? Here’s a quote from the letter the AI specialists wrote: “Once developed, lethal autonomous weapons will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.

“We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.”

In fact, it’s already opened. On the Korean peninsula — brimming with diplomatic tension, the rattling of nuclear weapons by the North Koreans and the corresponding threats of “fire and fury” from Trump — a fixed-place sentry gun, reportedly capable of firing autonomously, is in place along the South Korean side of the Demilitarized Zone.

Developed by Samsung for South Korea, the gun reportedly has an autonomous system capable of surveillance up to two miles, voice-recognition, tracking and firing with mounted machine gun or grenade launcher. There is disagreement over whether the weapon is actually deployed to operate on its own, but it can. Currently, the gun and other autonomous weapons being developed by the U.S., Russia, Germany, China, the United Kingdom and others require a human to approve their actions, but usually in a split-second decision. There is little time to weigh the consequences and the human will likely assume the robot is correct rather than risk the consequences of an incorrect second-guess.

But it is precisely the removal of the human element from warfare that Musk and the other AI developers are worried about. Removing the calculation of deaths on “our side” makes deciding to use a killer robot against humans on the other side much easier. Too easy perhaps. And robots that can actually make that decision remove the human factor entirely. A machine will not agonize over causing the deaths of thousands of “enemies.”

And make no mistake, the robots will be used to kill humans as well as destroy enemy machines. Imagine a commander-in-chief who talks cavalierly about using nuclear weapons against a nation also being able to deploy robots that will think for themselves about who and what to attack. No second-guessing generals.

Musk, a pioneer in the AI field, has also been consistent with regard to his respect for the potential danger posed to humans by machines that think for themselves or by intelligences — artificial or otherwise — that are infinitely superior to ours. The Tesla CEO has regularly spoken out, for example, against earthlings sending messages into space to try to contact other societies, lest they deploy their technology to destroy us. One may take issue with him on solar energy, space exploration, driverless cars, but one dismisses his warnings on killer robots at one’s own risk. He knows whereof he speaks.

Trump is another matter. His showboating stunt of a brief look at the sun, sans glasses, will probably not harm his eyes. But the image lingers and the warnings, including one from his own daughter, Ivanka, were explicit: Staring directly at the sun during the eclipse can damage your retina and damage your vision. Considering the blind faith some of his followers display in his words and actions, it was yet another incredibly irresponsible display of ego and another insult to science.

Artificial intelligence is not going away. It has the potential for enormous benefit. If you want an example of its effect on daily life just look at the impact autonomous computer programs have on the financial markets. Having weapons that can think for themselves may also sound like a good idea, especially when a commander-in-chief displays erratic judgment, but their own creators — and several human rights groups — urge the U.N. to ban their use as weapons, in the same way chemical weapons and land mines are banned.

It may be one of the few remaining autonomous decisions humans can make in this area, and the most important one. We dare not wait until the next eclipse to make it.

rjgaydos@gmail.com