When Arnold Schwarzenegger's T-800 character says "I'll be back" in 1984's sci-fi hit "The Terminator," few, if any, thought the line would be a prescient look into the future.

Fast-forward to 2018 and humanity is trying to figure out what to do with and how to define "killer robots" and their inevitable role influencing humanity.

Experts from several countries are meeting at the Geneva offices of the United Nations this week to focus on lethal autonomous weapons systems and explore ways of possibly regulating them, among other issues.


In theory, fully autonomous, computer-controlled weapons don’t exist yet, UN officials say. The debate is still in its infancy and the experts have at times grappled with basic definitions. The United States has argued that it’s premature to establish a definition of such systems, much less regulate them.

Some advocacy groups say governments and militaries should be prevented from developing such systems, which have sparked fears and led some critics to envisage harrowing scenarios about their use.

As the meeting opened Monday, Amnesty International urged countries to work toward a ban.

Amnesty researcher on artificial intelligence Rasha Abdul Rahim said killer robots are “no longer the stuff of science fiction,” warning that technological advances are outpacing international law.

The meeting follows a new report from researchers saying autonomous weapons would breach international law and that there is a "moral imperative" to ban them.

new report published by Human Rights Watch and Harvard Law School’s International Human Rights Clinic claims that such autonomous weapons would violate the Martens Clause — a provision of humanitarian law that's widely accepted worldwide.

It requires emerging technologies to be judged by the “principles of humanity” and the “dictates of public conscience” when they are not already covered by other treaty provisions.


“The idea of delegating life and death decisions to cold compassionless machines without empathy or understanding cannot comply with the Martens clause and it makes my blood run cold,” Noel Sharkey, a roboticist who wrote about the reality of robot war as far back as 2007, told The Guardian.

Research firm IDC expects that global spending on robotics and drones will reach $201.3 billion by 2022, up from an estimated $95.9 billion in 2018.

Over the years, several luminaries, including Elon Musk, legendary theoretical physicist Stephen Hawking and a host of others have warned against the rise of artificial intelligence.

In September 2017, Musk tweeted that he thought AI could play a direct role in causing World War III. Musk's thoughts were in response to comments made by Russian President Vladimir Putin, who said that the country "who becomes the leader in this sphere [artificial intelligence] will be the ruler of the world."

In November, prior to his death, Hawking theorized that AI could eventually "destroy" humanity if we are not careful about it.

The AP and Fox News' Christopher Carbone contributed to this report. Follow Chris Ciaccia on Twitter @Chris_Ciaccia