Disturbing video depicts near-future ubiquitous lethal autonomous weapons

http://img.youtube.com/vi/9CO6M2HsoIA/0.jpg


Campaign to Stop Killer Robots | Slaughterbots

In response to growing concerns about autonomous weapons, the Campaign to Stop Killer Robots, a coalition of AI researchers and advocacy organizations, has released a fictional video that depicts a disturbing future in which lethal autonomous weapons have become cheap and ubiquitous worldwide.

UC Berkeley AI researcher Stuart Russell presented the video at the United Nations Convention on Certain Conventional Weapons in Geneva, hosted by the Campaign to Stop Killer Robots earlier this week. Russell, in an appearance at the end of the video, warns that the technology described in the film already exists* and that the window to act is closing fast.

Support for a ban against autonomous weapons has been mounting. On Nov. 2, more than 200 Canadian scientists and more than 100 Australian scientists in academia and industry penned open letters to Prime Minister Justin Trudeau and Malcolm Turnbull urging them to support the ban.

Earlier this summer, more than 130 leaders of AI companies signed a letter in support of this week’s discussions. These letters follow a 2015 open letter released by the Future of Life Institute and signed by more than 20,000 AI/robotics researchers and others, including Elon Musk and Stephen Hawking.

“Many of the world’s leading AI researchers worry that if these autonomous weapons are ever developed, they could dramatically lower the threshold for armed conflict, ease and cheapen the taking of human life, empower terrorists, and create global instability,” according to an article published by the Future of Life Institute, which funded the video. “The U.S. and other nations have used drones and semi-automated systems to carry out attacks for several years now, but fully removing a human from the loop is at odds with international humanitarian and human rights law.”

“The Campaign to Stop Killer Robots is not trying to stifle innovation in artificial intelligence and robotics and it does not wish to ban autonomous systems in the civilian or military world,” explained Noel Sharkey of the International Committee for Robot Arms Control. Rather we see an urgent need to prevent automation of the critical functions for selecting targets and applying violent force without human deliberation and to ensure meaningful human control for every attack.”

For more information about autonomous weapons:

* As suggested in this U.S. Department of Defense video:


Perdix Drone Swarm – Fighters Release Hive-mind-controlled Weapon UAVs in Air | U.S. Naval Air Systems Command

from KurzweilAI http://ift.tt/2zgtwfe

Advertisements

Disturbing video depicts near-future ubiquitous lethal autonomous weapons

http://img.youtube.com/vi/9CO6M2HsoIA/0.jpg


Campaign to Stop Killer Robots | Slaughterbots

In response to growing concerns about autonomous weapons, the Campaign to Stop Killer Robots, a coalition of AI researchers and advocacy organizations, has released a fictional video that depicts a disturbing future in which lethal autonomous weapons have become cheap and ubiquitous worldwide.

UC Berkeley AI researcher Stuart Russell presented the video at the United Nations Convention on Certain Conventional Weapons in Geneva, hosted by the Campaign to Stop Killer Robots earlier this week. Russell, in an appearance at the end of the video, warns that the technology described in the film already exists* and that the window to act is closing fast.

Support for a ban against autonomous weapons has been mounting. On Nov. 2, more than 200 Canadian scientists and more than 100 Australian scientists in academia and industry penned open letters to Prime Minister Justin Trudeau and Malcolm Turnbull urging them to support the ban.

Earlier this summer, more than 130 leaders of AI companies signed a letter in support of this week’s discussions. These letters follow a 2015 open letter released by the Future of Life Institute and signed by more than 20,000 AI/robotics researchers and others, including Elon Musk and Stephen Hawking.

“Many of the world’s leading AI researchers worry that if these autonomous weapons are ever developed, they could dramatically lower the threshold for armed conflict, ease and cheapen the taking of human life, empower terrorists, and create global instability,” according to an article published by the Future of Life Institute, which funded the video. “The U.S. and other nations have used drones and semi-automated systems to carry out attacks for several years now, but fully removing a human from the loop is at odds with international humanitarian and human rights law.”

“The Campaign to Stop Killer Robots is not trying to stifle innovation in artificial intelligence and robotics and it does not wish to ban autonomous systems in the civilian or military world,” explained Noel Sharkey of the International Committee for Robot Arms Control. Rather we see an urgent need to prevent automation of the critical functions for selecting targets and applying violent force without human deliberation and to ensure meaningful human control for every attack.”

For more information about autonomous weapons:

* As suggested in this U.S. Department of Defense video:


Perdix Drone Swarm – Fighters Release Hive-mind-controlled Weapon UAVs in Air | U.S. Naval Air Systems Command

from KurzweilAI » News http://ift.tt/2zgtwfe

ICIST 2018 : ACM-8th International Conference on Information Systems and Technologies

ACM – ICIST’2018 (Istanbul, Turkey) is a forum for researchers, developers and industrials in the information systems field and the continuance of the following event: ICIST’11 (Tebessa, Algeria), ICIST’12 (Sousse, Tunisia), ICIST’13 (Tangier, Morocco), ICIST’14 (Valancia, Spain), ICIST’15 (Istanbul, Turkey), ICIST’16 (Barcelona, Spain) and ICIST’2017 (Dubai, UAE). It reports progress and development of methodologies, technologies, planning and implementation, tools and standards in information systems. The conference looks also at socio-economic aspects, impacts and success factors of information systems.

Important Dates

Abstract submission: December 5th, 2017

Camera Ready and Early Registration: January 25, 2018

Conference: March 16 – 18, 2018

Email:

icist.educ@gmail.com

from CFPs on Artificial Intelligence : WikiCFP http://ift.tt/2hMgG0S

Question on Image classifiers.

If you are creating an image classifier and one of your training images belongs to more than one category, what would be the correct thing to do? Create a new category labelled A+B or put the image into both categories individually?

submitted by /u/chubbyostrich
[link] [comments]

from Artificial Intelligence http://ift.tt/2itWdLA

Current AI student: should I drop out?

I am currently enrolled in a Master level AI program, and as classes go by, I am starting to have increasingly more serious doubts about whether I should go through with it, so I’d love some feedback from the people that are already in the industry. The current situation is:

  1. I have a previous degree in electrical engineering, but I have graduated about 5 years, ago, and I was not the top student. Since then, I have worked in testing

  2. My math skills are pretty limited. Somehow the math in the program is not that complex, but I really worry I will need to greatly improve to

  3. The program is hard for me, even studying almost full time I struggle. Which is not necessarily a huge problem, but combined with the next point feels to me like it is

  4. My programming skills are very very meh. I know some Python and have a good foundation in algorithms, but I have never coded in a professional setting or written anything longer than 500 lines of code for that matter.

So all in all, looks to me like my chances of finding a job in the industry upon graduation are pretty slim. I wouldn’t mind doing the program as a thing in itself and then learning to code, but I’d like to adjust my expectations in that case. As I’ve said, any thoughts/input are really appreciated.

submitted by /u/randomsongname
[link] [comments]

from Artificial Intelligence http://ift.tt/2zfDNsc