disarmament and international security committee (disec) · disarmament and international security...
TRANSCRIPT
Disarmament and International Security Committee (DISEC)
��� | SOUTHMUNC IV1
Disarmament and International Security Committee (DISEC)
Meet the Chair
Hello DISEC delegates,
It’s a delight to have you all in this DISEC committee at SouthMUNC IV. My name is
Mahish Kewalramani and I am genuinely excited to be your head chair for this committee. When
coming up with this committee’s topic I thought about my past experience with general assembly
committee’s and how I had always felt they were so repetitive. Solutions from one conference
could be copied and pasted with a few word changes and no one would blink an eye. This is why
I decided that the committee topic be about robotics for national security. Not only is this a fun
topic to discuss, but it is also very important to talk about now as this is where the future of
warfare and internal security is headed.
Outside of Model UN I love to watch and play soccer. I have followed the sport for as
long as I could remember and have played on the school’s varsity team. Along with this I have
done a lot of work in STEM-oriented fields such as working with computer learning systems just
this past summer, which showed me the power that technology has today, which gave me the
idea to have this topic for the committee. Finally, I have participated in model UN since I was a
freshman, and treasure the experiences and knowledge that I have acquired through the club over
the past four years. Good luck researching guys, and I’m looking forward to seeing all of you in
committee.
��� | SOUTHMUNC IV2
Disarmament and International Security Committee (DISEC)
Warm Regards,
Mahish Kewalramani
Tenets of DISEC
DISEC is the principle global forum for countries to address issues of war, armed conflict
and armaments. It deals with some of the same issues as the Security Council, but works more
broadly to set global disarmament priorities, policies and goals. DISEC resolutions are politically
and morally binding, not legally. It can request and appeal for state action. Unlike the Security
Council it cannot demand action. As the technology of war evolves, DISEC targets global
attention on the weapons and policies it thinks most dangerous and destabilizing. Some
governments are convinced that nuclear disarmament is overwhelming and must come before
any other action. Others want to focus on more immediate killers like landmines and cluster
munitions. Some believe only threats from states are the business of the UN system, others think
terrorism is equally important.
��� | SOUTHMUNC IV3
Disarmament and International Security Committee (DISEC)
Regulation and Use of Robots for National Security
Background
Something big is going on in war today, and maybe even in the history of humanity
today. According to an Air Force three-
star general, where the US is headed
very soon is tens of thousands of robots
operating in our conflicts. These
��� | SOUTHMUNC IV4
Disarmament and International Security Committee (DISEC)
numbers are huge, considering that the robots that will be sent out into the battlefield will not be
today’s robots, but tens of thousands of prototypes and tomorrows robots, which if Moore’s law
holds true for the technology of robots, will mean that in 25 years these robots will be close to a
billion times more powerful in their computing than the robots are today. This means that the
kind of things that we used to only talk about in science fiction conventions like Comic-Con will
have to be talked about in committees much like this one. When historians look at this period,
they’re going to conclude that war is undergoing a revolution, much like how the invention of
the atomic bomb revolutionized fighting. While this is true the introduction of robots into the
military is going to change much more in warfare as they aren’t just changing the firepower of an
army, but they are going to change who is fighting these wars on its fundamental level. Soon we
will be going to war with soldiers whose hardware was made in china and whose software was
written in India. The future of war will feature a new type of warrior, which will redefine the
experience of going to war. They will be cubicle warriors. A predator drone pilot described his
experience in Iraq War while never leaving Nevada. “You’re going to war for 12 hours, shooting
weapons at targets, directing kill on enemy combatants, and then you get in the car and drive
home and within 20 minutes, you’re sitting at the dinner table talking to your kids about their
homework.” The capabilities of robotics in war are vast, and we have only see glimpses of their
power, mostly in the use of drones. These drones have made war much safer for the country
using them however, we have already seen many problems in their use, for example, in striking
down people in other countries as there are countless cases in which a civilian has been killed by
one of these drones. Drones as mentioned earlier, are only a stepping stone, as these are still
weapons that people are controlling, even though it may be from a remote location. Fully
autonomous weapons however are not a fantasy, as the capability of producing prototypes of
��� | SOUTHMUNC IV5
Disarmament and International Security Committee (DISEC)
these kinds of weapons has already been achieved. Robotics in the military is an amazing feat by
humans however it presents many problems that need to be solved before we can truly consider
these weapons lifesavers for soldiers risking their lives for their country each day as current laws
have not even completely caught up to drones, which are very popular on the battlefield today, so
obviously laws for up and coming technologies are virtually nonexistent. It will be your job as a
committee to come up with a set of guidelines that you guys think will help guide countries for
their international and national policies pertaining to the use of the military robots.
Potential Issues
Weapons Getting into the Wrong Hands
Much like how software has gone open-source, so has warfare. Unlike an aircraft carrier
or an atomic bomb, you don’t need a massive manufacturing system to build robotics. For
example, for a thousand dollars you can build yourself a raven drone equivalent to what the
soldiers use in Iraq. So while good guys might play around and work on these drones as a hobby,
bad guys could do. This cross between robotics and things like terrorism will be both disturbing
and fascinating, and we’ve already seen the
start of it. During the war between Israel and
Hezbollah, the terrorist organization non-state
actor flew four different drones against Israel.
There’s already a jihadi website that you can
go on and remotely detonate an IED
��� | SOUTHMUNC IV6
Disarmament and International Security Committee (DISEC)
(Improvised Explosive Device) while sitting at your home computer. So two main trends will
take place. First, the power of individuals against governments will be reinforced, but also the
realm of terrorism will be expanded. Another way to think about it is that you don’t have to
convince a robot that their family will receive something in exchange for bombing a place.
Keeping these weapons out of terrorist hands have to be a priority for all countries as it will
make the world a safer place if they do not have access to these weapons. Steps much like the
ones taken by the United Nations in previous meetings to keep weapons like nuclear bombs out
of the hands of terrorists, however with the added problem of how easy these weapons are to
make new policies will have to be figured out. For example, when the US drone crashed into
central Israel, they were easily able to copy the technology seen within these weapons and make
a copy drone, so if terrorist organizations get their hands on a crashed robotic weapon they too
can copy the weapons. Both where these robots can be used and how much information on how
these robots are made be made public must be regulated in order to make sure that this happens.
Psychological Problems for Soldiers
A common misconception is that since soldiers are not physically going into war with the
use of controllable drones, they do not have the trauma and psychological problems commonly
associated with returning soldiers. But in fact, these soldiers actually have higher rates of PTSD
than many of the units that directly partake in war, a
study about the U.S war in Iraq has shown. Many
people however are worried that the disconnection
these drone pilots feel from the war while still being
able to affect its outcome makes the contemplation of
��� | SOUTHMUNC IV7
Disarmament and International Security Committee (DISEC)
war crimes a lot easier when you have this distance. “It’s like a video game,” is what one young
pilot described to me of taking out enemy troops from afar. If you’ve ever played Grand Theft
Auto knows that we do things in the video game world that we wouldn’t do face to face. Things
like counseling and psychological help will need to be put into place for these soldiers in order to
protect them from things like this.
Controlling Robots
We all have “oops” moments within our lives, like procrastinating about doing work and
realizing you do not have time left to do your work. However, what do these “oops” moments
mean for autonomous robots in war? Sometimes they’re funny, like in the scene from the Eddie
Murphy movie “Best Defense” playing out in reality, where they tested a machine gun armed
robot, and during the demonstration it started spinning in a circle and pointed its machine gun at
the viewers of the product trial. Fortunately, the weapon wasn’t loaded and no one was hurt, but
other times “oops” moments can be tragic. For example, in South Africa, an anti-aircraft cannon
had a “software glitch” and actually did turn on people and fire, killing nine soldiers. This
presents new problems in the laws of war and accountability. Who de we reprimand for things
like unmanned slaughter? What can you quantify as unmanned slaughter? We have already had
three cases where a predator drone was shot at a place where the US thought a terrorist was
staying at, however that ended up not being the case and a civilian was killed. We are not even in
an era yet with autonomous robots and we are already seeing problems with things like this. How
do we limit the possibilities of these lethal weapons having a “software glitch” that could
potentially cause international incidents or kill a countries own denizen? Do we limit the type of
weapons that can be used to make sure no autonomous weapon ever sees the battlefield (at least
��� | SOUTHMUNC IV8
Disarmament and International Security Committee (DISEC)
if countries follow this regulation), or do we have some test that robots have to pass in order to
let them fight on the battlefield?
Regulating These Weapons
One of the final main problems with these robotic weapons is how do we regulate them
within an international scope. Much like nuclear weapons, these robots are too strong to have no
restrictions so it begs the question what should these regulations be. Should they be regulated
based on their destructive capabilities (which can be measured on a scale that looks at the robots
specs)? Or, should certain types of robots be disallowed universally on the battle field (I.E.
autonomous robots)? These decisions will be up to you delegates to decide. However currently
there is a definitive need for regulations for these laws, as the current war laws are so old they
could qualify for Medicare.
Current Status
Currently robots are only seen in warfare in things like drones. These are robots
controlled by humans that give us the ability to affect the war from back home. They have saved
numerous amounts of soldiers through how we don’t have to use soldiers for some missions
anymore and, through being used as scouts to search for things like land mines. We have seen
them being used in national security matters during the Dallas shooter event in 2016, when a
lethal robot was used to kill the Dallas shooter effectively. The use of these robots however will
only grow as time goes on however and prototypes for many lethal robots are being made that
��� | SOUTHMUNC IV9
Disarmament and International Security Committee (DISEC)
the military start using within the next 5 years. In order to avoid disaster like we saw with
nuclear weapons in Hiroshima and Nagasaki, where an atomic bomb was dropped before they
had regulations on atomic bombs. The hardest part of coming up with this policy as we don’t
know where these robots will be in 25 years, however we have a reasonable forecast for where it
will be in 5. The policy that I expect to see should be based on the role robots will play in the
future, not where they are now (although that would be a good place to start as even now we
have very little regulations).
Case Study: Dallas Shooter
July 7, 2016. Micah Xavier Johnson, a reserve for the Afghan War, ambushes a group of
police officers and civilians in Dallas, Texas. A trained sniper, Micah, known holding extreme
prejudices against white individuals, opens fire in an attempt to exterminate every single person
with a Caucasian complexion. After opening fire, the police and Johnson engage in a 45-minute
gun battle followed by two hours of negotiation. Two officers had already been killed and Police
Chief David Brown has his back against the wall. Does he risk the lives of his men by putting
them in the line of fire to attempt a takedown on this soldier? Chief Brown finally makes a call
and carries out a plan never seen before: Use a robot and more than a pound of c4 to take out the
sniper. Following some brief planning, the Dallas police department send in the robot and kill
Micah Xavier Johnson in the first instance of police using a lethal robot. Thanks to the keen mind
of Chief David Brown, Johnson lays dead and police officers get to go home. However, this is
not where our story ends. A host of new questions emerge after the killing of Johnson and
obvious problems with the regulation and use of the robots are highlighted. The first problem lies
��� | SOUTHMUNC IV10
Disarmament and International Security Committee (DISEC)
in EOD (Explosive Ordnance Disposal) robots as a whole. The model used in Dallas is the
Remotec Androx Mark V A-1 and has a variety of other mounts that can be attached other than a
piece of c4. The Remotec Androx Mark V A-1, priced at around $170,000 (a small price to pay
for the uses it has), has mounts including: a modified 12-gauge shotgun, a gas can dispenser, a
window breaker, drills and saws, x-ray machines, mounts to combine with l6, l8 37mm or 40mm
launcher weapons. The capabilities of this EOD obviously surpass what the entire world
witnessed on in Dallas.
Accountability of Current Developments
A number of robots, founded by DARPA (Defense Advanced Research Projects Agency),
fall under the aforementioned category of being able to use lethal force; however, these robots,
because they are not directly related to a weapon system, are given the green light with little to
no regulation. For example, one researcher was working under Navy contract to create a robot
that would play baseball, but being able
to track and intercept a fly ball is
analogous to tracking and intercepting a
missile. In another instance, a robot was
being built that could drive a car, climb
a ladder and even operate a jackhammer.
Peter Singer, an expert in this field,
explains that these types of robots “can manipulate an Ak-47…[those robots] can manipulate the
controls of all conventional military machines as well.” As these robots become able and adept,
it’s clear that we will soon cross a point in which the robots will decide to kill, not us. At this
��� | SOUTHMUNC IV11
Disarmament and International Security Committee (DISEC)
point, if a robot like this made a mistake, who is to blame? Is it the programmer? The
manufacturer? The commander who launched it on its mission? The fact of the matter is that as
the robots become more accessible, more intelligent, more skilled, more prominent in today’s
society, regulations have to be put in place in order to ensure that any malpractice is corrected
and any ambiguity is extinguished.
Guiding Questions
• Are there different classifications of robots and, if so, would different classes of robots
prompt different regulations?
• How will we determine the destructive capability a robot has?
��� | SOUTHMUNC IV12
Disarmament and International Security Committee (DISEC)
• How should nations be held accountable for their actions of their robots? Will “software
glitches” be given a pass as they aren’t the fault of a specific country or should they be
reprimanded?
• How will we keep these robots out of the hands of terrorist organizations?
• How can we come up with a system of guidelines to regulate the robots of tomorrow?
• Should we allow countries to utilize robots within their own borders?
• How can we make sure soldiers lose touch with reality when utilizing things like drones?
• Who within a nation should be held accountable for a wrongdoing a robot does?
Works Cited
Horowitz, Michael C., and Paul Scharre. "The Morality of Warfare." The New York Times.
N.p., 26 May 2015. Web. 16 Oct. 2016.
��� | SOUTHMUNC IV13
Disarmament and International Security Committee (DISEC)
Lin, Patrick. "Robots, Ethics & War." Center for Internet and Society. Stanford, 15 Dec.
2010. Web. 16 Oct. 2016.
Sidner, Sarah, and Mallory Simon. "How Robot, Explosives Took out Dallas Sniper." CNN.
Cable News Network, 12 July 2016. Web. 16 Oct. 2016.
Singer, Peter Warren. "The Future of War Will Be Robotic." CNN. Cable News Network, 23
Feb. 2015. Web. 16 Oct. 2016.
Singer, P.W. "Military Robots and the Future of War." TED. N.p., Feb. 2009. Web. 16 Oct.
2016.
��� | SOUTHMUNC IV14