2014 karles invitational conference

58
The difference between “Winning the War” and “Winning the Peace.” Mr. John S. Canning Combat Systems Engineer, G82 Naval Surface Warfare Center Dahlgren Division (540) 653-5275 [email protected] 14 January 2014 2014 Karles Invitational Conference at NRL Washington, D.C. Distribution Statement A, Approved for Public Release, Distribution is Unlimited Autonomy: Legal, Ethical, & Policy Issues Drive a New Paradigm in Warfare

Upload: john-canning

Post on 11-Apr-2017

142 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: 2014 Karles Invitational Conference

The difference between “Winning the War”

and “Winning the Peace.”

Mr. John S. Canning

Combat Systems Engineer, G82

Naval Surface Warfare Center

Dahlgren Division

(540) 653-5275

[email protected]

14 January 2014

2014 Karles Invitational Conference at NRL

Washington, D.C.

Distribution Statement A, Approved for Public Release, Distribution is Unlimited

Autonomy: Legal, Ethical, & Policy Issues

Drive a New Paradigm in Warfare

Page 2: 2014 Karles Invitational Conference

Distribution Statement A

Brief Structure

• Background

• Legal issues

• Ethical issues

• Policy issues

• The New

Paradigm

Page 3: 2014 Karles Invitational Conference

Distribution Statement A

Background

Page 4: 2014 Karles Invitational Conference

Distribution Statement A

Prelude

• The Dahlgren Division of the Naval Surface Warfare

Center (NSWC DD) is the Navy’s premier laboratory for

the development of weapon systems and shipboard combat

systems.

• In late 2002, NSWC DD began to look at the use of

weapons by unmanned systems, specifically, at the

autonomous use of weapons.

• We wanted to determine how these machines could be

allowed to figure-out for themselves when to pull the

trigger.

With apologies to BASF:

We don’t make your robot.

We make your robot better by hanging weapons on it!

Page 5: 2014 Karles Invitational Conference

Distribution Statement A

Meet the Lawyers

• Meeting at Dahlgren with both the Navy JAG, and

the OSD Office of General Counsel.

– 23 September 2003

• Discussed the arming of robotic systems.

• Working relationship with Navy JAG has been

maintained.

• Key insight:

“The ultimate goal in warfare is not to kill the enemy,

but to bring hostilities to a complete and lasting close

as quickly, and as humanely, as possible.” Hayes Parks – OSD Office of General Counsel (Ret)

Page 6: 2014 Karles Invitational Conference

Distribution Statement A

Meet the Philosopher

• Dr. Rob Sparrow, Monash University

(Australia) School of Philosophical,

Historical and International Studies

– Dahlgren visit, February 2008

• Sparrow’s robotic interests:

– Building safe systems

– Designing for the Law of Armed Conflict

– The machine and the mission

– Not making ourselves redundant

• Widely-published in this area.

Page 7: 2014 Karles Invitational Conference

Distribution Statement A

Meet the Computer Scientists

• Dr. Ron Arkin – Regents' Professor, Associate Dean for Research and Space Planning

School of Interactive Computing, College of Computing, Georgia Tech

• Dahlgren visit, September 2007

• Key insight:

– Develop an “ethics module” & autonomously target people

• Prof Noel Sharkey – University of Sheffield (UK) Professor of Artificial Intelligence and

Robotics

– Judge from the TV program “Robot Wars”

• AUVSI North American Denver event, August 2010

• Key insight:

• Should not be weaponized at all

Note that they are on opposite ends of the spectrum.

Page 8: 2014 Karles Invitational Conference

Distribution Statement A

Meet the Ethicist

• Dr. George Lucas – Professor of Ethics & Public Policy (NPS)

– Distinguished Chair in Ethics, VADM James Stockdale Center for

Ethics, U.S. Naval Academy.

• September 2009 meeting at the Naval Academy

– Asked Mr. Canning to come brief his then-current crop

of Stockdale Fellows on our approach to arming robots.

• Key insight: – Supports Dahlgren’s approach to weaponizing robots

Page 9: 2014 Karles Invitational Conference

Distribution Statement A

Meet the “Policy Wonk”

• Paul Scharre

– OSD (Policy)

• April 2010, OSD Working Group

• Spearheaded development of DoD Directive

3000.09, “Autonomy in Weapon Systems.”

• Key insight:

– “No matter how autonomous, people are

ultimately in-charge.”

Page 10: 2014 Karles Invitational Conference

Distribution Statement A

International Committee for Robot Arms

Control (ICRAC)

• Stood-up December 2011

• Desires ban on armed robots

– Both Sparrow and Sharkey are ICRAC founding

members (Sharkey is Chair)

• Closely coupled with Christof Heyns, UN Special

Rapporteur on extrajudicial, summary or arbitrary

executions (Human Rights Council)

• Key insight:

– Stop “Killer Robots”

Note that their focus is on “Killer Robots.” They don’t differentiate

between semi-autonomous, and autonomous, systems.

Page 11: 2014 Karles Invitational Conference

Distribution Statement A

Why Do We Mention These People and

Organizations?

• Each brings their own perspective to the autonomy issue

for weapons use.

• An effective design for an autonomous system must

consider ALL of them.

– We may choose to disagree with, or ignore, some of them, but we

better have a good reason when we do.

• Without doing this, we would be wasting our time,

designing something that would likely never get used.

Arming robots is not just a narrow

engineering problem.

Page 12: 2014 Karles Invitational Conference

Distribution Statement A

Legal Issues

Page 13: 2014 Karles Invitational Conference

Distribution Statement A

Hey, they’re lighting their arrows…can they do that?

This is all about what is, and isn’t, allowed under

the Law of Armed Conflict (LOAC).

Page 14: 2014 Karles Invitational Conference

Distribution Statement A

From the beginning of human history, man has been

targeting his enemies with his weapons

How many millions have

died, or been injured?

Civil War dead

WWII Battle of the Bulge

Remembering the dead from Iraqi Freedom

Page 15: 2014 Karles Invitational Conference

Distribution Statement A

Under the Napoleonic Theory of War (everything is fair

game), we have opted for the “bigger bang,” causing

potential for incidental injury to civilians and collateral

damage to civilian property to increase.

An atomic blast

The atomic dome in Hiroshima,

located directly under Ground Zero.

Safety of innocent civilians

wasn’t the greatest concern.

Page 16: 2014 Karles Invitational Conference

Distribution Statement A

Lessons from WWII: Destruction beyond that necessary

to accomplish the military objective can prolong the war,

and can make securing a lasting peace more difficult.

WWII bomb damage in the

German city of Dresden

German civilians in Halberstad

following 8 APR 1945 bombing

Page 17: 2014 Karles Invitational Conference

Distribution Statement A

TV brought the Vietnam war to the nation’s living rooms,

put a human “face” on the war and contributed to civil and

political unrest at home

Vietnam War protest in Washington, D.C.

Siege at Khe Sanh – 500lb

bombs falling on NVA trenches

“The Wall”

Page 18: 2014 Karles Invitational Conference

Distribution Statement A

Despite man’s history of violence, there have long been

restrictions on the use of force during war. Today, treaties

as well as the Law of Armed Conflict or LOAC regulate the

use of force during armed conflict.

• Now, all weapons and weapon systems, from small arms and ammunition to cruise missiles are subjected to a legal review to ensure compliance with the Law of Armed Conflict (LOAC) and applicable treaties.

• Additionally, once declared legal, the employment of these weapons may be further controlled by Rules of Engagement and the Discriminate Use of Force

Page 19: 2014 Karles Invitational Conference

Distribution Statement A

Legal Review of Weapons

• DoD policy* requires that a legal review be conducted of all weapons and weapon systems acquired to meet a military requirement of the US.

• Primarily this review requires an analysis of three factors: (1) whether the weapon causes suffering that is needless, superfluous, or

disproportionate to the military advantage reasonably expected from the use of the weapon. It cannot be declared unlawful merely because it may cause severe suffering or injury;

(2) whether the weapon is capable of being controlled so as to be directed against a lawful target, (i.e., it can discriminate between lawful and unlawful targets);

(3) whether there is a specific treaty provision or domestic law prohibiting the weapon’s acquisition or use.

• These three factors are analyzed in relation to the weapon’s intended method of employment, not in relation to any possible use, as any lawful weapon can be used illegally.

*See DODD 5000.1, subparagraph E1.15, and SECNAVINST 5000.2C, paragraph 2.6.

With regard to Armed Autonomous Systems, the critical issue

is the ability for the weapon to discriminate a legal target

Page 20: 2014 Karles Invitational Conference

Distribution Statement A

Rules of Engagement Defined

• Directives issued by competent

authority which delineate the

circumstances and limitations under

which U.S. forces will initiate and/or

continue combat engagement with

other forces encountered.

Joint Pub 1-02

• ROE are based on the LOAC as well as

political and military factors and can be

utilized to guide the military use of

force during a particular operation. ROE can restrict the employment

of certain weapons depending on

the tactical, strategic or political

situation.

Page 21: 2014 Karles Invitational Conference

Distribution Statement A

Discriminate Use of Force (DUF)

• “Our concept of DUF strongly aligns with much of the current thinking about effects-based operations (EBO). The coming of age of these concepts is influenced both by opportunity and need.

• DUF brings new concepts for collaboration and massing of effects, which are joint in character and integrated among joint force echelons and components. It is enabled by new weapons; improved intelligence, surveillance, and reconnaissance; shared situation understanding; improved individual and collaborative training; greater agility; smaller footprints; and other emerging capabilities of the U.S. military that allow more timely and precise use of force than heretofore possible.

• The need is driven by the nature of current military campaigns. A striking feature of these campaigns is the tension among multiple strategic and operational objectives: cause regime change, destroy a terrorist organization, decapitate leadership, but preserve infrastructure, don’t wage war on a people, do hold an international coalition together, etc.”

“Report of the Defense Science Board Task Force on Discriminate Use of Force,” JUL 2003

Driven by new technology yielding better discrimination,

which leads to demand for even better technology

Page 22: 2014 Karles Invitational Conference

Distribution Statement A

The Issue

• Using today’s paradigm of warfare, there is a requirement to maintain

an operator in the “weapons release”-loop to avoid the possibility of

accidentally killing someone.

• An operator is effectively “welded” to each armed unmanned system

for this purpose.

• This is a “performance- and cost-killer” when considering the

employment of large numbers of armed unmanned systems

How can we effectively employ autonomous armed

unmanned systems, while avoiding this problem?

Page 23: 2014 Karles Invitational Conference

Distribution Statement A

Ethical Issues

Page 24: 2014 Karles Invitational Conference

Distribution Statement A

Asimov’s Three Laws of Robotics:

1. A robot may not injure a human being or, through

inaction, allow a human being to come to harm.

2. A robot must obey orders given it by human beings

except where such orders would conflict with the First

Law.

3. A robot must protect its own existence as long as such

protection does not conflict with the First or Second Law.

Originally from Asimov’s 1942 short story “Runaround”

While labeled as “laws,” the real thrust behind

these are the ethics of protecting human life

Page 25: 2014 Karles Invitational Conference

Distribution Statement A

“The Terminator”

Hollywood went to the far ethical extreme from Asimov with these films

Page 26: 2014 Karles Invitational Conference

Distribution Statement A

Where’s Reality?

• For a long time, Asimov defined everyone’s

“reality” when it came to robo-ethics.

• “The Terminator” movie series took

everyone in a completely different ethical

direction.

• The issue is that the ethics for both of them

are rooted in fiction.

Neither represents reality

Page 27: 2014 Karles Invitational Conference

Distribution Statement A

Policy Issues

Page 28: 2014 Karles Invitational Conference

Distribution Statement A

New DoD Directive

• DoD has recently released a new Directive, 3000.09, “Autonomy in Weapon

Systems”

– Directive was the result of an 18-month effort that included the Office of the

Secretary of Defense (OSD), the Joint Staff, and the Military Services

– Mr. Canning was the sole representative on the Working Group from any of the

Service Laboratories (to include Air Force and Army).

• This Directive establishes policy for the development and use of autonomy in

weapon systems, including:

– Autonomous and semi-autonomous functions

– Manned and unmanned platforms

• The purpose is to establish guidelines to minimize the probability and

consequences of failures that could lead to unintended engagements without

unduly inhibiting progress toward the development of new capabilities.

This new DoD Directive closely mirrors Dahlgren’s approach.

Page 29: 2014 Karles Invitational Conference

Distribution Statement A

The New Paradigm

Page 30: 2014 Karles Invitational Conference

Distribution Statement A

Battlefield Economics

Today’ Soldier Today’s Robots

Tomorrow’s Desired Robot Force

Too expensive to keep complete “man-in-the-loop” fire control for very large numbers of robots. We must

move to autonomous robots that can decide for themselves when to pull the trigger.

There is a known cost associated with putting

the man on the battlefield with his weapon.

This is “man-in-the-loop” fire control.

We are placing an expensive machine between

the man and his weapon, raising cost..

Page 31: 2014 Karles Invitational Conference

Distribution Statement A

Maybe we can!

This is your worst nightmare!

It is a safety issue concerning the innocents of war.

Page 32: 2014 Karles Invitational Conference

Distribution Statement A

Target Discrimination:

How do you tell the difference?

Between a cruise ship…

…and a war ship?

Between people who are just mad at you…

…and a determined enemy?

Page 33: 2014 Karles Invitational Conference

Distribution Statement A

Today’s Valid Targets from a Legal Standpoint

Not a Military

Objective

Valid Military

Objective

Not a Military

Objective

Can’t Target Target People

Valid Military

Objective

Target Things Target All

People

Things

“We can target objects when they are military objectives and we can target people when they are military

objectives. If people or property isn't a military objective, we don't target it. It might be destroyed as collateral

damage, but we don't target it. Thus in many situations, we could target the individual holding the gun and/or the

gun and legally there's no difference.” – MAJ R. Craig Burton, USAF, Judge Advocate General's Legal Center

and School

Page 34: 2014 Karles Invitational Conference

Distribution Statement A

Tomorrow’s Target Subset for Autonomous Systems

Not a Military

Objective

Valid Military

Objective

Not a Military

Objective

Can’t Target

People

Won’t Target

People

Valid Military

Objective

Target Things Target Things,

but Not People

People

Things

For autonomous systems, we are purposefully

restricting the target set to “things.”

Page 35: 2014 Karles Invitational Conference

Distribution Statement A

A Proposed Paradigm for Autonomous Use

of Weapons

• Let the machines target other machines

– Specifically, let’s design our armed unmanned systems to automatically ID, target, and neutralize or destroy the weapons used by our enemies – not the people using the weapons.

– This gives us the possibility of disarming a threat force without the need for killing them.

– We can equip our machines with non-lethal technologies for the purpose of convincing the enemy to abandon their weapons prior to our machines destroying the weapons, and lethal weapons to kill their weapons.

• Let men target men

– In those instances where we find it necessary to target the human (i.e. to disable the command structure), the armed unmanned systems can be remotely controllable by human operators who are “in-the-weapons-control-loop”

• Provide a “Dial-a-Level” of autonomy to switch from one to the other mode.

This may overcome some of the political objections and legal

ramifications of the use of Armed Autonomous Systems

Page 36: 2014 Karles Invitational Conference

Distribution Statement A

Legal Precedence Established

• TOMAHAWK Anti-Ship Missile

– Passive Identification/Direction-Finding Equipment

• CAPTOR Mine

– “Mousetrap that chases the mouse”

• AEGIS Ships – “Auto-Special” Engagement Mode

• Close-In Weapon System

– Automatic Cruise Missile Defense

• Patriot Missile System – Automated air defense

Each of these directly targets either the bow, or the arrow,

but not the archer. People may still die, but as a

secondary consequence of going after the weapon of war.

Page 37: 2014 Karles Invitational Conference

Distribution Statement A

Tomahawk Anti-Ship Missile

1983 - 1995

The missile is launched in the general direction of the target and at some distance

from the expected target position, it enters a serpentine flight pattern to search for

it using both active radar to lock on a detected target, and passive identification

equipment to scan enemy emissions.

PI/DE Capability

From “The Commander’s Handbook on the Law of

Naval Operations,” NWP 1-14M

9.9 OVER-THE-HORIZON WEAPONS SYSTEMS

Missiles and projectiles with over-the-horizon or beyond-

visual-range capabilities are lawful, provided they are

equipped with sensors, or are employed in conjunction with

external sources of targeting data, that are sufficient to

ensure effective target discrimination.

Page 38: 2014 Karles Invitational Conference

Distribution Statement A

CAPTOR Mine System

1979- 2000

The mousetrap that chases the mouse

CAPTOR acoustically detects

submarines while ignoring

surface ships. Upon detection of a

target, the mine launches an

acoustic homing Torpedo Mk 46

Mod 6.

Page 39: 2014 Karles Invitational Conference

Distribution Statement A

AEGIS Auto-Special Doctrine

1973-Present

AEGIS Auto-Special Doctrine allows “hands-off” engagement of AAW

threats completely from initial detection to kill assessment, and the decision

to re-engage, if necessary.

Page 40: 2014 Karles Invitational Conference

Distribution Statement A

Close-In Weapon System

1980- Present

The MK 15 Phalanx Close-In Weapons System is a fast-reaction, rapid-fire 20-millimeter

gun system that provides US Navy ships with a terminal defense against anti-ship

missiles that have penetrated other fleet defenses. Designed to engage anti-ship cruise

missiles and fixed-wing aircraft at short range, Phalanx automatically engages functions

usually performed by separate, independent systems such as search, detection, threat

evaluation, acquisition, track, firing, target destruction, kill assessment and cease fire.

Page 41: 2014 Karles Invitational Conference

Distribution Statement A

Patriot Missile System

1984- Present

“An incoming missile could be 50 miles

(80.5 kilometers) away when the Patriot's

radar locks onto it. At that distance, the

incoming missile would not even be visible

to a human being, much less identifiable. It is

even possible for the Patriot missile system

to operate in a completely automatic mode

with no human intervention at all. An

incoming missile flying at Mach 5 is

traveling approximately one mile every

second. There just isn't a lot of time to react

and respond once the missile is detected,

making automatic detection and launching an

important feature.”

http://science.howstuffworks.com/patriot-missile.htm

Page 42: 2014 Karles Invitational Conference

Distribution Statement A

A Relevant Dichotomy

Anti-Tank Landmines Anti-Personnel Landmines

There is a huge international debate over the continuing use of Anti-Personnel Landmines,

with most of the world abandoning their use. The single essential of the problem is the fact

that conventional Anti-Personnel Landmines are designed to persist, remaining lethal for

decades after they are emplaced. This then becomes a long-term issue for civilian populations

living in the areas that were mined. There is not the same level of debate over the use of Anti-

Tank Landmines.

This highlights the issue of targeting the archer, as opposed to his bow, or arrow.

Page 43: 2014 Karles Invitational Conference

Distribution Statement A

Enabling Technologies

• Sensors

• Artificial Intelligence

• Communications

• Protection

• Stabilized weapons

• Data recording

Page 44: 2014 Karles Invitational Conference

Distribution Statement A

How Do You Tell a Lethal Machine from

One that Just Wants Your Rifle?

Answer: You can’t! Your machine has to be able to present a credible lethal threat to the enemy. We can keep the enemy guessing as to the lethality of the machine he is facing since

he won’t know if a human controller is on the other side of the interface, or just a computer. (The Turing Test.)

Page 45: 2014 Karles Invitational Conference

Distribution Statement A

Striking a Balance

Source: IEEE Technology and Society magazine, Spring 2009

Our approach to arming

autonomous unmanned systems

strikes a balance between the

legal, ethical, and policy issues

involved, while also considering

the financial costs of conducting

war.

It presents us with a truly new

paradigm for the conduct of

warfare in which we will be

trying to disarm a foe, as opposed

to killing him, but maintains the

ability to kill him, if necessary.

Page 46: 2014 Karles Invitational Conference

Distribution Statement A

A Parting Shot

Page 47: 2014 Karles Invitational Conference

Distribution Statement A

Backups

Page 48: 2014 Karles Invitational Conference

Distribution Statement A

The Leading Worldwide Ethical & Philosophical

Positions on the Use of Armed Robots

• International ban on all armed unmanned systems. – Noel Sharkey,

University of Sheffield, founding member of the International Committee for Robotic Arms Control (ICRAC)

• Develop an “ethics” module for armed robots that will allow them to process the Rules Of Engagement more even-handedly than human troops, and allow them to target and kill humans. – Ron Arkin, Georgia Tech

• Design armed robots that target and engage a foe’s implements of war – not the human enemy – and do this with “weapons” that may not be traditional. – John Canning, NSWC DD

Which one is the most ethical?

Page 49: 2014 Karles Invitational Conference

Distribution Statement A

0

20

40

60

80

100

120

Completely ban armed

unmanned systems

Autonomously target the

"bow," or the "arrow" -

not the human "archer"

Autonomously target all

valid military targets

Rela

tive

"

Valu

e"

for E

thic

s &

Tech

nolo

gy

Primary World-Wide Positions on Targeting of Autonomous, Armed Unmanned Systems

Comparison of Ethics and Technology for the Primary

International Positions on Autonomous, Armed

Unmanned Systems

Relative improvement over today's

battlefield ethics

Relative level of technology

Improvement needed

Page 50: 2014 Karles Invitational Conference

Distribution Statement A

Why is Policy Important?

• Legal issues define what we must do to meet the requirements of both

domestic law and international agreements.

– That leaves a lot of “gray area” open to interpretation.

• Ethical issues define what we should do from a moral standpoint.

– They don’t necessarily state what we must do.

• According to Webster, policy is defined as “…a definite course or

method of action selected from among alternatives and in light of

given conditions to guide and determine present and future decisions.”

– This tackles the gray areas not covered well by the legal issues, and provides more

firm direction than ethics do.

Policy provides more definitive direction on what

we can and cannot do in our robotic designs.

Page 51: 2014 Karles Invitational Conference

Distribution Statement A

• This Directive was motivated by a desire to get ahead

of increasing autonomy in both unmanned and manned

systems and establish policy for an emerging

technology – For a number of years, thought leaders inside and outside DoD have

raised questions about the appropriate role of autonomy in military

systems, including weapon systems

– Prior to this effort, DoD lacked formal policies on appropriate degrees of

autonomy and human control in the use of force

• This Directive establishes a flexible and responsible

framework upon which to further refine and adapt

policy as technology continues to evolve

Policy for an Emerging Technology

Our efforts were part of a “forcing function”

that caused OSD to move on this policy.

Page 52: 2014 Karles Invitational Conference

Distribution Statement A

Policy Established in Directive

• Codifies existing practices with respect to autonomous and

semi-autonomous functions:

– Semi-autonomous systems using non-lethal or lethal force

• Systems that use autonomous functions to engage specific targets that have

been selected by a human operator

– Defensive, human supervised systems employing lethal force for local

defense of manned vehicles or installations

• For example: Patriot, Aegis, active protection systems

– Fully autonomous systems using non-lethal, non-kinetic, anti-materiel

force, such as some forms of electronic attack

• Establishes a process (and criteria) for reviewing potential

future autonomous weapon systems

– Provides responsible oversight for development of autonomous

systems, without tying the hands of developers

– Mandates appropriate level of verification & validation (V&V) and test

& evaluation (T&E) to ensure safe and reliable functioning

Page 53: 2014 Karles Invitational Conference

Distribution Statement A

Sensors

• “DC to Daylight”

– Broad spectrum coverage

– Detect the presence of weapons

• Radar

– Imaging

– Robust

– Enable target discrimination

• Distributed Imaging Radar Technology (DIRT)

• Optical

– IR

– Low Light Level

– “All-weather” capability

• Other

– ?

• No single “Silver Bullet” sensor

– Likely will need a combination of sensors

Imaging Radar

Night Vision

IR Image

Page 54: 2014 Karles Invitational Conference

Distribution Statement A

Artificial Intelligence

• Situational Awareness

– Sensor fusion

• Efficient battlefield search for weapons

• ID weapons as weapons

– Automatic Target Recognition

– Share information about new weapons with others

• Communicate to enemy that his weapon is being targeted

– Give him the opportunity to abandon his weapon

• “Dial-a-Level” of autonomy

• Select correct weapon(s) for use

• Target/track enemy weapons

• Engage enemy weapons

• Swarm behavior

– Self-coordinating

A cartoon for AI

Linguistic Geometry

Page 55: 2014 Karles Invitational Conference

Distribution Statement A

Communications

• Provide Common Relevant Operational Picture (CROP) input to the Command Structure

• Local coordinating communications among other unmanned systems

• “Skip echelon” capability

• Secure

– LPI/LPD

– Encryption

• High bandwidth

– HDTV

• Communicate with the enemy Long-Range Acoustic Device

Navy Combat Information Center

Page 56: 2014 Karles Invitational Conference

Distribution Statement A

Protection

• Expect to draw fire

– Remember, we will be using COTS gear

– Be prepared for it

• Armor

– Passive (i.e. Kevlar)

– Active (i.e explosive)

• Use redundant & dispersed components

• Active defenses

– Take out the source of incoming fire

• Hostile intent is already established

• Kill the source

– Take out the incoming fire itself

• Electronic Attack Systems

• Counter-RPG systems

– Self-repairing materials

Ye olde armor

Electronic Attack System

TROPHY counter-RPG System

Page 57: 2014 Karles Invitational Conference

Distribution Statement A

Stabilized Weapons • Shoot faster and straighter than a human

• Target the enemy’s weapons

• Stay inside the enemy’s OODA loop

• Non-lethals needed to separate human from his

weapons

– Active Denial technology

• Lethals needed to destroy weapons

– Lethal to weapons

– Traditional lethals

• Guns

• Missiles

– Unconventional lethals

• Directed Energy Weapons

Active Denial ACTD Ship-mounted stabilized guns

The “weapon of

choice” may not be

a traditional gun or

missile.

It could be a

diamond-tipped

saw!

Page 58: 2014 Karles Invitational Conference

Distribution Statement A

Data Recording

• What happens if the enemy spoofs our armed unmanned systems, and causes them to kill when they shouldn’t?

– Political support can disappear virtually instantaneously

• Law enforcement departments equip today’s police cruisers with video cameras and recorders to provide evidence of what happens during routine traffic stops.

• Need to record, and download, sensor data from our unmanned systems leading up to, and encompassing, engagements so that we have a record of any attempts at spoofing.

• Supplies direct evidence of enemy guilt

From a police video of a traffic stop