spectral lines - nov 2013

1
Ad hoc surveillance will inevitably pro- liferate. Dropcam and other cheap surveil- lance programs, already popular among the tech-savvy, will spread widely. DIY and vigi- lante panopticons will complicate matters. Imagine someone like George Zimmerman, the Florida neighborhood watchman, equipped not with a gun but with a digi- tal surveillance net, allowing him to track pretty much anything—on his smartphone. With data multiplying exponentially and technology inexorably advancing, the ques- tion is not whether an all-encompassing surveillance systems will be deployed. The question is how, when, and how many. In the absence of settled laws and norms, the role of engineers looms large. They will shoulder much of the burden of designing systems in ways that limit the damage to innocents while maximizing the pressures brought to bear on bad guys. But where do the responsibilities of engineers begin and end? It is too early to answer conclusively, but engineers would do well to keep a few fun- damental principles in mind: 1. Keep humans in the loop, but insist they follow the “rules of the road.” Compiling and analyzing data can be done by machines. But it would be best to design these surveillance systems so that a human reviews and ponders the data before any irreversible actions are taken. If citizens want to spy on one another, as they inevitably will, impose binding rules on how they do so. 2. Design self-correcting systems that eject tainted or wrong information fast and inexpensively. Create a professional ethos and explicit standards of behavior for engineers, code writers, and designers who contribute significantly to the creation of panopticon-like systems. 3. Delete the old stuff routinely. Systems should mainly contain real-time data. They should not become archives tracing the lives of innocents. Engineers acting responsibly are no guarantee that panopticons will not come to control us. But they can be part of getting this brave new world right. —G. Pascal Zachary G. Pascal Zachary is the author of Endless Frontier: Vannevar Bush, Engineer of the American Century (Free Press, 1997). He teaches at Arizona State University. 08 | OCT 2013 | NORTH AMERICAN | SPECTRUM.IEEE.ORG COLIN ANDERSON/GETTY IMAGES 10.13 or centuries, we humans have lacked the all-knowing, all-seeing mechanisms to credibly predict and prevent bad actions by others. Now these very powers of preemption are perhaps within our grasp, thanks to a confluence of technologies. In the foreseeable future, governments, and perhaps some for-profit corporations and civil-society groups, will design, construct, and deploy surveillance systems that aim to predict and prevent bad actions—and to identify, track, and neutralize people who commit them. And when contemplating these systems, let’s broadly agree that we should prevent the slaughter of children at school and the abduction, rape, and imprisonment of women. And let’s also agree that we should thwart lethal attacks against lawful government. Of late, the U. S. government gets most of the attention in this arena, and for good reason. The National Security Agency, through its vast capacity to track virtually every phone call, e-mail, and text message, promises new forms of preemption through a system security experts call persistent surveillance. The Boston Marathon bombing, in April, reinforced the impression that guaranteed prevention against unwanted harm is elusive, if not impossible. Yet the mere chance of stopping the next mass shooting or terror attack persuades many people of the benefits of creating a high-tech version of the omniscient surveillance construct that, in 1787, the British philosopher Jeremy Bentham conceived as a panopticon: a prison with a central viewing station for watching all the inmates at once. Some activists complain about the potential of such a system to violate basic freedoms, including the right to privacy. But others will be seduced by the lure of techno fixes. For example, how could anyone object to a digital net that protects a school from abusive predators? Rules for the Digital Panopticon The technologies of persistent surveillance can protect us only if certain boundaries are respected F SPECTRAL LINES_

Upload: pablosilvoni

Post on 15-Apr-2017

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Spectral Lines - Nov 2013

Ad hoc surveillance will inevitably pro-liferate. Dropcam and other cheap surveil-lance programs, already popular among the tech-savvy, will spread widely. DIY and vigi-lante panopticons will complicate matters. Imagine someone like George Zimmerman, the Florida neighborhood watchman, equipped not with a gun but with a digi-tal surveillance net, allowing him to track pretty much anything—on his smartphone.

With data multiplying exponentially and technology inexorably advancing, the ques-tion is not whether an all-encompassing surveillance systems will be deployed. The question is how, when, and how many.

In the absence of settled laws and norms, the role of engineers looms large. They will shoulder much of the burden of designing systems in ways that limit the damage to innocents while maximizing the pressures brought to bear on bad guys.

But where do the responsibilities of engineers begin and end?

It is too early to answer conclusively, but engineers would do well to keep a few fun-

damental principles in mind:

1. Keep humans in the loop, but insist they follow the “rules of the road.” Compiling and analyzing data can be done by machines. But it would be best to design these surveillance systems so that a human reviews and ponders the data before any irreversible actions are taken. If citizens want to spy on one another, as they inevitably will, impose binding rules on how they do so.2. Design self-correcting systems that eject tainted or wrong information fast and inexpensively. Create a professional ethos and explicit standards of behavior for engineers, code writers, and designers who contribute significantly to the creation of panopticon-like systems. 3. Delete the old stuff routinely. Systems should mainly contain real-time data. They should not become archives tracing the lives of innocents.

Engineers acting responsibly are no guarantee that panopticons will not come to control us. But they can be part of getting this brave new world right.

—G. Pascal Zachary

G. Pascal Zachary is the author of Endless Frontier: Vannevar Bush, Engineer of the American Century (Free Press, 1997). He teaches at Arizona State University.

08 | OCT 2013 | nOrTh ameriCan | SPeCTrUm.ieee.OrG

co

lin

an

de

rs

on

/ge

tty

ima

ge

s

10.13

or centuries, we humans have lacked the all-knowing, all-seeing mechanisms to credibly predict and prevent bad actions by others. Now these very powers of preemption are perhaps within our grasp, thanks to a confluence of technologies.

In the foreseeable future, governments, and perhaps some for-profit corporations and civil-society groups, will design, construct, and deploy surveillance systems that aim to predict and prevent bad actions—and

to identify, track, and neutralize people who commit them. And when contemplating these systems, let’s broadly agree that we should

prevent the slaughter of children at school and the abduction, rape, and imprisonment of women. And let’s also agree that we should thwart lethal attacks against lawful government.

Of late, the U. S. government gets most of the attention in this arena, and for good reason. The National Security Agency, through its vast capacity to track virtually every phone call, e-mail, and text message, promises new forms of preemption through a system security experts call persistent surveillance.

The Boston Marathon bombing, in April, reinforced the impression that guaranteed prevention against unwanted harm is elusive, if not impossible. Yet the mere chance of stopping the next mass shooting or terror attack persuades many people of the benefits of creating a high-tech version of the omniscient surveillance construct that, in 1787, the British philosopher Jeremy Bentham conceived as a panopticon: a prison with a central viewing station for watching all the inmates at once.

Some activists complain about the potential of such a system to violate basic freedoms, including the right to privacy. But others will be seduced by the lure of techno fixes. For example, how could anyone object to a digital net that protects a school from abusive predators?

Rules for the Digital PanopticonThe technologies of persistent surveillance can protect us only if certain boundaries are respected

F

SPeCTraL LineS_

10.spectrallines.NA.indd 8 9/11/13 4:00 PM