accountability in the digital age - saïd business school · accountability in the digital age...

23
Accountability in the Digital Age Social, Technological, Political, and Commercial Implications Angelina Bishman Submitted to the 2016 Robert Davies Commemorative Essay Award Skoll Centre for Social Entrepreneurship University of Oxford

Upload: dangdien

Post on 28-Jul-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Accountability

in the

Digital Age

Social, Technological, Political, and

Commercial Implications

Angelina Bishman

Submitted to the 2016 Robert Davies Commemorative Essay Award

Skoll Centre for Social Entrepreneurship

University of Oxford

 

 

 

 

 

 

 

 

 

 

 

 

A   consideration   of   the   challenges   of   digital  

accountability  and  a  theorisation  of  potential  solutions  

that  combine  the  public  and  private  sectors.    

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

TABLE  OF  CONTENTS  

 

Executive  Summary   Page  1  

   

Introduction   Page  2  

   

Chapter  1)  Accountability  in  the  digital  age   Page  4  

   

1.1)  The  three  properties  of  accountability:  transparency,  standards  and  liability  

   

Chapter  2)  Analysing  accountability  in  four  domains  of  the  digital  age   Page  6  

   

  2.1)  Social  domain:  personal  privacy    

  2.2)  Technological:  the  regulatory  gap    

  2.3)  Political  domain:  cyberterrorism      

  2.4)  Commercial  domain:  monopolisation      

   

Chapter  3)  Potential  solutions  and  policy  implications   Page  11  

   

  3.1)  Extrapolating  Patterns    

3.1.1)  Technology  is  a  human,  not  a  technological  issue  

 3.1.2)  Fostering  accountability  in  the  digital  age  requires  the  respective  and  

 combined  efforts  of  the  public  and  private  sector  

3.2)  The  “Alpha-­‐Beta-­‐Gamma”  Model  for  evaluating  accountability  decisions  in  the  

digital  age  and  their  ethical  implications    

   

Conclusion   Page  16  

   

List  of  Works  Cited   Page  17  

 

 

1    

EXECUTIVE  SUMMARY      

  In   this  essay,   I  will   challenge   the  popular  assumption   that   increasing  accountability   irrespective  of  

the  context  always  proves  beneficial.  Specifically,  I  will  argue  that  accountability  in  the  digital  age  is  

a   concept   that   involves   tradeoffs   and   implications   for   different   stakeholders.   My   argument   is  

structured  into  three  chapters.  First,  I  define  accountability  as  a  relational  concept  that  must  satisfy  

three   properties:   transparency,   standards   and   liability.   I   proceed   to   transpose   this   definition   of  

accountability  to  the  digital  age  such  that  each  property  reflects  distinctive  demands  of  the  digital  

age.   Second,   I   evaluate   the   current   climate   of   accountability   in   four   domains   of   the   digital   age:  

social,   technological,   commercial   and  political.   I   largely   restrict  my   analysis   of   each  domain   to   an  

explicit  issue,  chosen  based  on  its  contemporary  resonance.  Third,  I  extrapolate  two  patterns  about  

accountability  in  the  digital  age  that  can  influence  policy  prescriptions.  I  conclude  the  third  and  final  

chapter   by   introducing   the   “Alpha-­‐Beta-­‐Gamma”   model,   which   is   borne   out   of   my   findings.   I  

propose  that  the  model  helps  to  facilitate  analysis  of  accountability  decisions  in  the  digital  age  and  

their  ethical  implications.  

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

     

2    

In  recent  decades,  accountability  has  transformed  from  a  “culturally  innocuous  term”  to  a  “cultural  

keyword”  (Dubnick,  2014:  23).  Evidence  of  this  transformation  is  prominent.  Forgotten  are  the  days  

when   uttering   “accountability”   conjured   up   banal   images   of   a   financial   audit   where   companies’  

ledgers   were   examined   for   ill-­‐considered   postings   according   to   objective   criteria.   Today,  

accountability  broaches  many  contexts,  from  healthcare  reform  to  business  ethics.  This  essay  seeks  

to  transpose  accountability  to  yet  another  context:  the  digital  age.  When  defining  accountability  in  

the  digital  age,  it  is  insufficient  to  invoke  a  de-­‐contextualised  dictionary  definition  of  accountability  

or   trace   its   etymological   roots.   Rather,   a   plausible   definition   should   heed   how   the   concept   is  

currently  being  used.    

 

  When   one   uses   accountability,   it   is   typically   used   in   conjunction   with   modifiers–   one   speaks   of  

political   or   financial   accountability   where   accountability   is   “surrendered   to   its   contextualized  

meanings”   (Dubnick,   2014:   27).   Whatever   substantive   meaning   accountability   might   have   is  

therefore   “overwhelmed   and   subordinated   to   the   demands”   of   the   digital   age’s   specific   task  

environments   (ibid).  At  a   cursory  glance,   there  are   two  demands   that  accountability   in   the  digital  

age  necessarily  considers:  practical  and  ethical.  Both  demands  imply  the  importance  if  not  urgency  

in  formulating  a  well-­‐considered  and  operational  definition  of  accountability  in  the  digital  age.    

   

  Practically,  defining  accountability  in  the  digital  age  can  provide  us  with  a  useful  instrument  “to  be  

pulled”  from  the  policy  toolbox  to  manage  specific  problems  (Dubnick,  2014:  25).  Though  we  have  

already  felt  the  immense  impact  of  the  digital  revolution,  we  are  only  just  beginning  to  understand  

its   implications.   This   is   exacerbated   by   the   inherent   difficultly   in   our   current   regulatory   systems  

staying   attuned   to   the   rapid   pace   of   technological   development.   Defining   accountability   in   the  

digital  age  could  advance  discussions  about  how  we  can  be  both  proactive  and  reactive  in  the  face  

of  novel  technologies  and  their  risks.    

 

  Ethically,  defining  accountability  in  the  digital  age  is  closely  tied  to  efforts  to  bring  about  change  and  

reflects  the  view  that  current  governance  of  the  digital  age  is  not  sufficiently  ethically  robust.  Today,  

accountability   has   “become   narratively   intertwined”   with   the   promises   of   justice,   efficiency   and  

greater   administrative   performance,   signifying   the   way   that   the   term   has   become   “discursively  

associated  with  what  are  perceived  to  be  higher  public  values”  (ibid:  33).  Defining  accountability  in  

the  digital  age  could  stimulate  discussions  about  whether  or  not  fostering  accountability  in  this  area  

INTRODUCTION  

3    

makes   our   relationship   with   technology   more   (or   less)   sensitive   to   public   values   and   ethical  

commitments.    

 

  I  will  now  define  accountability  in  the  digital  age  by  taking  a  fundamental  definition  of  accountability  

and   adapting   it   to   the   demands   of   the   digital   age.   Mostly,   there   is   profound   agreement   that  

accountability   is   a   relational   concept   that   satisfies   three   properties:   transparency,   standards   and  

liability.  Accountability  is  organised  around  a  relationship  between  an  agent,  who  owes  an  account,  

and  a  principal,  to  whom  the  account  is  owed;  the  properties  of  transparency,  liability  and  standards  

function   to   sustain   that   relationship.   Accountability’s   main   objects   are   individuals,   businesses,  

national   governments,  public–private  partnerships,  non-­‐state  actors  and   international   institutions.  

Importantly,   accountability  does  not  have   to  be  a   relationship  between   two  objects  exclusively:   a  

threefold   relationship   is   possible   when   national   governments   and   international   institutions  

collaborate   to  keep  a  certain  agent  accountable.  Moreover,   individual  objects  are  not  confined   to  

being  either  an  agent  or  a  principal;  they  can  in  different  circumstances  be  both.  

 

 

 

 

 

     

 

 

 

 

 

 

 

 

 

4    

As   previously   defined,   accountability   is   a   relation   that   satisfies   three   properties:   transparency,  

standards   and   liability.   I   will   now   outline   each   property   in   two   ways:   first,   as   it   has   been  

traditionally  conceived,  and  second,  as  it  is  adapted  to  the  demands  of  the  digital  age.    

 

 

 

 

 

 

 

 

 

 

 

Agent-­‐principal  relationships  intrinsically  involve  asymmetry  of  information:  the  agent  knows  more  

about  its  own  behaviour  than  anyone  else.  Increased  transparency  clearly  reduces  this  asymmetry,  

enabling  the  principal  to  examine  the  agent's  behaviour  and  hold  it  accountable.    

 

Transparency  can  be  adapted  to  the  digital  age  in  two  ways.  First,  the  digital  age  sees  data  being  

more  easily  acquired  and  shared.  Therefore,  intuitively,  the  digital  age  is  a  more  transparent  one.  

However,  the  deluge  of  data  has  also  made  discerning  relevant  information  more  difficult.  Second,  

technological  innovations  have  allowed  for  data  encryption,  which  clearly  reduces  transparency  by  

rendering  information  inaccessible.  

 

 

Accountability  differs  from  coercion  in  that  it  appeals  to  previous  standards–  formal  and  informal.  

These   standards   offer   predictable   measures   designed   to   mitigate   potential   harm   or   promote  

compliance.  

 

Accountability   standards   can   be   adapted   to   the   digital   age   in   one   way:   devising   appropriate  

standards  involves  considering  important  trade-­‐offs  between  control  and  innovation:  overbearing  

standards  can  potentially  stifle  technological  progress.    

Property  Two:  Standards  

Property  One:  Transparency  

CHAPTER  1:  ACCOUNTABILITY  IN  THE  DIGITAL  AGE  

The  Three  Properties  of  Accountability  

5    

 

 

Without   liability,   accountability   processes   are   empty   because   they   are   unenforceable.   To   be  

accountable   is   also   to   be   liable   for   one’s   actions.   The   type  of   standards   that   an   agent   breaches  

determines   the   agent’s   liability:   breaching   mandatory   standards   typically   has   more   severe  

repercussions  than  breaching  voluntary  ones.    

 

Liability  is  adapted  to  the  digital  age  by  considering  the  fact  that  various  industries  have  introduced  

new  and  unpredictable  risks.  The  prospect  of  being  liable  or  punishable  can  incentivize  agents  to  

harness  opportunities  of  the  digital  age  more  responsibly.    

 

     

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Property  Three:  Liability  

6    

“The  digital   age”   is   a  broad   term   referring   to  all   the  phenomena   that   characterise   the  historical  

shift   from   traditional   industry   to   information   computerisation   (Shell,   2007).   To   achieve   a  

manageable   yet   comprehensive   analysis   of   the   digital   age,   I   will   compartmentalise   it   into   four  

explicit   areas:   its   social,   technological,   political   and   commercial   implications.   Each   area   will   be  

explored  via  a  key  issue,  followed  by  an  assessment  of  its  current  climate  of  accountability.      

The  Four  Domains  of  The  Digital  Age  and  Their  Key  Issues  

 

 

 

 

 

Issue  Area:  Personal  Privacy    

 

In   the   digital   age,   governments   have   amassed   private   information   on   their   citizens   with  

unprecedented   efficiency.   Businesses   have   behaved   similarly,   acquiring   information   on   consumer  

“spending   habits,   magazine   subscriptions,   web-­‐surfing   activity   and   credit   history”   (Solove   &  

Rotenberg  &  Schwartz,  2006:  163).  Often,  consumers  are  unwittingly  complicit   in  their  own  privacy  

breaches  by  signing  away  terms  and  conditions.  The  threat  to  privacy  worsens  as  governments  and  

businesses  increasingly  share  information  with  each  other.  

 

Appraising  the  current  climate  of  accountability  

   

Transparency:  We  often  know   little  about   the  use  of  our  personal   information.  Roberts   (2006:  18)  

highlights   that   “disclosure   laws   have   been   carefully   tailored”   to   ensure   that   certain   government  

enclaves,  like  the  security  sector,  operate  in  secrecy.  Whilst  there  are  pragmatic  reasons  for  limited  

governmental  transparency  here,  the  right  to  privacy,  once  eroded,  is  difficult  to  recover.  We  require  

The  Social  Domain    

CHAPTER  TWO:  EVALUATING  ACCOUNTABILITY  IN  FOUR  DOMAINS  OF  THE  DIGITAL  AGE  

7    

that  the  knowledge  gap  is  bridged  by  an  effective  and  ‘tech-­‐savvy’  media,  who  inform  the  public  as  

to  how  their  data  is  being  used.    

 

Standards:   Imposing   standards   on   governments   through   democratic   expectation   of   government  

transparency  is  always  rational,  especially  if  they  render  the  intelligence  services  null.  More  rigorous  

standards   regulating   the   private   sector’s   use   of   customer   data   would,   however,   be   welcome.   A  

requirement   for  more   accessible   terms   and   conditions   would   be   a   useful   first   step   to   bridge   the  

knowledge   gap   between   consumers   and   data-­‐monopolists.   Furthermore,   standards   should   be  

updated   to   better   acknowledge   the   fact   that   governments   and   businesses   now   partake   in  

information  sharing.    

 

Liability:    

Trust-­‐based   accountability   can   possibly   be   relied   upon   when   consumers   keep   companies  

accountable  to  established  norms  for  data  collection.  This  is  enabled  by  the  possibility  of  a  consumer  

boycott,  as   long  as  the  knowledge  gap   is  bridged  by  an  effective  and  ‘tech-­‐savvy’  media.  However,  

the  fact  that  companies  can  seek  to  excuse  and,  perhaps,  even  cover-­‐up  their  misuse  of  consumer  

data  means  that  a  deterrent  through  sanctions  may  be  necessary.  This  is  particularly  true  in  complex  

domains   which   are   not   easily   understood   by   the   public.   While   the   government   cannot   be   kept  

completely   accountable   by   statute,   the   public   should   continue   to   pressure   governments   not   to  

overstep  their  boundaries.  

 

 

Issue  Area:  The  Regulatory  Gap    

 

It  is  widely  acknowledged  that  the  "tech  industry"  is  the  dominant  entrepreneurial  force  of  our  age.  

Whilst   software   advancement   reaches   incredible   heights,   our   legislative   systems   lag   behind.   This  

‘regulatory  gap’   can   lead   to   impotent  governance  and  moral  hazards  where   technologies  are  used  

with  no  expectation  of  potential  punishment.  

 

Appraising  the  current  climate  of  accountability    

   

Transparency:  The  explorative  nature  of  technological  progress  means  that  the  tasks  performed  by  

these   companies   are   very   difficult   to   understand.   However,   the   difficulty   in   understanding   how   a  

company  is  harnessing  new  technologies  has  no  bearing  on  the  importance  of  companies  upholding  

The  Technological  domain    

8    

transparency  in  their  use  of  these  technologies.  For   instance,  the  use  of  artificial   intelligence  raises  

ethical  questions  and  is  also  highly  complex  (Bostrom  and  Yudkowsky,  2011).    

 

Standards:   Using   standards   to   close   the   regulatory   gap  will   stifle   innovation   to   some   degree.   The  

alternative   is   companies   catering   to   the   whims   of   consumers   whilst   regulators   are   rendered  

impotent,   as   agents   use   new   technology   to   evade   accountability   by   exploiting   the   regulatory   gap.  

However,  an  early  imposition  of  light  regulation  followed  by  gradual  increments  in  standards  is  likely  

to   be   the   best   approach.     This   requires   those   forming   the   standards   to   collaborate  with   ‘thought  

leaders’  in  the  relevant  technological  fields  and  seek  to  establish  the  likely  risk  areas  and  how  best  to  

deal   with   them.   The   alternative   is   a   knee-­‐jerk   legislative   response   which   harms   technological  

progress  as  the  longer-­‐term  as  companies  refrain  from  innovation.    

 

Liability:  There  are  certainly  cases  where  sanctions-­‐based  accountability  should  be  preferred  to  trust-­‐

based   accountability.   Artificial   intelligence   (AI)   is   an   area   which   needs   to   be   dealt   with   carefully,  

particularly   when   it   is   required   to   answer   questions   of  morality   (Bostrom   and   Yudkowsky:   2011).  

Areas  such  as  this  should  be  involve  the  potential  for  sanctions,  as  the  costs  of  not  doing  so  may  be  

too  great.  

 

 

Issue  Area:  Cyberterrorism    

 

The  emergence  of  cyberterrorism–  “politically  motivated  attacks  on  computer  systems  for  achieving  

violent  outcomes”–  attests  to  the  fact  that  the  digital  age  is  “altering  the  nature  of  conflict”  (Arquilla  

&   Ronfeldt,   2001:   1).   Nearly   every   institution   and   individual   is   tied   to   a   virtual   network   to   some  

extent,  making  us  vulnerable  to  cyberterrorism  in  various  forms.  Moreover,  while  cyberterror  tactics  

cannot   replace   conventional   terrorist   tactics,   they   do   have   the   distinct   advantage   of   serving   as   a  

“force   multiplier”:   they   can   create   more   effect   when   executed   in   concert   with   other   traditional  

terrorist  activities  (CEDAT  2008:  73).    

 

Appraising  the  current  climate  of  accountability    

 

Transparency:    Whilst  cyberterrorism  operates  on  the  basis  of  opacity,  there  are  measures  to  create  

more  transparency.  At  the  international  level,  this  occurs  via  intelligence-­‐sharing  amongst  countries.  

At   the  national   level,   governments   can  use   their   own   technological   prowess   (aided  by   the  private  

The  Political  Domain    

9    

sector)   to   infiltrate   cyberterrorist  networks.  Private   firms,   such  as  Palantir   Technologies,  have  also  

aided  governments  in  their  efforts  to  identify  perpetrators  amongst  the  mass  of  online  data.    

 

Standards:   Cyberterrorism   can   arise   "anywhere   in   the   world”,   making   “investigation,   producing  

evidence   and   taking   the   offenders   to   court   an   immense   task   that   can   only   be   achieved   through  

international   cooperation”   (Cottim).   However,   there   are   currently   significant   deficiencies   in   the  

international  standards  for  combating  cyberterrorism.  As  laws  are  instituted  or  revised  according  to  

new   trends   in   cyberterrorism,  more   discussion   is   necessary   to  make   clear   what   the   national   and  

international  procedures  should  be  in  terms  of  handling  cyberterrorism.    

 

Given   that   technology   companies’   products   are  being  used  by   cyberterrorists,   companies   are  now  

also  being  held  responsible  for  destructive  use  of  their  products.  The  capitulation  of  MasterCard  and  

PayPal   to   political   pressures   in   the  WikiLeaks   saga   demonstrated   how   private   sector   ‘inaction’   is  

considered  increasingly  indefensible  (Greenberg,  2011).    

 

Liability:  Given  the  detrimental  effects  of  cyberterrorism,  the  notion  that  governments  should  leave  

internet-­‐communities   to   self-­‐regulate   is   quickly   abandoned   in   favour   of   instituting   legal   sanctions  

against   cyberterrorists.   Scholars   maintain,   however,   that   legal   measures   are   insufficient   for  

containing  cyberterrorism  and   liability  needs   to  be  cultivated  elsewhere.  This  has  already  occurred  

via  public  and  private  sector  advocacy  of  responsible  use  of  virtual  platforms.  

 

 

 Issue  area:  Monopolisation  

 

There  is  much  talk  of  the  "digital  divide":  the  inequalities  that  arise  from  differences  in  the  access  to,  

use   of,   or   benefit   from   information   and   communication   technology.   It   is   also   apparent   in   the  

commercial   realm,   where   certain   businesses’   superior   use   of   technology   leads   to   them   achieving  

monopoly   status.   Silicon   Valley   provides   an   awesome   hub   for   entrepreneurship   but   breeds  

monopolies  which  justifiably  concern  anti-­‐trust  regulators  (Fairless  &  Drozdiak,  2015).  Overregulation  

of  these  firms  may  stifle  innovation  but  the  concentration  of  economic  power  within  these  firms  can  

have  detrimental  effects.    

 

Appraising  the  current  climate  of  accountability    

 

The  Commercial  Domain    

10    

Transparency:  Many   commercial   initiatives   rely   on  monopolisation   and,   therefore,   privatisation   of  

data   in  order   to  make  profit.  As   such,   it   is   often  difficult   to  obtain   the   relevant  data  which  would  

allow   the   public   and   government   to   establish   whether   or   not   a   company   is   engaging   in  

anticompetitive  practice.    

 

Standards:  Determining  appropriate   standards  of   accountability  practice   among  businesses  with  

monopolistic   intentions  involves  delicate  trade-­‐offs.    Pro-­‐competition  regulation  should,   in  terms  

of   free-­‐market   theory,   encourage   the   efficient   provision   of   any   given   good.   However,  

Schumpeterian   economics   indicates   that   stripping   companies   of   their   monopolistic   power   will  

stifle  innovation  (Magnusson,  1962).    

 

Liability:  Antitrust  law  already  exists,  but  the  emerging  regulatory  gap  means  that  there  is  a  strong  

likelihood  of  government  increasing  sanctions  on  monopolies.    

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

11    

                     CHAPTER  THREE:  POTENTIAL  SOLUTIONS  AND  POLICY  IMPLICATIONS    

Upon   evaluating   four   climates   of   accountability,   two   patterns   can   be   extrapolated   about  

accountability  in  the  digital  age  with  important  policy  implications.    

 

 

1. Accountability  in  the  digital  age  is  a  human,  not  a  technological  issue  

 

The  speed  of  technological  developments  complicates  efforts  to  operationalise  accountability  in  the  

digital   age.   Although   it   is   important   to   acknowledge   that   the   tools   at   our   disposal   may   be  

insufficient   in   regulating   technological   developments,   we   should   be   cautious   about   consigning  

ourselves   to   ‘technological  determinism’:  a  view   that  underlines   the  unintended  consequences  of  

the   digital   age   in  which   technology   constitutes   an   autonomous,   exogenous   force   shaping   society  

(Smith  &  Marx,  1994).  The  major  problem  with  technological  determinism  is  that  it  absolves  us  from  

responsibility   for   our   “making   and   use   of   technology”   (Wyatt,   2008).   Such   a   view   should   be  

displaced  by   the  more  plausible   view  of   “interplay”   between   society   and   technology  where   each  

“responds  dynamically  to  the  moves  of  the  other”  (Cranor  and  Wildman,  2003:  introduction).  

 

By   recognising   that   there   is   a   dialectical   relationship   between   society   and   technology,  we   better  

understand   the   fundamental   role   of   human   agency,   willpower   and   innovation   in   rendering   the  

digital  age  a  more  successful,  equitable  and  benevolent  project.  Instead  of  fixating  our  gazes  to  the  

speed  of   technological   development,  we   should   be  more   attentive   to   how   the   ‘stickiness’   of   our  

institutions   has   also   contributed   to   various   regulatory   gaps   where   existing   legislation   proves  defunct   in   regulating   novel   technologies.   Just   as   humans   have   designed   technologies   with   far-­‐

reaching   capabilities,   humans   can   also   effectively   reform   the   institutions   in   which   these  

technologies   are   utilised.   More   thought   is   needed   however   about   reforming   our   institutions  

strategically.   We   should   begin   by   questioning   various   established   views   about   what   constitutes  

strategic  reform;  in  this  essay,  for  example,  we  have  analysed  whether  increasing  accountability  in  

the  digital  age  always  proves  to  be  beneficial  as  it  is  popularly  assumed.    

 

In  Chapter  Two,  we  saw  how  operationalising  accountability   in  the  digital  age  involves  conducting  

difficult   trade-­‐offs:   between   control   and   technological   innovation   and   between   personal   privacy  

and   security.   Just   as   it   can   be   misleading   and   dangerous   to   equate   technological   change   with  

progress  (Wyatt,  2008),  it  can  also  be  misleading  and  dangerous  to  conceive  of  accountability  in  the  

Extrapolating  Patterns  

12    

digital  age  as  only  conferring  benefits  on  society.  Accountability  is  a  value-­‐laden  term  with  different  

implications  for  its  diverse  stakeholders.  We  can  recall  that,  vis-­‐à-­‐vis  the  issue  of  personal  privacy,  

increasing   accountability   could   result   in   increased   symmetry   of   information   between   individuals  

whose  data  was  being  collected  and  data  collectors.  However,  increasing  accountability  could  also  

undercut  national  security  and  potentially  foster  unhelpful  distrust  between  the  two  parties.    

 

Awareness   that   accountability   is   not   a   one-­‐dimensional   phenomenon   can   lead   to   strategies   for  

reforming  our   institutions  that  pay  greater  attention  to  the  benefits  of   incremental  change.  Given  

that   citizens   have   an   inherent   difficulty   managing   their   affairs   in   the   absence   of   stability   and  

predictability   in   law,  piecemeal  and  ad  hoc   institutional   change  could  prove  more  beneficial   than  

reactive,  hurried  forms  of  legislative  change.  Most  importantly,  incremental  change  could  give  more  

room   to  public   debate,  moral   reasoning   and  education   for   ascertaining   the   costs   and  benefits   of  

increasing  accountability  in  the  digital  age  on  a  case-­‐by-­‐case  basis.  

 

2. Fostering  accountability   in  the  digital  age  requires  the  respective  and  combined  efforts  of  

the  public  and  private  sector  

 

The  public  and  private  sector  can  serve  to  counter-­‐balance  each  other  in  accountability  terms.  We  

have  seen  how  governments  can  exploit  technological  advances  to  amass  information  on  its  citizens  

more  effectively.  Even  though  some  businesses  have  collaborated  with  governments  in  information-­‐

sharing,   other   companies   have   sought   to   counteract   these  processes:   the  market   for   privacy   has  

flourished.  This  market  paradoxically   complements   the  growing  market   for  personal   information–  

there  is  a  counter-­‐demand  for  individuals  wanting  to  keep  that  information  private.    

 

Differently,   the   public   and   private   sector   can   also   constitute   mutual   aids   to   better   uphold  

accountability   terms.  For  example,   the   fight  against   cyberterrorism  has   seen  an  outsourcing   from  

the  public  to  the  private  sector.  This  is  exemplified  by  the  establishment  of  Cyber  Fast  Track  (CFT)  by  

the  U.S.   Homeland   Security,  which   awarded   short-­‐term   contracts   to   private   entities   for   targeted  

network-­‐security  projects.  CFT  was  part  of  a  government-­‐led  shift   towards  “democratized,  crowd-­‐

sourced  innovation”  (Schmidt  &  Cohen,  2014:  168).  

 

 

 

13    

 

 

 

The  "Alpha-­‐Beta-­‐Gamma"  model  below  is  used  as  a  basis  for  answering  and  evaluating  the  following  

questions:  

 

1. Would   a   change   in   the   level   of   accountability   in   the   principal-­‐agent   relationship   be  

accepted  by  the  principal?  

 

2. Would  a  change  in  the  level  of  accountability  be  ethically  permissible?  

 

It   must   be   stated,   however,   that   this   model   is   not   empirically   grounded   but   rather   is   used   to  

consolidate  the  analysis  performed  thus  far  into  a  single  construct.  Therefore,  it   is  borne  out  of  the  

earlier  analysis  performed,  rather  than  formulated  independently  and  then  applied.  The  assumptions  

which   it   rests   upon   prevent   it   from   being  wholly   realistic   but   it   is   nonetheless   a   useful   analytical  

device  for  considering  these  questions.  

 

The  diagram  shown  below  illustrates  a  single  possible  principal-­‐agent  relationship  within  the  context  

of   the   model.   The   notation   within   the   diagram,   however,   applies   to   all   potential   accountability  

structures.  

The  "Alpha-­‐Beta-­‐Gamma"  Model  

 

 

 

 

 

 

 

 

 

 

New  Ethical  Codes  for  the  Digital  Age  

14    

 

Key  

As   explained   below,   α,   β   and   γ   are   calculated   in   the   same  manner   but   are   distinguished   for   the  

purposes  of  this  model  as  follows:  

 -­‐  “α”  relates  to  the  primary  principal-­‐agent  relationship.  

 -­‐   “β”   relates   to   all   other   accountability   relationships   which   both   are   affected   by   and   affect   the  

Principal.  This  is  illustrated  within  the  diagram  by  existing  as  part  of  a  closed  loop  of  accountability  

relationships,  which  “α”  is  also  a  part  of.  

 -­‐  “γ”  relates  to  all  accountability  relationships  which  do  not  affect  but  are  affected  by  the  Principal,  

i.e.  those  which  do  not  satisfy  the  criteria  to  be  an  “α”  or  “β”.  

𝛼!,    Any  𝛽!"  and    Any  𝛾!"  The  value  of  each  variable  is  the  net  benefit  of  the  level  of  

accountability  of  this  relationship  to  a  particular  party,  “I”.  

N.B.    Accountability  is  itself  a  function  of  transparency,  standards  and  liability,  as  previously  defined.  

The  'i'  subscript  is  used  to  differentiate  between  different  betas  and  gammas.  

1.  Because  the  principal  is  self-­‐interested,  its  desire  for  increased  accountability  is  governed  by  the  

following  function  (with  the  subscript  "P"  representing  the  principal):  

∆𝑓 𝛼! ,𝛽! = ∆𝛼! + (∆𝛽!")!

!!!

 

Where:    

∆𝑥   Represents  the  change  in  𝑥    

 

Decision  rule:  

 

The  principal  accepts  the  increase  in  accountability  where:  ∆𝒇 𝜶𝑷,𝜷𝑷 > 0  

 

2.   To   account   for   the   effect   of   a   change   in   accountability   in   ethical   terms,   the   effect   is  

expressed   in   relation   to   all   existing   parties   (with   "A"   representing   all   parties   affected   by   a  

change  in  the  primary  accountability  relationship):  

∆𝑓 𝛼!,𝛽!, 𝛾! = ∆α! + (∆𝛽!")!

!!!

+ (∆𝛾!")!

!!!

 

Decision  rule:  

The  change  in  accountability  is  ethically  permissible  where  ∆𝒇 𝜶𝑨,𝜷𝑨,𝜸𝑨 > 0  

15    

The  model   is   developed   from   the   perspective   of   the   principal   because   it   is   considered   to   be   the  

primary   driver   of   the   level   of   accountability.   This   is   presupposed   as,   based   on   the   examples  

discussed,   they   typically   wield   the   ability   to   alter   the   accountability   relationship.   The   principal   is  

assumed   to   be   self-­‐interested   and   rational   but   there   are   various   different   incarnations   of   self-­‐

interest.  If  the  principal  is  the  government,  self-­‐interest  translates  into  political  self-­‐preservation  and  

pleasing  the  electorate.  If,  however,  the  principal  is  a  company,  then  self-­‐interest  is  restricted  to  the  

success  of  the  company  and  satisficing  key  stakeholders.  

 

The  decision  rule  for  the  first  question  encompasses  the  primary  principal-­‐agent  relationship  and  any  

relationships  which,  if  affected,  would  in  turn  affect  the  principal.  The  “γ”  relationships  do  not  affect  

the  principal  and  therefore  are  not  included  in  the  decision  rule.  Take  an  example  where  the  principal  

is   the  government  and   the  agent   is  a   "tech  monopolist"  providing  a  popular   service   to  consumers.  

Increased   accountability   benefitting   the   government   may   be   accompanied   by   a   backlash   from  

consumers,  if  it  restricts  the  ability  of  the  company  to  provide  the  service.  In  this  case,  the  net  effect  

of   the  change   in  accountability  may   result   in  a   localised  net  gain   (∆𝛼! > 0)  but  an  overall  net   loss  

(∆𝑓 𝛼! ,𝛽! < 0)  for  the  principal,  so  the  change  would  be  rejected.  

   

The   decision   rule   for   the   second   question   incorporates   all   of   the   human   effects   of   the   change   in  

accountability.   Thus,   we   are   assessing   the   effect   of   the   decision   on   all   direct   and   indirect  

stakeholders.  The  decision  rule  therefore  qualifies  as  an  ethical  one  by  considering  the  welfare  of  all  

parties.  As  such,  the  model  produces  both  an  incentive-­‐based  and  ethics-­‐based  outcome.  The  use  of  

ethics  here  is  distinct  to  the  digital  age  because  it  incorporates  the  interconnectedness  which  defines  

the  digital  age.  

 

There  are,  however,  clear  limitations  to  the  model.  It  makes  the  assumption  that  no  “γ”  relationship  

affects   the   principal,   which   in   practice   may   not   be   the   case.   There   is   also   no   explicit  

acknowledgement  of  the  interrelation  of  the  accountability  links,  which  are  seen  instead  as  a  series  

of  single  bilateral  interactions.  Furthermore,  the  ethics-­‐based  decision-­‐making  rule  is  fairly  simplistic,  

though  there  is  scope  for  nuances.  One  option  would  be  to  require  Pareto-­‐optimality,  whereby  a  gain  

to   society   cannot   be   accompanied   by   a   net   loss   for   any   individual   group.   This   could   perhaps   be  

softened  for  the  sake  of  pragmatism  to  allow  small  net  costs  for  some  parties  in  order  to  benefit  of  

society   as   a   whole.   In   any   case,   this   model   hopes   to   enable   principals   to   incorporate   an   ethical  

calculation  into  accountability  decision-­‐making  processes.  

16    

CONCLUSION  

 

I   began   this   essay   by   defining   accountability   in   the  digital   age   as   a   relational   concept   based  on  

three  properties:  transparency,  standards  and  liability.   I  employed  this  definition  to  evaluate  the  

current  climates  of  accountability  in  the  social,  technological,  political  and  commercial  domains  of  

the  digital   age.  Various   striking   findings  ensued.  First,   it  was  apparent   that  accountability   in   the  

digital  age  proved  to  be  a  human,  not  a   technological   issue.  Specifically,  within  each  of   the   four  

domains  of   the  digital   age,   accountability   is   clearly   a   value-­‐laden   term,  with  different  meanings  

and   implications   for   the   different   stakeholders   involved–   whether   it   be   cyberterrorists   or  

businesses  or  public-­‐private  partnerships.  Second,  it  was  apparent  that  fostering  accountability  in  

the  digital  age  involves  the  respective  and  combined  efforts  of  the  public  and  private  sector.  This  is  

because  they  can  act  as  beneficial  counterweights  or  mutual  aids  in  accountability  terms.  In  order  

to  understand  the  policy  implications  of  my  findings,  I  proceeded  to  use  the  “Alpha-­‐Beta-­‐Gamma”  

model  to  evaluate  accountability  decisions  in  the  digital  age  and  their  ethical  implications.  

 

Word  Count:  3,999  (excluding  title  pages,  table  of  contents  and  list  of  works  cited)  

 

 

 

 

 

 

 

 

 

 

 

 

 

 

17    

LIST  OF  WORKS  CITED  

Akdeniz,  Y.  &  Walker,  C.,  (2000),  The  Internet,  Law  and  Society,  Harlow.    

Beniger,   J.,   (1986),   The   Control   Revolution:   Technological   and   Economic   Origins   of   the   Information  

Society,  Harvard  University  Press.    

Bimber,  B.,  (1994),  The  Faces  of  Technological  Determinism”,  MIT  Press.    

Borowiak,  C.,  (2011),  Accountability  and  Democracy:  the  Pitfalls  and  Promise  of  Popular  Control,  Oxford  

University  Press.    

Bostrom,   N.,   &   Yudkowsky,   (2011),   “The   Ethics   of   Artificial   Intelligence”,   Cambridge   Handbook   of  

Artificial   Intelligence.  Ed.  William  Ramsey  &  Keith  Frankish,  Cambridge  University  Press,  2011.  

1-­‐20.  Print.    

Bovens,  M.   &   Goodin,   R.   &   Schillemans,   T.,   (2014),   The   Oxford   Handbook   of   Accountability,  Oxford  

University  Press.    

Cavelty,   D.   &   Mauer,   V.   &   Krishna-­‐Hensel,   S.,   2007,   Power   and   Security   in   the   Information   Age:  

Investigating  the  Role  of  the  State  in  Cyberspace,  Ashgate.  

Colarik,  A.,  (2006),  Cyber  Terrorism:  Political  and  Economic  Implications,  Idea  Group  Pub.    

Compaine,  B.,  (2011),  The  Digital  Divide:  Facing  a  Crisis  or  Creating  a  Myth?,  MIT  Press.    

Cukier,   K.,   (2015),   “Data,   Data   Everywhere”,   The   Economist,   February   25,   2010:  

http://www.economist.com/node/15557443  (accessed  February  4  2016).    

Davenport,   T.  &  Harris,   J.,   (2007),   “The  Dark   Side  of   Consumer  Analytics”,  Harvard  Business  Review,  

May  7  2007:  https://hbr.org/2007/05/the-­‐dark-­‐side-­‐of-­‐customer-­‐analytics  (accessed  February  4  

2016).    

Dubnick,  M.,  “Accountability  as  a  Cultural  Keywod”,  The  Oxford  Handbook  of  Accountability,  Ed.  Mark  

Bovens  &  Robert  Goodin  &  Thomas  Schillemans  (2014),  Oxford  University  Press,  2014.  23-­‐  35.  

Print.      

Egede,   T.,   (2004),  An   Annotated   Bibliography   of   the   Accountability   of  Multinational   Corporations:   A  

Review  of   International  Human  Rights   Law,   Centre   for  Business  Relationships,  Accountability,  

Sustainability  &  Society.    

Fairless,  T  &  Drozdiak  N.,  (2015),  “Google  Owners  Accuses  EU  of  Antirust  About-­‐Face”,  The  Wall  Street  

Journal,  November  2  2015:    http://www.wsj.com/articles/google-­‐troubled-­‐by-­‐ambiguity-­‐in-­‐eu-­‐

antitrust-­‐case-­‐1446485853  (accessed  February  7  2016).  

Gailmard,  S.,  “Accountability  and  Principal-­‐Agent  Theory”,  The  Oxford  Handbook  of  Accountability,  Ed.  

Mark  Bovens  &  Robert  Goodin  &  Thomas  Schillemans  (2014),  Oxford  University  Press,  2014.  89-­‐  

101.  Print.      

18    

Graham,   M.   &   Dutton,   W.,   (2014),   Society   and   the   Internet:   How   Networks   of   Information   and  

Communication  Are  Changing  Our  Lives,  Oxford  University  Press.    

Greenberg,  A.,   (2011),   “On  The  Anniversary  of  Cutting  Off  WikiLeaks,  PayPal   Slaps  Christmas  Charity  

Project”,   Forbes,   December   6   2011:  

http://www.forbes.com/sites/andygreenberg/2011/12/06/on-­‐the-­‐anniversary-­‐of-­‐cutting-­‐off-­‐

wikileaks-­‐paypal-­‐slaps-­‐christmas-­‐charity/#66bfbd317b2e  (accessed  February  2  2016)  

Hackett,   E.,   Amsterdamska,   O.,   Lynch,   M.   &   Wajcman,   J.,   (2007),   The   Handbook   of   Science   and  

Technology,  MIT  Press.    

Heickero,  R.,  (2013),  The  Dark  Sides  of  the  Internet:  On  Cyber  Threats  and  Information  Warfare,  Peter  

Lang.  

Lach,   E.,   (2010),   “YouTube  Now  Lets  Users   Flag  Videos   That   “Promote  Terrorism””,  Business   Insider,  

December  14  2010:  http://www.businessinsider.com/youtube-­‐now-­‐lets-­‐users-­‐flag-­‐videos-­‐that-­‐

promote-­‐terrorism-­‐2010-­‐12?IR=T  (accessed  January  29  2016)  

Leckart,  S.,   (2015),  “The  Hackathon  Fast  Track,  From  Campus  to  Silicon  Valley”,  The  New  York  Times,  

April   6,   2015:   http://www.nytimes.com/2015/04/12/education/edlife/the-­‐hackathon-­‐fast-­‐

track-­‐from-­‐campus-­‐to-­‐silicon-­‐valley.html  (accessed  February  3  2016).    

Lindberg,   S.,   (2009),   Accountability:   The   Core   Concept   and   Its   Subtypes,   Overseas   Development  

Institution.      

Lohr,  S.,   (2015),   “As  Tech  Booms,  Workers  Turn   to  Coding   for  Career  Change”,  The  New  York  Times,  

July   28,   2015:   http://www.nytimes.com/2015/07/29/technology/code-­‐academy-­‐as-­‐career-­‐

game-­‐changer.html  (accessed  February  3  2016).    

Locke,  R.,  (2013),  The  promise  and  limits  of  private  power,  Cambridge  University  Press.  

Lucker,   J.,   (2014),   “Leveraging   Consumer   Analytics”,   Harvard   Business   Review,   January   15   2014:  

http://www.economist.com/node/15557443  (accessed  February  4  2016).    

Mansbridge,  Jane.,  “A  Contingency  Theory  of  Accountability”,  The  Oxford  Handbook  of  Accountability,  

Ed.  Mark  Bovens  &  Robert  Goodin  &  Thomas  Schillemans  (2014),  Oxford  University  Press,  2014.  

55-­‐  66.  Print.      

May,  C.,  (2002),  The  Information  Society:  A  Skeptical  View,  Cambridge  Polity  Press.  

Magnusson,   L.,   (1994),   Evolutionary   and   Neo-­‐Schumpeterian   Approaches   to   Economics,   Kluwer  

Academic  Publishers.    

Melendez,   S.,   (2015),   “The   Office   is   Watching   You”,   Fast   Company,   May   22,   2010:  

http://www.economist.com/node/15557443  (accessed  February  4  2016).    

19    

Mossberger,   K.   &   Tolbert,   C.   &   Stansbury   M.,   (2003),   Virtual   Inequality:   Beyond   the   Digital   Divide,  

Georgetown  University  Press.    

Mossberger,  K.,  (2008),  Digital  Citizenship:  The  Internet,  Society,  and  Participation,  MIT  Press.    

Neyland,  D.,  (2006),  Privacy,  Surveillance  and  Public  Trust,  Palgrave  Macmillan.    

Reich,  P.  &  Gelbstein,  E.,  (2012),  Law,  Policy,  and  Technology:  Cyberterrorism,  Information  Warfare  and  

Internet  Immobilization,  Information  Science  Reference.  

Royal   Academy   of   Engineering,   (2007),   Dilemmas   of   Privacy   and   Surveillance:   Challenges   of  

Technological  Change,  Royal  Academy  of  Engineering.    

Rubinstein,  I.,  (2013)  "  Big  Data:  The  End  of  Privacy  or  a  New  Beginning?”,   International  Data  Privacy  

Law,  vol.3,  no.2,  pp74-­‐87.  

Schapiro,   A.,   (1999),   The   Control   Revolution:   How   the   Internet   is   Putting   Indiivduals   in   Charge   and  

Changing  the  World  We  Know,  Public  Affairs.    

Schell,  B.,  (2007),  The  Internet  and  Society:  A  Reference  Handbook,  ABC-­‐CLIO  Press.    

Schmidt,  E.,  (1995),  The  New  Digital  Age:  Reshaping  the  Future  of  People,  Nations  and  Businesses,  John  

Murray.  

Schwartz,  A.,  (2012)  “A  real  internet  of  things  for  the  developing  world”  Fast  Company,  1  August  2012:  

http://www.fastcoexist.com/1680223/a-­‐real-­‐internet-­‐of-­‐things-­‐for-­‐the-­‐developing-­‐world-­‐and-­‐

burning-­‐man  (Accessed  8  February  2015)  

Senker,  C.,  2011,  Privacy  and  Surveillance,  Wayland.    

Smith,   M.   &   Marx,   L.,   1994,   Does   Technology   Drive   History?   The   Dilemma   of   Technological  

Determinism,  MIT  Press.    

Steinhauer,   J.,   (2015),   “From  Cybersecurity   to   Trade  Deals,   Bills   are   Expected   to   Start  Moving”,  The  

New   York   Times,   April   13,   2015:   http://www.nytimes.com/2015/04/14/us/politics/from-­‐

cybersecurity-­‐to-­‐trade-­‐deal-­‐bills-­‐are-­‐expected-­‐to-­‐start-­‐moving-­‐in-­‐congress.html   (accessed  

February  3  2016).    

Taddeo,   M.   ,   2014,   Information   Warfare:   The   Ontological   and   Regulatory   Gap,   APA   Newsletter   on  

Philosophy  and  Computers,  Department  of  Computer  Sciences.    

Warren,   M.,   “Accountability   and   Democracy”,   The   Oxford   Handbook   of   Accountability,   Ed.   Mark  

Bovens  &  Robert  Goodin  &  Thomas  Schillemans  (2014),  Oxford  University  Press,  2014.  39-­‐  52.  

Print.      

 

Warschauer,  M.,  2003,  Technology  and  Social  Inclusion:  Rethinking  the  Digital  Age,  MIT  Press.    

20    

Whincop,  M.,  2001,  Bridging   the  Entrepreneurial  Financing  Gap:  Linking  Governance  with  Regulatory  

Policy,  Ashgate.    

Zeller,  T.,  (2006)    “A  Slippery  Slope  of  Censorship  at  YouTube”,  The  New  York  Times,  October  9  2006:  

http://www.nytimes.com/2006/10/09/technology/09link.html?_r=0   (accessed   February   1  

2016)  

Zureik,   E.,   2010,   Surveillance,   Privacy,   and   the   Globalisation   of   Personal   Information:   International  

Comparisons,  McGill-­‐Queen’s  University  Press.