systems research-socspi-2012-06-19

25
James Landay Short-Dooley Professor Computer Science & Engineering University of Washington 2012 NSF SoCS PI Meeting University of Michigan June19, 2012 James & Friends’ Systems How To A Guide to Systems & Applica3ons Research

Upload: james-a-landay

Post on 05-Dec-2014

1.670 views

Category:

Education


0 download

DESCRIPTION

Overview of what makes good systems research for the 2012 NSF Social Computing Systems (SoCS) PI Meeting held at the University of MIchigan, Ann Arbor (Jun 17-19, 2012)

TRANSCRIPT

Page 1: Systems research-socspi-2012-06-19

James LandayShort-Dooley ProfessorComputer Science & Engineering University of Washington " " " 2012 NSF SoCS PI Meeting University of Michigan June19, 2012  

James  &  Friends’  Systems  How  To    A  Guide  to  Systems  &  Applica3ons  Research!

Page 2: Systems research-socspi-2012-06-19

What  Type  of  Researcher  are  You?  

A  -­‐  Discoverer   B  -­‐  Ques=oner   C  -­‐  Maker  

Page 3: Systems research-socspi-2012-06-19

“With  a  Li6le  Help  From  My  UIST  Friends”  

Page 4: Systems research-socspi-2012-06-19

QuesCons  Answered  What  are  the  key  a6ributes  of  strong  systems  work?  

What  are  the  best  techniques  to  evaluate  systems  &  when  do  they  make  sense  to  use?  

Which  HCI  techniques  do  not  make  sense  in  systems  research?  

How  do  you  disCnguish  good  research  from  bad?  

What  are  your  favorite  systems  research  projects  &  why?  

What  makes  a  good  social  compuCng  systems  research  project  &  what  are  your  favorites?  

   

Page 5: Systems research-socspi-2012-06-19

Key  A6ributes  of  Strong  Systems  Research  

Compelling  Target  •  “Solves  a  concrete,  compelling  problem  with  demonstrated  need”  Strong  moCvaCon  for  the  problem  w/  need  based  in  users,  costs,  or  tech  issues  •  “Solves  a  compelling  set  of  problems  using  a  unifying  set  of  principles”  The  principles  Ce  the  set  of  problems  together  

 

•  “Explores  how  people  will  interact  with  computers  in  the  future”  Takes  into  account  technical  &  usage  trends  

Page 6: Systems research-socspi-2012-06-19

Key  A6ributes  of  Strong  Systems  Research  

Technical  Challenge  • “Goes  beyond  rou3ne  so@ware  engineering”  Requires  novel,  non-­‐trivial  algorithms  or  configura=on  of  components  

 

Deployed  When  Possible  • “system  is  deployed  &  intended  benefits  &  unexpected  outcomes  documented”  Not  required,  but  gold  standard  for  most  systems  work  

Page 7: Systems research-socspi-2012-06-19

“Everybody’s  Got  Something  To  Evaluate  Except  Me  And  My  Monkey”  

Page 8: Systems research-socspi-2012-06-19

EvaluaCon  Methods  for  Systems  Research  

“it  depends  upon  the  contribu3on”    “match  the  type  of  evalua3on  with  how  you  expect  the  system  to  be  used”    “mul3tude  of  metrics  to  give  you  a  holis3c  view”  

Page 9: Systems research-socspi-2012-06-19

Idea  EvaluaCon  Overall  value  of  system  or  applica2on  

•  If  extremely  novel,  the  fact  that  it  works  &    logical  argument  to  explore  “boundaries  of  value”  • Real  world  deployment  (expensive  in  Cme  &  effort)  

Page 10: Systems research-socspi-2012-06-19

Technical  EvaluaCon  Measure  key  aspects  from  technical  perspec2ve  

1) Toolkit  è    expressiveness  (“Can  I  build  it?”)    efficiency  (“How  long  will  it  take?”)    accessibility  (“Do  I  know  how?”)  

2) Performance  improvement  è    benchmark  (error,  scale,  effiencey…)  

3) Novel  component  è  controlled  lab  study*  *  may  not  generalize  to  real-­‐world  condiCons  

 

Page 11: Systems research-socspi-2012-06-19

EffecCveness  EvaluaCon    

1) Usability  improvement  è    controlled  lab  study*  

2) Conceptual  understanding  è  case  studies  w/  a  few  real  external  users  

Page 12: Systems research-socspi-2012-06-19

“Honey  Don’t  Use  That  Technique”  

Page 13: Systems research-socspi-2012-06-19

HCI  Techniques  That  Don’t  Make  Sense  

• Usability  Tests  &  A/B  tests  “can’t  tell  much  about  complex  systems”  

• Contextual  Inquiry  “good  for  today,  but  can’t  predict  tomorrow”  

• TradiConal  controlled  empirical  studies  “not  meaningful  to  isolate  small  number  of  variables”  

Page 14: Systems research-socspi-2012-06-19

“I  Want  You”  

Page 15: Systems research-socspi-2012-06-19

How  Do  You  Tell  Good  From  Bad?  Good  •  “Combines  a  lot  of  exisCng  ideas  together  in  new  ways  …  it  really  is  a  case  of  the  sum  being  greater  than  the  parts”  •  “PotenCal  for  impact”  •  “Tries  to  solve  an  important  problem  using  novel  technology.  It  is  creaCve  &  raises  new  possibiliCes  for  human-­‐computer  interacCon.”  

 Bad  •  “Fails  to  jusCfy  the  problem  it  addresses,  uses  off-­‐the-­‐shelf  technology,  or  does  not  teach  anything  new  about  how  people  interact  with  computers.”  

•  “too  many  concepts—true  insight  has  a  simplicity  to  it”  •  “a  feature,  but  not  a  product  or  a  business”    

 

Page 16: Systems research-socspi-2012-06-19

“I  Want  You”  

Page 17: Systems research-socspi-2012-06-19

HYDROSENSE   Froehlich,  Larson,  Fogarty,  Patel  

+  crucial  problems,  surprising  how  well  can  do  w/  few  sensors  

Page 18: Systems research-socspi-2012-06-19

prefab   Dixon  &  Fogarty  

+  “compelling,  but  not  obvious  best  way…  pushes  as  far  as  can”  

Page 19: Systems research-socspi-2012-06-19

Whyline   Ko  &  Myers  

+  “based  on  studies  of  how  people  debug  today”  +  “insight  that  almost  all  quesCons  in  form  of  why  or  “whynot”  

Page 20: Systems research-socspi-2012-06-19

$100  InteracCve  Whiteboard   Johnny  Lee  

+  “repurposes  current  tools  in  a  creaCve  way  to  solve  a  problem          that  no  one  would  have  imagined  possible  before  he  did  it”  

Page 21: Systems research-socspi-2012-06-19

What  Makes  a  Good  Social  CompuCng  System?    •  “criteria  above  +  involves  social  interacCon  as  a  main  feature..  Facilitates  new  or  enhanced  forms  of  collaboraCve  parCcipaCon”  •  “combines  good  theory  with  good  systems  building”  •  “finds  new  ways  of  combining  the  best  of  people  and  computers  together”  •  “good  answers  to  why  people  will  parCcipate  at  scale”  •  “a  model  of  individual  user  behavior;  a  model  of  aggregated  social  behavior;  use  that  model  to  build  a  novel  system”  •  “make  the  system  work  in  the  face  of  malicious  behavior”  

Page 22: Systems research-socspi-2012-06-19

Soylent   Bernstein,  et.  al.  

+  “innovaCve  applicaCons  for  growing  trend  (crowdsourcing)”  +  “led  to  new  ideas  for  how  to  organize  people  &  computers”  +  “contributed  a  general  design  pa6ern  (Find-­‐Fix-­‐Verify)”  

Page 23: Systems research-socspi-2012-06-19

Group  Lens  /  Movie  Lens   Riedl,  Herlocker,    Lam,  et.  al.  

+  “built  their  own  community  &  used  it  to  develop  a  long  list  of  compelling  research  results”  +  “incorporates  lots  of  social  science  ideas,  led  to  innovaCons  in  collaboraCve  filtering,  and  has  actual  deployment  &  lots  of  use”  

Page 24: Systems research-socspi-2012-06-19

Many-­‐Eyes   Heer,  Viégas,  Wa6enberg  

+  “recognized  the  social  nature  of  people’s  relaConships  to    data  visualizaCons  &  provided  a  planorm  for  disseminaCng”  

 +  “significant  real-­‐world  impact  in  introducing  larger  audiences  to  a  variety  of  visualizaCon  techniques”  

Page 25: Systems research-socspi-2012-06-19

Thanks  to  Contributors  Ben  Bederson,  University  of  Maryland  Ed  H.  Chi,  Google  Research  Saul  Greenberg,  University  of  Calgary  François  GuimbreCère,  Cornell  University  Jeffrey  Heer,  Stanford  University  Jason  Hong,  Carnegie  Mellon  University  Tessa  Lau,  IBM  Research  Dan  Olsen,  Brigham  Young  University