tabin hasan dept. of computer science university of trento

19
Bridging the Motivation Gap for Individual Annotators: What Can We Learn From Photo Annotation Systems? Tabin Hasan Dept. of Computer Science University of Trento 1st Workshop on Incentives for the Semantic Web ISWC 2008, Karlsruhe, October 26th, 2008 Anthony Jameson FBK-irst Trento Contents: 1. The Two Motivation Gaps 2. Bridging the … [See the title] 3. Implications for the Other Workshop Papers (With audience participation)

Upload: colleen-liddane

Post on 31-Dec-2015

27 views

Category:

Documents


0 download

DESCRIPTION

1st Workshop on Incentives for the Semantic Web ISWC 2008, Karlsruhe, October 26th, 2008. Bridging the Motivation Gap for Individual Annotators: What Can We Learn From Photo Annotation Systems?. Anthony Jameson FBK-irst Trento. Tabin Hasan Dept. of Computer Science University of Trento. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Tabin Hasan Dept. of Computer Science University of Trento

Bridging the Motivation Gap for Individual Annotators: What Can We Learn From Photo Annotation Systems?

Tabin HasanDept. of Computer Science

University of Trento

1st Workshop on Incentives for the Semantic WebISWC 2008, Karlsruhe, October 26th, 2008

Anthony JamesonFBK-irstTrento

Contents:1. The Two Motivation Gaps2. Bridging the … [See the title]3. Implications for the Other Workshop Papers

• (With audience participation)

Page 2: Tabin Hasan Dept. of Computer Science University of Trento

The Community Motivation Gap(common view)

Creating metadata for yourself Creating metadata for a community

Spread the work around Provide individual as well as community benefit Provide quick confirmation and benefit Make incremental contributions usable Show what contributions are needed

Show what the current user is especially qualified to provide Provide a fun game Minimize privacy concerns Publicize contributions

Common strategies:

Page 3: Tabin Hasan Dept. of Computer Science University of Trento

The Individual Motivation Gap

Creating metadata for yourself Exploiting your own metadata

Creating metadata for yourself Creating metadata for a community

Page 4: Tabin Hasan Dept. of Computer Science University of Trento

A Photo Annotation Interface

"Sri Lanka dusk photos” detected by PhotoCompas using contextual metadata. From Naaman et al. (2004)

Page 5: Tabin Hasan Dept. of Computer Science University of Trento

Metadata:Quantity and

Quality

Algorithms

· Clustering of similar objects

· Classifying into known classes

· Improvement of clustering or classification (via learning) User Interface

· Facilitation of batch annotation

· Recommendation of metadata

· Enjoyable actions and feedback

· Minimization of tedious actions

· Easy checking· Final judgment by

user

User Input

Annotation:· Sorting· Labeling

Naturally occurring:· Relevance

feedback while searching

· Annotating while sharing

ExternalResources

· Existing related text· Already annotated

objects· Information about

contexts· Persons who can

provide judgments

Affordances ofSituations

· Data currently clustered or classified?

· Time and attention available for annotation?

· Naturally occurring user actions that yield metadata?

Page 6: Tabin Hasan Dept. of Computer Science University of Trento

External Resources

"Sri Lanka dusk photos” detected by PhotoCompas using contextual metadata. From Naaman et al. (2004)

Page 7: Tabin Hasan Dept. of Computer Science University of Trento

Metadata:Quantity and

Quality

Algorithms

· Clustering of similar objects

· Classifying into known classes

· Improvement of clustering or classification (via learning) User Interface

· Facilitation of batch annotation

· Recommendation of metadata

· Enjoyable actions and feedback

· Minimization of tedious actions

· Easy checking· Final judgment by

user

User Input

Annotation:· Sorting· Labeling

Naturally occurring:· Relevance

feedback while searching

· Annotating while sharing

ExternalResources

· Existing related text· Already annotated

objects· Information about

contexts· Persons who can

provide judgments

Affordances ofSituations

· Data currently clustered or classified?

· Time and attention available for annotation?

· Naturally occurring user actions that yield metadata?

Page 8: Tabin Hasan Dept. of Computer Science University of Trento

Algorithms, User Interface

SAPHARI, from Suh and Bederson (2007)

Page 9: Tabin Hasan Dept. of Computer Science University of Trento

Metadata:Quantity and

Quality

Algorithms

· Clustering of similar objects

· Classifying into known classes

· Improvement of clustering or classification (via learning) User Interface

· Facilitation of batch annotation

· Recommendation of metadata

· Enjoyable actions and feedback

· Minimization of tedious actions

· Easy checking· Final judgment by

user

User Input

Annotation:· Sorting· Labeling

Naturally occurring:· Relevance

feedback while searching

· Annotating while sharing

ExternalResources

· Existing related text· Already annotated

objects· Information about

contexts· Persons who can

provide judgments

Affordances ofSituations

· Data currently clustered or classified?

· Time and attention available for annotation?

· Naturally occurring user actions that yield metadata?

Page 10: Tabin Hasan Dept. of Computer Science University of Trento

User Input, Affordances of Situations

ARIA: cf. Lieberman, Rosenzweig, and Singh (2001)

Page 11: Tabin Hasan Dept. of Computer Science University of Trento

Tag Learning

Metadata:Quantity and

Quality

Algorithms

· Clustering of similar objects

· Classifying into known classes

· Improvement of clustering or classification (via learning) User Interface

· Facilitation of batch annotation

· Recommendation of metadata

· Enjoyable actions and feedback

· Minimization of tedious actions

· Easy checking· Final judgment by

user

User Input

Annotation:· Sorting· Labeling

Naturally occurring:· Relevance

feedback while searching

· Annotating while sharing

ExternalResources

· Existing related text· Already annotated

objects· Information about

contexts· Persons who can

provide judgments

Affordances ofSituations

· Data currently clustered or classified?

· Time and attention available for annotation?

· Naturally occurring user actions that yield metadata?

Learning tags from examples

Geographic and event data sources

Suggesting annotations of new photos

Correcting incorrect suggested annotations

Best when user is uploading many photos related to a given place / event

Page 12: Tabin Hasan Dept. of Computer Science University of Trento

SOMNet

Metadata:Quantity and

Quality

Algorithms

· Clustering of similar objects

· Classifying into known classes

· Improvement of clustering or classification (via learning) User Interface

· Facilitation of batch annotation

· Recommendation of metadata

· Enjoyable actions and feedback

· Minimization of tedious actions

· Easy checking· Final judgment by

user

User Input

Annotation:· Sorting· Labeling

Naturally occurring:· Relevance

feedback while searching

· Annotating while sharing

ExternalResources

· Existing related text· Already annotated

objects· Information about

contexts· Persons who can

provide judgments

Affordances ofSituations

· Data currently clustered or classified?

· Time and attention available for annotation?

· Naturally occurring user actions that yield metadata?

Better support for choice of terms

Encourage case entry when doctor has been dealing with the relevant information

Use KB of previous cases to support autocompletion

Enable “scratching own itch” while browsing

Intelligently use relevant data in doctors’ own computers

Page 13: Tabin Hasan Dept. of Computer Science University of Trento

Collaborative IR Augmentation

Metadata:Quantity and

Quality

Algorithms

· Clustering of similar objects

· Classifying into known classes

· Improvement of clustering or classification (via learning) User Interface

· Facilitation of batch annotation

· Recommendation of metadata

· Enjoyable actions and feedback

· Minimization of tedious actions

· Easy checking· Final judgment by

user

User Input

Annotation:· Sorting· Labeling

Naturally occurring:· Relevance

feedback while searching

· Annotating while sharing

ExternalResources

· Existing related text· Already annotated

objects· Information about

contexts· Persons who can

provide judgments

Affordances ofSituations

· Data currently clustered or classified?

· Time and attention available for annotation?

· Naturally occurring user actions that yield metadata?

Suggest mapping on basis of previous learning

Facilitate contribution after noticing of gap

Use machine learning to exploit KB of existing keyword queries and formal representations

Support testing and debugging to require minimal user effort

Page 14: Tabin Hasan Dept. of Computer Science University of Trento

Constitution-Based Game

Metadata:Quantity and

Quality

Algorithms

· Clustering of similar objects

· Classifying into known classes

· Improvement of clustering or classification (via learning) User Interface

· Facilitation of batch annotation

· Recommendation of metadata

· Enjoyable actions and feedback

· Minimization of tedious actions

· Easy checking· Final judgment by

user

User Input

Annotation:· Sorting· Labeling

Naturally occurring:· Relevance

feedback while searching

· Annotating while sharing

ExternalResources

· Existing related text· Already annotated

objects· Information about

contexts· Persons who can

provide judgments

Affordances ofSituations

· Data currently clustered or classified?

· Time and attention available for annotation?

· Naturally occurring user actions that yield metadata?

Make sure that the game is actually fun!

Page 15: Tabin Hasan Dept. of Computer Science University of Trento

Inverse Search

Metadata:Quantity and

Quality

Algorithms

· Clustering of similar objects

· Classifying into known classes

· Improvement of clustering or classification (via learning) User Interface

· Facilitation of batch annotation

· Recommendation of metadata

· Enjoyable actions and feedback

· Minimization of tedious actions

· Easy checking· Final judgment by

user

User Input

Annotation:· Sorting· Labeling

Naturally occurring:· Relevance

feedback while searching

· Annotating while sharing

ExternalResources

· Existing related text· Already annotated

objects· Information about

contexts· Persons who can

provide judgments

Affordances ofSituations

· Data currently clustered or classified?

· Time and attention available for annotation?

· Naturally occurring user actions that yield metadata?

KB of aggregated information needs

Support batch export of subsets of private data

Algorithms for recommending information to be made public

Facilitate export when private data is originally added

Page 16: Tabin Hasan Dept. of Computer Science University of Trento

Community Motivation Gap• Spread the work around so that each person

needs to do only a bit– Collaborative IR augmentation

» "collaborative"

• Provide individual benefit as well as community benefit

– Collaborative IR augmentation» "immediate"

Page 17: Tabin Hasan Dept. of Computer Science University of Trento

Community Motivation Gap

• Provide quick confirmation of contribution and quick benefit

– Collaborative IR augmentation» "immediate"

– SOMNet» People want to know their contributions are being

taken seriously

Page 18: Tabin Hasan Dept. of Computer Science University of Trento

Community Motivation Gap

• Ensure that even incremental contributions can be utilized

– Collaborative IR augmentation» "incremental", "partial"

• Show what contributions are needed (and which ones the current user is especially qualified to provide)

– Inverse search

Page 19: Tabin Hasan Dept. of Computer Science University of Trento

Community Motivation Gap

• Provide a fun game in which people do the desired work

– Constitution-based game

• Protect privacy– SOMNet

» People wanted to avoid revealing gaps in knowledge– Inverse search

» Encourage people to move knowledge from private to public when it's needed

• Publicize (and maybe publicly evaluate) contributions