Interfacing with Virtual Worlds

Download Interfacing with Virtual Worlds

Post on 24-May-2015

1.282 views

Category:

Technology

4 download

TRANSCRIPT

  • 1. Interfacing with Virtual Worlds An Introduction to MPEG-V Christian Timmerer Klagenfurt University (UNIKLU)Faculty of Technical Sciences (TEWI) Department of Information Technology (ITEC)Multimedia Communication (MMC) http://research.timmerer.comhttp://blog.timmerer.commailto:christian.timmerer@itec.uni-klu.ac.at Authors : Christian Timmerer, Jean Gelissen, Markus Waltl, and Hermann Hellwagner Slides available at http://www.slideshare.net/christian.timmerer

2. Outline

  • Introduction
  • Part 1: System Architecture
  • Overview of MPEG-V Parts 2 and 4
  • Part 3: Sensory Information
    • Concept
    • Sensory Effect Description Language
    • Sensory Effect Vocabulary + Usage Examples (cf. paper)
  • Conclusions
  • (Demo Video)

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 3. Introduction

  • Multi-user online virtual worlds(NVE, MMOG) reached mainstream popularity
    • e.g., World of Warcraft, Second Life, Lineage
  • Boost real world economy by connecting virtual and real world? Not only Gaming
    • Entertainment, education, training, getting information, social interaction, work, virtual tourism, etc.
  • Forfast adoptionof virtual worlds we need a betterunderstandingof theirinternal economics ,rulesandregulations

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 4. Introduction(contd)

  • Finally,interoperabilityachieved through standardization
  • MPEG-V (ISO/IEC 23005):==system architecture+associated information representations
  • Interoperabilitybetween virtual worlds
    • E.g., digital content provider of a virtual world (serious) gaming, simulation, DVD
  • Andreal world
    • E.g., sensors, actuators, vision and rendering, robotics (e.g. for revalidation), (support for) independent living, social and welfare systems, banking, insurance, travel, real estate, rights management

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 5. MPEG-V System Architecture 2009/09/30 Christian Timmerer, Klagenfurt University, Austria Media context and control Pt. 1: Architecture Pt. 3: Sensory Information Pt. 4: Avatar Information Pt. 2: Control Information 6. Part 2: Control Information 2009/09/30 Christian Timmerer, Klagenfurt University, Austria

  • Sensory Device Capabilitiesas ext. ofdia:TerminalCapability
  • unit, max/minIntensity, numOfLevels, delay, position
  • light (color, flash), heating, cooling, wind, vibration
  • scent, fog, water sprayer, color correction
  • kinestetic, tactile
  • User Sensory Preferencesas ext. ofdia:UserCharacteristics
  • adaptability, max/minIntensity
  • light (color, flash), heating, cooling, wind, vibration
  • scent, fog, water sprayer, color correction
  • kinestetic, tactile

Fundamental Input to any Control Device (aka Adaptation Engine) 7. Part 4: Avatar Characteristics

  • Appearance
    • Contains thehigh level description of the appearanceand may refer a media containing theexact geometry and texture
  • Haptics Properties
    • Contains thehigh level descriptionof the haptics properties
  • Animation
    • Contains the description of aset of animation sequencesthat the avatar is able to perform and may refer to several medias containing the exact (geometric transformations) animation parameters
  • Communication Skills
    • Contains a set of descriptors providing information on thedifferent modalitiesan avatar is able to communicate
  • Personality
    • Contains a set of descriptors defining thepersonality of the avatar
  • Control
    • Contains a set of descriptors defining possible place-holders for sensors onbody skeletonandface featurepoints

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 8. Part 3: Sensory Information

  • Universal Multimedia Access (UMA)
    • Anywhere, anytime, any device +technically feasible
    • Main focus on devices and network connectivity issues
  • Universal Multimedia Experience (UME)
    • Take the user into account
  • Multimedia AdaptationandQuality Models/Metrics
    • Single modality(i.e., audio, image, or video only) or a simplecombination of two modalities(i.e., audio and video)
  • Triple user characterization model
    • Sensorial , e.g., sharpness, brightness
    • Perceptual , e.g., what/where is the content
    • Emotional , e.g., feeling, sensation
  • Ambient Intelligence
    • Addllight effectsarehighly appreciatedfor both audio and visual content
    • Calls for ascientific frameworktocapture ,measure ,quantify ,judge , andexplain the user experience

2009/09/30 Christian Timmerer, Klagenfurt University, Austria F. Pereira, A triple user characterization model for video adaptation and quality of experience evaluation,Proc. of the 7th Workshop on Multimedia Signal Processing , Shanghai, China, October 2005, pp. 1 4. B. de Ruyter, E. Aarts. Ambient intelligence: visualizing the future, Proceedings of the Working Conference on Advanced Visual Interfaces, New York, NY, USA, 2004, pp. 203208. E. Aarts, B. de Ruyter, New research perspectives on Ambient Intelligence, Journal of Ambient Intelligence and Smart Environments, IOS Press, vol. 1, no. 1, 2009, pp. 514. 9. Concept of MPEG-V Sensory Information

  • Consumption of multimedia contentmay stimulate also other senses
    • Vision or audition
    • Olfaction ,mechanoreception ,equilibrioception ,thermoception ,
  • Annotation with metadata providing so-calledsensory effectsthatsteer appropriate devicescapable of rendering these effects

2009/09/30 Christian Timmerer, Klagenfurt University, Austria giving her/him the sensation of being part of the particular media worthwhile, informative user experience 10. Sensory Effect Description Language (SEDL)

  • XML Schema-based language for describing sensory effects
    • Basic building blocksto describe, e.g., light, wind, fog, vibration, scent
    • MPEG-V Part 3, Sensory Information
    • Adopted MPEG-21 DIA tools for adding time information (synchronization)
  • Actual effects are not part of SEDL but defined within theSensory Effect Vocabulary (SEV)
    • Extensibility :additional effectscan be added easily w/o affecting SEDL
    • Flexibility : eachapplication domainmay define its own sensory effects
  • Description conforming to SEDL :==Sensory Effect Metadata (SEM)
    • May be associated toany kind of multimedia content(e.g., movies, music, Web sites, games)
    • Steersensory deviceslike fans, vibration chairs, lamps, etc. via an appropriate mediation device
  • Increase the experience of the user
  • Worthwhile, informative user experience

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 11. Sensory Effect Description Language(contd) 2009/09/30 Christian Timmerer, Klagenfurt University, Austria EffectDefinition ::= [activate][duration][fade][alt] [priority][intensity][position] [adaptability] SEM ::=[DescriptionMetadata](Declarations|GroupOfEffects| Effect|ReferenceEffect)+ Declarations ::= (GroupOfEffects|Effect|Parameter)+ GroupOfEffects ::= timestamp EffectDefinition EffectDefinition (EffectDefinition)* Effect ::= timestamp EffectDefinition 12. Example 2009/09/30 Christian Timmerer, Klagenfurt University, Austria < sedl:GroupOfEffects si:pts="3240000" duration ="100"fade ="15" position =" urn:mpeg:mpeg-v:01-SI-PositionCS-NS:center:*:front " > < sedl:Effectxsi:type=" sev:WindType"intensity="0.0769"/> < sedl:Effectxsi:type=" sev:VibrationType " intensity="0.56"/> < sedl:Effectxsi:type=" sev:LightType"intensity="0.0000077"/> 13. Conclusions

  • MPEG-V: Media Context and Control
    • Information exchangebetween Virtual Worlds
    • Information exchangebetween Virtual and Real Worlds
    • Currently comprisesfour parts(more to come, e.g., refsw, conf)
  • MPEG-V Part 3: Sensory Information
    • Annotation with metadata providing so-calledsensory effectsthatsteer appropriate devicescapable of rendering these effects
    • enhanced ,worthwhile , andinformative user experience , giving the user thesensation of being partof the actual media
  • Future work
    • Standardization: currently atCD level&going to FCDin October 2009
    • Research & Development:
      • Optimizedandefficient delivery frameworkfor MPEG-V enabled content
      • NewQuality of Service/Experiencemetrics
      • Mechanism for(semi-)automatic generationof MPEG-V metadata
      • End-to-endreference implementation of MPEG-V

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 14. References

  • M. Waltl, C. Timmerer, and H. Hellwagner, A Test-Bed for Quality of Multimedia Experience Evaluation of Sensory Effects,Proceedings of the First International Workshop on Quality of Multimedia Experience (QoMEX 2009) , San Diego, USA, July 29-31, 2009.
  • C. Timmerer, J. Gelissen, M. Waltl, and H. Hellwagner, Interfacing with Virtual Worlds,accepted for publication in the Proceedings of the 2009 NEM Summit , Saint-Malo, France, September 28-30, 2009.
  • C. Timmerer, MPEG-V: Media Context and Control,89th ISO/IEC JTC 1/SC 29/WG 11 (MPEG) Meeting , London, UK, June 2009.https://www-itec.uni-klu.ac.at/mmc/blog/2009/07/08/mpeg-v-media-context-and-control/
  • MPEG-V:http://www.chiariglione.org/mpeg/working_documents.htm#MPEG-V
  • MPEG-V reflector:http://lists.uni-klu.ac.at/mailman/listinfo/metaverse

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 15. 2009/09/30 Christian Timmerer, Klagenfurt University, Austria Demo & Video 16. Thank you for your attention ... questions, comments, etc. are welcome Ass.-Prof. Dipl.-Ing. Dr. Christian Timmerer Klagenfurt University, Department of Information Technology (ITEC) Universittsstrasse 65-67, A-9020 Klagenfurt, AUSTRIA [email_address] http://research.timmerer.com/ Tel: +43/463/2700 3621 Fax: +43/463/2700 3699 Copyright: Christian Timmerer 2009/09/30 Christian Timmerer, Klagenfurt University, Austria

Recommended

View more >