interactive explainers in 10 yearsrd.abc.net.au/connectedcars/attachments/abc rd demo... · r+d...
TRANSCRIPT
Interactive Explainersin 10 years
ABC R+D: DEMO
R+D 2016 2
CARS EXPLAINERS IN 10 YEARS
In this rapid prototype, we show how a user takes a more ‘organic’ learning journey with something they’re interested in, via their virtual assistant.
Using voice commands, they can ask any question, and have a more natural conversation with their assistant to gain deeper understanding. Their highly advanced, intelligent assistant feels more like a friend or companion.
Car Context:
Imagine a driver on a morning commute wants to find out more about a particular story or issue. They engage in a seamless conversation with their assistant, who effortlessly provides summaries and snippets from content in shorter, more conversational, bite-sized chunks. In this future, assistants will be able to serve up much more dynamically generated content - e.g. audio segments and story summaries could be automatically created.
Relevance for ABC content makers
> when a user can ask any question and have a discussion with a virtual assistant, how will ABC content be surfaced and incorporated into the conversation? And, how will this sound and feel?
> how might ABC content makers from across divisions work together to facilitate this kind of explainer experience?
> advanced voice navigation and interaction - shows how virtual assistants will facilitate ‘conversations’, as opposed to being limited to voice commands. Their listening and learning capabilities will be very powerful.
> the ABC could eventually simulate conversations with ABC personalities thanks to voice donation technology - e.g. ask Costa from Gardening Australia about gardening!
> this kind of interaction model works well for more complex stories and issues, and offers a new way to showcase the breadth of the ABC’s content.
What is this?
R+D 2016 3
CARS EXPLAINERS IN 10 YEARS
2016 202620212017 20182011
Learning to listen
We allow people to talk to us, basic commands and simple questions
Seamless, natural conversations
You can converse with your virtual assistant, who now feels more like a friend or companion
Anticipating audience needs
We can facilitate highly personal journeys and more conversational interactions with our content
Building understanding and a relationship
We begin to learn about user context and preferences, starting to recognise patterns
Speech recognition improves
Coupled with natural language understanding communication opens up from basic commands
Q&A with a podcast via a virtual assistant
NIGEL
Interactive podcast with voice comments
SHORT & CURLYVoice
activated news headlines
SOUND DEMO
Interactive audio explainers
IN FIVE YEARSVoice and Emotion detection
IN UR FACE
ABC NewsbotFacebook
TRIAL
Interactive audio explainers
IN TEN YEARS
R+D 2016 4
CARS EXPLAINERS IN 10 YEARS
Potential user scenariosin 10 years timeThe virtual assistant and audience aren’t concerned about knowing the source for each answer
i.e. they’re happy to just have a conversation and hear what their assistant tells them. In this scenario, ABC content might not come up at all - it depends on how easy it is for the assistant to find and interpret it, and/or whether it knows if the user likes ABC content
The assistant already knows preferred content sources
e.g. Always give me ABC content first; give me ABC for local and Australian stories
It is easy for the audience to request a particular source for all answers
e.g. — what can the ABC tell me about the recent shark attacks in NSW?— what can ABC Sydney tell me about…
Audience members can ask for perspectives from multiple sources
e.g. — what does the ABC have on this?— how is the Guardian covering this? what about BBC… NYT?
R+D 2016 5
CARS EXPLAINERS IN 10 YEARS
How do virtual assistantsstart anticipating our needs
As virtual assistants become more intelligent they will learn more and more about the context of their users.
How much time do you have? = might just quickly want top level information, or alternatively have time for more detail
What are you doing? e.g. driving in frantic traffic, stuck in slow traffic = might have limited attention or be bored and want to fill time
The kind of information you’re seeking = looking for straight facts versus analysis and opinion; local, Australian stories versus international stories
The kind of attachment you have to the media brands like the ABC = in general, to a particular program (e.g. RN Breakfast) or a particular network (e.g. ABC 702)
How much you trust virtual assistants and tech companies
R+D 2016 6
CARS EXPLAINERS IN 10 YEARS
What if… the Assistant doesn’t mention sources
R+D 2016 7
CARS EXPLAINERS IN 10 YEARS
Content example: Shark attacks v1 = Assistant doesn’t mention sources
R+D 2016 8
CARS EXPLAINERS IN 10 YEARS
Where and how might the ABC fit in this scenario?Most, nearly all of the answers in the example user journey above have been sourced from ABC content - we have a lot of excellent information about shark attacks and shark behaviour!
What would it take for virtual assistants to be able to interpret ABC content and then grab the most perfect snippets in order to answer user questions? How will we point assistants in the right direction?
How will virtual assistants know if a user likes ABC content? Will there be anything the ABC needs to do to help? (e.g. provide data on what we already know about the behaviour and preferences of our users?) What kinds of partnerships do we need to consider with tech companies so that ABC content is prioritised?
R+D 2016 9
CARS EXPLAINERS IN 10 YEARS
What if… the Assistant knows the user prefers ABC
R+D 2016 10
CARS EXPLAINERS IN 10 YEARS
Content example: Shark attacks v2 = Assistant knows the user prefers ABC
R+D 2016 11
CARS EXPLAINERS IN 10 YEARS
Even if virtual assistants know when users prefer the ABC, how will we ensure they are serving up the very best, most relevant content we have?
Today we are focusing on segmented audio output and we know we need better metadata - but how else can we ready ourselves for this kind of future?
How do we want the ABC to sound in this kind of experience? How will it feel combining a mixture of ABC segments with various tones and styles in a single explainer?
R+D 2016 12
CARS EXPLAINERS IN 10 YEARS
What if… the Assistant uses ABC Local content as the first hook
R+D 2016 13
CARS EXPLAINERS IN 10 YEARS
Content example: Shark attacks v3 = ABC Local content as the first hook
R+D 2016 14
CARS EXPLAINERS IN 10 YEARS
How might the ABC help virtual assistants generate explainers with local stories and community sentiment as the first hook into our broader coverage of a topic or issue?
How will content makers across the ABC work together to facilitate this kind of experience?
How will this kind of experience sound and feel?
How else might hyperlocal, location-based content experiences be different with advanced virtual assistants and conversational interactions?
For ABC, I think the trick is in anticipating the questions that are most appropriate to the
particular Australian context. Also, what about answering the obscure questions that explore
the most far-fetched scenarios that no one else will answer?
Cam, 43
AUDIENCE FEEDBACK
R+D 2016 16
CARS INTERACTIVE EXPLAINERS IN 5 YEARS
It's not always going to be questions. We're talking about an interaction... When that woman (in the Short & Curly Podcast) was saying 'my pets are chickens', and you think 'my pets are chickens too', you'd want to be able to say that to your assistant. Sometimes it's just going to be comments and if we're talking to each other we’ll all go 'that's nice you've got chickens' and so what, that's what conversation is! It's just an affirmation, a statement where you say ‘okay cool’.
CarolinePresenter/Producer
ABC CONTENT MAKER COMMENT
R+D 2016 17
CARS EXPLAINERS IN 10 YEARS
Questions or comments?
Astrid Scott
Priscilla Davies
NicolaasEarnshaw
Jo Szczepanska
Amy Nelson
JanelleHerrera
Anne Lin
Charlie Szasz
Email ABC R&D to get in touch.