how i hacked the google daydream controller

14
Rome, December 11th 2016 I am Matteo Pisani , CTO and co-founder of Remoria VR (www.remoriavr.com), a startup committed to develop input devices for mobile virtual reality. Creative, curious and inspired software developer with hacking attitude and strong disposition toward reverse-engineering. I matured several years of experience in IoT, embedded systems development and in bridging the gap between the physical reality and the digital world. E-mail: [email protected] LinkedIn: https://it.linkedin.com/in/matteopisani www.remoriavr.com How I hacked the Google Daydream Controller ABOUT ABSTRACT

Upload: matteo-pisani

Post on 15-Apr-2017

519 views

Category:

Software


0 download

TRANSCRIPT

Rome, December 11th 2016

I am Matteo Pisani , CTO and co-founder of Remoria VR (www.remoriavr.com), a startup committed todevelop input devices for mobile virtual reality.

Creative, curious and inspired software developer with hacking attitude and strong disposition towardreverse-engineering. I matured several years of experience in IoT, embedded systems development and inbridging the gap between the physical reality and the digital world.

E-mail: [email protected]

LinkedIn: https://it.linkedin.com/in/matteopisani

www.remoriavr.com

How I hacked the Google Daydream Controller

ABOUT

ABSTRACT

Mobile virtual reality is growing rapidly. The Google Daydream platform was launched just last month and itsuggested that compelling VR experiences might become widely accessible to consumers sooner thanexpected. Today, solutions like smartphone + headset + bluetooth controller are very appreciated bydevelopers, media and entertainment companies, but... There's one problem: compatibility. As announced,the Daydream controller binds only with a bunch of Daydream-ready smartphones running Android 7.0Nougat. Moreover, as reported by Clay Bavor (VP, Virtual Reality at Google), the Google Daydream "It’snot currently compatible with iOS and won’t be for several years probably.".

Since I like challenges I decided to hack the Google Daydream controller using code, reverse-engineering skills and some math, to extend the compatibility also on Apple iOS devices: it was a success.

Google Daydream controller works via Bluetooth LE (Low Energy) but I wasn't able to discover it inBluetooth settings of my iPhone 5, so I used the BlueCap (github.com/troystribling/BlueCapapp) whichallows to easily implement Central and Peripheral applications, serialize and deserialize messagesexchanged with bluetooth devices and define reusable GATT profile definitions.

ANALYSIS

I had a look at the data available for each Service: there were known services like Device Information andBattery but I also found something intresting inside an uknown one, the FE55:

As soon I explored inside the first Characteristic with the UUID 00000001-1000-1000-8000-00805f9b34fband turning On the Notifications, BlueCap started showing BLE packets. Waving the Daydream controllerin the air, I could see the incoming data changing in real-time. Same thing happened by touching the padon top or randomly by pressing the buttons.

According to Bluetoth LE standard each packet should weigh 20 bytes:

7be85b3ff13b48003bf1ffa00000000000000070

The packets anatomy revealed that they were encoded and represented into Hexadecimal notation.Behind the masked data laid the whole status of the controller, including accelerometer, gyroscope,

magnetometer, touchpad, buttons and more.

The first step was to setup a testing environment to facilitate all the debug processes. I decided to startfrom scratch: I developed a sandbox with Apple XCode (working on a MacBook Pro) and an iOS app(with some Objective-C) that included the CoreBluetooth\CoreBluetooth.h framework(developer.apple.com/reference/corebluetooth). Thanks to this, I could establish and managecommunications and data flows over Bluetooth GATT protocol.

After choosing the Service FE55 and requesting notifications for the Characteristic 00000001-1000-1000-8000-00805f9b34fb I was able to get the data output flowing through the console:

Once the data was collected and opportunely decoded I decided to represent it into a 3D view. So, Imigrated all the iOS native code to a Hybrid environment wrapping it all into a Cordova plugin: thanks tothis process, I was able to save time and perform several optimizations.

The use of JavaScript reduced the overall complexity, speeded up the experiments and allowed me toimprove the data visualization embedding also thanks to the amazing A-Frame WebGL framework(aframe.io) inside a HTML5+CSS3 view.

(the environment stack)

ENVIRONMENT

With the use of Blender, the open-source 3D creation suite, I was able to edit a bulky Google Daydreamcontroller model fund on the internet, making it suitable for my purpose. After the editing, I exported it to anA-Frame compliant format (*.obj).

In few lines of code, I was able to finish the whole setup and this was the result:

Now for the hardest part: understanding the raw data. Starting from an average knowledge aboutHexadecimal to Decimal conversion, I split up the 40 chars in 20 chunks of 2 chars then converted toBinary:

7b e8 5b 3f f1 3b 48 00 3b f1 ff a0 00 00 00 00 00 00 00 70

I just wanted to give it a try, so I tested an online Hexadecimal to Decimal converter and this was the output

Later, I also tried the Decimal to Binary converter.

Bringing everything to JavaScript

var rawdata= "7be85b3ff13b48003bf1ffa00000000000000070", bitchain = "";for(var i = 2; i <= 40; i+=2) bitchain += parseInt(rawdata.slice(i-2,i),16).toString(2);console.log(bitchain,'length: ' + bitchain.length);

The output expected was 160 bits length chain (8 bits * 20 chunks) for each packet:

1111011111010001011011111111111100011110111001000011101111110001111111111010000000000001110000 length: 94

I got only 94 instead of 160 bits expected so I realized that something was wrong.After going deep into the issue, I found that the hexadecimal values converted in bits sometimes producedresults shorter than 8, in other words, were not stuffed in groups of 8: the zeropad to 8 solved all theproblems.

Once I addedd the zeropad method and changed the code in:

REVERSING

function zeropad(n, width, z) { z = z || '0'; n = n + ''; return n.length >= width ? n : new Array(width - n.length + 1).join(z) + n;}

var rawdata= "7be85b3ff13b48003bf1ffa00000000000000070", bitchain = "";for(var i = 2; i <= 40; i+=2) bitchain += zeropad(parseInt(rawdata.slice(i-2,i),16).toString(2),8);console.log(bitchain,'length: ' + bitchain.length);

this time the expected result was correct.

0111101111101000010110110011111111110001001110110100100000000000001110111111000111111111101000000000000000000000000000000000000000000000000000000000000001110000 length: 160

After a couple of sleepless nights I started to give a shape to this mesmerizing bitchain: a comprehensiveknowledge about IMU (Inertial Measurement Unit) and MEMS (Micro Electro-Mechanical Systems) sensorspaired with a great patience and good observation skills helped me to figure out the sense of what washappening.

The crucial points were:

observate all the oscillating bits;play a little more with offests.

This allowed me to recognize, extract and categorize the values. I reported all of them below:

SENSORS (12 bits for the value + 1 bit for the sign)

var gyroscope = { x : parseInt(rawdata.slice(14,27),2), y : parseInt(rawdata.slice(27,40),2), z : parseInt(rawdata.slice(40,53),2)};

var magnetometer = { x : parseInt(rawdata.slice(53,66),2), y : parseInt(rawdata.slice(66,79),2), z : parseInt(rawdata.slice(79,92),2)};

var accelerometer = { x : parseInt(rawdata.slice(92,105),2), y : parseInt(rawdata.slice(105,118),2), z : parseInt(rawdata.slice(118,131),2)};

TOUCHPAD (8 bits for the value)

var touchpad = { x : rawdata.slice(131,139), y : rawdata.slice(139,147)};

BUTTONS (1 bit for the value)

var buttons = { app: rawdata.slice(147,148), home : rawdata.slice(148,149), volumeUp : rawdata.slice(149,150), volumeDown : rawdata.slice(150,151), touchClick : rawdata.slice(151,152)};

Once I achived this goal, I tried to manipulate all these data to give a coherent orientation to the 3D GoogleDaydream controller model, through the A-Frame canvas: unfortunately the output on the screen resultedin a tilting controller with meaningless movements.

<a-scene> <a-camera id="camera" position="0 0 10"></a-camera> <a-sky color="#4E4E4E"></a-sky> <a-entity id="daydream" obj-loader="src: url(./DayDream_Controller.obj); mtl: url(./DayDream_Controller.mtl);" position="0 0 0" rotation="0 0 0" scale="0.1 0.1 0.1"> </a-entity></a-scene>

<script> document.querySelector('a-entity[id=daydream]'). setAttribute('rotation', GoogleDayDreamController.getZ() + ' ' + GoogleDayDreamController.getY() + ' ' + GoogleDayDreamController.getX();</script>

Reversing some of the *.apk of the Google VR Services (found inside the Google Pixel OS and thatallows native communication with Google Daydream controller via BLE), I was able to get my hands onuseful information.

Through reverse-engineering of Android Java app using apktool, dex2jar, jd-gui to convert *.apk file to.java, it was possible to:

understand how a particular UI in an App is constructedreading AndroidManifest.xml, permissions, activities, intents etc in the Appdiscover native libraries and images used in that Appfind obsfucated code (Android SDK, by default, uses ProGuard tool which shrinks, optimizes, andobfuscates the code by removing unused code and renaming classes, fields, and methods withsemantically obscure names).

The tools I used:

ApkTool (from http://code.google.com/p/android-apktool/)

to extract AndroidManifest.xml and everything in res folder (layout xml files, images, htmls usedon webview etc..), run the following command:

apktool d sampleAndroidApp.apk

TOOLS

It also extracts the .smali file of all .class files, but which is difficult to read.

Dex2jar (from http://code.google.com/p/dex2jar/)

to generate .jar file from *.apk file, we need JD-GUI to view the source code from this .jar. Runthe following command:

dex2jar sampleAndroidApp.apk

JD-GUI (from http://java.decompiler.free.fr/?q=jdgui)

it decompiles the .class files (obsfucated, in case of Android app, but readable original code isobtained in case of other .jar file). i.e., we get .java back from the application. Just Run the jd-guiexecutables on your OS and after, File->Open to view Java code from .jar or .class file.

In particular, I found interesting information inside:

com.google.android.vr.home.apkcom.google.vr.vrcore.apk

Collecting all my developer-thoughts and making them fit together, I realized that the best solution was touse the AHRS (Attitude Heading Reference Systems) calculation for JavaScript(npmjs.com/package/ahrs).

This calculates the attitude and heading for a device with all of the following sensors: magnetometer,gyroscope and accelerometer. The Madgwick or Mahony algorithms can be used to filter data in real timefrom these sensors, obtaining a great accuracy.

<script> var AHRS = require('ahrs'); var madgwick = new AHRS({ /* * The sample interval, in Hz. */ sampleInterval: 60, /* * Choose from the `Madgwick` or `Mahony` filter. */ algorithm: 'Madgwick', /* * The filter noise value, smaller values have * smoother estimates, but have higher latency. * This only works for the `Madgwick` filter. */ beta: 0.4, /* * The filter noise values for the `Mahony` filter. */ kp: 0.5, ki: 0 });

madgwick.update(gyroscope.x, gyroscope.y, gyroscope.z, accelerometer.x, accelerometer.y, accelerometer.z, magnetometer.x, magnetometer.y, magnetometer.z); var euler = madgwick.toEulerAnglesDegree();</script>

The getEulerAnglesDegrees method returns an object with the Euler angles (heading/yaw, pitch, roll), indegrees.

The return Object contains:

heading is from north, going west (about z-axis).pitch is from vertical, going forward (about y-axis).roll is from vertical, going right (about x-axis).

Finally, it was possible for me to set the model orientation to the right coordinates

<script> document.querySelector('a-entity[id=daydream]') .setAttribute('rotation', euler.heading + ' ' + euler.pitch + ' ' + euler.roll );</script>

"A picture is worth a thousand words" (think of a video!).

The result was brilliant: as you can see in the YouTube video below that I recorded to show the potential ofthe entire hack

www.youtube.com/watch?v=QKNWqBFlR1M

The responsiveness is extremely fluid, according to the PPS (packets per second) parameter, ~ 60 areenough to cover a VR game or a 3D experience as well.

RESULT

The scenarios that this hack opens are various. Now that the secret sauce has been exposed and thecompatibility extended to iOS devices, it is possible to replicate the job to include all the desktop platforms.This would help the developers debugging their own software in a desktop environment, without passingthrough deploying an app on the smartphone every time. On the Android side, this hack will unleash thewhole potential of the Daydream controller as it would be no longer restricted to the OS Nougat 7.0.

In this perspective, it is possible to see the Daydream controller working with older versions of AndroidOS. On the other hand, binding this controller with open source platforms like Raspberry PI or Arduino,will extend the horizons of makers and creatives. Do you imagine using the Daydream controller to pilotyour drone or your RC-car, playing a virtual drumset or maybe, making some sounds with a virtual synth?

www.remoriavr.com

CONCLUSION