cosc 4730 basics of google glass development.. basic idea from...

39
Cosc 4730 Basics of Google glass development.

Upload: elfreda-bridges

Post on 22-Dec-2015

218 views

Category:

Documents


0 download

TRANSCRIPT

Cosc 4730

Basics of Google glass development.

Basic idea

From https://developers.google.com/glass/develop/overview

Basic Idea (2)

• So we use everything we have learned so far, except it’s only API 19.– There is a note, that the sensors api were frozen at

17 and that any changes in 18+ were not added.

humor

Creating a glass project

• Create a new project– Minimum and Target SDK Versions: 19• There is only one Glass version, so minimum and target

SDK are the same.)

– Compile with: Glass Development Kit Developer Preview [Android 4.4.2]

– Theme: None

Creating a glass project (2)• AndroidManifest.xml file:

– You chose no theme, but • ADT and Android Studio assigns a theme automatically, even if you specify no

theme, so remove the android:theme property from your manifest after creating a project.)

– Add for voice:<uses-permission android:name="com.google.android.glass.permission.DEVELOPMENT"/>

– Change the Intent-filter. Launcher is not used and we need voice <intent-filter> <action android:name= "com.google.android.glass.action.VOICE_TRIGGER" /> </intent-filter> <meta-data android:name="com.google.android.glass.VoiceTrigger" android:resource="@xml/voice_trigger"/>

Voice Trigger

• Pick a couple of words– Hello world project uses “Hello World”

• The full command would be “Ok Glass Hello World” and it will launch the hello world app.

– Voice_trigger.xml file:<trigger keyword="@string/voice_trigger" />

– Since we are “Doing this correctly”– In String.xml file (only a snippet of the file)

<!-- The voice trigger used for launching general demos. -->

<string name="voice_trigger">Hello World</string>

Creating a glass project (3)

• Remove the android-support-v4.jar – Eclipse automatically includes it.

– You can then code any layout needed and it will display. • helloGlass

– Remember though, you don’t touch widgets and thought of thing. We use the voice and hardware sensors

Example: HelloGlass

Live Cards• Live Cards app show up on the glass time line• They are updated (low or high frequency) via a

service render– We also use menus, When the live card is no

longer needed then the service is ended (or destroyed) via the stop menu.

https://developers.google.com/glass/develop/gdk/live-cards

Immersion• Immersion app’s display outside the time line

• The Hello Glass app is basically an immersion app and what we are used too.

– This are “great for experiences that require prolonged user attention”• Design items to keep in mind:

– Design your UIs for a 640 × 360 pixel screen. With a layout, custom 2D view, or opengl rendering.

– Don't rely on complex touch gestures or UI patterns: Instead use menus, voice, or simple touch events (tap, slide forward, slide backward, slide up)• You can use more then one finger touch as well, but those tend to be complex.

– Create a 50 × 50 pixel icon and specify it for the android:icon attribute of the <activity> element in your Android manifest. Also specify text for android:label. This allows a voice or touch menu item that is associated with multiple Glassware to show your Glassware's name and icon as an option.

– Specify android:immersive="true" for your <activity> element to give immersions focus after a screen sleeps and wakes up.

https://developers.google.com/glass/develop/gdk/immersions

Immersion and custom view

• Setup the project for glass– Pretty much is the same as normal.– It demo with the AndGame code, which is called

GlassInvaders• There were some issues with guestures, which

in mainActivity instead of view as the before.– Uses the slide forward and backward to move– Tap to fire– Also the voice system can be used.

Glass Invaders

Full size

200 px less

Bigger font

gestures• Swipe forward, back, tap, and double tap– Declare a GestureDetector and in onCreate– GestureDetector gestureDetector = new

GestureDetector(context);– //Create a base listener for generic gestures– gestureDetector.setBaseListener( new

GestureDetector.BaseListener() {• public boolean onGesture(Gesture gesture) {

– If gesture == Gesture.TAP – Gesture.TWO_TAP, Gesture.SWIPE_RIGHT, or

Gesture.SWIPE_LEFT

Gestures (2)gestureDetector.setFingerListener(new GestureDetector.FingerListener() {

@Override

public void onFingerCountChanged(int previousCount, int currentCount) {// do something on finger count changes}

});

gestureDetector.setScrollListener(new GestureDetector.ScrollListener() {@Override

public boolean onScroll(float displacement, float delta, float velocity) {// do something on scrollingreturn false;}

});

voice

• In the App you can also respond to voice pretty easily.

• NOTE: The system requires that user to say “OK Glass” before every command. – This is not a free form voice recognition, which you can do

just like in any android app.

• In OnCreate and before SetContentView–

getWindow().requestFeature(WindowUtils.FEATURE_VOICE_COMMANDS);

Voice (2)

• Now it works off the same functions/idea as a menu does, public boolean onMenuItemSelected(int featureId, MenuItem item) {

if (featureId == WindowUtils.FEATURE_VOICE_COMMANDS) {

If (item.getItemId() == R.id.menuitemid) {//do something

}

}

}

Immersion and openGL

• Setup the project for glass• And then use the openGL just like before.

• See openGL demo for the example.• Also there is a LiveCardOpenGL version too.

OpenGL picture

menu• Almost all of the menu system is just like android with a couple of

exceptions.• First the menu system is not automatic. We have to start it up when the

user taps.– So we have to include this code in the activity we want a menu in. @Override public boolean onKeyDown(int keyCode, KeyEvent event) { if (keyCode == KeyEvent.KEYCODE_DPAD_CENTER) { openOptionsMenu(); return true; } return false; }– Now the onCreateOptionsMenu and onOptionsItemSelected methods will be

called.

Menu (2)

• For each menu item, provide a 50 × 50 pixel menu item icon. The menu icon must be white in color on a transparent background. – Download the Glass menu icons for use or as an example.

• Use a short phrase that describes the action and is in sentence case. An imperative verb works well (for example, "Share" or "Reply all").

<item android:id="@+id/exit_menu"android:icon="@drawable/ic_done_50"android:title="@string/exit"/>

https://developers.google.com/glass/develop/gdk/immersions#creating_and_displaying_a_menu

Card View

• A Card class creates a well-formed cards given a set of properties and can be used where you need any view (ie in a layout)

• We can also use a CardScrollView (and adapter), so it works like a listview– But the user “scroll” back and forth between the

cards.

Card View (2)

• The card can have text, a footer, and pictures.card.setText()Card.setFootnote()

cardsetImagelayout(Card.ImageLayout.FULL)or Card.ImageLayout.LEFT

Card.AddImage(R.drawable.X)• Add The number of images needed

Card View (3)

• A click (tap) listener for the CardScrollView– mCardScrollView.setOnItemClickListener(new

AdapterView.OnItemClickListener() {• @Override• public void onItemClick(AdapterView<?> parent, View

view, int position, long id) {– Position is the which card in the CardScrollView (and

adapterview parent)

CardView example

Back to Live Cards.

• Low-Frequency Rendering– It’s a small of views, that don’t need updated

every second• FrameLayout, LinearLayout, RelativeLayout, GridLayout,

AdapterViewFlipper, AnalogClock, Button, Chronometer, GridView, ImageButton, ImageView, ListView, ProgressBar, StrackView, TextView, and ViewFlipper

• High Frequency Rendering– Custom 2D view (ie drawing) and OpenGL

Live Cards (2)

• LiveCard app will have the following parts– MenuActivity (required)

• Used to Stop the LiveCard at the very minimum– Provides a Stop Menu item and other menu items as needed.

– Service• This is actually what is called when the app is started• It provided the view to be displayed either view a Render or

RemoteView (Low Frequency)

– Render (High Frequency only)• OpenGL code and/or Custum View

– Other Java classes as needed.

Live Cards (3)

• Some Notes– Put all images in a drawable directory (not

drawable-XXX)

– Live Card are long running services. You may have to manually uninstall the app to get to it stop if there are problems.• Even if it crashes, sometimes the devices restarts it.

Creating the LiveCard• It’s created in service, in the OnStartCommand(..)

– mLiveCard = new LiveCard (this, LIVE_CARD_TAG)– Where private static final String LIVE_CARD_TAG = "GameStats";

– Inflate the view (as a RemoteViews)• mLiveCardView = new RemoteViews( getPackageName(),

R.layout.main_layout);• //now you work on the layout via the variable, example:

– mLiveCardView.setTextViewText(R.id.home_team, "UW Cowboys");

• You have to include a pending intent for the menu and the publish the card (otherwise it won’t show)

– Intent menuIntent = new Intent(this, MenuActivity.class);– menuIntent.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK |

Intent.FLAG_ACTIVITY_CLEAR_TASK);– mLiveCard.setAction(PendingIntent.getActivity(this, 0, menuIntent, 0));– mLiveCard.publish(PublishMode.REVEAL);

Example

Creating the LiveCard (2)

• Also you would use a thread or asynctask to update the card with new/updated data.– Again with the mLiveCardView variables.

– Then set view , so the card will redraw.• mLiveCard.setViews(mLiveCardView);

MenuActivity

• For any LiveCard, you are REQUIRED to have a menu and the Stop.– Ie it’s the only way to end the service.

– See the liveCardGameStats demo code for the MenuActivity on how to make it work correctly.

TextToSpeech and Binder

• In LiveCards you have a service which holds all the running data.– So if you want to use “Read Aloud” functions, then

you must use the IPC functions in android (ie Binder) methods.• Shown in the code and here.

– LiveCardStats2

– Use the normal TextToSpeech like normal android.

LiveCard and OpenGL

• So like before, use the OnStartCommand to start the Renderer() – And menuActivity and publish.

• The render though uses a import com.google.android.glass.timeline.GlRenderer; //which is a GLES20 version (not 1.1)

• Otherwise, just like normal openGL.

example

Sensors

• Glass supports the following sensors (and used just like on android)– TYPE_ACCELEROMETER, TYPE_GRAVITY,

TYPE_GYROSCOPE, TYPE_LIGHT, TYPE_LINEAR_ACCELERATION, TYPE_MAGNETIC_FIELD, TYPE_ORIENTATION (deprecated), TYPE_ROTATION_VECTOR

Sensors

• Location (GPS) is the same.• Camera is also the same, but there is a camera

button builtin to the top of the glass.– You can override it as needed– public boolean onKeyDown(int keyCode, KeyEvent event) {

if (keyCode == KeyEvent.KEYCODE_CAMERA) {return false; //let the glass do the work

return true; //if you handled it or taking video

• Taking video and pictures are like in android.

References

• Using glass: https://support.google.com/glass/?hl=en#topic=4363345

• Using an older API, but somewhat useful:– https://github.com/harrywye/gdkdemo

• Glass doc’s:– https://developers.google.com/glass/develop/overview – https://

developers.google.com/glass/develop/patterns/index – https://

developers.google.com/glass/develop/gdk/index

QA&