tensorflow in production with aws lambda

28
Tensorow in production with AWS Lambda Tensorow Tokyo - 2016-09-15

Upload: fabian-dubois

Post on 16-Apr-2017

5.579 views

Category:

Software


1 download

TRANSCRIPT

Page 1: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS Lambda

Tensorflow Tokyo - 2016-09-15

Page 2: Tensorflow in production with AWS Lambda

THIS IS NOT A MACHINE LEARNING

TALK

Page 3: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

What Will You Learn?

▸ What can you do with your trained model: MLOPS

▸ Why AWS lambda can be a solution

▸ AWS lambda with tensor flow: how it works

Page 4: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

About Me

▸ Freelance Data Products Developper and Consultant(data visualization, machine learning)

▸ Former Orange Labs and Locarise(connected sensors data processing and visualization)

▸ Current side project denryoku.io an API for electric grid power demand and capacity prediction.

Page 5: Tensorflow in production with AWS Lambda

So you have trained a model? Now what?

Page 6: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

It is a product, not an ad-hoc analysis

Live Data

Historical Data

" "Trained model Deployed model Prediction

Model selection and training

Production

▸ Needs to run on live data

Page 7: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

Many things may need to be done in production

▸ Batch processing

▸ Stream / event processing

▸ A prediction API

▸ Update and maintain the model

Page 8: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

This needs to be scalable, resilient

And also:

▸ maintainable

▸ versioned

▸ easy to integrate

ML+DevOps = MLOPS

Page 9: Tensorflow in production with AWS Lambda

Why AWS Lambda may help.

Page 10: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

Some deployment solutions

▸ Tensor flow Serving:

▸ Forces you to create dedicated code if you have more than a pure Tensorflow model

▸ doesn’t solve scalability issues

▸ forces you to manage servers

▸ Google CloudML

▸ Private Beta

▸ Likely limitations

Page 11: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

Serverless architectures with AWS Lambda

▸ Serverless offer by AWS

▸ No lifecycle to manage or shared state => resilient

▸ Auto-scaling

▸ Pay for actual running time: low cost

▸ No server, infra management: reduced dev / devops cost

…events lambda function

output

Page 12: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

Creating a function

Page 13: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

Creating a function

Page 14: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

Creating an “architecture” with triggers

Page 15: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

Event / microbatch processing

▸ event based: db/stream update, new file on s3, web hook

▸ classify the incoming data or update your prediction

Page 16: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

Batch processing

▸ cron scheduling

▸ let your function get some data and process it at regular interval

Page 17: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

An API

▸ on API call

▸ returned response is your function return value

▸ manage API keys, rate limits, etc on AWS gateway

Page 18: Tensorflow in production with AWS Lambda

Tensorflow and AWS Lambda in practice.

Page 19: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

How to save a TF model

▸ Use a saver object.

▸ It will save on disk:

▸ the graph model (‘filename.meta’)

▸ the variable values (‘filename’)

▸ Need to identify the placeholders that will be accessed later

saver = tf.train.Saver()## do the training#tf.add_to_collection('output', pred)tf.add_to_collection('input', x)save_path = saver.save(sess, "model-name.ckpt")python

Page 20: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

How to restore a TF model

▸ Restore the graph and variable values with a saver object

saver = tf.train.import_meta_graph(filename + '.meta')with tf.Session() as sess: # Restore variables from disk. saver.restore(sess, filename) pred = tf.get_collection('output')[0] x = tf.get_collection('input')[0] print("Model restored.") # Do some work with the model prediction = pred.eval({x: test_data})python

Page 21: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

Setting up AWS Lambda for Tensorflow

Tensorflow needs to be compiled for the right environment

# install compilation environmentsudo yum -y updatesudo yum -y upgradesudo yum groupinstall "Development Tools"# create and activate virtual envvirtualenv tfenvsource tfenv/bin/activate# install tensorflowexport TF_BINARY_URL=https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.10.0-cp27-none-linux_x86_64.whlpip install --upgrade $TF_BINARY_URL# zip the environment contenttouch ~/tfenv/lib/python2.7/site-packages/google/__init__.pycd ~/tfenv/lib/python2.7/site-packages/zip -r ~/tf-env.zip . --exclude \*.pyccd ~/tfenv/lib64/python2.7/site-packages/

1. Launch an EC2 instance and connect to it

2. Install TensorFlow in a virtualenv

3. Zip the installed libraries

shell

Page 22: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

A tensorflow calling lambda function

▸ Accepts a list of input vectors: multiple predictions

▸ Returns a list of predictions

import tensorflow as tffilename = 'model-name.ckpt'def lambda_handler(event, context): saver = tf.train.import_meta_graph(filename + '.meta') inputData = event['data'] with tf.Session() as sess: # Restore variables from disk. saver.restore(sess, filename) pred = tf.get_collection('pred')[0] x = tf.get_collection('x')[0] # Apply the model to the input data predictions = pred.eval({x: inputData}) return {'result': predictions.tolist()}python

Page 23: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

upload and test

▸ add your lambda function code and TF model to the environment zip.

▸ upload your function

Page 24: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

Where to put the model?

▸ with the function: easy, in particular when testing

▸ on s3: ease update or allows for multiple models to be used in parallel.

▸ function could be called with model ref as argument

"…lambda function

tensor flowlive dataprediction

$model

Page 25: Tensorflow in production with AWS Lambda

Tensorflow in production with AWS lambda

caveats

▸ No GPU support at the moment

▸ model loading time: better to increase machine RAM (hence CPU) for fast API response time

▸ python 2.7 (python 3 doable with more work)

▸ request limit increase to AWS for more than 100 concurrent executions

Page 26: Tensorflow in production with AWS Lambda

Your turn!

Page 28: Tensorflow in production with AWS Lambda

Text

references:

▸ http://docs.aws.amazon.com/lambda/latest/dg/current-supported-versions.html

▸ tensor flow package url https://www.tensorflow.org/versions/r0.10/get_started/os_setup.html

"%……$