Giter Site home page Giter Site logo

fpaupier / tensorflow-serving_sidecar Goto Github PK

View Code? Open in Web Editor NEW
45.0 1.0 21.0 116.87 MB

Serve machine learning models using tensorflow serving

License: MIT License

Python 99.36% Shell 0.15% Jupyter Notebook 0.44% Dockerfile 0.05%
tensorflow tensorflow-tutorials tensorflow-serving object-detection object-detection-pipelines coco python docker

tensorflow-serving_sidecar's Introduction

Note: This repo is now archived and won't see further development

How to serve your machine learning model with tensorflow-serving

This repo offers detailed How-to about serving your tensorflow models with tensorflow-serving. It can be followed as a step-by-step tutorial to serve your tensorflow model or other models with some adjustments.

Scope of the project

I do not cover the parts of training or exporting models here. I focus solely on serving a model for inference. Pipe overview

What's in the box?

Two tutorials:

  1. A basic tutorial to get you quickly serve an object detection model with tensorflow-serving. Expect to get such an awesome object-detection up and running in less than 10 mins. horse labelled with faster rcnn resnet

  2. An advanced tutorial to deploy your tensorflow server docker image on Google Cloud Platform. Unleash the power of GCP to build a scalable machine learning server running in a kubernetes cluster.

The proposed object detection model is here to get you started quickly, feel free to use yours for more fun!

The reader is expected to be familiar with tensorflow, knowing how to export a model for inference will be helpful (tensorflow-server works with savedModel). Knowing docker or kubernetes are not pre-requisites since the commands used are simple and explained when needed.

Those tutorials are highly inspired by tensorflow-serving official documentation with some tips and more detail on the installation process.

0. Install the project

The installation process is thoroughly described in the docs/setup.md. It covers everything you need to do prior being able to serve a model with tensorflow-serving.

1. Serve your first model and perform inference with tensorflow-serving

Play Time!

To make sure the installation went smoothly, get your first inference result from the object detection model tensorflow serving. Follow the first tutorial to serve an object detection model on your machine docs/tf_server_local.md.

2. Deploy your model on the cloud for high availability

Now that we made sure our inference server works great on local, let's deploy it on the cloud at production scale. Let's dive in the docs/tf_server_k8s.md.


Resources

Additional information you may find useful

Google proposes a managed solution - Google Cloud ML Engine - to serve your saved_models.pb models. I do not focus on it since google documentation is a more comprehensive source of information.

Pros

  • Deploy new models easily

Cons

  • Less flexibility
  • As of today, you are limited in size for your savedModel.pb file to 250 MB. (That may change)

Credits

The object_detection directory comes from the tensorflow-model repository. It offers useful utils functions to tag the image returned from the model.

Feel free to investigate the models on the tensorflow-model repo since they are well documented and often comes with useful tutorials.

tensorflow-serving_sidecar's People

Contributors

dependabot[bot] avatar fai555 avatar fpaupier avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

tensorflow-serving_sidecar's Issues

variables empty!

I have a question:

1.download faster_rcnn_resnet101_coco_2018_01_28 from model_zoo as you mentioned
2. rename dir name to 001
3.run TF Serving successfully!

But,why they need to rewrite exporter.py ?

cannot import name 'plot_util'

while running client.py script , i am getting error as - cannot import name 'plot_util'.
i do not see 'plot_util.py' file under object_detection/utils.

Let me know your views.
Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.