Giter Site home page Giter Site logo

Comments (6)

emacski avatar emacski commented on May 31, 2024

The name "model" is used in a couple default options to the model server startup command.

If your run command looks similar to this:

docker run --rm --init -v my_saved_model:/models/my_saved_model emacski/tensorflow-serving:1.14.0

Then you should be able override the defaults as such:

docker run --rm --init -v my_saved_model:/models/my_saved_model emacski/tensorflow-serving:1.14.0 --model_name=my_saved_model --model_base_path=/models/my_saved_model

from tensorflow-serving-arm.

NickHauptvogel avatar NickHauptvogel commented on May 31, 2024

If I run it like this:
docker run --rm --init -v my_saved_model:/models/my_saved_model emacski/tensorflow-serving:1.14.0 --model_name=my_saved_model --model_base_path=/models/my_saved_model

I only get an error similar to the first one:
FileSystemStoragePathSource encountered a filesystem access error: Could not find base path /models/my_saved_model for servable my_saved_model

Which path do I have to add after -v? Is it the path on my local machine or inside the container? If inside, how do I get my model there (otherwise error: No servable found)?

Regards

from tensorflow-serving-arm.

NickHauptvogel avatar NickHauptvogel commented on May 31, 2024

Still the only way it works is the following, but only by mounting the model, I need to get it inside the container:

docker run -p 8501:8501 --mount type=bind,source=/tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu,target=/models/model -e MODEL_NAME=model -t emacski/tensorflow-serving:latest-linux_arm &

from tensorflow-serving-arm.

NickHauptvogel avatar NickHauptvogel commented on May 31, 2024

To be precise, I only need to copy the model inside the container instead of mounting it via a local path, how do I achieve this?

from tensorflow-serving-arm.

emacski avatar emacski commented on May 31, 2024

To me, the easiest and most straight forward way is to just bake the model into a new image using the default paths with the parent image being emacski/tensorflow-serving.

(I opt for stateless and ephemeral containers whenever possible which is why I focus on images)

create a Dockerfile with the following

FROM emacski/tensorflow-serving
COPY /tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu /models/model

and then something like this in the same directory as the Dockerfile

docker build -t halfplus2cpu .

and now you can use your halfplus2cpu image just like the emacski/tensorflow-serving image without any additional volume mapping.

Other Approaches / Examples

Custom Naming in Image

Same steps as above, but with following contents in the Dockerfile

FROM emacski/tensorflow-serving
COPY /tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu /saved_model_half_plus_two_cpu
CMD ["--model_name=saved_model_half_plus_two_cpu", "--model_base_path=/saved_model_half_plus_two_cpu"]

Copy to Existing Container

You can actually copy files and folders to an existing container if this is what you really want. Just note that, essentially, these steps need to be repeated every time a new container is created since the filesystem modification doesn't actually exist in the image.

docker create --name tfserve -p 8501:8501 --init emacski/tensorflow-serving --model_name=saved_model_half_plus_two_cpu --model_base_path=/saved_model_half_plus_two_cpu
docker cp /tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu tfserve:/saved_model_half_plus_two_cpu
# start in background
docker start tfserve
# then tail logs
docker logs tfserve

# start in foreground
docker start -a tfserve

from tensorflow-serving-arm.

NickHauptvogel avatar NickHauptvogel commented on May 31, 2024

Hi,

docker cp did work for me, although not with /models/ folder, as it does not exist if not created at startup. I therefore used /home/ to store my model in as a quick workaround. A custom Dockerfile would solve this for sure!

Thank you and regards,

Nick

from tensorflow-serving-arm.

Related Issues (13)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.