Giter Site home page Giter Site logo

amy-tabb / docker-containers Goto Github PK

View Code? Open in Web Editor NEW

This project forked from fastai/docker-containers

0.0 1.0 0.0 171 KB

Docker images for fastai

Home Page: https://hub.docker.com/u/fastai

License: Apache License 2.0

Dockerfile 20.52% Python 8.43% Shell 66.51% Ruby 4.55%

docker-containers's Introduction

Build fastai images Build nbdev images Build nbdev docs Build CI Containers

Docker Containers For fast.ai

This repository builds various docker images relevant to projects in fastai on a recurring schedule defined in this repo's workflow files. You must install Docker before using this project.

These Docker containers are useful for production, testing and online services or to facilitate scenarios that require reproducibility and portability. Some familiarity with Docker is assumed before using these containers. For a gentle introduction to Docker, see this blog post.

For students taking the fast.ai courses, or for prototyping and development, fast.ai recommends using Anaconda in your home directory, instead of these containers. Please consult the appropriate repository for installation instructions.

Table of Contents

Tags

The following tags are available for all images:

  • latest: the most current build.

    example: docker pull fastdotai/fastai:latest

  • version: corresponds to the version of the project when the container was built.

    example: docker pull fastdotai/fastai:0.0.22

  • YYYY-MM-DD: corresponds to the date the container was built.

    example: docker pull fastdotai/fastai:2020-07-31

You can filter for the available tags by navigating to the Tags tab on the appopriate DockerHub repository that corresponds to the container you are using.

Projects

fastai

Build fastai images

Docker images for fastai/fastai. These images are built on top of the latest pytorch image. You must install Nvidia-Docker to enable gpu compatibility with these containers. The definition of this image can be found in fastai-build/Dockerfile.

fastai Images

  • fastdotai/fastai: fastai and fastcore, with all of their dependencies.

    Pull this image:

    docker pull fastdotai/fastai:latest

  • fastdotai/fastai-dev: has an editable install of fastai and fastcore.

    Pull this image: docker pull fastdotai/fastai-dev:latest

  • fastdotai/fastai-course: is the same as fastdotai/fastai but additionally has the the version 4 course notebooks and fastbook cloned into the container.

    Pull this image: docker pull fastdotai/fastai-course:latest

fastai Usage

If you have a Nvdia GPU that is compatible with CUDA 10 or higher, you should install Nvidia Docker. Afterwards, you will need to use the --gpus flag when running the container. See the usage section for more details on the various arguments available.

These images have the default user as root. However, for production use cases you may not want to run your containers as root. We leave it to the end user to configure their environment to suit their needs. You can change to a non-root user with the flag --user 9000, which is illustrated in the examples below.

fastai Examples

Note: the script run_jupyter.sh is a convenience script that is located in the home directory of these containers. This allows you to quickly run a jupyter server. The script has the command jupyter notebook --ip=0.0.0.0 --port=8888 --allow-root --no-browser.

  • Run an interactive shell on CPUs (for example your laptop) on the latest version of fastai:

    docker run -it fastdotai/fastai bash

  • Run an interactive shell with fastdotai/fastai-dev and mount the current directory from your host file system to /home/fastai-user in the container (the -v flag) as well as make this the home directory (the -w flag) in the container.

    docker run -it -v $PWD:/home/fastai-user -w /home/fastai-user fastdotai/fastai-dev bash

  • Run a jupyter server on CPU with an editable install on port 8888

    docker run -p 8888:8888 fastdotai/fastai-dev ./run_jupyter.sh

  • Test that your GPUs are visible to pytorch from within the docker container:

    docker run --gpus 1 fastdotai/fastai python -c "import torch;print(torch.cuda.is_available())"

  • Run the same command as above as a non-root user:

    docker run -it --user 9000 another/tag python -c "import torch;print(torch.cuda.is_available())"

  • Run a jupyter server with all GPUs:

    docker run --gpus all -p 8888:8888 fastdotai/fastai ./run_jupyter.sh

  • Run a jupyter server with 2 GPUs on with an editable install:

    docker run --gpus 2 -p 8888:8888 fastdotai/fastai-dev ./run_jupyter.sh

  • Run a jupyter server with 2 GPUs on with an editable install for version fastai 0.0.22:

    docker run --gpus 2 -p 8888:8888 fastdotai/fastai-dev:0.0.22 ./run_jupyter.sh

nbdev

Build nbdev images

Docker images for nbdev. These containers are built using repo2docker, and come bundled working Jupyter and JupyterLab. You can see how these images are built in .github/workflows/nbdev.yaml. The default entrypoints into these containers are Jupyter servers running on port 8888 with a user named runner, with a working directory of /home/runner in the container.

nbdev Images

  • fastdotai/nbdev: an install of nbdev from the latest available version on GitHub. Pull this image:

    docker pull fastdotai/nbdev:latest

  • fastdotai/nbdev-dev: has an editable install of nbdev. Pull this image:

    docker pull fastdotai/nbdev-dev:latest

nbdev Usage

There are two common ways to utilize this container:

  1. Serve a Jupyter, JupyterLab or nteract development environment. After connecting with your browser to your running Jupyter instance you can:

    • Access JupyterLab: append /lab to the end of the URL like so:

      http(s)://<server:port>/lab

    • Switch back to the classic notebook, add /tree to the URL like so:

      http(s)://<server:port>/tree

    • Launch nteract from within a user session by replacing /tree with /nteract at the end of a notebook server’s URL like so:

      http(s)://<server:port>/nteract

  2. Run nbdev utilities as part of your CI such as nbdev_read_nbs, nbdev_clean_nbs, nbdev_diff_nbs, nbdev_test_nbs.

nbdev Examples

  • Run a Jupyter server with an editable install of nbdev locally and mount your local directory into the container. We also bind the container port 8888 to 8888 on localhost so you can reach the Jupyter server:

    docker run -p 8888:8888 -v $PWD:/home/runner/my_data fastdotai/nbdev-dev

  • Run nbdev_test_nbs (in this case we are passing the -h flag to see help)

    docker run fastdotai/nbdev nbdev_test_nbs -h

  • Run nbdev_test_nbs on notebooks in your local directory that you mount into the container. We set the working directory in the container to /home/runner/my_data where your local files are located.

    docker run -w /home/runner/my_data -v $PWD:/home/runner/my_data fastdotai/nbdev nbdev_test_nbs

nbdev-docs

Build nbdev docs

These are Jekyll based images for building docs associated with nbev. These are used by fastai to build docs for various projects, that have all necessary gems installed.

Pull this image: docker pull fastdotai/nbdev-docs:latest

This image can can be found on dockerhub: fastdotai/nbdev-docs


Miscellaneous Resources & Tips

  • Save the state of a running container by first finding the Container ID of your running container with docker ps. After you have located the relevant ID, you can use docker commit to save the state of the container for later use.

  • Mount a local directory into your Docker image so that you can access files that are genearated when you exit your container with the -v flag.

  • Read this blog post.

  • Read this book to dive deeper into Docker.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.