Giter Site home page Giter Site logo

aws-samples / aws-stepfunction-sagemakerprocessingjob-deploynotebooks Goto Github PK

View Code? Open in Web Editor NEW
1.0 4.0 4.0 274 KB

Jupyter notebooks are widely used by Datascientists for their data transformation workloads. There are scenarios where the notebook needs to be scheduled and run in regular interval and they need to be productionize. This repo provides a framework for the Datascientists to productionize their workloads dynamically (based on the notebooks) using AWS Stepfunctions with Service Integration to Sagemaker Processing Job.

License: MIT No Attribution

Dockerfile 5.66% Python 84.25% Shell 5.02% Jupyter Notebook 5.07%
stepfunctions sagemakerprocessingjob awsdatasciencesdk

aws-stepfunction-sagemakerprocessingjob-deploynotebooks's Introduction

aws-stepfunctions-sagemakerprocessingjob

Jupyter notebooks are widely used by data scientists for their data transformation workloads. There are scenarios where the notebook needs to be scheduled and run in regular interval and they need to be productionize.

This repo provides a framework for the data scientists to productionize their workloads dynamically (based on the notebooks) using AWS Step Functions with Service Integration to Sagemaker Processing Job.

As a data scientist or developer, you need to concentrate on developing the notebooks. The framework will help to productionize the notebooks using AWS Step Functions as an orchestrator to execute your notebooks in Sagemaker Processing Job.

The repo is a template which you need to use and replace the notebooks with your notebooks and update the simple configuration file config.yml file. Based on the config.yml details, the Step Functions with states for Sagemaker processing job for each notebook under src/notebooks is dynamically and programmatically created.

Pre-requisites:

  1. AWS Account and setup Access keys. The access id, secret key and region set as environment variables AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY and AWS_DEFAULT_REGION respectively.
  2. AWS CLI2 installed already.
  3. Docker (To do build)

Architecture

Architecture

Usage

To try out the framework and see its work in action, do the following:

  1. Clone the repo. git clone https://github.com/aws-samples/aws-stepfunction-sagemakerprocessingjob-deploynotebooks.git
  2. cd aws-stepfunctions-sagemakerprocessingjob-dynamicinput
  3. Update the config.yml accordingly.
  4. Create virtual env using the command, python3 -m venv venv
  5. Activate, the venv using the command, source venv/bin/activate
  6. Install the dependent python modules required using the command pip install -r venv_requirement.txt
  7. Set the environment variables Example: export AWS_DEFAULT_REGION=us-west-1, export AWS_ACCESS_KEY_ID=your access key id, export AWS_SECRET_ACCESS_KEY=your access key
  8. Provide executable permission , chmod +x deploy.sh
  9. Execute script ./deploy.sh
  10. Login to the AWS Console, Goto Step Functions, Statemachines. You would be able to see state machine created dynamically based on the information from config.yml.

User-Experience

The users who want to on-board this framework to productionize their notebook, need to do the following:

  • Clone the repo
  • Place your jupyter notebooks at src/notebooks. Remove sample notebook from src/notebooks and csv file from src/data. If there are mulitple notebooks to be executed sequentially, then, name the notebook in alphabetical order. Example: a-nb.ipynb, a-nb.ipynb.
  • If your notebooks requires any 3rd party python modules, specify them in the requirements.txt
  • Update the config.yml file accordingly.
    • Specify the basic items required like instance type, volume size (in GB) and the max run time in the keys instance_type, volume_size and max_runtime.
    • If you want Sagemaker processing job to spin the instance in a VPC, please provide Subnet Ids and Security group accordingly in the keys subnets and security_groups.
    • In addition to the above, the framework supports other keys that include:
      • container_endpoint - In case, if you want to execute your own scripts (it can be anyother normal custom python script as well), please provide so. Example: container_endpoint: "python3 deepprofiler --root=/opt/ml/processing/input --config=config.json train"
      • s3_input: If your script needs to copy data from s3 to local container, sagemaker processing job is executing, please provide the s3 path accordingly. Example: s3_input: "s3://my-input-sm-bucket"
      • container_input: The path in the container from where the s3_input data need to be copied to. Example: container_input: "/opt/ml/processing/input"
      • container_output: The path in the container from where the data output by the execution of script provided in the container endpoint need to be copied to s3_output Example: container_output: "/opt/ml/processing/input/results_from_execution"
      • s3_output: The S3 path where the contents from containe_output path copied to S3 Location. Example: s3_input: "s3://my-output-sm-bucket"
  • Follow the steps 4 to 10 from Usage section.

Note

The dataset (US States 2010 and 2020 population) used in this repo is obtained from Wikipedia - https://en.wikipedia.org/wiki/List_of_U.S._states_and_territories_by_population

Security

See CONTRIBUTING for more information.

License

This library is licensed under the MIT-0 License. See the LICENSE file.

aws-stepfunction-sagemakerprocessingjob-deploynotebooks's People

Contributors

amazon-auto avatar govindhiaws avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.