Giter Site home page Giter Site logo

rje4242 / cortex Goto Github PK

View Code? Open in Web Editor NEW

This project forked from cortexlabs/cortex

0.0 0.0 0.0 4.86 MB

Machine learning model serving infrastructure

Home Page: https://cortex.dev

License: Apache License 2.0

Shell 3.60% Go 88.45% Dockerfile 0.98% Python 6.60% Makefile 0.37%

cortex's Introduction

Machine learning model serving infrastructure


installdocsexampleswe're hiringchat with us

Demo


Key features

  • Multi framework: deploy TensorFlow, PyTorch, scikit-learn, and other models.
  • Autoscaling: automatically scale APIs to handle production workloads.
  • ML instances: run inference on G4, P2, M5, C5 and other AWS instance types.
  • Spot instances: save money with spot instances.
  • Multi-model endpoints: deploy multiple models in a single API.
  • Rolling updates: update deployed APIs with no downtime.
  • Log streaming: stream logs from deployed models to your CLI.
  • Prediction monitoring: monitor API performance and prediction results.

Deploying a model

Install the CLI

$ bash -c "$(curl -sS https://raw.githubusercontent.com/cortexlabs/cortex/0.17/get-cli.sh)"

Implement your predictor

# predictor.py

class PythonPredictor:
    def __init__(self, config):
        self.model = download_model()

    def predict(self, payload):
        return self.model.predict(payload["text"])

Configure your deployment

# cortex.yaml

- name: sentiment-classifier
  predictor:
    type: python
    path: predictor.py
  compute:
    gpu: 1
    mem: 4G

Deploy your model

$ cortex deploy

creating sentiment-classifier

Serve predictions

$ curl http://localhost:8888 \
    -X POST -H "Content-Type: application/json" \
    -d '{"text": "serving models locally is cool!"}'

positive

Deploying models at scale

Spin up a cluster

Cortex clusters are designed to be self-hosted on any AWS account:

$ cortex cluster up

aws region: us-east-1
aws instance type: g4dn.xlarge
spot instances: yes
min instances: 0
max instances: 5

your cluster will cost $0.19 - $2.85 per hour based on cluster size and spot instance pricing/availability

○ spinning up your cluster ...

your cluster is ready!

Deploy to your cluster with the same code and configuration

$ cortex deploy --env aws

creating sentiment-classifier

Serve predictions at scale

$ curl http://***.amazonaws.com/sentiment-classifier \
    -X POST -H "Content-Type: application/json" \
    -d '{"text": "serving models at scale is really cool!"}'

positive

Monitor your deployment

$ cortex get sentiment-classifier

status   up-to-date   requested   last update   avg request   2XX
live     1            1           8s            24ms          12

class     count
positive  8
negative  4

How it works

The CLI sends configuration and code to the cluster every time you run cortex deploy. Each model is loaded into a Docker container, along with any Python packages and request handling code. The model is exposed as a web service using a Network Load Balancer (NLB) and FastAPI / TensorFlow Serving / ONNX Runtime (depending on the model type). The containers are orchestrated on Elastic Kubernetes Service (EKS) while logs and metrics are streamed to CloudWatch.

Cortex manages its own Kubernetes cluster so that end-to-end functionality like request-based autoscaling, GPU support, and spot instance management can work out of the box without any additional DevOps work.


What is Cortex similar to?

Cortex is an open source alternative to serving models with SageMaker or building your own model deployment platform on top of AWS services like Elastic Kubernetes Service (EKS), Lambda, or Fargate and open source projects like Docker, Kubernetes, TensorFlow Serving, and TorchServe.


Examples

cortex's People

Contributors

1vn avatar caleb-kaiser avatar deliahu avatar dsuess avatar hassenio avatar howjmay avatar ismaelc avatar ospillinger avatar rbromley10 avatar robertlucian avatar tthebst avatar vishalbollu avatar wingkwong avatar ynng avatar zouyee avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.