Giter Site home page Giter Site logo

kleveross / ormb Goto Github PK

View Code? Open in Web Editor NEW
447.0 18.0 61.0 160.17 MB

Docker for Your ML/DL Models Based on OCI Artifacts

License: Apache License 2.0

Makefile 6.73% Dockerfile 0.21% Go 82.74% Shell 0.65% Python 9.48% Scheme 0.19%
model-management model-versioning machine-learning docker opencontainers oci oci-artifacts harbor docker-registry image-registry

ormb's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ormb's Issues

[feature] Support tag subcommand

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Make exporting metada optional when exporting the model

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

Now the metadata will be pulled and stored as orbfile.yaml when we export the model. It should be optional, maybe by flag.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Use goreleaser to build and publish Docker images

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[discussion] Design signature field carefully

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

We need to design signature carefully to support existing features.

/cc @simon-cj Can you show your ideas here?

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[upstream contribution] Support ormb in Seldon Core and KFServing

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

ormb should be able to be used in KFServing and Seldon Core to pull/push models.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Improve the performance when saving/exporting the model

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

Using config file: /home/gaocegege/.ormb/config.json
Using /home/gaocegege/.ormb as the root path
ref:       localhost:5000/machinelearning/onnx-model:v1
digest:    a575cb4d62801c295caea441b2e431867148a82718c53551425c8beac8fcaf99
size:      28.9 MiB
format:    ONNX
v1: saved

Currently, Saving an 30M ONNX example takes about 30s.

Showing nodes accounting for 43.26s, 96.37% of 44.89s total
Dropped 93 nodes (cum <= 0.22s)
Showing top 10 nodes out of 50
      flat  flat%   sum%        cum   cum%
    28.96s 64.51% 64.51%     28.96s 64.51%  runtime._ExternalCode
     4.61s 10.27% 74.78%      4.61s 10.27%  racecall
     3.90s  8.69% 83.47%     13.36s 29.76%  compress/flate.(*compressor).deflate
     2.04s  4.54% 88.02%      2.04s  4.54%  racecalladdr
     1.33s  2.96% 90.98%      1.33s  2.96%  runtime.raceread
     0.73s  1.63% 92.60%      1.85s  4.12%  compress/flate.(*huffmanBitWriter).writeCode
     0.71s  1.58% 94.19%      1.03s  2.29%  compress/flate.(*compressor).findMatch
     0.35s  0.78% 94.97%      0.71s  1.58%  compress/flate.hash4 (inline)
     0.34s  0.76% 95.72%      0.34s  0.76%  runtime.racewrite
     0.29s  0.65% 96.37%      0.29s  0.65%  crypto/sha256.block

We should optimize it.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Release Python SDK on Python Package Index

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Support STDOUT/STDERR stream redirection in Python SDK

Is this a BUG REPORT or FEATURE REQUEST?:

/kind feature

What happened:

Now we will not output the STDOUT/STDERR of ormb when we use Python SDK in Jupyter:

Screenshot from 2020-06-15 11-37-46

It will output in jupyter server's log:

[I 11:32:18.356 LabApp] Saving file at /examples/SavedModel-fashion/training-serving.ipynb
Using config file: /home/gaocegege/.ormb/config.json
Using /home/gaocegege/.ormb as the root path
Error: open /tmp/ormb_73pss1e/1/ormbfile.yaml: no such file or directory
Usage:
  ormb save [flags]

Flags:
  -h, --help   help for save

Global Flags:
      --config string   config file (default is $HOME/.ormb/config.yaml)

open /tmp/ormb_73pss1e/1/ormbfile.yaml: no such file or directory
Using config file: /home/gaocegege/.ormb/config.json
Using /home/gaocegege/.ormb as the root path
Error: open /tmp/ormb_73pss1e/1/ormbfile.yaml: no such file or directory
Usage:
  ormb save [flags]

Flags:
  -h, --help   help for save

Global Flags:
      --config string   config file (default is $HOME/.ormb/config.yaml)

open /tmp/ormb_73pss1e/1/ormbfile.yaml: no such file or directory
Using config file: /home/gaocegege/.ormb/config.json
Using /home/gaocegege/.ormb as the root path
Error: open /tmp/ormb_73pss1e/1/ormbfile.yaml: no such file or directory
Usage:
  ormb save [flags]

Flags:
  -h, --help   help for save

Global Flags:
      --config string   config file (default is $HOME/.ormb/config.yaml)

open /tmp/ormb_73pss1e/1/ormbfile.yaml: no such file or directory
Using config file: /home/gaocegege/.ormb/config.json
Using /home/gaocegege/.ormb as the root path
The push refers to repository [demo.goharbor.io/tensorflow/fashion_model]
ref:       demo.goharbor.io/tensorflow/fashion_model:v1
digest:    13eb538942a70c699637cfa1bb9c7d72cd8430af50a992055b7bfd3220042e94
size:      162.1 KiB
format:    SavedModel
Error: unexpected response: 401 Unauthorized
Usage:
  ormb push [flags]

Flags:
  -h, --help         help for push
      --plain-http   use plain http and not https

Global Flags:
      --config string   config file (default is $HOME/.ormb/config.yaml)

unexpected response: 401 Unauthorized
[I 11:34:18.493 LabApp] Saving file at /examples/SavedModel-fashion/training-serving.ipynb

What you expected to happen:

We expect that the STDOUT/STDERR of ormb should be redirected to the cell output.

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Export metadat with the model

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

Now we just export the model from the local cache. we should also support exporting ormbfile.yaml from the local cache.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

/priority p0

[feature] Support Model Conversion/Compression via CRD

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

We should support model conversion and compression. We can have a init container to pull the model from registry to local filesystem. Then we can use MMdnn or other tools to convert/compress the model, then we can push the new model with the unique tag or the desired tag.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

https://github.com/gaocegege/ormb/tree/modeljob

[feature] Investigate how to support Nvidia TensorRT format

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

Nvidia Triton Server has its own model repository layout. We decide using the triton server as the default model inference server, then we should investigate how to support the layout well.

/cc @simon-cj

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feat] Support signature in metadata data structure

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

Now we do not support signature, we should support it

/cc @simon-cj

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Add version info in goreleaser

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:
Now, the release don't have version info by run ormb version.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Investigate if we need to compile ormb to dynamic lib for Python SDK

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

Now we will use the latest release binary in python SDK to install ormb first and use it in the SDK. We can Investigate if we need to compile ormb to dynamic lib for Python SDK.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Provider unified offline batch inference interface

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

Investigate if we can use https://github.com/uber/neuropod to provide a unified offline batch inference interface for users. They can use ormb python sdk to download the model first then use neuropod to run offline inference.

Thank @terrytangyuan for introducing the project.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[enhancement] Support HTTP login

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

Now we support --insecure but it does not work when the host is a hostname instead of IP.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[docs] Add doc to show that we do not support version dir for savedmodel

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

#67 removed version directory, we should show it somewhere

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Support TLS verification skip flag

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

We should support TLS verification skip for pull/push.

		resolver, err := client.authorizer.Resolver(
			context.Background(),
			&http.Client{
				// TODO(gaocegege): Make it optional.
				Transport: &http.Transport{
					TLSClientConfig: &tls.Config{InsecureSkipVerify: true},
				},
			}, client.plainHTTP)
		if err != nil {
			return nil, err
		}
		client.resolver = &Resolver{
			Resolver: resolver,
		}

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Add README addition layer for model

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

Just like Helm Chart, each model should have its README to illustrate what the model is and how to use it. We should have a separate layer to store such the info and show it in UI.

We can use Harbor Addition mechanism to support it.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Documentation for ormbfile.yaml

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

We need doc for ormbfile.yaml.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[enhancement] Add DEBUG level for ormb

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

ORMB uses oras to communicate with the OCI registry. We should have a flag to show the debug info for oras/registry-resolver.

Maybe something like:

logrus.SetLevel(logrus.DebugLevel)

Since docker/registry are using logrus.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feasibility research] Investigate if we can get signature without the model server

Is this a BUG REPORT or FEATURE REQUEST?:

/kind feature

What happened:

When we run the model conversion jobs, we have to setup a real model inference server first, which may be not necessary. We should investigate if we can get it directly similar to savedmodel_cli or some other tools.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[chore] Name for the Clever community version

Is this a BUG REPORT or FEATURE REQUEST?:

/kind feature

What happened:

We cannot move the ormb and model registry to the third-party organization currently, thus we need a new repository for our open-source model registry.

The name can be os-model-registry since it is temporary, WDYT @ddysher @simon-cj @codeflitting

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Show directory architecture in ormb UI

Is this a BUG REPORT or FEATURE REQUEST?:

/kind feature

What happened:

Visualize the model directory structure in UI

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Add CI&CD

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feasibility research] Investigate if we can store the model artifact with layers

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

Now we store the model in one layer.

What you expected to happen:

We should investigate if we can store it with multiple layers. We can reuse the cached layer if it is not changed.

/cc @zw0610

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[chore] Move to kleveross org

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

We have a new open GitHub organization https://github.com/kleveross for ormb and some other projects. We should have a plan about how to transfer them to that org.

Things we need to deal with:

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Dashboard UI for ormb

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

ORMB is a CLI to manage your ML models. We need a dashbaord like Harbor Portal to show the model list and the artifact details for one given model.

Besides this, we need to support creating model inference services in the UI. Maybe we should get in touch with Seldon Core community.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[discussion] How to choreography model's convert and extract

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:
Should the extract python script be included in convert python script.
or run two pod in order, one running convert script, and then another running extract script.
@gaocegege WDYT

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Provide Python SDK

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

We need to provide Python SDK like https://docker-py.readthedocs.io/en/stable/ to support CURD in Jupyter Notebook or Python code directly.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Support Python SDK on MacOS

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

/cc @ZhuYuJin

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[discussion] Should We Have the ormbfile.yaml?

Now we have the ormbfile.yaml in the directory of the model. It has some problems:

  • Some model servers will return errors when serving the model
  • Additional maintenance cost

We should have a final decision before the first release.

/priority p0

[chore] Replace caicloud image in Makefile

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

Now we are using docker images with caicloud prefix, which cannot be pulled outside. We should use Docker Hub.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

[feature] Generate ormbfile.yaml for users

Is this a BUG REPORT or FEATURE REQUEST?:

/kind feature

What happened:

For these users using Python SDK, they may not want to write such a configuration file. We may need to:

  • Make it optional or
  • Support some parameters to generate it or
  • Analyze the parameters from the model automatically.

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.