Giter Site home page Giter Site logo

borchero / pyblaze Goto Github PK

View Code? Open in Web Editor NEW
21.0 5.0 5.0 6.88 MB

Large-Scale Machine and Deep Learning in PyTorch.

Home Page: https://pyblaze.borchero.com

License: MIT License

Python 99.88% Dockerfile 0.12%
pytorch deep-learning machine-learning

pyblaze's Introduction

PyBlaze

PyPi License

PyBlaze is an unobtrusive, high-level library for large-scale machine and deep learning in PyTorch. It is engineered to cut obsolete boilerplate code while preserving the flexibility of PyTorch to create just about any deep learning model.

Quickstart

Plenty of tutorials are available in the official documentation. The most basic tutorial builds a classifier for CIFAR10.

Installation

PyBlaze is available on PyPi and can simply be installed as follows:

pip install pyblaze

Library Design

PyBlaze revolves around the concept of an engine. An engine is a powerful abstraction for combining a model's definition with the algorithm required to optimize its parameters according to some data. Engines provided by PyBlaze are focused on generalization: while the engine encapsulates the optimization algorithm, the user must explicitly define the optimization objective (usually the loss function).

However, engines go far beyond implementing the optimization algorithm. Specifically, they further provide the following features:

  • Evaluation: During training, validation data can be used to evaluate the generalization performance of the trained model every so often. Also, arbitrary metrics may be computed.

  • Callbacks: During training and model evaluation, callbacks serve as hooks called at specific events in the process. This makes it possible to easily use some tracking framework, perform early stopping, or dynamically adjust parameters over the course of the training. Custom callbacks can easily be created.

  • GPU Support: Training and model evaluation is automatically performed on all available GPUs. The same code that works for the CPU works for the GPU ... and also for multiple GPUs.

Available Engines

Engines are currently implemented for the following training procedures:

  • pyblaze.nn.MLEEngine: This is the most central engine as it enables supervised as well as unsupervised learning. It can therefore adapt to multiple different problems: classification, regression, (variational) autoencoders, ..., depending on the loss only. In order to simplify initialization (as configuration requires toggling some settings), there exist some specialized MLE engines. Currently, the only one is pyblaze.nn.AutoencoderEngine.

  • pyblaze.nn.WGANEngine: This engine is specifically designed for training Wasserstein GANs. This class is required due to the independent training of generator and critic.

Implementing your custom engine is rarely necessary for most common problems. However, when working on highly customized machine learning models, it might be a good idea. Usually, it is sufficient to implement the train_batch and eval_batch methods to specify how to perform training and evaluation, respectively, for a single batch of data. Consult the documentation of pyblaze.nn.Engine to read about all methods available for override.

License

PyBlaze is licensed under the MIT License.

pyblaze's People

Contributors

borchero avatar stadlmax avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

pyblaze's Issues

Bump scikit-learn dep version to include 0.24.0

Hi,
I'm super impressed with this work and very keen to use pycave for some real-time GMM fitting.

I note that the pyblaze requirements.txt lists scikit-learn>=0.23.0,<0.24.0. I'm running this project on an ARM64 Nvidia Xavier and scikit does not have arm wheels until v0.24 (compiling scikit-learn on such a small device is untenable). Is this strict dependency pin necessary? I have been unable to run the tests but it would be great if we could get this working with the latest version of scikit-learn.

Thanks!

restriction of using to_device in the engine

Documentation
Custom engines usually just require implementing train_batch, eval_batch, and predict_batch. However, when also using custom datatypes which might not be compatible with the to_device implementation (as it relies on the item to be moved to have the attribute is_sparse and/or the method is_contiguous()).

item = to_device(self.device, item)

if x.is_sparse or x.is_contiguous():

Possible Solution
The engine implements a method to_device calling the utils.torch.to_device in a default implementation. Engines for custom datatypes then can easily just override the implementation of engine.to_device to support custom data types. /

Complication
_prediction_worker_func also uses utils.torch.to_device, together with its use in the default implementation of predict this would require further adaptations.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.