Giter Site home page Giter Site logo

chrisyang / opendeep Goto Github PK

View Code? Open in Web Editor NEW

This project forked from vitruvianscience/opendeep

0.0 2.0 0.0 3 MB

Modular & extensible deep learning framework built on Theano.

Home Page: http://www.opendeep.org

License: Apache License 2.0

Python 100.00%

opendeep's Introduction

Documentation Status

OpenDeep

OpenDeep: a fully modular & extensible deep learning framework

Developer hub: http://www.opendeep.org/

OpenDeep is a deep learning framework for Python built from the ground up in Theano with a focus on flexibility and ease of use for both industry data scientists and cutting-edge researchers. OpenDeep is a modular and easily extensible framework for constructing any neural network architecture to solve your problem.

Use OpenDeep to:

  • Quickly prototype complex networks through a focus on complete modularity and containers similar to Torch.
  • Configure and train existing state-of-the-art models.
  • Write your own models from scratch in Theano and plug into OpenDeep for easy training and dataset integration.
  • Use visualization and debugging tools to see exactly what is happening with your neural net architecture.
  • Plug into your existing Numpy/Scipy/Pandas/Scikit-learn pipeline.
  • Run on the CPU or GPU.

This library is currently undergoing rapid development and is in its alpha stages.

Quick example usage

Train and evaluate a Multilayer Perceptron (MLP - your generic feedforward neural network for classification) on the MNIST handwritten digit dataset:

from opendeep.models.container import Prototype
from opendeep.models.single_layer.basic import BasicLayer, SoftmaxLayer
from opendeep.optimization.adadelta import AdaDelta
from opendeep.data.standard_datasets.image.mnist import MNIST, datasets

mlp = Prototype()
mlp.add(BasicLayer(input_size=28*28, output_size=512, activation='rectifier', noise='dropout'))
mlp.add(BasicLayer(output_size=512, activation='rectifier', noise='dropout'))
mlp.add(SoftmaxLayer(output_size=10))

data = MNIST()
trainer = AdaDelta(model=mlp, dataset=data, n_epoch=10)
trainer.train()

test_data, test_labels = data.getSubset(datasets.TEST)
predictions = mlp.run(test_data.eval())

print "Accuracy: ", float(sum(predictions==test_labels.eval())) / len(test_labels.eval())

Installation

Because OpenDeep is still in alpha, you have to install via setup.py. Also, please make sure you have these dependencies installed first.

Dependencies

Install from source

  1. Navigate to your desired installation directory and download the github repository:

    git clone https://github.com/vitruvianscience/opendeep.git
    
  2. Navigate to the top-level folder (should be named OpenDeep and contain the file setup.py) and run setup.py with develop mode:

    cd opendeep
    python setup.py develop
    

Using python setup.py develop instead of the normal python setup.py install allows you to update the repository files by pulling from git and have the whole package update! No need to reinstall when you get the latest files.

That's it! Now you should be able to import opendeep into python modules.

Quick Start

To get up to speed on deep learning, check out a blog post here: Deep Learning 101. You can also go through tutorials on OpenDeep's documentation site: http://www.opendeep.org/

Let's say you want to train a Denoising Autoencoder on the MNIST handwritten digit dataset. You can get started in just a few lines of code:

# standard libraries
import logging
# third-party imports
from opendeep.log.logger import config_root_logger
import opendeep.data.dataset as datasets
from opendeep.data.standard_datasets.image.mnist import MNIST
from opendeep.models.single_layer.autoencoder import DenoisingAutoencoder
from opendeep.optimization.adadelta import AdaDelta

# grab the logger to record our progress
log = logging.getLogger(__name__)
# set up the logging to display to std.out and files.
config_root_logger()
log.info("Creating a new Denoising Autoencoder")

# create the MNIST dataset
mnist = MNIST()

# define some model configuration parameters (this could have come from json!)
config = {
    "input_size": 28*28, # dimensions of the MNIST images
    "hidden_size": 1500  # number of hidden units - generally bigger than input size
}
# create the denoising autoencoder
dae = DenoisingAutoencoder(**config)

# create the optimizer to train the denoising autoencoder
# AdaDelta is normally a good generic optimizer
optimizer = AdaDelta(dae, mnist)
optimizer.train()

# test the trained model and save some reconstruction images
n_examples = 100
# grab 100 test examples
test_xs, _ = mnist.getSubset(datasets.TEST)
test_xs = test_xs[:n_examples].eval()
# test and save the images
dae.create_reconstruction_image(test_xs)

Congrats, you just:

  • set up a dataset (MNIST)
  • instantiated a denoising autoencoder model with some configurations
  • trained it with an AdaDelta optimizer
  • and predicted some outputs given inputs (and saved them as an image)!

Working example!

More Information

Source code: https://github.com/vitruvianscience/opendeep

Documentation and tutorials: http://www.opendeep.org/

User group: opendeep-users

Developer group: opendeep-dev

Twitter: @opendeep

We would love all help to make this the best library possible! Feel free to fork the repository and join the Google groups!

Why OpenDeep?

  • Modularity. A lot of recent deep learning progress has come from combining multiple models. Existing libraries are either too confusing or not easily extensible enough to perform novel research and also quickly set up existing algorithms at scale. This need for transparency and modularity is the main motivating factor for creating the OpenDeep library, where we hope novel research and industry use can both be easily implemented.
  • Ease of use. Many libraries require a lot of familiarity with deep learning or their specific package structures. OpenDeep's goal is to be the best-documented deep learning library and have smart enough default code that someone without a background can start training models, while experienced practitioners can easily create and customize their own algorithms.
  • State of the art. A side effect of modularity and ease of use, OpenDeep aims to maintain state-of-the-art performance as new algorithms and papers get published. As a research library, citing and accrediting those authors and code used is very important to the library.

opendeep's People

Contributors

mbeissinger avatar vitruvianscience avatar jimmycallin avatar erogol avatar gregretkowski avatar

Watchers

James Cloos avatar Chris avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.