Giter Site home page Giter Site logo

gyf135 / deephyper Goto Github PK

View Code? Open in Web Editor NEW

This project forked from deephyper/deephyper

0.0 0.0 0.0 42.43 MB

DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks

Home Page: https://deephyper.readthedocs.io

License: Other

Python 97.19% Jupyter Notebook 2.76% Shell 0.04%

deephyper's Introduction

GitHub tag (latest by date) Build Status Documentation Status PyPI - License PyPI - Downloads

What is DeepHyper?

DeepHyper is an automated machine learning (AutoML) package for deep neural networks. It comprises two components: 1) Neural architecture search is an approach for automatically searching for high-performing the deep neural network search_space. 2) Hyperparameter search is an approach for automatically searching for high-performing hyperparameters for a given deep neural network. DeepHyper provides an infrastructure that targets experimental research in neural architecture and hyperparameter search methods, scalability, and portability across HPC systems. It comprises three modules: benchmarks, a collection of extensible and diverse benchmark problems; search, a set of search algorithms for neural architecture search and hyperparameter search; and evaluators, a common interface for evaluating hyperparameter configurations on HPC platforms.

Documentation

Deephyper documentation is on ReadTheDocs

Install instructions

From pip:

pip install deephyper

From github:

git clone https://github.com/deephyper/deephyper.git
cd deephyper/
pip install -e .

if you want to install deephyper with test and documentation packages:

# From Pypi
pip install 'deephyper[tests,docs]'

# From github
git clone https://github.com/deephyper/deephyper.git
cd deephyper/
pip install -e '.[tests,docs]'

Directory search_space

benchmark/
    a set of problems for hyperparameter or neural architecture search which the user can use to compare our different search algorithms or as examples to build their own problems.
evaluator/
    a set of objects which help to run search on different systems and for different cases such as quick and light experiments or long and heavy runs.
search/
    a set of algorithms for hyperparameter and neural architecture search. You will also find a modular way to define new search algorithms and specific sub modules for hyperparameter or neural architecture search.
hps/
        hyperparameter search applications
nas/
        neural architecture search applications

How do I learn more?

Quickstart

Hyperparameter Search (HPS)

deephyper hps ambs --evaluator ray --problem deephyper.benchmark.hps.polynome2.Problem --run deephyper.benchmark.hps.polynome2.run --n-jobs 1

Neural Architecture Search (NAS)

deephyper nas ambs --evaluator ray --problem deephyper.benchmark.nas.polynome2Reg.Problem --n-jobs 1

Who is responsible?

Currently, the core DeepHyper team is at Argonne National Laboratory:

Modules, patches (code, documentation, etc.) contributed by:

Citing DeepHyper

If you are referencing DeepHyper in a publication, please cite the following papers:

  • P. Balaprakash, M. Salim, T. Uram, V. Vishwanath, and S. M. Wild. DeepHyper: Asynchronous Hyperparameter Search for Deep Neural Networks. In 25th IEEE International Conference on High Performance Computing, Data, and Analytics. IEEE, 2018.

  • P. Balaprakash, R. Egele, M. Salim, S. Wild, V. Vishwanath, F. Xia, T. Brettin, and R. Stevens. Scalable reinforcement-learning-based neural architecture search for cancer deep learning research. In SC โ€™19: IEEE/ACM International Conference on High Performance Computing, Network-ing, Storage and Analysis, 2019.

How can I participate?

Questions, comments, feature requests, bug reports, etc. can be directed to:

Patches are much appreciated on the software itself as well as documentation. Optionally, please include in your first patch a credit for yourself in the list above.

The DeepHyper Team uses git-flow to organize the development: Git-Flow cheatsheet. For tests we are using: Pytest.

Acknowledgements

  • Scalable Data-Efficient Learning for Scientific Domains, U.S. Department of Energy 2018 Early Career Award funded by the Advanced Scientific Computing Research program within the DOE Office of Science (2018--Present)
  • Argonne Leadership Computing Facility: This research used resources of the Argonne Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.
  • SLIK-D: Scalable Machine Learning Infrastructures for Knowledge Discovery, Argonne Computing, Environment and Life Sciences (CELS) Laboratory Directed Research and Development (LDRD) Program (2016--2018)

Copyright and license

Copyright ยฉ 2019, UChicago Argonne, LLC

DeepHyper is distributed under the terms of BSD License. See LICENSE

Argonne Patent & Intellectual Property File Number: SF-19-007

deephyper's People

Contributors

deathn0t avatar masalim2 avatar pbalapra avatar dipendra009 avatar mdorier avatar felker avatar romit-maulik avatar bethanyl avatar rmjcs2020 avatar jtchilders avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.