Giter Site home page Giter Site logo

muammar / ml4chem Goto Github PK

View Code? Open in Web Editor NEW
92.0 5.0 15.0 2.85 MB

ML4Chem: Machine Learning for Chemistry and Materials

Home Page: https://ml4chem.dev

License: Other

Python 100.00%
machine-learning chemistry physics materials-science kernel-methods deeplearning kernel

ml4chem's Introduction

alt text


About

PyPI - Python Version Build Status License Downloads PyPI - Downloads GitHub release Documentation Status Slack channel

ML4Chem is a package to deploy machine learning for chemistry and materials science. It is written in Python 3, and intends to offer modern and rich features to perform machine learning (ML) workflows for chemical physics.

A list of features and ML algorithms are shown below.

  • PyTorch backend.
  • Completely modular. You can use any part of this package in your project.
  • Free software <3. No secrets! Pull requests and additions are more than welcome!
  • Documentation (work in progress).
  • Explicit and idiomatic: ml4chem.get_me_a_coffee().
  • Distributed training in a data parallel paradigm aka mini-batches.
  • Scalability and distributed computations are powered by Dask.
  • Real-time tools to track status of your computations.
  • Easy scaling up/down.
  • Easy access to intermediate quantities: NeuralNetwork.get_activations(X, numpy=True) or VAE.get_latent_space(X).
  • Messagepack serialization.

Notes

This package is under heavy development and might break at some points until it gets stabilized. It is in its infancy, so if you find there is an error, you might want to report it so that it can be improved. We also welcome pull requests if you find any part of ML4Chem should be improved. That would be very nice.

Citing

If you find this software useful, please use this bibtex to cite it:

@article{El_Khatib2020,
author = "Muammar El Khatib and Wibe de Jong",
title = "{ML4Chem: A Machine Learning Package for Chemistry and Materials Science}",
year = "2020",
month = "3",
url = "https://chemrxiv.org/articles/ML4Chem_A_Machine_Learning_Package_for_Chemistry_and_Materials_Science/11952516",
doi = "10.26434/chemrxiv.11952516.v1"
}

Documentation

To get started, read the documentation at https://ml4chem.dev. It is arranged in a way that you can go through the theory as well as some code snippets to understand how to use this software. Additionally, you can dive through the module index to get more information about different classes and functions of ML4Chem. If you think the documentation has to be improved do not hesistate to state so in the bug reports and help out if you feel like it.

Visualizations

Copyright

License: BSD 3-clause "New" or "Revised" License.

ML4Chem: Machine Learning for Chemistry and Materials (ML4Chem) Copyright (c)
2019, The Regents of the University of California, through Lawrence Berkeley
National Laboratory (subject to receipt of any required approvals from the U.S.
Dept. of Energy).  All rights reserved.

If you have questions about your rights to use or distribute this software,
please contact Berkeley Lab's Intellectual Property Office at
[email protected].

NOTICE.  This Software was developed under funding from the U.S. Department
of Energy and the U.S. Government consequently retains certain rights.  As
such, the U.S. Government has been granted for itself and others acting on
its behalf a paid-up, nonexclusive, irrevocable, worldwide license in the
Software to reproduce, distribute copies to the public, prepare derivative
works, and perform publicly and display publicly, and to permit other to do
so.

ml4chem's People

Contributors

jacklyngee avatar muammar avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

ml4chem's Issues

Transfer Learning

  1. Load in pre-trained weights from a network trained on a large dataset
  2. Freeze all the weights in the lower (convolutional) layers: the layers to freeze are adjusted depending on similarity of new task to original dataset
  3. Replace the upper layers of the network with a custom classifier: the number of outputs must be set equal to the number of classes
  4. Train only the custom classifier layers for the task thereby optimizing the model for smaller dataset.

Loading in a pre-trained model in PyTorch is simple:

from torchvision import models
model = model.vgg16(pretrained=True)

Source: https://towardsdatascience.com/transfer-learning-with-convolutional-neural-networks-in-pytorch-dd09190245ce

I think the pretrained keyword argument is something that has to do with the vgg16 model but should not be complicated to implement.

Create a custom loss function to train energies

The reason for this is because we have the energy of an image and within the atom-centered model we have to guess atomic energies that when summed up they return the target -- in this case the total energy. This same loss function can be then adjusted to train against forces if wanted.

Add Kernel Ridge Regression

I already did this before. I have to add this model and modify the data class so that the code is not slow. This is related to issue #2.

  • Build kernel matrix per atom.
  • Train model with Cholesky.
  • Compute just triangular kernel matrix to save computational time.
  • Logging different parts of the tasks.
  • Save/load parameters to file.
  • Predictions.
  • Clean up the code.
  • Parallelization.
  • Make it possible to do a calc.get_potential_energy(atoms) in memory.
  • Support sigmaper atom.

Create some rule to initialize models

The biases are not correctly initialized. I should create some kind of trick to have them starting to values that make sense for the problem in question.

Add ModelMerger class

  • Model merger.
  • Loss function weighted sum vs independently.
  • Printing of training evolution.
  • Save model (I think code already available in ml4chem could be used for that).
  • Set convergence stop mechanism.
  • Parallelization.
  • Add more compatibility with different available components.
  • Check for consistency of the class (independent vs dependent loss).

That class would help to merge models like this:

Chgpo

Make the Data class universal

Data structure is quite important in machine-learning and in fact, it can represent a bottleneck as one has to iterate over the data points.

The Data class in mlchemistry, at least I hope, has to be done in a way that can interoperate with other backends and has to be designed to build data for more than neural networks (I support vector machines).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.