Giter Site home page Giter Site logo

stjordanis / deep-learning-koans Goto Github PK

View Code? Open in Web Editor NEW

This project forked from adamwespiser/deep-learning-koans

0.0 1.0 0.0 220 KB

A series of educational Deep Learning Koans, using Julia and Flux.jl

Julia 22.90% Jupyter Notebook 77.10%

deep-learning-koans's Introduction

Deep Learning Koans

A collection of Deep Learning Koans written for Flux.jl and Julia Programming language.

Colab Link

Run these koans on Colab

What is a Koan?

A Koan is a short puzzle designed to elucidate understanding upon completion. The plural, "koans" are also a format of interactive programming puzzles designed to be run locally and to teach the user programming, through a series of challenges in order of increasing difficultly. This is exactly what this project is!
Thus, a koan critically consists of three components:

  • A text introduction to a concept and problem
  • Section of code that applies the concept
  • A test that is not passed untill the concept is applied correctly

This format borrows a lot from the pedalogical approach of Daniel P. Friedman, and his book, The Little Schemer.
A list of koans projects for a variety of platforms and languages can be found here on github.

So what's this project trying to do?

This project attempts to take the Koans approach to teaching programming, and apply that to the subject matter of deep learning in Julia.
To do this, we will focus first on giving a skippable overview of Julia, then move on to demonstrating the library Flux.jl.
Technically, this project is implemented as a series of IJulia Notebooks that you generate from source code, host locally with IJulia, the extension to run Julia in a Jupyter Notebook, then interactively modify the code until "it works". Links and additional resources are scattered throughout the material.

Why Julia?

First, Julia offer a couple advantages compared to Python and R that make it superior for some projects, namely, its gradual typing improves the developer experience and code reliability, and compilation to LLVM makes Julia code faster than R/Matlab/Python. However, Julia is young, and does not have the depth and breadth of machine learning, and web development packages that make Python a safe production choice, or the depth of statistical packages and academic work that makes R so revered.
Nonetheless, Julia's LLVM compilcation offers huge advantage for Deep Learning in research and practice: we can actually program using a single language, instead of having to call an underlying C++ library, like Tensorflow or PyTorch. This requires a programming language, differential programming, which is currently being studied and integrated into Julia through the Zygote project

Why Flux?

Flux is a Julia only, neural network library, with basic differential programing capability, with a library of optimizers, layers, and helper functions that can facilitate deep learning, and has first-class support for GPU allocation. Thus, solutions like Julia/Flux, that use a single language to compile to fast LLVM bytecode and leverage GPUs, are a good future for for Deep Learning, as they are conceptually simpler to understand than current solutions like PyTorch/TensorFlow with the same performance. However, and you'll see this very soon throgh the koans, the API maturity is not quite there!

Chapters/Contents

The chapter outline is as follows:

Attribution Note

Many examples of these koans are borrowed and modified from sources in the Flux source code, or examples in the Flux Documentation. The Flux source code can be found here: Flux.jl.

How to Run

To run this project, you will need Julia installed locally, which is available here. Additionally, you will need NVIDIA drivers installed on your machine, for the GPU based examples in Chapter 7.
Clone the Repository, or download directly from Github.

git clone  https://github.com/adamwespiser/deep-learning-koans.git

Run julia in the deep-learning-koans directory

$ cd deep-learning-koans/
$ julia --project='.'

Instantiate the environment, which will download all the needed packages

julia> using Pkg; Pkg.instantiate();

Build the notebooks

julia> include("deps/build.jl")

Open the Julia Notebooks

julia> using IJulia
julia> notebook(dir=".")    

Dev

The projects is built using the script found in deps/build.jl.

Adding a package to Project.toml, Manifest.toml

For pkg X, we can get the UUID with the following:

julia> Pkg.METADATA_compatible_uuid("X")    

Then, add an entry to [deps] in Project.toml.
Finally, we will want to re-populate the Manifest.toml, which we can do as followings:

$ julia --project="."    
julia> Pkg.resolve()    
julia> Pkg.instantiate()    

Test

Run all the notebooks at once.

$ julia --project="." test/run_koan_source.jl

deep-learning-koans's People

Contributors

adamwespiser avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.