Giter Site home page Giter Site logo

dl_book's Introduction

dl_book github

https://tensorchiefs.github.io/dl_book/

Book

The book is availibe https://www.manning.com/books/probabilistic-deep-learning-with-python

Note on the code

All examples in the book, except nb_06_05, are tested with the 2.0 of TensorFlow (TF) and the 0.8 version of TensorFlow Probability. The notebooks nb_ch03_03 and nb_ch03_04, describing the computation graph, are easier to understand in version 1 of TF. For these, we also include both versions. The nb_06_05 is only working with tf 1 (we need weights which are only provided in TF 1.0)

You can execute the notebooks in google colab or locally. Colab is great, you can simply click on a link and you can play with the code in the cloud. No installation, you just need a browser. We definitely suggest that you to go this way. But, TensorFlow is still moving fast, and we cannot guarantee the code will run in several years. We, therefore, provided a docker container https://github.com/oduerr/dl_book_docker/ which you can use to execute all notebooks except nb_06_05 and the TF 1.0 versions of nb_ch03_03 and nb_ch03_04 run in this container. This docker container is also a way to go if you want to use the notebooks locally.

dl_book's People

Contributors

bsick avatar emurina avatar oduerr avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dl_book's Issues

nb_ch05_02.ipynb

TFP Version 0.16.0
TF Version 2.8.0

AttributeError: 'Mixture' object has no attribute '_prob'

probs=model_p(X_test).log_prob(np.arange(0,20,1).reshape(20, 1)).numpy()
probs = np.transpose(probs, [1, 0])
probs = np.exp(probs)

How to understand the plot of linear regression with Keras in dl_book/chapter_05/nb_ch05_02.ipynb

Recently I am reading Probabilistic Deep Learning with Python, which is a really interesting book. I am really new to deep learning. I have some questions about the following plot after reading the code and results in nb_ch05_02.ipynb. Is it typical for this type of regression analysis for the training loss to be much larger than the validation loss, and can we trust this model with such training results?
I got a similar plot in my own research (validation loss curve is below training curve with a large gap), and I'm not sure how to explain or evaluate the model. I will greatly appreciate it if someone could solve my puzzles.

image

Some comments in the notebook nb_ch06_03.ipynb in German

Hello dear authors.

On the notebook nb_ch06_03.ipynb there are some comments in German.

It's not a big deal but if it is possible to correct it it would be good to maintain the standard.

Thanks.

Also, I think the book is excellent.

image

image

Plot in nb_ch02_01.ipynb has incorrect legend

In cell#8 of nb_ch02_01.ipynb for the plot visualize the data in a 2D feature space :
the legend sequence should be : ("real","faked") since "real" banknotes are indicated by "triangles" and "faked" by "circles".

Or need to swap the symbols in plot.

Co-lab Import Dependency Errors

When running the https://colab.research.google.com/github/tensorchiefs/dl_book/blob/master/chapter_05/nb_ch05_02.ipynb notebook in colab, upon running the 4th cell there is an import error:

ImportError                               Traceback (most recent call last)

<ipython-input-19-4f70da3f4e95> in <module>()
      2 import numpy as np
      3 from sklearn.model_selection import train_test_split
----> 4 import tensorflow_probability as tfp
      5 
      6 get_ipython().magic('matplotlib inline')

4 frames

/usr/local/lib/python3.6/dist-packages/tensorflow_probability/__init__.py in <module>()
     73 
     74 # from tensorflow_probability.google import staging  # DisableOnExport
---> 75 from tensorflow_probability.python import *  # pylint: disable=wildcard-import
     76 from tensorflow_probability.python.version import __version__
     77 # pylint: enable=g-import-not-at-top

/usr/local/lib/python3.6/dist-packages/tensorflow_probability/python/__init__.py in <module>()
     22 from tensorflow_probability.python import debugging
     23 from tensorflow_probability.python import distributions
---> 24 from tensorflow_probability.python import experimental
     25 from tensorflow_probability.python import glm
     26 from tensorflow_probability.python import layers

/usr/local/lib/python3.6/dist-packages/tensorflow_probability/python/experimental/__init__.py in <module>()
     32 from __future__ import print_function
     33 
---> 34 from tensorflow_probability.python.experimental import auto_batching
     35 from tensorflow_probability.python.experimental import edward2
     36 from tensorflow_probability.python.experimental import mcmc

/usr/local/lib/python3.6/dist-packages/tensorflow_probability/python/experimental/auto_batching/__init__.py in <module>()
     22 from tensorflow_probability.python.experimental.auto_batching import allocation_strategy
     23 from tensorflow_probability.python.experimental.auto_batching import dsl
---> 24 from tensorflow_probability.python.experimental.auto_batching import frontend
     25 from tensorflow_probability.python.experimental.auto_batching import instructions
     26 from tensorflow_probability.python.experimental.auto_batching import liveness

/usr/local/lib/python3.6/dist-packages/tensorflow_probability/python/experimental/auto_batching/frontend.py in <module>()
     42 from tensorflow.python.autograph.converters import return_statements
     43 from tensorflow.python.autograph.core import converter
---> 44 from tensorflow.python.autograph.core import naming
     45 from tensorflow.python.autograph.pyct import anno
     46 from tensorflow.python.autograph.pyct import compiler

ImportError: cannot import name 'naming'


---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.

To view examples of installing some common dependencies, click the
"Open Examples" button below.

Typos and errors in nb_ch02_01.ipynb

  • linar --> linear
  • curtosis --> kurtosis
  • exactely --> exactly
  • performace --> performance
  • featues --> features
  • Here we use extract the two features and the label of the dataset --> # Here we extract the two features and the label of the dataset

  • seperation --> separation
  • seperate --> separate
  • training of the model using the training data stored in X and Y for 4100 epochs --> # training of the model using the training data stored in X and Y

  • this values can vary --> these values can vary
  • descriped --> described
  • Definition of the network with two hidden layers --> Definition of the network with two layers (note: there is only 1 hidden layer)
  • paramters --> parameters
  • alot --> a lot
  • leraning --> learning

advice on a regression problem similar to the one in chapter 8 section 3

Hi, first of all i would like to thank you for writing the book, i am reading it at the moment and i find it really interesting. I am a student learning how to use TensorFlow on a scientific data augmentation problem. I modified the code presented in chapter 8.3 (regression with mcdropout) to create a Bayesian neural network that recreates the trend of a function based on a set of points (x,y) and provides an estimate of the error associated with each prediction. The network works well and is quite accurate but the problem is that, just like in the example of the book, I assumed that the measures are independent of each other when in fact the data I have have a covariance matrix that has small but not null terms outside the diagonal. I was wondering if there was a way in tensorflow_probability to modify the loss function to use a multivariate gaussian while leaving the output layer unchanged (I don't want to predict a whole new covariance matrix, just the mean and the associated error estimate). Thanks for any advice.

nb_ch06_01.ipynb

TFP Version 0.14.0
TF Version 2.8.0

AttributeError: module 'tensorflow_probability.python.bijectors' has no attribute 'AffineScalar'

logis = tfd.TransformedDistribution(distribution=tfd.Logistic(loc=1, scale=0.25),
bijector=tfb.Shift(shift=-0.5))

nb_ch06_04.ipynb

TFP Version 0.16.0
TF Version 2.8.0

Model: "real_nvp"


Layer (type) Output Shape Param #

=================================================================
Total params: 0
Trainable params: 0
Non-trainable params: 0

    self.net = tfb.real_nvp_default_template([h, h])
    for i in range(num_blocks):
        net = tfb.real_nvp_default_template([h, h])#BUG
        bijectors.append(
            tfb.RealNVP(shift_and_log_scale_fn=self.net,
                        num_masked=num_masked))  # E

nb_ch05_01.ipynb

TFP Version 0.14.0
TF Version 2.8.0
model_mean_sd_5 = Model(inputs=inputs, outputs=dist.mean())
AttributeError: 'UserRegisteredTypeKerasTensor' object has no attribute 'mean'

modified:
preds = model_sd_1(x_pred)
mean = preds.mean()
sigma = preds.stddev()

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.