Giter Site home page Giter Site logo

jttoivon / data-analysis-with-python-spring-2019 Goto Github PK

View Code? Open in Web Editor NEW
4.0 4.0 18.0 15 MB

Sources of materials for the course Data Analysis with Python - Spring 2019. Newer course instance available here https://csmastersuh.github.io/data_analysis_with_python_2020/

Home Page: https://jttoivon.github.io/data-analysis-with-python-spring-2019/

Makefile 0.13% Python 3.18% Jupyter Notebook 96.69%

data-analysis-with-python-spring-2019's Introduction

Data analysis with Python - Spring 2019

Note

Newer course instance is available from Data analysis with Python - 2020.

Authors

Materials created by Jarkko Toivonen.

License

The course material is licensed under a Creative Commons BY-NC-SA 4.0 license.

Usage:

Building html pages locally

The libraries needed in these Jupyter notebooks are listed in file requirements.txt. In addition, to compile the *.ipynb and *.rst files to html pages, the following libraries are needed:

  • nbsphinx (conda install -c conda-forge nbsphinx)
  • sphinx_rtd_theme (conda install sphinx_rtd_theme)

Then you can compile the html pages locally with:

make html

The html pages will be stored under _build/html/ folder.

Automatic deployment of html pages to GitHub Pages by Travis

Create a new branch for your repository with name gh-pages. This is where the html pages will be stored. They will be visible at https://<account>.github.io/data-analysis-with-python-spring-2019/

GitHub can be instructed to notify Travis CI every time something is pushed to the git repository. Travis will then pull the most recent versions of notebooks from GitHub, convert them to html, and then push them to the gh-pages branch of your GitHub repository. They will then be visible at github.io.

To set up this automation, follow the next instructions:

In Github choose settings -> Developer settings -> Personal access tokens and generate an access token and copy it.

In Travis CI (travis-ci.com) select the correct repository and add an environment variable with key ´GITHUB_TOKEN´ and as value the secret token you got from github. This allows Travis to push the html pages to the gh-pages branch of the github repository.

You may at some point need to install the Travis CI application to github.

data-analysis-with-python-spring-2019's People

Contributors

jttoivon avatar luupanu avatar neodyymi avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar

data-analysis-with-python-spring-2019's Issues

error running test with 'tmc test'

Hi,
I just started data-analysis-with-python-spring-2019/2020 course.
For the first three exercises hello world, compliment, multiplication i could run the tests and submit.
But for the next 3 excercises i have an error, for example
"src.two_dice.main does not exist!"

What could be the reason?
Note: i have tested my code in jupyter/pythotutor and it works fine

Dinesh
Capture

part04-e11, tmc test passes, tmc submit fails

Results the same both in Windows cmd and in WSL.

>tmc test  part04-e11_below_zero
Testing: part04-e11_below_zero
Test results: 3/3 tests passed
100%
All tests passed! Submit to server with 'tmc submit'

>tmc submit part04-e11_below_zero
Submitting: part04-e11_below_zero
Failed: test.test_below_zero.BelowZero.test_called
        [Errno 2] File b'src\\kumpula-weather-2017.csv' does not exist: b'src\\kumpula-weather-2017.csv'
Failed: test.test_below_zero.BelowZero.test_output
        [Errno 2] File b'src\\kumpula-weather-2017.csv' does not exist: b'src\\kumpula-weather-2017.csv'
Failed: test.test_below_zero.BelowZero.test_value
        [Errno 2] File b'src\\kumpula-weather-2017.csv' does not exist: b'src\\kumpula-weather-2017.csv'
Test results: 0/3 tests passed
  0%

part04-e16_split_date; submit: ImportError: No module named 'pandas'

I'm under Windows, until now submits were successful.

>tmc test part04-e16_split_date
Testing: part04-e16_split_date
Test results: 5/5 tests passed
100%
All tests passed! Submit to server with 'tmc submit'

>tmc submit part04-e16_split_date
Submitting: part04-e16_split_date
Failed: unittest.loader._FailedTest.test.test_split_date
        Failed to import test module: test.test_split_date
Traceback (most recent call last):
  File "/usr/lib/python3.5/unittest/loader.py", line 428, in _find_test_path
    module = self._get_module_from_name(name)
  File "/usr/lib/python3.5/unittest/loader.py", line 369, in _get_module_from_name
    __import__(name)
  File "/tmc/test/test_split_date.py", line 6, in <module>
    import pandas as pd
ImportError: No module named 'pandas'


Test results: 0/1 tests passed
  0%

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.