Giter Site home page Giter Site logo

pyinform's People

Contributors

colemathis avatar dglmoore avatar jakehanson avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pyinform's Issues

Provide Convenient Imports

At present the pyinform module provides no functions on import. Only submodules are provided by pyinform. It would be great if from pyinform import * brought in all available functions.

But what about name conflicts?

As more functions are added and naming conflicts being to arise, we can revisit this decision. Until then forward ho!

Segmentation fault while using the Transfer Entropy

Hello, 

I am using a two time series to compute transfer entropy using the below lines. 

pyinform.transferentropy.transfer_entropy(x_timeseries,y_timeseries,1)

But I am getting a segmentation fault in Linux and tcmalloc (in collab) when interchanging the time series (y to x) using the below line.

pyinform.transferentropy.transfer_entropy(y_timeseries,x_timeseries,1)

The interesting part is that I am getting this error only in one condition. 

To compute the y_timeseries, I need to read nearly 100 GB of data (different .npz files of 3D data (time, lat, lon) ) which is being done in the same code, so there is no such error. But, when I saved the prepared y_timeseries in a separate .npz file (which is about 500 MB) and read this file, I am getting the above mentioned error. I do not understand why reading such a small file gets an error while reading large data sets there is no error. 

Please, someone help me in this regard. 

Thanks 

Time Series Measures grammatical error

Under the section "Subtle Details - The Base: States and Logarithms" There is a sentence that reads "It two uses this convention...", when the "two" should be written as a "too".

Add Continuous Integration Support

Continuous integration (CI) makes building and testing on multiple platforms a snap. When commits are pushed or merge requests are made (or merged), a CI platform can pull the repository and run automated tests. This ensures that regressions are caught as early as possible. Integration of CI platforms with GitHub ensures that the status of the builds and tests are visible within the GitHub UI, so assignees know the status of the builds before merging pull requests.

The two CI platforms used by inform, Travis CI and AppVeyor, have proven invaluable in ensuring that the builds stay in tip-top shape. Travis CI provides Linux and OS X support while AppVeyor handles testing on Windows.

To ensure that development of PyInform goes as smoothly as possible, support for Travis-CI and AppVeyor should be added with the v0.0.5 release.

Relative Entropy

Inform added relative entropy in the v0.0.5 release. The next release of PyInform should include a wrapper for relative entropy.

Proposed API

def relative_entropy(xs, ys, b=0, base=2.0, local=False):

Example Usage

from pyinform.relativeentropy import relative_entropy

xs = [0,0,1,1,1,1,0,0,0]
ys = [1,0,0,1,0,0,1,0,0]

relative_entropy(xs, ys, b=2, base=2.0) # == 0.038330
relative_entropy(ys, xs, b=2, base=2.0) # == 0.037010

relative_entropy(xs, ys, b=2, base=2.0, local=True)
# == [-0.263, 0.415]
relative_entropy(ys, xs, b=2, base=2.0, local=True)
# == [0.263, -0.415]

Mistakes/typos in the documentation of PyInform

Documentation of relative entropy:
• b (double) – the base of the time series
• b – the logarithmic base

Should instead be:
• b – the base of the time series
• base (double) – the logarithmic base

Documentation of relative entropy:

  • There is a "a the" instead of "as the" in the doc of entropy rate.

Block entropy documentation:

  • block_entropy([0,0,1,1,1,1,0,0,0], k=2, local=True) -> should have parameter b=4 otherwise the results differ from what inside the pdf.

Lib not found libc.so.6 when building for aarch64/Android

I am trying to run a scripts that use transfer_entropy on an Android Python (aarch64), and I always get this error:

OSError: dlopen failed: library "libm.so.6" not found: needed by /data/data/com.example.a_example/files/chaquopy/AssetFinder/requirements/pyinform/inform-1.0.0/lib/linux-x86_64/libinform.so.1.0.0 in namespace clns-6

Broken Link in README.md

Hi Doug,

It appears the link to the google drive file containing documentation is broken. This is on the home page of the git repo in the section describing documentation.

Best,
Jake

Continuous values for Transfer Entropy

Thanks for the cool library!

I find there are binary values for Transfer Entropy input. such as :
xs = [0,1,1,1,1,0,0,0,0] ys = [0,0,1,1,1,1,0,0,0]

Is it possible to use continuous values instead of binary values, such as :
xs = [0.5,1.7,2.5,0.1,1.5,0.9,0.56,0.71,2.10] ys = [0.3,1.0,3.1,1.1,0.1,1.8,0.5,1.2,3.5]

`block_entropy()` does not match `scipy.stats.entropy()` when k = 1, as expected

From documentation:

  • N-gram entropy should reduce to Shannon entropy for k = 1
  • "...take as an argument the base of the state and use that as the base of the logarithm...when k is 1"

Taking the example given in the documentation:

a = [0,0,1,1,1,1,0,0,0]
base_a = len(set(a))
pyinform.block_entropy(a, k = 1)
# 0.9910760598382222
scipy.stats.entropy(a, base = base_a ) 
# 2.0

What have I got wrong?

Fully Document

The biggest issue at the moment is the startling lack of documentation. The only "documentation" so far provided is in the form of minimalist docstrings. This travesty must be remedied before the v0.0.5 release.

All Hail the Sphinx

Sphinx is the de facto standard for Python documentation. The expectation is that there will be a doc directory containing the Sphinx configuration files and generated targets. The targets are not to be kept under version control.

Documentation Formats

Which documentation formats should be targeted? I think for now HTML should be the only output format by default. If the user wants other formats, they can enable them and generate the files for themselves.

Hosting

One solid option for hosting the documentation is the wiki.

Bug in calculating local transfer entropy

Hi there,

Thanks for the great library!
I discovered the following bug.

In the documentation Transfer Entropy considers X as Source input and Y as Target input. https://elife-asu.github.io/PyInform/timeseries.html#notation

In the source code, transfer entropy is calculated as transfer_entropy(source, target, k, condition=None, local=False) but in the preprocessing the source is assigned to ys = np.ascontiguousarray(source, np.int32) (which should be assigned to xs) and the target to xs = np.ascontiguousarray(target, np.int32) (which should be ys).

As a result _local_transfer_entropy(ys, xs) seems to provide the reversed result of information flow from source to target distribution rather than a target to the source distribution.

pyinform.error.InformError: an inform error occurred - "negative state in timeseries"

I have two simple time series, xs and ys, having 5000 samples each. I attempted to compute the transfer entropy via

T=transfer_entropy(xs, ys, k)

using various lag values, k. Each attempt yielded the following error message:

Traceback (most recent call last):
  File "/Users/fishbacp/Desktop/Python2022/transfer_entropy.py", line 16, in <module>
    T=transfer_entropy(x_source,x_target,1)
  File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/pyinform/transferentropy.py", line 222, in transfer_entropy
    error_guard(e)
  File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/pyinform/error.py", line 63, in error_guard
    raise InformError(e, func)
pyinform.error.InformError: an inform error occurred - "negative state in timeseries"

Any insights as to the error source? Should I be adjusting other keyword arguments?

Conditional Entropy

Inform added conditional entropy in the v0.0.5 release. The next release of PyInform should include a wrapper for conditional entropy.

Proposed API

def conditional_entropy(xs, ys, bx=0, by=0, base=2.0, local=False):

Example Usage

from pyinform.conditionalentropy import conditional_entropy

xs = [0,0,1,1,1,1,0,0,0]
ys = [1,0,0,1,0,0,1,0,0]

conditional_entropy(xs, ys) # == 0.899985
conditional_entropy(ys, xs) # == 0.972765

conditional_entropy(xs, ys, local=True)
# == [1.322, 0.737, 0.415, 2.000, 0.415, 0.415, 1.322, 0.737, 0.737]
conditional_entropy(ys, xs, local=True)
# == [0.585, 1.000, 1.000, 1.585, 1.000, 1.000, 0.585, 1.000, 1.000]

Concept of Time Series

Hi,

I have not clear if a sequence in input to a method that require probability distribution automatically estimate the empirical distributions of the input data. For example Mutual Information requires 2 np.array but I could not find where the empirical distribution is estimated. I also investigated C backend but again I have not found anything useful. Could you provide some information about this process?
I am asking about this because I am experiencing InformError: an inform error occurred - "negative state in timeseries". I read a previous issue for such error and the answer was to use coalesce_series but I am not sure if it is correctly to apply it to continuos timeseries.

running with k>2 raise "memory allocation failed" error

Any value above k>2 for transfer_entropy method creates the following issue(for k=1|2 it works):

test = pyinform.transfer_entropy(x,y,k=3)

Traceback (most recent call last):
File "C:/Users/user/PycharmProjects/JerusalemProject/JerusalemProject/ActionActorAnalysis.py", line 279, in
temp = pyinform.transfer_entropy(x,y,k=3)
File "C:\Users\user\Anaconda2\envs\Python35\lib\site-packages\pyinform\transferentropy.py", line 179, in transfer_entropy
error_guard(e)
File "C:\Users\user\Anaconda2\envs\Python35\lib\site-packages\pyinform\error.py", line 57, in error_guard
raise InformError(e,func)
pyinform.error.InformError: an inform error occurred - "memory allocation failed"

Conditional Transfer Entropy

In Inform, inform_transfer_entropy supports conditional TE through its third argument (int const *back), which is simply set to None in PyInform. It should be rather straightforward to implement.

Proposed API

transfer_entropy(source, target, k, background=None, b=0, local=False)

Example Usage

From https://elife-asu.github.io/Inform/#transfer-entropy

xs = [0,1,1,1,1,0,0,0,0]
ys = [0,0,1,1,1,1,0,0,0]
ws = [0,1,1,1,1,0,1,1,1]

transfer_entropy(xs, ys, background=ws, b=2, k=2)
# 0.285714

ws = [[1,0,1,0,1,1,1,1,1],
      [1,1,0,1,0,1,1,1,1]]

transfer_entropy(xs, ys, background=ws, b=2, k=2)
# 0.0

OS X Support

Add support for OS X binaries releases of inform.

Proposed Changes

Support for OS X should be as simple as checking the platform and setting the shared library path in pyinform/__init__.py. Simply adding a branch for "Darwin" should do the trick:

if system() == 'Linux':
    return "{}/lib/libinform.so.{}.{}.{}".format(libdir,major,minor,revision)
elif system() == 'Darwin':
    return "{}/lib/libinform.{}.{}.{}.dylib".format(libdir,major,minor,revision)
elif system() == 'Windows':
    return "{}/lib/inform.dll".format(libdir)
else:
    raise RuntimeError("unsupported platform - \"{}\"".format(system()))

Add information flow

Hi,
It would be great to have information flow for python. Do you plan to integrate it ?
Thank you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.