Giter Site home page Giter Site logo

pyentropy's Introduction

Hi there πŸ‘‹

Welcome! I'm Nikolay Donets. I'm passionate about Computer Science and Software Engineering, I bring a user-focused, hands-on approach to my work. I've worked on various projects, always eager to learn and grow. My greatest joy is personal and professional growth while delivering value to users. I thrive on expanding my skills to make a positive impact.

Blog

Apps

πŸŽπŸ“± Calculator 42 with History PRO

GPTs

🐞 Buggy – an expert at crafting bug reports

pyentropy's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pyentropy's Issues

I am getting RuntimeWarning: divide by zero encountered in log

greetings. i am try to use sample entropy for eeg, but i am getting following error
/opt/conda/lib/python3.6/site-packages/pyentrp/entropy.py:178: RuntimeWarning: divide by zero encountered in log
sampen = - np.log(Ntemp[1:] / Ntemp[:-1])
/opt/conda/lib/python3.6/site-packages/pyentrp/entropy.py:178: RuntimeWarning: invalid value encountered in true_divide
sampen = - np.log(Ntemp[1:] / Ntemp[:-1])

Multiscale sample entropy: Docstring has dead link

Hi! I would love to use your multiscale_entropy function :) Just as a sanity check: The link in your docstring leads to a dead end. I guess this link also refers to the reference in sample_entropy? Therefore you are using the implementation from Costa et al.? Which again should mean that your function gives the same output as neurokit2.entropy_multiscale? :) I would prefer using pyEntropy over neurokit2 since the latter seems to downgrade my pandas version and your package seems to be more lightweight :)

issue about the example

when I install it, and test it with your example code, it got a warning: RuntimeWarning: invalid value encountered in true_divide, does it mean the example data is wrong or I make some mistake?

Please provide some references.

Dear developers,
I wonder if you would like to use some data (might be simulated) to illustrate some examples and also I would like to know, which research work has inspired you, please post and mention them and I would like to know based on which author you did the calculations. Thanks

Doubt

Can i know the output format of this function sample_entropy? I am getting an array format as an output.
Can u please explain what does the array indicate?

[Permutation entropy] Unexpected behavior

I'm running permutation_entropy(my_array, m = 3, delay = 1) on the following array and am getting a value of -0.0 for my output.

array([ 8.6354885,  7.847    ,  9.786279 , 10.811804 , 11.368661 ,
       10.298161 ,  9.584094 , 10.228781 , 10.675296 , 10.790827 ,
       10.666389 , 10.177469 , 10.89863  , 11.041763 ,  8.989045 ,
       11.213311 , 10.382978 ,  8.943591 ,  9.770281 , 10.339998 ,
        9.418209 ,  9.841557 , 10.484714 ,  9.990864 , 10.560351 ,
       10.241841 , 10.830489 , 10.062455 ,  9.75937  ,  9.950198 ,
       10.7275305,  9.284445 ,  9.037022 ,  7.691061 , 10.351711 ,
        8.412907 , 10.735292 ,  5.889793 ,  9.477356 ,  4.858265 ,
        9.646169 ,  8.169383 , 10.117668 , 10.4823475,  7.4111977,
       10.287312 ,  7.277527 , 10.397271 ,  7.3295283, 10.846437 ,
        6.8738704,  9.159628 ,  6.409748 ,  8.958366 ,  6.473379 ,
       10.166828 ,  8.030985 , 10.2656975,  9.349428 ,  9.349428 ,
        5.9747562,  9.571418 ,  6.412549 , 10.038413 ,  7.3413515,
        9.748129 ,  7.903518 , 10.491926 ,  8.761297 ,  9.51908  ,
        7.7680426, 10.160261 ,  6.89564  , 10.132029 ,  7.983659 ,
        9.381678 ,  8.631905 , 10.81286  ,  8.44598  , 10.472592 ,
        7.6191735, 10.0790825,  7.5667286,  9.558374 ,  7.1744604,
        9.184171 ,  8.056332 ,  8.731763 ,  8.138747 ,  9.808532 ,
        7.086683 ,  9.662312 ,  6.749556 , 10.758138 , 10.854101 ,
        7.7829103, 11.219701 ,  8.178098 ], dtype=float32)

Visually speaking. The array looks like this:

download 1

Differences in sampling rate aside, I'm inclined to believe that the hashing might not be working correctly.

The permutation distro is:
[95, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]

I understand permutation entropy as the following:

  • A<B<C<D is a distinct permutation, as is B<A<C<D for m = 3
  • a total of (m+1)! permutations are possible. This is also the size/range of my distribution.
  • Using a window size of m+1, I move a window across the time series with a stride of delay = 1. The permutation bin that matches the pattern within that window is incremented by one.
  • After having traversed the entire time series a permutation distribution is left over.
  • something like shannon entropy is computed for the permutation distribution. plogp

With that said the permutation distro:

[95, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]

is contradicted by the first 5 terms:
8.6354885, 7.847 , 9.786279 , 10.811804 , 11.368661

B A C D for first 4
A B C D for second 4

I'm looking for help in either debugging my understanding of permutation entropy, or debugging the code producing unexpected results. Thanks in advance! I'm glad someone has written open source code for this. If you get the index/hashing approach that might be loads faster than how I would have resolved it.

Entropy for RR intervals

Hey, I'm trying to apply your function to ECG signal and, more specifically, RR intervals.

I have an RRis array (rri = array([ 702, 743, 1052, ..., 812, 798, 710])) of length 382, sampled at 1000Hz based on a 5min recording.

I've tried the following:

sample_entropy(rri, len(rri))

But it returns:

__main__:45: RuntimeWarning: invalid value encountered in true_divide
__main__:46: RuntimeWarning: divide by zero encountered in log
Out[11]: 
array([ 2.9060357 ,  2.63176262,  2.7080502 , ...,         nan,
               nan,         nan])

Am I doing something wrong? Also, what's a good default for the tolerance parameter of the multiscale entropy? Thanks

issue about small test set

Thank you all for providing this pyEntropy project.
I am trying to run this sample_entropy as follow:

import numpy as np
import pandas as pd
from pyentrp import entropy as ent

ts = [0.1,0.2,0.3,0.4,0.5,0.6,0.7,0.8]
sample_entropy = ent.sample_entropy(ts, 3, 0.2) # situation1: m+1=3, tolerance=0.2
sample_entropy1 = ent.sample_entropy(ts, 3, 0.21) # situation2: m+1=3, tolerance=0.21
print(sample_entropy)
print(sample_entropy1)

Result is:
[1.38629436 0.15415068 0.18232156] # correct
[ 1.13497993 -0. -0. ] # no result, which is a problem

However,the result that I calculated by hand is 0.18232156 and 0.20067069 ( situation1 and 2)
The algorithm I applied is here:
https://en.wikipedia.org/wiki/Sample_entropy
which is your reference too.

In addition, I ran the same test set on the code provided by the wikipedia,
the result is 0.17185025692665928 and 0.10536051565782628 ( situation1 and 2)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.