Giter Site home page Giter Site logo

Entropy for RR intervals about pyentropy HOT 10 CLOSED

nikdon avatar nikdon commented on May 25, 2024
Entropy for RR intervals

from pyentropy.

Comments (10)

nikdon avatar nikdon commented on May 25, 2024 1

I marked this issue to remember to change the output value of the function

from pyentropy.

DominiqueMakowski avatar DominiqueMakowski commented on May 25, 2024 1

Cool :)

On PhysioNet it says "The outputs are the sample entropies of the input, for all epoch lengths of 1 to a specified maximum length, m.".

But I don't understand if they refer to the tuples or the values within each tuple. Anyway that's way too much values for me 😅! I've take a look at that paper but it didn't help me much 😢

Then I ran your function with the different values of m:

sample_entropy(rri,2, 0.1*np.std(rri))
sample_entropy(rri,1, 0.1*np.std(rri))

This returned:

sample_entropy(rri,1, 0.1*np.std(rri))
Out[9]: array([ 2.84102392])

sample_entropy(rri,2, 0.1*np.std(rri))
Out[10]: array([ 2.84102392,  2.28018737])

So I believe, indeed, that the "appropriate" value is the m-1 value of the list, right?

Anyway, I've added your multiscale entropy function to my package (which is of particular interest for ECG signal). As you can see, I added at line 328 the code to only take the value corresponding to the emb_dim. Also, I used nolds' computation of sampen... However, I have no idea if this is correct as I don't know the difference between the 'chebychev' and the 'euler' parameters... What do you suggest?

Please do not hesitate to make modifications to this packaged function. As you can see, it is further integrated within a higher-level complexity() function that has a nice documentation and all.

from pyentropy.

nikdon avatar nikdon commented on May 25, 2024

Hey, seems like the second parameter you pass is wrong. Sample Entropy is the negative natural logarithm of an estimate of the conditional probability that subseries (epochs) of length m that match pointwise within a tolerance r also match at the next point. Sometimes M im is called embedding dimension or sample vector, so by definition, it should be smaller than the analysing signal. The default value is 2. Default value for tolerance usually is 0.1...0.2 * std(time_series)).

from pyentropy.

DominiqueMakowski avatar DominiqueMakowski commented on May 25, 2024

It works if I put sample_length=2. However, it returns a vector of length == sample_length. Is it right? Which one is the actual entropy value (or maybe I should compute a mean or something?)?

Moreover, if I run sample_entropy(rri, 2), it returns me array([ 2.9060357 , 2.63176262]). None of these values is similar to the sample entropy as computed by the nolds package (nolds.sampen(rri, 2, 0.1*np.std(rri))) that returns 2.708050201. Are you aware of any differences?

Thanks a lot 😄

from pyentropy.

nikdon avatar nikdon commented on May 25, 2024

The first value is the sample entropy. I think results are correct because nolds takes by default Chebyshev distance and pyEntropy evaluates Euler distance. You can try to call sampen(rri, 2, 0.1*np.std(rri), dist="euler") and check results. Hope they will converge :)

from pyentropy.

DominiqueMakowski avatar DominiqueMakowski commented on May 25, 2024

First I tried to fix the nolds package and the "euler" option (see this commit). Then, when comparing the two, it still returns different values 😕. I made a minimal example here, if you wanna see by yourself.

For the multiscale entropy, Should I also use the first value?

Finally, would you be ok if I added some of your functions within my package, as for example for shannon_entropy (see here)? It would be more convenient to use if within a package... I tried to credit you as much as I could 😸

from pyentropy.

nikdon avatar nikdon commented on May 25, 2024

Hey, I've checked the results. Actually, sample_entropy ported from SampEn from PhysioNet. And results are:

SamPen: [(0, 1.3217558399823195, 0.093227453170680125), (1, 1.2992829841302609, 0.13428162652290843), (2, -0.0, 0.0)]
pyEntropy: [ 1.32175584  1.29928298]

As you can see, in list values appended as function goes to another end of the signal. Moreover, the result is not integral. So, I am not sure, where is the error and is it necessary to take only the first value of the list. It looks like it is a window function. For the multiscale_entropy, I could use the same approach.

Also, yes, please use whatever you want wherever you want :)

from pyentropy.

nikdon avatar nikdon commented on May 25, 2024

In case you want only one value - yes. Regarding chebychev and euler (or euclidean) - they are just distance functions. Depending on your purposes you can use other distance functions as well. Which one to use you can decide based on some prior knowledge or simulations for appropriate cases. Which implementation to use - I'm not sure, at the moment I don't have time to check it, but results from pyEntropy equals those in the SampEn.

from pyentropy.

nikdon avatar nikdon commented on May 25, 2024

Ok, I close the issue due to clarification of all questions. Also, I had prepared a package for easy use by pip install pyentrp

from pyentropy.

martinmocko avatar martinmocko commented on May 25, 2024

Just wanna share my two cents that I also wanted to use some form of entropy calculation for my data and while your solutions seem to work, I really had a hard time understanding what the output from sample entropy and multiscale entropy really meant. I have a suggestion that you should write somewhere a clear explanation of what the output means and also ideally provide some examples of inputs/outputs and explanations to make it easier to grasp and faster to use correctly. In the end, this issue post helped me in my problem, but I think the information should be easier to find.

from pyentropy.

Related Issues (8)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.