Comments (9)
Can you make x an np.array([]) too?
from npeet.
Yes, I think so!
from npeet.
It could be a question of which quantities it expects to be lists of vectors, and which not. I'd try modifying where the brackets are, e.g. this:
ee.micd(cont.iloc[:,[1]].values.tolist(),disc.iloc[:,1].values.tolist()))
That way, the continuous one is a (n_samples, 1) and the discrete on is just (n_samples,). I think that's right. If it doesn't work, print out the dimensions of cont and disc, and I'll think about it a little more.
from npeet.
(I'm pretty sure discrete expects just a single discrete quantity, not a vector, but continuous does expect a vector.)
from npeet.
Yes!it works!thank you!
from npeet.
(I'm pretty sure discrete expects just a single discrete quantity, not a vector, but continuous does expect a vector.)
Can you please provide a working example for micd?
I can use one for mi:
x = [[1.3], [3.7], [5.1], [2.4], [3.4]]
y = [[1.5], [3.32], [5.3], [2.3], [3.3]]
ee.mi(x, y)
0.16831442143704642
Now, according to your recommendations:
x = [[1.3], [3.7], [5.1], [2.4], [3.4]]
y = [5, 3, 5, 2, 3]
ee.micd(x, y)
C:\ProgramData\Anaconda3\lib\site-packages\npeet\entropy_estimators.py in micd(x, y, k, base, warning)
223 entropy_x_given_y = 0.0
224 for yval, py in zip(y_unique, y_proba):
--> 225 x_given_y = x[(y == yval).all(axis=1)]
226 if k <= len(x_given_y) - 1:
227 entropy_x_given_y += py * entropy(x_given_y, k, base)
C:\ProgramData\Anaconda3\lib\site-packages\numpy\core_methods.py in _all(a, axis, dtype, out, keepdims, where)
62 # Parsing keyword arguments is currently fairly slow, so avoid it for now
63 if where is True:
---> 64 return umr_all(a, axis, dtype, out, keepdims)
65 return umr_all(a, axis, dtype, out, keepdims, where=where)
66AxisError: axis 1 is out of bounds for array of dimension 1
from npeet.
Oh, how annoying! The way you called it seems more natural, but at a glance it seems like it also expects "vectors" for the discrete values y = np.array([[5], [3]...]).
Also it seems like it would only work if y is a numpy array. I can't believe I didn't just put in a check, as y = np.asarray(y) would be an efficient way to avoid problems like this.
Let me know if this works.
from npeet.
ld only work if y is a numpy array. I can't believe I didn't just put in a check, as y = np.asarray(y) would be an efficient way to avoid
mm, then I get
x = [[1.3], [3.7], [5.1], [2.4], [3.4]]
y =np.array([[5], [3], [5], [2], [3]])
ee.micd(x, y)
TypeError Traceback (most recent call last)
~\AppData\Local\Temp\ipykernel_19412\3742637936.py in
1 x = [[1.3], [3.7], [5.1], [2.4], [3.4]]
2 y =np.array([[5], [3], [5], [2], [3]])
----> 3 ee.micd(x, y)
C:\ProgramData\Anaconda3\lib\site-packages\npeet\entropy_estimators.py in micd(x, y, k, base, warning)
223 entropy_x_given_y = 0.0
224 for yval, py in zip(y_unique, y_proba):
--> 225 x_given_y = x[(y == yval).all(axis=1)]
226 if k <= len(x_given_y) - 1:
227 entropy_x_given_y += py * entropy(x_given_y, k, base)
TypeError: only integer scalar arrays can be converted to a scalar index
from npeet.
oh wait. it's actually x that has to be a numpy array. now it works:
x = np.array([[1.3], [3.7], [5.1], [2.4], [3.4]])
y =np.array([[5], [3], [5], [2], [3]])
ee.micd(x, y)
0.0
One more question if possible:
the fact that we are using list of lists implies that in npeet functions we can use m-dimensional arrays for x and y, right?
so we can estimate a MI of a 3-dimensional x on 2-dimensional y and it will be supported?
from npeet.
Related Issues (20)
- mutual information between different high dimensional continuous signal HOT 7
- raises error in continuous entropy HOT 3
- Readme lacks installation instructions HOT 1
- can it be used for feature_selection.mutual_info?
- Question: on how to compute conditional mutual information against a set of features HOT 2
- Why mutual information I(x;x) is not equal to h(x)? HOT 2
- Negative mutual information after using shuffle (but correct trend) HOT 4
- Entropy does not increase with variance HOT 1
- Unexpected scaling of mutual information with variance HOT 2
- Trying hard to find how to install the package HOT 1
- How to estimate MMI HOT 4
- What are the Units of the Entropy Output? / Differential Entropy Magnitude is Wrong HOT 1
- Compute the Jensen–Shannon divergence HOT 1
- Question on how to compute normalized mutual information for discrete and continuous data HOT 2
- Unexpected behaviour in the mutual information calculation? HOT 2
- CMI HOT 1
- entropy value is negative HOT 1
- It doesn't work HOT 1
- Best way to compute mutual information in high dimension when all but one variable are iid HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from npeet.