Giter Site home page Giter Site logo

jacobi's People

Contributors

hdembinski avatar pre-commit-ci[bot] avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

jacobi's Issues

`Indexable` type generates linter warning with numpy arrays

When passing the covariance as numpy array, the Indexable type generates linter warnings of the type:

Argument of type "NDArray[float64]" cannot be assigned to parameter "cov" of type "float | Indexable[float] | Indexable[Indexable[float]]" in function "propagate"
  Type "NDArray[float64]" cannot be assigned to type "float | Indexable[float] | Indexable[Indexable[float]]"
    "NDArray[float64]" is incompatible with "float"
    "NDArray[float64]" is incompatible with "Indexable[float]"
    "NDArray[float64]" is incompatible with "Indexable[Indexable[float]]"Pylance[reportGeneralTypeIssues](https://github.com/microsoft/pyright/blob/main/docs/configuration.md#reportGeneralTypeIssues)

All Jacobian matrix values are equal to zero

It depends on function. For example, I have two functions:

def fun1(x):
    # some calculations with res estimation (res is np.ndarray)
    return res
def fun2(x): 
    return res

I don't want to call fun1 becaues of repeating calculation (I called fun1 previously in my code), so I remember res and define fun2.
Why the result is depends on calculations into fun1? res values are equal in fun1 and fun2.

 jac_mat = jacobi(fun1, x)[0]  # gives me matrix with non-zero values
 jac_mat = jacobi(fun2, x)[0]  # gives me matrix with zero values

Support masked arrays

Test and improve support for Numpy masked arrays.

Jacobi raises a ValueError if a numpy.ma.core.MaskedConstant is encountered as an input.

propagate: support functions with multiple independent arguments

Often, one has a function that takes independent arguments. Propagating this currently is a bit cumbersome, one has to combine the arguments into a vector and the covariances into joint block covariance matrix. propagate could be made more flexible to make this easier. I am not sure how the signature should look like, though.

def func(a, b):
     ....

a = [...]
b = [...]
cov_a = [...]
cov_b = [...]

# option 1
propagate(func, a, b, cov_a, cov_b)

# option 2
propagate(func, a, cov_a, b, cov_b)

# option 3
propagate(func, (a, cov_a), (b, cov_b))

Options 1 and 2 are natural extensions of the current calling convention, just the order of values and covariances differ. I am not sure what is most natural. I am leaning towards option 2.

Option 3 requires more typing and is not really much safer.

New test case (hard)

A challenging test case is the function y = np.log(1 + a1 * x + a2 * x**2) with $a1 = 2.0 \pm 0.2$ and $a2 = 1.0 \pm 0.1$ and correlation $\rho = -0.8$. Jacobi performs poorly in the divergent region near x = -1.

Make `diagonal=True` smarter?

From discussion with @mdhaber

It looks like step[0] is a relative (to the value of x) initial step size and step[1] is a reduction factor for each iteration. However, when diagonal=True, step[0] effectively becomes an absolute step due to the way the wrapper function works.
@HDembinski HDembinski Jun 26, 2023

Yes, that's a speed trade-off. I am assuming here that the x-values are roughly of equal scale and don't vary a lot in magnitude. If they do, then diagonal=True should not be used. This needs to be properly documented at the very least.

The ideal solution in my view would be an algorithm that first groups x-values of similar magnitude together in blocks and then does the calculation using the same absolute step for those blocks. The speed of jacobi comes from doing array calculations as much as possible. Such an algorithm would give the same result as diagonal=True in the ideal case and would fall back to the slow "one-x value at a time" worst case if necessary.

Calculation of Jacobian matrix

Hi. First, thank you for your nice librarary! I have questions about jacobian matrix calculation.

I solve minimization problem with scipy.minimize(Nelder-Mead), my residual function is

np.mean(np.power(y_fact - y_predicted, 2)),

and I want to calculate Jacobian matrix for calculating confidence interval for each optimal parameters (params):

y_predicted = y_predicted(params, x).

Your library based on DERIVEST, and there are example in code:
image

When I do something like this:
image
I got not matrix, but vector with shape = (2).

Is it possible to get Jacobian matrix like this, where f(P, x) --> y_predicted(params, x), a - each parameter from params?

image

Implement efficiency and calculation

Hello Hans,

Thank you for this project, we needed something like this for a long time and we had tried to implement it ourselves privately, but given our limited programming skills our implementations were far from what you have done.

However I have a suggestion. Many of us are trying to get away from ROOT, but there are things like:

https://root.cern.ch/doc/master/classTEfficiency.html

which are constantly needed to calculate efficiency errors. I wonder if Jacobi is a good place to add this feature. I do not see it implemented yet from what I saw in the documentation.

Cheers.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.