Giter Site home page Giter Site logo

Comments (11)

mfkasim1 avatar mfkasim1 commented on June 1, 2024

Hi @bsun0802, thank you for your issue!
I think you can use xitorch.linalg.symeig with A -> B and M -> A.
The mathematical expression for one eigenpair in your problem is B.x = lambda.A.x while for more than one eigenpairs, it becomes B.X = A.X.E where E is a diagonal matrix containing the eigenvalues.
If your A and B are dense matrices, you can do something like this:

Blinop = xitorch.LinearOperator.m(B)
Alinop = xitorch.LinearOperator.m(A)
eivals, eivecs = xitorch.linalg.symeig(Blinop, M=Alinop)

It will retrieve the complete eigenvalues and eigenvectors for you.
If you are interested in implementing it yourself, you can take a look at

from xitorch.

mfkasim1 avatar mfkasim1 commented on June 1, 2024

From my experience using the two-matrices eigendecomposition, it is differentiable.
A problem might occur if your M matrix is not posdef (but the problem itself is not well-defined for non-posdef M) or if the eigenvalues are degenerate (see pytorch/pytorch#47599)

from xitorch.

bsun0802 avatar bsun0802 commented on June 1, 2024

Thanks for your quick reply.

For your first comment,
In my case, A and B are both dense (7x7) covariance matrix (which is supposed to be non-negative definite matrix). So I will give the xitorch.linalg.symeig a try first.

For your second comment,
you mean torch.lobpcg should be differentiable in the posdef case? or the xitorch solution should be differentiable?

Great thanks for your help!

from xitorch.

mfkasim1 avatar mfkasim1 commented on June 1, 2024

I mean generally the two-matrices eigendecomposition problem should be differentiable (unless the M matrix is not posdef, which I am unsure about that case).

For torch.lobpcg, I'm not sure what's the implementation status of the backward algorithm, but xitorch.linalg.symeig should be differentiable (and numerically stable, in my experience), even if the eigenvalues are degenerate.

from xitorch.

mfkasim1 avatar mfkasim1 commented on June 1, 2024

Hi @bsun0802 could you please let me know whether it works for you?

from xitorch.

bsun0802 avatar bsun0802 commented on June 1, 2024

I haven't tried it yet, will let you know.

from xitorch.

bsun0802 avatar bsun0802 commented on June 1, 2024

Hi @mfkasim1,

Yes your code worked. Both the forward pass and the backward pass worked. But it elongates my training time by about 1/3.

And yet I have another question. I brushed up eigenvalues and determinant a bit. I think the joint eigenvalues can be obtained via native torch functions as long as A is invertible.

The joint eigenvalues are the solution of det(lambda*A - B) = 0. Since A in my case is a covariance matrix which is nonnegative positive definite, therefore A^-1 exists.

By facts, det(AB) = det(A)det(B). So det(lambdaA - B) = det(A) * det(lambda - BA^-1) = 0
And if we assume det(A) != 0, then the joint eigenvalues (lambda) are actually the eigenvalues of BA^-1.

So lambdas = torch.symeig(BA^-1) is the solution.

But torch.symeig(BA^-1) gives different answers with yours, I also tried switch the role of A and B you mentioned above, still different.

Any thoughts?

from xitorch.

mfkasim1 avatar mfkasim1 commented on June 1, 2024

Thanks for trying it out!
Yes, I haven't optimize xitorch for speed because numerical stability is still the priority at the moment.
To complement my answer in another thread, one answer to calculate that is torch.eig(A^(-1)B).
However, torch.eig is not as developed as torch.symeig (if I remember it correctly, it can't take batched matrices).
Also, the calculation with eig can produce complex eigenvectors in near-degenerate cases.

from xitorch.

bsun0802 avatar bsun0802 commented on June 1, 2024

Great thanks for your help and time!

from xitorch.

bsun0802 avatar bsun0802 commented on June 1, 2024

Thanks for trying it out!
Yes, I haven't optimize xitorch for speed because numerical stability is still the priority at the moment.
To complement my answer in another thread, one answer to calculate that is torch.eig(A^(-1)B).
However, torch.eig is not as developed as torch.symeig (if I remember it correctly, it can't take batched matrices).
Also, the calculation with eig can produce complex eigenvectors in near-degenerate cases.

Yes, indeed BA^-1 isn't symmetric even if both A and B are symmetric. So torch.eig should be used, but according to torch documentation, backward pass is only supported by torch.symeig.

I will try with both xitorch and the solution in another thread (work on something symmetric) to see the training and testing accuracy.

from xitorch.

mfkasim1 avatar mfkasim1 commented on June 1, 2024

torch.eig has backward supported as long as eigenvectors are not complex (since 1.7 if I remember it correctly). I implemented it a long time ago, but I forgot to update the doc.pytorch/pytorch#33090

from xitorch.

Related Issues (18)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.