Comments (11)
Hi @bsun0802, thank you for your issue!
I think you can use xitorch.linalg.symeig
with A -> B
and M -> A
.
The mathematical expression for one eigenpair in your problem is B.x = lambda.A.x
while for more than one eigenpairs, it becomes B.X = A.X.E
where E
is a diagonal matrix containing the eigenvalues.
If your A
and B
are dense matrices, you can do something like this:
Blinop = xitorch.LinearOperator.m(B)
Alinop = xitorch.LinearOperator.m(A)
eivals, eivecs = xitorch.linalg.symeig(Blinop, M=Alinop)
It will retrieve the complete eigenvalues and eigenvectors for you.
If you are interested in implementing it yourself, you can take a look at
xitorch/xitorch/_impls/linalg/symeig.py
Line 27 in b09d3bf
from xitorch.
From my experience using the two-matrices eigendecomposition, it is differentiable.
A problem might occur if your M
matrix is not posdef (but the problem itself is not well-defined for non-posdef M
) or if the eigenvalues are degenerate (see pytorch/pytorch#47599)
from xitorch.
Thanks for your quick reply.
For your first comment,
In my case, A and B are both dense (7x7) covariance matrix (which is supposed to be non-negative definite matrix). So I will give the xitorch.linalg.symeig
a try first.
For your second comment,
you mean torch.lobpcg should be differentiable in the posdef case? or the xitorch solution should be differentiable?
Great thanks for your help!
from xitorch.
I mean generally the two-matrices eigendecomposition problem should be differentiable (unless the M
matrix is not posdef, which I am unsure about that case).
For torch.lobpcg
, I'm not sure what's the implementation status of the backward algorithm, but xitorch.linalg.symeig
should be differentiable (and numerically stable, in my experience), even if the eigenvalues are degenerate.
from xitorch.
Hi @bsun0802 could you please let me know whether it works for you?
from xitorch.
I haven't tried it yet, will let you know.
from xitorch.
Hi @mfkasim1,
Yes your code worked. Both the forward pass and the backward pass worked. But it elongates my training time by about 1/3.
And yet I have another question. I brushed up eigenvalues and determinant a bit. I think the joint eigenvalues can be obtained via native torch functions as long as A is invertible.
The joint eigenvalues are the solution of det(lambda*A - B) = 0. Since A in my case is a covariance matrix which is nonnegative positive definite, therefore A^-1 exists.
By facts, det(AB) = det(A)det(B). So det(lambdaA - B) = det(A) * det(lambda - BA^-1) = 0
And if we assume det(A) != 0, then the joint eigenvalues (lambda) are actually the eigenvalues of BA^-1.
So lambdas = torch.symeig(BA^-1) is the solution.
But torch.symeig(BA^-1) gives different answers with yours, I also tried switch the role of A and B you mentioned above, still different.
Any thoughts?
from xitorch.
Thanks for trying it out!
Yes, I haven't optimize xitorch
for speed because numerical stability is still the priority at the moment.
To complement my answer in another thread, one answer to calculate that is torch.eig(A^(-1)B)
.
However, torch.eig
is not as developed as torch.symeig
(if I remember it correctly, it can't take batched matrices).
Also, the calculation with eig
can produce complex eigenvectors in near-degenerate cases.
from xitorch.
Great thanks for your help and time!
from xitorch.
Thanks for trying it out!
Yes, I haven't optimizexitorch
for speed because numerical stability is still the priority at the moment.
To complement my answer in another thread, one answer to calculate that istorch.eig(A^(-1)B)
.
However,torch.eig
is not as developed astorch.symeig
(if I remember it correctly, it can't take batched matrices).
Also, the calculation witheig
can produce complex eigenvectors in near-degenerate cases.
Yes, indeed BA^-1 isn't symmetric even if both A and B are symmetric. So torch.eig should be used, but according to torch documentation, backward pass is only supported by torch.symeig.
I will try with both xitorch and the solution in another thread (work on something symmetric) to see the training and testing accuracy.
from xitorch.
torch.eig has backward supported as long as eigenvectors are not complex (since 1.7 if I remember it correctly). I implemented it a long time ago, but I forgot to update the doc.pytorch/pytorch#33090
from xitorch.
Related Issues (18)
- `torch.outer` is not yet present in pytorch HOT 1
- Incompatibility with pytorch 1.10 HOT 3
- Rootfinder fails with a simple case
- A RuntimeError occur when call _Jac.fullmatrix() HOT 2
- gradient-vector product and vector-gradient product for higher-order gradients
- Cannot install from sdist obtained from PyPI HOT 1
- Custom termination criterion for nonlinear solver HOT 2
- Interpolate module to-do list
- Status Badges issues
- Any way to improve the computational speed of the linalg.symeig() function? HOT 3
- Using root finder to solve a inverse function with the NN form Error HOT 1
- Linear algebra module to-do list HOT 1
- Integrate module to-do list
- Optimize module to-do list HOT 1
- Example 2 broken HOT 1
- Complex hermitian matrices not recognized as hermitian. HOT 2
- Spurious results if x input is not sorted in interpolate.Interp1D HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from xitorch.