alasdairtran / fourierflow Goto Github PK
View Code? Open in Web Editor NEW[ICLR 2023] Factorized Fourier Neural Operators
Home Page: https://arxiv.org/abs/2111.13802
License: MIT License
[ICLR 2023] Factorized Fourier Neural Operators
Home Page: https://arxiv.org/abs/2111.13802
License: MIT License
If this code were to run across different machines, we would want some way of parametrically interpolating settings into the data paths.
This already happens to a degree with $SM_MODEL_DIR, but it is not consistent.... it could be quite easily though. I'll prepare a pull request which allows changeable root dirs.
The context here, @alasdairtran is that since I want to send code two ways between us, it makes sense, at least for now, for me to work with something similar to your training infrastructure, since mine is totally ad hoc and therefore easy to throw out.
Obviously I don't know what all the Vevo
stuff is, so I can't test my code against that, but I think it should be easy to get something that works for both of our experiments.
Don't worry, you are absolutely not required or expected to accept my pull requests, but if it is useful, please feel free to.
I think that N
and M
need to be flipped in the following lines.
This only becomes clear when considering N!=M
.
Thank you for your excellent work! So I want to ask a question about deepcopy. Why does deepcopy used in the experiments, I notice that you also modify the nn.Linear code to let the weight norm can be deepcopied.
Hi,
This is a very nice paper and thanks for keeping the code online. I have a doubt regarding how you include viscosity and force function in your input. For example in the original paper, for Navier stokes, the input shape to fourier layer was (batch_size, grid_x, grid_y, widths).
Did you incorporate viscosity and forcing function in this input? If so, can you please let me know how you added these parameters? Viscosity I believe is a scalar, so I am confused how it fit in the inputs.
Looking forward for your reply,
Thanks. Tariq
Hi, first of all thank you for the nice work you did.
I have a question about how you implemented the code in the file fourierflow/modules/zongy_fno/grid_2d.py in particular it is not clear to me why you apply the affine transformation after calculating the kernel integral operator using the Fourier transform instead of "parallel". If I understand correctly you are now calculating WK(z)+b instead of Wx+b+K(z), are you doing this because you have noticed that it works better or for some other reason?
Thank you in advance
Hi, I have a question for zongyi_fno.mesh_2d.py class FNOMesh2D
In init function, self.fc0 = nn.Linear(4, self.width) # input channel is 3: (a(x, y), x, y) is defined.
I just wonder why the input channel is 3 but nn.Linear function input channel is 4?
Hello, sorry for the maybe trivial question.
I don't understand how torch.fft.rfft2 works, it is clear that it performs the FFT on two dimensions of real function but i don't found how it is indexed. In particular in your code, for examples in \fourierflowmodules/zongyi_fno/grid_2d.py
, you take the Fourier coefficients indexed for out_ft[:, :, :self.n_modes, :self.n_modes]
and out_ft[:, :, -self.n_modes:, :self.n_modes]
. Calling i
the first index that you extract and j
the second, so (if I understand well) you extract the first and the last n_nodes
from the tensor out_ft
. In the theory we have to extract all value of i, j
such that |i| < n_modes
and the same for j. For j
using the Hermitian symmetry for real valued function in out_ft
there is only positive value for j
, so it is clear that the implementation extraction corresponds to the theory. Instead, is for me unclaear, if the implementation extraction of i
correspond to the theory extraction. Refurmulating my question is unclear if the implementation extraction --> out_ft[:, :, :self.n_modes, :self.n_modes]
and out_ft[:, :, -self.n_modes:, :self.n_modes]
corresponds to the theory extraction --> |i| < n_modes
. I don't found any information about that... so I ask you if you already think about that and if you find some information about this details of the torch.fft.rfft2 implementation.
Thank you in advance.
Best regards,
Massimiliano Ghiotto
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.