Giter Site home page Giter Site logo

Handling BCs about diffeqoperators.jl HOT 15 CLOSED

sciml avatar sciml commented on July 20, 2024 1
Handling BCs

from diffeqoperators.jl.

Comments (15)

dextorious avatar dextorious commented on July 20, 2024 2

@shivin9 Okay, I may have insufficiently communicated my point in my last response. Let's clear this up.

There's a number of issues with your test, of which two are the most important:

  • You're using the parabolic test function I gave in the diffusion example. I used it because it was convenient for me to make the analytic solution go to zero, but that was a second order equation. You're now trying to test the modified fourth derivative (with fixed first derivative values at the ends). The first derivative of the test function is f'(x) = 1 - 2x, so setting f'(0)=2 and f'(1)=3 is inconsistent and anything after that is going to produce nonsense.

  • The choice of test function notwithstanding, the way you're testing this is wrong. You're calculating the modified fourth derivative (res in your code), then calculating the first derivative of res. That may, given an appropriate test function and a fine grid, match the fifth derivative of the test function, but it's not going match the boundary values set for the first derivative.

Insofar as testing the modified derivative code is concerned, I don't think it makes any mathematical sense to test these approximations separately from equation solving. You can't retrieve the boundary values without either solving the equation or introducing further errors by other manipulations (you could technically integrate an appropriate number of times to get lower order derivatives back, but there are numerical issues with that).

from diffeqoperators.jl.

dextorious avatar dextorious commented on July 20, 2024 1

There's a number of ways to do this and the Fornberg algorithm gives us the freedom to choose whichever way we prefer, without the usual regard for how easy it is to derive particular approximations.

Purely as an illustrative counterpoint to the usual ghost point approach you're familiar with, let us consider the trivial example of approximating a second order derivative at the left (x=1) end of a 1D grid under the constraint that the first derivative is equal to a constant a (a Neumann condition). Using one-sided differences up to O(h^2) for the second derivative and O(h^3) for the first, we have:

2*x1 - 5*x2 + 4*x3 - x4 == f''
-11/6*x1 + 3*x2 - 3/2*x3 + 1/3*x4 == f' == a

All of those coefficients, to any order, are given by the Fornberg algorithm with no effort on our part. You can now simply multiply the second equation by 3 and add them together to get:

f'' == -3.5*x1 + 4*x2 - 0.5*x3 - 3*a

which is the formula for the second derivative at x1, taking into account the constraint on the value of the first derivative at that point. If you do the usual Taylor series math and calculate the remainder terms, you'll see that the final expression retains second order accuracy. This generalizes straightforwardly to any derivative and to any approximation order and in simple cases like this one doesn't even require solving an equation. You can do similar things with Robin conditions, but it gets a bit more complicated.

Obviously, this isn't the only way to do things. You can use ghost points and simple cases are just intuitively handled by reflection. I just thought I'd give you a slightly more systematic example in the sense that you can verify the accuracy directly.

from diffeqoperators.jl.

shivin9 avatar shivin9 commented on July 20, 2024

At the very last point, use a one-sided first derivative approximation of the correct order.

Well I get that instead of solving the algebraic equation which comes up by using the ghost points, we are using the Fornberg coefficients directly, but how do we incorporate the information regarding the value of derivative at the ends? I mean there is no equation to solve like we had in the ghost points case for Neumann 0 (I derived the reflected coefficients by hand to see how they come up)?

I can think of how to modify the coefficients at the edges using the ghost point technique. Although this would lead to reduction in order of accuracy (as @dextorious had said), we will just take the reflected coefficients and subtract the adjusted value of first derivative, like it done in here page 5.

from diffeqoperators.jl.

ChrisRackauckas avatar ChrisRackauckas commented on July 20, 2024

To keep the order of accuracy, at the ends you can just use the one-sided derivative at the correct order. Then, at the very last point, use the one-sided derivative of the first derivative at the correct order. For example, second order approximation, last cell on the right would be the second order backwards discretization of the first derivative. The Fornberg algorithm should be able to compute the coefficients for that?

from diffeqoperators.jl.

dextorious avatar dextorious commented on July 20, 2024

To expand a bit more on the previous comment, I played around a bit with the example I gave above and compared it to the usual ghost point approach in an IJulia notebook using a simple 1D heat equation: du/dt == d2u/dx2 defined on 0<=x<=1, with zero Neumann conditions on both ends and a parabolic initial condition. The analytic solution asymptotically goes to the line x=0 as you'd expect intuitively, though I also obtained the full dynamics using a Fourier series solution via Mathematica (yes, I'm lazy). The notebook doesn't have much in the way of comments, but it should be fairly self-explanatory and is meant more as an illustrative starting point that you can play around with to get a sense for these things.

You'll note that while both methods generally show second order accuracy, the approach I outlined above is more accurate on coarse grids and generally less prone to spurious oscillations far from the boundaries. The tradeoff, of course, is that the ghost point approach is slightly faster, as it requires fewer points.

Ultimately, I'd like to see both approaches implemented in PDEOperators.jl and leave the choice up to the user, but it makes sense to start with one of them. I think the more accurate choice makes sense as the default (particularly for users that may want to try highly non-linear PDEs where the oscillations might matter), but that's up to @ChrisRackauckas. Hope this helps a bit.

from diffeqoperators.jl.

ChrisRackauckas avatar ChrisRackauckas commented on July 20, 2024

Yes, the ghost point method is a silly hack for order of accuracy, pushing out the spot where its order 1 to a component whose error is not measured. Technically the right order, but order isn't the only thing that matters. In that sense, the one-sided difference is preferred.

However, the one-sided difference increases the bandsize. The difference on multiplications is negligible, so I'd ignore that, but the bandsize difference is huge. This means that when using in explicit solvers, or with iterative solvers, they will perform similarly (ignoring error differences). But when using a dense banded solver, the one-sided difference has a large overhead.

I agree that it should be both, with the one-sided difference being the default. Ghost points will usually be worse except in the edge case where a good banded matrix solver exists and you want to use it, then it might even just be even because of the error tradeoff. Ghost points are used since they are easy, not because they are good.

from diffeqoperators.jl.

shivin9 avatar shivin9 commented on July 20, 2024

@dextorious @ChrisRackauckas , I have implemented the generalized version of the approach you showed with the example.
For any derivative order and approximation order, I am creating the one-sided stencil at the boundary points and also the corresponding 1st derivative stencil of the same length ie. of a higher order so that I can cancel out the last coefficient in a similar way you had shown.

eg. For the 4th order derivative with 4th order approximation, we get the 1-sided stencil at x1 as :-

    f'' = [9.33333, -55.5, 142.0, -203.167, 176.0, -92.5, 27.3333, -3.5]

whereas the 1st order stencil at the boundary is

  f' = [-2.59286, 7.0, -10.5, 11.6667, -8.75, 4.2, -1.16667, 0.142857] = a

After eliminating x8, we get

  f'' = [-54.1917, 116.0, -115.25, 82.6667, -38.375, 10.4, -1.25] - 24.5*a  

Should there be any faults in this approach? Maybe not, as we are just taking high order approximations at the boundaries. My only doubt is regarding testing the correctness as in this code. Is it the expected result?

The corresponding code is here

from diffeqoperators.jl.

dextorious avatar dextorious commented on July 20, 2024

The coefficients are correct, so there shouldn't be any issues apart from bugs in the implementation. I haven't verified the implementation details yet, but looks good.

As for your testing notebook, I don't think it does what you think it does. If I'm reading it correctly, you're computing the (modified) fourth derivative of the test function, then calculating the first derivative of that result. There's no reason that should match the boundary values.

I'm not sure about comprehensive unit tests, but in terms of functional tests and benchmarking, you can start building up examples with any PDEs you like that have easily verifiable solutions. My heat equation notebook example is one such case, for higher order derivatives you can always try the KdV equation, a hyperdiffusion equation or the usual linear elasticity problems (thin, clamped beam with an oscillating free end, etc.). Those will also come in handy for documentation later.

from diffeqoperators.jl.

shivin9 avatar shivin9 commented on July 20, 2024

Well this is what I saw in the second answer but if the coefficients are correct then everything must be alright.

from diffeqoperators.jl.

ChrisRackauckas avatar ChrisRackauckas commented on July 20, 2024

That test is not a good test. What's the analytical solution supposed to be? It's definitely not supposed to be flat zero since the current solution is symmetric on [0,1] and your BC is asymmetric, meaning that at least one side has to be changing quite a bit.

sum(first_order_coeffs.*res[1:8]) # the first derivative at x1, should be equal to 2?

I'm not sure about this.

from diffeqoperators.jl.

shivin9 avatar shivin9 commented on July 20, 2024

I'm taking 4th derivative of a quadratic polynomial so it should be a flat 0 only, isn't it?
Right now I am just testing the operator and not solving any DE.

from diffeqoperators.jl.

ChrisRackauckas avatar ChrisRackauckas commented on July 20, 2024

@dextorious was much more clear. The boundary equations have to be satisfied from the start for the approximate derivative to make sense. Your operator assumes the boundary derivative is correct, and calculates the 4th derivative assuming that the boundary derivative is (2,3). But since that's not the case, you're not getting the correct 4th derivative out. And numerically it's easy to see why that can never be the case: one boundary has a very different constant than the other (2 vs 3), and the test equation is symmetric, meaning that even if one side was zero the other side could not be.

You should pick a different function to differentiate and make sure the first derivatives on the edges are correct. Then the generated operator should produce the correct derivatives.

In PDEs, this is the property that the boundary condition has to be satisfied by the initial condition. If it's not satisfied by the initial condition, you cannot guarantee that there is a solution. Mathematically it makes no sense, and numerically you have this problem.

from diffeqoperators.jl.

anriseth avatar anriseth commented on July 20, 2024

Is there a local branch anywhere with leaky BCs?

from diffeqoperators.jl.

shivin9 avatar shivin9 commented on July 20, 2024

No not yet. Work is going on in the new_boundary_conditions branch where I will add leaky BCs soon

from diffeqoperators.jl.

shivin9 avatar shivin9 commented on July 20, 2024

TODO: fix the automatic direction case for Robin BCs.

from diffeqoperators.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.