Comments (10)
The current behavior is technically correct but this seems reasonable also. Is there an efficient way to check this?
from dualnumbers.jl.
I don't know enough about -0.0. Could you explain what makes the current behavior technically correct?
from dualnumbers.jl.
Absolute value isn't differentiable at zero, so all we can expect is a valid subgradient. Both 1.0 and -1.0 are valid subgradients at that point, because they're both tangent to the graph of absolute value.
from dualnumbers.jl.
What about the sign on the real part?
On Dec 9, 2014 5:59 PM, "Miles Lubin" [email protected] wrote:
Absolute value isn't differentiable at zero, so all we can expect is a
valid subgradient. Both 1.0 and -1.0 are valid subgradients at that point,
because they're both tangent to the graph of absolute value.—
Reply to this email directly or view it on GitHub
#14 (comment)
.
from dualnumbers.jl.
-0.0
and 0.0
is a weird floating-point curiosity:
julia> 0.0 == -0.0
true
julia> -0.0 >= 0.0
true
from dualnumbers.jl.
Yeah, I'm with you...
On Dec 9, 2014 6:13 PM, "Miles Lubin" [email protected] wrote:
-0.0 and 0.0 is a weird floating-point curiosity:
julia> 0.0 == -0.0
true
julia> -0.0 >= 0.0
true—
Reply to this email directly or view it on GitHub
#14 (comment)
.
from dualnumbers.jl.
I was thinking about this again.
Currently we have this situation:
julia> z = Dual(-0.0,1.0)
-0.0 + 1.0du
julia> abs(z)
-0.0 + 1.0du
julia> sqrt(abs2(z))
dual(0.0,NaN)
I understand why Calculus.jl defines the derivatives of square root as it does, however I think it is possible to have a definition for sqrt(::Dual) that does work as expected in the above case.
#https://en.wikipedia.org/wiki/Dual_number#Linear_representation
function mat(z::Dual)
r = eye(2)*z.re
r[1,2] = z.du
return r
end
julia> mat(z)
2x2 Array{Float64,2}:
-0.0 1.0
-0.0 -0.0
julia> sqrtm(mat(abs2(z)))
2x2 Array{Complex{Float64},2}:
0.0+0.0im 0.0+0.0im
0.0+0.0im 0.0+0.0im
the final matrix corresponds to Dual(0.0,0.0)
Note that the current definition and the sqrtm(mat(::Dual))
operation above agree in a lot of weird cases already:
julia> sqrtm(mat(Dual(0.0,1.0)) )
2x2 Array{Float64,2}:
0.0 Inf
0.0 0.0
julia> sqrt(Dual(0.0,1.0))
dual(0.0,Inf)
julia> sqrtm(mat(Dual(-0.0,1.0)) )
2x2 Array{Float64,2}:
-0.0 -Inf
0.0 -0.0
julia> sqrt(Dual(-0.0,1.0))
dual(-0.0,-Inf)
just not in the following case
julia> sqrtm(mat(Dual(0.0,-0.0)))
2x2 Array{Complex{Float64},2}:
0.0+0.0im 0.0+0.0im
0.0+0.0im 0.0+0.0im
julia> sqrt(Dual(0.0, -0.0))
dual(0.0,NaN)
from dualnumbers.jl.
How is sqrtm defined at [0 1; 0 0]? The usual definitions of matrix functions (Taylor series, Cauchy integral formula) don't seem to apply when the spectrum touches a singularity. Or is it entry wise anyaltic continuation?
from dualnumbers.jl.
@dlfivefifty I don't know, but I can say that it is doing what matches my intuition based on the auto-diff properties of Dual numbers
#repeated from above
julia> sqrtm(mat(Dual(0.0,1.0)) )
2x2 Array{Float64,2}:
0.0 Inf
0.0 0.0
julia> a = [0 1; 0 0]
2x2 Array{Int64,2}:
0 1
0 0
julia> sqrtm(a+eye(2)*nextfloat(0.0))
2x2 Array{Float64,2}:
2.22276e-162 2.24946e161
0.0 2.22276e-162
julia> sqrt(Dual(nextfloat(0.0), 1.0))
2.2227587494850775e-162 + 2.2494568972715982e161du
from dualnumbers.jl.
Ok I guess that's the analytic continuation version.
from dualnumbers.jl.
Related Issues (20)
- 0.4: @compat value.(v) for v::Vector{Dual{Float64}} returns Vector{Any}
- Replace this package with ForwardDiff's `Dual`? HOT 13
- Tag a new version
- isapprox throws an error
- max when values agree?
- erf function broken in julia Nightly HOT 1
- / HOT 1
- / HOT 1
- Support for fft of DualComplex Numbers
- @show imε output
- Package compatibility caps HOT 1
- Odd behavior: `sparse(M::Array{Dual{Float64}})` zeroes dual values HOT 2
- isless broken for mixed types
- TagBot trigger issue HOT 13
- Consider renaming realpart to value or primalpart or similar
- Should we export ϵ ≡ ε?
- bitmix not defined
- Complex Differentiation HOT 5
- div() and rem()
- Consider replacing SpecialFunctions.jl by pure Julia alternatives HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from dualnumbers.jl.