-
Notifications
You must be signed in to change notification settings - Fork 146
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sign error on derivatives with certain type of functions. #33
Comments
Could you give this a try with #27? That branch basically fixes everything. |
Yeah, looks like this is correct: https://github.com/JuliaDiff/ForwardDiff.jl/pull/27/files#diff-85b7db0a1fe4b76c8f248e5b5d9ad774R71 Using the matrix syntax doesn't work but rewriting it with f(x) = [x[1]^2]' * [1.0]
ForwardDiff.gradient(f, [3.0])
# ERROR: MethodError: `load_gradient` has no method matching
# load_gradient(::Array{ForwardDiff.GradientNum{1,Float64,Tuple{Float64}},1})
# in take_gradient at /home/kristoffer/.julia/v0.4/ForwardDiff/src/fad_api.jl:73
# in gradient at /home/kristoffer/.julia/v0.4/ForwardDiff/src/fad_api.jl:78
f(x) = dot([x[1]^2], [1.0])
ForwardDiff.gradient(f, [3.0])
# 1-element Array{Float64,1}:
# 6.0 |
Just a note on why this results in an error: |
Ah, of course, thank you for pointing that out. |
Am I doing something forbidden here?
Edit: Oh, this is almost exactly: #24 but it should have been fixed in #25?
Edit2: Looked into it a bit more, the problem is not with transpose but with conjugate:
Basically, why does conj on a
Array{ForwardDiff.GraDual{Float64,1},1}
invert the gradient part?The text was updated successfully, but these errors were encountered: