Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nested derivate functions #31

Closed
ovainola opened this issue Jul 31, 2015 · 5 comments
Closed

Nested derivate functions #31

ovainola opened this issue Jul 31, 2015 · 5 comments

Comments

@ovainola
Copy link

Hi, I got my code working with the help of last issue: #30. Though I found a another example how I would like to make my code to work

Like last example:

using ForwardDiff
f1(s) = (s/3)^2
f2(x) = x[1]^2 + exp(3*x[2]/sqrt(4))
function f(x)
    g = forwarddiff_gradient(f2, Float64, fadtype=:typed, n=2)
return f1(x[1]) + g(x)
end
g = forwarddiff_gradient(f, Float64, fadtype=:typed, n=2)
g([-1., 2.0])

ended up with error message:

LoadError: MethodError: `g` has no method matching g(::Array{ForwardDiff.GraDual{Float64,2},1})
while loading In[11], in expression starting on line 9

 in f at In[11]:6
 in g at C:\Users\ovax03\.julia\v0.4\ForwardDiff\src\typed_fad\GraDual.jl:188

I already found a workaround, but Im not sure should it work like this:

using ForwardDiff
f1(s) = (s/3)^2
f2(x) = x[1]^2 + exp(3*x[2]/sqrt(4))
function f(x)
    xd = map(t->t.v, x)
    g = forwarddiff_gradient(f2, Float64, fadtype=:typed, n=2)
    return f1(x[1]) + g(xd)
end
g = forwarddiff_gradient(f, Float64, fadtype=:typed, n=2)
g([-1., 2.0])

which gave me a hessia as expected:

2x2 Array{Float64,2}:
 -0.222222  0.0
 -0.222222  0.0

note: I'm working with finite element material models, which happen to have this kind of functions

@jrevels
Copy link
Member

jrevels commented Jul 31, 2015

This is a really good use case, thanks for sharing. This is a problem with the new code in #27 as well - leaving a snippet here for later See comment below.

@jrevels
Copy link
Member

jrevels commented Jul 31, 2015

Rereading your code, I realize that f is a vector-valued function (f: R^2 --> R^2), meaning we want to take the Jacobian, not the gradient. You might try, then:

using ForwardDiff
f1(s) = (s/3)^2
f2(x) = x[1]^2 + exp(3*x[2]/sqrt(4))
function f(x)
    g = forwarddiff_gradient(f2, Float64, fadtype=:typed, n=2)
return f1(x[1]) + g(x)
end
j = forwarddiff_jacobian(f, Float64, fadtype=:typed, n=2)
j([-1., 2.0])

I'm not sure whether that will actually work or not - it seems the old code uses the same implementation to take the gradient as it does the Jacobian. The #27 implementation, however, handles this fine:

julia> using ForwardDiff

julia> f1(s) = (s/3)^2;

julia> f2(x) = x[1]^2 + exp(3*x[2]/sqrt(4));

julia> f(x) = f1(x[1]) + gradient(f2, x, Partials{2});

julia> j = jacobian_func(f, Partials{2}, mutates=false);

julia> j([-1, 2.0])
2x2 Array{Float64,2}:
  1.77778    0.0
 -0.222222  45.1925

@ovainola
Copy link
Author

Thanks for answering so soon! I couldn't the example code working but I'll get the idea you've implemented it in #27 and if it's coming within couple of weeks I can live with that. My model converges quite nicely with newton at least in 1D solution even if the jacobian might not be correctly evaluated.

@jrevels
Copy link
Member

jrevels commented Aug 14, 2015

This now fixed on master as of the merging of #27.

@jrevels jrevels closed this as completed Aug 14, 2015
@ovainola
Copy link
Author

👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants