-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Type of AutoGrad.Rec #42
Comments
I tried that naively but it doesn't successfully compile |
I don't think so, Rec is typically an array. But we may find another
solution if you describe what you are trying to do.
…On Tue, Jan 2, 2018, 06:03 Kai Xu ***@***.***> wrote:
I tried that naively but it doesn't successfully compile
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#42 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABvNpkUtla0UYfORsqABCCsuS46WAlwGks5tGZyagaJpZM4RQQbE>
.
|
What I am trying to do is to allow DIstributions.jl (and its dependency functions) to support AD by AutoGrad.
|
For what is worth, I think the same happens with the Distances.jl package. When trying to autodifferentiate functions that call functions in Distances.jl, a type problem will occur similar to that described in the original post. For instance, here is a simple example:
Unfortunately, the above will give the error:
I think that the problem with using AutoGrad on Distances.jl is very similar to the one using AutoGrad on Distributions.jl . Many thanks. |
In general for AutoGrad to work it needs to wrap the gradient variables and
their descendents with the Rec type. Most primitive functions in Base have
been extended with a method that can handle Rec inputs. Any function on the
path from inputs to loss has to be either generic, or extended with Rec
methods. New primitives can be defined using the @primitive macro. See @doc
@primitive and the documentation in core.jl.
…On Tue, Jan 2, 2018 at 21:47 Niko ***@***.***> wrote:
For what is worth, I think the same happens with the Distances.jl package.
When trying to auto differentiate functions that call functions in
Distances.jl, a type problem will occur similar to that described in the
original post.
For instance, here is a simple example:
A=randn(3,5);
f = x -> sum(pairwise(SqEuclidean(), A, reshape(x, size(A,1), size(A,2)))) # some arbitrary function that calls a function in Distances.jl
g = grad(f)
g(vec(randn(3,5)))
Unfortunately the above will give the error:
ERROR: MethodError: no method matching pairwise(::Distances.SqEuclidean, ::Array{Float64,2}, ::AutoGrad.Rec{Array{Float64,2}})
Closest candidates are:
pairwise(::Distances.PreMetric, ::AbstractArray{T,2} where T) at /Users/ngiann/.julia/v0.6/Distances/src/generic.jl:125
pairwise(::Distances.PreMetric, ::AbstractArray{T,2} where T, ::AbstractArray{T,2} where T) at /Users/ngiann/.julia/v0.6/Distances/src/generic.jl:118
I think that the problem with using AutoGrad on Distances.jl is very
similar to the one using AutoGrad on Distributions.jl .
Many thanks.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#42 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABvNpj11x-HtLirEsHRkfEJLPlx_plQbks5tGnmvgaJpZM4RQQbE>
.
|
Thanks for the response. I will see if I can modify my example above using the suggested macro to get around the problem. |
@denizyuret If my understanding is correct, as long as the underlying code is written in pure Julia and the corresponding primitive functions is extended to handle Rec, I can use AutoGrad.jl on that function. For Distributions.jl and its dependencies, they do have corresponding codes written in pure Julia - which means as long as the primitive functions supported AutoGrad.jl could work. The only problem here is Distributions.jl and its dependencies use
|
@primitive macro defines methods of primitives that will handle Rec inputs.
…On Wed, Jan 3, 2018 at 12:52 PM Kai Xu ***@***.***> wrote:
@denizyuret <https://github.com/denizyuret> If my understanding is
correct, as long as the underlying code is written in pure Julia and the
corresponding primitive functions is extended to handle Rec, I can use
AutoGrad.jl on that function.
For Distributions.jl and its dependencies, they *do have* corresponding
codes written in pure Julia - which means as long as the primitive
functions supported AutoGrad.jl could work.
The only problem here is Distributions.jl and its dependencies use Real
in their function signatures for variables - because Rec <: Real is false,
we cannot pass Rec to their functions.
- ForwardDiff.jl handles this problem by handle their Dual type
(similar to Rec but with forward AD) a subtype of Real. By doing that
it's compatible with Distributions.jl
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#42 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABvNpgkrs35R3aAV94PXYllFYAx8_Zgwks5tG03NgaJpZM4RQQbE>
.
|
Yes I understand this. The problem is not about the primitives but the function signature Distributions.jl and its dependencies use. Let me put it in an example. AutoGrad.jl works in the example below p(x) = x * x
f(w) = p(w[1])
df = grad(f)
w = KnetArray([1.0])
df(w) But if we change p(x::Real) = x * x AutoGrad.jl doesn't work through because of restriction on |
@primitive p(x::Real),dy dy*x*2
or whatever the derivative is should work no?
…On Wed, Jan 3, 2018 at 21:11 Kai Xu ***@***.***> wrote:
Yes I understand this. The problem is not about the primitives but the
function signature Distributions.jl and its dependencies use. Let me put it
in an example.
AutoGrad.jl works in the example below
p(x) = x * x
f(w) = p(w[1])
df = grad(f)
w = KnetArray([1.0])df(w)
But if we change p(x) = x * x to
p(x::Real) = x * x
AutoGrad.jl doesn't work through because of restriction on Real will deny
passing Rec.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#42 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABvNpmIb_zdkVuTPNbyzjAI7n2ePonQkks5tG8LegaJpZM4RQQbE>
.
|
Well you can do it but the point here is that However if we can either 1) change the function signature in Distributions.jl or 2) make |
I understand. Rec <: Real in the general case may break other use cases.
However you could create a fork/branch of AutoGrad.jl with Rec <: Real as a
short term solution. @jrevels is working on a Cassette based solution
where a Rec type will not be necessary and this problem will be solved.
…On Thu, Jan 4, 2018 at 2:45 AM Kai Xu ***@***.***> wrote:
Well you can do it but the point here is that p(x) is usually a very
complex function, e.g. logpdf of a distribution. We really don't want to
derive the derivative on our own and define it by @primitive otherwise
there is not point of using automatic differentiation here.
However if we can either 1) change the function signature in
Distributions.jl or 2) make Rec <: Real the problem itself doesn't exist
any more because the function is written in pure Julia and all the
primitives used are defined in AutoGrad.jl
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#42 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABvNptc8zM375lGZFNhsZd4TO9b08qiQks5tHBEwgaJpZM4RQQbE>
.
|
I tried to simply add julia> using AutoGrad
INFO: Precompiling module AutoGrad.
WARNING: Method definition broadcast(Any, Union{Number, AbstractArray{T, N} where N where T}...) in module AutoGrad at /home/kai/.julia/v0.6/AutoGrad/src/unfuse.jl:35 overwritten at /home/kai/.julia/v0.6/AutoGrad/src/unfuse.jl:37.
ERROR: LoadError: LoadError: MethodError: no method matching sign(::AutoGrad.Broadcasted{Array{Float64,1}})
Closest candidates are:
sign(::Bool) at bool.jl:76
sign(::Unsigned) at number.jl:81
sign(::Rational) at rational.jl:221
...
Stacktrace:
[1] broadcast(::Function, ::Array{Float64,1}) at /home/kai/.julia/v0.6/AutoGrad/src/unfuse.jl:37
[2] #randin#25(::Float64, ::Function, ::Tuple{Float64,Float64}, ::Int64, ::Vararg{Int64,N} where N) at /home/kai/.julia/v0.6/AutoGrad/src/gradcheck.jl:209
[3] addtest1(::Symbol, ::Tuple{Float64,Float64}) at /home/kai/.julia/v0.6/AutoGrad/src/gradcheck.jl:193
[4] macro expansion at /home/kai/.julia/v0.6/AutoGrad/src/base/number.jl:12 [inlined]
[5] anonymous at ./<missing>:?
[6] include_from_node1(::String) at ./loading.jl:569
[7] include(::String) at ./sysimg.jl:14
[8] include_from_node1(::String) at ./loading.jl:569
[9] include(::String) at ./sysimg.jl:14
[10] anonymous at ./<missing>:2
while loading /home/kai/.julia/v0.6/AutoGrad/src/base/number.jl, in expression starting on line 6
while loading /home/kai/.julia/v0.6/AutoGrad/src/AutoGrad.jl, in expression starting on line 25
ERROR: Failed to precompile AutoGrad to /home/kai/.julia/lib/v0.6/AutoGrad.ji.
Stacktrace:
[1] compilecache(::String) at ./loading.jl:703
[2] _require(::Symbol) at ./loading.jl:490
[3] require(::Symbol) at ./loading.jl:398 Yes - I know the Cassette.jl package but for now AutoGrad.jl seems to be the most mature AD solution with GPU support in Julia. |
Also will you consider to add a compilation flag to let AutoGrad.jl set |
Any idea on this? |
Is it possible to make
AutoGrad.Rec
a subtype ofReal
so that this package can work withDistributions.jl
?The text was updated successfully, but these errors were encountered: