You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want to define a new soft plus with the parameter beta. (similar to pytorch).
using Flux
using CUDA
mysoftplus(x, β=100.f0) = 1f0 / β * log(1f0 + exp(β*x))
CUDA.@cufunc mysoftplus(x, β=100.0f0) = 1f0 / β * CUDA.log(1f0 + CUDA.exp(β*x))
# g(x) = CUDA.logsoftplus.(x)
# CUDA.@cufunc g(x) = CUDA.log.(x)
function Net(ndim)
net = Chain(Dense(3, 2, mysoftplus))
return net
end
net = Net(3)
net = Flux.fmap(cu, net)
x = cu(zeros(3,4))
net(x)
gradient( x -> sum(net(x)), x )
I don't know what happened, but the following is the error message: (I tried to search a similar error, but it didn't help.
┌ Warning: calls to Base intrinsics might be GPU incompatible
│ exception =
│ You called log(x::Float32) in Base.Math at special/log.jl:290, maybe you intended to call log(x::Float32) in CUDA at /home/jakoo/.julia/packages/CUDA/5t6R9/src/device/cuda/math.jl:73 instead?
│ Stacktrace:
│ [1] log at special/log.jl:290
│ [2] #20 at /home/jakoo/.julia/packages/GPUArrays/JqOUg/src/host/broadcast.jl:57
└ @ GPUCompiler ~/.julia/packages/GPUCompiler/lqbF2/src/irgen.jl:68
ERROR: LoadError: GPU compilation of kernel broadcast(CUDA.CuKernelContext, CuDeviceArray{Tuple{Float32,typeof(∂(mysoftplus))},2,CUDA.AS.Global}, Base.Broadcast.Broadcasted{Nothing,Tuple{Base.OneTo{Int64},Base.OneTo{Int64}},Zygote.var"#1750#1757"{Zygote.Context,typeof(mysoftplus)},Tuple{Base.Broadcast.Extruded{CuDeviceArray{Float32,2,CUDA.AS.Global},Tuple{Bool,Bool},Tuple{Int64,Int64}}}}) failed
KernelError: passing and using non-bitstype argument
Argument 4 to your kernel function is of type Base.Broadcast.Broadcasted{Nothing,Tuple{Base.OneTo{Int64},Base.OneTo{Int64}},Zygote.var"#1750#1757"{Zygote.Context,typeof(mysoftplus)},Tuple{Base.Broadcast.Extruded{CuDeviceArray{Float32,2,CUDA.AS.Global},Tuple{Bool,Bool},Tuple{Int64,Int64}}}}, which is not isbits:
.f is of type Zygote.var"#1750#1757"{Zygote.Context,typeof(mysoftplus)} which is not isbits.
.context is of type Zygote.Context which is not isbits.
.cache is of type Union{Nothing, IdDict{Any,Any}} which is not isbits.
Passing non-isbits types is only allowed if they they are unused by the kernel.
The text was updated successfully, but these errors were encountered:
I want to define a new soft plus with the parameter beta. (similar to pytorch).
I don't know what happened, but the following is the error message: (I tried to search a similar error, but it didn't help.
┌ Warning: calls to Base intrinsics might be GPU incompatible
│ exception =
│ You called log(x::Float32) in Base.Math at special/log.jl:290, maybe you intended to call log(x::Float32) in CUDA at /home/jakoo/.julia/packages/CUDA/5t6R9/src/device/cuda/math.jl:73 instead?
│ Stacktrace:
│ [1] log at special/log.jl:290
│ [2] #20 at /home/jakoo/.julia/packages/GPUArrays/JqOUg/src/host/broadcast.jl:57
└ @ GPUCompiler ~/.julia/packages/GPUCompiler/lqbF2/src/irgen.jl:68
ERROR: LoadError: GPU compilation of kernel broadcast(CUDA.CuKernelContext, CuDeviceArray{Tuple{Float32,typeof(∂(mysoftplus))},2,CUDA.AS.Global}, Base.Broadcast.Broadcasted{Nothing,Tuple{Base.OneTo{Int64},Base.OneTo{Int64}},Zygote.var"#1750#1757"{Zygote.Context,typeof(mysoftplus)},Tuple{Base.Broadcast.Extruded{CuDeviceArray{Float32,2,CUDA.AS.Global},Tuple{Bool,Bool},Tuple{Int64,Int64}}}}) failed
KernelError: passing and using non-bitstype argument
Argument 4 to your kernel function is of type Base.Broadcast.Broadcasted{Nothing,Tuple{Base.OneTo{Int64},Base.OneTo{Int64}},Zygote.var"#1750#1757"{Zygote.Context,typeof(mysoftplus)},Tuple{Base.Broadcast.Extruded{CuDeviceArray{Float32,2,CUDA.AS.Global},Tuple{Bool,Bool},Tuple{Int64,Int64}}}}, which is not isbits:
.f is of type Zygote.var"#1750#1757"{Zygote.Context,typeof(mysoftplus)} which is not isbits.
.context is of type Zygote.Context which is not isbits.
.cache is of type Union{Nothing, IdDict{Any,Any}} which is not isbits.
Passing non-isbits types is only allowed if they they are unused by the kernel.
The text was updated successfully, but these errors were encountered: