-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Task switch error on Enzyme v0.13 #2081
Comments
@vchuravy this looks like a problem of julia not liking the locks of the caching mechanism all of a sudden?? I have no idea why |
@MilesCranmer from the looks of it, it seems like this related to doing a first compile in a task Separately, we've been consistently running the itnegration CI you made a while ago, and things continue to pass. Presumably that is set to the version at the time, so perhaps you can find what on your end changed to cause the issue? |
The main difference is that the first compile on my system is in a worker thread, whereas in the CI, it’s the main thread. Nothing has changed on my side though. |
Friendly ping on this |
This issue is a bit out of my wheelhouse. @vchuravy's help will probably be needed here |
@MilesCranmer it looks like you are calling autodiff from within the generator of a generated function? |
Ah, no... Enzyme uses a generated function (side-eyes billy) and from there invokes |
@MilesCranmer Can you try using |
(Just confirming – my call to Enzyme is indeed not within any generated functions)
|
To be clear that’s not a global setting you would do Enzyme.autodiff(set_abi(Forward, NonGenABI), …) Can you also include the stack trace? |
Oh I see. I only have reverse-mode set up at the moment; here's the full error with that tweak: julia> julia --project=. examples/parameterized_function.jl
[ Info: Training machine(SRRegressor(defaults = nothing, …), …).
[ Info: Started!
┌ Error: Problem fitting the machine machine(SRRegressor(defaults = nothing, …), …).
└ @ MLJBase ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:694
[ Info: Running type checks...
[ Info: Type checks okay.
ERROR: LoadError: TaskFailedException
Stacktrace:
[1] wait(t::Task)
@ Base ./task.jl:370
[2] fetch
@ ./task.jl:390 [inlined]
[3] _main_search_loop!(state::SymbolicRegression.SearchUtilsModule.SearchState{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, Task, Channel}, datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}, ropt::SymbolicRegression.SearchUtilsModule.RuntimeOptions{:multithreading, 1, true, Nothing}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5})
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:833
[4] _equation_search(datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}, ropt::SymbolicRegression.SearchUtilsModule.RuntimeOptions{:multithreading, 1, true, Nothing}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, saved_state::Nothing)
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:535
[5] equation_search(datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}; options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, saved_state::Nothing, runtime_options::Nothing, runtime_options_kws::@Kwargs{niterations::Int64, parallelism::Symbol, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, heap_size_hint_in_bytes::Nothing, runtests::Bool, return_state::Bool, run_id::Nothing, verbosity::Int64, logger::Nothing, progress::Nothing, v_dim_out::Val{1}})
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:525
[6] equation_search
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:506 [inlined]
[7] equation_search(X::Matrix{Float64}, y::Matrix{Float64}; niterations::Int64, weights::Nothing, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, variable_names::Vector{String}, display_variable_names::Vector{String}, y_variable_names::Nothing, parallelism::Symbol, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, heap_size_hint_in_bytes::Nothing, runtests::Bool, saved_state::Nothing, return_state::Bool, run_id::Nothing, loss_type::Type{Nothing}, verbosity::Int64, logger::Nothing, progress::Nothing, X_units::Nothing, y_units::Nothing, extra::@NamedTuple{class::Vector{Int64}}, v_dim_out::Val{1}, multithreaded::Nothing)
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:476
[8] #equation_search#21
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:499 [inlined]
[9] _update(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}}, y::Vector{Float64}, w::Nothing, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, class::Vector{Int64})
@ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:253
[10] _update(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}, class::Vector{Int64}}, y::Vector{Float64}, w::Nothing, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, class::Nothing)
@ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:220
[11] update(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}, class::Vector{Int64}}, y::Vector{Float64}, w::Nothing)
@ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:201
[12] fit
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:189 [inlined]
[13] fit(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}, class::Vector{Int64}}, y::Vector{Float64})
@ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:189
[14] fit_only!(mach::MLJBase.Machine{SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, true}; rows::Nothing, verbosity::Int64, force::Bool, composite::Nothing)
@ MLJBase ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:692
[15] fit_only!
@ ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:617 [inlined]
[16] #fit!#63
@ ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:789 [inlined]
[17] fit!(mach::MLJBase.Machine{SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, true})
@ MLJBase ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:786
[18] top-level scope
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/examples/parameterized_function.jl:101
nested task error: TaskFailedException
Stacktrace:
[1] wait(t::Task)
@ Base ./task.jl:370
[2] fetch
@ ./task.jl:390 [inlined]
[3] (::SymbolicRegression.var"#56#61"{SymbolicRegression.SearchUtilsModule.SearchState{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, Task, Channel}, Int64, Int64})()
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:810
nested task error: TaskFailedException
Stacktrace:
[1] wait(t::Task)
@ Base ./task.jl:370
[2] fetch
@ ./task.jl:390 [inlined]
[3] with_stacksize
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:31 [inlined]
[4] (::SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}})(::Float64, G::Vector{Float64}, x::Vector{Float64})
@ SymbolicRegressionEnzymeExt ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:41
[5] (::NLSolversBase.var"#69#70"{NLSolversBase.InplaceObjective{Nothing, SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}}, Nothing, Nothing, Nothing}, Float64})(G::Vector{Float64}, x::Vector{Float64})
@ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/objective_types/incomplete.jl:54
[6] value_gradient!!(obj::NLSolversBase.OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, x::Vector{Float64})
@ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/interface.jl:82
[7] initial_state(method::Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.BackTracking{Float64, Int64}, Nothing, Nothing, Optim.Flat}, options::Optim.Options{Float64, Nothing}, d::NLSolversBase.OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, initial_x::Vector{Float64})
@ Optim ~/.julia/packages/Optim/fBdaz/src/multivariate/solvers/first_order/bfgs.jl:94
[8] optimize
@ ~/.julia/packages/Optim/fBdaz/src/multivariate/optimize/optimize.jl:36 [inlined]
[9] optimize(f::NLSolversBase.InplaceObjective{Nothing, SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}}, Nothing, Nothing, Nothing}, initial_x::Vector{Float64}, method::Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.BackTracking{Float64, Int64}, Nothing, Nothing, Optim.Flat}, options::Optim.Options{Float64, Nothing}; inplace::Bool, autodiff::Symbol)
@ Optim ~/.julia/packages/Optim/fBdaz/src/multivariate/optimize/interface.jl:143
[10] optimize
@ ~/.julia/packages/Optim/fBdaz/src/multivariate/optimize/interface.jl:139 [inlined]
[11] _optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, algorithm::Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.BackTracking{Float64, Int64}, Nothing, Nothing, Optim.Flat}, optimizer_options::Optim.Options{Float64, Nothing}, idx::Nothing)
@ SymbolicRegression.ConstantOptimizationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:76
[12] dispatch_optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, idx::Nothing)
@ SymbolicRegression.ConstantOptimizationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:46
[13] optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5})
@ SymbolicRegression.ConstantOptimizationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:27
[14] macro expansion
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SingleIteration.jl:118 [inlined]
[15] macro expansion
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/Utils.jl:159 [inlined]
[16] optimize_and_simplify_population(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, pop::Population{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, curmaxsize::Int64, record::Dict{String, Any})
@ SymbolicRegression.SingleIterationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SingleIteration.jl:109
[17] _dispatch_s_r_cycle(in_pop::Population{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}; pop::Int64, out::Int64, iteration::Int64, verbosity::Int64, cur_maxsize::Int64, running_search_statistics::SymbolicRegression.AdaptiveParsimonyModule.RunningSearchStatistics)
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:1087
[18] macro expansion
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:762 [inlined]
[19] (::SymbolicRegression.var"#53#55"{Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, Float64, SymbolicRegression.SearchUtilsModule.RuntimeOptions{:multithreading, 1, true, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Int64, Task, SymbolicRegression.AdaptiveParsimonyModule.RunningSearchStatistics, Int64, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Int64})()
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SearchUtils.jl:263
nested task error: AssertionError: Base.isconcretetype(typ)
Stacktrace:
[1] abs_typeof(arg::LLVM.ExtractValueInst, partial::Bool, seenphis::Set{LLVM.PHIInst})
@ Enzyme.Compiler ~/.julia/packages/Enzyme/RvNgp/src/absint.jl:614
[2] abs_typeof(arg::LLVM.ExtractValueInst, partial::Bool)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/RvNgp/src/absint.jl:281
[3] codegen(output::Symbol, job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}; libraries::Bool, deferred_codegen::Bool, optimize::Bool, toplevel::Bool, strip::Bool, validate::Bool, only_entry::Bool, parent_job::Nothing)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:7095
[4] codegen
@ ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:6072 [inlined]
[5] _thunk(job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}, postopt::Bool)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:8375
[6] cached_compilation(job::GPUCompiler.CompilerJob)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:8416
[7] thunkbase
@ ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:8548 [inlined]
[8] thunk
@ ~/.julia/packages/Enzyme/RvNgp/src/compiler.jl:8631 [inlined]
[9] autodiff
@ ~/.julia/packages/Enzyme/RvNgp/src/Enzyme.jl:473 [inlined]
[10] autodiff
@ ~/.julia/packages/Enzyme/RvNgp/src/Enzyme.jl:537 [inlined]
[11] autodiff
@ ~/.julia/packages/Enzyme/RvNgp/src/Enzyme.jl:504 [inlined]
[12] (::SymbolicRegressionEnzymeExt.var"#1#2"{SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}}, Vector{Float64}, Vector{Float64}})()
@ SymbolicRegressionEnzymeExt ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:42
in expression starting at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/examples/parameterized_function.jl:101 The code is here: https://github.com/MilesCranmer/SymbolicRegression.jl/blob/667df823cd2dd111db524b7cb0c495d1a583eb89/ext/SymbolicRegressionEnzymeExt.jl#L42 |
Okay good news that’s now a different error |
Same error on main. Is there any way I can get more debugging info out of this to see where abs_typeof is blowing up? As far as I know there should be no type instabilities here, and this same code used to work ok. |
what's the backtrace on main (and list commit you use). The line number you have shouldn't throw that error. With that I'll try to make a patch to add more info to the relevant assertion. |
This might do it, if you give it a go: https://github.com/EnzymeAD/Enzyme.jl/pull/2149/files |
Thanks. Here's the printout on e69f3c2: nested task error: AssertionError: Illegal absint of %.fca.45.0.extract = extractvalue { {} addrspace(10)*, {} addrspace(10)*, { i8, {} addrspace(10)*, {} addrspace(10)*, i64, i64 }, i64, float, float, i32, i8, i8, float, i64, i64, i8, i8, i8, i8, {} addrspace(10)*, i64, float, i8, i8, i64, {} addrspace(10)*, float, float, i8, i8, double, i64, i64, float, float, i64, i64, i8, i8, float, i64, i64, i64, i8, {} addrspace(10)*, {} addrspace(10)*, {} addrspace(10)*, {} addrspace(10)*, [1 x i64], i8, i8, i64, i8, {} addrspace(10)*, float, i64, {} addrspace(10)*, {} addrspace(10)*, float, {} addrspace(10)*, i64, i8, i64, i8, i8, {} addrspace(10)*, i8, i8, i8 } %2, 45, 0, !dbg !90 ltyp=Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, typ=Optim.AbstractOptimizer, offset=UInt32[0x0000002d, 0x00000000], ind=0 and the full backtrace: 1-element ExceptionStack:
LoadError: TaskFailedException
Stacktrace:
[1] wait(t::Task)
@ Base ./task.jl:370
[2] fetch
@ ./task.jl:390 [inlined]
[3] _main_search_loop!(state::SymbolicRegression.SearchUtilsModule.SearchState{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, Task, Channel}, datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}, ropt::SymbolicRegression.SearchUtilsModule.RuntimeOptions{:multithreading, 1, true, Nothing}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5})
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:833
[4] _equation_search(datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}, ropt::SymbolicRegression.SearchUtilsModule.RuntimeOptions{:multithreading, 1, true, Nothing}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, saved_state::Nothing)
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:535
[5] equation_search(datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}; options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, saved_state::Nothing, runtime_options::Nothing, runtime_options_kws::@Kwargs{niterations::Int64, parallelism::Symbol, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, heap_size_hint_in_bytes::Nothing, runtests::Bool, return_state::Bool, run_id::Nothing, verbosity::Int64, logger::Nothing, progress::Nothing, v_dim_out::Val{1}})
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:525
[6] equation_search
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:506 [inlined]
[7] equation_search(X::Matrix{Float64}, y::Matrix{Float64}; niterations::Int64, weights::Nothing, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, variable_names::Vector{String}, display_variable_names::Vector{String}, y_variable_names::Nothing, parallelism::Symbol, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, heap_size_hint_in_bytes::Nothing, runtests::Bool, saved_state::Nothing, return_state::Bool, run_id::Nothing, loss_type::Type{Nothing}, verbosity::Int64, logger::Nothing, progress::Nothing, X_units::Nothing, y_units::Nothing, extra::@NamedTuple{class::Vector{Int64}}, v_dim_out::Val{1}, multithreaded::Nothing)
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:476
[8] #equation_search#21
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:499 [inlined]
[9] _update(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}}, y::Vector{Float64}, w::Nothing, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, class::Vector{Int64})
@ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:253
[10] _update(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}, class::Vector{Int64}}, y::Vector{Float64}, w::Nothing, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, class::Nothing)
@ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:220
[11] update(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, old_fitresult::Nothing, old_cache::Nothing, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}, class::Vector{Int64}}, y::Vector{Float64}, w::Nothing)
@ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:201
[12] fit
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:189 [inlined]
[13] fit(m::SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, verbosity::Int64, X::@NamedTuple{x1::Vector{Float64}, x2::Vector{Float64}, class::Vector{Int64}}, y::Vector{Float64})
@ SymbolicRegression.MLJInterfaceModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/MLJInterface.jl:189
[14] fit_only!(mach::MLJBase.Machine{SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, true}; rows::Nothing, verbosity::Int64, force::Bool, composite::Nothing)
@ MLJBase ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:692
[15] fit_only!
@ ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:617 [inlined]
[16] #fit!#63
@ ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:789 [inlined]
[17] fit!(mach::MLJBase.Machine{SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, SRRegressor{DynamicQuantities.SymbolicDimensions{DynamicQuantities.FixedRational{Int32, 25200}}, DataType}, true})
@ MLJBase ~/.julia/packages/MLJBase/7nGJF/src/machines.jl:786
[18] top-level scope
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/examples/parameterized_function.jl:101
[19] include(fname::String)
@ Main ./sysimg.jl:38
[20] top-level scope
@ REPL[2]:1
nested task error: TaskFailedException
Stacktrace:
[1] wait(t::Task)
@ Base ./task.jl:370
[2] fetch
@ ./task.jl:390 [inlined]
[3] (::SymbolicRegression.var"#56#61"{SymbolicRegression.SearchUtilsModule.SearchState{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, Task, Channel}, Int64, Int64})()
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:810
nested task error: TaskFailedException
Stacktrace:
[1] wait(t::Task)
@ Base ./task.jl:370
[2] fetch
@ ./task.jl:390 [inlined]
[3] with_stacksize
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:31 [inlined]
[4] (::SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}})(::Float64, G::Vector{Float64}, x::Vector{Float64})
@ SymbolicRegressionEnzymeExt ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:41
[5] (::NLSolversBase.var"#69#70"{NLSolversBase.InplaceObjective{Nothing, SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}}, Nothing, Nothing, Nothing}, Float64})(G::Vector{Float64}, x::Vector{Float64})
@ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/objective_types/incomplete.jl:54
[6] value_gradient!!(obj::NLSolversBase.OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, x::Vector{Float64})
@ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/interface.jl:82
[7] initial_state(method::Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.BackTracking{Float64, Int64}, Nothing, Nothing, Optim.Flat}, options::Optim.Options{Float64, Nothing}, d::NLSolversBase.OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, initial_x::Vector{Float64})
@ Optim ~/.julia/packages/Optim/fBdaz/src/multivariate/solvers/first_order/bfgs.jl:94
[8] optimize
@ ~/.julia/packages/Optim/fBdaz/src/multivariate/optimize/optimize.jl:36 [inlined]
[9] optimize(f::NLSolversBase.InplaceObjective{Nothing, SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}}, Nothing, Nothing, Nothing}, initial_x::Vector{Float64}, method::Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.BackTracking{Float64, Int64}, Nothing, Nothing, Optim.Flat}, options::Optim.Options{Float64, Nothing}; inplace::Bool, autodiff::Symbol)
@ Optim ~/.julia/packages/Optim/fBdaz/src/multivariate/optimize/interface.jl:143
[10] optimize
@ ~/.julia/packages/Optim/fBdaz/src/multivariate/optimize/interface.jl:139 [inlined]
[11] _optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, algorithm::Optim.BFGS{LineSearches.InitialStatic{Float64}, LineSearches.BackTracking{Float64, Int64}, Nothing, Nothing, Optim.Flat}, optimizer_options::Optim.Options{Float64, Nothing}, idx::Nothing)
@ SymbolicRegression.ConstantOptimizationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:76
[12] dispatch_optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, idx::Nothing)
@ SymbolicRegression.ConstantOptimizationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:46
[13] optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5})
@ SymbolicRegression.ConstantOptimizationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ConstantOptimization.jl:27
[14] macro expansion
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SingleIteration.jl:118 [inlined]
[15] macro expansion
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/Utils.jl:159 [inlined]
[16] optimize_and_simplify_population(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, pop::Population{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, curmaxsize::Int64, record::Dict{String, Any})
@ SymbolicRegression.SingleIterationModule ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SingleIteration.jl:109
[17] _dispatch_s_r_cycle(in_pop::Population{Float64, Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}}, dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, options::Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}; pop::Int64, out::Int64, iteration::Int64, verbosity::Int64, cur_maxsize::Int64, running_search_statistics::SymbolicRegression.AdaptiveParsimonyModule.RunningSearchStatistics)
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:1087
[18] macro expansion
@ ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SymbolicRegression.jl:762 [inlined]
[19] (::SymbolicRegression.var"#53#55"{Float64, ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, Float64, SymbolicRegression.SearchUtilsModule.RuntimeOptions{:multithreading, 1, true, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Int64, Task, SymbolicRegression.AdaptiveParsimonyModule.RunningSearchStatistics, Int64, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Int64})()
@ SymbolicRegression ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/SearchUtils.jl:263
nested task error: AssertionError: Illegal absint of %.fca.45.0.extract = extractvalue { {} addrspace(10)*, {} addrspace(10)*, { i8, {} addrspace(10)*, {} addrspace(10)*, i64, i64 }, i64, float, float, i32, i8, i8, float, i64, i64, i8, i8, i8, i8, {} addrspace(10)*, i64, float, i8, i8, i64, {} addrspace(10)*, float, float, i8, i8, double, i64, i64, float, float, i64, i64, i8, i8, float, i64, i64, i64, i8, {} addrspace(10)*, {} addrspace(10)*, {} addrspace(10)*, {} addrspace(10)*, [1 x i64], i8, i8, i64, i8, {} addrspace(10)*, float, i64, {} addrspace(10)*, {} addrspace(10)*, float, {} addrspace(10)*, i64, i8, i64, i8, i8, {} addrspace(10)*, i8, i8, i8 } %2, 45, 0, !dbg !90 ltyp=Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, typ=Optim.AbstractOptimizer, offset=UInt32[0x0000002d, 0x00000000], ind=0
Stacktrace:
[1] abs_typeof(arg::LLVM.Value, partial::Bool, seenphis::Set{LLVM.PHIInst})
@ Enzyme.Compiler ~/.julia/packages/Enzyme/JYslI/src/absint.jl:642
[2] abs_typeof
@ ~/.julia/packages/Enzyme/JYslI/src/absint.jl:283 [inlined]
[3] codegen(output::Symbol, job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}; libraries::Bool, deferred_codegen::Bool, optimize::Bool, toplevel::Bool, strip::Bool, validate::Bool, only_entry::Bool, parent_job::Nothing)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/JYslI/src/compiler.jl:5257
[4] codegen
@ ~/.julia/packages/Enzyme/JYslI/src/compiler.jl:4196 [inlined]
[5] _thunk(job::GPUCompiler.CompilerJob{Enzyme.Compiler.EnzymeTarget, Enzyme.Compiler.EnzymeCompilerParams}, postopt::Bool)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/JYslI/src/compiler.jl:6298
[6] cached_compilation(job::GPUCompiler.CompilerJob)
@ Enzyme.Compiler ~/.julia/packages/Enzyme/JYslI/src/compiler.jl:6339
[7] thunkbase
@ ~/.julia/packages/Enzyme/JYslI/src/compiler.jl:6452 [inlined]
[8] thunk
@ ~/.julia/packages/Enzyme/JYslI/src/compiler.jl:6535 [inlined]
[9] autodiff
@ ~/.julia/packages/Enzyme/JYslI/src/Enzyme.jl:485 [inlined]
[10] autodiff
@ ~/.julia/packages/Enzyme/JYslI/src/Enzyme.jl:544 [inlined]
[11] autodiff
@ ~/.julia/packages/Enzyme/JYslI/src/Enzyme.jl:516 [inlined]
[12] (::SymbolicRegressionEnzymeExt.var"#1#2"{SymbolicRegression.ConstantOptimizationModule.GradEvaluator{SymbolicRegression.ConstantOptimizationModule.Evaluator{ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, @NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}, Options{SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}}, ParametricNode, ParametricExpression, @NamedTuple{max_parameters::Int64}, MutationWeights, false, false, nothing, ADTypes.AutoEnzyme{Nothing, Nothing}, 5}, Nothing}, ADTypes.AutoEnzyme{Nothing, Nothing}, @NamedTuple{storage_tree::ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::Nothing, variable_names::Nothing, parameters::Matrix{Float64}, parameter_names::Nothing}}, storage_refs::@NamedTuple{constant_refs::Vector{Base.RefValue{ParametricNode{Float64}}}, parameter_refs::Matrix{Float64}, num_parameters::Int64, num_constants::Int64}, storage_dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, @NamedTuple{class::Vector{Int64}}, Nothing, Nothing, Nothing, Nothing}}}, Vector{Float64}, Vector{Float64}})()
@ SymbolicRegressionEnzymeExt ~/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:42
in expression starting at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/examples/parameterized_function.jl:101 |
Do you know what this type is from?
and what the 0-indexed 45th element type is (or 46-th 1 indexed). And then of that result, what the 0-th (0 indexed) type is? |
That is the Looks like the 46th 1-indexed field is julia> options = Options();
julia> propertynames(options)[46]
:expression_type This is the The struct ParametricExpression{
T,
N<:ParametricNode{T},
D<:NamedTuple{(:operators, :variable_names, :parameters, :parameter_names)},
} <: AbstractExpression{T,N}
tree::N
metadata::Metadata{D}
function ParametricExpression(tree::ParametricNode, metadata::Metadata)
return new{eltype(tree),typeof(tree),typeof(_data(metadata))}(tree, metadata)
end
end However, this is only stored in the options as a constructor. The actual object evaluated should have a fully-resolved type. |
hm, rephrasing what is the 46th non ghost type (aka type which has non-0 size storage. And yeah what is the relevant subtypes of the fully specialized type above? |
They should be something like: ParametricExpression{
Float64,
ParametricNode{Float64},
@NamedTuple{
operators::DynamicExpressions.OperatorEnumModule.OperatorEnum{
Tuple{typeof(+),typeof(*),typeof(-),typeof(/)},Tuple{}
},
variable_names::Vector{String},
parameters::Matrix{Float64},
parameter_names::Vector{String},
}
} from julia> using SymbolicRegression
julia> options = Options(expression_type=ParametricExpression, binary_operators=[+, *, -, /]);
julia> x1 = ParametricNode{Float64}(; feature=1)
x1
julia> ex = ParametricExpression(x1; parameters=ones(Float64, 1, 1), parameter_names=["p1"], variable_names=["x1"], operators=options.operators)
x1
julia> typeof(ex)
ParametricExpression{Float64, ParametricNode{Float64}, @NamedTuple{operators::DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(-), typeof(/)}, Tuple{}}, variable_names::Vector{String}, parameters::Matrix{Float64}, parameter_names::Vector{String}}} |
Is there a way to filter non-ghost types? |
okay so this is basically the code we run, which is itself erring during this: using Enzyme, SymbolicRegression, DynamicExpressions, ADTypes
offset=UInt32[0x0000002d, 0x00000000]
typ = Options{
SymbolicRegression.CoreModule.OptionsStructModule.ComplexityMapping{Int64, Int64}, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(*), typeof(/), typeof(-)}, Tuple{typeof(cos), typeof(exp)}},
ParametricNode,
ParametricExpression,
@NamedTuple{max_parameters::Int64},
MutationWeights,
false, false,
nothing,
ADTypes.AutoEnzyme{Nothing, Nothing},
5
}
ltyp = typ
for ind in offset
if !Base.isconcretetype(typ)
throw(AssertionError("Illegal absint ltyp=$ltyp, typ=$typ, offset=$offset, ind=$ind"))
end
cnt = 0
for i = 1:fieldcount(typ)
styp = Enzyme.Compiler.typed_fieldtype(typ, i)
if Enzyme.Compiler.isghostty(styp)
continue
end
if cnt == ind
typ = styp
break
end
cnt += 1
end
@show cnt, typ
end slightly modified from https://github.com/EnzymeAD/Enzyme.jl/blob/79678f7a93fd1a65ffc633ed132baf8d33a1b4f8/src/absint.jl#L639C1-L656C16 |
oh bleh I see, unions mess up the indexing count:
|
Might it be the |
does this fix it for you: #2155 |
okay with that branch, the code at the top works for me
|
Nice!!! Thanks. (I still seem to get segfaults after running for a bit; is that something about the caching interacting with the multithreading?) [61256] signal 11 (2): Segmentation fault: 11 second: 4.16e+04. Press 'q' and then <enter> to stop execution early.
in expression starting at REPL[10]:1
gc_mark_obj8 at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:0 [inlined]──────────────────────────────────────────────
gc_mark_outrefs at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:2888 [inlined]
gc_mark_and_steal at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:2993
gc_mark_loop_parallel at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:3141
jl_parallel_gc_threadfun at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/scheduler.c:151
_pthread_start at /usr/lib/system/libsystem_pthread.dylib (unknown line)
Allocations: 785098264 (Pool: 782822967; Big: 2275297); GC: 866 |
Also are you running with |
I'm just doing the code I pasted directly without changes. And I'm not seeing a segfault, but there's a segfault in Julia itself which we found before (but apparently wasn't backported to 1.10 yet). It's possible it's that, but if not we also should try to fix that too. See JuliaLang/julia#55306 / JuliaLang/julia#56653 for the current julia segfault. No idea of course if that's what you're hitting |
Thanks. I'm on 1.11 at the moment (and macOS). I'll check if using 1.10 fixes anything. With the NonGenABI it also seems to work which is great (let me know if I should close this?). I get a similar segfault. This happens after 10 iterations have passed, meaning we are successfully doing a lot of Enzyme-based optimizations, which is good! So I guess it's a bug in the Julia GC maybe? Evolving for 100 iterations... 10%|█████████████████████▍ | ETA: 0:01:07
[61781] signal 11 (2): Segmentation fault: 11 second: 3.99e+04. Press 'q' and then <enter> to stop execution early.
in expression starting at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/examples/parameterized_function.jl:102
gc_mark_obj8 at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:0 [inlined]──────────────────────────────────────────────
gc_mark_outrefs at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:2888 [inlined]
gc_mark_and_steal at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:2993
gc_mark_loop_parallel at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:3133 [inlined]
gc_mark_loop at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:3152
_jl_gc_collect at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:3538
ijl_gc_collect at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:3899
maybe_collect at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:922 [inlined]
jl_gc_pool_alloc_inner at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:1325 [inlined]
ijl_gc_pool_alloc_instrumented at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/gc.c:1383
_eval_tree_array at /Users/mcranmer/.julia/packages/DynamicExpressions/LMkFg/src/Evaluate.jl:187 x1)
#eval_tree_array#2 at /Users/mcranmer/.julia/packages/DynamicExpressions/LMkFg/src/Evaluate.jl:1561.2409
eval_tree_array at /Users/mcranmer/.julia/packages/DynamicExpressions/LMkFg/src/Evaluate.jl:131 [inlined]2)
#eval_tree_array#17 at /Users/mcranmer/.julia/packages/DynamicExpressions/LMkFg/src/ParametricExpression.jl:353 - (p2 - -0.092549)))
eval_tree_array at /Users/mcranmer/.julia/packages/DynamicExpressions/LMkFg/src/ParametricExpression.jl:335 [inlined]) - p2) - ((x2 * -0.15569) / (p1 - 0.99632)))
eval_tree_dispatch at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/ParametricExpression.jl:84 [inlined]──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
_eval_loss at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/LossFunctions.jl:87
#eval_loss#3 at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/LossFunctions.jl:146
eval_loss at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/src/LossFunctions.jl:138 [inlined]
evaluator at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:27 [inlined]
evaluator at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:0 [inlined]
diffejulia_evaluator_33238_inner_42wrap at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:0
macro expansion at /Users/mcranmer/.julia/packages/Enzyme/yOwHI/src/compiler.jl:5190 [inlined]
enzyme_call at /Users/mcranmer/.julia/packages/Enzyme/yOwHI/src/compiler.jl:4736 [inlined]
CombinedAdjointThunk at /Users/mcranmer/.julia/packages/Enzyme/yOwHI/src/compiler.jl:4608 [inlined]
autodiff at /Users/mcranmer/.julia/packages/Enzyme/yOwHI/src/Enzyme.jl:503 [inlined]
autodiff at /Users/mcranmer/.julia/packages/Enzyme/yOwHI/src/Enzyme.jl:544 [inlined]
autodiff at /Users/mcranmer/.julia/packages/Enzyme/yOwHI/src/Enzyme.jl:516 [inlined]
#1 at /Users/mcranmer/PermaDocuments/SymbolicRegressionMonorepo/SymbolicRegression.jl/ext/SymbolicRegressionEnzymeExt.jl:42
unknown function (ip: 0x36b8142ff)
jl_apply at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/./julia.h:2157 [inlined]
start_task at /Users/julia/.julia/scratchspaces/a66863c6-20e8-4ff4-8a62-49f30b1f605e/agent-cache/default-honeycrisp-R17H3W25T9.0/build/default-honeycrisp-R17H3W25T9-0/julialang/julia-release-1-dot-11/src/task.c:1202
Allocations: 364229539 (Pool: 363378515; Big: 851024); GC: 419 |
hm, yeah I ran to completion on 1.10 on macos [latest enzyme commit]. I know it also was backported to 1.11 (JuliaLang/julia#55344), so that means it's likely something separate. Can you open a different issue for the segfault with as simplified a MWE as possible (having for loops and or manual GC.gc calls is fine, but ideally it's some code with a single autodiff call [perhaps in a loop] with as simple an inner function as possible) |
Confirming I can run it on 1.10 too. It's just 1.11 that hits the segfault. So I guess a Julia bug? Given how random the bug is, and the interaction with multithreading, and my limited knowledge of Julia's GC process, I think it will be pretty hard for me to make a clean MWE. I could try to make an rr trace though if that's useful? (Assuming I can repro on linux) |
Yeah totally understood — tho of course making it as simple as possible would be helpful for trying to fix. |
x-posted to JuliaLang/julia#56735 |
Crazy thing is when I run with |
When trying to use Enzyme as the autodiff backend for SymbolicRegression searches I ran into this error:
The full stack trace:
To reproduce, you can run the example here: https://ai.damtp.cam.ac.uk/symbolicregression/dev/examples/parameterized_function/ and swap
:Zygote
for:Enzyme
.For example:
Could it be because I am running Enzyme.jl from a task within the code?
More context: Enzyme.jl used to work for this, and I don't think I changed anything that would cause new behavior on my end. But I just switched to v0.13 so not sure if something changed. I can't test v0.12 due to the error here: #2080
The text was updated successfully, but these errors were encountered: