Distributed High-Performance Symbolic Regression in Julia
APACHE-2.0 License
Bot releases are hidden (Show)
Published by MilesCranmer 7 months ago
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.24.0...v0.24.1
Published by MilesCranmer 7 months ago
PopMember{T,L}
is now PopMember{T,L,N}
for N
the type of expression.node_type
in creation of Options
. This node_type <: AbstractExpressionNode
can be a GraphNode
which will result in expressions that care share nodes – and therefore have a lower complexity.form_connection
and break_connection
– which control the merging and breaking of shared nodes in expressions. These are experimental.Dataset
struct has had many of its field declared immutable (for memory safety). If you had relied on the mutability of the struct to set parameters after initializing it, you will need to modify your code.Node{T}(feature=...) # leaf referencing a particular feature column
Node{T}(val=...) # constant value leaf
Node{T}(op=1, l=x1) # operator unary node, using the 1st unary operator
Node{T}(op=1, l=x1, r=1.5) # binary unary node, using the 1st binary operator
rather than the previous constructors Node(op, l, r)
and Node(T; val=...)
(though those will still work; just with a depwarn
).
bumper=true
to Options()
will result in using bump-allocation for evaluation which can get speeds equivalent to LoopVectorization and sometimes even better due to better management of allocations. (https://github.com/MilesCranmer/SymbolicRegression.jl/pull/287)function _equation_search(
datasets::Vector{D}, ropt::RuntimeOptions, options::Options, saved_state
) where {D<:Dataset}
_validate_options(datasets, ropt, options)
state = _create_workers(datasets, ropt, options)
_initialize_search!(state, datasets, ropt, options, saved_state)
_warmup_search!(state, datasets, ropt, options)
_main_search_loop!(state, datasets, ropt, options)
_tear_down!(state, ropt, options)
return _format_output(state, ropt)
end
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.23.3...v0.24.0
Published by MilesCranmer 10 months ago
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.23.0...v0.23.1
Published by MilesCranmer about 1 year ago
MutationWeights
by @MilesCranmer in https://github.com/MilesCranmer/SymbolicRegression.jl/pull/253
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.22.2...v0.22.3
Published by MilesCranmer about 1 year ago
batching=true
. It results in improved searches on large datasets, as the "winning expression" is not biased towards an expression that landed on a lucky batch.fast_cycle
feature in #243. Use of this parameter will have no effect.
@eval
-ing new operators.x1 + x2
. All internal search code uses Node()
explicitly to build expressions, so did not rely on method invalidation at any point.npop
=> population_size
npopulations
=> populations
--depwarn=yes
.predict
uses units if trained with them in https://github.com/MilesCranmer/SymbolicRegression.jl/pull/244
MLJ.predict
will output predictions in the same units. Before this change, MLJ.predict
would return numerical arrays with no units.Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.21.5...v0.22.0
Published by MilesCranmer about 1 year ago
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.21.4...v0.21.5
Published by MilesCranmer about 1 year ago
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.21.2...v0.21.3
Published by MilesCranmer about 1 year ago
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.21.1...v0.21.2
Published by MilesCranmer about 1 year ago
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.21.0...v0.21.1
Published by MilesCranmer about 1 year ago
Quantity
type to the MLJ interface.X_units
, y_units
to low-level equation_search
.print_precision
option.x₁
is used rather than x1
.y =
is printed at the start (or y₁ =
for multi-output). With units this becomes, for example, y[kg] =
.node_to_symbolic(::Node, ::AbstractSRRegressor)
) (#228)Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.20.0...v0.21.0
Published by MilesCranmer over 1 year ago
@generated
functionsFull Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.18.0...v0.19.0
Published by MilesCranmer over 1 year ago
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.16.1...v0.16.2
Published by MilesCranmer over 1 year ago
Published by MilesCranmer over 1 year ago
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.15.2...v0.15.3
Published by MilesCranmer over 1 year ago
check_constraints
by @MilesCranmer in https://github.com/MilesCranmer/SymbolicRegression.jl/pull/172
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.15.1...v0.15.2
Published by MilesCranmer over 1 year ago
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.15.0...v0.15.1
Published by MilesCranmer almost 2 years ago
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.14.5...v0.15.0
Published by MilesCranmer almost 2 years ago
parallelism
argument of EquationSearch (e.g., "multithreading"
instead of :multithreading
). This is to allow compatibility with PyJulia calls, which can't pass symbols.Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.13.1...v0.13.2
Published by MilesCranmer about 2 years ago
abs
everywhere which results in fundamentally different functional forms.Node{T}
type to non-floats by @MilesCranmer in https://github.com/MilesCranmer/SymbolicRegression.jl/pull/122
Full Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.11.1...v0.12.0
Published by MilesCranmer about 2 years ago
Inf
appears as loss for expressionFull Changelog: https://github.com/MilesCranmer/SymbolicRegression.jl/compare/v0.10.2...v0.11.1