Skip to content

Commit

Permalink
fix readme
Browse files Browse the repository at this point in the history
  • Loading branch information
manuelbb-upb committed Sep 19, 2024
1 parent 74421a8 commit e64436f
Show file tree
Hide file tree
Showing 6 changed files with 105 additions and 138 deletions.
17 changes: 15 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ I don't really keep up a consistent versioning scheme.
But the changes in this section have been significant enough to warrant some comments.

#### Version 0.3
I did not exactly keep track of all the changes, but some types and default
I did not exactly keep track of all the changes, but some types and default
settings have changed, so a new breaking version is warranted.
We have set-based algorithms now, but they are drafts only and mostly undocumented.
Originally, it was `optimize_set`, but I tested `optimize_many` the most.
Expand All @@ -31,6 +31,18 @@ Try `optimize_many` at your own risk.
#### Version 0.2.0
This was an intermediate version that has been superseded fast.

#### Version 0.2.0
* We import and re-export `@set` and `@reset` from Accessors.jl.
* `AlgorithmOptions` is no immutable and type-stable.
`@set algo_opts.float_type` will trigger cnoversion and setting of type-dependent defaults.
* Likewise, `RBFConfig` is no longer mutable and has concrete types.
* `TypedMOP` supports `@set` and `@reset` for `objectives`, `nl_eq_constraints` and
`nl_eq_constraints`.
* The `AbstractNonlinearOperator` interface now requires `CompromiseEvaluators.operator_dim_in`
and `CompromiseEvaluators.operator_dim_out`.
* The `ReturnObject` now references the whole cache (basically a NamedTuple of internal structs.)
* `SimpleMOP` reset call counters by default.

#### Version 0.1.0
This release is breaking, because the RBF database is no longer thread-safe by default.
Instead, `ConcurrentUtils` is a weak dependency and no longer mandatory.
Expand Down Expand Up @@ -316,7 +328,8 @@ You can pass `max_func_calls` as a keyword argument to `add_objectives!` and sim
Likewise, `max_grad_calls` restricts the number of gradient calls,
`max_hess_calls` limits Hessian computations.

For historic reasons, the count is kept between runs.
~~For historic reasons, the count is kept between runs.~~
The count is now reset between runs by default.
To reset the count between runs (sequential or parallel), indicate it when setting up
the MOP.

Expand Down
12 changes: 11 additions & 1 deletion docs/literate_src/README.jl
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,16 @@ constraints are violated.
I don't really keep up a consistent versioning scheme.
But the changes in this section have been significant enough to warrant some comments.
#### Version 0.3
I did not exactly keep track of all the changes, but some types and default
settings have changed, so a new breaking version is warranted.
We have set-based algorithms now, but they are drafts only and mostly undocumented.
Originally, it was `optimize_set`, but I tested `optimize_many` the most.
Try `optimize_many` at your own risk.
#### Version 0.2.0
This was an intermediate version that has been superseded fast.
#### Version 0.2.0
* We import and re-export `@set` and `@reset` from Accessors.jl.
* `AlgorithmOptions` is no immutable and type-stable.
Expand All @@ -39,7 +49,7 @@ But the changes in this section have been significant enough to warrant some com
* `SimpleMOP` reset call counters by default.
#### Version 0.1.0
This release is breaking, because the the RBF database is no longer thread-safe by default.
This release is breaking, because the RBF database is no longer thread-safe by default.
Instead, `ConcurrentUtils` is a weak dependency and no longer mandatory.
To use a thread-safe RBF database, either configure your problem functions
with `:rbfLocked`, use an `RBFConfig` with
Expand Down
82 changes: 30 additions & 52 deletions docs/src/CompromiseEvaluators.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,9 @@ abstract type AbstractNonlinearOperatorNoParams <: AbstractNonlinearOperator end
operator_has_name(::AbstractNonlinearOperator)::Bool=false
operator_name(::AbstractNonlinearOperator)=error("No name.")

operator_dim_in(::AbstractNonlinearOperator)=-1
operator_dim_out(::AbstractNonlinearOperator)=-1

operator_chunk_size(::AbstractNonlinearOperator)::Integer=1
operator_has_params(::AbstractNonlinearOperator)::Bool=false
operator_can_partial(::AbstractNonlinearOperator)::Bool=false
Expand All @@ -53,34 +56,29 @@ end
## Call Counting

````julia
import ..Compromise: AbstractStoppingCriterion, stop_message, @ignoraise
import ..Compromise: AbstractUltimateStoppingCriterion, stop_message, @ignoraise

Base.@kwdef mutable struct FuncCallCounter
val :: Int =0
lock :: ReentrantLock =ReentrantLock()
val :: Threads.Atomic{Int} = Threads.Atomic{Int}(0)
end

function read_counter(fcc::FuncCallCounter)
lock(fcc.lock) do
fcc.val
end
fcc.val[]
end

function set_counter!(fcc::FuncCallCounter, v::Int)
lock(fcc.lock) do
fcc.val = v
end
Threads.atomic_xchg!(fcc.val, v)
return v
end

function inc_counter!(fcc::FuncCallCounter)
return set_counter!(fcc, read_counter(fcc) + 1)
return Threads.atomic_add!(fcc.val, 1) + 1
end

func_call_counter(op::AbstractNonlinearOperator, ::Val)=nothing
max_num_calls(op::AbstractNonlinearOperator, ::Val)::Real=Inf

struct BudgetExhausted <: AbstractStoppingCriterion
struct BudgetExhausted <: AbstractUltimateStoppingCriterion
ni :: Int
mi :: Int
order :: Int
Expand All @@ -97,16 +95,14 @@ end
request_func_calls(::Nothing, op::AbstractNonlinearOperator, v::Val, N::Integer)=Inf
function request_func_calls(fcc::FuncCallCounter, op::AbstractNonlinearOperator, v::Val{i}, N::Integer) where i
mfc =max_num_calls(op, v)
lock(fcc.lock) do
cfc = fcc.val
rem =min(mfc - cfc, N)
#show cfc, mfc, N, rem
if rem <=0
return BudgetExhausted(cfc, mfc, i)
end
fcc.val += rem
return rem

cfc =read_counter(fcc)
rem =min(mfc - cfc, N)
if rem <=0
return BudgetExhausted(cfc, mfc, i)
end
set_counter!(fcc, cfc + rem)
return rem
end
````

Expand Down Expand Up @@ -474,74 +470,53 @@ objects, such as databases, by reference so as to avoid a large memory-overhead.
Moreover, we only need copies for radius-dependent models!
You can ignore those methods otherwise.

````julia
copy_model(mod::AbstractSurrogateModel)=deepcopy(mod)
copy_model(::Nothing)=nothing
````

A surrogate is initialized from its configuration and the operator it is meant to model:

````julia
"""
init_surrogate(
model_config, nonlin_op, dim_in, dim_out, params, T
model_config, nonlin_op, params, T
)
Return a model subtyping `AbstractSurrogateModel`, as defined by
`model_config::AbstractSurrogateModelConfig`, for the nonlinear operator `nonlin_op`.
The operator (and model) has input dimension `dim_in` and output dimension `dim_out`.
`params` is the current parameter object for `nonlin_op` and is cached.
`T` is a subtype of `AbstractFloat` to indicate float_type of cache arrays.
"""
function init_surrogate(
::AbstractSurrogateModelConfig, op, dim_in, dim_out, params, T;
::AbstractSurrogateModelConfig, op, params, T;
require_fully_linear::Bool=true,
delta_max::Union{Number, AbstractVector{<:Number}}=Inf,
)::AbstractSurrogateModel
return nothing
end
````

A function to return a copy of a model. Should be implemented if
`depends_on_radius` returns `true`.
Note, that the returned object does not have to be an “independent” copy, we allow
for shared objects (like mutable database arrays or something of that sort)...

````julia
universal_copy(mod::AbstractSurrogateModel)=mod
````

A function to copy parameters between source and target models, like `Base.copy!` or
`Base.copyto!`. Relevant mostly for implicit trainable parameters.

````julia
universal_copy!(mod_trgt::AbstractSurrogateModel, mod_src::AbstractSurrogateModel)=mod_trgt

function universal_copy_model(mod)
depends_on_radius(mod) && return universal_copy(mod)
return mod
end

function universal_copy_model!(mod_trgt, mod_src)
depends_on_radius(mod_trgt) && return universal_copy!(mod_trgt, mod_src)
return mod_trgt
end
````

Because parameters are implicit, updates are in-place operations:

````julia
"""
update!(surrogate_model, nonlinear_operator, Δ, x, fx, lb, ub)
Update the model on a trust region of size `Δ` in a box with lower left corner `lb`
Update the model on a trust region of size `Δ` in a global box with lower left corner `lb`
and upper right corner `ub` (in the scaled variable domain)
`x` is a sub-vector of the current iterate conforming to the inputs of `nonlinear_operator`
in the scaled domain. `fx` are the outputs of `nonlinear_operator` at `x`.
"""
function update!(
surr::AbstractSurrogateModel, op, Δ, x, fx, lb, ub; log_level, indent, kwargs...
surr::AbstractSurrogateModel, op, Δ, x, fx, global_lb, global_ub;
log_level, indent, kwargs...
)
return nothing
end

function process_trial_point!(
surr::AbstractSurrogateModel, xtrial, fxtrial, is_next::Bool
surr::AbstractSurrogateModel, xtrial, fxtrial, is_next
)
return nothing
end
Expand All @@ -568,6 +543,9 @@ operator_name(op::AbstractNonlinearOperatorWrapper)=operator_name(wrapped_operat
func_call_counter(op::AbstractNonlinearOperatorWrapper, v::Val)=func_call_counter(wrapped_operator(op), v)
max_num_calls(op::AbstractNonlinearOperatorWrapper, v::Val)=max_num_calls(wrapped_operator(op), v)

operator_dim_in(op::AbstractNonlinearOperatorWrapper)=operator_dim_in(wrapped_operator(op))
operator_dim_out(op::AbstractNonlinearOperatorWrapper)=operator_dim_out(wrapped_operator(op))

preprocess_inputs(op::AbstractNonlinearOperator, x::RVec) = x
preprocess_inputs(op::AbstractNonlinearOperator, x::RMat) = x
preprocess_inputs(op::AbstractNonlinearOperator, x::RVec, p) = (preprocess_inputs(op, x), p)
Expand Down
27 changes: 25 additions & 2 deletions docs/src/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,30 @@ constraints are violated.
I don't really keep up a consistent versioning scheme.
But the changes in this section have been significant enough to warrant some comments.

#### Version 0.3
I did not exactly keep track of all the changes, but some types and default
settings have changed, so a new breaking version is warranted.
We have set-based algorithms now, but they are drafts only and mostly undocumented.
Originally, it was `optimize_set`, but I tested `optimize_many` the most.
Try `optimize_many` at your own risk.

#### Version 0.2.0
This was an intermediate version that has been superseded fast.

#### Version 0.2.0
* We import and re-export `@set` and `@reset` from Accessors.jl.
* `AlgorithmOptions` is no immutable and type-stable.
`@set algo_opts.float_type` will trigger cnoversion and setting of type-dependent defaults.
* Likewise, `RBFConfig` is no longer mutable and has concrete types.
* `TypedMOP` supports `@set` and `@reset` for `objectives`, `nl_eq_constraints` and
`nl_eq_constraints`.
* The `AbstractNonlinearOperator` interface now requires `CompromiseEvaluators.operator_dim_in`
and `CompromiseEvaluators.operator_dim_out`.
* The `ReturnObject` now references the whole cache (basically a NamedTuple of internal structs.)
* `SimpleMOP` reset call counters by default.

#### Version 0.1.0
This release is breaking, because the the RBF database is no longer thread-safe by default.
This release is breaking, because the RBF database is no longer thread-safe by default.
Instead, `ConcurrentUtils` is a weak dependency and no longer mandatory.
To use a thread-safe RBF database, either configure your problem functions
with `:rbfLocked`, use an `RBFConfig` with
Expand Down Expand Up @@ -310,7 +332,8 @@ You can pass `max_func_calls` as a keyword argument to `add_objectives!` and sim
Likewise, `max_grad_calls` restricts the number of gradient calls,
`max_hess_calls` limits Hessian computations.

For historic reasons, the count is kept between runs.
~~For historic reasons, the count is kept between runs.~~
The count is now reset between runs by default.
To reset the count between runs (sequential or parallel), indicate it when setting up
the MOP.

Expand Down
23 changes: 7 additions & 16 deletions docs/src/mop.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,12 +41,8 @@ The optional function `float_type` returns the type of result and derivative vec

````julia
float_type(::AbstractMOP)::Type{<:AbstractFloat}=DEFAULT_FLOAT_TYPE
````

We would also like to deterministically query the expected surrogate model types:

````julia
model_type(::AbstractMOP)::Type{<:AbstractMOPSurrogate}=AbstractMOPSurrogate
stop_type(::AbstractMOP) = Any
````

Below functions are used to query dimension information.
Expand Down Expand Up @@ -98,12 +94,12 @@ end
function dim_lin_eq_constraints(mop::AbstractMOP)
A =lin_eq_constraints_matrix(mop)
b = lin_eq_constraints_vector(mop)
return dim_lin_cons(A, b)
return dim_lin_cons(A, b) :: Int
end
function dim_lin_ineq_constraints(mop::AbstractMOP)
A =lin_ineq_constraints_matrix(mop)
b = lin_ineq_constraints_vector(mop)
return dim_lin_cons(A, b)
return dim_lin_cons(A, b) :: Int
end
````

Expand All @@ -112,7 +108,6 @@ end
!!! note
All evaluation and differentiation methods that you see below should always
return `nothing`, **unless** you want to stop early.
Then return something else, for example a string.

Evaluation of nonlinear objective functions requires the following method:

Expand All @@ -136,14 +131,14 @@ end
To ensure they only get called if needed, we wrap them and assign shorter names:

````julia
function objectives!(y::RVec, mop::AbstractMOP, x::RVec)
function objectives!(y::RVecOrNothing, mop::AbstractMOP, x::RVec)
eval_objectives!(y, mop, x)
end
function nl_eq_constraints!(y::RVec, mop::AbstractMOP, x::RVec)
function nl_eq_constraints!(y::RVecOrNothing, mop::AbstractMOP, x::RVec)
dim_nl_eq_constraints(mop) <= 0 && return nothing
eval_nl_eq_constraints!(y, mop, x)
end
function nl_ineq_constraints!(y::RVec, mop::AbstractMOP, x::RVec)
function nl_ineq_constraints!(y::RVecOrNothing, mop::AbstractMOP, x::RVec)
dim_nl_ineq_constraints(mop) <= 0 && return nothing
eval_nl_ineq_constraints!(y, mop, x)
end
Expand Down Expand Up @@ -177,8 +172,6 @@ This cache is queried for evaluation data by methods such as
`cached_fx(mop_cache)` to retrieve the objective values for example.
Note, that the getter calls should return arrays, and we want to modify
these array.
When scalar values are expected (`cached_theta`, `cached_Phi`),
then the cache should implement setters (`cached_theta!`, `cached_Phi!`).

````julia
function init_value_caches(::AbstractMOP)::AbstractMOPCache
Expand Down Expand Up @@ -216,9 +209,7 @@ function eval_mop!(mop_cache, mop)
cached_Ax(mop_cache),
mop, ξ
)
cached_theta!(mop_cache, θ)
cached_Phi!(mop_cache, Φ)
return nothing
return θ, Φ
end

"Evaluate `mop` at unscaled site `ξ` and modify result arrays in place."
Expand Down
Loading

0 comments on commit e64436f

Please sign in to comment.