Skip to content

Commit 0bc903b

Browse files
Update constraints in docs to be inplace
1 parent b145d49 commit 0bc903b

File tree

9 files changed

+72
-51
lines changed

9 files changed

+72
-51
lines changed

docs/src/optimization_packages/optim.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ The following special keyword arguments which are not covered by the common `sol
4646
* `show_every`: Trace output is printed every `show_every`th iteration.
4747

4848

49-
For a more extensive documentation of all the algorithms and options please consult the
49+
For a more extensive documentation of all the algorithms and options please consult the
5050
[`Documentation`](https://julianlsolvers.github.io/Optim.jl/stable/#)
5151

5252
## Local Optimizer
@@ -73,7 +73,7 @@ The Rosenbrock function can optimized using the `Optim.IPNewton()` as follows:
7373

7474
```julia
7575
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
76-
cons= (x,p) -> [x[1]^2 + x[2]^2]
76+
cons= (res,x,p) -> res .= [x[1]^2 + x[2]^2]
7777
x0 = zeros(2)
7878
p = [1.0,100.0]
7979
prob = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff();cons= cons)
@@ -345,7 +345,7 @@ The Rosenbrock function can optimized using the `Optim.KrylovTrustRegion()` as f
345345

346346
```julia
347347
rosenbrock(x, p) = (1 - x[1])^2 + 100 * (x[2] - x[1]^2)^2
348-
cons= (x,p) -> [x[1]^2 + x[2]^2]
348+
cons= (res,x,p) -> res .= [x[1]^2 + x[2]^2]
349349
x0 = zeros(2)
350350
p = [1.0,100.0]
351351
optprob = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff();cons= cons)

docs/src/tutorials/rosenbrock.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,14 @@
11
# Solving the Rosenbrock Problem in >10 Ways
2-
2+
33
This tutorial is a demonstration of many different solvers to demonstrate the
44
flexibility of Optimization.jl. This is a gauntlet of many solvers to get a feel
55
for common workflows of the package and give copy-pastable starting points.
66

77
!!! note
88

99
This example uses many different solvers of Optimization.jl. Each solver
10-
subpackage needs to be installed separate. For example, for the details on
11-
the installation and usage of OptimizationOptimJL.jl package, see the
10+
subpackage needs to be installed separate. For example, for the details on
11+
the installation and usage of OptimizationOptimJL.jl package, see the
1212
[Optim.jl page](@ref optim).
1313

1414
```@example rosenbrock
@@ -39,8 +39,8 @@ sol = solve(prob, NelderMead())
3939
4040
# Now a gradient-based optimizer with forward-mode automatic differentiation
4141
42-
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
43-
prob = OptimizationProblem(optf, x0, _p)
42+
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
43+
prob = OptimizationProblem(optf, x0, _p)
4444
sol = solve(prob, BFGS())
4545
4646
# Now a second order optimizer using Hessians generated by forward-mode automatic differentiation
@@ -53,7 +53,7 @@ sol = solve(prob, Optim.KrylovTrustRegion())
5353
5454
# Now derivative-based optimizers with various constraints
5555
56-
cons = (x,p) -> [x[1]^2 + x[2]^2]
56+
cons = (res,x,p) -> res .= [x[1]^2 + x[2]^2]
5757
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff();cons= cons)
5858
#prob = OptimizationProblem(optf, x0, _p)
5959
#sol = solve(prob, IPNewton()) # No lcons or rcons, so constraints not satisfied
@@ -64,24 +64,24 @@ sol = solve(prob, IPNewton()) # Note that -Inf < x[1]^2 + x[2]^2 < Inf is always
6464
prob = OptimizationProblem(optf, x0, _p, lcons = [-5.0], ucons = [10.0])
6565
sol = solve(prob, IPNewton()) # Again, -5.0 < x[1]^2 + x[2]^2 < 10.0
6666
67-
prob = OptimizationProblem(optf, x0, _p, lcons = [-Inf], ucons = [Inf],
67+
prob = OptimizationProblem(optf, x0, _p, lcons = [-Inf], ucons = [Inf],
6868
lb = [-500.0,-500.0], ub=[50.0,50.0])
6969
sol = solve(prob, IPNewton())
7070
71-
prob = OptimizationProblem(optf, x0, _p, lcons = [0.5], ucons = [0.5],
72-
lb = [-500.0,-500.0], ub=[50.0,50.0])
71+
prob = OptimizationProblem(optf, x0, _p, lcons = [0.5], ucons = [0.5],
72+
lb = [-500.0,-500.0], ub=[50.0,50.0])
7373
sol = solve(prob, IPNewton()) # Notice now that x[1]^2 + x[2]^2 ≈ 0.5:
7474
# cons(sol.minimizer, _p) = 0.49999999999999994
7575
76-
function con2_c(x,p)
77-
[x[1]^2 + x[2]^2, x[2]*sin(x[1])-x[1]]
76+
function con2_c(res,x,p)
77+
res .= [x[1]^2 + x[2]^2, x[2]*sin(x[1])-x[1]]
7878
end
7979
8080
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff();cons= con2_c)
8181
prob = OptimizationProblem(optf, x0, _p, lcons = [-Inf,-Inf], ucons = [Inf,Inf])
8282
sol = solve(prob, IPNewton())
8383
84-
cons_circ = (x,p) -> [x[1]^2 + x[2]^2]
84+
cons_circ = (x,p) -> res .= [x[1]^2 + x[2]^2]
8585
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff();cons= cons_circ)
8686
prob = OptimizationProblem(optf, x0, _p, lcons = [-Inf], ucons = [0.25^2])
8787
sol = solve(prob, IPNewton()) # -Inf < cons_circ(sol.minimizer, _p) = 0.25^2
@@ -116,7 +116,7 @@ sol = solve(prob, Opt(:LD_LBFGS, 2))
116116
## Evolutionary.jl Solvers
117117
118118
using OptimizationEvolutionary
119-
sol = solve(prob, CMAES(μ =40 , λ = 100),abstol=1e-15) # -1.0 ≤ x[1], x[2] ≤ 0.8
119+
sol = solve(prob, CMAES(μ =40 , λ = 100),abstol=1e-15) # -1.0 ≤ x[1], x[2] ≤ 0.8
120120
121121
## BlackBoxOptim.jl Solvers
122122

src/function/finitediff.jl

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -123,8 +123,10 @@ function instantiate_function(f, x, adtype::AutoFiniteDiff, p, num_cons = 0)
123123
cons_h = (res, θ) -> f.cons_h(res, θ, p)
124124
end
125125

126-
return OptimizationFunction{true}(f, adtype; grad=grad, hess=hess, hv=hv,
127-
cons=cons, cons_j=cons_j, cons_h=cons_h,
128-
cons_jac_colorvec = cons_jac_colorvec,
129-
hess_prototype=f.hess_prototype, cons_jac_prototype=f.cons_jac_prototype, cons_hess_prototype=f.cons_hess_prototype)
126+
return OptimizationFunction{true}(f, adtype; grad = grad, hess = hess, hv = hv,
127+
cons = cons, cons_j = cons_j, cons_h = cons_h,
128+
cons_jac_colorvec = cons_jac_colorvec,
129+
hess_prototype = f.hess_prototype,
130+
cons_jac_prototype = f.cons_jac_prototype,
131+
cons_hess_prototype = f.cons_hess_prototype)
130132
end

src/function/forwarddiff.jl

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -102,7 +102,9 @@ function instantiate_function(f::OptimizationFunction{true}, x,
102102
cons_h = (res, θ) -> f.cons_h(res, θ, p)
103103
end
104104

105-
return OptimizationFunction{true}(f.f, adtype; grad=grad, hess=hess, hv=hv,
106-
cons=cons, cons_j=cons_j, cons_h=cons_h,
107-
hess_prototype=f.hess_prototype, cons_jac_prototype=f.cons_jac_prototype, cons_hess_prototype=f.cons_hess_prototype)
105+
return OptimizationFunction{true}(f.f, adtype; grad = grad, hess = hess, hv = hv,
106+
cons = cons, cons_j = cons_j, cons_h = cons_h,
107+
hess_prototype = f.hess_prototype,
108+
cons_jac_prototype = f.cons_jac_prototype,
109+
cons_hess_prototype = f.cons_hess_prototype)
108110
end

src/function/function.jl

Lines changed: 13 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -44,15 +44,19 @@ For more information on the use of automatic differentiation, see the
4444
documentation of the `AbstractADType` types.
4545
"""
4646
function instantiate_function(f, x, ::AbstractADType, p, num_cons = 0)
47-
grad = f.grad === nothing ? nothing : (G,x)->f.grad(G,x,p)
48-
hess = f.hess === nothing ? nothing : (H,x)->f.hess(H,x,p)
49-
hv = f.hv === nothing ? nothing : (H,x,v)->f.hv(H,x,v,p)
50-
cons = f.cons === nothing ? nothing : (res,x)->f.cons(res,x,p)
51-
cons_j = f.cons_j === nothing ? nothing : (res,x)->f.cons_j(res,x,p)
52-
cons_h = f.cons_h === nothing ? nothing : (res,x)->f.cons_h(res,x,p)
53-
hess_prototype = f.hess_prototype === nothing ? nothing : convert.(eltype(x), f.hess_prototype)
54-
cons_jac_prototype = f.cons_jac_prototype === nothing ? nothing : convert.(eltype(x), f.cons_jac_prototype)
55-
cons_hess_prototype = f.cons_hess_prototype === nothing ? nothing : [convert.(eltype(x), f.cons_hess_prototype[i]) for i in 1:num_cons]
47+
grad = f.grad === nothing ? nothing : (G, x) -> f.grad(G, x, p)
48+
hess = f.hess === nothing ? nothing : (H, x) -> f.hess(H, x, p)
49+
hv = f.hv === nothing ? nothing : (H, x, v) -> f.hv(H, x, v, p)
50+
cons = f.cons === nothing ? nothing : (res, x) -> f.cons(res, x, p)
51+
cons_j = f.cons_j === nothing ? nothing : (res, x) -> f.cons_j(res, x, p)
52+
cons_h = f.cons_h === nothing ? nothing : (res, x) -> f.cons_h(res, x, p)
53+
hess_prototype = f.hess_prototype === nothing ? nothing :
54+
convert.(eltype(x), f.hess_prototype)
55+
cons_jac_prototype = f.cons_jac_prototype === nothing ? nothing :
56+
convert.(eltype(x), f.cons_jac_prototype)
57+
cons_hess_prototype = f.cons_hess_prototype === nothing ? nothing :
58+
[convert.(eltype(x), f.cons_hess_prototype[i])
59+
for i in 1:num_cons]
5660
expr = symbolify(f.expr)
5761
cons_expr = symbolify.(f.cons_expr)
5862

src/function/reversediff.jl

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,9 @@ function instantiate_function(f, x, adtype::AutoReverseDiff, p = SciMLBase.NullP
8484
hv = f.hv
8585
end
8686

87-
return OptimizationFunction{false}(f, adtype; grad=grad, hess=hess, hv=hv,
88-
cons=nothing, cons_j=nothing, cons_h=nothing,
89-
hess_prototype=f.hess_prototype, cons_jac_prototype=nothing, cons_hess_prototype=nothing)
87+
return OptimizationFunction{false}(f, adtype; grad = grad, hess = hess, hv = hv,
88+
cons = nothing, cons_j = nothing, cons_h = nothing,
89+
hess_prototype = f.hess_prototype,
90+
cons_jac_prototype = nothing,
91+
cons_hess_prototype = nothing)
9092
end

src/function/tracker.jl

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -51,8 +51,9 @@ function instantiate_function(f, x, adtype::AutoTracker, p, num_cons = 0)
5151
hv = f.hv
5252
end
5353

54-
55-
return OptimizationFunction{false}(f, adtype; grad=grad, hess=hess, hv=hv,
56-
cons=nothing, cons_j=nothing, cons_h=nothing,
57-
hess_prototype=f.hess_prototype, cons_jac_prototype=nothing, cons_hess_prototype=nothing)
54+
return OptimizationFunction{false}(f, adtype; grad = grad, hess = hess, hv = hv,
55+
cons = nothing, cons_j = nothing, cons_h = nothing,
56+
hess_prototype = f.hess_prototype,
57+
cons_jac_prototype = nothing,
58+
cons_hess_prototype = nothing)
5859
end

src/function/zygote.jl

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,9 @@ function instantiate_function(f, x, adtype::AutoZygote, p, num_cons = 0)
6666
hv = f.hv
6767
end
6868

69-
return OptimizationFunction{false}(f, adtype; grad=grad, hess=hess, hv=hv,
70-
cons=nothing, cons_j=nothing, cons_h=nothing,
71-
hess_prototype=f.hess_prototype, cons_jac_prototype=nothing, cons_hess_prototype=nothing)
69+
return OptimizationFunction{false}(f, adtype; grad = grad, hess = hess, hv = hv,
70+
cons = nothing, cons_j = nothing, cons_h = nothing,
71+
hess_prototype = f.hess_prototype,
72+
cons_jac_prototype = nothing,
73+
cons_hess_prototype = nothing)
7274
end

test/ADtests.jl

Lines changed: 17 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -148,7 +148,6 @@ optprob.grad(G2, x0)
148148
@test G1 == G2
149149
@test_throws ErrorException optprob.hess(H2, x0)
150150

151-
152151
prob = OptimizationProblem(optf, x0)
153152

154153
sol = solve(prob, Optim.BFGS())
@@ -253,14 +252,17 @@ cons_j = (J, θ, p) -> optprob.cons_j(J, θ)
253252
hess = (H, θ, p) -> optprob.hess(H, θ)
254253
sH = sparse([1, 1, 2, 2], [1, 2, 1, 2], zeros(4))
255254
sJ = sparse([1, 1, 2, 2], [1, 2, 1, 2], zeros(4))
256-
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff(), hess=hess, hess_prototype=copy(sH), cons=con2_c, cons_j=cons_j, cons_jac_prototype=copy(sJ))
257-
optprob1 = Optimization.instantiate_function(optf, x0, Optimization.AutoForwardDiff(), nothing, 2)
255+
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff(), hess = hess,
256+
hess_prototype = copy(sH), cons = con2_c, cons_j = cons_j,
257+
cons_jac_prototype = copy(sJ))
258+
optprob1 = Optimization.instantiate_function(optf, x0, Optimization.AutoForwardDiff(),
259+
nothing, 2)
258260
@test optprob1.hess_prototype == sparse([0.0 0.0; 0.0 0.0]) # make sure it's still using it
259261
optprob1.hess(sH, [5.0, 3.0])
260-
@test all(isapprox(sH, [28802.0 -2000.0; -2000.0 200.0]; rtol=1e-3))
262+
@test all(isapprox(sH, [28802.0 -2000.0; -2000.0 200.0]; rtol = 1e-3))
261263
@test optprob1.cons_jac_prototype == sparse([0.0 0.0; 0.0 0.0]) # make sure it's still using it
262264
optprob1.cons_j(sJ, [5.0, 3.0])
263-
@test all(isapprox(sJ, [10.0 6.0; -0.149013 -0.958924]; rtol=1e-3))
265+
@test all(isapprox(sJ, [10.0 6.0; -0.149013 -0.958924]; rtol = 1e-3))
264266

265267
grad = (G, θ, p) -> optprob.grad(G, θ)
266268
hess = (H, θ, p) -> optprob.hess(H, θ)
@@ -269,14 +271,20 @@ cons_h = (res, θ, p) -> optprob.cons_h(res, θ)
269271
sH = sparse([1, 1, 2, 2], [1, 2, 1, 2], zeros(4))
270272
sJ = sparse([1, 1, 2, 2], [1, 2, 1, 2], zeros(4))
271273
sH3 = [sparse([1, 2], [1, 2], zeros(2)), sparse([1, 1, 2], [1, 2, 1], zeros(3))]
272-
optf = OptimizationFunction(rosenbrock, SciMLBase.NoAD(), grad=grad, hess=hess, cons=con2_c, cons_j=cons_j, cons_h=cons_h, hess_prototype=sH, cons_jac_prototype=sJ, cons_hess_prototype=sH3)
274+
optf = OptimizationFunction(rosenbrock, SciMLBase.NoAD(), grad = grad, hess = hess,
275+
cons = con2_c, cons_j = cons_j, cons_h = cons_h,
276+
hess_prototype = sH, cons_jac_prototype = sJ,
277+
cons_hess_prototype = sH3)
273278
optprob2 = Optimization.instantiate_function(optf, x0, SciMLBase.NoAD(), nothing, 2)
274279
optprob2.hess(sH, [5.0, 3.0])
275-
@test all(isapprox(sH, [28802.0 -2000.0; -2000.0 200.0]; rtol=1e-3))
280+
@test all(isapprox(sH, [28802.0 -2000.0; -2000.0 200.0]; rtol = 1e-3))
276281
optprob2.cons_j(sJ, [5.0, 3.0])
277-
@test all(isapprox(sJ, [10.0 6.0; -0.149013 -0.958924]; rtol=1e-3))
282+
@test all(isapprox(sJ, [10.0 6.0; -0.149013 -0.958924]; rtol = 1e-3))
278283
optprob2.cons_h(sH3, [5.0, 3.0])
279-
@test sH3 [[2.0 0.0; 0.0 2.0], [2.8767727327346804 0.2836621681849162; 0.2836621681849162 -6.622738308376736e-9]]
284+
@test sH3 [
285+
[2.0 0.0; 0.0 2.0],
286+
[2.8767727327346804 0.2836621681849162; 0.2836621681849162 -6.622738308376736e-9],
287+
]
280288

281289
# Can we solve problems? Using AutoForwardDiff to test since we know that works
282290
for consf in [cons, con2_c]

0 commit comments

Comments
 (0)