Skip to content

update docs for subpackages #227

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Apr 28, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 16 additions & 5 deletions docs/src/optimization_packages/blackboxoptim.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,21 @@
# BlackBoxOptim.jl
[`BlackBoxOptim`](https://github.com/robertfeldt/BlackBoxOptim.jl) is a is a Julia package implementing **(Meta-)heuristic/stochastic algorithms** that do not require for the optimized function to be differentiable.

## Installation: GalacticBBO.jl

To use this package, install the GalacticBBO package:

```julia
import Pkg; Pkg.add("GalacticBBO")
```

## Global Optimizers

### Without Constraint Equations

The algorithms in [`BlackBoxOptim`](https://github.com/robertfeldt/BlackBoxOptim.jl) are performing global optimization on problems without
constraint equations. However, lower and upper constraints set by `lb` and `ub` in the `OptimizationProblem` are required.

A `BlackBoxOptim` algorithm is called by `BBO_` prefix followed by the algorithm name:

* Natural Evolution Strategies:
Expand Down Expand Up @@ -29,11 +44,7 @@ The recommended optimizer is `BBO_adaptive_de_rand_1_bin_radiuslimited()`

The currently available algorithms are listed [here](https://github.com/robertfeldt/BlackBoxOptim.jl#state-of-the-library)

## Global Optimizer
### Without Constraint Equations

The algorithms in [`BlackBoxOptim`](https://github.com/robertfeldt/BlackBoxOptim.jl) are performing global optimization on problems without
constraint equations. However, lower and upper constraints set by `lb` and `ub` in the `OptimizationProblem` are required.
## Example

The Rosenbrock function can optimized using the `BBO_adaptive_de_rand_1_bin_radiuslimited()` as follows:

Expand Down
9 changes: 9 additions & 0 deletions docs/src/optimization_packages/cmaevolutionstrategy.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,21 @@

The CMAEvolutionStrategy algorithm is called by `CMAEvolutionStrategyOpt()`

## Installation: GalacticCMAEvolutionStrategy.jl

To use this package, install the GalacticCMAEvolutionStrategy package:

```julia
import Pkg; Pkg.add("GalacticCMAEvolutionStrategy")
```

## Global Optimizer
### Without Constraint Equations

The method in [`CMAEvolutionStrategy`](https://github.com/jbrea/CMAEvolutionStrategy.jl) is performing global optimization on problems without
constraint equations. However, lower and upper constraints set by `lb` and `ub` in the `OptimizationProblem` are required.

## Example

The Rosenbrock function can optimized using the `CMAEvolutionStrategyOpt()` as follows:

Expand Down
24 changes: 14 additions & 10 deletions docs/src/optimization_packages/evolutionary.md
Original file line number Diff line number Diff line change
@@ -1,29 +1,33 @@
# Evolutionary.jl
[`Evolutionary`](https://github.com/wildart/Evolutionary.jl) is a Julia package implementing various evolutionary and genetic algorithm.

A `Evolutionary` algorithm is called by one of the following:

- [`Evolutionary.GA()`](https://wildart.github.io/Evolutionary.jl/stable/ga/): **Genetic Algorithm optimizer**

- [`Evolutionary.DE()`](https://wildart.github.io/Evolutionary.jl/stable/de/): **Differential Evolution optimizer**
## Installation: GalacticCMAEvolutionStrategy.jl

- [`Evolutionary.ES()`](https://wildart.github.io/Evolutionary.jl/stable/es/): **Evolution Strategy algorithm**

- [`Evolutionary.CMAES()`](https://wildart.github.io/Evolutionary.jl/stable/cmaes/): **Covariance Matrix Adaptation Evolution Strategy algorithm**

Algorithm specific options are defined as `kwargs`. See the respective documentation for more detail.
To use this package, install the GalacticCMAEvolutionStrategy package:

```julia
import Pkg; Pkg.add("GalacticCMAEvolutionStrategy")
```

## Global Optimizer
### Without Constraint Equations

The methods in [`Evolutionary`](https://github.com/wildart/Evolutionary.jl) are performing global optimization on problems without
constraint equations. These methods work both with and without lower and upper constraints set by `lb` and `ub` in the `OptimizationProblem`.

A `Evolutionary` algorithm is called by one of the following:

- [`Evolutionary.GA()`](https://wildart.github.io/Evolutionary.jl/stable/ga/): **Genetic Algorithm optimizer**

- [`Evolutionary.DE()`](https://wildart.github.io/Evolutionary.jl/stable/de/): **Differential Evolution optimizer**

- [`Evolutionary.ES()`](https://wildart.github.io/Evolutionary.jl/stable/es/): **Evolution Strategy algorithm**

- [`Evolutionary.CMAES()`](https://wildart.github.io/Evolutionary.jl/stable/cmaes/): **Covariance Matrix Adaptation Evolution Strategy algorithm**

Algorithm specific options are defined as `kwargs`. See the respective documentation for more detail.

## Example

The Rosenbrock function can optimized using the `Evolutionary.CMAES()` as follows:

Expand Down
9 changes: 9 additions & 0 deletions docs/src/optimization_packages/flux.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,14 @@
# Flux.jl

## Installation: GalacticFlux.jl

To use this package, install the GalacticFlux package:

```julia
import Pkg; Pkg.add("GalacticFlux")
```

## Local Unconstrained Optimizers

- [`Flux.Optimise.Descent`](https://fluxml.ai/Flux.jl/stable/training/optimisers/#Flux.Optimise.Descent): **Classic gradient descent optimizer with learning rate**

Expand Down
12 changes: 11 additions & 1 deletion docs/src/optimization_packages/gcmaes.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,24 @@
# GCMAES.jl
[`GCMAES`](https://github.com/AStupidBear/GCMAES.jl) is a Julia package implementing the **Gradient-based Covariance Matrix Adaptation Evolutionary Strategy** which can utilize the gradient information to speed up the optimization process.

The GCMAES algorithm is called by `GCMAESOpt()` and the initial search variance is set as a keyword argument `σ0` (default: `σ0 = 0.2`)
## Installation: GalacticGCMAES.jl

To use this package, install the GalacticGCMAES package:

```julia
import Pkg; Pkg.add("GalacticGCMAES")
```

## Global Optimizer
### Without Constraint Equations

The GCMAES algorithm is called by `GCMAESOpt()` and the initial search variance is set as a keyword argument `σ0` (default: `σ0 = 0.2`)

The method in [`GCMAES`](https://github.com/AStupidBear/GCMAES.jl) is performing global optimization on problems without
constraint equations. However, lower and upper constraints set by `lb` and `ub` in the `OptimizationProblem` are required.

## Example

The Rosenbrock function can optimized using the `GCMAESOpt()` without utilizing the gradient information as follows:

```julia
Expand Down
11 changes: 10 additions & 1 deletion docs/src/optimization_packages/mathoptinterface.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,22 @@

[MathOptInterface](https://github.com/jump-dev/MathOptInterface.jl) is Julia abstration layer to interface with variety of mathematical optimization solvers.

## Installation: GalacticMOI.jl

To use this package, install the GalacticMOI package:

```julia
import Pkg; Pkg.add("GalacticMOI")
```

## Details

As of now the `GalacticOptim` interface to `MathOptInterface` implents only the `maxtime` common keyword arguments. An optimizer which is implemented in the `MathOptInterface` is can be called be called directly if no optimizer options have to be defined. For example using the `Ipopt.jl` optimizer:

```julia
sol = solve(prob, Ipopt.Optimizer())
```


The optimizer options are handled in one of two ways. They can either be set via `GalacticOptim.MOI.OptimizerWithAttributes()` or as keyword argument to `solve`. For example using the `Ipopt.jl` optimizer:

```julia
Expand Down
16 changes: 14 additions & 2 deletions docs/src/optimization_packages/metaheuristics.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,17 @@
# Metaheuristics.jl
[`Metaheuristics`](https://github.com/jmejia8/Metaheuristics.jl) is a is a Julia package implementing **metaheuristic algorithms** for global optiimization that do not require for the optimized function to be differentiable.

## Installation: GalacticMetaheuristics.jl

To use this package, install the GalacticMetaheuristics package:

```julia
import Pkg; Pkg.add("GalacticMetaheuristics")
```

## Global Optimizer
### Without Constraint Equations

A `Metaheuristics` Single-Objective algorithm is called using one of the following:

* Evolutionary Centers Algorithm: `ECA()`
Expand Down Expand Up @@ -28,12 +39,13 @@ Lastly, information about the optimization problem such as the true optimum is s

The currently available algorithms and their parameters are listed [here](https://jmejia8.github.io/Metaheuristics.jl/stable/algorithms/).

## Global Optimizer
### Without Constraint Equations
## Notes

The algorithms in [`Metaheuristics`](https://github.com/jmejia8/Metaheuristics.jl) are performing global optimization on problems without
constraint equations. However, lower and upper constraints set by `lb` and `ub` in the `OptimizationProblem` are required.

## Examples

The Rosenbrock function can optimized using the Evolutionary Centers Algorithm `ECA()` as follows:

```julia
Expand Down
10 changes: 10 additions & 0 deletions docs/src/optimization_packages/multistartoptimization.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,12 +12,22 @@ Currently, the local methods can be one of the algotithms implemented in `NLopt.

If you checkout the master branch of `MultiStartOptimization` or have version `>=0.1.3` you can use all optimizers found in the `GalacticOptim` which work with an initial parameter set. See an example of this below.


## Installation: GalacticMultiStartOptimization.jl

To use this package, install the GalacticMultiStartOptimization package:

```julia
import Pkg; Pkg.add("GalacticMultiStartOptimization")
```

## Global Optimizer
### Without Constraint Equations

The methods in [`MultistartOptimization`](https://github.com/tpapp/MultistartOptimization.jl) is performing global optimization on problems without
constraint equations. However, lower and upper constraints set by `lb` and `ub` in the `OptimizationProblem` are required.

## Examples

The Rosenbrock function can optimized using `MultistartOptimization.TikTak()` with 100 initial points and the local method `NLopt.LD_LBFGS()` as follows:

Expand Down
10 changes: 10 additions & 0 deletions docs/src/optimization_packages/nlopt.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,16 @@
# NLopt.jl
[`NLopt`](https://github.com/JuliaOpt/NLopt.jl) is Julia package interfacing to the free/open-source [`NLopt library`](http://ab-initio.mit.edu/nlopt) which implements many optimization methods both global and local [`NLopt Documentation`](https://nlopt.readthedocs.io/en/latest/NLopt_Algorithms/).

## Installation: GalacticNLopt.jl

To use this package, install the GalacticNLopt package:

```julia
import Pkg; Pkg.add("GalacticNLopt")
```

## Methods

`NLopt.jl` algorithms are chosen either via `NLopt.Opt(:algname, nstates)` where nstates is the number of states to be optimized but preferably via `NLopt.AlgorithmName()` where `AlgorithmName can be one of the following:
* `NLopt.GN_DIRECT()`
* `NLopt.GN_DIRECT_L()`
Expand Down
9 changes: 9 additions & 0 deletions docs/src/optimization_packages/nomad.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,14 @@

The NOMAD algorithm is called by `NOMADOpt()`

## Installation: GalacticNOMAD.jl

To use this package, install the GalacticNOMAD package:

```julia
import Pkg; Pkg.add("GalacticNOMAD")
```

## Global Optimizer
### Without Constraint Equations

Expand All @@ -11,6 +19,7 @@ constraint equations. Currently however, linear and nonlinear constraints defin

NOMAD works both with and without lower and upper boxconstraints set by `lb` and `ub` in the `OptimizationProblem`.

## Examples

The Rosenbrock function can optimized using the `NOMADOpt()` with and without boxcontraints as follows:

Expand Down
17 changes: 14 additions & 3 deletions docs/src/optimization_packages/nonconvex.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,17 @@
# Nonconvex.jl
[`Nonconvex`](https://github.com/JuliaNonconvex/Nonconvex.jl) is a is a Julia package implementing and wrapping nonconvex constrained optimization algorithms.

## Installation: GalacticNOMAD.jl

To use this package, install the GalacticNOMAD package:

```julia
import Pkg; Pkg.add("GalacticNOMAD")
```

## Global Optimizer
### Without Constraint Equations

A `Nonconvex` algorithm is called using one of the following:

* [Method of moving asymptotes (MMA)](https://julianonconvex.github.io/Nonconvex.jl/stable/algorithms/mma/#Method-of-moving-asymptotes-(MMA)):
Expand All @@ -25,16 +36,16 @@ A `Nonconvex` algorithm is called using one of the following:

When performing optimizing a combination of integer and floating-point parameters the `integer` keyword has to be set. It takes a boolean vector indicating which parameter is an integer.

## Notes

Some optimizer may require further options to be defined in order to work.

The currently available algorithms are listed [here](https://julianonconvex.github.io/Nonconvex.jl/stable/algorithms/algorithms/)

## Global Optimizer
### Without Constraint Equations

The algorithms in [`Nonconvex`](https://julianonconvex.github.io/Nonconvex.jl/stable/algorithms/algorithms/) are performing global optimization on problems without constraint equations. However, lower and upper constraints set by `lb` and `ub` in the `OptimizationProblem` are required.

## Examples

The Rosenbrock function can optimized using the Method of moving asymptotes algorithm `MMA02()` as follows:

```julia
Expand Down
13 changes: 11 additions & 2 deletions docs/src/optimization_packages/optim.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,16 @@
# Optim.jl
# [Optim.jl](@id optim)
[`Optim`](https://github.com/JuliaNLSolvers/Optim.jl) is Julia package implementing various algorithm to perform univariate and multivariate optimization.

## Installation: GalacticOptimJL.jl

To use this package, install the GalacticNOMAD package:

```julia
import Pkg; Pkg.add("GalacticNOMAD")
```

## Methods

`Optim.jl` algorithms are chosen either via `NLopt.AlgorithmName()` where `AlgorithmName` can be one of the following:

* `Optim.NelderMead()`
Expand Down Expand Up @@ -39,7 +49,6 @@ The following special keyword arguments which are not covered by the common `sol
For a more extensive documentation of all the algorithms and options please consult the
[`Documentation`](https://julianlsolvers.github.io/Optim.jl/stable/#)


## Local Optimizer
### Local Constraint

Expand Down
11 changes: 10 additions & 1 deletion docs/src/optimization_packages/quaddirect.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,18 @@

The QuadDIRECT algorithm is called using `QuadDirect()`.

## Installation: GalacticQuadDIRECT.jl

To use this package, install the GalacticQuadDIRECT package:

```julia
import Pkg; Pkg.add("GalacticQuadDIRECT")
```

Also note that `QuadDIRECT` should (for now) be installed by doing:

`] add https://github.com/timholy/QuadDIRECT.jl.git`


## Global Optimizer
### Without Constraint Equations
The algorithm in [`QuadDIRECT`](https://github.com/timholy/QuadDIRECT.jl) is performing global optimization on problems without
Expand All @@ -16,6 +23,8 @@ constraint equations. However, lower and upper constraints set by `lb` and `ub`
Furthermore, `QuadDirect` requires `splits` which is a list of 3-vectors with initial locations at which to evaluate the function (the values must be in strictly increasing order and lie within the specified bounds) such that
`solve(problem, QuadDirect(), splits)`.

## Example

The Rosenbrock function can optimized using the `QuadDirect()` as follows:

```julia
Expand Down
8 changes: 8 additions & 0 deletions docs/src/optimization_packages/speedmapping.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,14 @@

The SpeedMapping algorithm is called by `SpeedMappingOpt()`

## Installation: GalacticSpeedMapping.jl

To use this package, install the GalacticSpeedMapping package:

```julia
import Pkg; Pkg.add("GalacticSpeedMapping")
```

## Global Optimizer
### Without Constraint Equations

Expand Down
Loading