You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+16-5Lines changed: 16 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -35,16 +35,30 @@ installation of dependencies. Below is the list of packages that need to be
35
35
installed explicitly if you intend to use the specific optimization algorithms
36
36
offered by them:
37
37
38
+
- OptimizationAuglag for augmented Lagrangian methods
38
39
- OptimizationBBO for [BlackBoxOptim.jl](https://github.com/robertfeldt/BlackBoxOptim.jl)
40
+
- OptimizationCMAEvolutionStrategy for [CMAEvolutionStrategy.jl](https://github.com/jbrea/CMAEvolutionStrategy.jl)
39
41
- OptimizationEvolutionary for [Evolutionary.jl](https://github.com/wildart/Evolutionary.jl) (see also [this documentation](https://wildart.github.io/Evolutionary.jl/dev/))
40
42
- OptimizationGCMAES for [GCMAES.jl](https://github.com/AStupidBear/GCMAES.jl)
41
-
- OptimizationMOI for [MathOptInterface.jl](https://github.com/jump-dev/MathOptInterface.jl) (usage of algorithm via MathOptInterface API; see also the API [documentation](https://jump.dev/MathOptInterface.jl/stable/))
43
+
- OptimizationIpopt for [Ipopt.jl](https://github.com/jump-dev/Ipopt.jl)
44
+
- OptimizationLBFGSB for [LBFGSB.jl](https://github.com/Gnimuc/LBFGSB.jl)
45
+
- OptimizationMadNLP for [MadNLP.jl](https://github.com/MadNLP/MadNLP.jl)
46
+
- OptimizationManopt for [Manopt.jl](https://github.com/JuliaManifolds/Manopt.jl) (optimization on manifolds)
42
47
- OptimizationMetaheuristics for [Metaheuristics.jl](https://github.com/jmejia8/Metaheuristics.jl) (see also [this documentation](https://jmejia8.github.io/Metaheuristics.jl/stable/))
48
+
- OptimizationMOI for [MathOptInterface.jl](https://github.com/jump-dev/MathOptInterface.jl) (usage of algorithm via MathOptInterface API; see also the API [documentation](https://jump.dev/MathOptInterface.jl/stable/))
43
49
- OptimizationMultistartOptimization for [MultistartOptimization.jl](https://github.com/tpapp/MultistartOptimization.jl) (see also [this documentation](https://juliahub.com/docs/MultistartOptimization/cVZvi/0.1.0/))
44
50
- OptimizationNLopt for [NLopt.jl](https://github.com/JuliaOpt/NLopt.jl) (usage via the NLopt API; see also the available [algorithms](https://nlopt.readthedocs.io/en/latest/NLopt_Algorithms/))
51
+
- OptimizationNLPModels for [NLPModels.jl](https://github.com/JuliaSmoothOptimizers/NLPModels.jl)
45
52
- OptimizationNOMAD for [NOMAD.jl](https://github.com/bbopt/NOMAD.jl) (see also [this documentation](https://bbopt.github.io/NOMAD.jl/stable/))
46
-
- OptimizationNonconvex for [Nonconvex.jl](https://github.com/JuliaNonconvex/Nonconvex.jl) (see also [this documentation](https://julianonconvex.github.io/Nonconvex.jl/stable/))
53
+
- OptimizationODE for optimization of steady-state and time-dependent ODE problems
54
+
- OptimizationOptimJL for [Optim.jl](https://github.com/JuliaNLSolvers/Optim.jl)
55
+
- OptimizationOptimisers for [Optimisers.jl](https://github.com/FluxML/Optimisers.jl) (machine learning optimizers)
56
+
- OptimizationPolyalgorithms for polyalgorithm optimization strategies
57
+
- OptimizationPRIMA for [PRIMA.jl](https://github.com/libprima/PRIMA.jl)
58
+
- OptimizationPyCMA for Python's CMA-ES implementation via [PythonCall.jl](https://github.com/JuliaPy/PythonCall.jl)
47
59
- OptimizationQuadDIRECT for [QuadDIRECT.jl](https://github.com/timholy/QuadDIRECT.jl)
60
+
- OptimizationSciPy for [SciPy](https://scipy.org/) optimization algorithms via [PythonCall.jl](https://github.com/JuliaPy/PythonCall.jl)
61
+
- OptimizationSophia for Sophia optimizer (second-order stochastic optimizer)
48
62
- OptimizationSpeedMapping for [SpeedMapping.jl](https://github.com/NicolasL-S/SpeedMapping.jl) (see also [this documentation](https://nicolasl-s.github.io/SpeedMapping.jl/stable/))
Copy file name to clipboardExpand all lines: docs/src/examples/rosenbrock.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ flexibility of Optimization.jl. This is a gauntlet of many solvers to get a feel
5
5
for common workflows of the package and give copy-pastable starting points.
6
6
7
7
!!! note
8
-
8
+
9
9
This example uses many different solvers of Optimization.jl. Each solver
10
10
subpackage needs to be installed separate. For example, for the details on
11
11
the installation and usage of OptimizationOptimJL.jl package, see the
@@ -14,12 +14,12 @@ for common workflows of the package and give copy-pastable starting points.
14
14
The objective of this exercise is to determine the $(x, y)$ value pair that minimizes the result of a Rosenbrock function $f$ with some parameter values $a$ and $b$. The Rosenbrock function is useful for testing because it is known *a priori* to have a global minimum at $(a, a^2)$.
15
15
```math
16
16
f(x,\,y;\,a,\,b) = \left(a - x\right)^2 + b \left(y - x^2\right)^2
17
-
```
17
+
```
18
18
19
19
The Optimization.jl interface expects functions to be defined with a vector of optimization arguments $\bar{x}$ and a vector of parameters $\bar{p}$, i.e.:
Parameters $a$ and $b$ are captured in a vector $\bar{p}$ and assigned some arbitrary values to produce a particular Rosenbrock function to be minimized.
25
25
```math
@@ -164,7 +164,7 @@ sol = solve(prob, CMAEvolutionStrategyOpt())
0 commit comments