Releases: SciML/Optimization.jl
Releases · SciML/Optimization.jl
v3.20.0
Optimization v3.20.0
Merged pull requests:
- CompatHelper: add new compat entry for HiGHS at version 1 for package docs, (keep existing compat) (#551) (@github-actions[bot])
- CompatHelper: bump compat for Flux to 0.14 for package docs, (keep existing compat) (#566) (@github-actions[bot])
- CompatHelper: bump compat for Documenter to 1 for package docs, (keep existing compat) (#594) (@github-actions[bot])
- CompatHelper: add new compat entry for OptimizationPRIMA at version 0.0.1 for package docs, (keep existing compat) (#618) (@github-actions[bot])
- Change typeof(x) <: y to x isa y (#623) (@pepijndevos)
- Fix optim maxsense (#624) (@Vaibhavdixit02)
- fix: use SII instead of explicitly accessing SciMLFunction.syms (#626) (@AayushSabharwal)
- Create polyopt.md (#627) (@Vaibhavdixit02)
- Update OptimizationOptimJL.jl (#630) (@Vaibhavdixit02)
- CompatHelper: add new compat entry for SciMLBase at version 2 for package docs, (keep existing compat) (#637) (@github-actions[bot])
- Aqua + typos CI (#639) (@ArnoStrouwen)
- Bump actions/checkout from 3 to 4 (#640) (@dependabot[bot])
- Update Flux compat (#642) (@vavrines)
- [skip ci] dependabot ignore typos patches (#643) (@ArnoStrouwen)
Closed issues:
- Linking Nonconvex.jl (#123)
- Add support for constraints for the other AD backends (#130)
- Document MOI Optimizers (#143)
- compiled tape ReverseDiff (#245)
- Wrap LsqFit.jl and LeastSquaresOptim.jl to replace DiffEqParamEstim implementation (#447)
- EnsembleProblem support (#491)
- Documentation callbacks. (#498)
- Provide new initial point without redefining the optimization problem (#500)
- Is it possible to set the parameter "min_mesh_size" of the the NOMAD algorithm when calling from Optimization.jl ? (#573)
- Finding a suitable algorithm to solve my problem (#610)
- Docs job erroring (#614)
- a bug in Optimization.jls enzyme extension (#621)
- Provide a way to create OptimizationProblem from NonlinearLeastSquaresProblem (#622)
- Optim.jl's IPNewton should not require constraints (#629)
- Enormous unnecessary memory allocation (#638)
v3.19.3
Optimization v3.19.3
Merged pull requests:
- Add OptimizationPRIMA to the docs (#616) (@ChrisRackauckas)
v3.19.2
Optimization v3.19.2
Merged pull requests:
- CompatHelper: bump compat for Optimisers to 0.3 for package OptimizationOptimisers, (keep existing compat) (#580) (@github-actions[bot])
- Documenter 1.0 upgrade (#598) (@ArnoStrouwen)
- Improve Performance for OptimizationBBO (#600) (@Zentrik)
- Add callback to MOI (#601) (@Vaibhavdixit02)
- Mark tests broken on v1.6 as broken on v1.6 (#603) (@ChrisRackauckas)
- Bound Optim.jl to the range where it is generic (#604) (@ChrisRackauckas)
- Require v1.9 for AD tests (#605) (@ChrisRackauckas)
- build: lift Optim bound (#607) (@sathvikbhagavan)
- build(OptimizationOptimJL): bump patch version (#608) (@sathvikbhagavan)
- Make AutoZygote robust to zero gradients (#609) (@ChrisRackauckas)
- fix typos (#611) (@spaette)
- [WIP] Add PRIMA wrapper (#612) (@Vaibhavdixit02)
- Fix zygote constraint bug and update rosenbrock doc (#615) (@Vaibhavdixit02)
Closed issues:
- Inclusion of PRIMA solvers in Optimization.jl (#593)
v3.19.1
Optimization v3.19.1
Merged pull requests:
- map
maxiters
toouter_iterations
(#544) (@SebastianM-C) - Eliminate some runtime dispatch and other things (#597) (@Vaibhavdixit02)
Closed issues:
- Bounds + maxiters breaks BFGS (#508)
v3.19.0
Optimization v3.19.0
v3.18.0
Optimization v3.18.0
Merged pull requests:
- Support both float and array return extensions [Enzyme] (#589) (@wsmoses)
- Setup tape compilation for ReverseDiff (#590) (@ChrisRackauckas)
- Fix chunksize issue (#595) (@Vaibhavdixit02)
Closed issues:
v3.17.0
Optimization v3.17.0
Merged pull requests:
- [WIP] Enzyme and sparse updates (#585) (@Vaibhavdixit02)
- Fix type instability in Enzyme extension (#586) (@wsmoses)
- Bump actions/checkout from 3 to 4 (#588) (@dependabot[bot])
v3.16.1
Optimization v3.16.1
Merged pull requests:
- Fix typos in the extension names (#582) (@ChrisRackauckas)
- Update extensions to not load packages directly only through Optimization (#583) (@Vaibhavdixit02)
v3.16.0
Optimization v3.16.0
Closed issues:
- Hessian coloring (#269)
- Move all of the AD overloads to subpackages instead of Requires (#309)
- Manual Hessian of Lagrangian function (#343)
init
interface for reduced overhead in repeated solves (#352)- very easy to get bit by callbacks when there is a mismatch in the args of the callback function and the return of the loss (#538)
- Missing parameters in call to
instantiate_function
(#559) - Bump compat for Enzyme.jl to 0.11.2 or higher (#564)
- Error with AutoZygote in OptimizationFunction after Julia version upgrade to 1.9.2 (#571)
Merged pull requests:
- [Experimental] Add Sophia method implementation (#534) (@Vaibhavdixit02)
- Handle sparse hessians, jacobians and hessvec product better (#553) (@Vaibhavdixit02)
- CompatHelper: add new compat entry for Symbolics at version 5, (keep existing compat) (#557) (@github-actions[bot])
- Try Enzyme 0.11.2 (#561) (@Vaibhavdixit02)
- Some NLopt and MOI updates (#562) (@Vaibhavdixit02)
- Update Project.toml to try Enzyme 0.11.4 and then 0.11.3 (#563) (@Vaibhavdixit02)
- Update hessian implementation and avoid closure in gradient (#565) (@Vaibhavdixit02)
- remove second copy of __moi_status_to_ReturnCode (#568) (@visr)
- Refactor: correct a typo (#569) (@tapyu)
- Throw error from callback true in NLopt to halt optimization (#570) (@Vaibhavdixit02)
- CompatHelper: bump compat for ADTypes to 0.2, (keep existing compat) (#574) (@github-actions[bot])
- CompatHelper: bump compat for NLopt to 1 for package docs, (keep existing compat) (#575) (@github-actions[bot])
- CompatHelper: bump compat for NLopt to 1 for package OptimizationNLopt, (keep existing compat) (#576) (@github-actions[bot])
- Add constraints support to ReverseDiff and Zygote (#577) (@Vaibhavdixit02)
- Fix mtk empty contraints creation and update rosenbrock example (#578) (@Vaibhavdixit02)
v3.15.2
Optimization v3.15.2
Merged pull requests:
- Bring back AbstractRule in OptimizationOptimisers (#552) (@Vaibhavdixit02)