Releases: SciNim/numericalnim
meshgrid fix
What's Changed
- avoid name collisions of
meshgridby @HugoGranstrom in #42
Full Changelog: v0.8.8...v0.8.9
Extrapolation
The 1D interpolation methods now support extrapolation using these methods:
Constant: Set all points outside the range of the interpolator toextrapValue.Edge: Use the value of the left/right edge.Linear: Uses linear extrapolation using the two points closest to the edge.Native(default): Uses the native method of the interpolator to extrapolate. For Linear1D it will be a linear extrapolation, and for Cubic and Hermite splines it will be cubic extrapolation.Error: Raises anValueErrorifxis outside the range.
These are passed in as an argument to eval and derivEval:
let valEdge = interp.eval(x, Edge)
let valConstant = interp.eval(x, Constant, NaN)`levmarq` uncertainties + CI Docs
levmarqnow acceptsyError.paramUncertaintiesallows you to calculate the uncertainties of fitted parameters.chi2test added
What's Changed
- build docs in CI when pushing to master by @HugoGranstrom in #35
- add chi2 + add uncertainties to levmarq by @HugoGranstrom in #36
Full Changelog: v0.8.5...v0.8.6
Fix rbf bug
Radial basis function interpolation
With radial basis function interpolation, numericalnim finally gets an interpolation method which works on scattered data in arbitrary dimensions!
Basic usage:
let interp = newRbf(points, values)
let result = interp.eval(evalPoints)
What's Changed
- Radial Basis functions by @HugoGranstrom in #33
Full Changelog: v0.8.3...v0.8.4
Export LineSearchCriterion
What's Changed
- export linesearch and test it by @HugoGranstrom in #32
Full Changelog: v0.8.2...v0.8.3
Fix Nim CI - strictEffects
Fix Nim CI
Fixes #29
Optimization has joined the chat
Multi-variate optimization and differentiation has been introduced.
numericalnim/differentiateofferstensorGradient(f, x)which calculates the gradient offw.r.txusing finite differences,tensorJacobian(returns the transpose of the gradient),tensorHessian,mixedDerivative. It also providescheckGradient(f, analyticGrad, x, tol)to verify that the analytic gradient is correct by comparing it to the finite difference approximation.numericalnim/optimizenow has several multi-variate optimization methods:steepestDescentnewtonbfgslbfgs- They all have the function signatures like:
where
proc bfgs*[U; T: not Tensor](f: proc(x: Tensor[U]): T, x0: Tensor[U], options: OptimOptions[U, StandardOptions] = bfgsOptions[U](), analyticGradient: proc(x: Tensor[U]): Tensor[T] = nil): Tensor[U]
fis the function to be minimized,x0is the starting guess,optionscontain options like tolerance (each method has it own options type which can be created by for examplelbfgsOptionsornewtonOptions),analyticGradientcan be supplied to avoid having to do finite difference approximations of the derivatives. - There are 4 different line search methods supported and those are set in the
options:Armijo, Wolfe, WolfeStrong, NoLineSearch. levmarq: non-linear least-square optimizerproc levmarq*[U; T: not Tensor](f: proc(params: Tensor[U], x: U): T, params0: Tensor[U], xData: Tensor[U], yData: Tensor[T], options: OptimOptions[U, LevmarqOptions[U]] = levmarqOptions[U]()): Tensor[U]
fis the function you want to fit to the parameters inparamandxis the value to evaluate the function at.params0is the initial guess for the parametersxDatais a 1D Tensor with the x points andyDatais a 1D Tensor with the y points.optionscan be created usinglevmarqOptions.- Returns the final parameters
Note: There are basic tests to ensure these methods converge for simple problems, but they are not tested on more complex problems and should be considered experimental until more tests have been done. Please try them out, but don't rely on them for anything important for now. Also, the API isn't set in stone yet so expect that it may change in future versions.
Fix nim CI
adds the task nimCI which is to to run by the Nim CI