Îõ³h&&#êå      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcd(c) Masahiro Sakai 2023 BSD-stylemasahiro.sakai@gmail.com provisional non-portable Safe-Inferred /ÁÂÃÜï#23numeric-optimization0Wrapper type for adding constraints to a problemnumeric-optimization+Wrapper type for adding bounds to a problemnumeric-optimization,Wrapper type for adding hessian to a problemnumeric-optimization6Wrapper type for adding gradient function to a problem numeric-optimizationType of constraint(Currently, no constraints are supported. numeric-optimizationOptional constraint numeric-optimization6Optimization problem equipped with hessian informationnumeric-optimization"Hessian of a function computed by  It is called hess in scipy.optimize.minimize.numeric-optimizationThe product of the hessian H of a function f at x with a vector x. It is called hessp in scipy.optimize.minimize. See also  Òhttps://hackage.haskell.org/package/ad-4.5.4/docs/Numeric-AD.html#v:hessianProduct.numeric-optimization7Optimization problem equipped with gradient informationnumeric-optimization#Gradient of a function computed by  It is called jac in scipy.optimize.minimize.numeric-optimizationPair of  and numeric-optimization Similar to : but destination passing style is used for gradient vectornumeric-optimizationOptimization problemsnumeric-optimizationObjective function It is called fun in scipy.optimize.minimize.numeric-optimizationBoundsnumeric-optimization Constraintsnumeric-optimization8The bad things that can happen when you use the library.numeric-optimization!Statistics of optimizaion processnumeric-optimizationTotal number of iterations. numeric-optimization%Total number of function evaluations.!numeric-optimization%Total number of gradient evaluations."numeric-optimization$Total number of hessian evaluations.#numeric-optimization$Total number of hessian evaluations.$numeric-optimizationOptimization result&numeric-optimization1Whether or not the optimizer exited successfully.'numeric-optimization,Description of the cause of the termination.(numeric-optimizationSolution)numeric-optimization&Value of the function at the solution.*numeric-optimizationGradient at the solution+numeric-optimization1Hessian at the solution; may be an approximation.,numeric-optimizationhttp://www.ece.northwestern.edu/~nocedal/PSfiles/limited.ps.gz=A Limited Memory Algorithm for Bound Constrained OptimizationÖ, (1995), SIAM Journal on Scientific and Statistical Computing , 16, 5, pp. 1190-1208.'[2] C. Zhu, R. H. Byrd and J. Nocedal.  =http://www.ece.northwestern.edu/~nocedal/PSfiles/lbfgsb.ps.gzâL-BFGS-B: Algorithm 778: L-BFGS-B, FORTRAN routines for large scale bound constrained optimizationÐ (1997), ACM Transactions on Mathematical Software, Vol 23, Num. 4, pp. 550-560."[3] J. L. Morales and J. Nocedal.  ?http://www.ece.northwestern.edu/~morales/PSfiles/acm-remark.pdfìL-BFGS-B: Remark on Algorithm 778: L-BFGS-B, FORTRAN routines for large scale bound constrained optimizationË (2011), ACM Transactions on Mathematical Software, Vol 38, Num. 7, pp. 1“@4[4] ,https://hackage.haskell.org/package/l-bfgs-b[5] 7http://users.iems.northwestern.edu/~nocedal/lbfgsb.html;numeric-optimization0Naïve implementation of Newton method in Haskell/This method requires both gradient and hessian.<numeric-optimization Whether a 7, is supported under the current environment.=numeric-optimizationUtility function to define   instances>numeric-optimization0Bounds for unconstrained problems, i.e. (-žD,+žD).?numeric-optimization Double rosenbrock [x,y] = sq (1 - x) + 100 * sq (y - sq x) rosenbrock' :: Vector Double -> Vector Double rosenbrock' [x,y] = [ 2 * (1 - x) * (-1) + 100 * 2 * (y - sq x) * (-2) * x , 100 * 2 * (y - sq x) ] sq :: Floating a => a -> a sq x = x ** 2@numeric-optimization'Numerical optimization algorithm to usenumeric-optimization,Parameters for optimization algorithms. Use  as a default.numeric-optimizationOptimization problem to solvenumeric-optimization Initial valueÁ  !"#$%&'()*+,-./0123456798:;<=>?@Á@  >? 798:;<./0123456$%&'()*+,- !"# = Safe-Inferred#Ýfghijklmî        !"#$$%&'()*+,--./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmî3numeric-optimization-0.1.1.0-Lqt23hWDbpiDxeKXNFAdyWNumeric.OptimizationPaths_numeric_optimization1data-default-class-0.1.2.0-CQYBH38PFES4dDyailJWvdData.Default.ClassdefDefaultWithConstraints WithBounds WithHessianWithGrad Constraint Optionally optionalDict HasHessianhessianhessianProductHasGradgradgrad'grad'M IsProblemfuncbounds constraintsOptimizationExceptionUnsupportedProblemUnsupportedMethodGradUnavailableHessianUnavailable Statistics totalIters funcEvals gradEvals hessianEvals hessEvalsResult resultSuccess resultMessageresultSolution resultValue resultGrad resultHessianresultHessianInvresultStatisticsParamsparamsCallback paramsTol paramsFTol paramsGTolparamsMaxIters paramsPastparamsMaxCorrectionsMethod CGDescentLBFGSLBFGSBNewtonisSupportedMethodhasOptionalDictboundsUnconstrainedisUnconstainedBoundsminimize$fContravariantParams$fDefaultParams$fFunctorResult $fExceptionOptimizationException$fIsProblemFUN$fOptionallyHasHessian$fOptionallyHasGrad$fOptionallyHasHessian0$fOptionallyHasGrad0$fHasHessianWithGrad$fHasGradWithGrad$fIsProblemWithGrad$fOptionallyHasHessian1$fOptionallyHasGrad1$fHasHessianWithHessian$fHasGradWithHessian$fIsProblemWithHessian$fOptionallyHasHessian2$fOptionallyHasGrad2$fHasHessianWithBounds$fHasGradWithBounds$fIsProblemWithBounds$fOptionallyHasHessian3$fOptionallyHasGrad3$fHasHessianWithConstraints$fHasGradWithConstraints$fIsProblemWithConstraints$fShowOptimizationException$fEqOptimizationException $fShowResult$fShowStatistics $fEqMethod $fOrdMethod $fEnumMethod $fShowMethod$fBoundedMethod-numeric-limits-0.1.0.0-1mnkQdvceomDvixHK0JBSGNumeric.LimitsepsilonversiongetDataFileName getBinDir getLibDir getDynLibDir getDataDir getLibexecDir getSysconfDir