h*y      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~                                                                                                                                                            2.0.0.0(c) Matthew Peddie 2017BSD3"Matthew Peddie  provisionalGHC Safe-InferredF6srtreeThis structure is returned in the event of a successful optimization.srtree,The objective function value at the minimumsrtree3The parameter vector which minimizes the objectivesrtreeWhy the optimizer stoppedsrtreeThe Augmented Lagrangian solvers allow you to enforce nonlinear constraints while using local or global algorithms that don't natively support them. The subsidiary problem is used to do the minimization, but the AUGLAG methods modify the objective to enforce the constraints. Please see  8http://ab-initio.mit.edu/wiki/index.php/NLopt_Algorithmsthe NLOPT algorithm manual for more details on how the methods work and how they relate to one another.See the documentation for 7 for an important note about the constraint functions.srtree3AUGmented LAGrangian with a local subsidiary methodsrtreeAUGmented LAGrangian with a local subsidiary method and with penalty functions only for equality constraintssrtree4AUGmented LAGrangian with a global subsidiary methodsrtreeAUGmented LAGrangian with a global subsidiary method and with penalty functions only for equality constraints.srtreeIMPORTANT NOTEFor augmented lagrangian problems, you, the user, are responsible for providing the appropriate type of constraint. If the subsidiary problem requires an u, then you should provide constraint functions with derivatives. If the subsidiary problem requires an v, you should provide constraint functions without derivatives. If you don't do this, you may get a runtime error.srtree+Possibly empty set of equality constraintssrtree=Possibly empty set of equality constraints with derivativessrtreeAlgorithm specification.srtreeThese are the local minimization algorithms provided by NLOPT. Please see  8http://ab-initio.mit.edu/wiki/index.php/NLopt_Algorithmsthe NLOPT algorithm manual for more details on how the methods work and how they relate to one another. Note that some local methods require you provide derivatives (gradients or Jacobians) for your objective function and constraint functions.%Optional parameters are wrapped in a ; for example, if you see  M, you can simply specify  to use the default behavior.srtreeLimited-memory BFGSsrtreeLimited-memory BFGS srtree.Shifted limited-memory variable-metric, rank-2!srtree.Shifted limited-memory variable-metric, rank-1"srtreeTruncated Newton's method#srtree3Truncated Newton's method with automatic restarting$srtree(Preconditioned truncated Newton's method%srtreePreconditioned truncated Newton's method with automatic restarting&srtreeMethod of moving averages'srtree.Sequential Least-Squares Quadratic Programming(srtree+Conservative Convex Separable Approximation)srtree/PRincipal AXIS gradient-free local optimization*srtree1Constrained Optimization BY Linear Approximations+srtreePowell's NEWUOA algorithm,srtree,Powell's NEWUOA algorithm with bounds by SGJ-srtree(Nelder-Mead Simplex gradient-free method.srtree1NLOPT implementation of Rowan's Subplex algorithm/srtree0Bounded Optimization BY Quadratic Approximations2srtree'The dimension of the parameter vector.3srtree At least one stopping condition4srtreeAlgorithm specification5srtreeThese are the global minimization algorithms provided by NLOPT. Please see  8http://ab-initio.mit.edu/wiki/index.php/NLopt_Algorithmsthe NLOPT algorithm manual for more details on how the methods work and how they relate to one another.%Optional parameters are wrapped in a ; for example, if you see  O, you can simply specify  to use the default behavior.6srtreeDIviding RECTangles7srtree+DIviding RECTangles, locally-biased variant8srtree*DIviding RECTangles, "slightly randomized"9srtree%DIviding RECTangles, unscaled version:srtree0DIviding RECTangles, locally-biased and unscaled;srtreeDIviding RECTangles, locally-biased, unscaled and "slightly randomized"<srtree4DIviding RECTangles, original FORTRAN implementation=srtreeDIviding RECTangles, locally-biased, original FORTRAN implementation>srtree!Stochastic Global Optimization. 9This algorithm is only available if you have linked with  libnlopt_cxx.?srtree5Stochastic Global Optimization, randomized variant. 9This algorithm is only available if you have linked with  libnlopt_cxx.@srtree,Controlled Random Search with Local MutationAsrtree.Improved Stochastic Ranking Evolution StrategyBsrtreeEvolutionary AlgorithmCsrtree#Original Multi-Level Single-LinkageDsrtreeMulti-Level Single-Linkage with Sobol Low-Discrepancy Sequence for starting pointsGsrtreeLower bounds for xHsrtreeUpper bounds for xIsrtree At least one stopping conditionJsrtreeAlgorithm specificationKsrtreeThis specifies the memory size to be used by algorithms like 6 which store approximate Hessian or Jacobian matrices.OsrtreeThis specifies the population size for algorithms that use a pool of solutions.QsrtreeThis specifies how to initialize the random number generator for stochastic algorithms.Rsrtree%Seed the RNG with the provided value.Ssrtree$Seed the RNG using the system clock.Tsrtree5Don't perform any explicit initialization of the RNG.UsrtreeA U tells NLOPT when to stop working on a minimization problem. When multiple Us are provided, the problem will stop when any one condition is met.Vsrtree(Stop minimizing when an objective value J4 less than or equal to the provided value is found.WsrtreeStop minimizing when an optimization step changes the objective value J3 by less than the provided tolerance multiplied by |J|.XsrtreeStop minimizing when an optimization step changes the objective value by less than the provided tolerance.Ysrtree'Stop when an optimization step changes  every element of the parameter vector x by less than x# scaled by the provided tolerance.Zsrtree'Stop when an optimization step changes  every element of the parameter vector x by less than the corresponding element in the provided vector of tolerances values.[srtreeStop when the number of evaluations of the objective function exceeds the provided count.\srtreeStop when the optimization time exceeds the provided time (in seconds). This is not a precise limit.]srtreeBound constraints are specified by vectors of the same dimension as the parameter space.Example programThe following interactive session example enforces lower bounds on the example from the beginning of the module. This prevents the optimizer from locating the true minimum at (0, 0),; a slightly higher constrained minimum at (1, 1)- is found. Note that the optimizer returns   rather than  ?, because the bound constraint is active at the final minimum..import Numeric.LinearAlgebra ( dot, fromList )let objf x = x `dot` x + 22 -- define objectivelet stop = ObjectiveRelativeTolerance 1e-6 :| [] -- define stopping criterionlet lowerbound = LowerBounds $ fromList [1, 1] -- specify boundslet algorithm = NELDERMEAD objf [lowerbound] Nothing -- specify algorithmlet problem = LocalProblem 2 stop algorithm -- specify problemlet x0 = fromList [5, 10] -- specify initial guessminimizeLocal problem x0Right (Solution {solutionCost = 24.0, solutionParams = [1.0,1.0], solutionResult = XTOL_REACHED})^srtreeLower bound vector v means we want x >= v._srtreeUpper bound vector u means we want x <= u.`srtreeA collection of inequality constraints that supply constraint derivatives.asrtreeA collection of equality constraints that supply constraint derivatives.bsrtreeA collection of inequality constraints that do not supply constraint derivatives.csrtreeA collection of equality constraints that do not supply constraint derivatives.dsrtreeAn inequality constraint, comprised of both the constraint function (or functions, if a preconditioner is used) along with the desired tolerance.hsrtreeAn equality constraint, comprised of both the constraint function (or functions, if a preconditioner is used) along with the desired tolerance.msrtreeA scalar constraint.nsrtreeA vector constraint.osrtree>A scalar constraint with an attached preconditioning function.psrtree$A constraint function which returns c(x) given the parameter vector x7 along with the Jacobian (first derivative) matrix of c(x) with respect to x3 at that point. The constraint will enforce that  c(x) == 0 (equality constraint) or  c(x) <= 0 (inequality constraint).qsrtree-A constraint function which returns a vector c(x) given the parameter vector x$. The constraint will enforce that  c(x) == 0 (equality constraint) or  c(x) <= 0 (inequality constraint).rsrtree$A constraint function which returns c(x) given the parameter vector x along with the gradient of c(x) with respect to x3 at that point. The constraint will enforce that  c(x) == 0 (equality constraint) or  c(x) <= 0 (inequality constraint).ssrtree$A constraint function which returns c(x) given the parameter vector x$. The constraint will enforce that  c(x) == 0 (equality constraint) or  c(x) <= 0 (inequality constraint).tsrtree*A preconditioner function, which computes  vpre = H(x) v , where H is the Hessian matrix: the positive semi-definite second derivative at the given parameter vector x, or an approximation thereof.usrtreeAn objective function that calculates both the objective value and the gradient of the objective with respect to the input parameter vector, at the given parameter vector.vsrtreeAn objective function that calculates the objective value at the given parameter vector.wsrtree0Solve the specified global optimization problem.Example program3The following interactive session example uses the A algorithm, a stochastic, derivative-free global optimizer, to minimize a trivial function with a minimum of 22.0 at (0, 0). The search is conducted within a box from -10 to 10 in each dimension. .import Numeric.LinearAlgebra ( dot, fromList )let objf x = x `dot` x + 22 -- define objectivelet stop = ObjectiveRelativeTolerance 1e-12 :| [] -- define stopping criterionlet algorithm = ISRES objf [] [] (SeedValue 22) Nothing -- specify algorithmlet lowerbounds = fromList [-10, -10] -- specify boundslet upperbounds = fromList [10, 10] -- specify boundslet problem = GlobalProblem lowerbounds upperbounds stop algorithmlet x0 = fromList [5, 8] -- specify initial guessminimizeGlobal problem x0Right (Solution {solutionCost = 22.000000000002807, solutionParams = [-1.660591102367038e-6,2.2407062393213684e-7], solutionResult = FTOL_REACHED})xsrtreeExample programThe following interactive session example enforces the same scalar constraint as the nonlinear constraint example, but this time it uses the SLSQP solver to find the minimum. =import Numeric.LinearAlgebra ( dot, fromList, toList, scale )*let objf x = (x `dot` x + 22, 2 `scale` x)0let stop = ObjectiveRelativeTolerance 1e-9 :| [];let constraintf x = (sum (toList x) - 1.0, fromList [1, 1])=let constraint = EqualityConstraint (Scalar constraintf) 1e-6-let algorithm = SLSQP objf [] [] [constraint]+let problem = LocalProblem 2 stop algorithmlet x0 = fromList [5, 10]minimizeLocal problem x0Right (Solution {solutionCost = 22.5, solutionParams = [0.4999999999999998,0.5000000000000002], solutionResult = FTOL_REACHED})ysrtreeExample programThe following interactive session example enforces the same scalar constraint as the nonlinear constraint example, but this time it uses the augmented Lagrangian method to enforce the constraint and the . algorithm, which does not support nonlinear constraints itself, to perform the minimization. As before, the parameters must always sum to 1, and the minimizer finds the same constrained minimum of 22.5 at  (0.5, 0.5). 6import Numeric.LinearAlgebra ( dot, fromList, toList )let objf x = x `dot` x + 220let stop = ObjectiveRelativeTolerance 1e-9 :| []%let algorithm = SBPLX objf [] Nothing.let subproblem = LocalProblem 2 stop algorithmlet x0 = fromList [5, 10]minimizeLocal subproblem x0Right (Solution {solutionCost = 22.0, solutionParams = [0.0,0.0], solutionResult = FTOL_REACHED})-- define constraint function:(let constraintf x = sum (toList x) - 1.05-- define constraint object to pass to the algorithm:=let constraint = EqualityConstraint (Scalar constraintf) 1e-6let problem = AugLagProblem [constraint] [] (AUGLAG_EQ_LOCAL subproblem)minimizeAugLag problem x0Right (Solution {solutionCost = 22.500000015505844, solutionParams = [0.5000880506776678,0.4999119493223323], solutionResult = FTOL_REACHED})psrtreeParameter vectorsrtreeConstraint Vectorizesrtree3(Constraint violation vector, constraint Jacobian)qsrtreeParameter vectorsrtreeConstraint VectorizesrtreeConstraint violation vectorrsrtreeParameter vectorsrtree+(Constraint violation, constraint gradient)ssrtreeParameter vector xsrtree'Constraint violation (deviation from 0)tsrtreeParameter vector xsrtreeVector v to precondition at xsrtreePreconditioned vector vpreusrtreeParameter vectorsrtree$(Objective function value, gradient)vsrtreeParameter vectorsrtreeObjective function valuewsrtreeProblem specificationsrtreeInitial parameter guesssrtreeOptimization resultsvut]^_srqplnmohijkdefgcab`UVWXYZ[\QRSTOPMNKL& !"#$%'()*+,-./01234x56789:;<=>?@ABCDEFGHIJwy vut]^_srqplnmohijkdefgcab`UVWXYZ[\QRSTOPMNKL& !"#$%'()*+,-./01234x56789:;<=>?@ABCDEFGHIJwy !(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimental5FlexibleInstances, DeriveFunctor, ScopedTypeVariables Safe-Inferred9G''!(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimental5FlexibleInstances, DeriveFunctor, ScopedTypeVariables Safe-Inferred"9UsrtreeSupported functionssrtreeSupported operatorssrtreeTree structure to be used with Symbolic Regression algorithms. This structure is a fixed point of a n-ary tree. srtreeindex of the variablessrtreeindex of the parametersrtreeconstant value, can be converted to a parameter | IConst Int -- TODO: integer constant | RConst Ratio -- TODO: rational constantsrtreeunivariate functionsrtreebinary operatorsrtree8create a tree with a single node representing a variablesrtree9create a tree with a single node representing a parametersrtree>create a tree with a single node representing a constant valuesrtreeArity of the current nodesrtreeGet the children of a node. Returns an empty list in case of a leaf node.%map showExpr . getChildren $ "x0" + 2 ["x0", 2]srtree$Get the children of an unfixed node srtree0replaces the children with elements from a list srtree9returns a node containing the operator and () as childrensrtree$Count the number of nodes in a tree.countNodes $ "x0" + 23srtreeCount the number of  nodes,countVarNodes $ "x0" + 2 * ("x0" - sin "x1")3srtreeCount the number of  nodes4countParams $ "x0" + "t0" * sin ("t1" + "x1") - "t0"3srtreeCount the number of const nodes$countConsts $ "x0"* 2 + 3 * sin "x0"2srtree-Count the occurrences of variable indexed as ix2countOccurrences 0 $ "x0"* 2 + 3 * sin "x0" + "x1"2srtree#counts the number of unique tokens :countUniqueTokens $ "x0" + ("x1" * "x0" - sin ("x0" ** 2))8srtree&return the number of unique variables +numberOfVars $ "x0" + 2 * ("x0" - sin "x1")2srtreereturns the integer constants. We assume an integer constant as those values in which `floor x == ceiling x`.*getIntConsts $ "x0" + 2 * "x1" ** 3 - 3.14 [2.0,3.0]srtree;Relabel the parameters indices incrementaly starting from 0showExpr . relabelParams $ "x0" + "t0" * sin ("t1" + "x1") - "t0"'"x0" + "t0" * sin ("t1" + "x1") - "t2" srtree;Relabel the parameters indices incrementaly starting from 0showExpr . relabelParams $ "x0" + "t0" * sin ("t1" + "x1") - "t0"&"x0" + "t0" * sin ("t1" + "x1") - "t2"srtreeChange constant values to a parameter, returning the changed tree and a list of parameter values6snd . constsToParam $ "x0" * 2 + 3.14 * sin (5 * "x1")[2.0,3.14,5.0]srtreeSame as  but does not change constant values that can be converted to integer without loss of precision;snd . floatConstsToParam $ "x0" * 2 + 3.14 * sin (5 * "x1")[3.14]srtree1Convert the parameters into constants in the treeshowExpr . paramsToConst [1.1, 2.2, 3.3] $ "x0" + "t0" * sin ("t1" * "x0" - "t2")x0 + 1.1 * sin(2.2 * x0 - 3.3)srtreethe instance of = allows us to create a tree using a more practical notation:":t "x0" + "t0" * sin("x1" * "t1") Fix SRTree>>!(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimentalConstraintKinds Safe-Inferred/\ srtreeRndTree is a Monad Transformer to generate random trees of type `SRTree ix val` given the parameters `p ix val` using the random number generator .srtreeA structure with every propertysrtree&Constraint synonym for all properties.srtree)Returns a random variable, the parameter p must have the  propertysrtree)Returns a random constant, the parameter p must have the HasConst propertysrtree3Returns a random integer power node, the parameter p must have the  propertysrtree)Returns a random function, the parameter p must have the  propertysrtree%Returns a random node, the parameter p must have every property.srtree2Returns a random non-terminal node, the parameter p must have every property.srtree;Returns a random tree with a limited budget, the parameter p must have every property.let treeGen = runReaderT (randomTree 12) (P [0,1] (-10, 10) (2, 3) [Log, Exp])(tree <- evalStateT treeGen (mkStdGen 52) showExpr tree"(-2.7631152121655838 / Exp((x0 / ((x0 * -7.681722660704317) - Log(3.378309080134594)))))"srtree4Returns a random tree with a approximately a number n of nodes, the parameter p must have every property.let treeGen = runReaderT (randomTreeBalanced 10) (P [0,1] (-10, 10) (2, 3) [Log, Exp])(tree <- evalStateT treeGen (mkStdGen 42) showExpr tree"Exp(Log((((7.784360517385774 * x0) - (3.6412224491658223 ^ x1)) ^ ((x0 ^ -4.09764995657091) + Log(-7.710216839988497)))))"!(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimental Safe-Inferred"a srtreeconverts a tree with protected operators to a conventional math treesrtree.convert a tree into a string in math notation )showExpr $ "x0" + sin ( tanh ("t0" + 2) )"(x0 + Sin(Tanh((t0 + 2.0))))"srtreeconvert a tree into a string in math notation given named vars.showExprWithVar ["mu", "eps"] $ "x0" + sin ( "x1" * tanh ("t0" + 2) )$"(mu + Sin(Tanh(eps * (t0 + 2.0))))"srtreeprints the expression srtreeprints the expressionsrtree1Displays a tree as a numpy compatible expression.+showPython $ "x0" + sin ( tanh ("t0" + 2) )."(x[:, 0] + np.sin(np.tanh((t[:, 0] + 2.0))))"srtree&print the expression in numpy notationsrtree1Displays a tree as a LaTeX compatible expression.*showLatex $ "x0" + sin ( tanh ("t0" + 2) )"\\left(x_{, 0} + \\operatorname{sin}(\\operatorname{tanh}(\\left(\\theta_{, 0} + 2.0\\right)))\\right)"srtree%prints expression in LaTeX notation. srtreeDisplays a tree in Tikz formatsrtreeprints the tree in TikZ format  !(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimentalFlexibleInstances, DeriveFunctor, ScopedTypeVariables, ConstraintKinds Safe-Inferredbw>>!(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimental5FlexibleInstances, DeriveFunctor, ScopedTypeVariables Safe-Inferredf srtreeMatrix of features values srtreeVector of parameter values. Needs to be strict to be readily accesible.srtreeVector of target values srtreeEvaluates the tree given a vector of variable values, a vector of parameter values and a function that takes a Double and change to whatever type the variables have. This is useful when working with datasets of many values per variables.srtree>Returns the inverse of a function. This is a partial function.srtreeevals the inverse of a functionsrtree'evals the right inverse of an operator srtree&evals the left inverse of an operator srtreeList of invertible functions!(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimentalFlexibleInstances, DeriveFunctor, ScopedTypeVariables, ConstraintKinds Safe-Inferred"(o srtree9Loads a list of list of bytestrings to a matrix of doublesrtree$Returns true if the extension is .gzsrtreeDetects the separator automatically by checking whether the use of each separator generates the same amount of SRMatrix in every row and at least two SRMatrix.detectSep ["x1,x2,x3,x4"] ','srtree+reads a file and returns a list of list of  ByteString corresponding to each element of the matrix. The first row can be a header. srtreeSplits the parameters from the filename the expected format of the filename is *filename.ext:p1:p2:p3:p4* where p1 and p2 is the starting and end rows for the training data, by default p1 = 0 and p2 = number of rows - 1 p3 is the target PVector, it can be a string corresponding to the header or an index. p4 is a comma separated list of SRMatrix (either index or name) to be used as input variables. These will be renamed internally as x0, x1, ... in the order of this list.srtree#Tries to parse a string into an intsrtreeGiven a map between PVector name and indeces, the target PVector and the variables SRMatrix, returns the indices of the variables SRMatrix and the targetsrtreeGiven the start and end rows, it returns the hmatrix extractors for the training and validation datasrtree loads a dataset with a filename in the format: filename.ext:start_row:end_row:target:features it returns the X_train, y_train, X_test, y_test, varnames, target name where varnames are a comma separated list of the name of the vars and target name is the name of the targetwhere*start_row:end_row** is the range of the training rows (default 0:nrows-1). every other row not included in this range will be used as validation*target** is either the name of the PVector (if the datafile has headers) or the index of the target variable*features** is a comma separated list of SRMatrix names or indices to be used as input variables of the regression model. Safe-Inferredq#srtreeGiven a list of (x,y) co-ordinates, produces a list of coefficients to cubic equations, with knots at each of the initially provided x co-ordinates. Natural cubic spline interpololation is used. See:  http://en.wikipedia.org/wiki/Spline_interpolation#Interpolation_using_natural_cubic_spline. !(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimental Safe-Inferreds srtree+this assumes up to 999 variables and paramssrtreereturns an empty e-graphsrtreereturns an empty e-graph DBsrtreeCreates a new e-class from an e-class id, a new e-node, and the info of this e-class srtree#gets the canonical id of an e-classsrtreecanonize the e-node childrensrtreegets an e-class with id csrtree+Creates a singleton trie from an e-class idsrtree,Check whether an e-class is a constant value !(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimental Safe-Inferred(vosrtree8returns all the root e-classes (e-class without parents)srtreereturns the e-class id with the best fitness that is true to a predicate !(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimental Safe-Inferredxssrtreejoin data from two e-classes TODO: instead of folding, just do not apply rules list of values instead of single valuesrtree0Calculate e-node data (constant values and cost)srtreeupdate the heights of each e-class won't work if there's no rootsrtreecalculates the cost of a nodesrtree,check whether an e-node evaluates to a const !(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimental Safe-Inferred~ srtree?Returns the substitution rules for every match of the pattern  inside the e-graph.srtree,Returns a Query (list of atoms) of a patternsrtree'returns the list of the children valuessrtreeCreates the substituion map for the pattern variables for each one of the matched subgraphsrtreereturns the e-class id for a certain variable that matches the pattern described by the atomssrtree>returns all e-class id that can matches this sequence of atomssrtreesearches for the intersection of e-class ids that matches each part of the query. Returns Nothing if the intersection is empty.var is the current variable being investigated xs is the map of ids being investigated and their corresponding e-class id trie is the current trie of the pattern (i:ids) sequence of root : children of the atom to investigate NOTE: it must be Maybe Set to differentiate between empty set and no answersrtree/updates all occurrence of var with the new id xsrtreechecks whether two ClassOrVar are different only check if it is a pattern variable, else returns truesrtree$checks if v is an element of an atomsrtree?sorts the variables in a query by the most frequently occurring332 !(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimental5FlexibleInstances, DeriveFunctor, ScopedTypeVariables Safe-Inferred"ksrtree>Creates the symbolic partial derivative of a tree by variable dx (if p is ) or parameter dx (if p is ). This uses mutual recursion where the first recursion (alg1) holds the derivative w.r.t. the current node and the second (alg2) holds the original tree.-showExpr . deriveBy False 0 $ 2 * "x0" * "x1" "(2.0 * x1)"showExpr . deriveBy True 1 $ 2 * "x0" * "t0" - sqrt ("t1" * "x0")1"(-1.0 * ((1.0 / (2.0 * Sqrt((t1 * x0)))) * x0))"srtreeDerivative of each supported function For a function h(f) it returns the derivative dh/dfderivative Log 2.00.5srtree.Second-order derivative of supported functionsdoubleDerivative Log 2.0-0.25srtree!Symbolic derivative by a variablesrtree"Symbolic derivative by a parameter!(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimental5FlexibleInstances, DeriveFunctor, ScopedTypeVariables Safe-Inferred (9Ksrtreeget the value of a certain index if it is an array (Left) or returns the value itself if it is a scalar.srtreeCalculates the results of the error vector multiplied by the Jacobian of an expression using forward mode provided a vector of variable values xss, a vector of parameter values theta and a function that changes a Double value to the type of the variable values. uses unsafe operations to use mutable array instead of a tapesrtree The function  calculates the numerical gradient of the tree and evaluates the tree at the same time. It assumes that each parameter has a unique occurrence in the expression. This should be significantly faster than .srtree;Same as above, but using reverse mode, that is even faster.srtree The function  calculates the numerical gradient of the tree and evaluates the tree at the same time. It assumes that each parameter has a unique occurrence in the expression. This should be significantly faster than .!(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimentalConstraintKinds Safe-Inferred(srtree3Supported distributions for negative log-likelihoodsrtree.Sum-of-square errors or Sum-of-square residuessrtreeTotal Sum-of-squaressrtreeMean squared errorssrtreeRoot of the mean squared errorssrtreeCoefficient of determinationsrtreelogistic functionsrtreeget the standard error from a Maybe Double if it is Nothing, estimate from the ssr, otherwise use the current value For distributions other than Gaussian, it defaults to a constant 1srtreeGaussian distributionNegative log-likelihoodsrtree&Prediction for different distributionssrtree'Gradient of the negative log-likelihoodsrtree'Gradient of the negative log-likelihoodsrtree-Fisher information of negative log-likelihoodsrtree"Hessian of negative log-likelihoodNote, though the Fisher is just the diagonal of the return of this function it is better to keep them as different functions for efficiency!(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimentalConstraintKinds Safe-Inferredsrtree7minimizes the negative log-likelihood of the expressionsrtreeminimizes the likelihood assuming repeating parameters in the expression srtreeminimizes the function while keeping the parameter ix fixed (used to calculate the profile)srtree$minimizes using Gaussian likelihood srtree$minimizes using Binomial likelihood srtree#minimizes using Poisson likelihood !(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimentalConstraintKinds Safe-Inferred(`srtreeBayesian information criterionsrtreeAkaike information criterionsrtree Evidence srtreeMDL as described in Bartlett, Deaglan J., Harry Desmond, and Pedro G. Ferreira. "Exhaustive symbolic regression." IEEE Transactions on Evolutionary Computation (2023).srtreeMDL Lattice as described in Bartlett, Deaglan, Harry Desmond, and Pedro Ferreira. "Priors for symbolic regression." Proceedings of the Companion Conference on Genetic and Evolutionary Computation. 2023.srtreesame as  but weighting the functional structure by frequency calculated using a wiki information of physics and engineering functions  !(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimentalConstraintKinds Safe-Inferred(srtree8a confience interval is composed of the point estimate (), lower bound (_lower_) and upper bound ()srtreeBasic stats of the data: covariance of parameters, correlation, standard errors srtreeConfidence Interval using Laplace approximation or profile likelihood.srtreeprofile likelihood algorithms: Bates (classical), ODE (faster), Constrained (fastest) The Constrained approach returns only the endpoints.srtreeCalculates the confidence interval of the parameters using Laplace approximation or Profile likelihoodsrtreecalculates the prediction confidence interval using Laplace approximation or profile likelihood. ,,!(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimental Safe-Inferredsrtree4adds a new or existing e-node (merging if necessary)srtree:rebuilds the e-graph after inserting or merging e-classessrtreerepairs e-node by canonizing its children if the canonized e-node already exists in e-graph, merge the e-classessrtreerepair the analysis of the e-class considering the new added e-nodesrtreemerge to equivalent e-classessrtreemodify an e-class, e.g., add constant e-node and prune non-leavessrtree creates a database of patterns from an e-graph it simply calls addToDB for every pair (e-node, e-class id) from the e-graph.srtree. adds an e-node and e-class id to the databasesrtree3Populates an IntTrie with a sequence of e-class idssrtreegets the e-node of the target of the rule TODO: add consts and modifysrtree,adds the target of the rule into the e-graphsrtreereturns 3 if the condition of a rule is valid for that matchsrtree*Creates an e-graph from an expression treesrtree1Builds an e-graph from multiple independent treessrtree8gets the best expression given the default cost functionsrtree)returns one expression rooted at e-class eId TODO: avoid loopingssrtree*returns all expressions rooted at e-class eId TODO: check for infinite listsrtree.returns a random expression rooted at e-class eId!(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimental Safe-InferredfsrtreeThe  stores a map with the banned iterations of a certain rule . TODO: make it more customizable.srtreeruns equality saturation from an expression tree, a given set of rules, and a cost function. Returns the tree with the smallest cost.srtree/recalculates the costs with a new cost functionsrtree2run equality saturation for a number of iterationssrtree5apply a single step of merge-only equality saturationsrtree#matches the rules given a scheduler!(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimental Safe-Inferred"srtreedefault cost function for simplification TODO: num_params: length: terminal < nonterminal: symbol comparison (constants, parameters, variables x0, x10, x2) op priorities (+, -, *, inv_div, pow, abs, exp, log, log10, sqrt) univariatessrtree&simplify using the default parameters srtree!simplifies with custom parameterssrtree5apply a single step of merge-only using default rules   Safe-Inferred!(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimentalConstraintKinds Safe-Inferred"psrtreeSupported outputs.srtreeSupported algorithms.srtree*Parser of a symbolic regression tree with 5 variable index and numerical values represented as 1. The numerical values type can be changed with .srtreeReturns the corresponding function from Data.SRTree.Print for a given .srtree+Calls the corresponding parser for a given fmap (showOutput MATH) $ parseSR OPERON "lambda,theta" False "lambda ^ 2 - sin(theta*3*lambda)"-Right "((x0 ^ 2.0) - Sin(((x1 * 3.0) * x0)))"srtree&Creates a parser for a binary operatorsrtree%Creates a parser for a unary functionsrtreeEnvelopes the parser in parenssrtree=Parse an expression using a user-defined parser given by the  lists containing the name of the functions and operators of that SR algorithm, a list of parsers binFuns for binary functions a parser  for variables, a boolean indicating whether to change floating point values to free parameters variables, and a list of variable names with their corresponding indexes.srtreeTries to parse as an #, if it fails, parse as a Double.srtreeanalytic quotientsrtreeParser for binary functionssrtree/parser for Transformation-Interaction-Rational.srtreeparser for Operon.srtreeparser for HeuristicLab.srtreeparser for Bingosrtreeparser for GOMEAsrtreeparser for PySR!(c) Fabricio Olivetti 2021 - 2024BSD3fabricio.olivetti@gmail.com experimentalConstraintKinds Safe-Inferredisrtreegiven a filename, the symbolic regression algorithm, a string of variables name, and two booleans indicating whether to convert float values to parameters and whether to simplify the expression or not, it will read the file and parse everything returning a list of either an error message or a tree.!empty filename defaults to stdin srtreeoutputs a list of either error or trees to a file using the Output format. "empty filename defaults to stdout srtree=debug version of output function to check the invalid parsers ! " # $ % & ' ( ) * + ,--./01234566789:;<=>?@ABCDEFGHIJKLMMNOPQRSTUVWXYZ[\]^_`aabcdeffgghhijklmnopqrstuvwxyz{||}~                                                                                                                                                             srtree-2.0.0.0-inplaceAlgorithm.SRTree.NonlinearOptData.SRTree.RecursionData.SRTree.InternalData.SRTree.RandomData.SRTree.PrintData.SRTree.EvalData.SRTree.DatasetsAlgorithm.Massiv.UtilsAlgorithm.EqSat.EgraphAlgorithm.EqSat.QueriesAlgorithm.EqSat.InfoAlgorithm.EqSat.DBData.SRTree.DerivativeAlgorithm.SRTree.ADAlgorithm.SRTree.LikelihoodsAlgorithm.SRTree.OptAlgorithm.SRTree.ModelSelection$Algorithm.SRTree.ConfidenceIntervalsAlgorithm.EqSat.BuildAlgorithm.EqSatAlgorithm.EqSat.Simplify Text.ParseSRText.ParseSR.IOsrtree Data.SRTree Paths_srtreebaseGHC.BaseNonEmpty:|nlopt-haskell-0.1.3.0-522950f004a425a67508e6fbb64193b8f5f5b3aac5497310ed4383d6af75e5cc#Numeric.Optimization.NLOPT.BindingsResultFAILURE INVALID_ARGS OUT_OF_MEMORYROUNDOFF_LIMITED FORCED_STOPSUCCESSSTOPVAL_REACHED FTOL_REACHED XTOL_REACHEDMAXEVAL_REACHEDMAXTIME_REACHEDSolution solutionCostsolutionParamssolutionResultAugLagAlgorithm AUGLAG_LOCALAUGLAG_EQ_LOCAL AUGLAG_GLOBALAUGLAG_EQ_GLOBAL AugLagProblem alEquality alEqualityD alalgorithmLocalAlgorithm LBFGS_NOCEDALLBFGSVAR2VAR1TNEWTONTNEWTON_RESTARTTNEWTON_PRECONDTNEWTON_PRECOND_RESTARTMMASLSQPCCSAQPRAXISCOBYLANEWUOA NEWUOA_BOUND NELDERMEADSBPLXBOBYQA LocalProblemlsizelstop lalgorithmGlobalAlgorithmDIRECTDIRECT_L DIRECT_L_RAND DIRECT_NOSCALDIRECT_L_NOSCALDIRECT_L_RAND_NOSCAL ORIG_DIRECT ORIG_DIRECT_LSTOGO STOGO_RANDCRS2_LMISRESESCHMLSLMLSL_LDS GlobalProblem lowerBounds upperBoundsgstop galgorithm InitialStep VectorStorage Population RandomSeed SeedValue SeedFromTime Don'tSeedStoppingCondition MinimumValueObjectiveRelativeToleranceObjectiveAbsoluteToleranceParameterRelativeToleranceParameterAbsoluteToleranceMaximumEvaluations MaximumTimeBounds LowerBounds UpperBoundsInequalityConstraintsDEqualityConstraintsDInequalityConstraintsEqualityConstraintsInequalityConstraintineqConstraintFunctionsineqConstraintToleranceEqualityConstrainteqConstraintFunctionseqConstraintTolerance ConstraintScalarVectorPreconditionedVectorConstraintDVectorConstraintScalarConstraintDScalarConstraintPreconditioner ObjectiveD ObjectiveminimizeGlobal minimizeLocalminimizeAugLag%$fApplyConstraintInequalityConstraint#$fApplyConstraintEqualityConstraint&$fApplyConstraintInequalityConstraint0$$fApplyConstraintEqualityConstraint0$fExceptionNloptException$fProblemSizeGlobalProblem$fProblemSizeLocalProblem$fProblemSizeAugLagProblem $fEqSolution$fShowSolution$fReadSolution$fShowNloptException$fEqInitialStep$fShowInitialStep$fReadInitialStep$fEqVectorStorage$fShowVectorStorage$fReadVectorStorage$fEqPopulation$fShowPopulation$fReadPopulation$fEqRandomSeed$fShowRandomSeed$fReadRandomSeed$fEqStoppingCondition$fShowStoppingCondition$fReadStoppingCondition $fEqBounds $fShowBounds $fReadBoundsFreeRetOpCofree:< CoAlgebraAlgebraFixunfixTreeFLeafFNodeFStreamFNatFZeroFSuccFListFNilFConsFextractunOpcatacataManahyloparamutuapoaccuhistofutuchronofromListtoList stream2listtoNatfromNat$fFunctorTreeF$fFunctorStreamF $fFunctorNatF$fFunctorListFFunctionIdAbsSinCosTanSinhCoshTanhASinACosATanASinhACoshATanhSqrtSqrtAbsCbrtSquareLogLogAbsExpRecipCubeAddSubMulDivPowerPowerAbsAQSRTreeVarParamConstUniBinvarparamconstvarity getChildren childrenOfreplaceChildren getOperator countNodes countVarNodes countParams countConstscountOccurrencescountUniqueTokens numberOfVars getIntConsts relabelParams relabelVars constsToParamfloatConstsToParam paramsToConst$fTraversableSRTree$fFoldableSRTree $fFloatingFix$fFractionalFix$fNumFix $fIsStringFix $fShowSRTree $fEqSRTree $fOrdSRTree$fFunctorSRTree$fShowFunction$fReadFunction $fEqFunction $fOrdFunction$fEnumFunction$fShowOp$fReadOp$fEqOp$fOrdOp$fEnumOpRndTree FullParamsP HasEverythingHasFunsHasValsHasVars randomVar randomConst randomPowrandomFunction randomNoderandomNonTerminal randomTreerandomTreeBalanced$fHasFunsFullParams$fHasExpsFullParams$fHasValsFullParams$fHasVarsFullParamsshowExprshowExprWithVars printExprprintExprWithVars showPython printPython showLatex printLatexshowTikz printTikzSRMatrixPVectorSRVectorcompMode replicateAsevalTreeevalOpevalFuncbrt inverseFunc evalInverseinvrightinvleft invertibles$fFractionalArray$fFloatingArray $fNumArray loadDatasetPolyCosNegDef MMassArraygetRowsgetCols appendRow appendColupdateSlinSpaceouterdetdetCholrangedLinearDotProdcholeskyinvChollu forwardSub backwardSubluSolvecubicSplineCoefficientschunkBy genSplineFun$fExceptionNegDef $fShowNegDefIntTrie_keys_trieDB EClassDataEData_cost_best_consts_fitness_theta_sizePropertyPositiveNegativeNonZeroRealConstsNotConstParamIxConstValEClass _eClassId_eNodes_parents_height_infoEGraphDBEDB _worklist _analysis_patDB _fitRangeDB_sizeDB _sizeFitDB _unevaluated_nextIdEGraph _canonicalMap_eNodeToEClass_eClass_eDB RangeTreeCostFunCostEGraphSTENodeEncENode ClassIdMapEClassId encodeEnode decodeEnode insertRange removeRangegetWithinRange getSmallest getGreatest$fHashableSRTree$fEqEClassData $fShowIntTrie $fShowEGraph$fShowEGraphDB $fShowEClass $fEqEClass$fShowEClassData$fShowProperty $fEqProperty $fShowConsts $fEqConsts canonicalMapeClasseDB eNodeToEClasseClassIdeNodesheightinfoparentsbestconstscostfitnesssizethetaanalysis fitRangeDBnextIdpatDBsizeDB sizeFitDB unevaluatedworklist emptyGraphemptyDB createEClass canonicalcanonize getEClasstrieisConstgetEClassesThat updateFitnessfindRootClassesgetTopECLassThatgetTopECLassWithSizejoinData makeAnalysisgetChildrenMinHeightcalculateHeights calculateCostcalculateConsts combineConsts insertFitnessAtom ClassOrVar ConditionQueryRule:=>:==:PatternFixedVarPatunFixPattargetsource getConditionscleanDBmatchcompileToQuerygetIntgetElems genericJoindomainXintersectAtomsintersectTries updateVar isDiffFrom elemOfAtom orderedVars$fFloatingPattern$fFractionalPattern $fNumPattern$fIsStringPattern $fShowRule $fShowAtom $fShowPattern derivativedoubleDerivative deriveByVar deriveByParam forwardModeforwardModeUniquereverseModeUniqueforwardModeUniqueJac$fFunctorTupleF DistributionGaussian BernoulliPoissonssemsermser2getSErrnllpredictgradNLLgradNLLNonUnique fisherNLL hessianNLL$fShowDistribution$fReadDistribution$fEnumDistribution$fBoundedDistribution minimizeNLLminimizeNLLNonUniqueminimizeNLLWithFixedParamminimizeGaussianminimizeBinomialminimizePoisson estimateSErrbicaicevidencemdlmdlLattmdlFreq logFunctionallogFunctionalFreq logParameterslogParametersLattnll' treeToNatProfileT_taus_thetas_opt _tau2theta _theta2tauCIest_lower_upper_ BasicStatsMkStats_cov_corr_stdErrCITypeLaplaceProfilePTypeBatesODE ConstrainedshowCIprintCIparamCI predictionCI inverseDist replaceParam0evalVar calcTheta0getAllProfiles getProfilegetProfileCnstr getEndPoint getProfileODErkgetStatsFromModel createSplinesgetCol sortOnFirstsplinesSketchesapproximateContour$fEqCI$fShowCI$fReadCI$fEqBasicStats$fShowBasicStats $fShowPType $fReadPTypeaddrebuildrepairrepairAnalysismerge modifyEClasscreateDBaddToDBpopulate canonizeMap applyMatchapplyMergeOnlyMatch classOfENodereprPrat isValidHeightisValidConditionsfromTree fromTreesgetBestgetExpressionFromgetAllExpressionsFromgetRndExpressionFrom cleanMaps forceStateCostMap SchedulerfromJusteqSatrecalculateBestrunEqSatapplySingleMergeOnlyEqSatmatchWithScheduler rewriteBasic rewritesFunrewritessimplifyEqSatDefaultapplyMergeOnlyDftlOutputPYTHONMATHTIKZLATEXSRAlgsTIRHLOPERONBINGOGOMEAPYSRSBPEPLEX showOutputparseSR $fShowOutput $fReadOutput $fEnumOutput$fBoundedOutput $fShowSRAlgs $fReadSRAlgs $fEnumSRAlgs$fBoundedSRAlgs withInput withOutputwithOutputDebug GHC.MaybeMaybeNothing Data.StringIsStringrandom-1.2.1.2-d46c6e8cecc0a9c8b2721799e4d366e61d7d9307938f0e97982a126187a33185System.Random.InternalStdGenHasExpsremoveProtectionloadMtxisGZip detectSepreadFileToLinessplitFileNameParamsparseVal getColumnsderiveByghc-prim GHC.TypesFalseTrue!??sseTotlogisticmyCost simplifyEqSatversion getBinDir getLibDir getDynLibDir getDataDir getLibexecDirgetDataFileName getSysconfDir ParseTreeIntDoublefmapbinaryprefixparens parseExprattoparsec-expr-0.1.1.2-08763591c3a0e89173b7b91258ab6ed100b32331cc893f10f77b9eaf1882e0e0Data.Attoparsec.ExprOperator intOrDoubleaqbinFunparseTIR parseOperonparseHL parseBingo parseGOMEA parsePySR