!"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~ Safe-Inferred Average $Population variance (divided by n). ICompute empirical qunatiles (using R type 7 continuous sample quantile). $Estimate continuous quantile (like R''s default type 7, SciPy (1,1), Excel). Median Interquartile range.  samples "probabilities in the range (0, 1) estimated quantiles' values n the number of samples xs samples prob numeric probability (0, 1) estimated quantile value      Safe-InferredNone,Yield a new randomly selected value of type a in the range (lo, hi).  See  for details. ,Yield a new randomly selected value of type a.  See  for details. 9Yield two randomly selected values which follow standard  normal distribution. EYield one randomly selected value from standard normal distribution. >Take at most n random elements from the list. Preserve order. Randomly reorder the list. AGiven a sequence (e1,...en) to shuffle, its length, and a random ? generator, compute the corresponding permutation of the input ; sequence, return the permutation and the new state of the  random generator. Modify value with probability p.. Return the unchanged value with probability 1-p.    NoneDA data type to distinguish the last and intermediate steps results. (On life cycle of the genetic algorithm:    [ start ]  |  v L (genomes) --> [calculate objective] --> (evaluated genomes) --> [ stop ] 4 ^ ^ | 4 | | | 4 | `-----------. | 4 | \ v 8 [ mutate ] (elite) <-------------- [ select ] 4 ^ | 4 | | 4 | | 4 | v ? (genomes) <----- [ crossover ] <-------- (evaluted genomes)  %PopulationState can represent either genomes or evaluated genomed. 0Iterations stop when the condition evaluates as True. stop when both conditions hold /stop when at least one of two conditions holds  terminate when evolution stalls :max number of generations for an indicator to be the same !stall indicator function "a counter (initially Nothing) #'stop when objective values satisfy the  predicate $ stop after n generations %1A single step of the genetic algorithm. See also nextGeneration. &FA mutation operator takes a genome and returns an altered copy of it. ' A crossover operator takes some parent genomes and returns some  children2 along with the remaining parents. Many crossover D operators use only two parents, but some require three (like UNDX) ? or more. Crossover operator should consume as many parents as 7 necessary and stop when the list of parents is empty. (AA selection operator selects a subset (probably with repetition) 9 of genomes for reproduction via crossover and mutation. )9A function to evaluate a genome should be an instance of  )- class. It may be called a cost function for ? minimization problems, or a fitness function for maximization  problems. 'Some genetic algorithm operators, like rouletteSelect , require  the 2 to be non-negative. +?A type of optimization problem: whether the objective function $ has to be miminized, or maximized. ./6 extracts a raw genome from any type which embeds it. 0!An entire population of observed 1s. 1&A genome associated with its observed 2 value. 2=A measure of the observed performance. It may be called cost B for minimization problems, or fitness for maximization problems. 34A genetic representation of an individual solution. 5Don' t crossover. 6Don' t mutate. @Evaluate fitness (cost) of all genomes, possibly changing their  order. ,Evaluate all fitness (cost) values at once. 2Evaluate fitness (cost) values genome per genome. $ !"#$%stop condition %population of the current generation "population of the next generation &'()*+,-./0123456 !"#$%&'()*+,-./01234563210./4+-,)*('&65%$# !"$# !"%&'()*+-,./0123456 None Generate n( random genomes made of elements in the  hyperrectangle ranges [(from_i,to_i)]. Return a list of genomes - and a new state of random number generator. 7 Generate n/ uniform random genomes with individual genome  elements bounded by ranges%. This corresponds to random uniform D sampling of points (genomes) from a hyperrectangle with a bounding  box ranges. 8;Crossover all available parents. Parents are not repeated. 9Produce exactly n& offsprings by repeatedly running the  crossover A operator between randomly selected parents (possibly repeated). random number generator !n, number of genomes to generate &ranges for individual genome elements 7n, how many genomes to generate &ranges for individual genome elements random genomes 89n#, number of offsprings to generate parents' genomes  crossover operator 789789 NoneCrossover two lists in exactly n random points. :GSelect a random point in two genomes, and swap them beyond this point.  Apply with probability p. ;ISelect two random points in two genomes, and swap everything in between.  Apply with probability p. <5Swap individual bits of two genomes with probability p. :;<589:;<:;< None=6An individual with all objective functions evaluated. AACalculate multiple objective per every genome in the population. =>?@A a list of problems a population of genomes =>?@A=>?@A NoneNone5A popular niching method proposed by D. Goldberg and J J. Richardson in 1987. The shared fitness of the individual is inversely " protoptional to its niche count. ? The method expects the objective function to be non-negative. 9An extension for minimization problems is implemented by @ making the fitnes proportional to its niche count (rather than  inversely proportional). 4Reference: Chen, J. H., Goldberg, D. E., Ho, S. Y., & Sastry, 8 K. (2002, July). Fitness inheritance in multiobjective > optimization. In Proceedings of the Genetic and Evolutionary B Computation Conference (pp. 319-326). Morgan Kaufmann Publishers  Inc.. distance function  niche radius niche compression exponent alpha (usually 1) !type of the optimization problem None BGApply given scaling or other transform to population before selection. C5Transform objective function values before seletion. D?Replace objective function values in the population with their " ranks. For a population of size n, a genome with the best value  of objective function has rank n' <= n, and a genome with the - worst value of objective function gets rank 1. D8 may be useful to avoid domination of few super-genomes  in F or to apply F when an objective ' function is not necessarily positive. E5A popular niching method proposed by D. Goldberg and J J. Richardson in 1987. The shared fitness of the individual is inversely " protoptional to its niche count. ? The method expects the objective function to be non-negative. 9An extension for minimization problems is implemented by @ making the fitnes proportional to its niche count (rather than  inversely proportional). 4Reference: Chen, J. H., Goldberg, D. E., Ho, S. Y., & Sastry, 8 K. (2002, July). Fitness inheritance in multiobjective > optimization. In Proceedings of the Genetic and Evolutionary B Computation Conference (pp. 319-326). Morgan Kaufmann Publishers  Inc.. F;Objective-proportionate (roulette wheel) selection: select n  random items with each item's chance of being selected is 3 proportional to its objective function (fitness). , Objective function should be non-negative. G$Performs tournament selection among size individuals and  returns the winner. Repeat n times. H=Stochastic universal sampling (SUS) is a selection technique E similar to roulette wheel selection. It gives weaker members a fair 6 chance to be selected, which is proportinal to their 5 fitness. Objective function should be non-negative. @Sort population by decreasing objective function (also known as B fitness for maximization problems). The genomes with the highest * fitness are put in the head of the list. @Sort population by increasing objective function (also known as @ cost for minimization problems). The genomes with the smallest ' cost are put in the head of the list. I)Reorders a list of individual solutions, . by putting the best in the head of the list. BCDEdistance function  niche radius niche compression exponent alpha (usually 1) !type of the optimization problem FG!type of the optimization problem size of the tournament group how many tournaments to run Hhow many genomes to select IBCDEFGHI BCDEFGHINoneJFInput-output actions, interactive and time-dependent stop conditions. Kterminate iteration after t seconds M%custom or interactive stop condition Naction to run every nth iteration, starting from 0; 9 initially (at iteration 0) the objective value is zero. QLogging to run every n4th iteration starting from 0 (the first parameter). I The logging function takes the current generation count and population. S3Helper function to run the entire algorithm in the   monad. < It takes care of generating a new random number generator. T3Helper function to run the entire algorithm in the  monad. U2Construct a single step of the genetic algorithm. See Moo.GeneticAlgorithm.Binary and Moo.GeneticAlgorithm.Continuous + for the building blocks of the algorithm. VNConstruct a single step of the incremental (steady-steate) genetic algorithm.  Exactly n8 worst solutions are replaced with newly born children. See Moo.GeneticAlgorithm.Binary and Moo.GeneticAlgorithm.Continuous + for the building blocks of the algorithm. W?Wrap a population transformation with pre- and post-conditions $ to indicate the end of simulation. :Use this function to define custom replacement strategies  in addition to U and V. Select n1 best genomes, then select more genomes from the  entire: population (elite genomes inclusive). Elite genomes will  be the first in the list. X:Run strict iterations of the genetic algorithm defined by step. % Return the result of the last step. Y<GA iteration interleaved with the same-monad logging hooks. Z<GA iteration interleaved with IO (for logging or saving the @ intermediate results); it takes and returns the updated random  number generator explicitly. JKLMNOPQRS&function to create initial population genetic algorithm, see also X and Y final population T&function to create initial population genetic algorithm, see also Z final population Ua type of the optimization problem objective function selection operator elite', the number of genomes to keep intact crossover operator mutation operator Vn', number of worst solutions to replace a type of the optimization problem objective function selection operator crossover operator mutation operator WXtermination condition cond step) function to produce the next generation initial population final population Yperiodic logging action termination condition cond step) function to produce the next generation initial population final population ZAinput-output actions, special and time-dependent stop conditions termination condition cond step) function to produce the next generation )reference to the random number generator initial population pop0 final population  !"#$JKLMNOPQRSTUVWXYZSTUVWXYZ$# !"QRJNMKOPL JNMKOPLQRSTUVWXYZNone[AHow many bits are needed to represent a range of integer numbers   (from, to) (inclusive). \&Encode an integer number in the range  (from, to) (inclusive) as B binary sequence of minimal length. Use of Gray code means that a B single point mutation leads to incremental change of the encoded  value. ]>Decode a binary sequence using Gray code to an integer in the  range  (from, to)$ (inclusive). This is an inverse of \. + Actual value returned may be greater than to. ^&Encode an integer number in the range  (from, to) (inclusive) @ as a binary sequence of minimal length. Use of binary encoding B means that a single point mutation may lead to sudden big change  of the encoded value. _4Decode a binary sequence to an integer in the range  (from, to) $ (inclusive). This is an inverse of ^. Actual value  returned may be greater than to. `"Encode a real number in the range  (from, to) (inclusive)  with n5 equally spaced discrete values in binary Gray code. a@Decode a binary sequence using Gray code to a real value in the  range  (from, to)#, assuming it was discretized with n equally  spaced values (see `). Represent a range  (from, to) of real numbers with n equally 4 spaced values. Use it to discretize a real number val.  Take a range  (from, to) of real numbers with n equally spaced values.  Convert i2-th value to a real number. This is an inverse of . b!Split a list into pieces of size n. This may be useful to split = the genome into distinct equally sized genes  which encode & distinct properties of the solution. c Generate n! random binary genomes of length len.  Return a list of genomes. dCFlips a random bit along the length of the genome with probability p.  With probability (1 - p) the genome remains unaffected. eFlip 1s and 09s with different probabilities. This may help to control  the relative frequencies of 1s and 0s in the genome. fFlip m4 bits on average, keeping the relative frequency of 0s  and 1s in the genome constant. gHamming distance between x and y is the number of coordinates  for which x_i and y_i are different. BReference: Hamming, Richard W. (1950), Error detecting and error C correcting codes , Bell System Technical Journal 29 (2): 147 160,  MR 0035935. [\]^_`a (from, to), the range to be encoded n7, how many discrete numbers from the range to consider a real number in the range  (from, to) to discretize (a discrete value (normally in the range (0, n-1))  (from, to), the encoded range n7, how many discrete numbers from the range to consider a discrete value in the range (0, n-1) a real number from the range bchow many genomes to generate genome length deprobability of a False bit to become True probability of a True bit to become False f!average number of bits to change g]  !"#$%&'()*+,-./012345689:;<BCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefg\]^_`a[bcFHGBCDEgI:;<589def[\]^_`abcdefgNoneh1-norm distance: sum |x_i - y-i|. i2-norm distance: (sum (x_i - y_i)^2)^(1/2). jInfinity norm distance: max |x_i - y_i|. kDBlend crossover (BLX-alpha) for continuous genetic algorithms. For  each component let x and y$ be its values in the first and the C second parent respectively. Choose corresponding component values D of the children independently from the uniform distribution in the  range (L,U), where L = min (x,y) - alpha * d, U = max  (x,y) + alpha * d, and d = abs (x - y). alpha is usually  0.5. Takahashi in [10.1109/CEC.2001.934452] suggests 0.366. l<Unimodal normal distributed crossover (UNDX) for continuous ? genetic algorithms. Recommended parameters according to [ISBN  978-3-540-43330-9] are sigma_xi = 0.5,  sigma_eta =  0.35/sqrt(n), where n, is the number of variables (dimensionality D of the search space). UNDX mixes three parents, producing normally J distributed children along the line between first two parents, and using A the third parent to build a supplementary orthogonal correction  component. 7UNDX preserves the mean of the offspring, and also the ' covariance matrix of the offspring if sigma_xi^2 = 0.25. By + preserving distribution of the offspring, /the UNDX can efficiently > search in along the valleys where parents are distributed in 1 functions with strong epistasis among parameters/ (idem). mRun l& with default recommended parameters. nASimulated binary crossover (SBX) operator for continuous genetic @ algorithms. SBX preserves the average of the parents and has a E spread factor distribution similar to single-point crossover of the  binary genetic algorithms. If n > 0, then the heighest > probability density is assigned to the same distance between " children as that of the parents. DThe performance of real-coded genetic algorithm with SBX is similar E to that of binary GA with a single-point crossover. For details see M Simulated Binary Crossover for Continuous Search Space (1995) Agrawal etal. o2For every variable in the genome with probability p replace its  value v with v + sigma*N(0,1), where N(0,1) is a normally E distributed random variable with mean equal 0 and variance equal 1.  With probability  (1 - p)^n, where n is the number . of variables, the genome remains unaffected. hijkalpha, range expansion parameter lsigma_xi, the standard deviation of ' the mix between two principal parents  sigma_eta, the standard deviation $ of the single orthogonal component mnnon-negative distribution  parameter n, usually in the  range from 2 to 5; for small  values of n children far away " from the parents are more likely  to be chosen. o probability p  sigmaY  !"#$%&'()*+,-./0123456789:;<BCDEFGHIJKLMNOPQRSTUVWXYZhijklmno7FHGBCDEhijIklmn:;<589ohijklmnoNone1boolean flag indicates if the bound is inclusive qDefine constraints using s, t, u, v, and w  operators, with a r on the left hand side. /For double inequality constraints use pairs of y, z and  x, { respectively, with a r in the middle.  Examples:   function .>=. lowerBound  lowerBound .< = function <=. upperBound 1double inequality, boolean flags indicate if the  bound is inclusive. equality constraint, 1 function value is equal to the constraint value "non-strict inequality constraint, > function value is less than or equal to the constraint value strict inequality constraint, 2 function value is less than the constraint value Returns True if a genome represents a feasible solution  with respect to the  constraint. |Returns True if a genome! represents a feasible solution,  i.e. satisfies all  constraints. } Generate n9 feasible random genomes with individual genome elements  bounded by ranges. ~ Generate n! feasible random binary genomes. 4A simple estimate of the degree of (in)feasibility. 2Count the number of constraint violations. Return 0 if the solution is feasible. .An estimate of the degree of (in)feasibility. Given f_j is the excess of j-th constraint function value,  return sum |f_j|^beta-. For strict inequality constraints, return  sum (|f_j|^beta + eta) . Return 0.0 if the solution is  feasible. =Modify objective function in such a way that 1) any feasible @ solution is preferred to any infeasible solution, 2) among two C feasible solutions the one having better objective function value ? is preferred, 3) among two infeasible solution the one having , smaller constraint violation is preferred. CReference: Deb, K. (2000). An efficient constraint handling method C for genetic algorithms. Computer methods in applied mechanics and  engineering, 186(2), 311-338. IKill all infeasible solutions after every step of the genetic algorithm. J Death penalty is very popular within the evolution strategies community, L but it is limited to problems in which the feasible search space is convex K and constitutes a reasonably large portion of the whole search space,  --  (Coello 1999). Coello, C. A. C., &+ Carlos, A. (1999). A survey of constraint 8 handling techniques used with evolutionary algorithms. ? Lania-RI-99-04, Laboratorio Nacional de Informtica Avanzada. >Kill all infeasible solutions once after the last step of the  genetic algorithm. See also . pqrstuvwxyz{ genome  constraint| constraints genome } constraints n, how many genomes to generate &ranges for individual genome elements random feasible genomes ~ constraints n, how many genomes to generate L, genome length random feasible genomes  constraints genome #the number of violated constraints  beta, single violation exponent -eta, equality penalty in strict inequalities  constrains genome total degree of violation  constraints "non-negative degree of violation,  see  and   constraints unconstrained step constrained step  constriants unconstrained step constrained step pqrstuvwxyz{|}~rq|stuvwpyxz{}~pqrstuvwxyz{|}~NoneASolution and its non-dominated rank and local crowding distance. 0 is the best Infinity" for less-crowded boundary points Returns True3 if the first solution dominates the second one in  some sense.  A solution p dominates another solution q if at least one 2  values of p( is better than the respective value of q, and the other  are not worse. GA solution p is said to constrain-dominate a solution q, if any of the I following is true: 1) Solution p is feasible and q is not. 2) Solutions M p and q are both infeasible but solution p has a smaller overall constraint T violation. 3) Solutions p and q are feasible, and solution p dominates solution q. Reference: (Deb, 2002). 0Fast non-dominated sort from (Deb et al. 2002). @ It is should be O(m N^2), with storage requirements of O(N^2). GThis is a direct translation of the pseudocode from (Deb et al. 2002). Crowding distance of a point p, as defined by Deb et < al. (2002), is an estimate (the sum of dimensions in their ? pseudocode) of the largest cuboid enclosing the point without . including any other point in the population. #Given there is non-domination rank rank_i, and local crowding  distance  distance_i assigned to every individual i, the partial  order between individuals i and q is defined by relation i ~ j if rank_i < rank_j or (rank_i = rank_j and  distance_i  >  distance_j). DAssign non-domination rank and crowding distances to all solutions. ) Return a list of non-domination fronts. =To every genome in the population, assign a single objective = value according to its non-domination rank. This ranking is E supposed to be used once in the beginning of the NSGA-II algorithm. Note:  reorders the genomes. CTo every genome in the population, assign a single objective value F equal to its non-domination rank, and sort genomes by the decreasing + local crowding distance within every rank < (i.e. sort the population with NSGA-II crowded comparision  operator) >A single step of the NSGA-II algorithm (Non-Dominated Sorting 6 Genetic Algorithm for Multi-Objective Optimization). BThe next population is selected from a common pool of parents and B their children minimizing the non-domination rank and maximizing - the crowding distance within the same rank. = The first generation of children is produced without taking  crowding into account. B Every solution is assigned a single objective value which is its E sequence number after sorting with the crowded comparison operator. @ The smaller value corresponds to solutions which are not worse $ the one with the bigger value. Use A to restore  individual objective values.  Reference: # Deb, K., Pratap, A., Agarwal, S., &" Meyarivan, T. A. M. T. (2002). A 4 fast and elitist multiobjective genetic algorithm: @ NSGA-II. Evolutionary Computation, IEEE Transactions on, 6(2),  182-197. ?Deb et al. used a binary tournament selection, base on crowded 6 comparison operator. To achieve the same effect, use   (or  with G  Minimizing 2 n, where n! is the size of the population). EA single step of NSGA-II algorithm with binary tournament selection.  See also . AA single step of the constrained NSGA-II algorithm, which uses a  constraint-domination rule:  A solution i* is said to constrain-dominate a solution j, if any of the  following is true: 1) Solution i is feasible and j is not. 2) Solutions  i and j" are both infeasible but solution i" has a smaller overall constraint  violation. 3) Solutions i and j are feasible, and solution i dominates solution j.  Reference: (Deb, 2002). JA single step of the constrained NSGA-II algorithm with binary tournament  selection. See also . 9Use normal selection, crossover, mutation to produce new B children. Select from a common pool of parents and children the ? best according to the least non-domination rank and crowding. =Take a pool of phenotypes of size 2N, ordered by the crowded ) comparison operator, and select N best. "problem types per every objective  constraints !non-negative degree of violation "problem types per every objective list of problems a population of raw genomes  a list of  objective functions n), number of top-ranked genomes to select a population of raw genomes 1selected genomes with their non-domination ranks  a list of  objective functions  a list of  objective functions  constraints !non-negative degree of violation  a list of  objective functions  constraints !non-negative degree of violation  a list of  objective functions  a list of  objective functions a list of objective functions  a list of  objective functions n$, the number of solutions to select pool of genomes to select from n best phenotypes None =>?@A ?>=A@portable experimentalNone !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMN O P Q R S T U V W X YZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~  moo-1.0Moo.GeneticAlgorithm.RandomMoo.GeneticAlgorithm.StatisticsMoo.GeneticAlgorithm.TypesMoo.GeneticAlgorithm.ContinuousMoo.GeneticAlgorithm.Binary#Moo.GeneticAlgorithm.MultiobjectiveMoo.GeneticAlgorithm.Run Moo.GeneticAlgorithm.ConstraintsMoo.GeneticAlgorithm.LinAlgMoo.GeneticAlgorithm.UtilitiesMoo.GeneticAlgorithm.Crossover)Moo.GeneticAlgorithm.Multiobjective.Types"Moo.GeneticAlgorithm.StopConditionMoo.GeneticAlgorithm.NichingMoo.GeneticAlgorithm.Selection)Moo.GeneticAlgorithm.Multiobjective.NSGA2Moo.GeneticAlgorithmrandom-1.0.1.1 System.RandomRandommersenne-random-pure64-0.2.0.3System.Random.Mersenne.Pure64 newPureMTPureMTmonad-mersenne-random-0.1Control.Monad.Mersenne.Random getWord64 getDoublegetInt64getWordgetIntgetBool evalRandom runRandomRandaveragevariance quantilesmedianiqr getRandomR getRandom getNormal2 getNormal randomSampleshufflewithProbability StepResult ContinueGAStopGAPopulationStateCondAndOr GensNoChange c'maxgens c'indicator c'counter IfObjective GenerationsStepGA MutationOp CrossoverOp SelectionOpObjectiveFunction evalObjective ProblemType Maximizing Minimizing GenomeState takeGenome Population Phenotype ObjectiveGenometakeObjectiveValue noCrossover noMutationgetRandomGenomes doCrossovers doNCrossoversonePointCrossovertwoPointCrossoveruniformCrossoverMultiPhenotypeMultiObjectiveProblemSingleObjectiveProblemtakeObjectiveValuesevalAllObjectiveswithPopulationTransform withScale rankScalewithFitnessSharingrouletteSelecttournamentSelectstochasticUniversalSampling bestFirstIOHook TimeLimitio'tStopWhenDoEveryio'n io'actionLogHook WriteEveryrunGArunIOnextGenerationnextSteadyState makeStoppableloop loopWithLogloopIO bitsNeeded encodeGray decodeGray encodeBinary decodeBinaryencodeGrayRealdecodeGrayReal splitEverygetRandomBinaryGenomes pointMutateasymmetricMutateconstFrequencyMutatehammingDistance distance1 distance2 distanceInfblendCrossoverunimodalCrossoverunimodalCrossoverRPsimulatedBinaryCrossovergaussianMutateLeftHandSideInequality ConstraintConstraintFunction.<..<=..>..>=..==..<=.<<.<=. isFeasiblegetConstrainedGenomesgetConstrainedBinaryGenomesnumberOfViolationsdegreeOfViolationwithConstraintswithDeathPenaltywithFinalDeathPenalty stepNSGA2 stepNSGA2btstepConstrainedNSGA2stepConstrainedNSGA2bt quantile7minusplusscaledotnorm2proj normalizerandomRrandom randomShuffle$fObjectiveFunction(->)a2$fObjectiveFunction(->)a20$fObjectiveFunction(->)a21$fGenomeState(,)a2$fGenomeState[]a2 randomGenomesnPointCrossoverevalCond updateCondfitnessSharingDistanceFunction nicheCountsharingsortByFitnessDesc sortByCostAscghc-prim GHC.TypesIO withElite toDiscreteR fromDiscreteRencodeWithCodedecodeWithCode InIntervalEqualLessThanOrEqualLessThansatisfiesConstraintpenalizeInfeasiblefilterFeasibleRankedSolutionrs'nondominationRankrs'localCrowdingDistnace DominationCmp dominationconstrainedDominationnondominatedSortnondominatedSortFastcrowdingDistancescrowdedComparerankAllSolutionsnondominatedRanking nsga2RankingstepNSGA2'nextGenerationstepNSGA2'poolSelection rs'phenotype sortIndicesBystepNSGA2'firstGeneration