About the Book
Please note that the content of this book primarily consists of articles available from Wikipedia or other free sources online. Pages: 152. Chapters: Genetic algorithm, Dynamic programming, Gene expression programming, CMA-ES, Expectation maximization algorithm, Newton's method, Swarm intelligence, Simplex algorithm, Particle swarm optimization, Simulated annealing, Criss-cross algorithm, Harmony search, Firefly algorithm, Minimax, Imperialist competitive algorithm, Cuckoo search, Bees algorithm, Gradient descent, Divide and conquer algorithm, Levenberg Marquardt algorithm, Gauss Newton algorithm, Alpha-beta pruning, Parallel metaheuristic, Coffman Graham algorithm, Matrix chain multiplication, Bin packing problem, Tabu search, Derivation of the conjugate gradient method, Differential evolution, Evolutionary algorithm, Job shop scheduling, Limited-memory BFGS, Nelder Mead method, Quasi-Newton method, Greedy algorithm, Extremal optimization, Hill climbing, Karmarkar's algorithm, Quantum annealing, BFGS method, Guided Local Search, Reactive search optimization, Golden section search, IOSO, Luus Jaakola, Automatic label placement, Cutting-plane method, Newton's method in optimization, Augmented Lagrangian method, Nonlinear programming, Interior point method, Natural evolution strategy, Artificial bee colony algorithm, Meta-optimization, Cross-entropy method, Local search (optimization), Simultaneous perturbation stochastic approximation, Auction algorithm, Graduated optimization, Special ordered set, Multi-swarm optimization, Kantorovich theorem, Nonlinear conjugate gradient method, Branch and bound, Pattern search (optimization), Random optimization, Frank Wolfe algorithm, Adaptive coordinate descent, Bat algorithm, Crew scheduling, Search-based software engineering, Trust region, Fourier Motzkin elimination, Random search, Bland's rule, Maximum subarray problem, Sequential minimal optimization, Symmetric rank-one, Mm algorithm, Negamax, Dykstra's projection algorithm, Sequential quadratic programming, Line search, Genetic algorithms in economics, Eagle strategy, Glowworm swarm optimization, Tree rearrangement, Branch and cut, Davidon Fletcher Powell formula, Active set, Powell's method, Delayed column-generation, Very large-scale neighborhood search, IPOPT, Killer heuristic, Penalty method, Mehrotra predictor corrector method, BHHH algorithm, Evolutionary programming, Successive parabolic interpolation, Big M method, Great Deluge algorithm, Iterated local search, Destination dispatch, Lemke's algorithm, Rosenbrock methods, Ordered subset expectation maximization, Sequence-dependent setup, Benson's algorithm, MCS algorithm, Branch and price, Fernandez s method, Zionts Wallenius method, Local convergence, Biologically inspired algorithms, Stochastic hill climbing, Space allocation problem. Excerpt: In the computer science field of artificial intelligence, a genetic algorithm (GA) is a search heuristic that mimics the process of natural evolution. This heuristic (also sometimes called a metaheuristic) is routinely used to generate useful solutions to optimization and search problems. Genetic algorithms belong to the larger class of evolutionary algorithms (EA), which generate solutions to optimization problems using techniques inspired by natural evolution, such as inheritance, mutation, selection, and crossover. Genetic algorithms find application in bioinformatics, phylogenetics, computational science, engineering, economics, chemistry, manufacturing, mathematics, physics, pharmacometrics and other fields. In a genetic algorithm, a population of candidate solutions (called individuals, creatures, or phenotypes) to an optimization problem is evolved toward better solutions. Each candidate solution has a set of properties (its chromosomes or genotype) which can be mutated and altered; traditionally, solutions are represented in binary as strings of 0s and 1s, but other encodings are also...