Nature-inspired optimization algorithms

Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization.The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chos...

Full description

Saved in:
Bibliographic Details
Main Author: Yang, Xin-She
Format: eBook Book
Language:English
Published: London Elsevier 2014
Edition:1
Series:Elsevier insights
Subjects:
ISBN:0124167438, 9780124167438, 0124167454, 9780124167452
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Table of Contents:
  • 9.5 Global Convergence: Brief Mathematical Analysis -- 9.6 Applications -- References -- 10 Bat Algorithms -- 10.1 Echolocation of Bats -- 10.1.1 Behavior of Microbats -- 10.1.2 Acoustics of Echolocation -- 10.2 Bat Algorithms -- 10.2.1 Movement of Virtual Bats -- 10.2.2 Loudness and Pulse Emission -- 10.3 Implementation -- 10.4 Binary Bat Algorithms -- 10.5 Variants of the Bat Algorithm -- 10.6 Convergence Analysis -- 10.7 Why the Bat Algorithm is Efficient -- 10.8 Applications -- 10.8.1 Continuous Optimization -- 10.8.2 Combinatorial Optimization and Scheduling -- 10.8.3 Inverse Problems and Parameter Estimation -- 10.8.4 Classifications, Clustering, and Data Mining -- 10.8.5 Image Processing -- 10.8.6 Fuzzy Logic and Other Applications -- References -- 11 Flower Pollination Algorithms -- 11.1 Introduction -- 11.2 Flower Pollination Algorithm -- 11.2.1 Characteristics of Flower Pollination -- 11.2.2 Flower Pollination Algorithm -- 11.3 Multi-Objective Flower Pollination Algorithms -- 11.4 Validation and Numerical Experiments -- 11.4.1 Single-Objective Test Functions -- 11.4.2 Multi-Objective Test Functions -- 11.4.3 Analysis of Results and Comparison -- 11.5 Applications -- 11.5.1 Single-Objective Design Benchmarks -- 11.5.1.1 Spring Design Optimization -- 11.5.1.2 Welded Beam Design -- 11.5.1.3 Speed Reducer Design -- 11.5.1.4 Pressure Vessel Design -- 11.5.2 Multi-Objective Design Benchmarks -- 11.6 Further Research Topics -- References -- 12 A Framework for Self-Tuning Algorithms -- 12.1 Introduction -- 12.2 Algorithm Analysis and Parameter Tuning -- 12.2.1 A General Formula for Algorithms -- 12.2.2 Type of Optimality -- 12.2.3 Parameter Tuning -- 12.3 Framework for Self-Tuning Algorithms -- 12.3.1 Hyperoptimization -- 12.3.2 A Multi-Objective View -- 12.3.3 Self-Tuning Framework -- 12.4 A Self-Tuning Firefly Algorithm -- 12.5 Some Remarks
  • Intro -- Half Title -- Title Page -- Copyright -- Contents -- Preface -- 1 Introduction to Algorithms -- 1.1 What is an Algorithm? -- 1.2 Newton's Method -- 1.3 Optimization -- 1.3.1 Gradient-Based Algorithms -- 1.3.2 Hill Climbing with Random Restart -- 1.4 Search for Optimality -- 1.5 No-Free-Lunch Theorems -- 1.5.1 NFL Theorems -- 1.5.2 Choice of Algorithms -- 1.6 Nature-Inspired Metaheuristics -- 1.7 A Brief History of Metaheuristics -- References -- 2 Analysis of Algorithms -- 2.1 Introduction -- 2.2 Analysis of Optimization Algorithms -- 2.2.1 Algorithm as an Iterative Process -- 2.2.2 An Ideal Algorithm? -- 2.2.3 A Self-Organization System -- 2.2.4 Exploration and Exploitation -- 2.2.5 Evolutionary Operators -- 2.3 Nature-Inspired Algorithms -- 2.3.1 Simulated Annealing -- 2.3.2 Genetic Algorithms -- 2.3.3 Differential Evolution -- 2.3.4 Ant and Bee Algorithms -- 2.3.5 Particle Swarm Optimization -- 2.3.6 The Firefly Algorithm -- 2.3.7 Cuckoo Search -- 2.3.8 The Bat Algorithm -- 2.3.9 Harmony Search -- 2.3.10 The Flower Algorithm -- 2.3.11 Other Algorithms -- 2.4 Parameter Tuning and Parameter Control -- 2.4.1 Parameter Tuning -- 2.4.2 Hyperoptimization -- 2.4.3 Multiobjective View -- 2.4.4 Parameter Control -- 2.5 Discussions -- 2.6 Summary -- References -- 3 Random Walks and Optimization -- 3.1 Random Variables -- 3.2 Isotropic Random Walks -- 3.3 Lévy Distribution and Lévy Flights -- 3.4 Optimization as Markov Chains -- 3.4.1 Markov Chain -- 3.4.2 Optimization as a Markov Chain -- 3.5 Step Sizes and Search Efficiency -- 3.5.1 Step Sizes, Stopping Criteria, and Efficiency -- 3.5.2 Why Lévy Flights are More Efficient -- 3.6 Modality and Intermittent Search Strategy -- 3.7 Importance of Randomization -- 3.7.1 Ways to Carry Out Random Walks -- 3.7.2 Importance of Initialization -- 3.7.3 Importance Sampling
  • 3.7.4 Low-Discrepancy Sequences -- 3.8 Eagle Strategy -- 3.8.1 Basic Ideas of Eagle Strategy -- 3.8.2 Why Eagle Strategy is So Efficient -- References -- 4 Simulated Annealing -- 4.1 Annealing and Boltzmann Distribution -- 4.2 Parameters -- 4.3 SA Algorithm -- 4.4 Unconstrained Optimization -- 4.5 Basic Convergence Properties -- 4.6 SA Behavior in Practice -- 4.7 Stochastic Tunneling -- References -- 5 Genetic Algorithms -- 5.1 Introduction -- 5.2 Genetic Algorithms -- 5.3 Role of Genetic Operators -- 5.4 Choice of Parameters -- 5.5 GA Variants -- 5.6 Schema Theorem -- 5.7 Convergence Analysis -- References -- 6 Differential Evolution -- 6.1 Introduction -- 6.2 Differential Evolution -- 6.3 Variants -- 6.4 Choice of Parameters -- 6.5 Convergence Analysis -- 6.6 Implementation -- References -- 7 Particle Swarm Optimization -- 7.1 Swarm Intelligence -- 7.2 PSO Algorithm -- 7.3 Accelerated PSO -- 7.4 Implementation -- 7.5 Convergence Analysis -- 7.5.1 Dynamical System -- 7.5.2 Markov Chain Approach -- 7.6 Binary PSO -- References -- 8 Firefly Algorithms -- 8.1 The Firefly Algorithm -- 8.1.1 Firefly Behavior -- 8.1.2 Standard Firefly Algorithm -- 8.1.3 Variations of Light Intensity and Attractiveness -- 8.1.4 Controlling Randomization -- 8.2 Algorithm Analysis -- 8.2.1 Scalings and Limiting Cases -- 8.2.2 Attraction and Diffusion -- 8.2.3 Special Cases of FA -- 8.3 Implementation -- 8.4 Variants of the Firefly Algorithm -- 8.4.1 FA Variants -- 8.4.2 How Can We Discretize FA? -- 8.5 Firefly Algorithms in Applications -- 8.6 Why the Firefly Algorithm is Efficient -- References -- 9 Cuckoo Search -- 9.1 Cuckoo Breeding Behavior -- 9.2 Lévy Flights -- 9.3 Cuckoo Search -- 9.3.1 Special Cases of Cuckoo Search -- 9.3.2 How to Carry Out Lévy Flights -- 9.3.3 Choice of Parameters -- 9.3.4 Variants of Cuckoo Search -- 9.4 Why Cuckoo Search is so Efficient
  • References -- 13 How to Deal with Constraints -- 13.1 Introduction and Overview -- 13.2 Method of Lagrange Multipliers -- 13.3 KKT Conditions -- 13.4 Penalty Method -- 13.5 Equality with Tolerance -- 13.6 Feasibility Rules and Stochastic Ranking -- 13.7 Multi-objective Approach to Constraints -- 13.8 Spring Design -- 13.9 Cuckoo Search Implementation -- References -- 14 Multi-Objective Optimization -- 14.1 Multi-Objective Optimization -- 14.2 Pareto Optimality -- 14.3 Weighted Sum Method -- 14.4 Utility Method -- 14.5 The ε-Constraint Method -- 14.6 Metaheuristic Approaches -- 14.7 NSGA-II -- References -- 15 Other Algorithms and Hybrid Algorithms -- 15.1 Ant Algorithms -- 15.1.1 Ant Behavior -- 15.1.2 Ant Colony Optimization -- 15.1.3 Virtual Ant Algorithms -- 15.2 Bee-Inspired Algorithms -- 15.2.1 Honeybee Behavior -- 15.2.2 Bee Algorithms -- 15.2.3 Honeybee Algorithm -- 15.2.4 Virtual Bee Algorithm -- 15.2.5 Artificial Bee Colony Optimization -- 15.3 Harmony Search -- 15.3.1 Harmonics and Frequencies -- 15.3.2 Harmony Search -- 15.4 Hybrid Algorithms -- 15.4.1 Other Algorithms -- 15.4.2 Ways to Hybridize -- 15.5 Final Remarks -- References -- Appendix A: Test Function Benchmarks for Global Optimization -- References -- Appendix B: Matlab Programs -- B.1 Simulated Annealing -- B.2 Particle Swarm Optimization -- B.3 Differential Evolution -- B.4 Firefly Algorithm -- B.5 Cuckoo Search -- B.6 Bat Algorithm -- B.7 Flower Pollination Algorithm