Introduction to algorithms for data mining and machine learning
Introduction to Algorithms for Data Mining and Machine Learning introduces the essential ideas behind all key algorithms and techniques for data mining and machine learning, along with optimization techniques. Its strong formal mathematical approach, well selected examples, and practical software re...
Uloženo v:
| Hlavní autor: | |
|---|---|
| Médium: | E-kniha Kniha |
| Jazyk: | angličtina |
| Vydáno: |
London
Academic Press
2019
Elsevier Science & Technology |
| Vydání: | 1 |
| Témata: | |
| ISBN: | 0128172169, 9780128172162 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
Obsah:
- 3.2.3 Conjugate gradient method -- 3.3 Optimizers in deep learning -- 3.4 Gradient-free methods -- 3.5 Evolutionary algorithms and swarm intelligence -- 3.5.1 Genetic algorithm -- 3.5.2 Differential evolution -- 3.5.3 Particle swarm optimization -- 3.5.4 Bat algorithm -- 3.5.5 Fire y algorithm -- 3.5.6 Cuckoo search -- 3.5.7 Flower pollination algorithm -- 3.6 Notes on software -- 4 Data tting and regression -- 4.1 Sample mean and variance -- 4.2 Regression analysis -- 4.2.1 Maximum likelihood -- 4.2.2 Liner regression -- 4.2.3 Linearization -- 4.2.4 Generalized linear regression -- 4.2.5 Goodness of t -- 4.3 Nonlinear least squares -- 4.3.1 Gauss-Newton algorithm -- 4.3.2 Levenberg-Marquardt algorithm -- 4.3.3 Weighted least squares -- 4.4 Over tting and information criteria -- 4.5 Regularization and Lasso method -- 4.6 Notes on software -- 5 Logistic regression, PCA, LDA, and ICA -- 5.1 Logistic regression -- 5.2 Softmax regression -- 5.3 Principal component analysis -- 5.4 Linear discriminant analysis -- 5.5 Singular value decomposition -- 5.6 Independent component analysis -- 5.7 Notes on software -- 6 Data mining techniques -- 6.1 Introduction -- 6.1.1 Types of data -- 6.1.2 Distance metric -- 6.2 Hierarchy clustering -- 6.3 k-Nearest-neighbor algorithm -- 6.4 k-Means algorithm -- 6.5 Decision trees and random forests -- 6.5.1 Decision tree algorithm -- 6.5.2 ID3 algorithm and C4.5 classi er -- 6.5.3 Random forest -- 6.6 Bayesian classi ers -- 6.6.1 Naive Bayesian classi er -- 6.6.2 Bayesian networks -- 6.7 Data mining for big data -- 6.7.1 Characteristics of big data -- 6.7.2 Statistical nature of big data -- 6.7.3 Mining big data -- 6.8 Notes on software -- 7 Support vector machine and regression -- 7.1 Statistical learning theory -- 7.2 Linear support vector machine -- 7.3 Kernel functions and nonlinear SVM -- 7.4 Support vector regression
- Front Cover -- Introduction to Algorithms for Data Mining and Machine Learning -- Copyright -- Contents -- About the author -- Preface -- Acknowledgments -- 1 Introduction to optimization -- 1.1 Algorithms -- 1.1.1 Essence of an algorithm -- 1.1.2 Issues with algorithms -- 1.1.3 Types of algorithms -- 1.2 Optimization -- 1.2.1 A simple example -- 1.2.2 General formulation of optimization -- 1.2.3 Feasible solution -- 1.2.4 Optimality criteria -- 1.3 Unconstrained optimization -- 1.3.1 Univariate functions -- 1.3.2 Multivariate functions -- 1.4 Nonlinear constrained optimization -- 1.4.1 Penalty method -- 1.4.2 Lagrange multipliers -- 1.4.3 Karush-Kuhn-Tucker conditions -- 1.5 Notes on software -- 2 Mathematical foundations -- 2.1 Convexity -- 2.1.1 Linear and af ne functions -- 2.1.2 Convex functions -- 2.1.3 Mathematical operations on convex functions -- 2.2 Computational complexity -- 2.2.1 Time and space complexity -- 2.2.2 Complexity of algorithms -- 2.3 Norms and regularization -- 2.3.1 Norms -- 2.3.2 Regularization -- 2.4 Probability distributions -- 2.4.1 Random variables -- 2.4.2 Probability distributions -- 2.4.3 Conditional probability and Bayesian rule -- 2.4.4 Gaussian process -- 2.5 Bayesian network and Markov models -- 2.6 Monte Carlo sampling -- 2.6.1 Markov chain Monte Carlo -- 2.6.2 Metropolis-Hastings algorithm -- 2.6.3 Gibbs sampler -- 2.7 Entropy, cross entropy, and KL divergence -- 2.7.1 Entropy and cross entropy -- 2.7.2 DL divergence -- 2.8 Fuzzy rules -- 2.9 Data mining and machine learning -- 2.9.1 Data mining -- 2.9.2 Machine learning -- 2.10 Notes on software -- 3 Optimization algorithms -- 3.1 Gradient-based methods -- 3.1.1 Newton's method -- 3.1.2 Newton's method for multivariate functions -- 3.1.3 Line search -- 3.2 Variants of gradient-based methods -- 3.2.1 Stochastic gradient descent -- 3.2.2 Subgradient method
- 7.5 Notes on software -- 8 Neural networks and deep learning -- 8.1 Learning -- 8.2 Arti cial neural networks -- 8.2.1 Neuron models -- 8.2.2 Activation models -- 8.2.3 Arti cial neural networks -- 8.3 Back propagation algorithm -- 8.4 Loss functions in ANN -- 8.5 Optimizers and choice of optimizers -- 8.6 Network architecture -- 8.7 Deep learning -- 8.7.1 Convolutional neural networks -- 8.7.1.1 Convolution and activation -- 8.7.1.2 Pooling -- 8.7.1.3 Flattening -- 8.7.1.4 Fully connected neural network -- 8.7.2 Restricted Boltzmann machine -- 8.7.3 Deep neural nets -- 8.7.4 Trends in deep learning -- 8.8 Tuning of hyperparameters -- 8.9 Notes on software -- Bibliography -- Index -- Back Cover

