Search Results - "Optimization Methods in Machine Learning"

Refine Results
  1. 1

    Gradient Optimization Methods in Machine Learning for the Identification of Dynamic Systems Parameters by Panteleev, A.V., Lobanov, A.V.

    ISSN: 2219-3758, 2311-9454
    Published: 2019
    Published in Modelling and Data Analysis (2019)
    “…The article considers one of the possible ways to solve the problem of estimating the unknown parameters of dynamic models described by differential-algebraic…”
    Get full text
    Journal Article
  2. 2

    Optimization methods in machine learning: Theory and applications by Saha, Ankan

    ISBN: 1303423448, 9781303423444
    Published: ProQuest Dissertations & Theses 01.01.2013
    “…We look at the integral role played by convex optimization in various machine learning problems. Over the last few years there has been a lot of machine…”
    Get full text
    Dissertation
  3. 3

    Operator Theory for Analysis of Convex Optimization Methods in Machine Learning by Gallagher, Patrick W

    ISBN: 9781321401738, 1321401736
    Published: ProQuest Dissertations & Theses 01.01.2014
    “…As machine learning has more closely interacted with optimization, the concept of convexity has loomed large. Two properties beyond simple convexity have…”
    Get full text
    Dissertation
  4. 4

    Secure Image Inference Using Pairwise Activation Functions by Jonas T. Agyepong, Mostafa I. Soliman, Yasutaka Wada, Keiji Kimura, Ahmed El-Mahdy

    ISSN: 2169-3536
    Published: Institute of Electrical and Electronics Engineers (IEEE) 01.01.2021
    Published in IEEE Access (01.01.2021)
    Get full text
    Journal Article
  5. 5

    A Survey of Optimization Methods From a Machine Learning Perspective by Sun, Shiliang, Cao, Zehui, Zhu, Han, Zhao, Jing

    ISSN: 2168-2267, 2168-2275, 2168-2275
    Published: United States IEEE 01.08.2020
    Published in IEEE transactions on cybernetics (01.08.2020)
    “… With the exponential growth of data amount and the increase of model complexity, optimization methods in machine learning face more and more…”
    Get full text
    Journal Article
  6. 6

    Non-smooth Bayesian learning for artificial neural networks by Fakhfakh, Mohamed, Chaari, Lotfi, Bouaziz, Bassem, Gargouri, Faiez

    ISSN: 1868-5137, 1868-5145
    Published: Germany Springer Nature B.V 01.10.2023
    “… A lot of work on solving optimization problems or improving optimization methods in machine learning has been proposed successively such as gradient-based method, Newton-type method, meta-heuristic method…”
    Get full text
    Journal Article
  7. 7

    Stochastic normalized gradient descent with momentum for large-batch training by Zhao, Shen-Yi, Shi, Chang-Wei, Xie, Yin-Peng, Li, Wu-Jun

    ISSN: 1674-733X, 1869-1919
    Published: Beijing Science China Press 01.11.2024
    Published in Science China. Information sciences (01.11.2024)
    “…Stochastic gradient descent (SGD) and its variants have been the dominating optimization methods in machine learning…”
    Get full text
    Journal Article
  8. 8

    Exploring Physics-Informed Neural Networks for the Generalized Nonlinear Sine-Gordon Equation by Deresse, Alemayehu Tamirie, Dufera, Tamirat Temesgen

    ISSN: 1687-9724, 1687-9732
    Published: New York Hindawi 2024
    “…The nonlinear sine-Gordon equation is a prevalent feature in numerous scientific and engineering problems. In this paper, we propose a machine learning-based…”
    Get full text
    Journal Article
  9. 9

    Learning From Mistakes: A Multilevel Optimization Framework by Zhang, Li, Garg, Bhanu, Sridhara, Pradyumna, Hosseini, Ramtin, Xie, Pengtao

    ISSN: 2691-4581, 2691-4581
    Published: IEEE 01.06.2025
    “…Bi-level optimization methods in machine learning are popularly effective in subdomains of neural architecture search, data reweighting, etc…”
    Get full text
    Journal Article
  10. 10

    Non-smooth Bayesian learning for artificial neural networks: Non-smooth Bayesian learning by Fakhfakh, Mohamed, Chaari, Lotfi, Bouaziz, Bassem, Gargouri, Faiez

    ISSN: 1868-5137, 1868-5145
    Published: Berlin/Heidelberg Springer Berlin Heidelberg 01.10.2023
    “… A lot of work on solving optimization problems or improving optimization methods in machine learning has been proposed successively such as gradient-based method, Newton-type method, meta-heuristic method…”
    Get full text
    Journal Article
  11. 11

    Investigation of training performance of convolutional neural networks evolved by genetic algorithms using an activity function by Betere, Job Isaac, Kinjo, Hiroshi, Nakazono, Kunihiko, Oshiro, Naoki

    ISSN: 1433-5298, 1614-7456
    Published: Tokyo Springer Science and Business Media LLC 01.02.2020
    Published in Artificial Life and Robotics (01.02.2020)
    “…) evolved by genetic algorithms (GA) using an activity function for image recognition. Globally, GA has been considered as one of the most robust search optimization methods in machine learning and artificial intelligent systems…”
    Get full text
    Journal Article
  12. 12

    Stochastic Normalized Gradient Descent with Momentum for Large-Batch Training by Shen-Yi, Zhao, Chang-Wei, Shi, Yin-Peng, Xie, Wu-Jun, Li

    ISSN: 2331-8422
    Published: Ithaca Cornell University Library, arXiv.org 15.04.2024
    Published in arXiv.org (15.04.2024)
    “…Stochastic gradient descent~(SGD) and its variants have been the dominating optimization methods in machine learning…”
    Get full text
    Paper
  13. 13

    Learning with Less: Low-Rank Dynamics, Communication, and Introspection in Machine Learning by Baker, Bradley

    ISBN: 9798263395957
    Published: ProQuest Dissertations & Theses 01.01.2023
    “…The enclosed research is a focused empirical and theoretical analysis of the optimization methods in machine learning, and the underlying role that the matrix rank of utilized learning statistics…”
    Get full text
    Dissertation
  14. 14

    Topics in Machine Learning Optimization by Fang, Biyi

    ISBN: 9798759972181
    Published: ProQuest Dissertations & Theses 01.01.2021
    “…Recently, machine learning and deep learning, which have made many theoretical and empirical breakthroughs and is widely applied in various fields, attract a…”
    Get full text
    Dissertation
  15. 15

    Large-scale learning with AdaGrad on Spark by Hadgu, Asmelash Teka, Nigam, Aastha, Diaz-Aviles, Ernesto

    Published: IEEE 01.10.2015
    “… (and often non-convex) functions and one of the most popular stochastic optimization methods in machine learning today…”
    Get full text
    Conference Proceeding
  16. 16

    A Survey of Optimization Methods from a Machine Learning Perspective by Sun, Shiliang, Cao, Zehui, Zhu, Han, Zhao, Jing

    ISSN: 2331-8422
    Published: Ithaca Cornell University Library, arXiv.org 23.10.2019
    Published in arXiv.org (23.10.2019)
    “… With the exponential growth of data amount and the increase of model complexity, optimization methods in machine learning face more and more…”
    Get full text
    Paper
  17. 17

    Learning rate adaptive stochastic gradient descent optimization methods: numerical simulations for deep learning methods for partial differential equations and convergence analyses by Dereich, Steffen, Jentzen, Arnulf, Riekert, Adrian

    ISSN: 2331-8422
    Published: Ithaca Cornell University Library, arXiv.org 20.06.2024
    Published in arXiv.org (20.06.2024)
    “… The default learning rate schedules for SGD optimization methods in machine learning implementation frameworks such as TensorFlow and Pytorch are constant learning rates…”
    Get full text
    Paper
  18. 18

    Meta-Learning Parameterized First-Order Optimizers using Differentiable Convex Optimization by Gautam, Tanmay, Pfrommer, Samuel, Sojoudi, Somayeh

    ISSN: 2331-8422
    Published: Ithaca Cornell University Library, arXiv.org 29.03.2023
    Published in arXiv.org (29.03.2023)
    “…Conventional optimization methods in machine learning and controls rely heavily on first-order update rules…”
    Get full text
    Paper
  19. 19

    Machines Explaining Linear Programs by Steinmann, David, Zečević, Matej, Devendra Singh Dhami, Kersting, Kristian

    ISSN: 2331-8422
    Published: Ithaca Cornell University Library, arXiv.org 14.06.2022
    Published in arXiv.org (14.06.2022)
    “… Although successful, these methods have mostly focused on the deep learning methods while the fundamental optimization methods in machine learning such as linear programs (LP) have been left…”
    Get full text
    Paper
  20. 20

    Conformal Symplectic and Relativistic Optimization by França, Guilherme, Sulam, Jeremias, Robinson, Daniel P, Vidal, René

    ISSN: 2331-8422
    Published: Ithaca Cornell University Library, arXiv.org 27.10.2020
    Published in arXiv.org (27.10.2020)
    “…Arguably, the two most popular accelerated or momentum-based optimization methods in machine learning are Nesterov's accelerated gradient and Polyaks's heavy ball, both corresponding to different…”
    Get full text
    Paper