Výsledky vyhledávání - Batch gradient algorithm

Upřesnit hledání
  1. 1
  2. 2

    Convergence of batch gradient learning algorithm with smoothing L1/2 regularization for Sigma–Pi–Sigma neural networks Autor Liu, Yan, Li, Zhengxue, Yang, Dakun, Mohamed, Kh.Sh, Wang, Jing, Wu, Wei

    ISSN: 0925-2312, 1872-8286
    Vydáno: Elsevier B.V 03.03.2015
    Vydáno v Neurocomputing (Amsterdam) (03.03.2015)
    “… However, the nonsmoothness of L1/2 regularization may lead to oscillation phenomenon. The aim of this paper is to develop a novel batch gradient method with smoothing L1/2 regularization for Sigma–Pi…”
    Získat plný text
    Journal Article
  3. 3

    A Mini-Batch Proximal Stochastic Recursive Gradient Algorithm with Diagonal Barzilai–Borwein Stepsize Autor Yu, Teng-Teng, Liu, Xin-Wei, Dai, Yu-Hong, Sun, Jie

    ISSN: 2194-668X, 2194-6698
    Vydáno: Berlin/Heidelberg Springer Berlin Heidelberg 01.06.2023
    “… We propose a mini-batch proximal stochastic recursive gradient algorithm SRG-DBB, which incorporates the diagonal Barzilai–Borwein (DBB…”
    Získat plný text
    Journal Article
  4. 4

    Deterministic convergence of complex mini-batch gradient learning algorithm for fully complex-valued neural networks Autor Zhang, Huisheng, Zhang, Ying, Zhu, Shuai, Xu, Dongpo

    ISSN: 0925-2312, 1872-8286
    Vydáno: Elsevier B.V 24.09.2020
    Vydáno v Neurocomputing (Amsterdam) (24.09.2020)
    “…This paper investigates the fully complex mini-batch gradient algorithm for training complex-valued neural networks…”
    Získat plný text
    Journal Article
  5. 5

    Convergence of batch gradient algorithm with smoothing composition of group l0 and l1/2 regularization for feedforward neural networks Autor Ramchoun, Hassan, Ettaouil, Mohamed

    ISSN: 2192-6352, 2192-6360
    Vydáno: Berlin/Heidelberg Springer Berlin Heidelberg 01.09.2022
    Vydáno v Progress in artificial intelligence (01.09.2022)
    “…In this paper, we prove the convergence of batch gradient method for training feedforward neural network…”
    Získat plný text
    Journal Article
  6. 6

    A mini-batch stochastic conjugate gradient algorithm with variance reduction Autor Kou, Caixia, Yang, Han

    ISSN: 0925-5001, 1573-2916
    Vydáno: New York Springer US 01.11.2023
    Vydáno v Journal of global optimization (01.11.2023)
    “… In this paper, in the spirit of SAGA, we propose a stochastic conjugate gradient algorithm which we call SCGA…”
    Získat plný text
    Journal Article
  7. 7
  8. 8

    Batch Gradient Learning Algorithm with Smoothing L1 Regularization for Feedforward Neural Networks Autor Mohamed, Khidir Shaib

    ISSN: 2073-431X, 2073-431X
    Vydáno: Basel MDPI AG 01.01.2023
    Vydáno v Computers (Basel) (01.01.2023)
    “… In this paper, we propose a batch gradient learning algorithm with smoothing L1 regularization (BGSL1…”
    Získat plný text
    Journal Article
  9. 9

    Design and Analysis of Urban Land Lease Price Predicting Model Using Batch Gradient Descent Algorithm Autor Berhane Niguse, Kifle

    ISSN: 2073-073X, 2220-184X
    Vydáno: 21.05.2023
    Vydáno v Momona Ethiopian journal of science (21.05.2023)
    “… But with prediction, they tend to over-fit samples and simplify poorly to new, undetected data. This paper presents a batch gradient algorithm for predicting the rice of land with large datasets…”
    Získat plný text
    Journal Article
  10. 10

    Batch Gradient Learning Algorithm with Smoothing Regularization for Feedforward Neural Networks Autor Khidir Shaib Mohamed

    ISSN: 2073-431X
    Vydáno: MDPI AG 01.12.2022
    Vydáno v Computers (Basel) (01.12.2022)
    “… In this paper, we propose a batch gradient learning algorithm with smoothing L1 regularization (BGS L1…”
    Získat plný text
    Journal Article
  11. 11

    Prediction of students’ academic performance using ANN with mini-batch gradient descent and Levenberg-Marquardt optimization algorithms Autor Simanungkalit, F R J, Hanifah, H, Ardaneswari, G, Hariadi, N, Handari, B D

    ISSN: 1742-6588, 1742-6596
    Vydáno: IOP Publishing 01.11.2021
    Vydáno v Journal of physics. Conference series (01.11.2021)
    “… In this paper, we use artificial neural networks (ANN) to predict this performance. ANNs with two optimization algorithms, mini-batch gradient descent and Levenberg-Marquardt, are implemented on students…”
    Získat plný text
    Journal Article
  12. 12

    Convergence of a Batch Gradient Algorithm with Adaptive Momentum for Neural Networks Autor Shao, Hongmei, Xu, Dongpo, Zheng, Gaofeng

    ISSN: 1370-4621, 1573-773X
    Vydáno: Boston Springer US 01.12.2011
    Vydáno v Neural processing letters (01.12.2011)
    “…In this paper, a batch gradient algorithm with adaptive momentum is considered and a convergence theorem is presented when it is used for two-layer feedforward neural networks training…”
    Získat plný text
    Journal Article
  13. 13

    Deterministic convergence analysis for regularized long short-term memory and its application to regression and multi-classification problems Autor Kang, Qian, Yu, Dengxiu, Cheong, Kang Hao, Wang, Zhen

    ISSN: 0952-1976, 1873-6769
    Vydáno: Elsevier Ltd 01.07.2024
    “… This paper propose a novel regularized LSTM based on the batch gradient method. Specifically, the L2 regularization is appended to the objective function as a systematic…”
    Získat plný text
    Journal Article
  14. 14

    L1/2 regularization learning for smoothing interval neural networks: Algorithms and convergence analysis Autor Yang, Dakun, Liu, Yan

    ISSN: 0925-2312, 1872-8286
    Vydáno: Elsevier B.V 10.01.2018
    Vydáno v Neurocomputing (Amsterdam) (10.01.2018)
    “… In this paper, a novel batch gradient algorithm with smoothing L1/2 regularization is proposed to prevent the weights oscillation for a smoothing interval neural network (SINN…”
    Získat plný text
    Journal Article
  15. 15

    Convergence of a Batch Gradient Algorithm with Adaptive Momentum for Neural Networks Autor Gaofeng Zheng, Dongpo Xu, Hongmei Shao

    ISSN: 1370-4621, 1573-773X
    Vydáno: Springer Science and Business Media LLC 22.07.2011
    Vydáno v Neural Processing Letters (22.07.2011)
    Získat plný text
    Journal Article
  16. 16

    An online self-organizing radial basis function neural network based on Gaussian Membership Autor Jia, Lijie, Li, Wenjing, Qiao, Junfei, Zhang, Xinliang

    ISSN: 0924-669X, 1573-7497
    Vydáno: New York Springer US 01.04.2025
    “…Radial basis function neural network (RBFNN) is one of the most popular neural networks, and an appropriate selection of its structure and learning algorithms is crucial for its performance…”
    Získat plný text
    Journal Article
  17. 17

    Convergence Analysis of Batch Gradient Algorithm for Three Classes of Sigma-Pi Neural Networks Autor Zhang, Chao, Wu, Wei, Xiong, Yan

    ISSN: 1370-4621, 1573-773X
    Vydáno: Dordrecht Springer 01.12.2007
    Vydáno v Neural processing letters (01.12.2007)
    “… A unified convergence analysis for the batch gradient algorithm for SPNN learning is presented, covering three classes of SPNNs: Σ-Π-Σ, Σ-Σ-Π and Σ-Π-Σ-Π…”
    Získat plný text
    Journal Article
  18. 18

    Multi-Pass Sequential Mini-Batch Stochastic Gradient Descent Algorithms for Noise Covariance Estimation in Adaptive Kalman Filtering Autor Kim, Hee-Seung, Zhang, Lingyi, Bienkowski, Adam, Pattipati, Krishna R.

    ISSN: 2169-3536, 2169-3536
    Vydáno: 2230 Support IEEE 01.01.2021
    Vydáno v IEEE access (01.01.2021)
    “… This paper presents stochastic gradient descent algorithms for noise covariance estimation in adaptive Kalman filters that are an order of magnitude faster than the batch method for similar or better…”
    Získat plný text
    Journal Article
  19. 19

    Strong Convergence Analysis of Batch Gradient-Based Learning Algorithm for Training Pi-Sigma Network Based on TSK Fuzzy Models Autor Liu, Yan, Yang, Dakun, Nan, Nan, Guo, Li, Zhang, Jianjun

    ISSN: 1370-4621, 1573-773X
    Vydáno: New York Springer US 01.06.2016
    Vydáno v Neural processing letters (01.06.2016)
    “… The aim of this paper is to present a gradient-based learning method for Pi-Sigma network to train TSK fuzzy inference system…”
    Získat plný text
    Journal Article
  20. 20

    Mini-Batch Gradient-Based Algorithms on High-Dimensional Quadratic Problems: Exact Dynamics and Consequences Autor Cheng, Andrew

    ISBN: 9798382617244
    Vydáno: ProQuest Dissertations & Theses 01.01.2023
    “… (il- lustrated by Figure 1.2). A popular class of these algorithms are the mini-batch gradient- based methods (see Definition 2…”
    Získat plný text
    Dissertation