Search Results - Batch gradient algorithm

Refine Results
  1. 1
  2. 2

    Convergence of batch gradient learning algorithm with smoothing L1/2 regularization for Sigma–Pi–Sigma neural networks by Liu, Yan, Li, Zhengxue, Yang, Dakun, Mohamed, Kh.Sh, Wang, Jing, Wu, Wei

    ISSN: 0925-2312, 1872-8286
    Published: Elsevier B.V 03.03.2015
    Published in Neurocomputing (Amsterdam) (03.03.2015)
    “… However, the nonsmoothness of L1/2 regularization may lead to oscillation phenomenon. The aim of this paper is to develop a novel batch gradient method with smoothing L1/2 regularization for Sigma–Pi…”
    Get full text
    Journal Article
  3. 3

    A Mini-Batch Proximal Stochastic Recursive Gradient Algorithm with Diagonal Barzilai–Borwein Stepsize by Yu, Teng-Teng, Liu, Xin-Wei, Dai, Yu-Hong, Sun, Jie

    ISSN: 2194-668X, 2194-6698
    Published: Berlin/Heidelberg Springer Berlin Heidelberg 01.06.2023
    “… We propose a mini-batch proximal stochastic recursive gradient algorithm SRG-DBB, which incorporates the diagonal Barzilai–Borwein (DBB…”
    Get full text
    Journal Article
  4. 4

    Deterministic convergence of complex mini-batch gradient learning algorithm for fully complex-valued neural networks by Zhang, Huisheng, Zhang, Ying, Zhu, Shuai, Xu, Dongpo

    ISSN: 0925-2312, 1872-8286
    Published: Elsevier B.V 24.09.2020
    Published in Neurocomputing (Amsterdam) (24.09.2020)
    “…This paper investigates the fully complex mini-batch gradient algorithm for training complex-valued neural networks…”
    Get full text
    Journal Article
  5. 5

    Convergence of batch gradient algorithm with smoothing composition of group l0 and l1/2 regularization for feedforward neural networks by Ramchoun, Hassan, Ettaouil, Mohamed

    ISSN: 2192-6352, 2192-6360
    Published: Berlin/Heidelberg Springer Berlin Heidelberg 01.09.2022
    Published in Progress in artificial intelligence (01.09.2022)
    “…In this paper, we prove the convergence of batch gradient method for training feedforward neural network…”
    Get full text
    Journal Article
  6. 6

    A mini-batch stochastic conjugate gradient algorithm with variance reduction by Kou, Caixia, Yang, Han

    ISSN: 0925-5001, 1573-2916
    Published: New York Springer US 01.11.2023
    Published in Journal of global optimization (01.11.2023)
    “… In this paper, in the spirit of SAGA, we propose a stochastic conjugate gradient algorithm which we call SCGA…”
    Get full text
    Journal Article
  7. 7
  8. 8

    Batch Gradient Learning Algorithm with Smoothing L1 Regularization for Feedforward Neural Networks by Mohamed, Khidir Shaib

    ISSN: 2073-431X, 2073-431X
    Published: Basel MDPI AG 01.01.2023
    Published in Computers (Basel) (01.01.2023)
    “… In this paper, we propose a batch gradient learning algorithm with smoothing L1 regularization (BGSL1…”
    Get full text
    Journal Article
  9. 9

    Design and Analysis of Urban Land Lease Price Predicting Model Using Batch Gradient Descent Algorithm by Berhane Niguse, Kifle

    ISSN: 2073-073X, 2220-184X
    Published: 21.05.2023
    Published in Momona Ethiopian journal of science (21.05.2023)
    “… But with prediction, they tend to over-fit samples and simplify poorly to new, undetected data. This paper presents a batch gradient algorithm for predicting the rice of land with large datasets…”
    Get full text
    Journal Article
  10. 10

    Batch Gradient Learning Algorithm with Smoothing Regularization for Feedforward Neural Networks by Khidir Shaib Mohamed

    ISSN: 2073-431X
    Published: MDPI AG 01.12.2022
    Published in Computers (Basel) (01.12.2022)
    “… In this paper, we propose a batch gradient learning algorithm with smoothing L1 regularization (BGS L1…”
    Get full text
    Journal Article
  11. 11

    Prediction of students’ academic performance using ANN with mini-batch gradient descent and Levenberg-Marquardt optimization algorithms by Simanungkalit, F R J, Hanifah, H, Ardaneswari, G, Hariadi, N, Handari, B D

    ISSN: 1742-6588, 1742-6596
    Published: IOP Publishing 01.11.2021
    Published in Journal of physics. Conference series (01.11.2021)
    “… In this paper, we use artificial neural networks (ANN) to predict this performance. ANNs with two optimization algorithms, mini-batch gradient descent and Levenberg-Marquardt, are implemented on students…”
    Get full text
    Journal Article
  12. 12

    Convergence of a Batch Gradient Algorithm with Adaptive Momentum for Neural Networks by Shao, Hongmei, Xu, Dongpo, Zheng, Gaofeng

    ISSN: 1370-4621, 1573-773X
    Published: Boston Springer US 01.12.2011
    Published in Neural processing letters (01.12.2011)
    “…In this paper, a batch gradient algorithm with adaptive momentum is considered and a convergence theorem is presented when it is used for two-layer feedforward neural networks training…”
    Get full text
    Journal Article
  13. 13

    Deterministic convergence analysis for regularized long short-term memory and its application to regression and multi-classification problems by Kang, Qian, Yu, Dengxiu, Cheong, Kang Hao, Wang, Zhen

    ISSN: 0952-1976, 1873-6769
    Published: Elsevier Ltd 01.07.2024
    “… This paper propose a novel regularized LSTM based on the batch gradient method. Specifically, the L2 regularization is appended to the objective function as a systematic…”
    Get full text
    Journal Article
  14. 14

    L1/2 regularization learning for smoothing interval neural networks: Algorithms and convergence analysis by Yang, Dakun, Liu, Yan

    ISSN: 0925-2312, 1872-8286
    Published: Elsevier B.V 10.01.2018
    Published in Neurocomputing (Amsterdam) (10.01.2018)
    “… In this paper, a novel batch gradient algorithm with smoothing L1/2 regularization is proposed to prevent the weights oscillation for a smoothing interval neural network (SINN…”
    Get full text
    Journal Article
  15. 15

    An online self-organizing radial basis function neural network based on Gaussian Membership by Jia, Lijie, Li, Wenjing, Qiao, Junfei, Zhang, Xinliang

    ISSN: 0924-669X, 1573-7497
    Published: New York Springer US 01.04.2025
    “…Radial basis function neural network (RBFNN) is one of the most popular neural networks, and an appropriate selection of its structure and learning algorithms is crucial for its performance…”
    Get full text
    Journal Article
  16. 16

    Convergence Analysis of Batch Gradient Algorithm for Three Classes of Sigma-Pi Neural Networks by Zhang, Chao, Wu, Wei, Xiong, Yan

    ISSN: 1370-4621, 1573-773X
    Published: Dordrecht Springer 01.12.2007
    Published in Neural processing letters (01.12.2007)
    “… A unified convergence analysis for the batch gradient algorithm for SPNN learning is presented, covering three classes of SPNNs: Σ-Π-Σ, Σ-Σ-Π and Σ-Π-Σ-Π…”
    Get full text
    Journal Article
  17. 17

    Multi-Pass Sequential Mini-Batch Stochastic Gradient Descent Algorithms for Noise Covariance Estimation in Adaptive Kalman Filtering by Kim, Hee-Seung, Zhang, Lingyi, Bienkowski, Adam, Pattipati, Krishna R.

    ISSN: 2169-3536, 2169-3536
    Published: 2230 Support IEEE 01.01.2021
    Published in IEEE access (01.01.2021)
    “… This paper presents stochastic gradient descent algorithms for noise covariance estimation in adaptive Kalman filters that are an order of magnitude faster than the batch method for similar or better…”
    Get full text
    Journal Article
  18. 18

    Strong Convergence Analysis of Batch Gradient-Based Learning Algorithm for Training Pi-Sigma Network Based on TSK Fuzzy Models by Liu, Yan, Yang, Dakun, Nan, Nan, Guo, Li, Zhang, Jianjun

    ISSN: 1370-4621, 1573-773X
    Published: New York Springer US 01.06.2016
    Published in Neural processing letters (01.06.2016)
    “… The aim of this paper is to present a gradient-based learning method for Pi-Sigma network to train TSK fuzzy inference system…”
    Get full text
    Journal Article
  19. 19

    Mini-Batch Gradient-Based Algorithms on High-Dimensional Quadratic Problems: Exact Dynamics and Consequences by Cheng, Andrew

    ISBN: 9798382617244
    Published: ProQuest Dissertations & Theses 01.01.2023
    “… (il- lustrated by Figure 1.2). A popular class of these algorithms are the mini-batch gradient- based methods (see Definition 2…”
    Get full text
    Dissertation
  20. 20

    Convergence analysis of the batch gradient-based neuro-fuzzy learning algorithm with smoothing L1/2 regularization for the first-order Takagi–Sugeno system by Liu, Yan, Yang, Dakun

    ISSN: 0165-0114, 1872-6801
    Published: Elsevier B.V 15.07.2017
    Published in Fuzzy sets and systems (15.07.2017)
    “… The neuro-fuzzy learning algorithm involves two tasks: generating comparable sparse networks and training the parameters…”
    Get full text
    Journal Article