Suchergebnisse - Batch gradient algorithm

  1. 1
  2. 2

    Convergence of batch gradient learning algorithm with smoothing L1/2 regularization for Sigma–Pi–Sigma neural networks von Liu, Yan, Li, Zhengxue, Yang, Dakun, Mohamed, Kh.Sh, Wang, Jing, Wu, Wei

    ISSN: 0925-2312, 1872-8286
    Veröffentlicht: Elsevier B.V 03.03.2015
    Veröffentlicht in Neurocomputing (Amsterdam) (03.03.2015)
    “… However, the nonsmoothness of L1/2 regularization may lead to oscillation phenomenon. The aim of this paper is to develop a novel batch gradient method with smoothing L1/2 regularization for Sigma–Pi …”
    Volltext
    Journal Article
  3. 3

    A Mini-Batch Proximal Stochastic Recursive Gradient Algorithm with Diagonal Barzilai–Borwein Stepsize von Yu, Teng-Teng, Liu, Xin-Wei, Dai, Yu-Hong, Sun, Jie

    ISSN: 2194-668X, 2194-6698
    Veröffentlicht: Berlin/Heidelberg Springer Berlin Heidelberg 01.06.2023
    “… We propose a mini-batch proximal stochastic recursive gradient algorithm SRG-DBB, which incorporates the diagonal Barzilai–Borwein (DBB …”
    Volltext
    Journal Article
  4. 4

    Deterministic convergence of complex mini-batch gradient learning algorithm for fully complex-valued neural networks von Zhang, Huisheng, Zhang, Ying, Zhu, Shuai, Xu, Dongpo

    ISSN: 0925-2312, 1872-8286
    Veröffentlicht: Elsevier B.V 24.09.2020
    Veröffentlicht in Neurocomputing (Amsterdam) (24.09.2020)
    “… This paper investigates the fully complex mini-batch gradient algorithm for training complex-valued neural networks …”
    Volltext
    Journal Article
  5. 5

    Convergence of batch gradient algorithm with smoothing composition of group l0 and l1/2 regularization for feedforward neural networks von Ramchoun, Hassan, Ettaouil, Mohamed

    ISSN: 2192-6352, 2192-6360
    Veröffentlicht: Berlin/Heidelberg Springer Berlin Heidelberg 01.09.2022
    Veröffentlicht in Progress in artificial intelligence (01.09.2022)
    “… In this paper, we prove the convergence of batch gradient method for training feedforward neural network …”
    Volltext
    Journal Article
  6. 6

    A mini-batch stochastic conjugate gradient algorithm with variance reduction von Kou, Caixia, Yang, Han

    ISSN: 0925-5001, 1573-2916
    Veröffentlicht: New York Springer US 01.11.2023
    Veröffentlicht in Journal of global optimization (01.11.2023)
    “… In this paper, in the spirit of SAGA, we propose a stochastic conjugate gradient algorithm which we call SCGA …”
    Volltext
    Journal Article
  7. 7
  8. 8

    Batch Gradient Learning Algorithm with Smoothing L1 Regularization for Feedforward Neural Networks von Mohamed, Khidir Shaib

    ISSN: 2073-431X, 2073-431X
    Veröffentlicht: Basel MDPI AG 01.01.2023
    Veröffentlicht in Computers (Basel) (01.01.2023)
    “… In this paper, we propose a batch gradient learning algorithm with smoothing L1 regularization (BGSL1 …”
    Volltext
    Journal Article
  9. 9

    Design and Analysis of Urban Land Lease Price Predicting Model Using Batch Gradient Descent Algorithm von Berhane Niguse, Kifle

    ISSN: 2073-073X, 2220-184X
    Veröffentlicht: 21.05.2023
    Veröffentlicht in Momona Ethiopian journal of science (21.05.2023)
    “… But with prediction, they tend to over-fit samples and simplify poorly to new, undetected data. This paper presents a batch gradient algorithm for predicting the rice of land with large datasets …”
    Volltext
    Journal Article
  10. 10

    Batch Gradient Learning Algorithm with Smoothing Regularization for Feedforward Neural Networks von Khidir Shaib Mohamed

    ISSN: 2073-431X
    Veröffentlicht: MDPI AG 01.12.2022
    Veröffentlicht in Computers (Basel) (01.12.2022)
    “… In this paper, we propose a batch gradient learning algorithm with smoothing L1 regularization (BGS L1 …”
    Volltext
    Journal Article
  11. 11

    Prediction of students’ academic performance using ANN with mini-batch gradient descent and Levenberg-Marquardt optimization algorithms von Simanungkalit, F R J, Hanifah, H, Ardaneswari, G, Hariadi, N, Handari, B D

    ISSN: 1742-6588, 1742-6596
    Veröffentlicht: IOP Publishing 01.11.2021
    Veröffentlicht in Journal of physics. Conference series (01.11.2021)
    “… In this paper, we use artificial neural networks (ANN) to predict this performance. ANNs with two optimization algorithms, mini-batch gradient descent and Levenberg-Marquardt, are implemented on students …”
    Volltext
    Journal Article
  12. 12

    Convergence of a Batch Gradient Algorithm with Adaptive Momentum for Neural Networks von Shao, Hongmei, Xu, Dongpo, Zheng, Gaofeng

    ISSN: 1370-4621, 1573-773X
    Veröffentlicht: Boston Springer US 01.12.2011
    Veröffentlicht in Neural processing letters (01.12.2011)
    “… In this paper, a batch gradient algorithm with adaptive momentum is considered and a convergence theorem is presented when it is used for two-layer feedforward neural networks training …”
    Volltext
    Journal Article
  13. 13

    Deterministic convergence analysis for regularized long short-term memory and its application to regression and multi-classification problems von Kang, Qian, Yu, Dengxiu, Cheong, Kang Hao, Wang, Zhen

    ISSN: 0952-1976, 1873-6769
    Veröffentlicht: Elsevier Ltd 01.07.2024
    Veröffentlicht in Engineering applications of artificial intelligence (01.07.2024)
    “… This paper propose a novel regularized LSTM based on the batch gradient method. Specifically, the L2 regularization is appended to the objective function as a systematic …”
    Volltext
    Journal Article
  14. 14

    L1/2 regularization learning for smoothing interval neural networks: Algorithms and convergence analysis von Yang, Dakun, Liu, Yan

    ISSN: 0925-2312, 1872-8286
    Veröffentlicht: Elsevier B.V 10.01.2018
    Veröffentlicht in Neurocomputing (Amsterdam) (10.01.2018)
    “… In this paper, a novel batch gradient algorithm with smoothing L1/2 regularization is proposed to prevent the weights oscillation for a smoothing interval neural network (SINN …”
    Volltext
    Journal Article
  15. 15

    Convergence of a Batch Gradient Algorithm with Adaptive Momentum for Neural Networks von Gaofeng Zheng, Dongpo Xu, Hongmei Shao

    ISSN: 1370-4621, 1573-773X
    Veröffentlicht: Springer Science and Business Media LLC 22.07.2011
    Veröffentlicht in Neural Processing Letters (22.07.2011)
    Volltext
    Journal Article
  16. 16

    An online self-organizing radial basis function neural network based on Gaussian Membership von Jia, Lijie, Li, Wenjing, Qiao, Junfei, Zhang, Xinliang

    ISSN: 0924-669X, 1573-7497
    Veröffentlicht: New York Springer US 01.04.2025
    Veröffentlicht in Applied intelligence (Dordrecht, Netherlands) (01.04.2025)
    “… Radial basis function neural network (RBFNN) is one of the most popular neural networks, and an appropriate selection of its structure and learning algorithms is crucial for its performance …”
    Volltext
    Journal Article
  17. 17

    Convergence Analysis of Batch Gradient Algorithm for Three Classes of Sigma-Pi Neural Networks von Zhang, Chao, Wu, Wei, Xiong, Yan

    ISSN: 1370-4621, 1573-773X
    Veröffentlicht: Dordrecht Springer 01.12.2007
    Veröffentlicht in Neural processing letters (01.12.2007)
    “… A unified convergence analysis for the batch gradient algorithm for SPNN learning is presented, covering three classes of SPNNs: Σ-Π-Σ, Σ-Σ-Π and Σ-Π-Σ-Π …”
    Volltext
    Journal Article
  18. 18

    Multi-Pass Sequential Mini-Batch Stochastic Gradient Descent Algorithms for Noise Covariance Estimation in Adaptive Kalman Filtering von Kim, Hee-Seung, Zhang, Lingyi, Bienkowski, Adam, Pattipati, Krishna R.

    ISSN: 2169-3536, 2169-3536
    Veröffentlicht: 2230 Support IEEE 01.01.2021
    Veröffentlicht in IEEE access (01.01.2021)
    “… This paper presents stochastic gradient descent algorithms for noise covariance estimation in adaptive Kalman filters that are an order of magnitude faster than the batch method for similar or better …”
    Volltext
    Journal Article
  19. 19

    Strong Convergence Analysis of Batch Gradient-Based Learning Algorithm for Training Pi-Sigma Network Based on TSK Fuzzy Models von Liu, Yan, Yang, Dakun, Nan, Nan, Guo, Li, Zhang, Jianjun

    ISSN: 1370-4621, 1573-773X
    Veröffentlicht: New York Springer US 01.06.2016
    Veröffentlicht in Neural processing letters (01.06.2016)
    “… The aim of this paper is to present a gradient-based learning method for Pi-Sigma network to train TSK fuzzy inference system …”
    Volltext
    Journal Article
  20. 20

    Mini-Batch Gradient-Based Algorithms on High-Dimensional Quadratic Problems: Exact Dynamics and Consequences von Cheng, Andrew

    ISBN: 9798382617244
    Veröffentlicht: ProQuest Dissertations & Theses 01.01.2023
    “… (il- lustrated by Figure 1.2). A popular class of these algorithms are the mini-batch gradient- based methods (see Definition 2 …”
    Volltext
    Dissertation