Suchergebnisse - Batch gradient learning algorithm
-
1
Convergence of batch gradient learning algorithm with smoothing L1/2 regularization for Sigma–Pi–Sigma neural networks
ISSN: 0925-2312, 1872-8286Veröffentlicht: Elsevier B.V 03.03.2015Veröffentlicht in Neurocomputing (Amsterdam) (03.03.2015)“… –Sigma neural networks. Compared with conventional gradient learning algorithm, this method produces sparser weights and simpler structure, and it improves the learning efficiency …”
Volltext
Journal Article -
2
Deterministic convergence of complex mini-batch gradient learning algorithm for fully complex-valued neural networks
ISSN: 0925-2312, 1872-8286Veröffentlicht: Elsevier B.V 24.09.2020Veröffentlicht in Neurocomputing (Amsterdam) (24.09.2020)“… This paper investigates the fully complex mini-batch gradient algorithm for training complex-valued neural networks …”
Volltext
Journal Article -
3
Convergence analysis of the batch gradient-based neuro-fuzzy learning algorithm with smoothing L 1/2 regularization for the first-order Takagi–Sugeno system
ISSN: 0165-0114Veröffentlicht: 15.07.2017Veröffentlicht in Fuzzy sets and systems (15.07.2017)Volltext
Journal Article -
4
Batch Gradient Learning Algorithm with Smoothing L1 Regularization for Feedforward Neural Networks
ISSN: 2073-431X, 2073-431XVeröffentlicht: Basel MDPI AG 01.01.2023Veröffentlicht in Computers (Basel) (01.01.2023)“… In this paper, we propose a batch gradient learning algorithm with smoothing L1 regularization (BGSL1 …”
Volltext
Journal Article -
5
Batch Gradient Learning Algorithm with Smoothing Regularization for Feedforward Neural Networks
ISSN: 2073-431XVeröffentlicht: MDPI AG 01.12.2022Veröffentlicht in Computers (Basel) (01.12.2022)“… In this paper, we propose a batch gradient learning algorithm with smoothing L1 regularization (BGS L1 …”
Volltext
Journal Article -
6
Strong Convergence Analysis of Batch Gradient-Based Learning Algorithm for Training Pi-Sigma Network Based on TSK Fuzzy Models
ISSN: 1370-4621, 1573-773XVeröffentlicht: New York Springer US 01.06.2016Veröffentlicht in Neural processing letters (01.06.2016)“… The aim of this paper is to present a gradient-based learning method for Pi-Sigma network to train TSK fuzzy inference system …”
Volltext
Journal Article -
7
Convergence analysis of the batch gradient-based neuro-fuzzy learning algorithm with smoothing L1/2 regularization for the first-order Takagi–Sugeno system
ISSN: 0165-0114, 1872-6801Veröffentlicht: Elsevier B.V 15.07.2017Veröffentlicht in Fuzzy sets and systems (15.07.2017)“… The neuro-fuzzy learning algorithm involves two tasks: generating comparable sparse networks and training the parameters …”
Volltext
Journal Article -
8
A Batch Variable Learning Rate Gradient Descent Algorithm With the Smoothing L1/2 Regularization for Takagi-Sugeno Models
ISSN: 2169-3536Veröffentlicht: IEEE 2020Veröffentlicht in IEEE access (2020)“… A batch variable learning rate gradient descent algorithm is proposed to efficiently train a neuro-fuzzy network of zero-order Takagi-Sugeno inference systems …”
Volltext
Journal Article -
9
A Batch Variable Learning Rate Gradient Descent Algorithm With the Smoothing L 1/2 Regularization for Takagi-Sugeno Models
ISSN: 2169-3536, 2169-3536Veröffentlicht: Piscataway The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020Veröffentlicht in IEEE access (2020)“… A batch variable learning rate gradient descent algorithm is proposed to efficiently train a neuro-fuzzy network of zero-order Takagi-Sugeno inference systems …”
Volltext
Journal Article -
10
Network revenue management with online inverse batch gradient descent method
ISSN: 1059-1478, 1937-5956Veröffentlicht: Los Angeles, CA SAGE Publications 01.07.2023Veröffentlicht in Production and operations management (01.07.2023)“… ' prices but is concave in products' market shares (or price‐controlled demand rates). This creates challenges in adopting any stochastic gradient descent …”
Volltext
Journal Article -
11
Trustworthy Network Anomaly Detection Based on an Adaptive Learning Rate and Momentum in IIoT
ISSN: 1551-3203, 1941-0050Veröffentlicht: Piscataway IEEE 01.09.2020Veröffentlicht in IEEE transactions on industrial informatics (01.09.2020)“… and trustworthiness of IIoT devices has become an urgent problem to solve. In this article, we design a new hinge classification algorithm based on mini-batch gradient descent with an adaptive learning rate and momentum (HCA-MBGDALRM …”
Volltext
Journal Article -
12
Big data dimensionality reduction-based supervised machine learning algorithms for NASH diagnosis
ISSN: 1471-2105, 1471-2105Veröffentlicht: London BioMed Central 21.10.2025Veröffentlicht in BMC bioinformatics (21.10.2025)“… Optimization with Artificial Neural Networks (PSO-ANN) machine learning algorithm. Then, a gradient based Batch Least Squares (BLS …”
Volltext
Journal Article -
13
A learning algorithm with a gradient normalization and a learning rate adaptation for the mini-batch type learning
Veröffentlicht: The Society of Instrument and Control Engineers - SICE 01.09.2017Veröffentlicht in 2017 56th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE) (01.09.2017)“… The learning algorithms with gradient normalization mechanisms have been investigated, and their effectiveness has been shown …”
Volltext
Tagungsbericht -
14
Explore a deep learning multi-output neural network for regional multi-step-ahead air quality forecasts
ISSN: 0959-6526, 1879-1786Veröffentlicht: Elsevier Ltd 01.02.2019Veröffentlicht in Journal of cleaner production (01.02.2019)“… ) neural network model that were incorporated with three deep learning algorithms (i.e., mini-batch gradient descent, dropout neuron and L2 regularization …”
Volltext
Journal Article -
15
Adaptive stochastic conjugate gradient for machine learning
ISSN: 0957-4174, 1873-6793Veröffentlicht: Elsevier Ltd 15.11.2022Veröffentlicht in Expert systems with applications (15.11.2022)“… ) algorithms have been widely used in machine learning. This paper considers conjugate gradient in the mini-batch setting …”
Volltext
Journal Article -
16
A Static Security Region Analysis of New Power Systems Based on Improved Stochastic–Batch Gradient Pile Descent
ISSN: 2076-3417, 2076-3417Veröffentlicht: Basel MDPI AG 01.05.2024Veröffentlicht in Applied sciences (01.05.2024)“… To address the slow training speed of traditional deep learning algorithms using batch gradient descent (BGD …”
Volltext
Journal Article -
17
A new lightweight deep neural network for surface scratch detection
ISSN: 0268-3768, 1433-3015Veröffentlicht: London Springer London 01.11.2022Veröffentlicht in International journal of advanced manufacturing technology (01.11.2022)“… To this end, a large surface scratch dataset obtained from cylinder-on-flat sliding tests was used to train the WearNet with appropriate training parameters such as learning rate, gradient algorithm and mini-batch size …”
Volltext
Journal Article -
18
A Fast Adaptive Online Gradient Descent Algorithm in Over-Parameterized Neural Networks
ISSN: 1370-4621, 1573-773XVeröffentlicht: New York Springer US 01.08.2023Veröffentlicht in Neural processing letters (01.08.2023)“… Although many first-order adaptive gradient algorithms (e.g., Adam, AdaGrad) have been proposed to adjust the learning rate, they are vulnerable to the initial learning …”
Volltext
Journal Article -
19
Reinforcement learning based optimal control of batch processes using Monte-Carlo deep deterministic policy gradient with phase segmentation
ISSN: 0098-1354, 1873-4375Veröffentlicht: Elsevier Ltd 04.01.2021Veröffentlicht in Computers & chemical engineering (04.01.2021)“… •DDPG algorithm is modified with Monte-Carlo learning for stable agent training.•Suggested algorithm is applied to a batch polymerization process control problem …”
Volltext
Journal Article -
20
Comparison of Stochastic Steepest Gradient Descent and Extended Kalman Filter as ARMA-FNN Learning Algorithms for Data-Driven System Identification of Batch Distillation Column
ISSN: 2470-640XVeröffentlicht: IEEE 02.10.2023Veröffentlicht in IEEE International Conference on System Engineering and Technology (Online) (02.10.2023)“… The plant used in this study is a batch-type distillation column system located in the ITB Honeywell Control Systems Laboratory, capable of separating binary mixtures of ethanol and water …”
Volltext
Tagungsbericht