Výsledky vyhledávání - Batch gradient learning algorithm
-
1
Convergence of batch gradient learning algorithm with smoothing L1/2 regularization for Sigma–Pi–Sigma neural networks
ISSN: 0925-2312, 1872-8286Vydáno: Elsevier B.V 03.03.2015Vydáno v Neurocomputing (Amsterdam) (03.03.2015)“…–Sigma neural networks. Compared with conventional gradient learning algorithm, this method produces sparser weights and simpler structure, and it improves the learning efficiency…”
Získat plný text
Journal Article -
2
Deterministic convergence of complex mini-batch gradient learning algorithm for fully complex-valued neural networks
ISSN: 0925-2312, 1872-8286Vydáno: Elsevier B.V 24.09.2020Vydáno v Neurocomputing (Amsterdam) (24.09.2020)“…This paper investigates the fully complex mini-batch gradient algorithm for training complex-valued neural networks…”
Získat plný text
Journal Article -
3
Convergence analysis of the batch gradient-based neuro-fuzzy learning algorithm with smoothing L 1/2 regularization for the first-order Takagi–Sugeno system
ISSN: 0165-0114Vydáno: 15.07.2017Vydáno v Fuzzy sets and systems (15.07.2017)Získat plný text
Journal Article -
4
Batch Gradient Learning Algorithm with Smoothing L1 Regularization for Feedforward Neural Networks
ISSN: 2073-431X, 2073-431XVydáno: Basel MDPI AG 01.01.2023Vydáno v Computers (Basel) (01.01.2023)“… In this paper, we propose a batch gradient learning algorithm with smoothing L1 regularization (BGSL1…”
Získat plný text
Journal Article -
5
Batch Gradient Learning Algorithm with Smoothing Regularization for Feedforward Neural Networks
ISSN: 2073-431XVydáno: MDPI AG 01.12.2022Vydáno v Computers (Basel) (01.12.2022)“… In this paper, we propose a batch gradient learning algorithm with smoothing L1 regularization (BGS L1…”
Získat plný text
Journal Article -
6
Strong Convergence Analysis of Batch Gradient-Based Learning Algorithm for Training Pi-Sigma Network Based on TSK Fuzzy Models
ISSN: 1370-4621, 1573-773XVydáno: New York Springer US 01.06.2016Vydáno v Neural processing letters (01.06.2016)“… The aim of this paper is to present a gradient-based learning method for Pi-Sigma network to train TSK fuzzy inference system…”
Získat plný text
Journal Article -
7
Convergence analysis of the batch gradient-based neuro-fuzzy learning algorithm with smoothing L1/2 regularization for the first-order Takagi–Sugeno system
ISSN: 0165-0114, 1872-6801Vydáno: Elsevier B.V 15.07.2017Vydáno v Fuzzy sets and systems (15.07.2017)“… The neuro-fuzzy learning algorithm involves two tasks: generating comparable sparse networks and training the parameters…”
Získat plný text
Journal Article -
8
A Batch Variable Learning Rate Gradient Descent Algorithm With the Smoothing L1/2 Regularization for Takagi-Sugeno Models
ISSN: 2169-3536Vydáno: IEEE 2020Vydáno v IEEE access (2020)“…A batch variable learning rate gradient descent algorithm is proposed to efficiently train a neuro-fuzzy network of zero-order Takagi-Sugeno inference systems…”
Získat plný text
Journal Article -
9
A Batch Variable Learning Rate Gradient Descent Algorithm With the Smoothing L 1/2 Regularization for Takagi-Sugeno Models
ISSN: 2169-3536, 2169-3536Vydáno: Piscataway The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020Vydáno v IEEE access (2020)“…A batch variable learning rate gradient descent algorithm is proposed to efficiently train a neuro-fuzzy network of zero-order Takagi-Sugeno inference systems…”
Získat plný text
Journal Article -
10
Network revenue management with online inverse batch gradient descent method
ISSN: 1059-1478, 1937-5956Vydáno: Los Angeles, CA SAGE Publications 01.07.2023Vydáno v Production and operations management (01.07.2023)“…' prices but is concave in products' market shares (or price‐controlled demand rates). This creates challenges in adopting any stochastic gradient descent…”
Získat plný text
Journal Article -
11
Trustworthy Network Anomaly Detection Based on an Adaptive Learning Rate and Momentum in IIoT
ISSN: 1551-3203, 1941-0050Vydáno: Piscataway IEEE 01.09.2020Vydáno v IEEE transactions on industrial informatics (01.09.2020)“… and trustworthiness of IIoT devices has become an urgent problem to solve. In this article, we design a new hinge classification algorithm based on mini-batch gradient descent with an adaptive learning rate and momentum (HCA-MBGDALRM…”
Získat plný text
Journal Article -
12
Big data dimensionality reduction-based supervised machine learning algorithms for NASH diagnosis
ISSN: 1471-2105, 1471-2105Vydáno: London BioMed Central 21.10.2025Vydáno v BMC bioinformatics (21.10.2025)“… Optimization with Artificial Neural Networks (PSO-ANN) machine learning algorithm. Then, a gradient based Batch Least Squares (BLS…”
Získat plný text
Journal Article -
13
A learning algorithm with a gradient normalization and a learning rate adaptation for the mini-batch type learning
Vydáno: The Society of Instrument and Control Engineers - SICE 01.09.2017Vydáno v 2017 56th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE) (01.09.2017)“… The learning algorithms with gradient normalization mechanisms have been investigated, and their effectiveness has been shown…”
Získat plný text
Konferenční příspěvek -
14
Explore a deep learning multi-output neural network for regional multi-step-ahead air quality forecasts
ISSN: 0959-6526, 1879-1786Vydáno: Elsevier Ltd 01.02.2019Vydáno v Journal of cleaner production (01.02.2019)“…) neural network model that were incorporated with three deep learning algorithms (i.e., mini-batch gradient descent, dropout neuron and L2 regularization…”
Získat plný text
Journal Article -
15
Adaptive stochastic conjugate gradient for machine learning
ISSN: 0957-4174, 1873-6793Vydáno: Elsevier Ltd 15.11.2022Vydáno v Expert systems with applications (15.11.2022)“…) algorithms have been widely used in machine learning. This paper considers conjugate gradient in the mini-batch setting…”
Získat plný text
Journal Article -
16
A Static Security Region Analysis of New Power Systems Based on Improved Stochastic–Batch Gradient Pile Descent
ISSN: 2076-3417, 2076-3417Vydáno: Basel MDPI AG 01.05.2024Vydáno v Applied sciences (01.05.2024)“… To address the slow training speed of traditional deep learning algorithms using batch gradient descent (BGD…”
Získat plný text
Journal Article -
17
A new lightweight deep neural network for surface scratch detection
ISSN: 0268-3768, 1433-3015Vydáno: London Springer London 01.11.2022Vydáno v International journal of advanced manufacturing technology (01.11.2022)“… To this end, a large surface scratch dataset obtained from cylinder-on-flat sliding tests was used to train the WearNet with appropriate training parameters such as learning rate, gradient algorithm and mini-batch size…”
Získat plný text
Journal Article -
18
A Fast Adaptive Online Gradient Descent Algorithm in Over-Parameterized Neural Networks
ISSN: 1370-4621, 1573-773XVydáno: New York Springer US 01.08.2023Vydáno v Neural processing letters (01.08.2023)“… Although many first-order adaptive gradient algorithms (e.g., Adam, AdaGrad) have been proposed to adjust the learning rate, they are vulnerable to the initial learning…”
Získat plný text
Journal Article -
19
Reinforcement learning based optimal control of batch processes using Monte-Carlo deep deterministic policy gradient with phase segmentation
ISSN: 0098-1354, 1873-4375Vydáno: Elsevier Ltd 04.01.2021Vydáno v Computers & chemical engineering (04.01.2021)“…•DDPG algorithm is modified with Monte-Carlo learning for stable agent training.•Suggested algorithm is applied to a batch polymerization process control problem…”
Získat plný text
Journal Article -
20
Comparison of Stochastic Steepest Gradient Descent and Extended Kalman Filter as ARMA-FNN Learning Algorithms for Data-Driven System Identification of Batch Distillation Column
ISSN: 2470-640XVydáno: IEEE 02.10.2023Vydáno v IEEE International Conference on System Engineering and Technology (Online) (02.10.2023)“… The plant used in this study is a batch-type distillation column system located in the ITB Honeywell Control Systems Laboratory, capable of separating binary mixtures of ethanol and water…”
Získat plný text
Konferenční příspěvek