Batch Gradient Learning Algorithm with Smoothing L1 Regularization for Feedforward Neural Networks

Regularization techniques are critical in the development of machine learning models. Complex models, such as neural networks, are particularly prone to overfitting and to performing poorly on the training data. L1 regularization is the most extreme way to enforce sparsity, but, regrettably, it does...

Full description

Saved in:
Bibliographic Details
Published in:Computers (Basel) Vol. 12; no. 1; p. 4
Main Author: Mohamed, Khidir Shaib
Format: Journal Article
Language:English
Published: Basel MDPI AG 01.01.2023
Subjects:
ISSN:2073-431X, 2073-431X
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Be the first to leave a comment!
You must be logged in first