Why RELU Units Sometimes Die: Analysis of Single-Unit Error Backpropagation in Neural Networks

Recently, neural networks in machine learning use rectified linear units (ReLUs) in early processing layers for better performance. Training these structures sometimes results in "dying ReLU units" with near-zero outputs. We first explore this condition via simulation using the CIFAR-10 da...

Full description

Saved in:
Bibliographic Details
Published in:Conference record - Asilomar Conference on Signals, Systems, & Computers pp. 864 - 868
Main Authors: Douglas, Scott C., Yu, Jiutian
Format: Conference Proceeding
Language:English
Published: IEEE 01.10.2018
Subjects:
ISSN:2576-2303
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Be the first to leave a comment!
You must be logged in first