Improving Neural Network Efficiency Using Piecewise Linear Approximation of Activation Functions

Activation functions play a pivotal role in Neural Networks by enabling the modeling of complex non-linear relationships within data. However, the computational cost associated with certain activation functions, such as the hyperbolic tangent (tanh) and its gradient, can be substantial. In this stud...

Full description

Saved in:
Bibliographic Details
Published in:Proceedings of the International Florida Artificial Intelligence Research Society Conference Vol. 38; no. 1
Main Authors: Reddy, Pavan, Gujral, Aditya Sanjay
Format: Journal Article
Language:English
Published: LibraryPress@UF 14.05.2025
ISSN:2334-0754, 2334-0762
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Activation functions play a pivotal role in Neural Networks by enabling the modeling of complex non-linear relationships within data. However, the computational cost associated with certain activation functions, such as the hyperbolic tangent (tanh) and its gradient, can be substantial. In this study, we demonstrate that a piecewise linear approximation of the tanh function, utilizing pre-calculated slopes, achieves faster computation without significant degradation in performance. Conversely, we show that a piecewise linear approximation of the sigmoid function is computationally slower compared to its continuous counterpart. These findings suggest that the computational efficiency of a piecewise activation function depends on whether the indexing and arithmetic costs of the approximation are lower than those of the continuous function.
ISSN:2334-0754
2334-0762
DOI:10.32473/flairs.38.1.139005