Improving Neural Network Efficiency Using Piecewise Linear Approximation of Activation Functions

Activation functions play a pivotal role in Neural Networks by enabling the modeling of complex non-linear relationships within data. However, the computational cost associated with certain activation functions, such as the hyperbolic tangent (tanh) and its gradient, can be substantial. In this stud...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Proceedings of the International Florida Artificial Intelligence Research Society Conference Ročník 38; číslo 1
Hlavní autoři: Reddy, Pavan, Gujral, Aditya Sanjay
Médium: Journal Article
Jazyk:angličtina
Vydáno: LibraryPress@UF 14.05.2025
ISSN:2334-0754, 2334-0762
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Activation functions play a pivotal role in Neural Networks by enabling the modeling of complex non-linear relationships within data. However, the computational cost associated with certain activation functions, such as the hyperbolic tangent (tanh) and its gradient, can be substantial. In this study, we demonstrate that a piecewise linear approximation of the tanh function, utilizing pre-calculated slopes, achieves faster computation without significant degradation in performance. Conversely, we show that a piecewise linear approximation of the sigmoid function is computationally slower compared to its continuous counterpart. These findings suggest that the computational efficiency of a piecewise activation function depends on whether the indexing and arithmetic costs of the approximation are lower than those of the continuous function.
ISSN:2334-0754
2334-0762
DOI:10.32473/flairs.38.1.139005