Improving Neural Network Efficiency Using Piecewise Linear Approximation of Activation Functions

Activation functions play a pivotal role in Neural Networks by enabling the modeling of complex non-linear relationships within data. However, the computational cost associated with certain activation functions, such as the hyperbolic tangent (tanh) and its gradient, can be substantial. In this stud...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Proceedings of the International Florida Artificial Intelligence Research Society Conference Jg. 38; H. 1
Hauptverfasser: Reddy, Pavan, Gujral, Aditya Sanjay
Format: Journal Article
Sprache:Englisch
Veröffentlicht: LibraryPress@UF 14.05.2025
ISSN:2334-0754, 2334-0762
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Abstract Activation functions play a pivotal role in Neural Networks by enabling the modeling of complex non-linear relationships within data. However, the computational cost associated with certain activation functions, such as the hyperbolic tangent (tanh) and its gradient, can be substantial. In this study, we demonstrate that a piecewise linear approximation of the tanh function, utilizing pre-calculated slopes, achieves faster computation without significant degradation in performance. Conversely, we show that a piecewise linear approximation of the sigmoid function is computationally slower compared to its continuous counterpart. These findings suggest that the computational efficiency of a piecewise activation function depends on whether the indexing and arithmetic costs of the approximation are lower than those of the continuous function.
AbstractList Activation functions play a pivotal role in Neural Networks by enabling the modeling of complex non-linear relationships within data. However, the computational cost associated with certain activation functions, such as the hyperbolic tangent (tanh) and its gradient, can be substantial. In this study, we demonstrate that a piecewise linear approximation of the tanh function, utilizing pre-calculated slopes, achieves faster computation without significant degradation in performance. Conversely, we show that a piecewise linear approximation of the sigmoid function is computationally slower compared to its continuous counterpart. These findings suggest that the computational efficiency of a piecewise activation function depends on whether the indexing and arithmetic costs of the approximation are lower than those of the continuous function.
Author Reddy, Pavan
Gujral, Aditya Sanjay
Author_xml – sequence: 1
  givenname: Pavan
  orcidid: 0009-0001-4832-1845
  surname: Reddy
  fullname: Reddy, Pavan
– sequence: 2
  givenname: Aditya Sanjay
  surname: Gujral
  fullname: Gujral, Aditya Sanjay
BookMark eNo9kN1OAjEQhRuDiag8gTf7Aqz93XYvCQElIeqFXNdut0OqsCXtAvL2Lqzh6pyZyXw5Ofdo0ITGIfREcM4ol-wZNsbHlDOVk5ywEmNxg4aUMT7GsqCDqxf8Do1S8hXmXIqiFGKIvhbbXQwH36yzN7ePZtNJewzxJ5sBeOtdY0_ZKp3vH95Zd_TJZUvfOBOzya57_fVb0_rQZAGyiW39oZ_m-8aeTXpEt2A2yY3-9QGt5rPP6et4-f6ymE6WY0upFGMClaVAa0m4xBSIhFICUwYTJ3BZWQCJBTNWmpoqxcuyYrVisgRaFdhRwR7QoufWwXzrXexixZMOxuvLIsS1NrH1duO04lAYR0BYW3JFlALKMePQ8bmtC9qxWM-yMaQUHVx5BOtL57rvXDOlie47Z3_AiHkX
ContentType Journal Article
DBID AAYXX
CITATION
DOA
DOI 10.32473/flairs.38.1.139005
DatabaseName CrossRef
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
DatabaseTitleList
CrossRef
Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
DeliveryMethod fulltext_linktorsrc
EISSN 2334-0762
ExternalDocumentID oai_doaj_org_article_84f6ae1f5cc948188f24034f53a4cd62
10_32473_flairs_38_1_139005
GroupedDBID AAYXX
ALMA_UNASSIGNED_HOLDINGS
CITATION
GROUPED_DOAJ
ID FETCH-LOGICAL-c2275-1fbc2f2d714702f17f97f38a01e509bcff7053ac7ad288499b3d8379f2b60e253
IEDL.DBID DOA
ISSN 2334-0754
IngestDate Fri Oct 03 12:47:31 EDT 2025
Sat Nov 29 07:53:14 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 1
Language English
License https://creativecommons.org/licenses/by-nc/4.0
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c2275-1fbc2f2d714702f17f97f38a01e509bcff7053ac7ad288499b3d8379f2b60e253
ORCID 0009-0001-4832-1845
OpenAccessLink https://doaj.org/article/84f6ae1f5cc948188f24034f53a4cd62
ParticipantIDs doaj_primary_oai_doaj_org_article_84f6ae1f5cc948188f24034f53a4cd62
crossref_primary_10_32473_flairs_38_1_139005
PublicationCentury 2000
PublicationDate 2025-05-14
PublicationDateYYYYMMDD 2025-05-14
PublicationDate_xml – month: 05
  year: 2025
  text: 2025-05-14
  day: 14
PublicationDecade 2020
PublicationTitle Proceedings of the International Florida Artificial Intelligence Research Society Conference
PublicationYear 2025
Publisher LibraryPress@UF
Publisher_xml – name: LibraryPress@UF
SSID ssib044756955
ssib059229545
Score 2.2914584
Snippet Activation functions play a pivotal role in Neural Networks by enabling the modeling of complex non-linear relationships within data. However, the...
SourceID doaj
crossref
SourceType Open Website
Index Database
Title Improving Neural Network Efficiency Using Piecewise Linear Approximation of Activation Functions
URI https://doaj.org/article/84f6ae1f5cc948188f24034f53a4cd62
Volume 38
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAON
  databaseName: DOAJ Directory of Open Access Journals
  customDbUrl:
  eissn: 2334-0762
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssib059229545
  issn: 2334-0754
  databaseCode: DOA
  dateStart: 20210101
  isFulltext: true
  titleUrlDefault: https://www.doaj.org/
  providerName: Directory of Open Access Journals
– providerCode: PRVHPJ
  databaseName: ROAD: Directory of Open Access Scholarly Resources
  customDbUrl:
  eissn: 2334-0762
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssib044756955
  issn: 2334-0754
  databaseCode: M~E
  dateStart: 19990101
  isFulltext: true
  titleUrlDefault: https://road.issn.org
  providerName: ISSN International Centre
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrZ09T8MwEIYtVDGwIBAgypc8MJI2tpPYHgtqxQBVB5C6mcTxSZFQi9ryMfHbOdttycbCkiGJouh8yb2X2M9LyLV2wHhtS3zEc55ghohEFa5KgJco5_O6UC5AXB_keKymUz1pWX35OWERDxwD11cZFKVjkFvrwSJKgSfIZZCLMrN1fPumUreaKcwkT7Er9O-Ky1x71-rgWMyFyBKsk1lEEKGekKIPr_7nSU-oHuuhJEq9mV2rTLVo_qHsjA7I_lov0kG8z0Oy42ZH5GX7KYB6uAYeH8fZ3HQYiBB-OSUNkwHopHHWfTZLR7HrxKymAw8R_2riikU6BzqwG4czOsIiF_LwmDyPhk9398naKiGxnMs8YVBZDryWLJMpByZBSxCqTJlDRVBZAIlPW2llWXOlsMupRI2tqQZeFanjuTghndl85k4JrWquC8cLDU5lNQoY0KkrixyFSWUld11ys4mMeYtEDIOdRAikiYE0QhlmYiC75NZHb3uqx1mHHTjIZj3I5q9BPvuPi5yTPe7Nez16NbsgndXi3V2SXfuxapaLq5A_uH38Hv4A4VnIyg
linkProvider Directory of Open Access Journals
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Improving+Neural+Network+Efficiency+Using+Piecewise+Linear+Approximation+of+Activation+Functions&rft.jtitle=The+International+FLAIRS+Conference+Proceedings&rft.au=Reddy%2C+Pavan&rft.au=Gujral%2C+Aditya+Sanjay&rft.date=2025-05-14&rft.issn=2334-0754&rft.eissn=2334-0762&rft.volume=38&rft_id=info:doi/10.32473%2Fflairs.38.1.139005&rft.externalDBID=n%2Fa&rft.externalDocID=10_32473_flairs_38_1_139005
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2334-0754&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2334-0754&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2334-0754&client=summon