Softermax: Hardware/Software Co-Design of an Efficient Softmax for Transformers

Transformers have transformed the field of natural language processing. Their superior performance is largely attributed to the use of stacked "self-attention" layers, each of which consists of matrix multiplies as well as softmax operations. As a result, unlike other neural networks, the...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2021 58th ACM/IEEE Design Automation Conference (DAC) s. 469 - 474
Hlavní autoři: Stevens, Jacob R., Venkatesan, Rangharajan, Dai, Steve, Khailany, Brucek, Raghunathan, Anand
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 05.12.2021
Témata:
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Transformers have transformed the field of natural language processing. Their superior performance is largely attributed to the use of stacked "self-attention" layers, each of which consists of matrix multiplies as well as softmax operations. As a result, unlike other neural networks, the softmax operation accounts for a significant fraction of the total run-time of Transformers. To address this, we propose Softermax, a hardware-friendly softmax design. Softermax consists of base replacement, low-precision softmax computations, and an online normalization calculation. We show Softermax results in 2.35x the energy efficiency at 0.90x the size of a comparable baseline, with negligible impact on network accuracy.
AbstractList Transformers have transformed the field of natural language processing. Their superior performance is largely attributed to the use of stacked "self-attention" layers, each of which consists of matrix multiplies as well as softmax operations. As a result, unlike other neural networks, the softmax operation accounts for a significant fraction of the total run-time of Transformers. To address this, we propose Softermax, a hardware-friendly softmax design. Softermax consists of base replacement, low-precision softmax computations, and an online normalization calculation. We show Softermax results in 2.35x the energy efficiency at 0.90x the size of a comparable baseline, with negligible impact on network accuracy.
Author Venkatesan, Rangharajan
Raghunathan, Anand
Dai, Steve
Khailany, Brucek
Stevens, Jacob R.
Author_xml – sequence: 1
  givenname: Jacob R.
  surname: Stevens
  fullname: Stevens, Jacob R.
  organization: Purdue University,West Lafayette
– sequence: 2
  givenname: Rangharajan
  surname: Venkatesan
  fullname: Venkatesan, Rangharajan
  organization: NVIDIA
– sequence: 3
  givenname: Steve
  surname: Dai
  fullname: Dai, Steve
  organization: NVIDIA
– sequence: 4
  givenname: Brucek
  surname: Khailany
  fullname: Khailany, Brucek
  organization: NVIDIA
– sequence: 5
  givenname: Anand
  surname: Raghunathan
  fullname: Raghunathan, Anand
  organization: Purdue University,West Lafayette
BookMark eNotT19LwzAcjKCgzn4CEfIF2uWXf018G93mhMEenM8jTX-Rgk0lKajf3g73cnccdwd3T67jGJGQJ2AVALPL9aoBw2pZccahsspoEPKKFLY2oLWSgteS3ZIi575lmikjZ7wjh7cxTJgG9_NMdy513y7h8uydBW3Gco25_4h0DNRFugmh9z3GiZ4jc4mGMdFjcjHPYsCUH8hNcJ8ZiwsvyPt2c2x25f7w8tqs9qXjpp5K6JgSBrAW4K1XElqwijuL0nfBauWNFMJJrzWX2oHyupOeoXRKadu2XCzI4_9uj4inr9QPLv2eLrfFHxA7T3w
ContentType Conference Proceeding
DBID 6IE
6IH
CBEJK
RIE
RIO
DOI 10.1109/DAC18074.2021.9586134
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Proceedings Order Plan (POP) 1998-present by volume
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP) 1998-present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
EISBN 9781665432740
1665432748
EndPage 474
ExternalDocumentID 9586134
Genre orig-research
GroupedDBID 6IE
6IH
ACM
ALMA_UNASSIGNED_HOLDINGS
CBEJK
RIE
RIO
ID FETCH-LOGICAL-a287t-1d05381e731c9c541b1952a9e4cdf965c8433a4c66246a15c6d4c0e4a5569bb23
IEDL.DBID RIE
ISICitedReferencesCount 89
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000766079700079&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
IngestDate Wed Aug 27 02:28:29 EDT 2025
IsPeerReviewed false
IsScholarly true
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-a287t-1d05381e731c9c541b1952a9e4cdf965c8433a4c66246a15c6d4c0e4a5569bb23
PageCount 6
ParticipantIDs ieee_primary_9586134
PublicationCentury 2000
PublicationDate 2021-Dec.-5
PublicationDateYYYYMMDD 2021-12-05
PublicationDate_xml – month: 12
  year: 2021
  text: 2021-Dec.-5
  day: 05
PublicationDecade 2020
PublicationTitle 2021 58th ACM/IEEE Design Automation Conference (DAC)
PublicationTitleAbbrev DAC
PublicationYear 2021
Publisher IEEE
Publisher_xml – name: IEEE
SSID ssib060584060
Score 2.5448027
Snippet Transformers have transformed the field of natural language processing. Their superior performance is largely attributed to the use of stacked "self-attention"...
SourceID ieee
SourceType Publisher
StartPage 469
SubjectTerms Deep learning
Design automation
Hardware
hardware/software codesign
Natural language processing
neural network accelerators
Neural networks
Software
Transformers
Title Softermax: Hardware/Software Co-Design of an Efficient Softmax for Transformers
URI https://ieeexplore.ieee.org/document/9586134
WOSCitedRecordID wos000766079700079&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlR3LSgMxMLTFgyeVVnyTg0e33WQnL2_SB55qwQq9lWwe4GVXaqt-vkm6rQhevE1CJoHJY2YyL4RuqSdMc1FmhZU8A09spqnQWWhZGdSDXBtIxSbEdCoXCzVrobt9LIxzLjmfuX4Eky3f1mYTv8oGisnAfaCN2kLwbazW7uxE617gTXkTpENyNRg9DElM9RKUQEr6De6vIiqJh0yO_rf6Mer9BOPh2Z7NnKCWq7ro6bn26VH9usfR-P6pV24Q-yKAh3U2Sp4ZuPZYV3ic8kSE2XEcEpBwEFXxfCezBgmwh14m4_nwMWtqIwRSSrHOiA23RxInCmKUYUBKohjVyoGxXnFmJBSFBsM5Ba4JM9yCyR1oxrgqS1qcok5VV-4MYe9KSw3zqgi6EdAgc5XKcg7CQG5B03PUjcRYvm3TXywbOlz83X2JDiO9k8cHu0Kd9WrjrtGB-Vi_vq9u0p59AzKsl1M
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1JSwMxFH7UKuhJpRV3c_Bo2kkmyUy8SRcq1lqwQm8lk2TAy4zUVv35Jum0InjxloQs8GV57-VtANc0J1yJJMOxSQVmOTFY0URhVzOpEw8ipVlINpGMRul0Ksc1uNn4wlhrg_GZbfli0OWbUi_9V1lb8tRRH7YF25wxGq28tdanx-v3HHWKKjcdEsl2965DfLAXJwZS0qpG_0qjEqhIf_9_6x9A88cdD403hOYQarZowNNzmYdn9esWefX7p5rbtm_zBdQpcTfYZqAyR6pAvRApws2OfBc3CDlmFU3WXKvjAZvw0u9NOgNcZUdwYKbJAhPj7k9KbBITLTVnJCOSUyUt0yaXguuUxbFiWgjKhCJcC8N0ZJniXMgso_ER1IuysMeAcpsZqnkuYycdOWAlz6QRgiWaRYYpegIND8bsbRUAY1bhcPp38xXsDiaPw9nwfvRwBnse-2D_wc-hvpgv7QXs6I_F6_v8MuzfN0O6mpo
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2021+58th+ACM%2FIEEE+Design+Automation+Conference+%28DAC%29&rft.atitle=Softermax%3A+Hardware%2FSoftware+Co-Design+of+an+Efficient+Softmax+for+Transformers&rft.au=Stevens%2C+Jacob+R.&rft.au=Venkatesan%2C+Rangharajan&rft.au=Dai%2C+Steve&rft.au=Khailany%2C+Brucek&rft.date=2021-12-05&rft.pub=IEEE&rft.spage=469&rft.epage=474&rft_id=info:doi/10.1109%2FDAC18074.2021.9586134&rft.externalDocID=9586134