Physical data embedding for memory efficient AI

Uloženo v:
Podrobná bibliografie
Název: Physical data embedding for memory efficient AI
Autoři: Callen MacPhee, Yiming Zhou, Bahram Jalali
Zdroj: Machine Learning: Science and Technology, Vol 6, Iss 4, p 045018 (2025)
Informace o vydavateli: IOP Publishing
Rok vydání: 2025
Sbírka: Directory of Open Access Journals: DOAJ Articles
Témata: physics-AI symbiosis, interpretable AI, physics-inspired algorithms, physics-based neural networks, memory-efficient AI, Computer engineering. Computer hardware, TK7885-7895, Electronic computers. Computer science, QA75.5-76.95
Popis: Deep neural networks have achieved exceptional performance across various fields by learning complex, nonlinear mappings from large-scale datasets. However, they face challenges such as high memory requirements and limited interpretability. This paper introduces an approach where master equations of physics are converted into multilayered networks that are trained via backpropagation. The resulting general-purpose model effectively encodes data in the properties of the underlying physical system. In contrast to existing methods wherein a trained neural network is used as a computationally efficient alternative for solving physical equations, our approach directly treats physics equations as trainable models. Rather than approximating physics with a neural network or augmenting a network with physics-inspired constraints, this framework makes the equation itself the architecture. We demonstrate this physical embedding concept with the nonlinear Schrödinger equation, which acts as trainable architecture for learning complex patterns including nonlinear mappings and memory effects from data. The network embeds data representation in orders of magnitude fewer parameters than conventional neural networks when tested on time series data. Notably, the trained ‘Nonlinear Schrödinger Network’ is interpretable, with all parameters having physical meanings. Curiously, this approach also provides a blueprint for implementing such AI computations in physical analog systems, offering a direct path toward low-latency and energy-efficient hardware realizations. The proposed method is also extended to the Gross-Pitaevskii equation, demonstrating the broad applicability of the framework to other master equations of physics. Among our results, an ablation study quantifies the relative importance of physical terms such as dispersion, nonlinearity, and potential energy for classification accuracy. We also outline the limitations and benefits of this approach as it relates to universality and generalizability. Overall, this work aims ...
Druh dokumentu: article in journal/newspaper
Jazyk: English
Relation: https://doi.org/10.1088/2632-2153/ae0f37; https://doaj.org/toc/2632-2153; https://doaj.org/article/b97d56612e9342afadd949cf144179ee
DOI: 10.1088/2632-2153/ae0f37
Dostupnost: https://doi.org/10.1088/2632-2153/ae0f37
https://doaj.org/article/b97d56612e9342afadd949cf144179ee
Přístupové číslo: edsbas.238D9702
Databáze: BASE
FullText Text:
  Availability: 0
CustomLinks:
  – Url: https://doi.org/10.1088/2632-2153/ae0f37#
    Name: EDS - BASE (s4221598)
    Category: fullText
    Text: View record from BASE
  – Url: https://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=EBSCO&SrcAuth=EBSCO&DestApp=WOS&ServiceName=TransferToWoS&DestLinkType=GeneralSearchSummary&Func=Links&author=MacPhee%20C
    Name: ISI
    Category: fullText
    Text: Nájsť tento článok vo Web of Science
    Icon: https://imagesrvr.epnet.com/ls/20docs.gif
    MouseOverText: Nájsť tento článok vo Web of Science
Header DbId: edsbas
DbLabel: BASE
An: edsbas.238D9702
RelevancyScore: 997
AccessLevel: 3
PubType: Academic Journal
PubTypeId: academicJournal
PreciseRelevancyScore: 996.707214355469
IllustrationInfo
Items – Name: Title
  Label: Title
  Group: Ti
  Data: Physical data embedding for memory efficient AI
– Name: Author
  Label: Authors
  Group: Au
  Data: <searchLink fieldCode="AR" term="%22Callen+MacPhee%22">Callen MacPhee</searchLink><br /><searchLink fieldCode="AR" term="%22Yiming+Zhou%22">Yiming Zhou</searchLink><br /><searchLink fieldCode="AR" term="%22Bahram+Jalali%22">Bahram Jalali</searchLink>
– Name: TitleSource
  Label: Source
  Group: Src
  Data: Machine Learning: Science and Technology, Vol 6, Iss 4, p 045018 (2025)
– Name: Publisher
  Label: Publisher Information
  Group: PubInfo
  Data: IOP Publishing
– Name: DatePubCY
  Label: Publication Year
  Group: Date
  Data: 2025
– Name: Subset
  Label: Collection
  Group: HoldingsInfo
  Data: Directory of Open Access Journals: DOAJ Articles
– Name: Subject
  Label: Subject Terms
  Group: Su
  Data: <searchLink fieldCode="DE" term="%22physics-AI+symbiosis%22">physics-AI symbiosis</searchLink><br /><searchLink fieldCode="DE" term="%22interpretable+AI%22">interpretable AI</searchLink><br /><searchLink fieldCode="DE" term="%22physics-inspired+algorithms%22">physics-inspired algorithms</searchLink><br /><searchLink fieldCode="DE" term="%22physics-based+neural+networks%22">physics-based neural networks</searchLink><br /><searchLink fieldCode="DE" term="%22memory-efficient+AI%22">memory-efficient AI</searchLink><br /><searchLink fieldCode="DE" term="%22Computer+engineering%2E+Computer+hardware%22">Computer engineering. Computer hardware</searchLink><br /><searchLink fieldCode="DE" term="%22TK7885-7895%22">TK7885-7895</searchLink><br /><searchLink fieldCode="DE" term="%22Electronic+computers%2E+Computer+science%22">Electronic computers. Computer science</searchLink><br /><searchLink fieldCode="DE" term="%22QA75%2E5-76%2E95%22">QA75.5-76.95</searchLink>
– Name: Abstract
  Label: Description
  Group: Ab
  Data: Deep neural networks have achieved exceptional performance across various fields by learning complex, nonlinear mappings from large-scale datasets. However, they face challenges such as high memory requirements and limited interpretability. This paper introduces an approach where master equations of physics are converted into multilayered networks that are trained via backpropagation. The resulting general-purpose model effectively encodes data in the properties of the underlying physical system. In contrast to existing methods wherein a trained neural network is used as a computationally efficient alternative for solving physical equations, our approach directly treats physics equations as trainable models. Rather than approximating physics with a neural network or augmenting a network with physics-inspired constraints, this framework makes the equation itself the architecture. We demonstrate this physical embedding concept with the nonlinear Schrödinger equation, which acts as trainable architecture for learning complex patterns including nonlinear mappings and memory effects from data. The network embeds data representation in orders of magnitude fewer parameters than conventional neural networks when tested on time series data. Notably, the trained ‘Nonlinear Schrödinger Network’ is interpretable, with all parameters having physical meanings. Curiously, this approach also provides a blueprint for implementing such AI computations in physical analog systems, offering a direct path toward low-latency and energy-efficient hardware realizations. The proposed method is also extended to the Gross-Pitaevskii equation, demonstrating the broad applicability of the framework to other master equations of physics. Among our results, an ablation study quantifies the relative importance of physical terms such as dispersion, nonlinearity, and potential energy for classification accuracy. We also outline the limitations and benefits of this approach as it relates to universality and generalizability. Overall, this work aims ...
– Name: TypeDocument
  Label: Document Type
  Group: TypDoc
  Data: article in journal/newspaper
– Name: Language
  Label: Language
  Group: Lang
  Data: English
– Name: NoteTitleSource
  Label: Relation
  Group: SrcInfo
  Data: https://doi.org/10.1088/2632-2153/ae0f37; https://doaj.org/toc/2632-2153; https://doaj.org/article/b97d56612e9342afadd949cf144179ee
– Name: DOI
  Label: DOI
  Group: ID
  Data: 10.1088/2632-2153/ae0f37
– Name: URL
  Label: Availability
  Group: URL
  Data: https://doi.org/10.1088/2632-2153/ae0f37<br />https://doaj.org/article/b97d56612e9342afadd949cf144179ee
– Name: AN
  Label: Accession Number
  Group: ID
  Data: edsbas.238D9702
PLink https://erproxy.cvtisr.sk/sfx/access?url=https://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsbas&AN=edsbas.238D9702
RecordInfo BibRecord:
  BibEntity:
    Identifiers:
      – Type: doi
        Value: 10.1088/2632-2153/ae0f37
    Languages:
      – Text: English
    Subjects:
      – SubjectFull: physics-AI symbiosis
        Type: general
      – SubjectFull: interpretable AI
        Type: general
      – SubjectFull: physics-inspired algorithms
        Type: general
      – SubjectFull: physics-based neural networks
        Type: general
      – SubjectFull: memory-efficient AI
        Type: general
      – SubjectFull: Computer engineering. Computer hardware
        Type: general
      – SubjectFull: TK7885-7895
        Type: general
      – SubjectFull: Electronic computers. Computer science
        Type: general
      – SubjectFull: QA75.5-76.95
        Type: general
    Titles:
      – TitleFull: Physical data embedding for memory efficient AI
        Type: main
  BibRelationships:
    HasContributorRelationships:
      – PersonEntity:
          Name:
            NameFull: Callen MacPhee
      – PersonEntity:
          Name:
            NameFull: Yiming Zhou
      – PersonEntity:
          Name:
            NameFull: Bahram Jalali
    IsPartOfRelationships:
      – BibEntity:
          Dates:
            – D: 01
              M: 01
              Type: published
              Y: 2025
          Identifiers:
            – Type: issn-locals
              Value: edsbas
            – Type: issn-locals
              Value: edsbas.oa
          Titles:
            – TitleFull: Machine Learning: Science and Technology, Vol 6, Iss 4, p 045018 (2025
              Type: main
ResultId 1