Generating Predicate Logic Expressions from Natural Language
Formal logic expressions are commonly written in standardized mathematical notation. Learning this notation typically requires many years of experience and is not an explicit part of undergraduate academic curricula. Constructing and comprehending logical predicates can feel difficult and unintuitiv...
Gespeichert in:
| Veröffentlicht in: | Proceedings of IEEE Southeastcon S. 1 - 8 |
|---|---|
| Hauptverfasser: | , |
| Format: | Tagungsbericht |
| Sprache: | Englisch |
| Veröffentlicht: |
IEEE
10.03.2021
|
| Schlagworte: | |
| ISSN: | 1558-058X |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Abstract | Formal logic expressions are commonly written in standardized mathematical notation. Learning this notation typically requires many years of experience and is not an explicit part of undergraduate academic curricula. Constructing and comprehending logical predicates can feel difficult and unintuitive. We hypothesized that this process can be automated using neural machine translation. Most machine translation techniques involve word-based segmentation as a preprocessing step. Given the nature of our custom dataset, hosts first-order-logic (FOL) semantics primarily in unigram tokens, the word-based approach does not seem applicable. The proposed solution was to automate the translation of short English sentences into FOL expressions using character-level prediction in a recurrent neural network model. We trained four encoder-decoder models (LSTM, Bidirectional GRU with Attention, and two variants of Bi-directional LSTM with Attention). Our experimental results showed that several established neural translation techniques can be implemented to produce highly accurate machine translators of English sentences to FOL formalisms, given only characters as markers of semantics. We also demonstrated that attention-based enhancement to the encoder-decoder architecture can vastly improve translation accuracy. Most machine translation techniques involve word-based segmentation as a preprocessing step. Given the nature of our custom dataset, hosts first-order-logic (FOL) semantics primarily in unigram tokens, the word-based approach does not seem applicable. The proposed solution was to automate the translation of short English sentences into FOL expressions using character-level prediction in a recurrent neural network model. We trained four encoder-decoder models (LSTM, Bidirectional GRU with Attention, and two variants of Bi-directional LSTM with Attention). Our experimental results showed that several established neural translation techniques can be implemented to produce highly accurate machine translators of English sentences to FOL formalisms, given only characters as markers of semantics. We also demonstrated that attention-based enhancement to the encoder-decoder architecture can vastly improve translation accuracy. We trained four encoder-decoder models (LSTM, Bidirectional GRU with Attention, and two variants of Bi-directional LSTM with Attention). Our experimental results showed that several established neural translation techniques can be implemented to produce highly accurate machine translators of English sentences to FOL formalisms, given only characters as markers of semantics. We also demonstrated that attention-based enhancement to the encoder-decoder architecture can vastly improve translation accuracy. |
|---|---|
| AbstractList | Formal logic expressions are commonly written in standardized mathematical notation. Learning this notation typically requires many years of experience and is not an explicit part of undergraduate academic curricula. Constructing and comprehending logical predicates can feel difficult and unintuitive. We hypothesized that this process can be automated using neural machine translation. Most machine translation techniques involve word-based segmentation as a preprocessing step. Given the nature of our custom dataset, hosts first-order-logic (FOL) semantics primarily in unigram tokens, the word-based approach does not seem applicable. The proposed solution was to automate the translation of short English sentences into FOL expressions using character-level prediction in a recurrent neural network model. We trained four encoder-decoder models (LSTM, Bidirectional GRU with Attention, and two variants of Bi-directional LSTM with Attention). Our experimental results showed that several established neural translation techniques can be implemented to produce highly accurate machine translators of English sentences to FOL formalisms, given only characters as markers of semantics. We also demonstrated that attention-based enhancement to the encoder-decoder architecture can vastly improve translation accuracy. Most machine translation techniques involve word-based segmentation as a preprocessing step. Given the nature of our custom dataset, hosts first-order-logic (FOL) semantics primarily in unigram tokens, the word-based approach does not seem applicable. The proposed solution was to automate the translation of short English sentences into FOL expressions using character-level prediction in a recurrent neural network model. We trained four encoder-decoder models (LSTM, Bidirectional GRU with Attention, and two variants of Bi-directional LSTM with Attention). Our experimental results showed that several established neural translation techniques can be implemented to produce highly accurate machine translators of English sentences to FOL formalisms, given only characters as markers of semantics. We also demonstrated that attention-based enhancement to the encoder-decoder architecture can vastly improve translation accuracy. We trained four encoder-decoder models (LSTM, Bidirectional GRU with Attention, and two variants of Bi-directional LSTM with Attention). Our experimental results showed that several established neural translation techniques can be implemented to produce highly accurate machine translators of English sentences to FOL formalisms, given only characters as markers of semantics. We also demonstrated that attention-based enhancement to the encoder-decoder architecture can vastly improve translation accuracy. |
| Author | Levkovskyi, Oleksii Li, Wei |
| Author_xml | – sequence: 1 givenname: Oleksii surname: Levkovskyi fullname: Levkovskyi, Oleksii email: ol150@mynsu.nova.edu organization: College of Engineering and Computing, Nova Southeastern University,Fort Lauderdale,Florida,USA – sequence: 2 givenname: Wei surname: Li fullname: Li, Wei email: lwei@nova.edu organization: College of Engineering and Computing, Nova Southeastern University,Fort Lauderdale,Florida,USA |
| BookMark | eNotj09LwzAYh6MouM19Ai85eWvNv7dNwIuUbQpFBRW8jbR9UyNbOpIW9Ns7cKcfPIeH5zcnF2EISMgtZznnzNy9DdP4hTaN1RAUKC5zwQTPjWJcgzgjS1NqXhSgmCwNnJMZB9AZA_15ReYpfTMmmOIwI_cbDBjt6ENPXyN2vrUj0nrofUtXP4eIKfkhJOrisKfPdpyi3dHahn6yPV6TS2d3CZenXZCP9eq9eszql81T9VBnXjA5ZkZroaQzAC1nAtwxEjoHbcEbU4IVomwMdI1EBOeMbaQrXHfkWoBsWtPJBbn593pE3B6i39v4uz2dlX9yS06x |
| ContentType | Conference Proceeding |
| DBID | 6IE 6IH CBEJK RIE RIO |
| DOI | 10.1109/SoutheastCon45413.2021.9401852 |
| DatabaseName | IEEE Electronic Library (IEL) Conference Proceedings IEEE Proceedings Order Plan (POP) 1998-present by volume IEEE Xplore All Conference Proceedings IEEE Electronic Library (IEL) IEEE Proceedings Order Plans (POP) 1998-present |
| DatabaseTitleList | |
| Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering |
| EISBN | 9781665403795 1665403799 |
| EISSN | 1558-058X |
| EndPage | 8 |
| ExternalDocumentID | 9401852 |
| Genre | orig-research |
| GroupedDBID | 6IE 6IF 6IH 6IK 6IL 6IN AAWTH ABLEC ADZIZ ALMA_UNASSIGNED_HOLDINGS BEFXN BFFAM BGNUA BKEBE BPEOZ CBEJK CHZPO IEGSK IJVOP OCL RIE RIL RIO |
| ID | FETCH-LOGICAL-i203t-988243f955c1025f4015df5c61b975a227b95db3ee5ff9ab3f6fd5a28253bc9d3 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 4 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000685167800078&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| IngestDate | Wed Aug 27 02:30:45 EDT 2025 |
| IsPeerReviewed | false |
| IsScholarly | true |
| Language | English |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-i203t-988243f955c1025f4015df5c61b975a227b95db3ee5ff9ab3f6fd5a28253bc9d3 |
| PageCount | 8 |
| ParticipantIDs | ieee_primary_9401852 |
| PublicationCentury | 2000 |
| PublicationDate | 2021-March-10 |
| PublicationDateYYYYMMDD | 2021-03-10 |
| PublicationDate_xml | – month: 03 year: 2021 text: 2021-March-10 day: 10 |
| PublicationDecade | 2020 |
| PublicationTitle | Proceedings of IEEE Southeastcon |
| PublicationTitleAbbrev | SOUTHEASTCON |
| PublicationYear | 2021 |
| Publisher | IEEE |
| Publisher_xml | – name: IEEE |
| SSID | ssj0020415 |
| Score | 2.1865866 |
| Snippet | Formal logic expressions are commonly written in standardized mathematical notation. Learning this notation typically requires many years of experience and is... |
| SourceID | ieee |
| SourceType | Publisher |
| StartPage | 1 |
| SubjectTerms | Bidirectional control Data models machine learning Measurement Natural languages neural machine translation NLP predicate logic Predictive models Recurrent neural networks Semantics |
| Title | Generating Predicate Logic Expressions from Natural Language |
| URI | https://ieeexplore.ieee.org/document/9401852 |
| WOSCitedRecordID | wos000685167800078&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LSwMxEB5qEdGLj1Z8k4N4Mu3u5rEb8FZaPJTSg0JvJckm0MtW2q305ztJl6rgxdsmJEuYwEwmme_7AB61TqXi3FGhfEK5kik1rEyoSZTRGmNCYeJOj_PJpJjN1LQFz3ssjHMuFp-5XviMb_nl0m7CVVlfYTJQCHS4B3kud1itfXIVoOZH8NRwaPajAF1QvxksKy7QU2MqmKW95g-_pFRiJBmd_m8NZ9D9huSR6T7YnEPLVRdw8oNNsAMvOwrpUMeMQ-MDTO1IEFO2ZLhtCl6rNQmIEjLRkXCDjJv7yi68j4Zvg1faiCPQRZawmio8GnPmlRAWzwjC47JE6YWVqVG50FmWGyVKw5wT3ittmJe-xH7MCJmxqmSX0K6WlbsC4jl6X55al0vJC2ywArOwDGcGcUnrrqET7DD_2PFfzBsT3PzdfQvHwdQ01rzdQbtebdw9HNrPerFePcRN-wJNXpkN |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3NS8MwFH-MKX5c_NjEb3sQT2Zrm6RtwNvYmFjLDhN2G02awC6tbJ345_uSlangxVsTkhJe4L285P1-P4D7PA8iwZgmXBifMBEFRNLCJ9IXMs8xJiTS7XQaZ1kym4lJCx63WBittSs-0z376d7yi0qt7VVZX2AykHB0uDucsdDfoLW26ZUFm-_BQ8Oi2XcSdFb_ZlCVjKOvxmQwDHrNP36JqbhYMjr63yqOofsNyvMm23BzAi1dnsLhDz7BDjxtSKRtJTMOdU8wtfasnLLyhp9NyWu58iymxMtyR7nhpc2NZRfeRsPpYEwaeQSyCH1aE4GHY0aN4FzhKYEbXBYvDFdRIEXM8zCMpeCFpFpzY0QuqYlMgf2YE1KpREHPoF1WpT4HzzD0vyxQOo4ilmCDJpiHhTjTyksqfQEda4f5-4YBY96Y4PLv7jvYH09f03n6nL1cwYE1O3EVcNfQrpdrfQO76qNerJa3bgO_AHMpnFQ |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=Proceedings+of+IEEE+Southeastcon&rft.atitle=Generating+Predicate+Logic+Expressions+from+Natural+Language&rft.au=Levkovskyi%2C+Oleksii&rft.au=Li%2C+Wei&rft.date=2021-03-10&rft.pub=IEEE&rft.eissn=1558-058X&rft.spage=1&rft.epage=8&rft_id=info:doi/10.1109%2FSoutheastCon45413.2021.9401852&rft.externalDocID=9401852 |