ChatEDA: A Large Language Model Powered Autonomous Agent for EDA

The integration of a complex set of electronic design automation (EDA) tools to enhance interoperability is a critical concern for circuit designers. Recent advancements in large language models (LLMs) have showcased their exceptional capabilities in natural language processing and comprehension, of...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on computer-aided design of integrated circuits and systems Vol. 43; no. 10; pp. 3184 - 3197
Main Authors: Wu, Haoyuan, He, Zhuolun, Zhang, Xinyun, Yao, Xufeng, Zheng, Su, Zheng, Haisheng, Yu, Bei
Format: Journal Article
Language:English
Published: New York IEEE 01.10.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:0278-0070, 1937-4151
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract The integration of a complex set of electronic design automation (EDA) tools to enhance interoperability is a critical concern for circuit designers. Recent advancements in large language models (LLMs) have showcased their exceptional capabilities in natural language processing and comprehension, offering a novel approach to interfacing with EDA tools. This research article introduces ChatEDA, an autonomous agent for EDA empowered by an LLM, AutoMage, complemented by EDA tools serving as executors. ChatEDA streamlines the design flow from the register-transfer level (RTL) to the graphic data system version II (GDSII) by effectively managing task decomposition, script generation, and task execution. Through comprehensive experimental evaluations, ChatEDA has demonstrated its proficiency in handling diverse requirements, and our fine-tuned AutoMage model has exhibited superior performance compared to GPT-4 and other similar LLMs.
AbstractList The integration of a complex set of electronic design automation (EDA) tools to enhance interoperability is a critical concern for circuit designers. Recent advancements in large language models (LLMs) have showcased their exceptional capabilities in natural language processing and comprehension, offering a novel approach to interfacing with EDA tools. This research article introduces ChatEDA, an autonomous agent for EDA empowered by an LLM, AutoMage, complemented by EDA tools serving as executors. ChatEDA streamlines the design flow from the register-transfer level (RTL) to the graphic data system version II (GDSII) by effectively managing task decomposition, script generation, and task execution. Through comprehensive experimental evaluations, ChatEDA has demonstrated its proficiency in handling diverse requirements, and our fine-tuned AutoMage model has exhibited superior performance compared to GPT-4 and other similar LLMs.
Author Wu, Haoyuan
Zheng, Su
Zheng, Haisheng
Yao, Xufeng
Yu, Bei
He, Zhuolun
Zhang, Xinyun
Author_xml – sequence: 1
  givenname: Haoyuan
  orcidid: 0000-0001-7090-0600
  surname: Wu
  fullname: Wu, Haoyuan
  organization: Basic Software Research Department, Shanghai Artificial Intelligence Laboratory, Shanghai, China
– sequence: 2
  givenname: Zhuolun
  orcidid: 0009-0009-4909-6588
  surname: He
  fullname: He, Zhuolun
  organization: Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, SAR
– sequence: 3
  givenname: Xinyun
  orcidid: 0000-0002-7763-7507
  surname: Zhang
  fullname: Zhang, Xinyun
  organization: Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, SAR
– sequence: 4
  givenname: Xufeng
  surname: Yao
  fullname: Yao, Xufeng
  organization: Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, SAR
– sequence: 5
  givenname: Su
  orcidid: 0000-0003-1159-1611
  surname: Zheng
  fullname: Zheng, Su
  organization: Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, SAR
– sequence: 6
  givenname: Haisheng
  surname: Zheng
  fullname: Zheng, Haisheng
  organization: Basic Software Research Department, Shanghai Artificial Intelligence Laboratory, Shanghai, China
– sequence: 7
  givenname: Bei
  orcidid: 0000-0001-6406-4810
  surname: Yu
  fullname: Yu, Bei
  email: byu@cse.cuhk.edu.hk
  organization: Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, SAR
BookMark eNp9kMtKAzEUhoNUsK0-gOBiwPXUXJvElcO0XqCii7oOpzOZOqVNaiaD-Pam1IW4cHMu8H_nwDdCA-edReiS4AkhWN8sy2I2oZjyCWOKMS5P0JBoJnNOBBmgIaZS5RhLfIZGXbfBmHBB9RDdle8Q57PiNiuyBYS1TdWte0jDs6_tNnv1nzbYOiv66J3f-b7LirV1MWt8yBJ4jk4b2Hb24qeP0dv9fFk-5ouXh6eyWOQV1TzmzcpORU1VZRtda1aD0kAFIRWr6UqBUCBXFBgFJSlR0EyZoAqmOO0VF1ywMbo-3t0H_9HbLpqN74NLLw0jWFLBldYpJY-pKviuC7YxVRshtt7FAO3WEGwOusxBlznoMj-6Ekn-kPvQ7iB8_ctcHZnWWvsrz5VgkrJvHLN04g
CODEN ITCSDI
CitedBy_id crossref_primary_10_1109_TVLSI_2025_3527382
crossref_primary_10_1038_s43588_025_00849_y
crossref_primary_10_1109_JETCAS_2025_3575272
crossref_primary_10_1007_s44336_024_00009_2
crossref_primary_10_1109_JETCAS_2025_3562937
crossref_primary_10_1145_3734523
crossref_primary_10_1002_spe_70002
crossref_primary_10_1016_j_mejo_2025_106607
crossref_primary_10_1109_MNET_2025_3579001
crossref_primary_10_1145_3754339
crossref_primary_10_1002_spe_3408
crossref_primary_10_3390_machines13090831
crossref_primary_10_1016_j_jmsy_2025_07_019
crossref_primary_10_1145_3735640
Cites_doi 10.1109/MLCAD58807.2023.10299852
10.18653/v1/2021.acl-long.568
10.1109/ISQED.2001.915211
10.18653/v1/2022.naacl-main.201
10.7759/cureus.40895
10.18653/v1/2023.findings-acl.71
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024
DBID 97E
RIA
RIE
AAYXX
CITATION
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
DOI 10.1109/TCAD.2024.3383347
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DatabaseTitle CrossRef
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
DatabaseTitleList
Technology Research Database
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1937-4151
EndPage 3197
ExternalDocumentID 10_1109_TCAD_2024_3383347
10485372
Genre orig-research
GrantInformation_xml – fundername: Research Grants Council of Hong Kong, SAR
  grantid: CUHK14210723
GroupedDBID --Z
-~X
0R~
29I
4.4
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFS
ACIWK
ACNCT
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
HZ~
H~9
IBMZZ
ICLAB
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
O9-
OCL
P2P
PZZ
RIA
RIE
RNS
TN5
VH1
VJK
AAYXX
CITATION
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c294t-fbe65d28cef9d93da89a2511c3d2b8a58a7b2a32a87218af63528a60a87c45453
IEDL.DBID RIE
ISICitedReferencesCount 45
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001319522900029&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0278-0070
IngestDate Mon Jun 30 16:26:37 EDT 2025
Sat Nov 29 03:31:53 EST 2025
Tue Nov 18 22:20:18 EST 2025
Wed Aug 27 01:58:07 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 10
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c294t-fbe65d28cef9d93da89a2511c3d2b8a58a7b2a32a87218af63528a60a87c45453
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0009-0009-4909-6588
0000-0002-7763-7507
0000-0003-1159-1611
0000-0001-7090-0600
0000-0001-6406-4810
PQID 3107254899
PQPubID 85470
PageCount 14
ParticipantIDs crossref_primary_10_1109_TCAD_2024_3383347
ieee_primary_10485372
proquest_journals_3107254899
crossref_citationtrail_10_1109_TCAD_2024_3383347
PublicationCentury 2000
PublicationDate 2024-10-01
PublicationDateYYYYMMDD 2024-10-01
PublicationDate_xml – month: 10
  year: 2024
  text: 2024-10-01
  day: 01
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle IEEE transactions on computer-aided design of integrated circuits and systems
PublicationTitleAbbrev TCAD
PublicationYear 2024
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref12
Dong (ref31) 2022
Dettmers (ref15) 2023
Kojima (ref41)
Touvron (ref8) 2023
Anil (ref26) 2023
Ajayi (ref1)
(ref6) 2023
Dettmers (ref36)
Hu (ref27)
Frantar (ref32) 2022
Mukherjee (ref16) 2023
Xie (ref30)
Roziere (ref33) 2023
Taori (ref35) 2023
Wei (ref9) 2022
Luo (ref38) 2023
(ref5) 2023
ref42
Brown (ref4)
Cui (ref11) 2023
Schick (ref17) 2023
Ouyang (ref34)
(ref19) 2023
ref28
Wang (ref10) 2022
Chowdhery (ref25) 2022
Li (ref2)
Touvron (ref7) 2023
ref29
Dettmers (ref37)
Patil (ref20) 2023
ref3
Tay (ref23)
Zheng (ref14) 2023
(ref18) 2023
Radford (ref24) 2018
Yang (ref22)
Kenton (ref21)
Wei (ref39)
ref40
Wei (ref13)
References_xml – volume-title: Auto-gpt: An autonomous gpt-4 experiment
  year: 2023
  ident: ref18
– start-page: 1
  volume-title: Proc. ICLR
  ident: ref27
  article-title: LoRA: Low-rank adaptation of large language models
– volume-title: Stanford alpaca: An instruction-following llama model
  year: 2023
  ident: ref35
– year: 2023
  ident: ref17
  article-title: Toolformer: Language models can teach themselves to use tools
  publication-title: arXiv:2302.04761
– ident: ref42
  doi: 10.1109/MLCAD58807.2023.10299852
– year: 2022
  ident: ref25
  article-title: PaLM: Scaling language modeling with pathways
  publication-title: arXiv:2204.02311
– year: 2023
  ident: ref15
  article-title: QLoRA: Efficient finetuning of quantized LLMs
  publication-title: arXiv:2305.14314
– ident: ref28
  doi: 10.18653/v1/2021.acl-long.568
– start-page: 1
  volume-title: Proc. Int. Symp. Electron. Design Autom. (ISEDA)
  ident: ref2
  article-title: iEDA: An open-source intelligent physical implementation toolkit and library
– year: 2023
  ident: ref11
  article-title: ChatLaw: Open-source legal large language model with integrated external knowledge bases
  publication-title: arXiv:2306.16092
– start-page: 1
  volume-title: Proc. NIPS
  ident: ref39
  article-title: Chain-of-thought prompting elicits reasoning in large language models
– start-page: 1
  volume-title: Proc. ICLR
  ident: ref23
  article-title: Ul2: Unifying language learning paradigms
– year: 2023
  ident: ref38
  article-title: WizardCoder: Empowering code large language models with evol-instruct
  publication-title: arXiv:2306.08568
– start-page: 1
  volume-title: Proc. NAACL
  ident: ref21
  article-title: BERT: Pre-training of deep bidirectional transformers for language understanding
– ident: ref3
  doi: 10.1109/ISQED.2001.915211
– start-page: 1
  volume-title: Proc. Gov. Microcircuit Appl. Crit. Technol. Conf.
  ident: ref1
  article-title: OpenROAD: Toward a self-driving, open-source digital layout implementation tool chain
– ident: ref29
  doi: 10.18653/v1/2022.naacl-main.201
– year: 2022
  ident: ref32
  article-title: GPTQ: Accurate post-training quantization for generative pre-trained transformers
  publication-title: arXiv:2210.17323
– ident: ref12
  doi: 10.7759/cureus.40895
– year: 2023
  ident: ref14
  article-title: Judging LLM-as-a-judge with MT-bench and Chatbot Arena
  publication-title: arXiv:2306.05685
– volume-title: Claude
  year: 2023
  ident: ref6
– ident: ref40
  doi: 10.18653/v1/2023.findings-acl.71
– start-page: 1
  volume-title: Proc. NIPS
  ident: ref4
  article-title: Language models are few-shot learners
– year: 2023
  ident: ref8
  article-title: LLaMA 2: Open foundation and fine-tuned chat models
  publication-title: arXiv:2307.09288
– year: 2022
  ident: ref10
  article-title: Self-instruct: Aligning language model with self generated instructions
  publication-title: arXiv:2212.10560
– start-page: 1
  volume-title: Proc. ICLR
  ident: ref13
  article-title: Finetuned language models are zero-shot learners
– year: 2022
  ident: ref9
  article-title: Emergent abilities of large language models
  publication-title: J. Mach. Learn. Res.
– volume-title: arXiv:2305.10403
  year: 2023
  ident: ref26
  article-title: PaLM 2 technical report
– start-page: 1
  volume-title: Proc. ICML
  ident: ref37
  article-title: The case for 4-bit precision: k-bit inference scaling laws
– year: 2023
  ident: ref7
  article-title: LLaMA: Open and efficient foundation language models
  publication-title: arXiv:2302.13971
– start-page: 1
  volume-title: Proc. ICLR
  ident: ref30
  article-title: An explanation of in-context learning as implicit Bayesian inference
– start-page: 1
  volume-title: Proc. NIPS
  ident: ref41
  article-title: Large language models are zero-shot reasoners
– volume-title: Improving language understanding by generative pre-training
  year: 2018
  ident: ref24
– year: 2023
  ident: ref20
  article-title: Gorilla: Large language model connected with massive APIs
  publication-title: arXiv:2305.15334
– year: 2023
  ident: ref16
  article-title: Orca: Progressive learning from complex explanation traces of GPT-4
  publication-title: arXiv:2306.02707
– start-page: 1
  volume-title: Proc. ICLR
  ident: ref36
  article-title: 8-bit optimizers via block-wise quantization
– volume-title: Gpt-4 technical report
  year: 2023
  ident: ref5
– start-page: 1
  volume-title: Proc. NIPS
  ident: ref34
  article-title: Training language models to follow instructions with human feedback
– year: 2023
  ident: ref33
  article-title: Code Llama: Open foundation models for code
  publication-title: arXiv:2308.12950
– start-page: 1
  volume-title: Proc. NIPS
  ident: ref22
  article-title: XLNet: Generalized autoregressive pretraining for language understanding
– volume-title: Babyagi
  year: 2023
  ident: ref19
– year: 2022
  ident: ref31
  article-title: A survey for in-context learning
  publication-title: arXiv:2301.00234
SSID ssj0014529
Score 2.657243
Snippet The integration of a complex set of electronic design automation (EDA) tools to enhance interoperability is a critical concern for circuit designers. Recent...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 3184
SubjectTerms Data systems
Design automation
Electronic design automation
Electronic design automation (EDA)
Integrated circuit modeling
Large language models
large language models (LLMs)
machine learning algorithms
Mathematical models
Natural language processing
Quantization (signal)
Task analysis
Training
Title ChatEDA: A Large Language Model Powered Autonomous Agent for EDA
URI https://ieeexplore.ieee.org/document/10485372
https://www.proquest.com/docview/3107254899
Volume 43
WOSCitedRecordID wos001319522900029&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 1937-4151
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014529
  issn: 0278-0070
  databaseCode: RIE
  dateStart: 19820101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8NAEF60eNCDz4rVKnvwJKSm2Tw2ngy1xYOUHir0FvYxQUFaaRN_vzObtBREwUtIyG5Ivt3NzrezMx9jt1rjDQAkOcZIJCiF8VKrwCt0H4QAabWRTmwiGY_lbJZOmmB1FwsDAG7zGfTo1Pny7cJUtFSGIzzE2SXBP-5uksR1sNbGZUAeRLegQiljsSM3Lsy-n95P8auQCgZhjwiZICmVrUnIqar8-BW7-WV09M83O2aHjSHJs7rlT9gOzE_ZwVZ6wTP2OHhT5fApe-AZf6Ed33isVyc5SaB98AlJpIHlWVVSbMOiWvGMQq04WrIcK7bZ62g4HTx7jWKCZ4I0LBFhiCMbSANFalNhlUwVcQgjbKCliqRKdKBEoCQSP6mKmHK7qNjHaxOiLSXOWWu-mMMF47H1YxAqMIFCaI1WMUASRUb5QqPZpjvMX0OYmyadOKlafOSOVvhpTqjnhHreoN5hd5sqn3Uujb8KtwnmrYI1wh3WXTdU3gy3VY42aoJMF7nj5S_Vrtg-Pb3ehtdlrXJZwTXbM1_l-2p543rSNzuPxDM
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1bS8MwFA6igvrgdeK85sEnYbNLekl9skxl4hw-TNhbyeUUBdnEdf5-z0nrGIiCL6WlCW2_JM35cnLOx9i5MXgDAEmOtQoJSmFbqdPQKkwHpATljFVebCIZDNRolD7Vweo-FgYA_OYzaNOp9-W7iZ3RUhmO8BBnlwT_uCtRGIqgCteaOw3Ih-iXVChpLHbl2onZCdLLIX4XkkERtomSSRJTWZiGvK7Kj5-xn2Hutv75bttsszYleVa1_Q5bgvEu21hIMLjHrrsvury9ya54xvu05xuP1fokJxG0N_5EImngeDYrKbphMpvyjIKtONqyHCs22PPd7bDba9WaCS0r0rBEjCGOnFAWitSl0mmVamIRVjphlI6UTozQUmiF1E_pIqbsLjoO8NqGaE3JfbY8nozhgPHYBTFILazQCK01OgZIosjqQBo03EyTBd8Q5rZOKE66Fm-5JxZBmhPqOaGe16g32cW8ynuVTeOvwg2CeaFghXCTHX83VF4PuGmOVmqCXBfZ4-Ev1c7YWm_42M_794OHI7ZOT6o25R2z5fJjBids1X6Wr9OPU9-rvgChicd6
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=ChatEDA%3A+A+Large+Language+Model+Powered+Autonomous+Agent+for+EDA&rft.jtitle=IEEE+transactions+on+computer-aided+design+of+integrated+circuits+and+systems&rft.au=Wu%2C+Haoyuan&rft.au=He%2C+Zhuolun&rft.au=Zhang%2C+Xinyun&rft.au=Yao%2C+Xufeng&rft.date=2024-10-01&rft.pub=IEEE&rft.issn=0278-0070&rft.volume=43&rft.issue=10&rft.spage=3184&rft.epage=3197&rft_id=info:doi/10.1109%2FTCAD.2024.3383347&rft.externalDocID=10485372
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0278-0070&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0278-0070&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0278-0070&client=summon