IndicBART for Translating Code-Mixed Kannada-English Sentences into Kannada: An Encoder-Decoder Transformer Approach
Translating Kannada-English code-mixed text continues to pose a major challenge in NLP owing to limited resource availability for Kannada, a lowresource Dravidian language, and the lack of parallel datasets. Existing models struggle with the structural complexity of code-mixed data, leading to subop...
Uložené v:
| Vydané v: | 2025 5th International Conference on Intelligent Technologies (CONIT) s. 1 - 6 |
|---|---|
| Hlavní autori: | , |
| Médium: | Konferenčný príspevok.. |
| Jazyk: | English |
| Vydavateľské údaje: |
IEEE
20.06.2025
|
| Predmet: | |
| ISBN: | 9798331522322 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Abstract | Translating Kannada-English code-mixed text continues to pose a major challenge in NLP owing to limited resource availability for Kannada, a lowresource Dravidian language, and the lack of parallel datasets. Existing models struggle with the structural complexity of code-mixed data, leading to suboptimal performance. To address this, we experimented with a transformer-based encoder-decoder model, leveraging two variants of IndicBART, a pre-trained multilingual model. We explored IndicBART's potential for transfer and few-shot learning by fine-tuning it on two Kannada-English code-mixed datasets: one in Roman script and the other in Kannada script, both paired with Kannada translations. Through selfattention and cross-attention mechanisms, IndicBART effectively captured the semantic essence of code-mixed sentences. Our experiments showed that both variants achieved significant BLEU scores of approximately 0.807, with each outperforming the other under different scenarios. This demonstrates their potential for code-mixed translation with minimal data. These findings highlight the effectiveness of our methodologies in tackling code-mixed translation challenges, establishing a basis for continued research in low-resource language settings. |
|---|---|
| AbstractList | Translating Kannada-English code-mixed text continues to pose a major challenge in NLP owing to limited resource availability for Kannada, a lowresource Dravidian language, and the lack of parallel datasets. Existing models struggle with the structural complexity of code-mixed data, leading to suboptimal performance. To address this, we experimented with a transformer-based encoder-decoder model, leveraging two variants of IndicBART, a pre-trained multilingual model. We explored IndicBART's potential for transfer and few-shot learning by fine-tuning it on two Kannada-English code-mixed datasets: one in Roman script and the other in Kannada script, both paired with Kannada translations. Through selfattention and cross-attention mechanisms, IndicBART effectively captured the semantic essence of code-mixed sentences. Our experiments showed that both variants achieved significant BLEU scores of approximately 0.807, with each outperforming the other under different scenarios. This demonstrates their potential for code-mixed translation with minimal data. These findings highlight the effectiveness of our methodologies in tackling code-mixed translation challenges, establishing a basis for continued research in low-resource language settings. |
| Author | N, Shruthi Sooda, Kavitha |
| Author_xml | – sequence: 1 givenname: Shruthi surname: N fullname: N, Shruthi email: imshruthin29@gmail.com organization: College of Engineering,Dept of CSE B.M.S.,Bangalore,India – sequence: 2 givenname: Kavitha surname: Sooda fullname: Sooda, Kavitha email: kavithas.cse@bmsce.ac.in organization: College of Engineering,Dept of CSE B.M.S.,Bangalore,India |
| BookMark | eNpVkMFOwzAMhoOAA4y9AYe8QEftLO3CrZQBE4NJ0PuUJu4WqXOrtgd4eyrGDpw--__l7-BrccENkxAS4hlAbO7yzfuqSLRGmGGMegwhSSGBMzE1qVkoBRpRxYvzfzvilRhW7IN7yD4KWTWdLDrLfW2HwDuZN56it_BFXr5aZutttORdHfq9_CQeiB31MvDQnOp7mbFcshvvuuiRfnk0jurDOGdt2zXW7W_EZWXrnqZ_nIjiaVnkL9F687zKs3UUjBoi1Kr0WJLSlam8rbR34JwhkypjEyxh7ss0tYnyOJ8DIJoy1QZcrK0vjarURNwetYGItm0XDrb73p5-o34AbQldOA |
| ContentType | Conference Proceeding |
| DBID | 6IE 6IL CBEJK RIE RIL |
| DOI | 10.1109/CONIT65521.2025.11167161 |
| DatabaseName | IEEE Electronic Library (IEL) Conference Proceedings IEEE Xplore POP ALL IEEE Xplore All Conference Proceedings IEEE/IET Electronic Library IEEE Proceedings Order Plans (POP All) 1998-Present |
| DatabaseTitleList | |
| Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| EISBN | 9798331522308 9798331522339 |
| EndPage | 6 |
| ExternalDocumentID | 11167161 |
| Genre | orig-research |
| GroupedDBID | 6IE 6IL CBEJK RIE RIL |
| ID | FETCH-LOGICAL-i93t-253bd2be35f9fdaf5dc1cc9e9739a62b14db77a63d24411229b7591c05adb93f3 |
| IEDL.DBID | RIE |
| ISBN | 9798331522322 |
| IngestDate | Wed Oct 01 07:05:12 EDT 2025 |
| IsPeerReviewed | false |
| IsScholarly | false |
| Language | English |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-i93t-253bd2be35f9fdaf5dc1cc9e9739a62b14db77a63d24411229b7591c05adb93f3 |
| PageCount | 6 |
| ParticipantIDs | ieee_primary_11167161 |
| PublicationCentury | 2000 |
| PublicationDate | 2025-June-20 |
| PublicationDateYYYYMMDD | 2025-06-20 |
| PublicationDate_xml | – month: 06 year: 2025 text: 2025-June-20 day: 20 |
| PublicationDecade | 2020 |
| PublicationTitle | 2025 5th International Conference on Intelligent Technologies (CONIT) |
| PublicationTitleAbbrev | CONIT |
| PublicationYear | 2025 |
| Publisher | IEEE |
| Publisher_xml | – name: IEEE |
| Score | 1.9123981 |
| Snippet | Translating Kannada-English code-mixed text continues to pose a major challenge in NLP owing to limited resource availability for Kannada, a lowresource... |
| SourceID | ieee |
| SourceType | Publisher |
| StartPage | 1 |
| SubjectTerms | Code-mixed texts Complexity theory Data models Encoder-Decoder Transformer Model Few shot learning IndicBart Kannada-English Code-mixed Multilingual Neural machine translation NLP Semantics Transformers Translation |
| Title | IndicBART for Translating Code-Mixed Kannada-English Sentences into Kannada: An Encoder-Decoder Transformer Approach |
| URI | https://ieeexplore.ieee.org/document/11167161 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3PS8MwGA06PAiCihN_k4PXbm3SNo23OR0OdQ7cYbeR5Eukl1RmJ_75Jlk78eDBU0NDS0gI73353vuC0LWhmgiuiohKn2bkRrs9xyDSRSwFOH6rTCji-sQmk2I-59PGrB68MFrrID7TPd8MuXyo1MoflfUTnzRIfLCzzVjemrX2OOMFpQ6IHDvYqHVi3h--TMazPHMA5eJAkvXaz39dpBJwZLT_zxEcoO6PIw9PN1hziLa0PUL12EKpbh0lxY564gA7Xtpm3_CwAh09l18a8KOwVoCIGsMufvVVOL14Gpe2rtruGzyw-N56h_syutPhuf6jZ7WuPWiKj3fRbHQ_Gz5EzS0KUclpHZGMSiBS08xwA8JkoBKluOaMcpETmaQgGRM5BQf0jnwRLlnGExVnAiSnhh6jjq2sPkGYE8d1NAEAU6REGuHCoRhMmlMiCkLTU9T1M7Z4X9fJWLSTdfbH-3O069fFC69IfIE69XKlL9GO-qzLj-VVWN1vDnymBA |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3PS8MwGA2igoKg4sTf5uC1W5v0V7zNubGxrQ7sYbeR5Eukl1RmJ_75Jl038eDBU0IKISSU9758731B6EFTRTiTqUeFSzMyrew_l4CnUl9wsPxW6rqI6yTJsnQ-Z7PGrF57YZRStfhMtV23zuVDKVfuqqwTuKRB4IKdvSgMib-xax2xhKWUWiiy_GCr1_FZp_eSjfI4shBlI0EStTcT_HpKpUaSwfE_13CCWj-ePDzbos0p2lHmDFUjA4V8sqQUW_KJa-Bx4jbzhnslKG9afCnAY24MB-41ll386upwOvk0LkxVbj4_4q7BfeM87kvvWdXtekbHa22_25Qfb6F80M97Q695R8ErGK08ElEBRCgaaaaB6whkICVTLKGMx0QEIYgk4TEFC_WWfhEmkogF0o84CEY1PUe7pjTqAmFGLNtRBAB0GhKhuQ2IfNBhTAlPCQ0vUcvt2OJ9XSljsdmsqz_G79HBMJ9OFpNRNr5Gh-6MnAyL-Ddot1qu1C3al59V8bG8q0_6G-AiqUs |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2025+5th+International+Conference+on+Intelligent+Technologies+%28CONIT%29&rft.atitle=IndicBART+for+Translating+Code-Mixed+Kannada-English+Sentences+into+Kannada%3A+An+Encoder-Decoder+Transformer+Approach&rft.au=N%2C+Shruthi&rft.au=Sooda%2C+Kavitha&rft.date=2025-06-20&rft.pub=IEEE&rft.isbn=9798331522322&rft.spage=1&rft.epage=6&rft_id=info:doi/10.1109%2FCONIT65521.2025.11167161&rft.externalDocID=11167161 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9798331522322/lc.gif&client=summon&freeimage=true |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9798331522322/mc.gif&client=summon&freeimage=true |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9798331522322/sc.gif&client=summon&freeimage=true |

