IndicBART for Translating Code-Mixed Kannada-English Sentences into Kannada: An Encoder-Decoder Transformer Approach

Translating Kannada-English code-mixed text continues to pose a major challenge in NLP owing to limited resource availability for Kannada, a lowresource Dravidian language, and the lack of parallel datasets. Existing models struggle with the structural complexity of code-mixed data, leading to subop...

Full description

Saved in:
Bibliographic Details
Published in:2025 5th International Conference on Intelligent Technologies (CONIT) pp. 1 - 6
Main Authors: N, Shruthi, Sooda, Kavitha
Format: Conference Proceeding
Language:English
Published: IEEE 20.06.2025
Subjects:
ISBN:9798331522322
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract Translating Kannada-English code-mixed text continues to pose a major challenge in NLP owing to limited resource availability for Kannada, a lowresource Dravidian language, and the lack of parallel datasets. Existing models struggle with the structural complexity of code-mixed data, leading to suboptimal performance. To address this, we experimented with a transformer-based encoder-decoder model, leveraging two variants of IndicBART, a pre-trained multilingual model. We explored IndicBART's potential for transfer and few-shot learning by fine-tuning it on two Kannada-English code-mixed datasets: one in Roman script and the other in Kannada script, both paired with Kannada translations. Through selfattention and cross-attention mechanisms, IndicBART effectively captured the semantic essence of code-mixed sentences. Our experiments showed that both variants achieved significant BLEU scores of approximately 0.807, with each outperforming the other under different scenarios. This demonstrates their potential for code-mixed translation with minimal data. These findings highlight the effectiveness of our methodologies in tackling code-mixed translation challenges, establishing a basis for continued research in low-resource language settings.
AbstractList Translating Kannada-English code-mixed text continues to pose a major challenge in NLP owing to limited resource availability for Kannada, a lowresource Dravidian language, and the lack of parallel datasets. Existing models struggle with the structural complexity of code-mixed data, leading to suboptimal performance. To address this, we experimented with a transformer-based encoder-decoder model, leveraging two variants of IndicBART, a pre-trained multilingual model. We explored IndicBART's potential for transfer and few-shot learning by fine-tuning it on two Kannada-English code-mixed datasets: one in Roman script and the other in Kannada script, both paired with Kannada translations. Through selfattention and cross-attention mechanisms, IndicBART effectively captured the semantic essence of code-mixed sentences. Our experiments showed that both variants achieved significant BLEU scores of approximately 0.807, with each outperforming the other under different scenarios. This demonstrates their potential for code-mixed translation with minimal data. These findings highlight the effectiveness of our methodologies in tackling code-mixed translation challenges, establishing a basis for continued research in low-resource language settings.
Author N, Shruthi
Sooda, Kavitha
Author_xml – sequence: 1
  givenname: Shruthi
  surname: N
  fullname: N, Shruthi
  email: imshruthin29@gmail.com
  organization: College of Engineering,Dept of CSE B.M.S.,Bangalore,India
– sequence: 2
  givenname: Kavitha
  surname: Sooda
  fullname: Sooda, Kavitha
  email: kavithas.cse@bmsce.ac.in
  organization: College of Engineering,Dept of CSE B.M.S.,Bangalore,India
BookMark eNpVkMFOwzAMhoOAA4y9AYe8QEftLO3CrZQBE4NJ0PuUJu4WqXOrtgd4eyrGDpw--__l7-BrccENkxAS4hlAbO7yzfuqSLRGmGGMegwhSSGBMzE1qVkoBRpRxYvzfzvilRhW7IN7yD4KWTWdLDrLfW2HwDuZN56it_BFXr5aZutttORdHfq9_CQeiB31MvDQnOp7mbFcshvvuuiRfnk0jurDOGdt2zXW7W_EZWXrnqZ_nIjiaVnkL9F687zKs3UUjBoi1Kr0WJLSlam8rbR34JwhkypjEyxh7ss0tYnyOJ8DIJoy1QZcrK0vjarURNwetYGItm0XDrb73p5-o34AbQldOA
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/CONIT65521.2025.11167161
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Xplore POP ALL
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE/IET Electronic Library (IEL) (UW System Shared)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
EISBN 9798331522308
9798331522339
EndPage 6
ExternalDocumentID 11167161
Genre orig-research
GroupedDBID 6IE
6IL
CBEJK
RIE
RIL
ID FETCH-LOGICAL-i93t-253bd2be35f9fdaf5dc1cc9e9739a62b14db77a63d24411229b7591c05adb93f3
IEDL.DBID RIE
ISBN 9798331522322
IngestDate Wed Oct 01 07:05:12 EDT 2025
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i93t-253bd2be35f9fdaf5dc1cc9e9739a62b14db77a63d24411229b7591c05adb93f3
PageCount 6
ParticipantIDs ieee_primary_11167161
PublicationCentury 2000
PublicationDate 2025-June-20
PublicationDateYYYYMMDD 2025-06-20
PublicationDate_xml – month: 06
  year: 2025
  text: 2025-June-20
  day: 20
PublicationDecade 2020
PublicationTitle 2025 5th International Conference on Intelligent Technologies (CONIT)
PublicationTitleAbbrev CONIT
PublicationYear 2025
Publisher IEEE
Publisher_xml – name: IEEE
Score 1.9125047
Snippet Translating Kannada-English code-mixed text continues to pose a major challenge in NLP owing to limited resource availability for Kannada, a lowresource...
SourceID ieee
SourceType Publisher
StartPage 1
SubjectTerms Code-mixed texts
Complexity theory
Data models
Encoder-Decoder Transformer Model
Few shot learning
IndicBart
Kannada-English Code-mixed
Multilingual
Neural machine translation
NLP
Semantics
Transformers
Translation
Title IndicBART for Translating Code-Mixed Kannada-English Sentences into Kannada: An Encoder-Decoder Transformer Approach
URI https://ieeexplore.ieee.org/document/11167161
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3PS8MwFA46PAiCihN_k4PXbm3SJIu3OTcc4hy4w24jyUull1RqJ_75Jl078eDBU0NDQsk7fN9rvu89hG7jGBQRPjvRwYOWCp5GKmXgA6I9ImSEZ2bTbELMZoPlUs4bs3rthbHW1uIz2wvD-i4fCrMOv8r6Sbg0SEKysysEb81aB1LIAaUeiDw72Kp1YtkfvcymC848QPk8kLBeu_xXI5UaRyaH__yCI9T9ceTh-RZrjtGOdSeomjrIzb2npNhTT1zDTpC2uTc8KsBGz_mXBfyknFOgosawi19DFc4gnsa5q4p2-g4PHR674HAvowdbPzc7Blbrx8Om-HgXLSbjxegxarooRLmkVUQY1UC0pSyTGaiMgUmMkVYKKhUnOklBC6E4BQ_0nnwRqQWTiYmZAi1pRk9RxxXOniE8AMiE9jNWQ8oTkCIGv6WxSieCAz1H3XBiq_dNnYxVe1gXf7y_RPshLkF4ReIr1KnKtb1Ge-azyj_Kmzq63xCLpjY
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3PS8MwFA6igoKg4sTf5uC1W5s0zeJtzo2NbXXgDruNJC-VXlKZnfjnm3TdxIMHTw0NPEre4fte833vIfQQhiAJd9WJ8h60mCdxIGMGLiHKIUJGkkyvh03wNG3P52Jam9UrL4wxphKfmaZfVnf5UOiV_1XWivylQeSLnT0WxyTc2LWOBBdtSh0UOX6w1euEotV9SYezhDmIcpUgYc1NgF-jVCok6R__8xtOUOPHk4enW7Q5RTvGnqFyaCHXT46UYkc-cQU8Xtxm33C3ABNM8i8DeCStlSCD2rKLX30fTi-fxrkti832I-5Y3LPe474Mnk31XEf0vNatO3X78Qaa9Xuz7iCo5ygEuaBlQBhVQJShLBMZyIyBjrQWRnAqZEJUFIPiXCYUHNQ7-kWE4kxEOmQSlKAZPUe7trDmAuE2QMaV2zEK4iQCwUNwIbWRKuIJ0EvU8Ce2eF93ylhsDuvqj_f36GAwm4wX42E6ukaHPkdehkXCG7RbLlfmFu3rzzL_WN5Vmf4G4k2pfQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2025+5th+International+Conference+on+Intelligent+Technologies+%28CONIT%29&rft.atitle=IndicBART+for+Translating+Code-Mixed+Kannada-English+Sentences+into+Kannada%3A+An+Encoder-Decoder+Transformer+Approach&rft.au=N%2C+Shruthi&rft.au=Sooda%2C+Kavitha&rft.date=2025-06-20&rft.pub=IEEE&rft.isbn=9798331522322&rft.spage=1&rft.epage=6&rft_id=info:doi/10.1109%2FCONIT65521.2025.11167161&rft.externalDocID=11167161
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9798331522322/lc.gif&client=summon&freeimage=true
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9798331522322/mc.gif&client=summon&freeimage=true
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9798331522322/sc.gif&client=summon&freeimage=true