IndicBART for Translating Code-Mixed Kannada-English Sentences into Kannada: An Encoder-Decoder Transformer Approach

Translating Kannada-English code-mixed text continues to pose a major challenge in NLP owing to limited resource availability for Kannada, a lowresource Dravidian language, and the lack of parallel datasets. Existing models struggle with the structural complexity of code-mixed data, leading to subop...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:2025 5th International Conference on Intelligent Technologies (CONIT) s. 1 - 6
Hlavní autoři: N, Shruthi, Sooda, Kavitha
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 20.06.2025
Témata:
ISBN:9798331522322
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Translating Kannada-English code-mixed text continues to pose a major challenge in NLP owing to limited resource availability for Kannada, a lowresource Dravidian language, and the lack of parallel datasets. Existing models struggle with the structural complexity of code-mixed data, leading to suboptimal performance. To address this, we experimented with a transformer-based encoder-decoder model, leveraging two variants of IndicBART, a pre-trained multilingual model. We explored IndicBART's potential for transfer and few-shot learning by fine-tuning it on two Kannada-English code-mixed datasets: one in Roman script and the other in Kannada script, both paired with Kannada translations. Through selfattention and cross-attention mechanisms, IndicBART effectively captured the semantic essence of code-mixed sentences. Our experiments showed that both variants achieved significant BLEU scores of approximately 0.807, with each outperforming the other under different scenarios. This demonstrates their potential for code-mixed translation with minimal data. These findings highlight the effectiveness of our methodologies in tackling code-mixed translation challenges, establishing a basis for continued research in low-resource language settings.
AbstractList Translating Kannada-English code-mixed text continues to pose a major challenge in NLP owing to limited resource availability for Kannada, a lowresource Dravidian language, and the lack of parallel datasets. Existing models struggle with the structural complexity of code-mixed data, leading to suboptimal performance. To address this, we experimented with a transformer-based encoder-decoder model, leveraging two variants of IndicBART, a pre-trained multilingual model. We explored IndicBART's potential for transfer and few-shot learning by fine-tuning it on two Kannada-English code-mixed datasets: one in Roman script and the other in Kannada script, both paired with Kannada translations. Through selfattention and cross-attention mechanisms, IndicBART effectively captured the semantic essence of code-mixed sentences. Our experiments showed that both variants achieved significant BLEU scores of approximately 0.807, with each outperforming the other under different scenarios. This demonstrates their potential for code-mixed translation with minimal data. These findings highlight the effectiveness of our methodologies in tackling code-mixed translation challenges, establishing a basis for continued research in low-resource language settings.
Author N, Shruthi
Sooda, Kavitha
Author_xml – sequence: 1
  givenname: Shruthi
  surname: N
  fullname: N, Shruthi
  email: imshruthin29@gmail.com
  organization: College of Engineering,Dept of CSE B.M.S.,Bangalore,India
– sequence: 2
  givenname: Kavitha
  surname: Sooda
  fullname: Sooda, Kavitha
  email: kavithas.cse@bmsce.ac.in
  organization: College of Engineering,Dept of CSE B.M.S.,Bangalore,India
BookMark eNpVkMFOwzAMhoOAA4y9AYe8QEftLO3CrZQBE4NJ0PuUJu4WqXOrtgd4eyrGDpw--__l7-BrccENkxAS4hlAbO7yzfuqSLRGmGGMegwhSSGBMzE1qVkoBRpRxYvzfzvilRhW7IN7yD4KWTWdLDrLfW2HwDuZN56it_BFXr5aZutttORdHfq9_CQeiB31MvDQnOp7mbFcshvvuuiRfnk0jurDOGdt2zXW7W_EZWXrnqZ_nIjiaVnkL9F687zKs3UUjBoi1Kr0WJLSlam8rbR34JwhkypjEyxh7ss0tYnyOJ8DIJoy1QZcrK0vjarURNwetYGItm0XDrb73p5-o34AbQldOA
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/CONIT65521.2025.11167161
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Xplore POP ALL
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
EISBN 9798331522308
9798331522339
EndPage 6
ExternalDocumentID 11167161
Genre orig-research
GroupedDBID 6IE
6IL
CBEJK
RIE
RIL
ID FETCH-LOGICAL-i93t-253bd2be35f9fdaf5dc1cc9e9739a62b14db77a63d24411229b7591c05adb93f3
IEDL.DBID RIE
ISBN 9798331522322
IngestDate Wed Oct 01 07:05:12 EDT 2025
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i93t-253bd2be35f9fdaf5dc1cc9e9739a62b14db77a63d24411229b7591c05adb93f3
PageCount 6
ParticipantIDs ieee_primary_11167161
PublicationCentury 2000
PublicationDate 2025-June-20
PublicationDateYYYYMMDD 2025-06-20
PublicationDate_xml – month: 06
  year: 2025
  text: 2025-June-20
  day: 20
PublicationDecade 2020
PublicationTitle 2025 5th International Conference on Intelligent Technologies (CONIT)
PublicationTitleAbbrev CONIT
PublicationYear 2025
Publisher IEEE
Publisher_xml – name: IEEE
Score 1.9125047
Snippet Translating Kannada-English code-mixed text continues to pose a major challenge in NLP owing to limited resource availability for Kannada, a lowresource...
SourceID ieee
SourceType Publisher
StartPage 1
SubjectTerms Code-mixed texts
Complexity theory
Data models
Encoder-Decoder Transformer Model
Few shot learning
IndicBart
Kannada-English Code-mixed
Multilingual
Neural machine translation
NLP
Semantics
Transformers
Translation
Title IndicBART for Translating Code-Mixed Kannada-English Sentences into Kannada: An Encoder-Decoder Transformer Approach
URI https://ieeexplore.ieee.org/document/11167161
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3PS8MwGA06PAiCihN_k4PXbG3SNv28zbnhEOfAHXYbaX5IL6nMTvzzTbJ24sGDp4aGhJAQ3kvy3vchdGt0BFrSjDAmNUlUaggwSQn1G0spEwsuQ7IJPp3miwXMGrN68MJorYP4TPd8Mbzlq0qu_VVZP_aPBrE_7OxynrVmrQPgkDPmgMixg61aJ4L-8GU6mWepAyh3DqRpr23-K5FKwJHx4T9HcIS6P448PNtizTHa0fYE1ROrSnnvKCl21BMH2PHSNvuGh5XS5Ln80go_CWuFEqQx7OJXH4XTi6dxaeuqrb7DA4tH1jvcV-RBh--mR89qXXnQBB_vovl4NB8-kiaLAimB1YSmrFC00Cw1YJQwqZKxlKCBMxAZLeJEFZyLjCkH9I58USh4CrGMUqEKYIadoo6trD5DOKIikybPTaJZEhkGnhtIUDSPhWQGzlHXz9jyfRMnY9lO1sUf_y_Rvl8XL7yi0RXq1Ku1vkZ78rMuP1Y3YXW_AcwZpew
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1NS8MwGA6igoKg4sRvc_CarU2atq-3OTc2ttWBPew20nxIL5nMTvz5Jt2HePDgKSGBEBLC8yR5nvdF6MHoALSkMWFMahIpbggwSQn1B0spE4pE1skmkixLp1OYrM3qtRdGa12Lz3TTV-u_fDWXS_9U1gr9p0HoLzt7PIposLFrHUECKWMOihw_2Op1Amh1XrJBHnMHUe4mSHlzM8CvVCo1kvSO_zmHE9T48eThyRZtTtGOtmeoGlhVyidHSrEjn7gGHi9us2-4M1eajMsvrfBQWCuUIGvLLn71cTi9fBqXtppvuh9x2-Ku9R73BXnWdbka0fNaV2-vw483UN7r5p0-WedRICWwilDOCkULzbgBo4ThSoZSgoaEgYhpEUaqSBIRM-Wg3tEvCkXCIZQBF6oAZtg52rVzqy8QDqiIpUlTE2kWBYaBZwcSFE1DIZmBS9TwKzZ7X0XKmG0W6-qP9nt00M_Ho9lokA2v0aHfIy_DosEN2q0WS32L9uVnVX4s7uqd_gaccqkz
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2025+5th+International+Conference+on+Intelligent+Technologies+%28CONIT%29&rft.atitle=IndicBART+for+Translating+Code-Mixed+Kannada-English+Sentences+into+Kannada%3A+An+Encoder-Decoder+Transformer+Approach&rft.au=N%2C+Shruthi&rft.au=Sooda%2C+Kavitha&rft.date=2025-06-20&rft.pub=IEEE&rft.isbn=9798331522322&rft.spage=1&rft.epage=6&rft_id=info:doi/10.1109%2FCONIT65521.2025.11167161&rft.externalDocID=11167161
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9798331522322/lc.gif&client=summon&freeimage=true
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9798331522322/mc.gif&client=summon&freeimage=true
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=9798331522322/sc.gif&client=summon&freeimage=true