Decoding as a linear ill-posed problem: The entropy minimization approach

Gespeichert in:
Bibliographische Detailangaben
Titel: Decoding as a linear ill-posed problem: The entropy minimization approach
Autoren: Gauthier-Umaña, Valérie, Gzyl, Henryk, ter Horst, Enrique
Weitere Verfasser: Facultad de Ingeniería::TICSw: Tecnologías de Información y Construcción de Software
Verlagsinformationen: Universidad de los Andes
Facultad de Ingeniería
Departamento de Ingeniería de Sistemas
Publikationsjahr: 2025
Bestand: Universidad de los Andes Colombia: Séneca
Schlagwörter: ill-posed inverse problems, decoding as inverse problem, convex optimization, gaussian random variables, Ingeniería
Beschreibung: The problem of decoding can be thought of as consisting of solving an ill-posed, linear inverse problem with noisy data and box constraints upon the unknowns. Specificially, we aimed to solve $\bA\bx+\be=\by,$ where $\bA$ is a matrix with positive entries and $\by$ is a vector with positive entries. It is required that $\bx\in\cK$, which is specified below, and we considered two points of view about the noise term, both of which were implied as unknowns to be determined. On the one hand, the error can be thought of as a confounding error, intentionally added to the coded message. On the other hand, we may think of the error as a true additive transmission-measurement error. We solved the problem by minimizing an entropy of the Fermi-Dirac type defined on the set of all constraints of the problem. Our approach provided a consistent way to recover the message and the noise from the measurements. In an example with a generator code matrix of the Reed-Solomon type, we examined the two points of view about the noise. As our approach enabled us to recursively decrease the $\ell_1$ norm of the noise as part of the solution procedure, we saw that, if the required norm of the noise was too small, the message was not well recovered. Our work falls within the general class of near-optimal signal recovery line of work. We also studied the case with Gaussian random matrices.
Publikationsart: article in journal/newspaper
Dateibeschreibung: 14 páginas; application/pdf
Sprache: English
Relation: 4152; 4139; 10; AIMS Mathematics; 1. F. L. Bauer, Decrypted secrets: Methods and maxims on cryptography, Berlin: Springer-Verlag, 1997. 2. J. M. Borwein, A. S. Lewis, Convex analysis and nonlinear optimization, 2nd Edition, Berlin: CMS-Springer, 2006. 3. D. Burshtein, I. Goldenberg, Improved linear programming decoding and bounds on the minimum distance of LDPC codes, IEEE Inf. Theory Work., 2010. Available from: https://ieeexplore. ieee.org/document/5592887. 4. E. Candes, T. Tao, Decoding by linear programming, IEEE Tran. Inf. Theory, 51 (2005), 4203– 4215. http://dx.doi.org/10.1109/TIT.2005.858979 5. E. Candes, T. Tao, Near optimal signal recovery from random projections: Universal encoding strategies, IEEE Tran. Inf. Theory, 52 (2006), 5406–5425. http://dx.doi.org/10.1109/TIT.2006.885507 6. C. Daskalakis, G. Alexandros, A. G. Dimakis, R. M. Karp, M. J. Wainwright, Probabilistic analysis of linear programming decoding, IEEE Tran. Inf. Theory, 54 (2008), 3565–3578. http://dx.doi.org/10.1109/TIT.2008.926452 7. S. El Rouayyheb, C. N. Georghiades, Graph theoretic methods in coding theory, Classical, Semiclass. Quant. Noise, 2012, 53–62. https://doi.org/10.1007/978-1-4419-6624-7 5 8. J. Feldman, M. J. Wainwright, D. R. Karger, Using linear programming to decode binary linear codes, IEEE Tran. Inf. Theory, 51 (2005), 954–972. https://doi.org/10.1109/TIT.2004.842696 9. F. Gamboa, H. Gzyl, Linear programming with maximum entropy, Math. Comput. Modeling, 13 (1990), 49–52. 10. Y. S. Han, A new treatment of priority-first search maximum-likelihood soft-decision decoding of linear block codes, IEEE Tran. Inf. Theory, 44 (1998), 3091–3096. https://doi.org/10.1109/18.737538 11. M. Helmiling, Advances in mathematical programming-based error-correction decoding, OPUS Koblen., 2015. Available from: https://kola.opus.hbz-nrw.de/frontdoor/index/ index/year/2015/docId/948. 12. M. Helmling, S. Ruzika, A. Tanatmis, Mathematical programming decoding of binary linear codes: Theory and algorithms, IEEE Tran. Inf. Theory, 58 (2012), 4753–4769. https://doi.org/10.1109/TIT.2012.2191697 13. M. R. Islam, Linear programming decoding: The ultimate decoding technique for low density parity check codes, Radioel. Commun. Syst., 56 (2013), 57–72. https://doi.org/10.3103/S0735272713020015 14. T. Kaneko, T. Nishijima, S. Hirasawa, An improvement of soft-decision maximum-likelihood decoding algorithm using hard-decision bounded-distance decoding, IEEE Tran. Inf. Theory, 43 (1997), 1314–1319. https://doi.org/10.1109/18.605601 15. S. B. McGrayne, The theory that would not die. How Bayes’ rule cracked the enigma code, hunted down Russian submarines, & emerged triumphant from two centuries of controversy, New Haven: Yale University Press, 2011. 16. R. J. McEliece, A public-key cryptosystem based on algebraic, Coding Th., 4244 (1978), 114–116. 17. H. Mohammad, N. Taghavi, P. H. Siegel, Adaptive methods for linear programming decoding, IEEE Tran. Inf. Theory, 54 (2008), 5396–5410. https://doi.org/10.1109/TIT.2008.2006384 18. G. Xie, F. Fu, H. Li, W. Du, Y. Zhong, L. Wang, et al, A gradient-enhanced physicsinformed neural networks method for the wave equation, Eng. Anal. Bound. Ele., 166 (2024). https://doi.org/10.1016/j.enganabound.2024.105802 19. Q. Yin, X. B. Shu, Y. Guo, Z. Y. Wang, Optimal control of stochastic differential equations with random impulses and the Hamilton-Jacobi-Bellman equation, Optimal Control Appl. Methods, 45 (2024), 2113–2135. https://doi.org/10.1002/oca.3139 20. B. Zolfaghani, K. Bibak, T. Koshiba, The odyssey of entropy: Cryptography, Entropy, 24 (2022), 266–292. https://doi.org/10.3390/e24020266; https://hdl.handle.net/1992/76128; https://doi.org/10.3934/math.2025192; instname:Universidad de los Andes; reponame:Repositorio Institucional Séneca; repourl:https://repositorio.uniandes.edu.co/
DOI: 10.3934/math.2025192
Verfügbarkeit: https://hdl.handle.net/1992/76128
https://doi.org/10.3934/math.2025192
Rights: Al consultar y hacer uso de este recurso, está aceptando las condiciones de uso establecidas por los autores ; info:eu-repo/semantics/openAccess ; http://purl.org/coar/access_right/c_abf2
Dokumentencode: edsbas.2550D0FD
Datenbank: BASE
FullText Text:
  Availability: 0
CustomLinks:
  – Url: https://hdl.handle.net/1992/76128#
    Name: EDS - BASE (s4221598)
    Category: fullText
    Text: View record from BASE
  – Url: https://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=EBSCO&SrcAuth=EBSCO&DestApp=WOS&ServiceName=TransferToWoS&DestLinkType=GeneralSearchSummary&Func=Links&author=Gauthier-Uma%C3%B1a%20V
    Name: ISI
    Category: fullText
    Text: Nájsť tento článok vo Web of Science
    Icon: https://imagesrvr.epnet.com/ls/20docs.gif
    MouseOverText: Nájsť tento článok vo Web of Science
Header DbId: edsbas
DbLabel: BASE
An: edsbas.2550D0FD
RelevancyScore: 997
AccessLevel: 3
PubType: Academic Journal
PubTypeId: academicJournal
PreciseRelevancyScore: 996.707214355469
IllustrationInfo
Items – Name: Title
  Label: Title
  Group: Ti
  Data: Decoding as a linear ill-posed problem: The entropy minimization approach
– Name: Author
  Label: Authors
  Group: Au
  Data: <searchLink fieldCode="AR" term="%22Gauthier-Umaña%2C+Valérie%22">Gauthier-Umaña, Valérie</searchLink><br /><searchLink fieldCode="AR" term="%22Gzyl%2C+Henryk%22">Gzyl, Henryk</searchLink><br /><searchLink fieldCode="AR" term="%22ter+Horst%2C+Enrique%22">ter Horst, Enrique</searchLink>
– Name: Author
  Label: Contributors
  Group: Au
  Data: Facultad de Ingeniería::TICSw: Tecnologías de Información y Construcción de Software
– Name: Publisher
  Label: Publisher Information
  Group: PubInfo
  Data: Universidad de los Andes<br />Facultad de Ingeniería<br />Departamento de Ingeniería de Sistemas
– Name: DatePubCY
  Label: Publication Year
  Group: Date
  Data: 2025
– Name: Subset
  Label: Collection
  Group: HoldingsInfo
  Data: Universidad de los Andes Colombia: Séneca
– Name: Subject
  Label: Subject Terms
  Group: Su
  Data: <searchLink fieldCode="DE" term="%22ill-posed+inverse+problems%22">ill-posed inverse problems</searchLink><br /><searchLink fieldCode="DE" term="%22decoding+as+inverse+problem%22">decoding as inverse problem</searchLink><br /><searchLink fieldCode="DE" term="%22convex+optimization%22">convex optimization</searchLink><br /><searchLink fieldCode="DE" term="%22gaussian+random+variables%22">gaussian random variables</searchLink><br /><searchLink fieldCode="DE" term="%22Ingeniería%22">Ingeniería</searchLink>
– Name: Abstract
  Label: Description
  Group: Ab
  Data: The problem of decoding can be thought of as consisting of solving an ill-posed, linear inverse problem with noisy data and box constraints upon the unknowns. Specificially, we aimed to solve $\bA\bx+\be=\by,$ where $\bA$ is a matrix with positive entries and $\by$ is a vector with positive entries. It is required that $\bx\in\cK$, which is specified below, and we considered two points of view about the noise term, both of which were implied as unknowns to be determined. On the one hand, the error can be thought of as a confounding error, intentionally added to the coded message. On the other hand, we may think of the error as a true additive transmission-measurement error. We solved the problem by minimizing an entropy of the Fermi-Dirac type defined on the set of all constraints of the problem. Our approach provided a consistent way to recover the message and the noise from the measurements. In an example with a generator code matrix of the Reed-Solomon type, we examined the two points of view about the noise. As our approach enabled us to recursively decrease the $\ell_1$ norm of the noise as part of the solution procedure, we saw that, if the required norm of the noise was too small, the message was not well recovered. Our work falls within the general class of near-optimal signal recovery line of work. We also studied the case with Gaussian random matrices.
– Name: TypeDocument
  Label: Document Type
  Group: TypDoc
  Data: article in journal/newspaper
– Name: Format
  Label: File Description
  Group: SrcInfo
  Data: 14 páginas; application/pdf
– Name: Language
  Label: Language
  Group: Lang
  Data: English
– Name: NoteTitleSource
  Label: Relation
  Group: SrcInfo
  Data: 4152; 4139; 10; AIMS Mathematics; 1. F. L. Bauer, Decrypted secrets: Methods and maxims on cryptography, Berlin: Springer-Verlag, 1997. 2. J. M. Borwein, A. S. Lewis, Convex analysis and nonlinear optimization, 2nd Edition, Berlin: CMS-Springer, 2006. 3. D. Burshtein, I. Goldenberg, Improved linear programming decoding and bounds on the minimum distance of LDPC codes, IEEE Inf. Theory Work., 2010. Available from: https://ieeexplore. ieee.org/document/5592887. 4. E. Candes, T. Tao, Decoding by linear programming, IEEE Tran. Inf. Theory, 51 (2005), 4203– 4215. http://dx.doi.org/10.1109/TIT.2005.858979 5. E. Candes, T. Tao, Near optimal signal recovery from random projections: Universal encoding strategies, IEEE Tran. Inf. Theory, 52 (2006), 5406–5425. http://dx.doi.org/10.1109/TIT.2006.885507 6. C. Daskalakis, G. Alexandros, A. G. Dimakis, R. M. Karp, M. J. Wainwright, Probabilistic analysis of linear programming decoding, IEEE Tran. Inf. Theory, 54 (2008), 3565–3578. http://dx.doi.org/10.1109/TIT.2008.926452 7. S. El Rouayyheb, C. N. Georghiades, Graph theoretic methods in coding theory, Classical, Semiclass. Quant. Noise, 2012, 53–62. https://doi.org/10.1007/978-1-4419-6624-7 5 8. J. Feldman, M. J. Wainwright, D. R. Karger, Using linear programming to decode binary linear codes, IEEE Tran. Inf. Theory, 51 (2005), 954–972. https://doi.org/10.1109/TIT.2004.842696 9. F. Gamboa, H. Gzyl, Linear programming with maximum entropy, Math. Comput. Modeling, 13 (1990), 49–52. 10. Y. S. Han, A new treatment of priority-first search maximum-likelihood soft-decision decoding of linear block codes, IEEE Tran. Inf. Theory, 44 (1998), 3091–3096. https://doi.org/10.1109/18.737538 11. M. Helmiling, Advances in mathematical programming-based error-correction decoding, OPUS Koblen., 2015. Available from: https://kola.opus.hbz-nrw.de/frontdoor/index/ index/year/2015/docId/948. 12. M. Helmling, S. Ruzika, A. Tanatmis, Mathematical programming decoding of binary linear codes: Theory and algorithms, IEEE Tran. Inf. Theory, 58 (2012), 4753–4769. https://doi.org/10.1109/TIT.2012.2191697 13. M. R. Islam, Linear programming decoding: The ultimate decoding technique for low density parity check codes, Radioel. Commun. Syst., 56 (2013), 57–72. https://doi.org/10.3103/S0735272713020015 14. T. Kaneko, T. Nishijima, S. Hirasawa, An improvement of soft-decision maximum-likelihood decoding algorithm using hard-decision bounded-distance decoding, IEEE Tran. Inf. Theory, 43 (1997), 1314–1319. https://doi.org/10.1109/18.605601 15. S. B. McGrayne, The theory that would not die. How Bayes’ rule cracked the enigma code, hunted down Russian submarines, & emerged triumphant from two centuries of controversy, New Haven: Yale University Press, 2011. 16. R. J. McEliece, A public-key cryptosystem based on algebraic, Coding Th., 4244 (1978), 114–116. 17. H. Mohammad, N. Taghavi, P. H. Siegel, Adaptive methods for linear programming decoding, IEEE Tran. Inf. Theory, 54 (2008), 5396–5410. https://doi.org/10.1109/TIT.2008.2006384 18. G. Xie, F. Fu, H. Li, W. Du, Y. Zhong, L. Wang, et al, A gradient-enhanced physicsinformed neural networks method for the wave equation, Eng. Anal. Bound. Ele., 166 (2024). https://doi.org/10.1016/j.enganabound.2024.105802 19. Q. Yin, X. B. Shu, Y. Guo, Z. Y. Wang, Optimal control of stochastic differential equations with random impulses and the Hamilton-Jacobi-Bellman equation, Optimal Control Appl. Methods, 45 (2024), 2113–2135. https://doi.org/10.1002/oca.3139 20. B. Zolfaghani, K. Bibak, T. Koshiba, The odyssey of entropy: Cryptography, Entropy, 24 (2022), 266–292. https://doi.org/10.3390/e24020266; https://hdl.handle.net/1992/76128; https://doi.org/10.3934/math.2025192; instname:Universidad de los Andes; reponame:Repositorio Institucional Séneca; repourl:https://repositorio.uniandes.edu.co/
– Name: DOI
  Label: DOI
  Group: ID
  Data: 10.3934/math.2025192
– Name: URL
  Label: Availability
  Group: URL
  Data: https://hdl.handle.net/1992/76128<br />https://doi.org/10.3934/math.2025192
– Name: Copyright
  Label: Rights
  Group: Cpyrght
  Data: Al consultar y hacer uso de este recurso, está aceptando las condiciones de uso establecidas por los autores ; info:eu-repo/semantics/openAccess ; http://purl.org/coar/access_right/c_abf2
– Name: AN
  Label: Accession Number
  Group: ID
  Data: edsbas.2550D0FD
PLink https://erproxy.cvtisr.sk/sfx/access?url=https://search.ebscohost.com/login.aspx?direct=true&site=eds-live&db=edsbas&AN=edsbas.2550D0FD
RecordInfo BibRecord:
  BibEntity:
    Identifiers:
      – Type: doi
        Value: 10.3934/math.2025192
    Languages:
      – Text: English
    Subjects:
      – SubjectFull: ill-posed inverse problems
        Type: general
      – SubjectFull: decoding as inverse problem
        Type: general
      – SubjectFull: convex optimization
        Type: general
      – SubjectFull: gaussian random variables
        Type: general
      – SubjectFull: Ingeniería
        Type: general
    Titles:
      – TitleFull: Decoding as a linear ill-posed problem: The entropy minimization approach
        Type: main
  BibRelationships:
    HasContributorRelationships:
      – PersonEntity:
          Name:
            NameFull: Gauthier-Umaña, Valérie
      – PersonEntity:
          Name:
            NameFull: Gzyl, Henryk
      – PersonEntity:
          Name:
            NameFull: ter Horst, Enrique
      – PersonEntity:
          Name:
            NameFull: Facultad de Ingeniería::TICSw: Tecnologías de Información y Construcción de Software
    IsPartOfRelationships:
      – BibEntity:
          Dates:
            – D: 01
              M: 01
              Type: published
              Y: 2025
          Identifiers:
            – Type: issn-locals
              Value: edsbas
            – Type: issn-locals
              Value: edsbas.oa
ResultId 1