Convergence of a Gradient-Based Learning Algorithm With Penalty for Ridge Polynomial Neural Networks
Recently there have been renewed interests in high order neural networks (HONNs) for its powerful mapping capability. Ridge polynomial neural network (RPNN) is an important kind of HONNs, which always occupies a key position as an efficient instrument in the tasks of classification or regression. In...
Uloženo v:
| Vydáno v: | IEEE access Ročník 9; s. 28742 - 28752 |
|---|---|
| Hlavní autoři: | , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
Piscataway
IEEE
2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Témata: | |
| ISSN: | 2169-3536, 2169-3536 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Abstract | Recently there have been renewed interests in high order neural networks (HONNs) for its powerful mapping capability. Ridge polynomial neural network (RPNN) is an important kind of HONNs, which always occupies a key position as an efficient instrument in the tasks of classification or regression. In order to make the convergence speed faster and the network generalization ability stronger, we introduce a regularization model for RPNN with Group Lasso penalty, which deals with the structural sparse problem at the group level in this paper. Nevertheless, there are two main obstacles for introducing the Group Lasso penalty, one is numerical oscillation and the other is convergence analysis challenge. In doing so, we adopt smoothing function to approximate the Group Lasso penalty to overcome these drawbacks. Meanwhile, strong and weak convergence theorems, and monotonicity theorems are provided for this novel algorithm. We also demonstrate the efficiency of our proposed algorithm by numerical experiments, and compare it to the no regularizer, <inline-formula> <tex-math notation="LaTeX">L_{2} </tex-math></inline-formula> regularizer, <inline-formula> <tex-math notation="LaTeX">L_{1/2} </tex-math></inline-formula> regularizer, smoothing <inline-formula> <tex-math notation="LaTeX">L_{1/2} </tex-math></inline-formula> regularizer, and the Group Lasso regularizer, and also the relevant theoretical analysis has been verified. |
|---|---|
| AbstractList | Recently there have been renewed interests in high order neural networks (HONNs) for its powerful mapping capability. Ridge polynomial neural network (RPNN) is an important kind of HONNs, which always occupies a key position as an efficient instrument in the tasks of classification or regression. In order to make the convergence speed faster and the network generalization ability stronger, we introduce a regularization model for RPNN with Group Lasso penalty, which deals with the structural sparse problem at the group level in this paper. Nevertheless, there are two main obstacles for introducing the Group Lasso penalty, one is numerical oscillation and the other is convergence analysis challenge. In doing so, we adopt smoothing function to approximate the Group Lasso penalty to overcome these drawbacks. Meanwhile, strong and weak convergence theorems, and monotonicity theorems are provided for this novel algorithm. We also demonstrate the efficiency of our proposed algorithm by numerical experiments, and compare it to the no regularizer, <tex-math notation="LaTeX">$L_{2}$ </tex-math> regularizer, <tex-math notation="LaTeX">$L_{1/2}$ </tex-math> regularizer, smoothing <tex-math notation="LaTeX">$L_{1/2}$ </tex-math> regularizer, and the Group Lasso regularizer, and also the relevant theoretical analysis has been verified. Recently there have been renewed interests in high order neural networks (HONNs) for its powerful mapping capability. Ridge polynomial neural network (RPNN) is an important kind of HONNs, which always occupies a key position as an efficient instrument in the tasks of classification or regression. In order to make the convergence speed faster and the network generalization ability stronger, we introduce a regularization model for RPNN with Group Lasso penalty, which deals with the structural sparse problem at the group level in this paper. Nevertheless, there are two main obstacles for introducing the Group Lasso penalty, one is numerical oscillation and the other is convergence analysis challenge. In doing so, we adopt smoothing function to approximate the Group Lasso penalty to overcome these drawbacks. Meanwhile, strong and weak convergence theorems, and monotonicity theorems are provided for this novel algorithm. We also demonstrate the efficiency of our proposed algorithm by numerical experiments, and compare it to the no regularizer, [Formula Omitted] regularizer, [Formula Omitted] regularizer, smoothing [Formula Omitted] regularizer, and the Group Lasso regularizer, and also the relevant theoretical analysis has been verified. Recently there have been renewed interests in high order neural networks (HONNs) for its powerful mapping capability. Ridge polynomial neural network (RPNN) is an important kind of HONNs, which always occupies a key position as an efficient instrument in the tasks of classification or regression. In order to make the convergence speed faster and the network generalization ability stronger, we introduce a regularization model for RPNN with Group Lasso penalty, which deals with the structural sparse problem at the group level in this paper. Nevertheless, there are two main obstacles for introducing the Group Lasso penalty, one is numerical oscillation and the other is convergence analysis challenge. In doing so, we adopt smoothing function to approximate the Group Lasso penalty to overcome these drawbacks. Meanwhile, strong and weak convergence theorems, and monotonicity theorems are provided for this novel algorithm. We also demonstrate the efficiency of our proposed algorithm by numerical experiments, and compare it to the no regularizer, <inline-formula> <tex-math notation="LaTeX">L_{2} </tex-math></inline-formula> regularizer, <inline-formula> <tex-math notation="LaTeX">L_{1/2} </tex-math></inline-formula> regularizer, smoothing <inline-formula> <tex-math notation="LaTeX">L_{1/2} </tex-math></inline-formula> regularizer, and the Group Lasso regularizer, and also the relevant theoretical analysis has been verified. |
| Author | Li, Haiyang Peng, Jigen Lin, Shoujin Fan, Qinwei |
| Author_xml | – sequence: 1 givenname: Qinwei orcidid: 0000-0002-1017-3496 surname: Fan fullname: Fan, Qinwei organization: School of Mathematics and Information Science, Guangzhou University, Guangzhou, China – sequence: 2 givenname: Jigen orcidid: 0000-0002-7207-3183 surname: Peng fullname: Peng, Jigen email: jgpeng@gzhu.edu.cn organization: School of Mathematics and Information Science, Guangzhou University, Guangzhou, China – sequence: 3 givenname: Haiyang orcidid: 0000-0001-7238-2456 surname: Li fullname: Li, Haiyang organization: School of Mathematics and Information Science, Guangzhou University, Guangzhou, China – sequence: 4 givenname: Shoujin surname: Lin fullname: Lin, Shoujin organization: School of Mathematics and Information Science, Guangzhou University, Guangzhou, China |
| BookMark | eNp9kU9v1DAQxSNUJErpJ-jFEucs_rtxjktUSqUVVLQSR2tij4OXrF0cb9F-e9KmIMSBOcxYT_N7tvVeVycxRayqC0ZXjNH23abrLm9vV5xyuhJUai7Ui-qUs3VbCyXWJ3-dX1Xn07Sjc-lZUs1p5boUHzAPGC2S5AmQqwwuYCz1e5jQkS1CjiEOZDMOKYfybU--zp3cYISxHIlPmXwJbkByk8ZjTPsAI_mEh_w0ys-Uv09vqpcexgnPn-dZdffh8q77WG8_X113m21tJdWl5koiUxZsr4TtsW-x5aAVets4K6xUWlkPTHpvpaVoGdeONY1ALTRoKc6q68XWJdiZ-xz2kI8mQTBPQsqDgVyCHdE4qp2G3iPlQur5osZJRQUwtWZcNv3s9Xbxus_pxwGnYnbpkOcvT4bLlmulWqXnrXbZsjlNU0ZvbChQQoolQxgNo-YxIrNEZB4jMs8Rzaz4h_394v9TFwsVEPEP0QrGWEPFLzynnxw |
| CODEN | IAECCG |
| CitedBy_id | crossref_primary_10_1002_adts_202200047 crossref_primary_10_1007_s11063_022_10956_w crossref_primary_10_1186_s12302_025_01078_w crossref_primary_10_3390_electronics13081585 crossref_primary_10_1007_s13042_023_01901_x crossref_primary_10_1080_02331934_2022_2057852 crossref_primary_10_1007_s11063_022_11069_0 |
| Cites_doi | 10.1016/j.asoc.2017.07.059 10.1109/72.377967 10.1016/j.neucom.2014.09.031 10.1016/j.neucom.2012.05.022 10.1109/CCMB.2013.6609169 10.1007/11893028_63 10.1016/j.neucom.2013.10.023 10.1016/j.neucom.2013.09.015 10.1371/journal.pone.0167248 10.1109/ACCESS.2017.2713389 10.1016/S0261-5177(99)00067-9 10.1109/TCYB.2019.2950105 10.1016/j.neunet.2010.09.007 10.1016/j.neucom.2017.08.037 10.1109/72.839013 10.1007/s00521-014-1730-x 10.1023/A:1022967523886 10.1364/AO.26.004972 10.1109/72.80236 10.1109/ACCESS.2020.3031647 10.1016/j.ins.2019.10.048 10.1016/j.neucom.2004.05.004 10.1162/neco.2007.19.12.3356 10.1111/j.1467-9868.2005.00532.x 10.7763/IJCEE.2011.V3.288 10.1016/j.eswa.2010.09.037 10.1007/s11356-016-7149-4 10.1109/TSMC.2015.2406855 10.1016/j.neucom.2008.12.005 10.1016/j.neucom.2007.12.004 10.1109/VIPMC.2003.1220516 10.1016/j.neunet.2013.11.006 10.1111/j.1467-9868.2007.00627.x 10.1080/10618600.2012.681250 10.9781/ijimai.2019.04.004 10.1109/TSMC.1971.4308320 10.1109/IJCNN.1991.155142 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
| DBID | 97E ESBDL RIA RIE AAYXX CITATION 7SC 7SP 7SR 8BQ 8FD JG9 JQ2 L7M L~C L~D DOA |
| DOI | 10.1109/ACCESS.2020.3048235 |
| DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE Xplore Open Access Journals IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE/IET Electronic Library CrossRef Computer and Information Systems Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts METADEX Technology Research Database Materials Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional DOAJ Open Access Full Text |
| DatabaseTitle | CrossRef Materials Research Database Engineered Materials Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace METADEX Computer and Information Systems Abstracts Professional |
| DatabaseTitleList | Materials Research Database |
| Database_xml | – sequence: 1 dbid: DOA name: DOAJ Open Access Full Text url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: RIE name: IEEE/IET Electronic Library url: https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering |
| EISSN | 2169-3536 |
| EndPage | 28752 |
| ExternalDocumentID | oai_doaj_org_article_d08d8abfe023489e97d4503a1561247b 10_1109_ACCESS_2020_3048235 9311170 |
| Genre | orig-research |
| GrantInformation_xml | – fundername: 65th China Postdoctoral Science Foundation grantid: 2019M652837 funderid: 10.13039/501100002858 – fundername: National Science Foundation of China grantid: 11771347 funderid: 10.13039/501100001809 |
| GroupedDBID | 0R~ 4.4 5VS 6IK 97E AAJGR ABAZT ABVLG ACGFS ADBBV AGSQL ALMA_UNASSIGNED_HOLDINGS BCNDV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD ESBDL GROUPED_DOAJ IPLJI JAVBF KQ8 M43 M~E O9- OCL OK1 RIA RIE RNS AAYXX CITATION 7SC 7SP 7SR 8BQ 8FD JG9 JQ2 L7M L~C L~D |
| ID | FETCH-LOGICAL-c408t-254e15cacb53cbeb9e92a85efc7dc3c4585cfa14ffc4c0ec128d1773e838a843 |
| IEDL.DBID | DOA |
| ISICitedReferencesCount | 10 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000621394200001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 2169-3536 |
| IngestDate | Fri Oct 03 12:32:20 EDT 2025 Mon Jun 30 06:10:27 EDT 2025 Sat Nov 29 06:11:49 EST 2025 Tue Nov 18 22:22:55 EST 2025 Wed Aug 27 02:29:14 EDT 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Language | English |
| License | https://creativecommons.org/licenses/by-nc-nd/4.0 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c408t-254e15cacb53cbeb9e92a85efc7dc3c4585cfa14ffc4c0ec128d1773e838a843 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ORCID | 0000-0002-1017-3496 0000-0001-7238-2456 0000-0002-7207-3183 |
| OpenAccessLink | https://doaj.org/article/d08d8abfe023489e97d4503a1561247b |
| PQID | 2492855958 |
| PQPubID | 4845423 |
| PageCount | 11 |
| ParticipantIDs | crossref_primary_10_1109_ACCESS_2020_3048235 ieee_primary_9311170 doaj_primary_oai_doaj_org_article_d08d8abfe023489e97d4503a1561247b crossref_citationtrail_10_1109_ACCESS_2020_3048235 proquest_journals_2492855958 |
| PublicationCentury | 2000 |
| PublicationDate | 20210000 2021-00-00 20210101 2021-01-01 |
| PublicationDateYYYYMMDD | 2021-01-01 |
| PublicationDate_xml | – year: 2021 text: 20210000 |
| PublicationDecade | 2020 |
| PublicationPlace | Piscataway |
| PublicationPlace_xml | – name: Piscataway |
| PublicationTitle | IEEE access |
| PublicationTitleAbbrev | Access |
| PublicationYear | 2021 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref13 ref34 ref12 ref37 ref15 ref36 ref14 ref31 ref30 han (ref24) 2016; 56 ref33 ref11 ref32 ref10 ref2 ref1 ref39 ref16 ref19 ref18 wu (ref22) 2005 yuan (ref44) 1997 ref23 wu (ref17) 2007; 14 ref26 sharma (ref28) 2010; 12 ref25 ref20 ref42 ref41 ref21 ref43 fan (ref38) 2020; 2020 ref27 ref29 ref8 bishop (ref35) 2006 ref7 ref9 ref4 ref3 ref6 ref5 ref40 |
| References_xml | – volume: 2020 start-page: 1 year: 2020 ident: ref38 article-title: Smoothing L? regularization for extreme learning machine publication-title: Math Problems Eng – ident: ref14 doi: 10.1016/j.asoc.2017.07.059 – ident: ref5 doi: 10.1109/72.377967 – ident: ref19 doi: 10.1016/j.neucom.2014.09.031 – ident: ref34 doi: 10.1016/j.neucom.2012.05.022 – ident: ref32 doi: 10.1109/CCMB.2013.6609169 – ident: ref36 doi: 10.1007/11893028_63 – ident: ref27 doi: 10.1016/j.neucom.2013.10.023 – volume: 14 start-page: 95 year: 2007 ident: ref17 article-title: Convergence of asynchronous batch gradient method for pi-sigma neural networks publication-title: Dyna Continuous Discrete Impul Syst – ident: ref8 doi: 10.1016/j.neucom.2013.09.015 – year: 1997 ident: ref44 publication-title: Optimization Theory and Methods – ident: ref9 doi: 10.1371/journal.pone.0167248 – start-page: 288 year: 2005 ident: ref22 article-title: Improving generalization performance of artificial neural networks with genetic algorithms publication-title: Proc IEEE Int Conf Granular Comput – ident: ref33 doi: 10.1109/ACCESS.2017.2713389 – ident: ref20 doi: 10.1016/S0261-5177(99)00067-9 – volume: 12 start-page: 7847 year: 2010 ident: ref28 article-title: Constructive neural networks: A review publication-title: Int J Eng Sci Technol – ident: ref39 doi: 10.1109/TCYB.2019.2950105 – ident: ref16 doi: 10.1016/j.neunet.2010.09.007 – ident: ref15 doi: 10.1016/j.neucom.2017.08.037 – ident: ref30 doi: 10.1109/72.839013 – ident: ref37 doi: 10.1007/s00521-014-1730-x – volume: 56 start-page: 3 year: 2016 ident: ref24 article-title: Deep compression: Compressing deep neural networks with pruning, trained quantization and Huffman coding publication-title: FIBER – ident: ref6 doi: 10.1023/A:1022967523886 – ident: ref3 doi: 10.1364/AO.26.004972 – ident: ref23 doi: 10.1109/72.80236 – ident: ref40 doi: 10.1109/ACCESS.2020.3031647 – ident: ref26 doi: 10.1016/j.ins.2019.10.048 – ident: ref7 doi: 10.1016/j.neucom.2004.05.004 – ident: ref31 doi: 10.1162/neco.2007.19.12.3356 – ident: ref42 doi: 10.1111/j.1467-9868.2005.00532.x – ident: ref29 doi: 10.7763/IJCEE.2011.V3.288 – ident: ref1 doi: 10.1016/j.eswa.2010.09.037 – ident: ref21 doi: 10.1007/s11356-016-7149-4 – year: 2006 ident: ref35 publication-title: Pattern Recognition and Machine Learning – ident: ref13 doi: 10.1109/TSMC.2015.2406855 – ident: ref11 doi: 10.1016/j.neucom.2008.12.005 – ident: ref18 doi: 10.1016/j.neucom.2007.12.004 – ident: ref12 doi: 10.1109/VIPMC.2003.1220516 – ident: ref25 doi: 10.1016/j.neunet.2013.11.006 – ident: ref43 doi: 10.1111/j.1467-9868.2007.00627.x – ident: ref41 doi: 10.1080/10618600.2012.681250 – ident: ref10 doi: 10.9781/ijimai.2019.04.004 – ident: ref2 doi: 10.1109/TSMC.1971.4308320 – ident: ref4 doi: 10.1109/IJCNN.1991.155142 |
| SSID | ssj0000816957 |
| Score | 2.2912338 |
| Snippet | Recently there have been renewed interests in high order neural networks (HONNs) for its powerful mapping capability. Ridge polynomial neural network (RPNN) is... |
| SourceID | doaj proquest crossref ieee |
| SourceType | Open Website Aggregation Database Enrichment Source Index Database Publisher |
| StartPage | 28742 |
| SubjectTerms | Algorithms Biological neural networks Convergence high order neural networks Input variables Machine learning Neural networks Polynomials Regression analysis Regularization ridge polynomial neural network Smoothing smoothing Group Lasso Smoothing methods Theorems Training |
| SummonAdditionalLinks | – databaseName: IEEE/IET Electronic Library dbid: RIE link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3BbtQwELXaigMcoKUgFtrKB44NdWJnbR-3KwqnaoUq0ZvljMel0rKpdlOk_j1jrxuQQEhckiiyHdvPyXgmM28Yey_bSEISp1WLyVo1raHyUoSqU1Z22ABYL3KyCX15aa6v7WKHnY6xMIiYnc_wQ7rM__JDD_fJVHZmZZ0SpeyyXa2n21it0Z6SEkjYVhdioVrYs9l8TmMgFbAhzZQWapNTuv0SPpmjvyRV-eNLnMXLxYv_69g-e162kXy2xf2A7eDqJXv2G7ngIQvz5FCeYyuR95F7_mmd_buG6pxEV-CFWvWGz5Y3_fp2-Padf6UjXyA1PTxw2s7yLymciy_65UMKX6YnJjKPfMre45tX7Ori49X8c1VyKlSghBkq0gexbsFD10rosLNoG29ajKADSFCkPUD0tYoRFAgEEl-h1lqikcYbJV-zvVW_wjeMK5pYaKglESNJQp143gn5KVphAmg9Yc3jXDsofOMp7cXSZb1DWLcFyCWAXAFowk7HSndbuo1_Fz9PII5FE1d2vkHouPLquUD9Mb6LSNsTZWjAOqhWSF-nxKBKdxN2mBAdGylgTtjR45Jw5b3euMSvaEgJa83bv9d6x542yeslG2mO2N6wvsdj9gR-DLeb9Ulesj8B8BPpdw priority: 102 providerName: IEEE |
| Title | Convergence of a Gradient-Based Learning Algorithm With Penalty for Ridge Polynomial Neural Networks |
| URI | https://ieeexplore.ieee.org/document/9311170 https://www.proquest.com/docview/2492855958 https://doaj.org/article/d08d8abfe023489e97d4503a1561247b |
| Volume | 9 |
| WOSCitedRecordID | wos000621394200001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVAON databaseName: DOAJ Open Access Full Text customDbUrl: eissn: 2169-3536 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0000816957 issn: 2169-3536 databaseCode: DOA dateStart: 20130101 isFulltext: true titleUrlDefault: https://www.doaj.org/ providerName: Directory of Open Access Journals – providerCode: PRVHPJ databaseName: ROAD: Directory of Open Access Scholarly Resources customDbUrl: eissn: 2169-3536 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0000816957 issn: 2169-3536 databaseCode: M~E dateStart: 20130101 isFulltext: true titleUrlDefault: https://road.issn.org providerName: ISSN International Centre |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1NT9wwELUqxKEcKiggli_50GMjnNhZ28dlBe2laFUhlZvlTGxA2u6i3YDEhd_OjGO2KyG1l14cKXLsZGbi8Vgz7zH2RdYRnWQYFnWg06phCYWXoi0aZWUTKgDrRSKb0FdX5ubGTtaovignrIcH7gV31grTGt_EgM5FGRusblUtpC-J1lHphlZfoe1aMJXWYFMOba0zzFAp7NloPMYvwoCwwjgVzbZKBG9_XFFC7M8UK-_W5eRsLrfZp7xL5KP-7XbYhzD7zLbWsAN3WTumfPFUOhn4PHLPvy1S-lZXnKNnanlGTr3lo-ntfHHf3f3mv7Dlk4BDd88cd6v8J1Vr8cl8-kzVyTgjYXWkS0oOX-6x68uL6_H3IlMmFKCE6QoM90JZg4emltCEBuVVeVOHCLoFCQqDA4i-VDGCAhEAvVNbai2DkcYbJffZxmw-CweMK5QUVDiSiBEdnSYYd1TsMFhUCGg9YNWb8BxkOHFitZi6FFYI63qJO5K4yxIfsK-rhx56NI2_dz8nray6EhR2uoEG4rKBuH8ZyIDtkk5Xg1hZEtvOgB2_6djl33bpCD7RYIxVm8P_MfUR-1hRCkw6sTlmG93iMZywTXjq7peL02Sx2P54uThNdYeveX7uCw |
| linkProvider | Directory of Open Access Journals |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3fb9MwED6NgQQ88GsgCgP8wOPCnNip7ceuYgwxqgpVYm9Wcrlsk0qD2gxp_z1n1wtIICRekiiyHdufk_Nd7r4DeKvKloUkjbOSgrVqnGNWKdlktXaqpgLRVTImmzCzmT07c_MdOBhiYYgoOp_Ru3AZ_-U3HV4FU9mhU3lIlHILbpea9Z5ttNZgUQkpJFxpErVQLt3hZDrlUbASWLBuyku1iEndfomfyNKf0qr88S2OAub44f917RE8SBtJMdki_xh2aPUE7v9GL7gHzTS4lMfoShJdKyrxYR09vPrsiIVXIxK56rmYLM-79WV_8U185aOYEzfdXwve0IovIaBLzLvldQhg5icGOo94iv7jm6ewOH6_mJ5kKatChlraPmONkPISK6xLhTXVjlxR2ZJaNA0q1Kw_YFvlum1RoyRkAdbkxiiyylZWq2ewu-pW9ByE5onFgluSbcuy0ASmd8Z-TE7aBo0ZQXEz1x4T43hIfLH0UfOQzm8B8gEgnwAawcFQ6fuWcOPfxY8CiEPRwJYdbzA6Pr18vuH-2KpuiTco2vKATaNLqao8pAbVph7BXkB0aCSBOYL9myXh05u98YFh0bIaVtoXf6_1Bu6eLD6f-tOPs08v4V4RfGCiyWYfdvv1Fb2CO_ijv9ysX8fl-xM33-y- |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Convergence+of+a+Gradient-Based+Learning+Algorithm+With+Penalty+for+Ridge+Polynomial+Neural+Networks&rft.jtitle=IEEE+access&rft.au=Fan%2C+Qinwei&rft.au=Peng%2C+Jigen&rft.au=Li%2C+Haiyang&rft.au=Lin%2C+Shoujin&rft.date=2021&rft.issn=2169-3536&rft.eissn=2169-3536&rft.volume=9&rft.spage=28742&rft.epage=28752&rft_id=info:doi/10.1109%2FACCESS.2020.3048235&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_ACCESS_2020_3048235 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2169-3536&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2169-3536&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2169-3536&client=summon |