Direct Parallel Perceptrons (DPPs): Fast Analytical Calculation of the Parallel Perceptrons Weights With Margin Control for Classification Tasks
Parallel perceptrons (PPs) are very simple and efficient committee machines (a single layer of perceptrons with threshold activation functions and binary outputs, and a majority voting decision scheme), which nevertheless behave as universal approximators. The parallel delta (P-Delta) rule is an eff...
Uloženo v:
| Vydáno v: | IEEE transactions on neural networks Ročník 22; číslo 11; s. 1837 - 1848 |
|---|---|
| Hlavní autoři: | , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
New York, NY
IEEE
01.11.2011
Institute of Electrical and Electronics Engineers |
| Témata: | |
| ISSN: | 1045-9227, 1941-0093, 1941-0093 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Abstract | Parallel perceptrons (PPs) are very simple and efficient committee machines (a single layer of perceptrons with threshold activation functions and binary outputs, and a majority voting decision scheme), which nevertheless behave as universal approximators. The parallel delta (P-Delta) rule is an effective training algorithm, which, following the ideas of statistical learning theory used by the support vector machine (SVM), raises its generalization ability by maximizing the difference between the perceptron activations for the training patterns and the activation threshold (which corresponds to the separating hyperplane). In this paper, we propose an analytical closed-form expression to calculate the PPs' weights for classification tasks. Our method, called Direct Parallel Perceptrons (DPPs), directly calculates (without iterations) the weights using the training patterns and their desired outputs, without any search or numeric function optimization. The calculated weights globally minimize an error function which simultaneously takes into account the training error and the classification margin. Given its analytical and noniterative nature, DPPs are computationally much more efficient than other related approaches (P-Delta and SVM), and its computational complexity is linear in the input dimensionality. Therefore, DPPs are very appealing, in terms of time complexity and memory consumption, and are very easy to use for high-dimensional classification tasks. On real benchmark datasets with two and multiple classes, DPPs are competitive with SVM and other approaches but they also allow online learning and, as opposed to most of them, have no tunable parameters. |
|---|---|
| AbstractList | Parallel perceptrons (PPs) are very simple and efficient committee machines (a single layer of perceptrons with threshold activation functions and binary outputs, and a majority voting decision scheme), which nevertheless behave as universal approximators. The parallel delta (P-Delta) rule is an effective training algorithm, which, following the ideas of statistical learning theory used by the support vector machine (SVM), raises its generalization ability by maximizing the difference between the perceptron activations for the training patterns and the activation threshold (which corresponds to the separating hyperplane). In this paper, we propose an analytical closed-form expression to calculate the PPs' weights for classification tasks. Our method, called Direct Parallel Perceptrons (DPPs), directly calculates (without iterations) the weights using the training patterns and their desired outputs, without any search or numeric function optimization. The calculated weights globally minimize an error function which simultaneously takes into account the training error and the classification margin. Given its analytical and noniterative nature, DPPs are computationally much more efficient than other related approaches (P-Delta and SVM), and its computational complexity is linear in the input dimensionality. Therefore, DPPs are very appealing, in terms of time complexity and memory consumption, and are very easy to use for high-dimensional classification tasks. On real benchmark datasets with two and multiple classes, DPPs are competitive with SVM and other approaches but they also allow online learning and, as opposed to most of them, have no tunable parameters. Parallel perceptrons (PPs) are very simple and efficient committee machines (a single layer of perceptrons with threshold activation functions and binary outputs, and a majority voting decision scheme), which nevertheless behave as universal approximators. The parallel delta (P-Delta) rule is an effective training algorithm, which, following the ideas of statistical learning theory used by the support vector machine (SVM), raises its generalization ability by maximizing the difference between the perceptron activations for the training patterns and the activation threshold (which corresponds to the separating hyperplane). In this paper, we propose an analytical closed-form expression to calculate the PPs' weights for classification tasks. Our method, called Direct Parallel Perceptrons (DPPs), directly calculates (without iterations) the weights using the training patterns and their desired outputs, without any search or numeric function optimization. The calculated weights globally minimize an error function which simultaneously takes into account the training error and the classification margin. Given its analytical and noniterative nature, DPPs are computationally much more efficient than other related approaches (P-Delta and SVM), and its computational complexity is linear in the input dimensionality. Therefore, DPPs are very appealing, in terms of time complexity and memory consumption, and are very easy to use for high-dimensional classification tasks. On real benchmark datasets with two and multiple classes, DPPs are competitive with SVM and other approaches but they also allow online learning and, as opposed to most of them, have no tunable parameters.Parallel perceptrons (PPs) are very simple and efficient committee machines (a single layer of perceptrons with threshold activation functions and binary outputs, and a majority voting decision scheme), which nevertheless behave as universal approximators. The parallel delta (P-Delta) rule is an effective training algorithm, which, following the ideas of statistical learning theory used by the support vector machine (SVM), raises its generalization ability by maximizing the difference between the perceptron activations for the training patterns and the activation threshold (which corresponds to the separating hyperplane). In this paper, we propose an analytical closed-form expression to calculate the PPs' weights for classification tasks. Our method, called Direct Parallel Perceptrons (DPPs), directly calculates (without iterations) the weights using the training patterns and their desired outputs, without any search or numeric function optimization. The calculated weights globally minimize an error function which simultaneously takes into account the training error and the classification margin. Given its analytical and noniterative nature, DPPs are computationally much more efficient than other related approaches (P-Delta and SVM), and its computational complexity is linear in the input dimensionality. Therefore, DPPs are very appealing, in terms of time complexity and memory consumption, and are very easy to use for high-dimensional classification tasks. On real benchmark datasets with two and multiple classes, DPPs are competitive with SVM and other approaches but they also allow online learning and, as opposed to most of them, have no tunable parameters. |
| Author | Fernandez-Delgado, M. Ameneiro, S. B. Ribeiro, J. Cernadas, E. |
| Author_xml | – sequence: 1 givenname: M. surname: Fernandez-Delgado fullname: Fernandez-Delgado, M. email: manuel.fernandez.delgado@usc.es organization: Intell. Syst. Group, San Sebastian, Spain – sequence: 2 givenname: J. surname: Ribeiro fullname: Ribeiro, J. email: jribeiro@estg.ipvc.pt organization: Sch. of Technol. & Manage., Viana do Castelo Polytech. Inst., Viana do Castelo, Portugal – sequence: 3 givenname: E. surname: Cernadas fullname: Cernadas, E. email: eva.cernadas@usc.es organization: Centro de Investig. en Tecnoloxias da Infor macion da USC (CITIUS), Univ. of Santiago de Compostela, Santiago de Compostela, Spain – sequence: 4 givenname: S. B. surname: Ameneiro fullname: Ameneiro, S. B. email: senen.barro@usc.es organization: Intell. Syst. Group, San Sebastian, Spain |
| BackLink | http://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=24750995$$DView record in Pascal Francis https://www.ncbi.nlm.nih.gov/pubmed/21984498$$D View this record in MEDLINE/PubMed |
| BookMark | eNqFkU9vVCEUxYlpY__o3sTEsDHaxRuBBzxw17xaNal1FmNcEh4POijzmAKz6LfwI8t0ppoYo6tLcn_n3HDOCTiY4mQBeIbRDGMk3yyur2cEYTwjmEsk-CNwjCXFDUKyPahvRFkjCemOwEnO3xDClCH-GBwRLAWlUhyDHxc-WVPgXCcdgg1wbpOx65LilOHri_k8n72FlzoXeD7pcFe80QH2OphN0MXHCUYHy9L-Xf_V-ptlqdOXJfyk042fYB-nugzQxQT7oHP2rnreWy10_p6fgEOnQ7ZP9_MUfLl8t-g_NFef33_sz68aQ1FXGoHJiDtB9Ghl6zomBzQwTASXvCVcUGKYFGJoiRktdsPQOV7_PnI6StYxN7Sn4NXOd53i7cbmolY-GxuCnmzcZFV9BMOMs_-TiLQUCdRV8sWe3AwrO6p18iud7tRD3BV4uQd0rkG6pCfj82-OdgxJuT3Jd5xJMedknTK-3IdUkvZBYaS2_avav9r2r_b9VyH6Q_jg_Q_J853EW2t_4Ry1rBOy_QnmRboy |
| CODEN | ITNNEP |
| CitedBy_id | crossref_primary_10_1109_TNNLS_2012_2229293 crossref_primary_10_1109_TNNLS_2012_2199766 crossref_primary_10_3389_fenvs_2022_999483 crossref_primary_10_1016_j_compbiomed_2014_12_024 crossref_primary_10_1109_TNNLS_2012_2195027 crossref_primary_10_1007_s11063_024_11707_9 crossref_primary_10_3390_math10244730 crossref_primary_10_1016_j_neunet_2013_11_002 |
| Cites_doi | 10.1109/TNN.2010.2099238 10.1016/S0031-3203(98)00016-8 10.1016/j.patcog.2005.03.017 10.1016/j.neucom.2007.02.006 10.1017/CBO9780511812651 10.1007/11691730_13 10.1016/j.neucom.2007.04.007 10.1109/ICDAR.2003.1227801 10.1109/TNN.2009.2016717 10.1007/11492542_6 10.1109/ICNN.1988.23872 10.1016/S0893-6080(01)00103-4 10.1006/jcss.1997.1504 10.1007/11494669_26 10.1109/TNN.2010.2094624 10.1109/TSMC.1976.4309452 10.1109/TNN.2004.836229 10.1109/72.991427 10.1162/089976600300014827 10.1016/j.neunet.2007.12.036 10.1007/BF00058655 10.1007/11499305_60 |
| ContentType | Journal Article |
| Copyright | 2015 INIST-CNRS |
| Copyright_xml | – notice: 2015 INIST-CNRS |
| DBID | 97E RIA RIE AAYXX CITATION IQODW CGR CUY CVF ECM EIF NPM 7X8 7SC 7SP 8FD F28 FR3 JQ2 L7M L~C L~D |
| DOI | 10.1109/TNN.2011.2169086 |
| DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Pascal-Francis Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed MEDLINE - Academic Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
| DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) MEDLINE - Academic Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Engineering Research Database Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering Computer and Information Systems Abstracts Professional |
| DatabaseTitleList | MEDLINE - Academic MEDLINE Technology Research Database |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher – sequence: 3 dbid: 7X8 name: MEDLINE - Academic url: https://search.proquest.com/medline sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering Anatomy & Physiology Computer Science Applied Sciences |
| EISSN | 1941-0093 |
| EndPage | 1848 |
| ExternalDocumentID | 21984498 24750995 10_1109_TNN_2011_2169086 6035789 |
| Genre | orig-research Research Support, Non-U.S. Gov't Journal Article |
| GroupedDBID | --- -~X .DC 0R~ 29I 4.4 53G 5GY 5VS 6IK 97E AAJGR AASAJ AAWTH ABAZT ABJNI ABQJQ ABVLG ACGFS AETIX AGQYO AGSQL AHBIQ AI. AIBXA ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD F5P HZ~ H~9 ICLAB IFIPE IFJZH IPLJI JAVBF LAI M43 MS~ O9- OCL P2P RIA RIE RNS S10 TAE TN5 VH1 AAYXX CITATION IQODW RIG AAYOK CGR CUY CVF ECM EIF NPM 7X8 7SC 7SP 8FD F28 FR3 JQ2 L7M L~C L~D |
| ID | FETCH-LOGICAL-c407t-812d1782ade93f759b0b5128696326842c5988b32cde1fbb7f6450d64d9575fb3 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 13 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000296469500013&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 1045-9227 1941-0093 |
| IngestDate | Fri Sep 05 14:02:12 EDT 2025 Fri Sep 05 10:21:46 EDT 2025 Thu Apr 03 07:07:43 EDT 2025 Mon Jul 21 09:13:59 EDT 2025 Sat Nov 29 03:59:25 EST 2025 Tue Nov 18 22:00:29 EST 2025 Tue Aug 26 17:18:09 EDT 2025 |
| IsPeerReviewed | false |
| IsScholarly | true |
| Issue | 11 |
| Keywords | parallel perceptrons Competitiveness parallel delta rule Modeling Optimization Direct method Multidimensional analysis Vector support machine Linear complexity Analytical closed-form weight calculation Probability learning Electronic vote Threshold function Single machine Dimensionality Statistical analysis margin maximization Error function linear computational complexity Neural network Computational complexity Hyperplane Exact solution Perceptron Activation function Storage management Pattern classification Time complexity online learning |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html CC BY 4.0 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c407t-812d1782ade93f759b0b5128696326842c5988b32cde1fbb7f6450d64d9575fb3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 ObjectType-Article-2 ObjectType-Feature-1 |
| PMID | 21984498 |
| PQID | 902340807 |
| PQPubID | 23479 |
| PageCount | 12 |
| ParticipantIDs | proquest_miscellaneous_963851565 pubmed_primary_21984498 crossref_primary_10_1109_TNN_2011_2169086 proquest_miscellaneous_902340807 ieee_primary_6035789 pascalfrancis_primary_24750995 crossref_citationtrail_10_1109_TNN_2011_2169086 |
| PublicationCentury | 2000 |
| PublicationDate | 2011-11-01 |
| PublicationDateYYYYMMDD | 2011-11-01 |
| PublicationDate_xml | – month: 11 year: 2011 text: 2011-11-01 day: 01 |
| PublicationDecade | 2010 |
| PublicationPlace | New York, NY |
| PublicationPlace_xml | – name: New York, NY – name: United States |
| PublicationTitle | IEEE transactions on neural networks |
| PublicationTitleAbbrev | TNN |
| PublicationTitleAlternate | IEEE Trans Neural Netw |
| PublicationYear | 2011 |
| Publisher | IEEE Institute of Electrical and Electronics Engineers |
| Publisher_xml | – name: IEEE – name: Institute of Electrical and Electronics Engineers |
| References | ref12 sheskin (ref26) 2006 ref14 witten (ref21) 2005 ref30 liu (ref13) 2003; 1 ref11 ref32 ref10 tomek (ref18) 1976; 6 ref2 ref17 ref19 cantador (ref5) 2005; 3562 vapnik (ref9) 1998 auer (ref1) 2002 collobert (ref24) 2003 simard (ref31) 1993 ref23 ref25 ref20 fernndez-delgado (ref33) 2010 ref22 cantador (ref4) 2005; 3523 ref28 ref27 ref29 daqi (ref8) 2006 delogu (ref16) 2008; 71 blake (ref15) 1998 ref3 ref6 gonzlez (ref7) 2005; 3697 |
| References_xml | – ident: ref10 doi: 10.1109/TNN.2010.2099238 – ident: ref17 doi: 10.1016/S0031-3203(98)00016-8 – start-page: 50 year: 1993 ident: ref31 publication-title: Advances in neural information processing systems – ident: ref14 doi: 10.1016/j.patcog.2005.03.017 – volume: 71 start-page: 919 year: 2008 ident: ref16 article-title: Geometrical synthesis of MLP neural networks publication-title: Neurocomputing doi: 10.1016/j.neucom.2007.02.006 – ident: ref25 doi: 10.1017/CBO9780511812651 – ident: ref29 doi: 10.1007/11691730_13 – year: 2006 ident: ref26 publication-title: Handbook of Parametric and Nonparametric Statistical Procedures – year: 1998 ident: ref9 publication-title: Statistical Learning Theory – ident: ref28 doi: 10.1016/j.neucom.2007.04.007 – ident: ref32 doi: 10.1109/ICDAR.2003.1227801 – ident: ref11 doi: 10.1109/TNN.2009.2016717 – start-page: 4797 year: 2006 ident: ref8 article-title: A mixed parallel perceptron classifier and several application problems publication-title: Proc Int Joint Conf Neural Netw – volume: 3523 start-page: 43 year: 2005 ident: ref4 publication-title: Pattern Recognition and Image Analysis doi: 10.1007/11492542_6 – start-page: 123 year: 2002 ident: ref1 article-title: Reducing communication for distributed learning in neural networks publication-title: Proc Int Conf Artif Neural Netw – ident: ref20 doi: 10.1109/ICNN.1988.23872 – year: 2003 ident: ref24 publication-title: Torch A Modular Machine Learning Software Library – ident: ref12 doi: 10.1016/S0893-6080(01)00103-4 – ident: ref22 doi: 10.1006/jcss.1997.1504 – ident: ref6 doi: 10.1007/11494669_26 – ident: ref19 doi: 10.1109/TNN.2010.2094624 – year: 2005 ident: ref21 publication-title: Data Mining Practical Machine Learning Tools and Techniques – volume: 6 start-page: 769 year: 1976 ident: ref18 article-title: Two Modifications of CNN publication-title: IEEE Trans Syst Man Cybern doi: 10.1109/TSMC.1976.4309452 – volume: 3697 start-page: 13 year: 2005 ident: ref7 publication-title: Artificial Neural Networks Formal Models and Their Applications - ICANN – ident: ref30 doi: 10.1109/TNN.2004.836229 – ident: ref27 doi: 10.1109/72.991427 – ident: ref3 doi: 10.1162/089976600300014827 – ident: ref2 doi: 10.1016/j.neunet.2007.12.036 – year: 1998 ident: ref15 publication-title: UCI repository of machine learning databases – ident: ref23 doi: 10.1007/BF00058655 – start-page: 1940 year: 2010 ident: ref33 article-title: Fast weight calculation for kernel-based perceptron in two-class classification problems publication-title: Proc Int Joint Conf Neural Netw – volume: 3562 start-page: 586 year: 2005 ident: ref5 publication-title: Artificial Intelligence and Knowledge Engineering Applications A Bioinspired Approach doi: 10.1007/11499305_60 – volume: 1 start-page: 41 year: 2003 ident: ref13 article-title: Kernel-based nonlinear discriminator with closed-form solution publication-title: Proc Int Conf Neural Netw Signal Process |
| SSID | ssj0014506 |
| Score | 2.1147857 |
| Snippet | Parallel perceptrons (PPs) are very simple and efficient committee machines (a single layer of perceptrons with threshold activation functions and binary... |
| SourceID | proquest pubmed pascalfrancis crossref ieee |
| SourceType | Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 1837 |
| SubjectTerms | Accuracy Activation Algorithmics. Computability. Computer arithmetics Algorithms Analytical closed-form weight calculation Applied sciences Artificial Intelligence Classification Classification - methods Closed-form solutions Computer science; control theory; systems Connectionism. Neural networks Data processing. List processing. Character string processing Databases, Factual Exact sciences and technology Kernel Linear approximation linear computational complexity Linear Models margin maximization Mathematical analysis Mathematical models Memory and file management (including protection and security) Memory organisation. Data processing Neural Networks (Computer) online learning parallel delta rule parallel perceptrons pattern classification Reproducibility of Results Software Statistical learning Support vector machines Tasks Theoretical computing Thresholds Training |
| Title | Direct Parallel Perceptrons (DPPs): Fast Analytical Calculation of the Parallel Perceptrons Weights With Margin Control for Classification Tasks |
| URI | https://ieeexplore.ieee.org/document/6035789 https://www.ncbi.nlm.nih.gov/pubmed/21984498 https://www.proquest.com/docview/902340807 https://www.proquest.com/docview/963851565 |
| Volume | 22 |
| WOSCitedRecordID | wos000296469500013&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 1941-0093 dateEnd: 20111231 omitProxy: false ssIdentifier: ssj0014506 issn: 1045-9227 databaseCode: RIE dateStart: 19900101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Nj9MwEB3trjjAgYWWj7Js5QNCrERokjr-2NuqUHGqciiit8h2HFFtSVGTIvEv-MmM7TSw0u5KnBqpjhNlnu034_E8gDdJytE55llEFSsjXCHiSCdxjD6P5ZqVDhLGi03wxUKsVjI_gvf9WRhrrU8-sx_cpd_LL7dm70JlExa72izyGI7xAeGsVr9jQDOvo4neRRbJNOWHLclYTpaLRajVmbo9IcF8AWApKJXixmrk5VVccqRq8PtUQdjibubpV6D56f-9-xN43DFNchWg8RSObD2A4VWNXvb3X-Qt8bmfPqg-gNODuAPpxvoAHv1TqXAIv8PUSHK1c-IrG5KHhJgdgpa8-5jnzcUlmaumJb7KiQ-Qk5namE4djGwrglTz9vu_-vAs_q7bb8Qp765rMgsp9AQ5NfHCnS6lKXS1VM118wy-zD8tZ5-jTswhMugzthESCbS8SFVp5bTimdSxRrIhGM4AruJMajIphJ6mprRJpTWvGJq0ZLSUyCgrPX0OJ_W2ti-BJIqrKlOMx8ZSapx-kE5ErEv0voxKkhFMDkYtTFfp3AlubArv8cSyQEQUDhFFh4gRXPR3_AhVPu5pO3TW7dt1hh3B-AZu-v9T6kiZzEZADkAqcAi7fRlV2-2-KSTyJorMnd_TBKdJZJ4Me3kRMPi3_w7Kr25_rzN46MPg_vjkazhpd3t7Dg_Mz3bd7MY4klZi7EfSH3CMF6Y |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1db9MwFL0aAwl4YNACKx_DDwgxaaFO6sQxb1OhGmJEeShib5HtOKKipFOTIu1f7CdzbaeBSWMST61Ux4p6j-1zP3wPwOsw4ugc8zhgMikDPCFooEJK0ecxXCWlhYR2YhM8y9KzM5HvwFF_F8YY44rPzDv71eXyy5Xe2FDZOKG2N4u4BbdjxiLqb2v1OQMWOyVN9C_iQEQR3yYlqRjPs8x364xsVihNXAtgkTIm0ivnkRNYseWRssF_qPLSFv_mnu4Mmu3939s_hAcd1yTHHhyPYMfUAxge1-hn_7wgb4ir_nRh9QHsbeUdSLfaB3D_r16FQ7j0myPJ5drKryxJ7kti1ghb8vZDnjeH78lMNi1xfU5ciJxM5VJ3-mBkVREkm9c__80FaPFz0X4nVnt3UZOpL6InyKqJk-60RU1-qrlsfjSP4evs43x6EnRyDoFGr7ENkEqg7dNIlkZMKh4LRRXSjTTBPcD2nIl0LNJUTSJdmrBSilcJmrRMWCmQU1Zq8gR261Vt9oGEkssqlgmn2jCmrYKQClOqSvS_tAzDEYy3Ri101-vcSm4sC-fzUFEgIgqLiKJDxAgO-yfOfZ-PG8YOrXX7cZ1hR3BwBTf97xGztEzEIyBbIBW4iG1mRtZmtWkKgcyJIXfnNwzBjRK5Z4KzPPUY_DN_B-Vn17_XK7h7Mv9yWpx-yj4_h3suKO4uU76A3Xa9MS_hjv7VLpr1gVtPvwF_uRoF |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Direct+Parallel+Perceptrons+%28DPPs%29%3A+Fast+Analytical+Calculation+of+the+Parallel+Perceptrons+Weights+With+Margin+Control+for+Classification+Tasks&rft.jtitle=IEEE+transactions+on+neural+networks&rft.au=Fernandez-Delgado%2C+M.&rft.au=Ribeiro%2C+J.&rft.au=Cernadas%2C+E.&rft.au=Ameneiro%2C+S.+B.&rft.date=2011-11-01&rft.issn=1045-9227&rft.eissn=1941-0093&rft.volume=22&rft.issue=11&rft.spage=1837&rft.epage=1848&rft_id=info:doi/10.1109%2FTNN.2011.2169086&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TNN_2011_2169086 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1045-9227&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1045-9227&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1045-9227&client=summon |