Soft Margin Multiple Kernel Learning

Multiple kernel learning (MKL) has been proposed for kernel methods by learning the optimal kernel from a set of predefined base kernels. However, the traditional L 1 MKL method often achieves worse results than the simplest method using the average of base kernels (i.e., average kernel) in some pra...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems Jg. 24; H. 5; S. 749 - 761
Hauptverfasser: Xinxing Xu, Tsang, I. W., Dong Xu
Format: Journal Article
Sprache:Englisch
Veröffentlicht: New York, NY IEEE 01.05.2013
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:2162-237X, 2162-2388, 2162-2388
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Abstract Multiple kernel learning (MKL) has been proposed for kernel methods by learning the optimal kernel from a set of predefined base kernels. However, the traditional L 1 MKL method often achieves worse results than the simplest method using the average of base kernels (i.e., average kernel) in some practical applications. In order to improve the effectiveness of MKL, this paper presents a novel soft margin perspective for MKL. Specifically, we introduce an additional slack variable called kernel slack variable to each quadratic constraint of MKL, which corresponds to one support vector machine model using a single base kernel. We first show that L 1 MKL can be deemed as hard margin MKL, and then we propose a novel soft margin framework for MKL. Three commonly used loss functions, including the hinge loss, the square hinge loss, and the square loss, can be readily incorporated into this framework, leading to the new soft margin MKL objective functions. Many existing MKL methods can be shown as special cases under our soft margin framework. For example, the hinge loss soft margin MKL leads to a new box constraint for kernel combination coefficients. Using different hyper-parameter values for this formulation, we can inherently bridge the method using average kernel, L 1 MKL, and the hinge loss soft margin MKL. The square hinge loss soft margin MKL unifies the family of elastic net constraint/regularizer based approaches; and the square loss soft margin MKL incorporates L 2 MKL naturally. Moreover, we also develop efficient algorithms for solving both the hinge loss and square hinge loss soft margin MKL. Comprehensive experimental studies for various MKL algorithms on several benchmark data sets and two real world applications, including video action recognition and event recognition demonstrate that our proposed algorithms can efficiently achieve an effective yet sparse solution for MKL.
AbstractList Multiple kernel learning (MKL) has been proposed for kernel methods by learning the optimal kernel from a set of predefined base kernels. However, the traditional L1MKL method often achieves worse results than the simplest method using the average of base kernels (i.e., average kernel) in some practical applications. In order to improve the effectiveness of MKL, this paper presents a novel soft margin perspective for MKL. Specifically, we introduce an additional slack variable called kernel slack variable to each quadratic constraint of MKL, which corresponds to one support vector machine model using a single base kernel. We first show that L1MKL can be deemed as hard margin MKL, and then we propose a novel soft margin framework for MKL. Three commonly used loss functions, including the hinge loss, the square hinge loss, and the square loss, can be readily incorporated into this framework, leading to the new soft margin MKL objective functions. Many existing MKL methods can be shown as special cases under our soft margin framework. For example, the hinge loss soft margin MKL leads to a new box constraint for kernel combination coefficients. Using different hyper-parameter values for this formulation, we can inherently bridge the method using average kernel, L1MKL, and the hinge loss soft margin MKL. The square hinge loss soft margin MKL unifies the family of elastic net constraint/regularizer based approaches; and the square loss soft margin MKL incorporates L2MKL naturally. Moreover, we also develop efficient algorithms for solving both the hinge loss and square hinge loss soft margin MKL. Comprehensive experimental studies for various MKL algorithms on several benchmark data sets and two real world applications, including video action recognition and event recognition demonstrate that our proposed algorithms can efficiently achieve an effective yet sparse solution for MKL.Multiple kernel learning (MKL) has been proposed for kernel methods by learning the optimal kernel from a set of predefined base kernels. However, the traditional L1MKL method often achieves worse results than the simplest method using the average of base kernels (i.e., average kernel) in some practical applications. In order to improve the effectiveness of MKL, this paper presents a novel soft margin perspective for MKL. Specifically, we introduce an additional slack variable called kernel slack variable to each quadratic constraint of MKL, which corresponds to one support vector machine model using a single base kernel. We first show that L1MKL can be deemed as hard margin MKL, and then we propose a novel soft margin framework for MKL. Three commonly used loss functions, including the hinge loss, the square hinge loss, and the square loss, can be readily incorporated into this framework, leading to the new soft margin MKL objective functions. Many existing MKL methods can be shown as special cases under our soft margin framework. For example, the hinge loss soft margin MKL leads to a new box constraint for kernel combination coefficients. Using different hyper-parameter values for this formulation, we can inherently bridge the method using average kernel, L1MKL, and the hinge loss soft margin MKL. The square hinge loss soft margin MKL unifies the family of elastic net constraint/regularizer based approaches; and the square loss soft margin MKL incorporates L2MKL naturally. Moreover, we also develop efficient algorithms for solving both the hinge loss and square hinge loss soft margin MKL. Comprehensive experimental studies for various MKL algorithms on several benchmark data sets and two real world applications, including video action recognition and event recognition demonstrate that our proposed algorithms can efficiently achieve an effective yet sparse solution for MKL.
Multiple kernel learning (MKL) has been proposed for kernel methods by learning the optimal kernel from a set of predefined base kernels. However, the traditional L 1 rm MKL method often achieves worse results than the simplest method using the average of base kernels (i.e., average kernel) in some practical applications. In order to improve the effectiveness of MKL, this paper presents a novel soft margin perspective for MKL. Specifically, we introduce an additional slack variable called kernel slack variable to each quadratic constraint of MKL, which corresponds to one support vector machine model using a single base kernel. We first show that L 1 rm MKL can be deemed as hard margin MKL, and then we propose a novel soft margin framework for MKL. Three commonly used loss functions, including the hinge loss, the square hinge loss, and the square loss, can be readily incorporated into this framework, leading to the new soft margin MKL objective functions. Many existing MKL methods can be shown as special cases under our soft margin framework. For example, the hinge loss soft margin MKL leads to a new box constraint for kernel combination coefficients. Using different hyper-parameter values for this formulation, we can inherently bridge the method using average kernel, L 1 rm MKL , and the hinge loss soft margin MKL. The square hinge loss soft margin MKL unifies the family of elastic net constraint/regularizer based approaches; and the square loss soft margin MKL incorporates L 2 rm MKL naturally. Moreover, we also develop efficient algorithms for solving both the hinge loss and square hinge loss soft margin MKL. Comprehensive experimental studies for various MKL algorithms on several benchmark data sets and two real world applications, including video action recognition and event recognition demonstrate that our proposed algorithms can efficiently achieve an effective yet sparse solution for MKL.
Multiple kernel learning (MKL) has been proposed for kernel methods by learning the optimal kernel from a set of predefined base kernels. However, the traditional [Formula Omitted] method often achieves worse results than the simplest method using the average of base kernels (i.e., average kernel) in some practical applications. In order to improve the effectiveness of MKL, this paper presents a novel soft margin perspective for MKL. Specifically, we introduce an additional slack variable called kernel slack variable to each quadratic constraint of MKL, which corresponds to one support vector machine model using a single base kernel. We first show that [Formula Omitted] can be deemed as hard margin MKL, and then we propose a novel soft margin framework for MKL. Three commonly used loss functions, including the hinge loss, the square hinge loss, and the square loss, can be readily incorporated into this framework, leading to the new soft margin MKL objective functions. Many existing MKL methods can be shown as special cases under our soft margin framework. For example, the hinge loss soft margin MKL leads to a new box constraint for kernel combination coefficients. Using different hyper-parameter values for this formulation, we can inherently bridge the method using average kernel, [Formula Omitted], and the hinge loss soft margin MKL. The square hinge loss soft margin MKL unifies the family of elastic net constraint/regularizer based approaches; and the square loss soft margin MKL incorporates [Formula Omitted] naturally. Moreover, we also develop efficient algorithms for solving both the hinge loss and square hinge loss soft margin MKL. Comprehensive experimental studies for various MKL algorithms on several benchmark data sets and two real world applications, including video action recognition and event recognition demonstrate that our proposed algorithms can efficiently achieve an effective yet sparse solution for MKL.
Multiple kernel learning (MKL) has been proposed for kernel methods by learning the optimal kernel from a set of predefined base kernels. However, the traditional L1MKL method often achieves worse results than the simplest method using the average of base kernels (i.e., average kernel) in some practical applications. In order to improve the effectiveness of MKL, this paper presents a novel soft margin perspective for MKL. Specifically, we introduce an additional slack variable called kernel slack variable to each quadratic constraint of MKL, which corresponds to one support vector machine model using a single base kernel. We first show that L1MKL can be deemed as hard margin MKL, and then we propose a novel soft margin framework for MKL. Three commonly used loss functions, including the hinge loss, the square hinge loss, and the square loss, can be readily incorporated into this framework, leading to the new soft margin MKL objective functions. Many existing MKL methods can be shown as special cases under our soft margin framework. For example, the hinge loss soft margin MKL leads to a new box constraint for kernel combination coefficients. Using different hyper-parameter values for this formulation, we can inherently bridge the method using average kernel, L1MKL, and the hinge loss soft margin MKL. The square hinge loss soft margin MKL unifies the family of elastic net constraint/regularizer based approaches; and the square loss soft margin MKL incorporates L2MKL naturally. Moreover, we also develop efficient algorithms for solving both the hinge loss and square hinge loss soft margin MKL. Comprehensive experimental studies for various MKL algorithms on several benchmark data sets and two real world applications, including video action recognition and event recognition demonstrate that our proposed algorithms can efficiently achieve an effective yet sparse solution for MKL.
Multiple kernel learning (MKL) has been proposed for kernel methods by learning the optimal kernel from a set of predefined base kernels. However, the traditional L 1 MKL method often achieves worse results than the simplest method using the average of base kernels (i.e., average kernel) in some practical applications. In order to improve the effectiveness of MKL, this paper presents a novel soft margin perspective for MKL. Specifically, we introduce an additional slack variable called kernel slack variable to each quadratic constraint of MKL, which corresponds to one support vector machine model using a single base kernel. We first show that L 1 MKL can be deemed as hard margin MKL, and then we propose a novel soft margin framework for MKL. Three commonly used loss functions, including the hinge loss, the square hinge loss, and the square loss, can be readily incorporated into this framework, leading to the new soft margin MKL objective functions. Many existing MKL methods can be shown as special cases under our soft margin framework. For example, the hinge loss soft margin MKL leads to a new box constraint for kernel combination coefficients. Using different hyper-parameter values for this formulation, we can inherently bridge the method using average kernel, L 1 MKL, and the hinge loss soft margin MKL. The square hinge loss soft margin MKL unifies the family of elastic net constraint/regularizer based approaches; and the square loss soft margin MKL incorporates L 2 MKL naturally. Moreover, we also develop efficient algorithms for solving both the hinge loss and square hinge loss soft margin MKL. Comprehensive experimental studies for various MKL algorithms on several benchmark data sets and two real world applications, including video action recognition and event recognition demonstrate that our proposed algorithms can efficiently achieve an effective yet sparse solution for MKL.
Author Xinxing Xu
Tsang, I. W.
Dong Xu
Author_xml – sequence: 1
  surname: Xinxing Xu
  fullname: Xinxing Xu
  organization: Sch. of Comput. Eng., Nanyang Technol. Univ., Singapore, Singapore
– sequence: 2
  givenname: I. W.
  surname: Tsang
  fullname: Tsang, I. W.
  organization: Sch. of Comput. Eng., Nanyang Technol. Univ., Singapore, Singapore
– sequence: 3
  surname: Dong Xu
  fullname: Dong Xu
  organization: Sch. of Comput. Eng., Nanyang Technol. Univ., Singapore, Singapore
BackLink http://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=27321506$$DView record in Pascal Francis
https://www.ncbi.nlm.nih.gov/pubmed/24808425$$D View this record in MEDLINE/PubMed
BookMark eNqF0ctOGzEUBmCrSlXC5QWoVEUqldgk9fFlbC8RghYR0gVU6s5yPGeQ0cQT7JlF3x5DApWyoN7Yi--3fPzvk1HsIhJyDHQGQM33u8VifjtjFNiMMa5A8w9kzKBiU8a1Hr2d1Z89cpTzAy2rorIS5hPZY0JTLZgck5PbruknNy7dhzi5Gdo-rFucXGOK2E7m6FIM8f6QfGxcm_Foux-Q35cXd-c_p_NfP67Oz-ZTL0D0U6g1d0aiUHqJqkYwAJ4b5YUzGmjd1N4hZbVbglcU9dIDxRqlAuU5axw_IKebe9epexww93YVsse2dRG7IVuQjAvKy_j_p1wYUUkwtNCvO_ShG1IsgxTFRHmtBFbUl60alius7TqFlUt_7etXFfBtC1z2rm2Siz7kf05xBpJWxemN86nLOWFjfehdH7rYJxdaC9Q-F2hfCrTPBdptgSXKdqKvt78b-rwJBUR8C1RCmopy_gTf1aJu
CODEN ITNNAL
CitedBy_id crossref_primary_10_1016_j_neucom_2020_06_039
crossref_primary_10_3390_math12243935
crossref_primary_10_1007_s11063_014_9385_2
crossref_primary_10_1016_j_eswa_2021_115308
crossref_primary_10_1109_TNNLS_2015_2444417
crossref_primary_10_1016_j_neucom_2019_06_065
crossref_primary_10_1109_TNNLS_2015_2506664
crossref_primary_10_1016_j_engappai_2020_103535
crossref_primary_10_1109_TNNLS_2024_3414325
crossref_primary_10_1109_TNNLS_2015_2461554
crossref_primary_10_1007_s11063_018_9951_0
crossref_primary_10_1109_TNNLS_2015_2461552
crossref_primary_10_3389_fgene_2023_1095330
crossref_primary_10_1007_s13042_021_01487_2
crossref_primary_10_1016_j_neucom_2016_09_117
crossref_primary_10_1109_TNNLS_2019_2962878
crossref_primary_10_1109_TIP_2017_2707858
crossref_primary_10_1109_TNNLS_2015_2405574
crossref_primary_10_3390_s19173808
crossref_primary_10_1007_s10489_023_04474_y
crossref_primary_10_1007_s11063_019_10053_5
crossref_primary_10_3390_electronics13214168
crossref_primary_10_1093_bib_bbw113
crossref_primary_10_1109_TNNLS_2014_2342533
crossref_primary_10_1007_s41095_016_0044_6
crossref_primary_10_1007_s13721_017_0146_9
crossref_primary_10_1109_TIP_2013_2292560
crossref_primary_10_1109_TSC_2022_3194121
crossref_primary_10_1007_s10994_016_5590_8
crossref_primary_10_1016_j_patrec_2020_10_013
crossref_primary_10_1109_TII_2022_3206817
crossref_primary_10_1016_j_knosys_2020_106469
crossref_primary_10_1186_s41074_017_0031_6
crossref_primary_10_1016_j_patcog_2018_07_032
crossref_primary_10_1155_2018_1018789
crossref_primary_10_1109_TKDE_2023_3250472
crossref_primary_10_1109_ACCESS_2019_2899389
crossref_primary_10_1109_TPAMI_2016_2537337
crossref_primary_10_1016_j_inffus_2017_12_006
crossref_primary_10_1109_TNNLS_2016_2518700
crossref_primary_10_1007_s12524_018_0814_y
crossref_primary_10_1109_ACCESS_2020_2996417
crossref_primary_10_1109_TCYB_2014_2326596
crossref_primary_10_1016_j_ejor_2020_11_027
crossref_primary_10_1109_TNNLS_2014_2386307
crossref_primary_10_1109_TNNLS_2017_2785329
crossref_primary_10_3390_sym11030325
crossref_primary_10_1007_s00521_015_2066_x
crossref_primary_10_1109_MCI_2015_2471235
crossref_primary_10_1007_s11633_014_0833_2
crossref_primary_10_1109_TNNLS_2016_2619399
crossref_primary_10_1109_TNNLS_2013_2291772
crossref_primary_10_1109_TNNLS_2016_2623219
crossref_primary_10_1109_TNNLS_2016_2635151
crossref_primary_10_3390_s23115226
crossref_primary_10_1155_2021_8842396
crossref_primary_10_1155_2015_346496
crossref_primary_10_1016_j_neucom_2019_01_010
crossref_primary_10_1109_ACCESS_2023_3263155
crossref_primary_10_1109_TNNLS_2015_2498149
crossref_primary_10_1109_TNNLS_2016_2645883
crossref_primary_10_1109_TNNLS_2014_2377181
crossref_primary_10_1016_j_neucom_2014_11_078
crossref_primary_10_1093_cercor_bhae341
crossref_primary_10_1049_iet_cvi_2018_5556
crossref_primary_10_1038_s41598_025_85151_7
crossref_primary_10_1007_s00521_019_04044_9
crossref_primary_10_23919_CJEE_2022_000007
crossref_primary_10_1145_3548775
crossref_primary_10_1007_s11063_014_9392_3
crossref_primary_10_1109_TCSS_2023_3249152
crossref_primary_10_1109_TNNLS_2014_2334137
crossref_primary_10_1109_TNNLS_2014_2347398
crossref_primary_10_1016_j_ins_2014_09_011
crossref_primary_10_1021_acs_jcim_6b00332
crossref_primary_10_1109_TNNLS_2015_2472284
crossref_primary_10_1007_s11042_022_14226_8
crossref_primary_10_3233_CH_170275
crossref_primary_10_1007_s10772_023_10017_0
crossref_primary_10_1109_TII_2019_2940475
crossref_primary_10_1109_TPAMI_2015_2476813
crossref_primary_10_1109_TNNLS_2017_2688365
Cites_doi 10.1109/TNN.2005.860848
10.1109/ICCV.2009.5459169
10.1109/TNNLS.2012.2187307
10.1109/TNN.2010.2103571
10.1145/1015330.1015424
10.1007/s10994-009-5150-6
10.1111/j.1467-9868.2005.00532.x
10.1007/978-3-642-33765-9_34
10.1109/ICDM.2012.78
10.1109/CVPR.2011.5995407
10.1007/s11263-005-1838-7
10.1023/B:VISI.0000029664.99615.94
10.1145/1961189.1961199
10.1007/BF00994018
10.1109/TPAMI.2011.265
10.1109/ICDM.2012.105
10.1111/j.1467-9868.2005.00503.x
10.1109/TNNLS.2012.2186314
10.1109/TNN.2002.1031937
10.1162/089976600300015565
10.1162/089976601750399335
10.1017/CBO9780511801389
10.1007/s10994-011-5252-9
10.1145/1273496.1273646
10.1109/CVPR.2011.5995624
10.1145/130385.130401
10.1109/TASL.2008.2012193
10.1023/A:1012450327387
10.1109/CVPR.2010.5539870
10.1109/72.914517
10.1109/TNN.2009.2014229
10.1109/TPAMI.2011.114
10.1109/ICCV.2007.4408875
10.1109/CVPR.2005.177
10.1145/1553374.1553510
ContentType Journal Article
Copyright 2014 INIST-CNRS
Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) May 2013
Copyright_xml – notice: 2014 INIST-CNRS
– notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) May 2013
DBID 97E
RIA
RIE
AAYXX
CITATION
IQODW
NPM
7QF
7QO
7QP
7QQ
7QR
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
P64
7X8
DOI 10.1109/TNNLS.2012.2237183
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Pascal-Francis
PubMed
Aluminium Industry Abstracts
Biotechnology Research Abstracts
Calcium & Calcified Tissue Abstracts
Ceramic Abstracts
Chemoreception Abstracts
Computer and Information Systems Abstracts
Corrosion Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
Materials Business File
Mechanical & Transportation Engineering Abstracts
Neurosciences Abstracts
Solid State and Superconductivity Abstracts
METADEX
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
Materials Research Database
ProQuest Computer Science Collection
Civil Engineering Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
DatabaseTitle CrossRef
PubMed
Materials Research Database
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Materials Business File
Aerospace Database
Engineered Materials Abstracts
Biotechnology Research Abstracts
Chemoreception Abstracts
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
Civil Engineering Abstracts
Aluminium Industry Abstracts
Electronics & Communications Abstracts
Ceramic Abstracts
Neurosciences Abstracts
METADEX
Biotechnology and BioEngineering Abstracts
Computer and Information Systems Abstracts Professional
Solid State and Superconductivity Abstracts
Engineering Research Database
Calcium & Calcified Tissue Abstracts
Corrosion Abstracts
MEDLINE - Academic
DatabaseTitleList MEDLINE - Academic
Technology Research Database
Materials Research Database
PubMed

Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
Applied Sciences
EISSN 2162-2388
EndPage 761
ExternalDocumentID 2938973211
24808425
27321506
10_1109_TNNLS_2012_2237183
6459603
Genre orig-research
Research Support, Non-U.S. Gov't
Journal Article
GroupedDBID 0R~
4.4
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACIWK
ACPRK
AENEX
AFRAH
AGQYO
AGSQL
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
IFIPE
IPLJI
JAVBF
M43
MS~
O9-
OCL
PQQKQ
RIA
RIE
RNS
AAYXX
CITATION
IQODW
RIG
NPM
7QF
7QO
7QP
7QQ
7QR
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
P64
7X8
ID FETCH-LOGICAL-c414t-1d83a95e478be7de1911c397c4a9810dfdcae02dab1c70e8bc10ede5717c32fa3
IEDL.DBID RIE
ISICitedReferencesCount 118
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000316494700006&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 2162-237X
2162-2388
IngestDate Mon Sep 29 06:34:24 EDT 2025
Sat Sep 27 19:49:03 EDT 2025
Sun Oct 05 00:05:42 EDT 2025
Mon Jul 21 05:54:45 EDT 2025
Wed Apr 02 07:25:22 EDT 2025
Sat Nov 29 01:39:47 EST 2025
Tue Nov 18 22:12:50 EST 2025
Tue Aug 26 16:42:11 EDT 2025
IsPeerReviewed false
IsScholarly true
Issue 5
Keywords Multiple kernel learning
Action
Event detection
support vector machines
Motion estimation
Video signal
Modeling
Kernel method
Loss function
Behavioral analysis
Efficiency
Scene analysis
Vector support machine
Objective function
Learning algorithm
Single machine
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
CC BY 4.0
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c414t-1d83a95e478be7de1911c397c4a9810dfdcae02dab1c70e8bc10ede5717c32fa3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Feature-1
content type line 23
PMID 24808425
PQID 1324478512
PQPubID 85436
PageCount 13
ParticipantIDs ieee_primary_6459603
proquest_miscellaneous_1349465190
proquest_miscellaneous_1523403109
crossref_primary_10_1109_TNNLS_2012_2237183
proquest_journals_1324478512
pascalfrancis_primary_27321506
pubmed_primary_24808425
crossref_citationtrail_10_1109_TNNLS_2012_2237183
PublicationCentury 2000
PublicationDate 2013-05-01
PublicationDateYYYYMMDD 2013-05-01
PublicationDate_xml – month: 05
  year: 2013
  text: 2013-05-01
  day: 01
PublicationDecade 2010
PublicationPlace New York, NY
PublicationPlace_xml – name: New York, NY
– name: United States
– name: Piscataway
PublicationTitle IEEE transaction on neural networks and learning systems
PublicationTitleAbbrev TNNLS
PublicationTitleAlternate IEEE Trans Neural Netw Learn Syst
PublicationYear 2013
Publisher IEEE
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: Institute of Electrical and Electronics Engineers
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref57
ref13
ref59
ref58
ref11
cristianini (ref5) 2001
ong (ref7) 2005; 6
ref51
kloft (ref19) 2010
ref46
ref45
ref48
ref47
ref41
ref44
ref43
xu (ref15) 2008
micchelli (ref52) 2010
ref49
liu (ref56) 2009
ref8
ref3
ref6
duchi (ref54) 2008
ref40
cortes (ref16) 2009
sonnenburg (ref12) 2006; 7
lanckriet (ref9) 2004; 5
ref37
xu (ref35) 2010
sonnenburg (ref55) 2010; 11
ref36
orabona (ref27) 2011
ref31
ref30
vishwanathan (ref34) 2010
ref33
ref32
orabona (ref20) 2010
bach (ref50) 2004
ref2
ref1
hu (ref10) 2009; 20
ref39
ref38
cortes (ref26) 2010
orabona (ref28) 2012; 13
kloft (ref24) 2010
kloft (ref17) 2011; 12
ref23
ref25
ref22
ref21
bach (ref42) 2008; 9
sch lkopf (ref4) 2002
ref29
shawe-taylor (ref18) 2008
shalev-shwartz (ref53) 2006; 7
ref60
rakotomamonjy (ref14) 2008; 9
References_xml – start-page: 1
  year: 2004
  ident: ref50
  article-title: Computing regularization paths for learning multiple kernels
  publication-title: Proc NIPS Conf
– start-page: 109
  year: 2009
  ident: ref16
  article-title: L2 regularization for learning kernels
  publication-title: Proc Conf Uncertainty Artif Intell
– year: 2002
  ident: ref4
  publication-title: Learning with kernels
– start-page: 66
  year: 2010
  ident: ref24
  article-title: A unifying view of multiple kernel learning
  publication-title: Proc ECML/PKDD (2)
– start-page: 272
  year: 2008
  ident: ref54
  article-title: Efficient projections onto the <formula formulatype="inline"><tex Notation="TeX">${\ell}_{1}$</tex></formula>-ball for learning in high dimensions
  publication-title: Proc Int Conf Mach Learn
– start-page: 1612
  year: 2010
  ident: ref52
  article-title: A family of penalty functions for structured sparsity
  publication-title: Proc NIPS Conf
– start-page: 1
  year: 2008
  ident: ref18
  article-title: Kernel learning for novelty detection
  publication-title: Proceedings of the NIPS Workshop on Kernel Learning Automatic Selection of Optimal Kernels
– start-page: 239
  year: 2010
  ident: ref26
  article-title: Two-stage learning kernel algorithms
  publication-title: Proc Int Conf Mach Learn
– ident: ref8
  doi: 10.1109/TNN.2005.860848
– ident: ref46
  doi: 10.1109/ICCV.2009.5459169
– ident: ref2
  doi: 10.1109/TNNLS.2012.2187307
– start-page: 1175
  year: 2010
  ident: ref35
  article-title: Simple and efficient multiple kernel learning by group lasso
  publication-title: Proc Int Conf Mach Learn
– ident: ref23
  doi: 10.1109/TNN.2010.2103571
– ident: ref11
  doi: 10.1145/1015330.1015424
– volume: 9
  start-page: 1179
  year: 2008
  ident: ref42
  article-title: Consistency of the group lasso and multiple kernel learning
  publication-title: J Mach Learn Res
– ident: ref51
  doi: 10.1007/s10994-009-5150-6
– ident: ref41
  doi: 10.1111/j.1467-9868.2005.00532.x
– volume: 13
  start-page: 227
  year: 2012
  ident: ref28
  article-title: Multikernel learning with online-batch optimization
  publication-title: J Mach Learn Res
– ident: ref32
  doi: 10.1007/978-3-642-33765-9_34
– ident: ref60
  doi: 10.1109/ICDM.2012.78
– volume: 5
  start-page: 27
  year: 2004
  ident: ref9
  article-title: Learning the kernel matrix with semidefinite programming
  publication-title: J Mach Learn Res
– ident: ref57
  doi: 10.1109/CVPR.2011.5995407
– ident: ref44
  doi: 10.1007/s11263-005-1838-7
– ident: ref43
  doi: 10.1023/B:VISI.0000029664.99615.94
– volume: 7
  start-page: 1531
  year: 2006
  ident: ref12
  article-title: Large scale multiple kernel learning
  publication-title: J Mach Learn Res
– start-page: 997
  year: 2010
  ident: ref19
  article-title: Efficient and accurate Lp-norm multiple kernel learning
  publication-title: Proc Adv Neural Inf Process Syst
– ident: ref40
  doi: 10.1145/1961189.1961199
– ident: ref36
  doi: 10.1007/BF00994018
– ident: ref30
  doi: 10.1109/TPAMI.2011.265
– ident: ref59
  doi: 10.1109/ICDM.2012.105
– ident: ref49
  doi: 10.1111/j.1467-9868.2005.00503.x
– ident: ref1
  doi: 10.1109/TNNLS.2012.2186314
– ident: ref38
  doi: 10.1109/TNN.2002.1031937
– volume: 7
  start-page: 1567
  year: 2006
  ident: ref53
  article-title: Efficient learning of label ranking by soft projections onto polyhedra
  publication-title: J Mach Learn Res
– ident: ref37
  doi: 10.1162/089976600300015565
– ident: ref39
  doi: 10.1162/089976601750399335
– ident: ref47
  doi: 10.1017/CBO9780511801389
– ident: ref25
  doi: 10.1007/s10994-011-5252-9
– ident: ref13
  doi: 10.1145/1273496.1273646
– start-page: 2361
  year: 2010
  ident: ref34
  article-title: Multiple kernel learning and the SMO algorithm
  publication-title: Advances in neural information processing systems
– start-page: 1
  year: 2009
  ident: ref56
  article-title: Recognizing realistic actions from videos in the wild
  publication-title: Proc IEEE Int Conf Comput Vis Pattern Recogn
– volume: 11
  start-page: 1799
  year: 2010
  ident: ref55
  article-title: The SHOGUN machine learning toolbox
  publication-title: J Mach Learn Res
– volume: 9
  start-page: 2491
  year: 2008
  ident: ref14
  article-title: SimpleMKL
  publication-title: J Mach Learn Res
– start-page: 367
  year: 2001
  ident: ref5
  publication-title: Advances in neural information processing systems
– ident: ref31
  doi: 10.1109/CVPR.2011.5995624
– ident: ref48
  doi: 10.1145/130385.130401
– ident: ref22
  doi: 10.1109/TASL.2008.2012193
– ident: ref6
  doi: 10.1023/A:1012450327387
– volume: 12
  start-page: 953
  year: 2011
  ident: ref17
  article-title: <formula formulatype="inline"><tex Notation="TeX">${\rm L}_{p}$</tex></formula>-norm multiple kernel learning
  publication-title: J Mach Learn Res
– volume: 6
  start-page: 1043
  year: 2005
  ident: ref7
  article-title: Learning the kernel with hyperkernels
  publication-title: J Mach Learn Res
– ident: ref58
  doi: 10.1109/CVPR.2010.5539870
– start-page: 1825
  year: 2008
  ident: ref15
  article-title: An extended level method for efficient multiple kernel learning
  publication-title: Advances in neural information processing systems
– ident: ref3
  doi: 10.1109/72.914517
– volume: 20
  start-page: 827
  year: 2009
  ident: ref10
  article-title: Building sparse multiple-kernel SVM classifiers
  publication-title: IEEE Trans Neural Netw
  doi: 10.1109/TNN.2009.2014229
– ident: ref29
  doi: 10.1109/TPAMI.2011.114
– ident: ref33
  doi: 10.1109/ICCV.2007.4408875
– ident: ref45
  doi: 10.1109/CVPR.2005.177
– start-page: 787
  year: 2010
  ident: ref20
  article-title: Online-batch strongly convex multikernel learning
  publication-title: Proc IEEE Conf Comput Vis Pattern Recognit
– start-page: 249
  year: 2011
  ident: ref27
  article-title: Ultrafast optimization algorithm for sparse multikernel learning
  publication-title: Proc 28th Int Conf Mach Learn
– ident: ref21
  doi: 10.1145/1553374.1553510
SSID ssj0000605649
Score 2.4394102
Snippet Multiple kernel learning (MKL) has been proposed for kernel methods by learning the optimal kernel from a set of predefined base kernels. However, the...
SourceID proquest
pubmed
pascalfrancis
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 749
SubjectTerms Algorithms
Applied sciences
Artificial intelligence
Computer science; control theory; systems
Data processing. List processing. Character string processing
Exact sciences and technology
Fasteners
Hinges
Kernel
Kernels
Learning
Learning and adaptive systems
Linear programming
Mathematical models
Memory organisation. Data processing
Multiple kernel learning
Neural networks
Operations research
Optimization
Pattern recognition. Digital image processing. Computational geometry
Recognition
Software
Studies
Support vector machines
Training
Vectors
Title Soft Margin Multiple Kernel Learning
URI https://ieeexplore.ieee.org/document/6459603
https://www.ncbi.nlm.nih.gov/pubmed/24808425
https://www.proquest.com/docview/1324478512
https://www.proquest.com/docview/1349465190
https://www.proquest.com/docview/1523403109
Volume 24
WOSCitedRecordID wos000316494700006&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 2162-2388
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000605649
  issn: 2162-237X
  databaseCode: RIE
  dateStart: 20120101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1baxUxEB5q8cEXq9bLaj2s0DfdNsmmm-RRxCJYD0IrnLclm8xKoewp5-Lvdyabs1LQgm8LycLuzCT5JnP5AI75GtD0qqmCkL7SvtOVd9JWvUDle2UIoqZC4Qszn9vFwn3fgw9TLQwipuQzPOHHFMuPy7Dlq7JTbnzScGvPB8Y0Y63WdJ8iCJc3Ce0q2ahK1Waxq5ER7vRqPr-45EQudULnIe3HzJ-jtBUchbpzJCWOFc6Q9GsSUj-yW_wbfqZj6Pzg_37gCTzOcLP8ONrHU9jD4Rkc7KgcyryyD-H4krbjkllvr4fyW04yLL_iasCbMvdg_fkcfpx_vvr0pcoEClXQUm8qGW3t3RlqYzs0Eck3k4EASNDeWSliH4NHoaLvZDACbRekwIhn5OKFWvW-fgH7w3LAV1Ba0ZEDG1QqRuXoazTO1pF7_wTXxKYAuZNhG3J3cSa5uGmTlyFcm1TQsgrarIIC3k_v3I69Ne6dfcgCnWZmWRYwu6OqaZxwmeL2iQUc7XTX5vW5bskH1yQVQjsFvJuGaWVxuMQPuNzyHO2YKd6Je-aQH69Td9UCXo528ecDsnm9_vuHv4FHKlFrcPLkEexvVlt8Cw_Dr831ejUjE1_YWTLx37lR8gk
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3dSxwxEB9EC_VFbdW6fnULvrWrSTbuJo9FFIvnUvAK97Zkk1kRZK_ch3-_mVxuRWgF3xaShd2ZSTKTmfn9AE7oGrBsRZFZxk0mTSMzo7nKWobCtKL0LmpoFB6UVaVGI_17BX70vTCIGIrP8JQeQy7fje2crsrOCPikIGjPNWLOit1a_Y0K8555EfxdwQuRibwcLbtkmD4bVtXgjkq5xKk_Ef2OTAw6QipGeahXh1JgWaEaSTP1YmoX_Bb_d0DDQXS1-b5f2IKN6HCmPxcW8glWsPsMm0syhzSu7W04ufMbckq8tw9dehvLDNMbnHT4mEYU1vsd-HN1Oby4ziKFQmYll7OMO5UbfY6yVA2WDn10xq13Qaw0WnHmWmcNMuFMw23JUDWWM3R47oM8m4vW5Luw2o073INUscaHsFaEdlTKv7pSq9wR-o_VhSsS4EsZ1jbiixPNxWMd4gym66CCmlRQRxUk8L1_5-8CXePN2dsk0H5mlGUCx69U1Y97z0wQgGICh0vd1XGFTmsfhUsvFe_vJPCtH_ZrixImpsPxnOZITVzxmr0xx0fyMuCrJvBlYRcvHxDNa__fH_4VPl4Pbwf14Fd1cwDrIhBtUCnlIazOJnM8gg_2afYwnRwHQ38GybH0ag
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Soft+Margin+Multiple+Kernel+Learning&rft.jtitle=IEEE+transaction+on+neural+networks+and+learning+systems&rft.au=Xinxing+Xu&rft.au=Tsang%2C+I.+W.&rft.au=Dong+Xu&rft.date=2013-05-01&rft.issn=2162-237X&rft.eissn=2162-2388&rft.volume=24&rft.issue=5&rft.spage=749&rft.epage=761&rft_id=info:doi/10.1109%2FTNNLS.2012.2237183&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TNNLS_2012_2237183
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2162-237X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2162-237X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2162-237X&client=summon