Distributed Center-Based Clustering: A Unified Framework

We develop a family of distributed center-based clustering algorithms that work over connected networks of users. In the proposed scenario, users contain a local dataset and communicate only with their immediate neighbours, with the aim of finding a clustering of the full, joint data. The proposed f...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on signal processing Vol. 73; pp. 903 - 918
Main Authors: Armacki, Aleksandar, Bajovic, Dragana, Jakovetic, Dusan, Kar, Soummya
Format: Journal Article
Language:English
Published: New York IEEE 2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:1053-587X, 1941-0476
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract We develop a family of distributed center-based clustering algorithms that work over connected networks of users. In the proposed scenario, users contain a local dataset and communicate only with their immediate neighbours, with the aim of finding a clustering of the full, joint data. The proposed family, termed Distributed Gradient Clustering (DGC-<inline-formula><tex-math notation="LaTeX">\mathcal{F}_{\rho}</tex-math></inline-formula>), is parametrized by <inline-formula><tex-math notation="LaTeX">\rho\geq 1</tex-math></inline-formula>, controlling the proximity of users' center estimates, with <inline-formula><tex-math notation="LaTeX">\mathcal{F}</tex-math></inline-formula> determining the clustering loss. Our framework allows for a broad class of smooth convex loss functions, including popular clustering losses like <inline-formula><tex-math notation="LaTeX">K</tex-math></inline-formula>-means and Huber loss. Specialized to <inline-formula><tex-math notation="LaTeX">K</tex-math></inline-formula>-means and Huber loss, DGC-<inline-formula><tex-math notation="LaTeX">\mathcal{F}_{\rho}</tex-math></inline-formula> gives rise to novel distributed clustering algorithms DGC-KM<inline-formula><tex-math notation="LaTeX">{}_{\rho}</tex-math></inline-formula> and DGC-HL<inline-formula><tex-math notation="LaTeX">{}_{\rho}</tex-math></inline-formula>, while novel clustering losses based on the logistic and fair loss lead to DGC-LL<inline-formula><tex-math notation="LaTeX">{}_{\rho}</tex-math></inline-formula> and DGC-FL<inline-formula><tex-math notation="LaTeX">{}_{\rho}</tex-math></inline-formula>. We provide a unified analysis and establish several strong results, under mild assumptions. First, the sequence of centers generated by the methods converges to a well-defined notion of fixed point, under any center initialization and value of <inline-formula><tex-math notation="LaTeX">\rho</tex-math></inline-formula>. Second, as <inline-formula><tex-math notation="LaTeX">\rho</tex-math></inline-formula> increases, the family of fixed points produced by DGC-<inline-formula><tex-math notation="LaTeX">\mathcal{F}_{\rho}</tex-math></inline-formula> converges to a notion of consensus fixed points. We show that consensus fixed points of DGC-<inline-formula><tex-math notation="LaTeX">\mathcal{F}_{\rho}</tex-math></inline-formula> are equivalent to fixed points of gradient clustering over the full data, guaranteeing a clustering of the full data is produced. For the special case of Bregman losses, we show that our fixed points converge to the set of Lloyd points. Numerical experiments on real data confirm our theoretical findings and demonstrate strong performance of the methods.
AbstractList We develop a family of distributed center-based clustering algorithms that work over connected networks of users. In the proposed scenario, users contain a local dataset and communicate only with their immediate neighbours, with the aim of finding a clustering of the full, joint data. The proposed family, termed Distributed Gradient Clustering (DGC-[Formula Omitted]), is parametrized by [Formula Omitted], controlling the proximity of users’ center estimates, with [Formula Omitted] determining the clustering loss. Our framework allows for a broad class of smooth convex loss functions, including popular clustering losses like [Formula Omitted]-means and Huber loss. Specialized to [Formula Omitted]-means and Huber loss, DGC-[Formula Omitted] gives rise to novel distributed clustering algorithms DGC-KM[Formula Omitted] and DGC-HL[Formula Omitted], while novel clustering losses based on the logistic and fair loss lead to DGC-LL[Formula Omitted] and DGC-FL[Formula Omitted]. We provide a unified analysis and establish several strong results, under mild assumptions. First, the sequence of centers generated by the methods converges to a well-defined notion of fixed point, under any center initialization and value of [Formula Omitted]. Second, as [Formula Omitted] increases, the family of fixed points produced by DGC-[Formula Omitted] converges to a notion of consensus fixed points. We show that consensus fixed points of DGC-[Formula Omitted] are equivalent to fixed points of gradient clustering over the full data, guaranteeing a clustering of the full data is produced. For the special case of Bregman losses, we show that our fixed points converge to the set of Lloyd points. Numerical experiments on real data confirm our theoretical findings and demonstrate strong performance of the methods.
We develop a family of distributed center-based clustering algorithms that work over connected networks of users. In the proposed scenario, users contain a local dataset and communicate only with their immediate neighbours, with the aim of finding a clustering of the full, joint data. The proposed family, termed Distributed Gradient Clustering (DGC-<inline-formula><tex-math notation="LaTeX">\mathcal{F}_{\rho}</tex-math></inline-formula>), is parametrized by <inline-formula><tex-math notation="LaTeX">\rho\geq 1</tex-math></inline-formula>, controlling the proximity of users' center estimates, with <inline-formula><tex-math notation="LaTeX">\mathcal{F}</tex-math></inline-formula> determining the clustering loss. Our framework allows for a broad class of smooth convex loss functions, including popular clustering losses like <inline-formula><tex-math notation="LaTeX">K</tex-math></inline-formula>-means and Huber loss. Specialized to <inline-formula><tex-math notation="LaTeX">K</tex-math></inline-formula>-means and Huber loss, DGC-<inline-formula><tex-math notation="LaTeX">\mathcal{F}_{\rho}</tex-math></inline-formula> gives rise to novel distributed clustering algorithms DGC-KM<inline-formula><tex-math notation="LaTeX">{}_{\rho}</tex-math></inline-formula> and DGC-HL<inline-formula><tex-math notation="LaTeX">{}_{\rho}</tex-math></inline-formula>, while novel clustering losses based on the logistic and fair loss lead to DGC-LL<inline-formula><tex-math notation="LaTeX">{}_{\rho}</tex-math></inline-formula> and DGC-FL<inline-formula><tex-math notation="LaTeX">{}_{\rho}</tex-math></inline-formula>. We provide a unified analysis and establish several strong results, under mild assumptions. First, the sequence of centers generated by the methods converges to a well-defined notion of fixed point, under any center initialization and value of <inline-formula><tex-math notation="LaTeX">\rho</tex-math></inline-formula>. Second, as <inline-formula><tex-math notation="LaTeX">\rho</tex-math></inline-formula> increases, the family of fixed points produced by DGC-<inline-formula><tex-math notation="LaTeX">\mathcal{F}_{\rho}</tex-math></inline-formula> converges to a notion of consensus fixed points. We show that consensus fixed points of DGC-<inline-formula><tex-math notation="LaTeX">\mathcal{F}_{\rho}</tex-math></inline-formula> are equivalent to fixed points of gradient clustering over the full data, guaranteeing a clustering of the full data is produced. For the special case of Bregman losses, we show that our fixed points converge to the set of Lloyd points. Numerical experiments on real data confirm our theoretical findings and demonstrate strong performance of the methods.
Author Kar, Soummya
Bajovic, Dragana
Jakovetic, Dusan
Armacki, Aleksandar
Author_xml – sequence: 1
  givenname: Aleksandar
  orcidid: 0000-0001-7916-585X
  surname: Armacki
  fullname: Armacki, Aleksandar
  email: aarmacki@andrew.cmu.edu
  organization: Electrical and Computer Engineering Department, Carnegie Mellon University, Pittsburgh, PA, USA
– sequence: 2
  givenname: Dragana
  orcidid: 0000-0003-1783-8734
  surname: Bajovic
  fullname: Bajovic, Dragana
  email: dbajovic@uns.ac.rs
  organization: Faculty of Technical Sciences, University of Novi Sad, Novi Sad, Serbia
– sequence: 3
  givenname: Dusan
  orcidid: 0000-0003-3497-5589
  surname: Jakovetic
  fullname: Jakovetic, Dusan
  email: dusan.jakovetic@dmi.uns.ac.rs
  organization: Faculty of Sciences, University of Novi Sad, Novi Sad, Serbia
– sequence: 4
  givenname: Soummya
  orcidid: 0000-0002-8060-5581
  surname: Kar
  fullname: Kar, Soummya
  email: soummyak@andrew.cmu.edu
  organization: Electrical and Computer Engineering Department, Carnegie Mellon University, Pittsburgh, PA, USA
BookMark eNpNkM1LAzEQxYNUsK3ePXgoeN6aTL423mq1KhQUbMFbyO7OSmq7W5NdxP_elHrwNPMe783Ab0QGTdsgIZeMThmj5mb19joFCnLKJWdg4IQMmREso0KrQdqp5JnM9fsZGcW4oZQJYdSQ5Pc-dsEXfYfVZI5NhyG7c_Egtn1Myjcft5PZZN342id3EdwOv9vweU5Oa7eNePE3x2S9eFjNn7Lly-PzfLbMShCyyyooGHOqpoyzIldKAwqlS4HgEICWkDtluOGOKtSlcUXFk8MLTiuQAJyPyfXx7j60Xz3Gzm7aPjTppeVMGRDaUJVS9JgqQxtjwNrug9-58GMZtQc-NvGxBz72j0-qXB0rHhH_xXOhZQ78FzrwYQg
CODEN ITPRED
Cites_doi 10.1109/MSP.2012.2235193
10.1109/TAC.2021.3122586
10.23919/EUSIPCO58844.2023.10289938
10.1137/1.9781611972740.22
10.1109/TSP.2008.2007111
10.1016/0041-5553(67)90040-7
10.1109/TAC.2008.2009515
10.1109/TCYB.2016.2526683
10.1109/JPROC.2020.3024266
10.1214/aos/1176345339
10.1109/TIT.1982.1056489
10.1109/JSTSP.2011.2114324
10.1109/TAC.2014.2298712
10.1109/TSP.2012.2211593
10.1137/16M1084316
10.1016/j.patrec.2009.09.011
10.1007/BF02293907
10.1109/JPROC.2014.2306253
10.1137/14096668X
10.1145/3580305.3599283
10.1201/b19706-9
10.1137/20M1361158
10.1137/S1052623497331063
10.1109/RBME.2010.2083647
10.1109/MSP.2020.2975749
10.1145/1541880.1541882
10.1109/TAC.2018.2836919
10.1109/TNN.2005.845141
10.1109/TSP.2023.3343561
10.1109/5.726791
10.1109/JIOT.2020.2981774
10.1017/CBO9781139086547
10.1109/Allerton.2012.6483403
10.1109/FOCS.2010.35
10.1109/TIT.2012.2191450
10.5555/1953048.2078195
10.1109/TIT.2022.3192506
10.1109/TSIPN.2016.2524588
10.1109/TAC.2009.2031203
10.1109/CDC.2006.377308
10.1090/cbms/092
10.1109/TSP.2009.2036046
10.1007/s10462-020-09918-2
10.1109/SPAWC.2005.1506308
10.1007/978-3-642-79999-0_1
10.1109/TPAMI.1984.4767478
10.1109/TSP.2023.3277211
10.1007/978-3-642-32512-0_4
10.1561/2200000083
10.1109/JPROC.2010.2052531
10.1111/j.1469-1809.1936.tb02137.x
10.1145/276698.276718
10.1007/978-3-319-91578-4
10.1109/ISC255366.2022.9921863
10.1109/TKDE.2008.222
10.1109/MSP.2023.3267896
10.1109/JPROC.2020.3007395
10.1016/j.cie.2020.107023
10.1145/1007352.1007400
10.1109/CDC.2018.8619228
10.1137/0213014
10.1137/130943170
10.1145/2783258.2783313
10.1214/aoms/1177703732
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2025
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2025
DBID 97E
RIA
RIE
AAYXX
CITATION
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
DOI 10.1109/TSP.2025.3531292
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DatabaseTitle CrossRef
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
DatabaseTitleList Technology Research Database

Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1941-0476
EndPage 918
ExternalDocumentID 10_1109_TSP_2025_3531292
10847582
Genre orig-research
GrantInformation_xml – fundername: Ministry of Science, Technological Development and Innovation
  grantid: No. 451-03-65/2024-03/200156
– fundername: Provincial Secretariat for Higher Education and Scientific Research
  grantid: 142-451-2593/2021-01/2
– fundername: National Science Foundation
  grantid: ECCS 2330195
  funderid: 10.13039/100000001
– fundername: "Scientific and Artistic Research Work of Researchers in Teaching and Associate Positions at the Faculty of Technical Sciences, University of Novi Sad"
  grantid: No. 01-3394/1
– fundername: Science Fund of Republic of Serbia, project "LASCADO"
  grantid: 7359
– fundername: European Union's Horizon Europe program
  grantid: 101093006
– fundername: Serbian Ministry of Science, Technological development and Innovation, within the bilateral project Serbia-Slovakia
  grantid: No. 337-00-3/2024-05/16
GroupedDBID -~X
.DC
0R~
29I
3EH
4.4
53G
5GY
5VS
6IK
85S
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABFSI
ABQJQ
ABVLG
ACGFO
ACIWK
ACKIV
ACNCT
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AJQPL
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
E.L
EBS
EJD
F5P
HZ~
H~9
ICLAB
IFIPE
IFJZH
IPLJI
JAVBF
LAI
MS~
O9-
OCL
P2P
RIA
RIE
RNS
TAE
TN5
VH1
AAYXX
CITATION
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c245t-d2b11a6f0131b86672e467c4e2ae220c28a69393a06e7c9abd38a63b30d252233
IEDL.DBID RIE
ISICitedReferencesCount 0
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=001428033100004&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1053-587X
IngestDate Mon Jun 30 10:13:43 EDT 2025
Sat Nov 29 08:20:59 EST 2025
Wed Aug 27 01:48:03 EDT 2025
IsPeerReviewed true
IsScholarly true
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c245t-d2b11a6f0131b86672e467c4e2ae220c28a69393a06e7c9abd38a63b30d252233
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0003-3497-5589
0000-0003-1783-8734
0000-0001-7916-585X
0000-0002-8060-5581
PQID 3169247906
PQPubID 85478
PageCount 16
ParticipantIDs proquest_journals_3169247906
crossref_primary_10_1109_TSP_2025_3531292
ieee_primary_10847582
PublicationCentury 2000
PublicationDate 20250000
2025-00-00
20250101
PublicationDateYYYYMMDD 2025-01-01
PublicationDate_xml – year: 2025
  text: 20250000
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle IEEE transactions on signal processing
PublicationTitleAbbrev TSP
PublicationYear 2025
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref57
ref12
ref56
ref14
ref53
ref52
ref11
ref55
ref10
ref54
Sun (ref75) 2021; 22
Armacki (ref19) 2022; 162
ref18
Swenson (ref58) 2022; 23
ref51
MacQueen (ref17) 1967; 1
ref50
ref46
ref45
ref48
ref47
ref86
ref85
ref88
Dennis (ref36) 2021; 139
ref43
ref87
Tang (ref81) 2017; 54
ref49
Bertsekas (ref60) 1982
ref8
Paul (ref67) 2021; 34
ref9
ref3
ref6
ref5
Dhillon (ref4) 2003; 3
ref82
ref40
ref84
ref83
Balcan (ref41) 2013; 26
ref80
ref35
ref79
Vattani (ref15)
ref34
ref31
ref30
Oliva (ref42) 2013
ref74
ref77
ref32
ref76
ref2
ref1
ref39
Krizhevsky (ref72) 2009
ref71
ref70
Awasthi (ref16) 2015
ref73
Li (ref33) 2018; 31
ref24
ref68
Pediredla (ref7) 2011
ref23
Kar (ref44) 2019
Taylor (ref20) 2021
ref26
ref25
Qiao (ref37) 2023
ref69
ref64
ref63
ref66
Huang (ref38) 2023; 202
ref65
ref28
ref27
McMahan (ref21) 2017; 54
ref29
Yang (ref22) 2013; 26
ref62
Armacki (ref59) 2024
ref61
Ghosh (ref78) 2020; 33
References_xml – ident: ref29
  doi: 10.1109/MSP.2012.2235193
– year: 2013
  ident: ref42
  article-title: Distributed k-means algorithm
– ident: ref57
  doi: 10.1109/TAC.2021.3122586
– ident: ref83
  doi: 10.23919/EUSIPCO58844.2023.10289938
– ident: ref18
  doi: 10.1137/1.9781611972740.22
– ident: ref82
  doi: 10.1109/TSP.2008.2007111
– year: 2009
  ident: ref72
  article-title: Learning multiple layers of features from tiny images
– ident: ref61
  doi: 10.1016/0041-5553(67)90040-7
– ident: ref48
  doi: 10.1109/TAC.2008.2009515
– volume: 202
  start-page: 13845
  volume-title: Proc. 40th Int. Conf. Mach. Learn.
  year: 2023
  ident: ref38
  article-title: Fast algorithms for distributed k-clustering with outliers
– ident: ref43
  doi: 10.1109/TCYB.2016.2526683
– ident: ref55
  doi: 10.1109/JPROC.2020.3024266
– ident: ref77
  doi: 10.1214/aos/1176345339
– volume: 26
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  year: 2013
  ident: ref41
  article-title: Distributed k-means and k-median clustering on general topologies
– ident: ref11
  doi: 10.1109/TIT.1982.1056489
– volume: 139
  start-page: 2611
  volume-title: Proc. 38th Int. Conf. Mach. Learn.
  year: 2021
  ident: ref36
  article-title: Heterogeneity for the win: One-shot federated clustering
– ident: ref40
  doi: 10.1109/JSTSP.2011.2114324
– ident: ref53
  doi: 10.1109/TAC.2014.2298712
– ident: ref86
  doi: 10.1109/TSP.2012.2211593
– ident: ref49
  doi: 10.1137/16M1084316
– ident: ref2
  doi: 10.1016/j.patrec.2009.09.011
– ident: ref88
  doi: 10.1007/BF02293907
– volume: 26
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  year: 2013
  ident: ref22
  article-title: Trading computation for communication: Distributed stochastic dual coordinate ascent
– volume: 34
  start-page: 8307
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  year: 2021
  ident: ref67
  article-title: Uniform concentration bounds toward a unified framework for robust clustering
– volume: 31
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  year: 2018
  ident: ref33
  article-title: Distributed k-clustering for data with heavy noise
– volume: 23
  start-page: 1
  year: 2022
  ident: ref58
  article-title: Distributed stochastic gradient descent: Nonconvexity, nonsmoothness, and convergence to local minima
  publication-title: J. Mach. Learn. Res.
– volume-title: Constrained Optimization and Lagrange Multiplier Methods
  year: 1982
  ident: ref60
– ident: ref27
  doi: 10.1109/JPROC.2014.2306253
– ident: ref50
  doi: 10.1137/14096668X
– ident: ref9
  doi: 10.1145/3580305.3599283
– ident: ref12
  doi: 10.1201/b19706-9
– ident: ref56
  doi: 10.1137/20M1361158
– ident: ref80
  doi: 10.1137/S1052623497331063
– ident: ref6
  doi: 10.1109/RBME.2010.2083647
– year: 2019
  ident: ref44
  article-title: Clustering with distributed data
– ident: ref24
  doi: 10.1109/MSP.2020.2975749
– ident: ref5
  doi: 10.1145/1541880.1541882
– ident: ref15
  article-title: The hardness of k-means clustering in the plane
– year: 2021
  ident: ref20
  article-title: Volume of data/information created, captured, copied, and consumed worldwide from 2010 to 2020, with forecasts from 2021 to 2025 (in zettabytes)
– volume: 54
  start-page: 1273
  volume-title: Proc. 20th Int. Conf. Artif. Intell. Statist.
  year: 2017
  ident: ref21
  article-title: Communication-efficient learning of deep networks from decentralized data
– ident: ref31
  doi: 10.1109/TAC.2018.2836919
– ident: ref1
  doi: 10.1109/TNN.2005.845141
– volume: 33
  start-page: 19586
  volume-title: Proc. Adv. Neural Inf. Process. Syst.
  year: 2020
  ident: ref78
  article-title: An efficient framework for clustered federated learning
– ident: ref79
  doi: 10.1109/TSP.2023.3343561
– ident: ref71
  doi: 10.1109/5.726791
– ident: ref45
  doi: 10.1109/JIOT.2020.2981774
– ident: ref65
  doi: 10.1017/CBO9781139086547
– ident: ref30
  doi: 10.1109/Allerton.2012.6483403
– ident: ref10
  doi: 10.1109/FOCS.2010.35
– ident: ref26
  doi: 10.1109/TIT.2012.2191450
– ident: ref73
  doi: 10.5555/1953048.2078195
– ident: ref68
  doi: 10.1109/TIT.2022.3192506
– ident: ref51
  doi: 10.1109/TSIPN.2016.2524588
– year: 2024
  ident: ref59
  article-title: A unified framework for gradient-based clustering of distributed data
– ident: ref84
  doi: 10.1109/TAC.2009.2031203
– volume: 54
  start-page: 1495
  volume-title: Proc. 20th Int. Conf. Artif. Intell. Statist.
  year: 2017
  ident: ref81
  article-title: Convergence rate of stochastic k-means
– ident: ref87
  doi: 10.1109/CDC.2006.377308
– ident: ref64
  doi: 10.1090/cbms/092
– ident: ref85
  doi: 10.1109/TSP.2009.2036046
– volume: 162
  start-page: 929
  volume-title: Proc. 39th Int. Conf. Mach. Learn.
  year: 2022
  ident: ref19
  article-title: Gradient based clustering
– ident: ref35
  doi: 10.1007/s10462-020-09918-2
– volume-title: Proc. 37th Conf. Neural Inf. Process. Syst.
  year: 2023
  ident: ref37
  article-title: Federated spectral clustering via secure similarity reconstruction
– ident: ref63
  doi: 10.1109/SPAWC.2005.1506308
– ident: ref3
  doi: 10.1007/978-3-642-79999-0_1
– volume: 1
  start-page: 281
  volume-title: Proc. 5th Berkeley Symp. Math. Statist. Probability
  year: 1967
  ident: ref17
  article-title: Some methods for classification and analysis of multivariate observations
– ident: ref13
  doi: 10.1109/TPAMI.1984.4767478
– ident: ref32
  doi: 10.1109/TSP.2023.3277211
– start-page: 501
  volume-title: Proc. 7th Int. Symp. Image Signal Process. Anal. (ISPA)
  year: 2011
  ident: ref7
  article-title: A Huber-loss-driven clustering technique and its application to robust cell detection in confocal microscopy images
– ident: ref8
  doi: 10.1007/978-3-642-32512-0_4
– ident: ref25
  doi: 10.1561/2200000083
– ident: ref69
  doi: 10.1109/JPROC.2010.2052531
– ident: ref70
  doi: 10.1111/j.1469-1809.1936.tb02137.x
– ident: ref46
  doi: 10.1145/276698.276718
– ident: ref66
  doi: 10.1007/978-3-319-91578-4
– ident: ref76
  doi: 10.1109/ISC255366.2022.9921863
– volume: 3
  start-page: 1265
  year: 2003
  ident: ref4
  article-title: A divisive information-theoretic feature clustering algorithm for text classification
  publication-title: J. Mach. Learn. Res. (JMLR)
– ident: ref39
  doi: 10.1109/TKDE.2008.222
– year: 2015
  ident: ref16
  article-title: The hardness of approximation of euclidean k-means
– ident: ref28
  doi: 10.1109/MSP.2023.3267896
– ident: ref23
  doi: 10.1109/JPROC.2020.3007395
– ident: ref34
  doi: 10.1016/j.cie.2020.107023
– ident: ref47
  doi: 10.1145/1007352.1007400
– ident: ref54
  doi: 10.1109/CDC.2018.8619228
– ident: ref14
  doi: 10.1137/0213014
– ident: ref52
  doi: 10.1137/130943170
– ident: ref74
  doi: 10.1145/2783258.2783313
– ident: ref62
  doi: 10.1214/aoms/1177703732
– volume: 22
  start-page: 1
  year: 2021
  ident: ref75
  article-title: Convex clustering: Model, theoretical guarantee and efficient algorithm
  publication-title: J. Mach. Learn. Res.
SSID ssj0014496
Score 2.4730513
Snippet We develop a family of distributed center-based clustering algorithms that work over connected networks of users. In the proposed scenario, users contain a...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Index Database
Publisher
StartPage 903
SubjectTerms Algorithms
Clustering
Clustering algorithms
consensus
Convergence
Costs
Distributed databases
Europe
first-order methods
fixed points
Logistics
networks
peer-to-peer
Peer-to-peer computing
Privacy
Servers
Signal processing algorithms
Title Distributed Center-Based Clustering: A Unified Framework
URI https://ieeexplore.ieee.org/document/10847582
https://www.proquest.com/docview/3169247906
Volume 73
WOSCitedRecordID wos001428033100004&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 1941-0476
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014496
  issn: 1053-587X
  databaseCode: RIE
  dateStart: 19910101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV05T8MwFLagYoCBs4hCQRlYGFxcO_HBVo6KqapEkbpFjv0iIaEW9eD38-wkqAgxsCWRE0Wf7Xf6vY-Qay4BykDfjk5OSVPuMmp1IShj3pfaKlkoF8km1Gikp1MzrovVYy0MAMTDZ9ALlzGX7-duHUJluMNRlmYaJe62UrIq1vpOGaRpJONCe0HQTKtpk5Nk5nbyMkZPkGc9gSuOG_5DB0VSlV-SOKqX4cE_f-yQ7Nd2ZDKoJv6IbMHsmOxtdBc8IfoxNMUNfFbgkxDFhQW9R6WFN-_r0B8BR90lgwTNzhIN0WTYnNNqk9fh0-ThmdZECdTxNFtRz4t-38oy9M4ptJSKA8o_lwK3wDlzXFtphBGWSVDO2MILfCIKwTxH-0uIU9KazWdwRpIS0r5Doyl0eQnV5tplrmTGembT0rOiQ24a6PKPqh9GHv0IZnKEOQ8w5zXMHdIOUG2Mq1DqkG4Ddl7vmGUu-hJdQWWYPP_jtQuyG75exT-6pLVarOGS7LjP1dtycRUXwxcLjrBu
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3LS8MwGA8yBfXgc-J0ag9ePHSmSZom3uZjTJxj4ITdSpoHCLLJHv79fkk7mYgHb21Jafkl-Z75vh9Cl4Rb6zx9Ozg5LmZEp7ESBY0xNsYJlfEi04FsIuv3xWgkB1WxeqiFsdaGw2e25S9DLt9M9MKHymCHgyxNBUjc9ZQxgstyre-kAWOBjgssBhqnIhsts5JYXg9fBuALkrRFYc0RSX5ooUCr8ksWBwXT2f3nr-2hncqSjNrl1O-jNTs-QNsr_QUPkbj3bXE9o5U1kY_j2ml8C2oLbt4XvkMCjLqJ2hEYng5M0aizPKlVR6-dh-FdN66oEmJNWDqPDSmSRHHnu-cUgvOMWJCAmlmiLCFYE6G4pJIqzG2mpSoMhSe0oNgQsMAoPUK18WRsj1HkLEs0mE2-z4uvNxc61Q5LZbBizuCiga6W0OUfZUeMPHgSWOYAc-5hziuYG6juoVoZV6LUQM0l2Hm1Z2Y5TTg4g5nE_OSP1y7QZnf43Mt7j_2nU7Tlv1RGQ5qoNp8u7Bna0J_zt9n0PCyML2sas7U
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Distributed+Center-Based+Clustering%3A+A+Unified+Framework&rft.jtitle=IEEE+transactions+on+signal+processing&rft.au=Armacki%2C+Aleksandar&rft.au=Bajovic%2C+Dragana&rft.au=Jakovetic%2C+Dusan&rft.au=Kar%2C+Soummya&rft.date=2025&rft.pub=IEEE&rft.issn=1053-587X&rft.volume=73&rft.spage=903&rft.epage=918&rft_id=info:doi/10.1109%2FTSP.2025.3531292&rft.externalDocID=10847582
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1053-587X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1053-587X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1053-587X&client=summon