Robust vertex enumeration for convex hulls in high dimensions

The problem of computing the vertices of the convex hull of a given input set S = { v i ∈ R m : i = 1 , ⋯ , n } is a classic and fundamental problem, studied in the context of computational geometry, linear and convex programming, machine learning and more. In this article we present All Vertex Tria...

Full description

Saved in:
Bibliographic Details
Published in:Annals of operations research Vol. 295; no. 1; pp. 37 - 73
Main Authors: Awasthi, Pranjal, Kalantari, Bahman, Zhang, Yikai
Format: Journal Article
Language:English
Published: New York Springer US 01.12.2020
Springer
Springer Nature B.V
Subjects:
ISSN:0254-5330, 1572-9338
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract The problem of computing the vertices of the convex hull of a given input set S = { v i ∈ R m : i = 1 , ⋯ , n } is a classic and fundamental problem, studied in the context of computational geometry, linear and convex programming, machine learning and more. In this article we present All Vertex Triangle Algorithm (AVTA), a robust and efficient algorithm for this problem. On the one hand, without any assumptions, AVTA computes approximation to the subset S ¯ of all K vertices of the convex hull of S so that the convex hull of the approximate subset of vertices is as close to conv ( S ) as desired. On the other hand, assuming a known lower bound γ on the ratio Γ ∗ / R , where Γ ∗ the minimum of the distances from each vertex to the convex hull of the remaining vertices and R the diameter of S , AVTA can recover all of S ¯ . Furthermore, assuming that instead of S the input is an ε -perturbation of S , S ¯ ε , where ‖ v i - v i ε ‖ ≤ ε R , AVTA can compute approximation to c o n v ( S ¯ ε ) , to any prescribed accuracy. Also, given a lower bound to the ratio Σ ∗ / R , where Σ ∗ is the minimum of the distances from each vertex to the convex hull of the remaining point of S , AVTA can recover all of S ¯ ε . We show Σ ∗ ≥ ρ ∗ Γ ∗ / R , where ρ ∗ is the minimum distance between distinct pair of points in S and prove the following main results: Given any t ∈ ( 0 , 1 ) , AVTA computes a subset S ¯ t of S ¯ of cardinality K ( t ) in O ( n K ( t ) ( m + t - 2 ) ) operations so that for any p ∈ c o n v ( S ) its Euclidean distance to c o n v ( S ¯ t ) is at most tR . Given γ ≤ γ ∗ = Γ ∗ / R , AVTA computes S ¯ in O ( n K ( m + γ - 2 ) ) operations. If K is known, the complexity of AVTA is O ( n K ( m + γ ∗ - 2 ) log ( γ ∗ - 1 ) ) . Assuming instead of S , its ε -perturbation, S ε is given, we prove Given any t ∈ ( 0 , 1 ) , AVTA computes a subset S ¯ ε t ⊂ S ¯ ε of cardinality K ε ( t ) in O ( n K ε ( t ) ( m + t - 2 ) ) operations so that for any p ∈ c o n v ( S ) its distance to c o n v ( S ¯ ε t ) is at most ( t + ε ) R . Given σ ∈ [ 4 ε , σ ∗ = Γ ∗ / R ] , AVTA computes S ¯ ε in O ( n K ε ( m + σ - 2 ) ) operations, where K ≤ K ε ≤ n . If γ ≤ γ ∗ = Γ ∗ / R is known satisfying 4 ε ≤ γ ρ ∗ / R , AVTA computes S ¯ ε in O ( n K ε ( m + ( γ ρ ∗ ) - 2 ) ) operations. Given σ ∈ [ 4 ε , σ ∗ ] , if K is known, AVTA computes S ¯ ε in O ( n K ( m + σ ∗ - 2 ) log ( σ ∗ - 1 ) ) operations. We also consider the application of AVTA in the recovery of vertices through the projection of S or S ε under a Johnson–Lindenstrauss randomized linear projection L : R m → R m ′ . Denoting U = L ( S ) and U ε = L ( S ε ) , by relating the robustness parameters of conv ( U ) and c o n v ( U ε ) to those of conv ( S ) and c o n v ( S ε ) , we derive analogous complexity bounds for probabilistic computation of the vertex set of conv ( U ) or those of c o n v ( U ε ) , or an approximation to them. Finally, we apply AVTA to design new practical algorithms for two popular machine learning problems: topic modeling and non-negative matrix factorization. For topic models, our new algorithm leads to significantly better reconstruction of the topic-word matrix than state of the art approaches of Arora et al. (International conference on machine learning, pp 280–288, 2013) and Bansal et al. (Advances in neural information processing systems, pp 1997–2005, 2014). Additionally, we provide a robust analysis of AVTA and empirically demonstrate that it can handle larger amounts of noise than existing methods. For non-negative matrix factorization we show that AVTA is competitive with existing methods that are specialized for this task in Arora et al. (Proceedings of the forty-fourth annual ACM symposium on theory of computing, ACM, pp 145–162, 2012a). We also contrast AVTA with Blum et al. (Proceedings of the twenty-seventh annual ACM-SIAM symposium on discrete algorithms, Society for Industrial and Applied Mathematics, pp 548–557, 2016) Greedy Clustering coreset algorithm for computing approximation to the set of vertices and argue that not only there are regimes where AVTA outperforms that algorithm but it can also be used as a pre-processing step to their algorithm. Thus the two algorithms in fact complement each other.
AbstractList The problem of computing the vertices of the convex hull of a given input set [Formula omitted] is a classic and fundamental problem, studied in the context of computational geometry, linear and convex programming, machine learning and more. In this article we present All Vertex Triangle Algorithm (AVTA), a robust and efficient algorithm for this problem. On the one hand, without any assumptions, AVTA computes approximation to the subset [Formula omitted] of all K vertices of the convex hull of S so that the convex hull of the approximate subset of vertices is as close to conv(S) as desired. On the other hand, assuming a known lower bound [Formula omitted] on the ratio [Formula omitted], where [Formula omitted] the minimum of the distances from each vertex to the convex hull of the remaining vertices and R the diameter of S, AVTA can recover all of [Formula omitted]. Furthermore, assuming that instead of S the input is an [Formula omitted]-perturbation of S, [Formula omitted], where [Formula omitted], AVTA can compute approximation to [Formula omitted], to any prescribed accuracy. Also, given a lower bound to the ratio [Formula omitted], where [Formula omitted] is the minimum of the distances from each vertex to the convex hull of the remaining point of S, AVTA can recover all of [Formula omitted]. We show [Formula omitted], where [Formula omitted] is the minimum distance between distinct pair of points in S and prove the following main results: Given any [Formula omitted], AVTA computes a subset [Formula omitted] of [Formula omitted] of cardinality [Formula omitted] in [Formula omitted] operations so that for any [Formula omitted] its Euclidean distance to [Formula omitted] is at most tR.
The problem of computing the vertices of the convex hull of a given input set S={vi∈Rm:i=1,⋯,n} is a classic and fundamental problem, studied in the context of computational geometry, linear and convex programming, machine learning and more. In this article we present All Vertex Triangle Algorithm (AVTA), a robust and efficient algorithm for this problem. On the one hand, without any assumptions, AVTA computes approximation to the subset S¯ of all K vertices of the convex hull of S so that the convex hull of the approximate subset of vertices is as close to conv(S) as desired. On the other hand, assuming a known lower bound γ on the ratio Γ∗/R, where Γ∗ the minimum of the distances from each vertex to the convex hull of the remaining vertices and R the diameter of S, AVTA can recover all of S¯. Furthermore, assuming that instead of S the input is an ε-perturbation of S, S¯ε, where ‖vi-viε‖≤εR, AVTA can compute approximation to conv(S¯ε), to any prescribed accuracy. Also, given a lower bound to the ratio Σ∗/R, where Σ∗ is the minimum of the distances from each vertex to the convex hull of the remaining point of S, AVTA can recover all of S¯ε. We show Σ∗≥ρ∗Γ∗/R, where ρ∗ is the minimum distance between distinct pair of points in S and prove the following main results: Given any t∈(0,1), AVTA computes a subset S¯t of S¯ of cardinality K(t) in O(nK(t)(m+t-2)) operations so that for any p∈conv(S) its Euclidean distance to conv(S¯t) is at most tR.Given γ≤γ∗=Γ∗/R, AVTA computes S¯ in O(nK(m+γ-2)) operations.If K is known, the complexity of AVTA is O(nK(m+γ∗-2)log(γ∗-1)).Assuming instead of S, its ε-perturbation, Sε is given, we prove Given any t∈(0,1), AVTA computes a subset S¯εt⊂S¯ε of cardinality Kε(t) in O(nKε(t)(m+t-2)) operations so that for any p∈conv(S) its distance to conv(S¯εt) is at most (t+ε)R.Given σ∈[4ε,σ∗=Γ∗/R], AVTA computes S¯ε in O(nKε(m+σ-2)) operations, where K≤Kε≤n.If γ≤γ∗=Γ∗/R is known satisfying 4ε≤γρ∗/R, AVTA computes S¯ε in O(nKε(m+(γρ∗)-2)) operations.Given σ∈[4ε,σ∗], if K is known, AVTA computes S¯ε in O(nK(m+σ∗-2)log(σ∗-1)) operations.We also consider the application of AVTA in the recovery of vertices through the projection of S or Sε under a Johnson–Lindenstrauss randomized linear projection L:Rm→Rm′. Denoting U=L(S) and Uε=L(Sε), by relating the robustness parameters of conv(U) and conv(Uε) to those of conv(S) and conv(Sε), we derive analogous complexity bounds for probabilistic computation of the vertex set of conv(U) or those of conv(Uε), or an approximation to them. Finally, we apply AVTA to design new practical algorithms for two popular machine learning problems: topic modeling and non-negative matrix factorization. For topic models, our new algorithm leads to significantly better reconstruction of the topic-word matrix than state of the art approaches of Arora et al. (International conference on machine learning, pp 280–288, 2013) and Bansal et al. (Advances in neural information processing systems, pp 1997–2005, 2014). Additionally, we provide a robust analysis of AVTA and empirically demonstrate that it can handle larger amounts of noise than existing methods. For non-negative matrix factorization we show that AVTA is competitive with existing methods that are specialized for this task in Arora et al. (Proceedings of the forty-fourth annual ACM symposium on theory of computing, ACM, pp 145–162, 2012a). We also contrast AVTA with Blum et al. (Proceedings of the twenty-seventh annual ACM-SIAM symposium on discrete algorithms, Society for Industrial and Applied Mathematics, pp 548–557, 2016) Greedy Clustering coreset algorithm for computing approximation to the set of vertices and argue that not only there are regimes where AVTA outperforms that algorithm but it can also be used as a pre-processing step to their algorithm. Thus the two algorithms in fact complement each other.
The problem of computing the vertices of the convex hull of a given input set [Formula omitted] is a classic and fundamental problem, studied in the context of computational geometry, linear and convex programming, machine learning and more. In this article we present All Vertex Triangle Algorithm (AVTA), a robust and efficient algorithm for this problem. On the one hand, without any assumptions, AVTA computes approximation to the subset [Formula omitted] of all K vertices of the convex hull of S so that the convex hull of the approximate subset of vertices is as close to conv(S) as desired. On the other hand, assuming a known lower bound [Formula omitted] on the ratio [Formula omitted], where [Formula omitted] the minimum of the distances from each vertex to the convex hull of the remaining vertices and R the diameter of S, AVTA can recover all of [Formula omitted]. Furthermore, assuming that instead of S the input is an [Formula omitted]-perturbation of S, [Formula omitted], where [Formula omitted], AVTA can compute approximation to [Formula omitted], to any prescribed accuracy. Also, given a lower bound to the ratio [Formula omitted], where [Formula omitted] is the minimum of the distances from each vertex to the convex hull of the remaining point of S, AVTA can recover all of [Formula omitted]. We show [Formula omitted], where [Formula omitted] is the minimum distance between distinct pair of points in S and prove the following main results: Given any [Formula omitted], AVTA computes a subset [Formula omitted] of [Formula omitted] of cardinality [Formula omitted] in [Formula omitted] operations so that for any [Formula omitted] its Euclidean distance to [Formula omitted] is at most tR. Given [Formula omitted], AVTA computes [Formula omitted] in [Formula omitted] operations. If K is known, the complexity of AVTA is [Formula omitted]. Assuming instead of S, its [Formula omitted]-perturbation, [Formula omitted] is given, we prove Given any [Formula omitted], AVTA computes a subset [Formula omitted] of cardinality [Formula omitted] in [Formula omitted] operations so that for any [Formula omitted] its distance to [Formula omitted] is at most [Formula omitted]. Given [Formula omitted], AVTA computes [Formula omitted] in [Formula omitted] operations, where [Formula omitted]. If [Formula omitted] is known satisfying [Formula omitted], AVTA computes [Formula omitted] in [Formula omitted] operations. Given [Formula omitted], if K is known, AVTA computes [Formula omitted] in [Formula omitted] operations. We also consider the application of AVTA in the recovery of vertices through the projection of S or [Formula omitted] under a Johnson-Lindenstrauss randomized linear projection [Formula omitted]. Denoting [Formula omitted] and [Formula omitted], by relating the robustness parameters of conv(U) and [Formula omitted] to those of conv(S) and [Formula omitted], we derive analogous complexity bounds for probabilistic computation of the vertex set of conv(U) or those of [Formula omitted], or an approximation to them. Finally, we apply AVTA to design new practical algorithms for two popular machine learning problems: topic modeling and non-negative matrix factorization. For topic models, our new algorithm leads to significantly better reconstruction of the topic-word matrix than state of the art approaches of Arora et al. (International conference on machine learning, pp 280-288, 2013) and Bansal et al. (Advances in neural information processing systems, pp 1997-2005, 2014). Additionally, we provide a robust analysis of AVTA and empirically demonstrate that it can handle larger amounts of noise than existing methods. For non-negative matrix factorization we show that AVTA is competitive with existing methods that are specialized for this task in Arora et al. (Proceedings of the forty-fourth annual ACM symposium on theory of computing, ACM, pp 145-162, 2012a). We also contrast AVTA with Blum et al. (Proceedings of the twenty-seventh annual ACM-SIAM symposium on discrete algorithms, Society for Industrial and Applied Mathematics, pp 548-557, 2016) Greedy Clustering coreset algorithm for computing approximation to the set of vertices and argue that not only there are regimes where AVTA outperforms that algorithm but it can also be used as a pre-processing step to their algorithm. Thus the two algorithms in fact complement each other.
The problem of computing the vertices of the convex hull of a given input set [Formula omitted] is a classic and fundamental problem, studied in the context of computational geometry, linear and convex programming, machine learning and more. In this article we present All Vertex Triangle Algorithm (AVTA), a robust and efficient algorithm for this problem. On the one hand, without any assumptions, AVTA computes approximation to the subset [Formula omitted] of all K vertices of the convex hull of S so that the convex hull of the approximate subset of vertices is as close to conv(S) as desired. On the other hand, assuming a known lower bound [Formula omitted] on the ratio [Formula omitted], where [Formula omitted] the minimum of the distances from each vertex to the convex hull of the remaining vertices and R the diameter of S, AVTA can recover all of [Formula omitted]. Furthermore, assuming that instead of S the input is an [Formula omitted]-perturbation of S, [Formula omitted], where [Formula omitted], AVTA can compute approximation to [Formula omitted], to any prescribed accuracy. Also, given a lower bound to the ratio [Formula omitted], where [Formula omitted] is the minimum of the distances from each vertex to the convex hull of the remaining point of S, AVTA can recover all of [Formula omitted]. We show [Formula omitted], where [Formula omitted] is the minimum distance between distinct pair of points in S and prove the following main results: Given any [Formula omitted], AVTA computes a subset [Formula omitted] of [Formula omitted] of cardinality [Formula omitted] in [Formula omitted] operations so that for any [Formula omitted] its Euclidean distance to [Formula omitted] is at most tR.
The problem of computing the vertices of the convex hull of a given input set S = { v i ∈ R m : i = 1 , ⋯ , n } is a classic and fundamental problem, studied in the context of computational geometry, linear and convex programming, machine learning and more. In this article we present All Vertex Triangle Algorithm (AVTA), a robust and efficient algorithm for this problem. On the one hand, without any assumptions, AVTA computes approximation to the subset S ¯ of all K vertices of the convex hull of S so that the convex hull of the approximate subset of vertices is as close to conv ( S ) as desired. On the other hand, assuming a known lower bound γ on the ratio Γ ∗ / R , where Γ ∗ the minimum of the distances from each vertex to the convex hull of the remaining vertices and R the diameter of S , AVTA can recover all of S ¯ . Furthermore, assuming that instead of S the input is an ε -perturbation of S , S ¯ ε , where ‖ v i - v i ε ‖ ≤ ε R , AVTA can compute approximation to c o n v ( S ¯ ε ) , to any prescribed accuracy. Also, given a lower bound to the ratio Σ ∗ / R , where Σ ∗ is the minimum of the distances from each vertex to the convex hull of the remaining point of S , AVTA can recover all of S ¯ ε . We show Σ ∗ ≥ ρ ∗ Γ ∗ / R , where ρ ∗ is the minimum distance between distinct pair of points in S and prove the following main results: Given any t ∈ ( 0 , 1 ) , AVTA computes a subset S ¯ t of S ¯ of cardinality K ( t ) in O ( n K ( t ) ( m + t - 2 ) ) operations so that for any p ∈ c o n v ( S ) its Euclidean distance to c o n v ( S ¯ t ) is at most tR . Given γ ≤ γ ∗ = Γ ∗ / R , AVTA computes S ¯ in O ( n K ( m + γ - 2 ) ) operations. If K is known, the complexity of AVTA is O ( n K ( m + γ ∗ - 2 ) log ( γ ∗ - 1 ) ) . Assuming instead of S , its ε -perturbation, S ε is given, we prove Given any t ∈ ( 0 , 1 ) , AVTA computes a subset S ¯ ε t ⊂ S ¯ ε of cardinality K ε ( t ) in O ( n K ε ( t ) ( m + t - 2 ) ) operations so that for any p ∈ c o n v ( S ) its distance to c o n v ( S ¯ ε t ) is at most ( t + ε ) R . Given σ ∈ [ 4 ε , σ ∗ = Γ ∗ / R ] , AVTA computes S ¯ ε in O ( n K ε ( m + σ - 2 ) ) operations, where K ≤ K ε ≤ n . If γ ≤ γ ∗ = Γ ∗ / R is known satisfying 4 ε ≤ γ ρ ∗ / R , AVTA computes S ¯ ε in O ( n K ε ( m + ( γ ρ ∗ ) - 2 ) ) operations. Given σ ∈ [ 4 ε , σ ∗ ] , if K is known, AVTA computes S ¯ ε in O ( n K ( m + σ ∗ - 2 ) log ( σ ∗ - 1 ) ) operations. We also consider the application of AVTA in the recovery of vertices through the projection of S or S ε under a Johnson–Lindenstrauss randomized linear projection L : R m → R m ′ . Denoting U = L ( S ) and U ε = L ( S ε ) , by relating the robustness parameters of conv ( U ) and c o n v ( U ε ) to those of conv ( S ) and c o n v ( S ε ) , we derive analogous complexity bounds for probabilistic computation of the vertex set of conv ( U ) or those of c o n v ( U ε ) , or an approximation to them. Finally, we apply AVTA to design new practical algorithms for two popular machine learning problems: topic modeling and non-negative matrix factorization. For topic models, our new algorithm leads to significantly better reconstruction of the topic-word matrix than state of the art approaches of Arora et al. (International conference on machine learning, pp 280–288, 2013) and Bansal et al. (Advances in neural information processing systems, pp 1997–2005, 2014). Additionally, we provide a robust analysis of AVTA and empirically demonstrate that it can handle larger amounts of noise than existing methods. For non-negative matrix factorization we show that AVTA is competitive with existing methods that are specialized for this task in Arora et al. (Proceedings of the forty-fourth annual ACM symposium on theory of computing, ACM, pp 145–162, 2012a). We also contrast AVTA with Blum et al. (Proceedings of the twenty-seventh annual ACM-SIAM symposium on discrete algorithms, Society for Industrial and Applied Mathematics, pp 548–557, 2016) Greedy Clustering coreset algorithm for computing approximation to the set of vertices and argue that not only there are regimes where AVTA outperforms that algorithm but it can also be used as a pre-processing step to their algorithm. Thus the two algorithms in fact complement each other.
Audience Academic
Author Zhang, Yikai
Kalantari, Bahman
Awasthi, Pranjal
Author_xml – sequence: 1
  givenname: Pranjal
  surname: Awasthi
  fullname: Awasthi, Pranjal
  organization: Rutgers University
– sequence: 2
  givenname: Bahman
  surname: Kalantari
  fullname: Kalantari, Bahman
  email: kalantar@cs.rutgers.edu
  organization: Rutgers University
– sequence: 3
  givenname: Yikai
  surname: Zhang
  fullname: Zhang, Yikai
  organization: Rutgers University
BookMark eNp9kV9LHDEUxUNR6O7qF-jTQF879ubfZOahDyKtCkJB9Dlkk8xuZCaxuTNL_fbNuoK1FAkkcDm_3HvPWZKjmKIn5BOFMwqgviIFoboaGNTApVQ1fCALKhWrO87bI7IAJkUtOYePZIn4AACUtnJBvt2m9YxTtfN58r8rH-fRZzOFFKs-5cqmuCvl7TwMWIVYbcNmW7kw-ohFgifkuDcD-tOXd0Xuf3y_u7iqb35eXl-c39RWMD7Vghto-rYz655z5k3XAjNrTpl1rjPeCmccA9eIphNOSWcMqMY0VtqerSk3fEU-H_59zOnX7HHSD2nOsbTUTCiqmGolvKo2ZvA6xD5N2dgxoNXnjaCMqoaLojr7j6oc58dQ9vV9KPU3wJe_gOJWiB7LhcWLCTdmRnwrZwe5zQkx-14_5jCa_KQp6H1Y-hCWLmHp57D0fvT2H8iG6TmGMlwY3kf5AcXSJ258fvXmHeoPl5-pUg
CitedBy_id crossref_primary_10_1145_3516520
crossref_primary_10_1016_j_ejor_2022_08_040
crossref_primary_10_1109_TKDE_2025_3552644
Cites_doi 10.1007/BF02712874
10.1007/BF02573985
10.1201/9781420035315
10.1145/2213977.2213994
10.1137/1.9781611974331.ch40
10.1007/BF02712873
10.1145/235815.235821
10.1137/0304007
10.1145/800057.808695
10.1002/nav.3800030109
10.1109/FOCS.2012.49
10.1145/2133806.2133826
10.1109/SFCS.1994.365723
10.1007/s10479-014-1707-2
10.1145/1542362.1542370
10.1016/j.laa.2005.12.022
10.1145/1557019.1557121
10.1023/A:1009715923555
10.1109/TIT.2002.808136
10.1016/0041-5553(80)90061-0
ContentType Journal Article
Copyright Springer Science+Business Media, LLC, part of Springer Nature 2020
COPYRIGHT 2020 Springer
Springer Science+Business Media, LLC, part of Springer Nature 2020.
Copyright_xml – notice: Springer Science+Business Media, LLC, part of Springer Nature 2020
– notice: COPYRIGHT 2020 Springer
– notice: Springer Science+Business Media, LLC, part of Springer Nature 2020.
DBID AAYXX
CITATION
N95
3V.
7TA
7TB
7WY
7WZ
7XB
87Z
88I
8AL
8AO
8FD
8FE
8FG
8FK
8FL
ABJCF
ABUWG
AFKRA
ARAPS
AZQEC
BENPR
BEZIV
BGLVJ
CCPQU
DWQXO
FR3
FRNLG
F~G
GNUQQ
HCIFZ
JG9
JQ2
K60
K6~
K7-
KR7
L.-
L6V
M0C
M0N
M2P
M7S
P5Z
P62
PHGZM
PHGZT
PKEHL
PQBIZ
PQBZA
PQEST
PQGLB
PQQKQ
PQUKI
PRINS
PTHSS
Q9U
DOI 10.1007/s10479-020-03557-0
DatabaseName CrossRef
Gale Business: Insights
ProQuest Central (Corporate)
Materials Business File
Mechanical & Transportation Engineering Abstracts
ABI/INFORM Collection
ABI/INFORM Global (PDF only)
ProQuest Central (purchase pre-March 2016)
ABI/INFORM Collection
Science Database (Alumni Edition)
Computing Database (Alumni Edition)
ProQuest Pharma Collection
Technology Research Database
ProQuest SciTech Collection
ProQuest Technology Collection
ProQuest Central (Alumni) (purchase pre-March 2016)
ABI/INFORM Collection (Alumni)
ProQuest MSED
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
Advanced Technologies & Computer Science Collection
ProQuest Central Essentials
ProQuest Central
ProQuest Business Premium Collection
ProQuest Technology Collection
ProQuest One
ProQuest Central
Engineering Research Database
Business Premium Collection (Alumni)
ABI/INFORM Global (Corporate)
ProQuest Central Student
ProQuest SciTech Premium Collection
Materials Research Database
ProQuest Computer Science Collection
ProQuest Business Collection (Alumni Edition)
ProQuest Business Collection
Computer Science Database (ProQuest)
Civil Engineering Abstracts
ABI/INFORM Professional Advanced
ProQuest Engineering Collection
ABI/INFORM Global
Computing Database
Science Database (ProQuest)
Engineering Database
Advanced Technologies & Aerospace Database
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Premium
ProQuest One Academic (New)
ProQuest One Academic Middle East (New)
ProQuest One Business
ProQuest One Business (Alumni)
ProQuest One Academic Eastern Edition (DO NOT USE)
One Applied & Life Sciences
ProQuest One Academic (retired)
ProQuest One Academic UKI Edition
ProQuest Central China
Engineering Collection
ProQuest Central Basic
DatabaseTitle CrossRef
Materials Research Database
ProQuest Business Collection (Alumni Edition)
Computer Science Database
ProQuest Central Student
ProQuest Advanced Technologies & Aerospace Collection
ProQuest Central Essentials
ProQuest Computer Science Collection
SciTech Premium Collection
ProQuest Central China
ABI/INFORM Complete
Materials Business File
ProQuest One Applied & Life Sciences
ProQuest Central (New)
Engineering Collection
Advanced Technologies & Aerospace Collection
Business Premium Collection
ABI/INFORM Global
Engineering Database
ProQuest Science Journals (Alumni Edition)
ProQuest One Academic Eastern Edition
ProQuest Technology Collection
ProQuest Business Collection
ProQuest One Academic UKI Edition
Engineering Research Database
ProQuest One Academic
ProQuest One Academic (New)
ABI/INFORM Global (Corporate)
ProQuest One Business
Technology Collection
Technology Research Database
ProQuest One Academic Middle East (New)
Mechanical & Transportation Engineering Abstracts
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest Pharma Collection
ProQuest Central
ABI/INFORM Professional Advanced
ProQuest Engineering Collection
ProQuest Central Korea
ABI/INFORM Complete (Alumni Edition)
Civil Engineering Abstracts
ProQuest Computing
ABI/INFORM Global (Alumni Edition)
ProQuest Central Basic
ProQuest Science Journals
ProQuest Computing (Alumni Edition)
ProQuest SciTech Collection
Advanced Technologies & Aerospace Database
Materials Science & Engineering Collection
ProQuest One Business (Alumni)
ProQuest Central (Alumni)
Business Premium Collection (Alumni)
DatabaseTitleList
Materials Research Database



Database_xml – sequence: 1
  dbid: BENPR
  name: ProQuest Central
  url: https://www.proquest.com/central
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
Business
EISSN 1572-9338
EndPage 73
ExternalDocumentID A641217634
10_1007_s10479_020_03557_0
GeographicLocations United States
GeographicLocations_xml – name: United States
GroupedDBID -4X
-57
-5G
-BR
-EM
-Y2
-~C
-~X
.4S
.86
.DC
.VR
06D
0R~
0VY
1N0
1SB
2.D
203
23M
28-
2J2
2JN
2JY
2KG
2KM
2LR
2P1
2VQ
2~H
30V
3V.
4.4
406
408
409
40D
40E
5GY
5QI
5VS
67Z
6NX
7WY
88I
8AO
8FE
8FG
8FL
8TC
8VB
95-
95.
95~
96X
AAAVM
AABHQ
AACDK
AAHNG
AAIAL
AAJBT
AAJKR
AANZL
AARHV
AARTL
AASML
AATNV
AATVU
AAUYE
AAWCG
AAYIU
AAYQN
AAYTO
AAYZH
ABAKF
ABBBX
ABBXA
ABDZT
ABECU
ABFTV
ABHLI
ABHQN
ABJCF
ABJNI
ABJOX
ABKCH
ABKTR
ABMNI
ABMQK
ABNWP
ABQBU
ABQSL
ABSXP
ABTEG
ABTHY
ABTKH
ABTMW
ABULA
ABUWG
ABWNU
ABXPI
ACAOD
ACBXY
ACDTI
ACGFS
ACGOD
ACHSB
ACHXU
ACIWK
ACKNC
ACMDZ
ACMLO
ACOKC
ACOMO
ACPIV
ACZOJ
ADHHG
ADHIR
ADIMF
ADINQ
ADKNI
ADKPE
ADRFC
ADTPH
ADURQ
ADYFF
ADZKW
AEBTG
AEFIE
AEFQL
AEGAL
AEGNC
AEJHL
AEJRE
AEKMD
AEMSY
AENEX
AEOHA
AEPYU
AESKC
AETLH
AEVLU
AEXYK
AFBBN
AFEXP
AFGCZ
AFKRA
AFLOW
AFQWF
AFWTZ
AFZKB
AGAYW
AGDGC
AGGDS
AGJBK
AGMZJ
AGQEE
AGQMX
AGRTI
AGWIL
AGWZB
AGYKE
AHAVH
AHBYD
AHKAY
AHQJS
AHSBF
AHYZX
AIAKS
AIGIU
AIIXL
AILAN
AITGF
AJBLW
AJRNO
AJZVZ
AKVCP
ALMA_UNASSIGNED_HOLDINGS
ALWAN
AMKLP
AMXSW
AMYLF
AMYQR
AOCGG
ARAPS
ARCSS
ARMRJ
ASPBG
AVWKF
AXYYD
AYQZM
AZFZN
AZQEC
B-.
BA0
BAPOH
BBWZM
BDATZ
BENPR
BEZIV
BGLVJ
BGNMA
BPHCQ
BSONS
CAG
CCPQU
COF
CS3
CSCUP
DDRTE
DL5
DNIVK
DPUIP
DWQXO
EBLON
EBO
EBS
EBU
EDO
EIOEI
EJD
ESBYG
FEDTE
FERAY
FFXSO
FIGPU
FINBP
FNLPD
FRNLG
FRRFC
FSGXE
FWDCC
GGCAI
GGRSB
GJIRD
GNUQQ
GNWQR
GQ6
GQ7
GQ8
GROUPED_ABI_INFORM_COMPLETE
GROUPED_ABI_INFORM_RESEARCH
GXS
H13
HCIFZ
HF~
HG5
HG6
HMJXF
HQYDN
HRMNR
HVGLF
HZ~
I-F
I09
IAO
IEA
IHE
IJ-
IKXTQ
ITC
ITM
IWAJR
IXC
IZIGR
IZQ
I~X
I~Z
J-C
J0Z
JBSCW
JCJTX
JZLTJ
K1G
K60
K6V
K6~
K7-
KDC
KOV
KOW
L6V
LAK
LLZTM
M0C
M0N
M2P
M4Y
M7S
MA-
N2Q
N95
N9A
NB0
NDZJH
NPVJJ
NQJWS
NU0
O9-
O93
O9G
O9I
O9J
OAM
OVD
P19
P2P
P62
P9M
PF0
PQBIZ
PQBZA
PQQKQ
PROAC
PT4
PT5
PTHSS
Q2X
QOK
QOS
QWB
R4E
R89
R9I
RHV
RNI
RNS
ROL
RPX
RSV
RZC
RZD
RZK
S16
S1Z
S26
S27
S28
S3B
SAP
SBE
SCF
SCLPG
SDH
SDM
SHX
SISQX
SJYHP
SNE
SNPRN
SNX
SOHCF
SOJ
SPISZ
SRMVM
SSLCW
STPWE
SZN
T13
T16
TEORI
TH9
TSG
TSK
TSV
TUC
TUS
U2A
UG4
UOJIU
UTJUX
UZXMN
VC2
VFIZW
W23
W48
WK8
YLTOR
Z45
Z5O
Z7R
Z7S
Z7U
Z7V
Z7W
Z7X
Z7Y
Z7Z
Z81
Z83
Z88
Z8N
Z8U
Z92
ZL0
ZMTXR
ZYFGU
~8M
~A9
~EX
AAPKM
AAYXX
ABBRH
ABDBE
ABFSG
ABRTQ
ACSTC
ADHKG
ADKFA
AEZWR
AFDZB
AFFHD
AFHIU
AFOHR
AGQPQ
AHPBZ
AHWEU
AIXLP
AMVHM
ATHPR
AYFIA
CITATION
ICD
PHGZM
PHGZT
PQGLB
7TA
7TB
7XB
8AL
8FD
8FK
FR3
JG9
JQ2
KR7
L.-
PKEHL
PQEST
PQUKI
PRINS
Q9U
ID FETCH-LOGICAL-c423t-43a06f89abf332ea9802ab312cdd9aec4dad20d64694d75daa076a6c5cf2b13a3
IEDL.DBID RSV
ISICitedReferencesCount 5
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000564403300003&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0254-5330
IngestDate Wed Nov 05 14:52:21 EST 2025
Sat Nov 29 13:36:49 EST 2025
Sat Nov 29 10:14:02 EST 2025
Sat Nov 29 08:42:17 EST 2025
Tue Nov 18 22:19:04 EST 2025
Sat Nov 29 02:36:40 EST 2025
Fri Feb 21 02:37:07 EST 2025
IsPeerReviewed true
IsScholarly true
Issue 1
Keywords Random projections
Approximation algorithms
Linear programming
Convex hull membership
Machine learning
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c423t-43a06f89abf332ea9802ab312cdd9aec4dad20d64694d75daa076a6c5cf2b13a3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
PQID 2471727850
PQPubID 25585
PageCount 37
ParticipantIDs proquest_journals_2471727850
gale_infotracmisc_A641217634
gale_infotracacademiconefile_A641217634
gale_businessinsightsgauss_A641217634
crossref_primary_10_1007_s10479_020_03557_0
crossref_citationtrail_10_1007_s10479_020_03557_0
springer_journals_10_1007_s10479_020_03557_0
PublicationCentury 2000
PublicationDate 20201200
2020-12-00
20201201
PublicationDateYYYYMMDD 2020-12-01
PublicationDate_xml – month: 12
  year: 2020
  text: 20201200
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle Annals of operations research
PublicationTitleAbbrev Ann Oper Res
PublicationYear 2020
Publisher Springer US
Springer
Springer Nature B.V
Publisher_xml – name: Springer US
– name: Springer
– name: Springer Nature B.V
References Arora, S., Ge, R., Halpern, Y., Mimno, D., Moitra, A., Sontag, D., et al. (2013). A practical algorithm for topic modeling with provable guarantees. In International conference on machine learning (pp. 280–288).
BurgesCJCA tutorial on support vector machines for pattern recognitionData Mining and Knowledge Discovery19982212116710.1023/A:1009715923555
Vu, K., Poirion, P.-L., & Liberti, L. (2017). Random projections for linear programming. arXiv preprint arXiv:1706.02768.
Yao, L., Mimno, D., & McCallum, A. (2009). Efficient methods for topic model inference on streaming document collections. In Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining (pp. 937–946). ACM.
KalantariBA characterization theorem and an algorithm for a convex hull problemAnnals of Operations Research2015226130134910.1007/s10479-014-1707-2
ChanTMOptimal output-sensitive convex hull algorithms in two and three dimensionsDiscrete & Computational Geometry199616436136810.1007/BF02712873
DingWRohbanMHIshwarPSaligramaVTopic discovery through data dependent and random projectionsICML2013312021210
Arora, S., Ge, R., Kannan, R., & Moitra, A. (2012a). Computing a nonnegative matrix factorization—provably. In Proceedings of the forty-fourth annual ACM symposium on theory of computing (pp. 145–162). ACM.
BarberCBDobkinDPHuhdanpaaHThe quickhull algorithm for convex hullsACM Transactions on Mathematical Software (TOMS)199622446948310.1145/235815.235821
BleiDMProbabilistic topic modelsCommunications of the ACM2012554778410.1145/2133806.2133826
ChvatalVLinear programming1983New YorkMacmillan
Bansal, T., Bhattacharyya, C., & Kannan, R. (2014) A provable SVD-based algorithm for learning topics in dominant admixture corpus. In Advances in neural information processing systems (pp. 1997–2005).
Karmarkar, N. (1984). A new polynomial-time algorithm for linear programming. In Proceedings of the sixteenth annual ACM symposium on theory of computing (pp. 302–311). ACM.
ZhangTSequential greedy approximation for certain convex optimization problemsIEEE Transactions on Information Theory200349368269110.1109/TIT.2002.808136
JinYKalantariBA procedure of chvátal for testing feasibility in linear programming and matrix scalingLinear Algebra and its Applications20064162–379579810.1016/j.laa.2005.12.022
Stevens, K., Kegelmeyer, P., Andrzejewski, D., & Buttler, D. (2012). Exploring topic coherence over many models and many topics. In Proceedings of the 2012 joint conference on empirical methods in natural language processing and computational natural language learning (pp. 952–961). Association for Computational Linguistics.
Donoho, D., & Stodden, V. (2003). When does non-negative matrix factorization give a correct decomposition into parts? In Advances in neural information processing systems.
ChanTMOutput-sensitive results on convex hulls, extreme points, and related problemsDiscrete & Computational Geometry199616436938710.1007/BF02712874
ClarksonKLCoresets, sparse greedy approximation, and the Frank-Wolfe algorithmACM Transactions on Algorithms (TALG)20106463
FrankMWolfePAn algorithm for quadratic programmingNaval Research Logistics (NRL)195631–29511010.1002/nav.3800030109
Gärtner, B., & Jaggi, M. (2009). Coresets for polytope distance. In Proceedings of the twenty-fifth annual symposium on computational geometry (pp. 33–42). ACM.
JohnsonWBLindenstraussJExtensions of lipschitz mappings into a hilbert spaceContemporary Mathematics198426189–2061
KhachiyanLGPolynomial algorithms in linear programmingUSSR Computational Mathematics and Mathematical Physics1980201537210.1016/0041-5553(80)90061-0
Anandkumar, A., Foster, D. P., Hsu, D. J., Kakade, S. M. & Liu, Y.-K. (2012). A spectral algorithm for latent dirichlet allocation. In Advances in neural information processing systems (pp. 917–925).
Blum, A., Har-Peled, S., & Raichel, B. (2016). Sparse approximation via generating point sets. In Proceedings of the twenty-seventh annual ACM-SIAM symposium on discrete algorithms (pp. 548–557). Society for Industrial and Applied Mathematics.
GilbertEGAn iterative procedure for computing the minimum of a quadratic form on a convex setSIAM Journal on Control196641618010.1137/0304007
Clarkson, K. L. (1994). More output-sensitive geometric algorithms. In 1994 Proceedings of the 35th annual symposium on foundations of computer science (pp. 695–702). IEEE.
Arora, S., Ge, R., & Moitra, A. (2012b). Learning topic models—going beyond SVD. In 2012 IEEE 53rd annual symposium on foundations of computer science (FOCS), (pp. 1–10). IEEE.
Lee, D. D. & Seung, H. S. (2001). Algorithms for non-negative matrix factorization. In Advances in neural information processing systems (pp. 556–562).
TothCDO’RourkeJGoodmanJEHandbook of discrete and computational geometry2004Boca RatonCRC Press10.1201/9781420035315
ChazelleBAn optimal convex hull algorithm in any fixed dimensionDiscrete & Computational Geometry199310137740910.1007/BF02573985
Jaggi, M. (2013). Revisiting Frank–Wolfe: projection-free sparse convex optimization.
BleiDMNgAYJordanMILatent dirichlet allocationJournal of Machine Learning Research20033Jan9931022
3557_CR4
M Frank (3557_CR19) 1956; 3
3557_CR5
EG Gilbert (3557_CR21) 1966; 4
3557_CR2
3557_CR3
3557_CR9
LG Khachiyan (3557_CR27) 1980; 20
TM Chan (3557_CR12) 1996; 16
3557_CR1
3557_CR29
3557_CR28
Y Jin (3557_CR23) 2006; 416
3557_CR26
DM Blei (3557_CR7) 2012; 55
3557_CR22
WB Johnson (3557_CR24) 1984; 26
CJC Burges (3557_CR10) 1998; 2
3557_CR20
V Chvatal (3557_CR14) 1983
B Chazelle (3557_CR13) 1993; 10
DM Blei (3557_CR8) 2003; 3
CD Toth (3557_CR30) 2004
B Kalantari (3557_CR25) 2015; 226
3557_CR18
W Ding (3557_CR17) 2013; 3
3557_CR15
CB Barber (3557_CR6) 1996; 22
T Zhang (3557_CR33) 2003; 49
3557_CR32
3557_CR31
TM Chan (3557_CR11) 1996; 16
KL Clarkson (3557_CR16) 2010; 6
References_xml – reference: Vu, K., Poirion, P.-L., & Liberti, L. (2017). Random projections for linear programming. arXiv preprint arXiv:1706.02768.
– reference: Jaggi, M. (2013). Revisiting Frank–Wolfe: projection-free sparse convex optimization.
– reference: ZhangTSequential greedy approximation for certain convex optimization problemsIEEE Transactions on Information Theory200349368269110.1109/TIT.2002.808136
– reference: JinYKalantariBA procedure of chvátal for testing feasibility in linear programming and matrix scalingLinear Algebra and its Applications20064162–379579810.1016/j.laa.2005.12.022
– reference: Arora, S., Ge, R., & Moitra, A. (2012b). Learning topic models—going beyond SVD. In 2012 IEEE 53rd annual symposium on foundations of computer science (FOCS), (pp. 1–10). IEEE.
– reference: KalantariBA characterization theorem and an algorithm for a convex hull problemAnnals of Operations Research2015226130134910.1007/s10479-014-1707-2
– reference: Karmarkar, N. (1984). A new polynomial-time algorithm for linear programming. In Proceedings of the sixteenth annual ACM symposium on theory of computing (pp. 302–311). ACM.
– reference: BleiDMProbabilistic topic modelsCommunications of the ACM2012554778410.1145/2133806.2133826
– reference: FrankMWolfePAn algorithm for quadratic programmingNaval Research Logistics (NRL)195631–29511010.1002/nav.3800030109
– reference: BleiDMNgAYJordanMILatent dirichlet allocationJournal of Machine Learning Research20033Jan9931022
– reference: JohnsonWBLindenstraussJExtensions of lipschitz mappings into a hilbert spaceContemporary Mathematics198426189–2061
– reference: TothCDO’RourkeJGoodmanJEHandbook of discrete and computational geometry2004Boca RatonCRC Press10.1201/9781420035315
– reference: Anandkumar, A., Foster, D. P., Hsu, D. J., Kakade, S. M. & Liu, Y.-K. (2012). A spectral algorithm for latent dirichlet allocation. In Advances in neural information processing systems (pp. 917–925).
– reference: ClarksonKLCoresets, sparse greedy approximation, and the Frank-Wolfe algorithmACM Transactions on Algorithms (TALG)20106463
– reference: Blum, A., Har-Peled, S., & Raichel, B. (2016). Sparse approximation via generating point sets. In Proceedings of the twenty-seventh annual ACM-SIAM symposium on discrete algorithms (pp. 548–557). Society for Industrial and Applied Mathematics.
– reference: ChvatalVLinear programming1983New YorkMacmillan
– reference: Bansal, T., Bhattacharyya, C., & Kannan, R. (2014) A provable SVD-based algorithm for learning topics in dominant admixture corpus. In Advances in neural information processing systems (pp. 1997–2005).
– reference: Donoho, D., & Stodden, V. (2003). When does non-negative matrix factorization give a correct decomposition into parts? In Advances in neural information processing systems.
– reference: BarberCBDobkinDPHuhdanpaaHThe quickhull algorithm for convex hullsACM Transactions on Mathematical Software (TOMS)199622446948310.1145/235815.235821
– reference: Arora, S., Ge, R., Kannan, R., & Moitra, A. (2012a). Computing a nonnegative matrix factorization—provably. In Proceedings of the forty-fourth annual ACM symposium on theory of computing (pp. 145–162). ACM.
– reference: DingWRohbanMHIshwarPSaligramaVTopic discovery through data dependent and random projectionsICML2013312021210
– reference: Clarkson, K. L. (1994). More output-sensitive geometric algorithms. In 1994 Proceedings of the 35th annual symposium on foundations of computer science (pp. 695–702). IEEE.
– reference: GilbertEGAn iterative procedure for computing the minimum of a quadratic form on a convex setSIAM Journal on Control196641618010.1137/0304007
– reference: ChanTMOptimal output-sensitive convex hull algorithms in two and three dimensionsDiscrete & Computational Geometry199616436136810.1007/BF02712873
– reference: Yao, L., Mimno, D., & McCallum, A. (2009). Efficient methods for topic model inference on streaming document collections. In Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining (pp. 937–946). ACM.
– reference: Gärtner, B., & Jaggi, M. (2009). Coresets for polytope distance. In Proceedings of the twenty-fifth annual symposium on computational geometry (pp. 33–42). ACM.
– reference: Arora, S., Ge, R., Halpern, Y., Mimno, D., Moitra, A., Sontag, D., et al. (2013). A practical algorithm for topic modeling with provable guarantees. In International conference on machine learning (pp. 280–288).
– reference: ChanTMOutput-sensitive results on convex hulls, extreme points, and related problemsDiscrete & Computational Geometry199616436938710.1007/BF02712874
– reference: Lee, D. D. & Seung, H. S. (2001). Algorithms for non-negative matrix factorization. In Advances in neural information processing systems (pp. 556–562).
– reference: ChazelleBAn optimal convex hull algorithm in any fixed dimensionDiscrete & Computational Geometry199310137740910.1007/BF02573985
– reference: KhachiyanLGPolynomial algorithms in linear programmingUSSR Computational Mathematics and Mathematical Physics1980201537210.1016/0041-5553(80)90061-0
– reference: BurgesCJCA tutorial on support vector machines for pattern recognitionData Mining and Knowledge Discovery19982212116710.1023/A:1009715923555
– reference: Stevens, K., Kegelmeyer, P., Andrzejewski, D., & Buttler, D. (2012). Exploring topic coherence over many models and many topics. In Proceedings of the 2012 joint conference on empirical methods in natural language processing and computational natural language learning (pp. 952–961). Association for Computational Linguistics.
– volume: 16
  start-page: 369
  issue: 4
  year: 1996
  ident: 3557_CR12
  publication-title: Discrete & Computational Geometry
  doi: 10.1007/BF02712874
– ident: 3557_CR31
– volume: 10
  start-page: 377
  issue: 1
  year: 1993
  ident: 3557_CR13
  publication-title: Discrete & Computational Geometry
  doi: 10.1007/BF02573985
– ident: 3557_CR29
– volume: 3
  start-page: 993
  issue: Jan
  year: 2003
  ident: 3557_CR8
  publication-title: Journal of Machine Learning Research
– volume-title: Handbook of discrete and computational geometry
  year: 2004
  ident: 3557_CR30
  doi: 10.1201/9781420035315
– ident: 3557_CR18
– ident: 3557_CR3
  doi: 10.1145/2213977.2213994
– volume: 6
  start-page: 63
  issue: 4
  year: 2010
  ident: 3557_CR16
  publication-title: ACM Transactions on Algorithms (TALG)
– volume: 3
  start-page: 1202
  year: 2013
  ident: 3557_CR17
  publication-title: ICML
– ident: 3557_CR9
  doi: 10.1137/1.9781611974331.ch40
– volume: 16
  start-page: 361
  issue: 4
  year: 1996
  ident: 3557_CR11
  publication-title: Discrete & Computational Geometry
  doi: 10.1007/BF02712873
– volume: 26
  start-page: 1
  issue: 189–206
  year: 1984
  ident: 3557_CR24
  publication-title: Contemporary Mathematics
– ident: 3557_CR5
– volume: 22
  start-page: 469
  issue: 4
  year: 1996
  ident: 3557_CR6
  publication-title: ACM Transactions on Mathematical Software (TOMS)
  doi: 10.1145/235815.235821
– volume: 4
  start-page: 61
  issue: 1
  year: 1966
  ident: 3557_CR21
  publication-title: SIAM Journal on Control
  doi: 10.1137/0304007
– ident: 3557_CR26
  doi: 10.1145/800057.808695
– ident: 3557_CR1
– volume: 3
  start-page: 95
  issue: 1–2
  year: 1956
  ident: 3557_CR19
  publication-title: Naval Research Logistics (NRL)
  doi: 10.1002/nav.3800030109
– ident: 3557_CR28
– volume-title: Linear programming
  year: 1983
  ident: 3557_CR14
– ident: 3557_CR4
  doi: 10.1109/FOCS.2012.49
– ident: 3557_CR22
– volume: 55
  start-page: 77
  issue: 4
  year: 2012
  ident: 3557_CR7
  publication-title: Communications of the ACM
  doi: 10.1145/2133806.2133826
– ident: 3557_CR15
  doi: 10.1109/SFCS.1994.365723
– volume: 226
  start-page: 301
  issue: 1
  year: 2015
  ident: 3557_CR25
  publication-title: Annals of Operations Research
  doi: 10.1007/s10479-014-1707-2
– ident: 3557_CR20
  doi: 10.1145/1542362.1542370
– volume: 416
  start-page: 795
  issue: 2–3
  year: 2006
  ident: 3557_CR23
  publication-title: Linear Algebra and its Applications
  doi: 10.1016/j.laa.2005.12.022
– ident: 3557_CR32
  doi: 10.1145/1557019.1557121
– volume: 2
  start-page: 121
  issue: 2
  year: 1998
  ident: 3557_CR10
  publication-title: Data Mining and Knowledge Discovery
  doi: 10.1023/A:1009715923555
– volume: 49
  start-page: 682
  issue: 3
  year: 2003
  ident: 3557_CR33
  publication-title: IEEE Transactions on Information Theory
  doi: 10.1109/TIT.2002.808136
– ident: 3557_CR2
– volume: 20
  start-page: 53
  issue: 1
  year: 1980
  ident: 3557_CR27
  publication-title: USSR Computational Mathematics and Mathematical Physics
  doi: 10.1016/0041-5553(80)90061-0
SSID ssj0001185
Score 2.3627803
Snippet The problem of computing the vertices of the convex hull of a given input set S = { v i ∈ R m : i = 1 , ⋯ , n } is a classic and fundamental problem, studied...
The problem of computing the vertices of the convex hull of a given input set [Formula omitted] is a classic and fundamental problem, studied in the context of...
The problem of computing the vertices of the convex hull of a given input set S={vi∈Rm:i=1,⋯,n} is a classic and fundamental problem, studied in the context of...
SourceID proquest
gale
crossref
springer
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 37
SubjectTerms Algorithms
Angle
Apexes
Applications of mathematics
Approximation
Authorship
Business and Management
Clustering
Combinatorics
Complexity
Computational geometry
Convex surfaces
Convexity
Data processing
Enumeration
Euclidean geometry
Factorization
Greedy algorithms
Hulls
Lower bounds
Machine learning
Mathematical programming
Matrix methods
Measurement
Operations research
Operations Research/Decision Theory
Original Research
Parameter robustness
Perturbation
Robustness (mathematics)
Technology application
Theory of Computation
Triangles
Vertex sets
SummonAdditionalLinks – databaseName: ABI/INFORM Global
  dbid: M0C
  link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1Lb9QwEB5BQQgOFBYQCwX5AOIAFo7jOPYJVRUVFyoOIPVmObYDldC21LuoP58Zr9NlQfTCJRdPIseepz0zH8CLVja-6wyGJXYYuPJR80EkzxUah95ojzZKFLCJ_ujIHB_bT_XALde0ykknFkUdTwOdkb-VqEXR1ppOvDv7wQk1im5XK4TGdbhBng31zv8oDi41MTrPJYURgyBOWZS1aKaWzqnecgqeBJrcnostw_Snev7rnrSYn8Pd_534PbhbHU-2v-aU-3AtLWZwa8p7n8HuhO_AqrjP4M5vzQofAJWKrfKSEYBzumCUQp_W7MPQ8WUlff2CfcOQNrOTBaM-yCwSdgCdx-WH8OXw_eeDD7yCL_CAHtaSq9YLPRrrh7FtZfLWCOmHtpEhRutTUNFHKaLG8FrFvovei157HbowyqFpffsIdhani_QYmLFReBvkiOusbGON7gcq0e1HLZMw_RyaaeVdqJ3JCSDju9v0VKbdcrhbruyWE3N4ffnO2bovx5XUL2lDXQX2xEemo4_81a9ydvtaNRiZ6VbN4VWhI-HGGQRfaxTwP6hN1hbl3hYlCmXYHp6YwVWlkN2GE-bwZmKnzfC_p__k6q89hduSGLkk2ezBzvJ8lZ7BzfBzeZLPnxeR-AV5DQ19
  priority: 102
  providerName: ProQuest
Title Robust vertex enumeration for convex hulls in high dimensions
URI https://link.springer.com/article/10.1007/s10479-020-03557-0
https://www.proquest.com/docview/2471727850
Volume 295
WOSCitedRecordID wos000564403300003&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAVX
  databaseName: SpringerLINK Contemporary 1997-Present
  customDbUrl:
  eissn: 1572-9338
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0001185
  issn: 0254-5330
  databaseCode: RSV
  dateStart: 19970101
  isFulltext: true
  titleUrlDefault: https://link.springer.com/search?facet-content-type=%22Journal%22
  providerName: Springer Nature
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnR1daxQxcNBWRB-snhVP65EHxQcNZLO7-XispUUQj6P1o_oSssmuFuSU5k76853Zy7Y9v0BfBpbMLsnsTGYmmQ-Ax6UsfF0bdEts0_DKR8Ub0XpeoXLQRnnUUaJvNqGnU3N8bGc5KSwN0e7DlWS_U19Kdqu05eTuCFSSmqOjvknFS6hvweHRu_P9F03mPnARXR9OsZM5Veb331hTRz9vyr_cjvZK52Dr_6Z7G25lI5PtrrjiDlxp5yO4PsS4j2Br6OXAsmiP4OalwoR3gdLClmnBqFlze8YoXL5dsQpDI5f1oepn7DO6r4mdzBnVPGaR-gTQ2VvahrcH-2_2XvLcaIEHtKYWvCq9UJ2xvunKUrbeGiF9UxYyxGh9G6rooxRRoStdRV1H74VWXoU6dLIpSl_eg43513l7H5ixUXgbZIdLrmxhjdINpePqTslWGD2GYqC3C7kKOTXD-OIu6icT4RwSzvWEc2IMz87f-baqwfFX7Cf0G11u4okg0TFH-uSXKbldVRXohamyGsPTHo8EGWcQfM5HwHVQSaw1zJ01TBTAsD48MIzLG0ByEpU-moamxvk8HxjkYvjP03_wb-gP4YYkHusDbHZgY3G6bB_BtfB9cZJOJ3BVv_8wgc0X-9PZIT690hzha7FHUM4I6iOEs_rjpBeiH-0XCtU
linkProvider Springer Nature
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Nb9QwEB2Vgvg4UFhALBTwgYoDWDhO4tgHhCqgatWy6qFIFRfXsR2ohLal3oXyp_iNzGSTLguitx645BInSuLnNzPOzDyAp7nMXFlqDEtMXfPCBcVrER0v0DhUWjm0UaIVm6hGI72_b3aX4GdfC0NplT0ntkQdjjztkb-UyKJoa3UpXh9_5aQaRX9XewmNGSy244_vGLKlV1tvcX7XpNx4t_dmk3eqAtyj6zDhRe6EarRxdZPnMjqjhXR1nkkfgnHRF8EFKYLCuLEIVRmcw1DfKV_6RtZZ7nK87yW4jG6EISLYLT-eMT86623KJAZdnLI2uyKdrlSvqAynYE2gia-4WDCEf5qDv_7LtuZuY-V_-1C34GbnWLP12Uq4DUtxPICrfV7_AFZ6_QrW0dkAbvzWjPEOUCncNE0YCVTHU0YlAnG2PBg69qxNzz9lnzFkT-xwzKjPMwukjUD7jekufLiQt7sHy-OjcbwPTJsgnPGywXktTGa0qmoqQa4aJaPQ1RCyfqat7zqvkwDIFzvvGU3osIgO26LDiiE8P7vmeNZ35NzRawQg2wmX4iHR1k765KYp2XVVZBh5qrwYwrN2HJEXPoF3XQ0Gvge1AVsYubowEknHL57uwWc70kt2jrwhvOjhOz_978d_cP7dnsC1zb33O3Zna7T9EK5LWkRtQtEqLE9OpvERXPHfJofp5HG7HBkcXDSsfwGLCWzP
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Nb9QwEB2Vgio4UFhALBTwgYoDWHUc58MHhKqWFVVh1QNIvRnHdqAS2pZ6F8pf49cxk3W6LIjeeuCSix0rjt_MeOyZeQBPc5nZoqjRLdFNw5X1JW9EsFyhcajq0qKNEh3ZRDUe14eH-mAFfva5MBRW2evETlH7Y0dn5FsStSja2roQW20KizjYHb06-cqJQYpuWns6jTlE9sOP7-i-xZd7u7jWm1KOXr_fecMTwwB3uI2YcpVbUba1tk2b5zJYXQtpmzyTznttg1Peeil8iT6k8lXhrUW335aucK1sstzmOO4VuFoppalu_zuxc24FcOPehU-iA8YpgjMl7KS0PVVpTo6bQHNfcbFkFP80DX_d0Xamb7T-P_-0W3AzbbjZ9lxCbsNKmAxgrY_3H8B6z2vBkpobwI3fijTeAUqRm8UpI-LqcMYodSDMxYbhhp91Yftn7DNOPLKjCaP6z8wTZwKdQ8a78OFSZncPVifHk3AfWK29sNrJFtdY6UzXZdVQanLVljKIuhpC1q-6cakiOxGDfDGLWtKEFINIMR1SjBjC8_N3Tub1SC7svUlgMonQFB-RjnziJzuL0WyXKkOPtMzVEJ51_Uip4Rc4m3IzcB5UHmyp58ZST1RGbrm5B6JJyjCaBQqH8KKH8qL535__4OLRnsAaotm83RvvP4TrkuSpizPagNXp6Sw8gmvu2_Qonj7uJJPBx8tG9S834HVY
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Robust+vertex+enumeration+for+convex+hulls+in+high+dimensions&rft.jtitle=Annals+of+operations+research&rft.au=Awasthi%2C+Pranjal&rft.au=Kalantari%2C+Bahman&rft.au=Zhang%2C+Yikai&rft.date=2020-12-01&rft.pub=Springer&rft.issn=0254-5330&rft.volume=295&rft.issue=1&rft.spage=37&rft_id=info:doi/10.1007%2Fs10479-020-03557-0&rft.externalDocID=A641217634
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0254-5330&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0254-5330&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0254-5330&client=summon