Abstract
Finding a point which minimizes the maximal distortion with respect to a dataset is an important estimation problem that has recently received growing attentions in machine learning, with the advent of one class classification. We propose two theoretically founded generalizations to arbitrary Bregman divergences, of a recent popular smallest enclosing ball approximation algorithm for Euclidean spaces coined by Bǎdoiu and Clarkson in 2002. © Springer-Verlag Berlin Heidelberg 2005.
Cite
CITATION STYLE
Nock, R., & Nielsen, P. (2005). Fitting the smallest enclosing bregman ball. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3720 LNAI, pp. 649–656). https://doi.org/10.1007/11564096_65
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.