Large Margin Non-Linear Embedding
2005
Conference Paper
ei
It is common in classification methods to first place data in a vector space and then learn decision boundaries. We propose reversing that process: for fixed decision boundaries, we ``learn‘‘ the location of the data. This way we (i) do not need a metric (or even stronger structure) -- pairwise dissimilarities suffice; and additionally (ii) produce low-dimensional embeddings that can be analyzed visually. We achieve this by combining an entropy-based embedding method with an entropy-based version of semi-supervised logistic regression. We present results for clustering and semi-supervised classification.
Author(s): | Zien, A. and Candela, JQ. |
Book Title: | ICML 2005 |
Journal: | Proceedings of the 22nd International Conference on Machine Learning (ICML 2005) |
Pages: | 1065-1072 |
Year: | 2005 |
Month: | August |
Day: | 0 |
Editors: | De Raedt, L. , S. Wrobel |
Publisher: | ACM Press |
Department(s): | Empirical Inference |
Bibtex Type: | Conference Paper (inproceedings) |
DOI: | 10.1145/1102351.1102485 |
Event Name: | 22nd International Conference on Machine Learning |
Event Place: | Bonn, Germany |
Address: | New York, NY, USA |
Digital: | 0 |
Language: | en |
Organization: | Max-Planck-Gesellschaft |
School: | Biologische Kybernetik |
Links: |
PDF
PostScript Web |
BibTex @inproceedings{3375, title = {Large Margin Non-Linear Embedding}, author = {Zien, A. and Candela, JQ.}, journal = {Proceedings of the 22nd International Conference on Machine Learning (ICML 2005)}, booktitle = {ICML 2005}, pages = {1065-1072}, editors = {De Raedt, L. , S. Wrobel}, publisher = {ACM Press}, organization = {Max-Planck-Gesellschaft}, school = {Biologische Kybernetik}, address = {New York, NY, USA}, month = aug, year = {2005}, doi = {10.1145/1102351.1102485}, month_numeric = {8} } |