Header logo is ei


1999


no image
Semiparametric support vector and linear programming machines

Smola, A., Friess, T., Schölkopf, B.

In Advances in Neural Information Processing Systems 11, pages: 585-591 , (Editors: MS Kearns and SA Solla and DA Cohn), MIT Press, Cambridge, MA, USA, Twelfth Annual Conference on Neural Information Processing Systems (NIPS), June 1999 (inproceedings)

Abstract
Semiparametric models are useful tools in the case where domain knowledge exists about the function to be estimated or emphasis is put onto understandability of the model. We extend two learning algorithms - Support Vector machines and Linear Programming machines to this case and give experimental results for SV machines.

PDF Web [BibTex]

1999

PDF Web [BibTex]


no image
Kernel PCA and De-noising in feature spaces

Mika, S., Schölkopf, B., Smola, A., Müller, K., Scholz, M., Rätsch, G.

In Advances in Neural Information Processing Systems 11, pages: 536-542 , (Editors: MS Kearns and SA Solla and DA Cohn), MIT Press, Cambridge, MA, USA, 12th Annual Conference on Neural Information Processing Systems (NIPS), June 1999 (inproceedings)

Abstract
Kernel PCA as a nonlinear feature extractor has proven powerful as a preprocessing step for classification algorithms. But it can also be considered as a natural generalization of linear principal component analysis. This gives rise to the question how to use nonlinear features for data compression, reconstruction, and de-noising, applications common in linear PCA. This is a nontrivial task, as the results provided by kernel PCA live in some high dimensional feature space and need not have pre-images in input space. This work presents ideas for finding approximate pre-images, focusing on Gaussian kernels, and shows experimental results using these pre-images in data reconstruction and de-noising on toy examples as well as on real world data.

PDF Web [BibTex]

PDF Web [BibTex]


no image
Kernel principal component analysis.

Schölkopf, B., Smola, A., Müller, K.

In Advances in Kernel Methods—Support Vector Learning, pages: 327-352, (Editors: B Schölkopf and CJC Burges and AJ Smola), MIT Press, Cambridge, MA, 1999 (inbook)

[BibTex]

[BibTex]


no image
Estimating the support of a high-dimensional distribution

Schölkopf, B., Platt, J., Shawe-Taylor, J., Smola, A., Williamson, R.

(MSR-TR-99-87), Microsoft Research, 1999 (techreport)

Web [BibTex]

Web [BibTex]


no image
Single-class Support Vector Machines

Schölkopf, B., Williamson, R., Smola, A., Shawe-Taylor, J.

Dagstuhl-Seminar on Unsupervised Learning, pages: 19-20, (Editors: J. Buhmann, W. Maass, H. Ritter and N. Tishby), 1999 (poster)

[BibTex]

[BibTex]


no image
Spatial Learning and Localization in Animals: A Computational Model and Its Implications for Mobile Robots

Balakrishnan, K., Bousquet, O., Honavar, V.

Adaptive Behavior, 7(2):173-216, 1999 (article)

[BibTex]


no image
SVMs for Histogram Based Image Classification

Chapelle, O., Haffner, P., Vapnik, V.

IEEE Transactions on Neural Networks, (9), 1999 (article)

Abstract
Traditional classification approaches generalize poorly on image classification tasks, because of the high dimensionality of the feature space. This paper shows that Support Vector Machines (SVM) can generalize well on difficult image classification problems where the only features are high dimensional histograms. Heavy-tailed RBF kernels of the form $K(mathbf{x},mathbf{y})=e^{-rhosum_i |x_i^a-y_i^a|^{b}}$ with $aleq 1$ and $b leq 2$ are evaluated on the classification of images extracted from the Corel Stock Photo Collection and shown to far outperform traditional polynomial or Gaussian RBF kernels. Moreover, we observed that a simple remapping of the input $x_i rightarrow x_i^a$ improves the performance of linear SVMs to such an extend that it makes them, for this problem, a valid alternative to RBF kernels.

GZIP [BibTex]

GZIP [BibTex]


no image
Classifying LEP data with support vector algorithms.

Vannerem, P., Müller, K., Smola, A., Schölkopf, B., Söldner-Rembold, S.

In Artificial Intelligence in High Energy Nuclear Physics 99, Artificial Intelligence in High Energy Nuclear Physics 99, 1999 (inproceedings)

[BibTex]

[BibTex]


no image
Generalization Bounds via Eigenvalues of the Gram matrix

Schölkopf, B., Shawe-Taylor, J., Smola, A., Williamson, R.

(99-035), NeuroCOLT, 1999 (techreport)

[BibTex]

[BibTex]


no image
Pedestal effects with periodic pulse trains

Henning, G., Wichmann, F.

Perception, 28, pages: S137, 1999 (poster)

Abstract
It is important to know for theoretical reasons how performance varies with stimulus contrast. But, for objects on CRT displays, retinal contrast is limited by the linear range of the display and the modulation transfer function of the eye. For example, with an 8 c/deg sinusoidal grating at 90% contrast, the contrast of the retinal image is barely 45%; more retinal contrast is required, however, to discriminate among theories of contrast discrimination (Wichmann, Henning and Ploghaus, 1998). The stimulus with the greatest contrast at any spatial-frequency component is a periodic pulse train which has 200% contrast at every harmonic. Such a waveform cannot, of course, be produced; the best we can do with our Mitsubishi display provides a contrast of 150% at an 8-c/deg fundamental thus producing a retinal image with about 75% contrast. The penalty of using this stimulus is that the 2nd harmonic of the retinal image also has high contrast (with an emmetropic eye, more than 60% of the contrast of the 8-c/deg fundamental ) and the mean luminance is not large (24.5 cd/m2 on our display). We have used standard 2-AFC experiments to measure the detectability of an 8-c/deg pulse train against the background of an identical pulse train of different contrasts. An unusually large improvement in detetectability was measured, the pedestal effect or "dipper," and the dipper was unusually broad. The implications of these results will be discussed.

[BibTex]

[BibTex]


no image
Apprentissage Automatique et Simplicite

Bousquet, O.

Biologische Kybernetik, 1999, In french (diplomathesis)

PostScript [BibTex]

PostScript [BibTex]


no image
Classification on proximity data with LP-machines

Graepel, T., Herbrich, R., Schölkopf, B., Smola, A., Bartlett, P., Müller, K., Obermayer, K., Williamson, R.

In Artificial Neural Networks, 1999. ICANN 99, 470, pages: 304-309, Conference Publications , IEEE, 9th International Conference on Artificial Neural Networks, 1999 (inproceedings)

DOI [BibTex]

DOI [BibTex]


no image
Kernel-dependent support vector error bounds

Schölkopf, B., Shawe-Taylor, J., Smola, A., Williamson, R.

In Artificial Neural Networks, 1999. ICANN 99, 470, pages: 103-108 , Conference Publications , IEEE, 9th International Conference on Artificial Neural Networks, 1999 (inproceedings)

DOI [BibTex]

DOI [BibTex]


no image
Linear programs for automatic accuracy control in regression

Smola, A., Schölkopf, B., Rätsch, G.

In Artificial Neural Networks, 1999. ICANN 99, 470, pages: 575-580 , Conference Publications , IEEE, 9th International Conference on Artificial Neural Networks, 1999 (inproceedings)

DOI [BibTex]

DOI [BibTex]


no image
Regularized principal manifolds.

Smola, A., Williamson, R., Mika, S., Schölkopf, B.

In Lecture Notes in Artificial Intelligence, Vol. 1572, 1572, pages: 214-229 , Lecture Notes in Artificial Intelligence, (Editors: P Fischer and H-U Simon), Springer, Berlin, Germany, Computational Learning Theory: 4th European Conference, 1999 (inproceedings)

[BibTex]

[BibTex]


no image
Entropy numbers, operators and support vector kernels.

Williamson, R., Smola, A., Schölkopf, B.

In Lecture Notes in Artificial Intelligence, Vol. 1572, 1572, pages: 285-299, Lecture Notes in Artificial Intelligence, (Editors: P Fischer and H-U Simon), Springer, Berlin, Germany, Computational Learning Theory: 4th European Conference, 1999 (inproceedings)

[BibTex]

[BibTex]


no image
Sparse kernel feature analysis

Smola, A., Mangasarian, O., Schölkopf, B.

(99-04), Data Mining Institute, 1999, 24th Annual Conference of Gesellschaft f{\"u}r Klassifikation, University of Passau (techreport)

PostScript [BibTex]

PostScript [BibTex]


no image
Machine Learning and Language Acquisition: A Model of Child’s Learning of Turkish Morphophonology

Altun, Y.

Middle East Technical University, Ankara, Turkey, 1999 (mastersthesis)

[BibTex]


no image
Is the Hippocampus a Kalman Filter?

Bousquet, O., Balakrishnan, K., Honavar, V.

In Proceedings of the Pacific Symposium on Biocomputing, 3, pages: 619-630, Proceedings of the Pacific Symposium on Biocomputing, 1999 (inproceedings)

[BibTex]

[BibTex]


no image
A Comparison of Artificial Neural Networks and Cluster Analysis for Typing Biometrics Authentication

Maisuria, K., Ong, CS., Lai, .

In unknown, pages: 9999-9999, International Joint Conference on Neural Networks, 1999 (inproceedings)

PDF [BibTex]

PDF [BibTex]


no image
Implications of the pedestal effect for models of contrast-processing and gain-control

Wichmann, F., Henning, G.

OSA Conference Program, pages: 62, 1999 (poster)

Abstract
Understanding contrast processing is essential for understanding spatial vision. Pedestal contrast systematically affects slopes of functions relating 2-AFC contrast discrimination performance to pedestal contrast. The slopes provide crucial information because only full sets of data allow discrimination among contrast-processing and gain-control models. Issues surrounding Weber's law will also be discussed.

[BibTex]


no image
Entropy numbers, operators and support vector kernels.

Williamson, R., Smola, A., Schölkopf, B.

In Advances in Kernel Methods - Support Vector Learning, pages: 127-144, (Editors: B Schölkopf and CJC Burges and AJ Smola), MIT Press, Cambridge, MA, 1999 (inbook)

[BibTex]

[BibTex]


no image
Advances in Kernel Methods - Support Vector Learning

Schölkopf, B., Burges, C., Smola, A.

MIT Press, Cambridge, MA, 1999 (book)

[BibTex]

[BibTex]


no image
Fisher discriminant analysis with kernels

Mika, S., Rätsch, G., Weston, J., Schölkopf, B., Müller, K.

In Proceedings of the 1999 IEEE Signal Processing Society Workshop, 9, pages: 41-48, (Editors: Y-H Hu and J Larsen and E Wilson and S Douglas), IEEE, Neural Networks for Signal Processing IX, 1999 (inproceedings)

DOI [BibTex]

DOI [BibTex]