Header logo is ei


1999


no image
Classification on proximity data with LP-machines

Graepel, T., Herbrich, R., Schölkopf, B., Smola, A., Bartlett, P., Müller, K., Obermayer, K., Williamson, R.

In Artificial Neural Networks, 1999. ICANN 99, 470, pages: 304-309, Conference Publications , IEEE, 9th International Conference on Artificial Neural Networks, 1999 (inproceedings)

DOI [BibTex]

1999

DOI [BibTex]


no image
Kernel-dependent support vector error bounds

Schölkopf, B., Shawe-Taylor, J., Smola, A., Williamson, R.

In Artificial Neural Networks, 1999. ICANN 99, 470, pages: 103-108 , Conference Publications , IEEE, 9th International Conference on Artificial Neural Networks, 1999 (inproceedings)

DOI [BibTex]

DOI [BibTex]


no image
Linear programs for automatic accuracy control in regression

Smola, A., Schölkopf, B., Rätsch, G.

In Artificial Neural Networks, 1999. ICANN 99, 470, pages: 575-580 , Conference Publications , IEEE, 9th International Conference on Artificial Neural Networks, 1999 (inproceedings)

DOI [BibTex]

DOI [BibTex]


no image
Classifying LEP data with support vector algorithms.

Vannerem, P., Müller, K., Smola, A., Schölkopf, B., Söldner-Rembold, S.

In Artificial Intelligence in High Energy Nuclear Physics 99, Artificial Intelligence in High Energy Nuclear Physics 99, 1999 (inproceedings)

[BibTex]

[BibTex]


no image
Generalization Bounds via Eigenvalues of the Gram matrix

Schölkopf, B., Shawe-Taylor, J., Smola, A., Williamson, R.

(99-035), NeuroCOLT, 1999 (techreport)

[BibTex]

[BibTex]


no image
Pedestal effects with periodic pulse trains

Henning, G., Wichmann, F.

Perception, 28, pages: S137, 1999 (poster)

Abstract
It is important to know for theoretical reasons how performance varies with stimulus contrast. But, for objects on CRT displays, retinal contrast is limited by the linear range of the display and the modulation transfer function of the eye. For example, with an 8 c/deg sinusoidal grating at 90% contrast, the contrast of the retinal image is barely 45%; more retinal contrast is required, however, to discriminate among theories of contrast discrimination (Wichmann, Henning and Ploghaus, 1998). The stimulus with the greatest contrast at any spatial-frequency component is a periodic pulse train which has 200% contrast at every harmonic. Such a waveform cannot, of course, be produced; the best we can do with our Mitsubishi display provides a contrast of 150% at an 8-c/deg fundamental thus producing a retinal image with about 75% contrast. The penalty of using this stimulus is that the 2nd harmonic of the retinal image also has high contrast (with an emmetropic eye, more than 60% of the contrast of the 8-c/deg fundamental ) and the mean luminance is not large (24.5 cd/m2 on our display). We have used standard 2-AFC experiments to measure the detectability of an 8-c/deg pulse train against the background of an identical pulse train of different contrasts. An unusually large improvement in detetectability was measured, the pedestal effect or "dipper," and the dipper was unusually broad. The implications of these results will be discussed.

[BibTex]

[BibTex]


no image
Apprentissage Automatique et Simplicite

Bousquet, O.

Biologische Kybernetik, 1999, In french (diplomathesis)

PostScript [BibTex]

PostScript [BibTex]


no image
Is the Hippocampus a Kalman Filter?

Bousquet, O., Balakrishnan, K., Honavar, V.

In Proceedings of the Pacific Symposium on Biocomputing, 3, pages: 619-630, Proceedings of the Pacific Symposium on Biocomputing, 1999 (inproceedings)

[BibTex]

[BibTex]


no image
A Comparison of Artificial Neural Networks and Cluster Analysis for Typing Biometrics Authentication

Maisuria, K., Ong, CS., Lai, .

In unknown, pages: 9999-9999, International Joint Conference on Neural Networks, 1999 (inproceedings)

PDF [BibTex]

PDF [BibTex]


no image
Regularized principal manifolds.

Smola, A., Williamson, R., Mika, S., Schölkopf, B.

In Lecture Notes in Artificial Intelligence, Vol. 1572, 1572, pages: 214-229 , Lecture Notes in Artificial Intelligence, (Editors: P Fischer and H-U Simon), Springer, Berlin, Germany, Computational Learning Theory: 4th European Conference, 1999 (inproceedings)

[BibTex]

[BibTex]


no image
Entropy numbers, operators and support vector kernels.

Williamson, R., Smola, A., Schölkopf, B.

In Lecture Notes in Artificial Intelligence, Vol. 1572, 1572, pages: 285-299, Lecture Notes in Artificial Intelligence, (Editors: P Fischer and H-U Simon), Springer, Berlin, Germany, Computational Learning Theory: 4th European Conference, 1999 (inproceedings)

[BibTex]

[BibTex]


no image
Sparse kernel feature analysis

Smola, A., Mangasarian, O., Schölkopf, B.

(99-04), Data Mining Institute, 1999, 24th Annual Conference of Gesellschaft f{\"u}r Klassifikation, University of Passau (techreport)

PostScript [BibTex]

PostScript [BibTex]


no image
Machine Learning and Language Acquisition: A Model of Child’s Learning of Turkish Morphophonology

Altun, Y.

Middle East Technical University, Ankara, Turkey, 1999 (mastersthesis)

[BibTex]


no image
Implications of the pedestal effect for models of contrast-processing and gain-control

Wichmann, F., Henning, G.

OSA Conference Program, pages: 62, 1999 (poster)

Abstract
Understanding contrast processing is essential for understanding spatial vision. Pedestal contrast systematically affects slopes of functions relating 2-AFC contrast discrimination performance to pedestal contrast. The slopes provide crucial information because only full sets of data allow discrimination among contrast-processing and gain-control models. Issues surrounding Weber's law will also be discussed.

[BibTex]


no image
Advances in Kernel Methods - Support Vector Learning

Schölkopf, B., Burges, C., Smola, A.

MIT Press, Cambridge, MA, 1999 (book)

[BibTex]

[BibTex]


no image
Fisher discriminant analysis with kernels

Mika, S., Rätsch, G., Weston, J., Schölkopf, B., Müller, K.

In Proceedings of the 1999 IEEE Signal Processing Society Workshop, 9, pages: 41-48, (Editors: Y-H Hu and J Larsen and E Wilson and S Douglas), IEEE, Neural Networks for Signal Processing IX, 1999 (inproceedings)

DOI [BibTex]

DOI [BibTex]


no image
Entropy numbers, operators and support vector kernels.

Williamson, R., Smola, A., Schölkopf, B.

In Advances in Kernel Methods - Support Vector Learning, pages: 127-144, (Editors: B Schölkopf and CJC Burges and AJ Smola), MIT Press, Cambridge, MA, 1999 (inbook)

[BibTex]

[BibTex]

1996


no image
The DELVE user manual

Rasmussen, CE., Neal, RM., Hinton, GE., van Camp, D., Revow, M., Ghahramani, Z., Kustra, R., Tibshirani, R.

Department of Computer Science, University of Toronto, December 1996 (techreport)

Abstract
This manual describes the preliminary release of the DELVE environment. Some features described here have not yet implemented, as noted. Support for regression tasks is presently somewhat more developed than that for classification tasks. We recommend that you exercise caution when using this version of DELVE for real work, as it is possible that bugs remain in the software. We hope that you will send us reports of any problems you encounter, as well as any other comments you may have on the software or manual, at the e-mail address below. Please mention the version number of the manual and/or the software with any comments you send.

GZIP [BibTex]

1996

GZIP [BibTex]


no image
Nonlinear Component Analysis as a Kernel Eigenvalue Problem

Schölkopf, B., Smola, A., Müller, K.

(44), Max Planck Institute for Biological Cybernetics Tübingen, December 1996, This technical report has also been published elsewhere (techreport)

Abstract
We describe a new method for performing a nonlinear form of Principal Component Analysis. By the use of integral operator kernel functions, we can efficiently compute principal components in high-dimensional feature spaces, related to input space by some nonlinear map; for instance the space of all possible 5-pixel products in 16 x 16 images. We give the derivation of the method, along with a discussion of other techniques which can be made nonlinear with the kernel approach; and present first experimental results on nonlinear feature extraction for pattern recognition.

[BibTex]

[BibTex]


no image
Quality Prediction of Steel Products using Neural Networks

Shin, H., Jhee, W.

In Proc. of the Korean Expert System Conference, pages: 112-124, Korean Expert System Society Conference, November 1996 (inproceedings)

[BibTex]

[BibTex]


no image
Comparison of view-based object recognition algorithms using realistic 3D models

Blanz, V., Schölkopf, B., Bülthoff, H., Burges, C., Vapnik, V., Vetter, T.

In Artificial Neural Networks: ICANN 96, LNCS, vol. 1112, pages: 251-256, Lecture Notes in Computer Science, (Editors: C von der Malsburg and W von Seelen and JC Vorbrüggen and B Sendhoff), Springer, Berlin, Germany, 6th International Conference on Artificial Neural Networks, July 1996 (inproceedings)

Abstract
Two view-based object recognition algorithms are compared: (1) a heuristic algorithm based on oriented filters, and (2) a support vector learning machine trained on low-resolution images of the objects. Classification performance is assessed using a high number of images generated by a computer graphics system under precisely controlled conditions. Training- and test-images show a set of 25 realistic three-dimensional models of chairs from viewing directions spread over the upper half of the viewing sphere. The percentage of correct identification of all 25 objects is measured.

PDF PDF DOI [BibTex]

PDF PDF DOI [BibTex]


no image
Learning View Graphs for Robot Navigation

Franz, M., Schölkopf, B., Georg, P., Mallot, H., Bülthoff, H.

(33), Max Planck Institute for Biological Cybernetics, Tübingen,, July 1996 (techreport)

Abstract
We present a purely vision-based scheme for learning a parsimonious representation of an open environment. Using simple exploration behaviours, our system constructs a graph of appropriately chosen views. To navigate between views connected in the graph, we employ a homing strategy inspired by findings of insect ethology. Simulations and robot experiments demonstrate the feasibility of the proposed approach.

[BibTex]

[BibTex]


no image
Incorporating invariances in support vector learning machines

Schölkopf, B., Burges, C., Vapnik, V.

In Artificial Neural Networks: ICANN 96, LNCS vol. 1112, pages: 47-52, (Editors: C von der Malsburg and W von Seelen and JC Vorbrüggen and B Sendhoff), Springer, Berlin, Germany, 6th International Conference on Artificial Neural Networks, July 1996, volume 1112 of Lecture Notes in Computer Science (inproceedings)

Abstract
Developed only recently, support vector learning machines achieve high generalization ability by minimizing a bound on the expected test error; however, so far there existed no way of adding knowledge about invariances of a classification problem at hand. We present a method of incorporating prior knowledge about transformation invariances by applying transformations to support vectors, the training examples most critical for determining the classification boundary.

PDF DOI [BibTex]

PDF DOI [BibTex]


no image
A practical Monte Carlo implementation of Bayesian learning

Rasmussen, CE.

In Advances in Neural Information Processing Systems 8, pages: 598-604, (Editors: Touretzky, D.S. , M.C. Mozer, M.E. Hasselmo), MIT Press, Cambridge, MA, USA, Ninth Annual Conference on Neural Information Processing Systems (NIPS), June 1996 (inproceedings)

Abstract
A practical method for Bayesian training of feed-forward neural networks using sophisticated Monte Carlo methods is presented and evaluated. In reasonably small amounts of computer time this approach outperforms other state-of-the-art methods on 5 datalimited tasks from real world domains.

PDF Web [BibTex]

PDF Web [BibTex]


no image
Gaussian Processes for Regression

Williams, CKI., Rasmussen, CE.

In Advances in neural information processing systems 8, pages: 514-520, (Editors: Touretzky, D.S. , M.C. Mozer, M.E. Hasselmo), MIT Press, Cambridge, MA, USA, Ninth Annual Conference on Neural Information Processing Systems (NIPS), June 1996 (inproceedings)

Abstract
The Bayesian analysis of neural networks is difficult because a simple prior over weights implies a complex prior over functions. We investigate the use of a Gaussian process prior over functions, which permits the predictive Bayesian analysis for fixed values of hyperparameters to be carried out exactly using matrix operations. Two methods, using optimization and averaging (via Hybrid Monte Carlo) over hyperparameters have been tested on a number of challenging problems and have produced excellent results.

PDF Web [BibTex]

PDF Web [BibTex]


no image
Evaluation of Gaussian Processes and other Methods for Non-Linear Regression

Rasmussen, CE.

Biologische Kybernetik, Graduate Department of Computer Science, Univeristy of Toronto, 1996 (phdthesis)

PostScript [BibTex]

PostScript [BibTex]


no image
Künstliches Lernen

Schölkopf, B.

In Komplexe adaptive Systeme, Forum für Interdisziplinäre Forschung, 15, pages: 93-117, Forum für interdisziplinäre Forschung, (Editors: S Bornholdt and PH Feindt), Röll, Dettelbach, 1996 (inbook)

[BibTex]

[BibTex]


no image
Does motion-blur facilitate motion detection ?

Wichmann, F., Henning, G.

OSA Conference Program, pages: S127, 1996 (poster)

Abstract
Retinal-image motion induces the perceptual loss of high spatial-frequency content - motion blur - that affects broadband stimuli. The relative detectability of motion blur and motion itself, measured in 2-AFC experiments, shows that, although the blur associated with motion can be detected, motion itself is the more effective cue.

[BibTex]

[BibTex]


no image
Aktives Erwerben eines Ansichtsgraphen zur diskreten Repräsentation offener Umwelten.

Franz, M., Schölkopf, B., Mallot, H., Bülthoff, H.

Fortschritte der K{\"u}nstlichen Intelligenz, pages: 138-147, (Editors: M. Thielscher and S.-E. Bornscheuer), 1996 (poster)

PDF PostScript [BibTex]

PDF PostScript [BibTex]

1993


no image
Presynaptic and Postsynaptic Competition in models for the Development of Neuromuscular Connections

Rasmussen, CE., Willshaw, DJ.

Biological Cybernetics, 68, pages: 409-419, 1993 (article)

Abstract
The development of the nervous system involves in many cases interactions on a local scale rather than the execution of a fully specified genetic blueprint. The problem is to discover the nature of these interactions and the factors on which they depend. The withdrawal of polyinnervation in developing muscle is an example where such competitive interactions play an important role. We examine the possible types of competition in formal models that have plausible biological implementations. By relating the behaviour of the models to the anatomical and physiological findings we show that a model that incorporates two types of competition is superior to others. Analysis suggests that the phenomenon of intrinsic withdrawal is a side effect of the competitive mechanisms rather than a separate non-competitive feature. Full scale computer simulations have been used to confirm the capabilities of this model.

PostScript [BibTex]

1993

PostScript [BibTex]


no image
Cartesian Dynamics of Simple Molecules: X Linear Quadratomics (C∞v Symmetry).

Anderson, A., Davison, T., Nagi, N., Schlueter, S.

Spectroscopy Letters, 26, pages: 509-522, 1993 (article)

[BibTex]

[BibTex]