Header logo is ei


1998


no image
Fast approximation of support vector kernel expansions, and an interpretation of clustering as approximation in feature spaces.

Schölkopf, B., Knirsch, P., Smola, A., Burges, C.

In Mustererkennung 1998, pages: 125-132, Informatik aktuell, (Editors: P Levi and M Schanz and R-J Ahlers and F May), Springer, Berlin, Germany, 20th DAGM-Symposium, 1998 (inproceedings)

Abstract
Kernel-based learning methods provide their solutions as expansions in terms of a kernel. We consider the problem of reducing the computational complexity of evaluating these expansions by approximating them using fewer terms. As a by-product, we point out a connection between clustering and approximation in reproducing kernel Hilbert spaces generated by a particular class of kernels.

Web [BibTex]

1998

Web [BibTex]


no image
Kernel PCA pattern reconstruction via approximate pre-images.

Schölkopf, B., Mika, S., Smola, A., Rätsch, G., Müller, K.

In 8th International Conference on Artificial Neural Networks, pages: 147-152, Perspectives in Neural Computing, (Editors: L Niklasson and M Boden and T Ziemke), Springer, Berlin, Germany, 8th International Conference on Artificial Neural Networks, 1998 (inproceedings)

[BibTex]

[BibTex]


no image
A bootstrap method for testing hypotheses concerning psychometric functions

Hill, N., Wichmann, F.

1998 (poster)

Abstract
Whenever psychometric functions are used to evaluate human performance on some task, it is valuable to examine not only the threshold and slope values estimated from the original data, but also the expected variability in those measures. This allows psychometric functions obtained in two experimental conditions to be compared statistically. We present a method for estimating the variability of thresholds and slopes of psychometric functions. This involves a maximum-likelihood fit to the data using a three-parameter mathematical function, followed by Monte Carlo simulation using the first fit as a generating function for the simulations. The variability of the function's parameters can then be estimated (as shown by Maloney, 1990), as can the variability of the threshold value (Foster & Bischof, 1997). We will show how a simple development of this procedure can be used to test the significance of differences between (a) the thresholds, and (b) the slopes of two psychometric functions. Further, our method can be used to assess the assumptions underlying the original fit, by examining how goodness-of-fit differs in simulation from its original value. In this way data sets can be identified as being either too noisy to be generated by a binomial observer, or significantly "too good to be true." All software is written in MATLAB and is therefore compatible across platforms, with the option of accelerating performance using MATLAB's plug-in binaries, or "MEX" files.

[BibTex]


no image
Convex Cost Functions for Support Vector Regression

Smola, A., Schölkopf, B., Müller, K.

In 8th International Conference on Artificial Neural Networks, pages: 99-104, Perspectives in Neural Computing, (Editors: L Niklasson and M Boden and T Ziemke), Springer, Berlin, Germany, 8th International Conference on Artificial Neural Networks, 1998 (inproceedings)

[BibTex]

[BibTex]


no image
Nonlinearities and the pedestal effect

Wichmann, F., Henning, G., Ploghaus, A.

Perception, 27, pages: S86, 1998 (poster)

Abstract
Psychophysical and physiological evidence suggests that luminance patterns are independently analysed in "channels" responding to different bands of spatial frequency. There are, however, interactions among stimuli falling well outside the usual estimates of channels' bandwidths (Henning, Hertz, and Broadbent, (1975). Vision Res., 15, 887-899). We examined whether the masking results of Henning et al. are consistent with independent channels. We postulated, before the channels, a point non-linearity which would introduce distortion products that might produce the observed interactions between stimuli two octaves apart in spatial frequency. Standard 2-AFC masking experiments determined whether possible distortion products of a 4.185 c/deg masking sinusoid revealed their presence through effects on the detection of a sinusoidal signal at the frequency of the second harmonic of the masker-8.37 c/deg. The signal and masker were horizontally orientated and the signal was in-phase, out-of-phase, or in quadrature with the putative second-order distortion product of the masker. Significant interactions between signal and masker were observed: for a wide range of masker contrasts, signal detection was facilitated by the masking stimulus. However, the shapes of the functions relating detection performance to masker contrast, as well as the effects of relative phase, were inconsistent with the notion that distortion products were responsible for the interactions observed.

[BibTex]

[BibTex]


no image
Support vector regression with automatic accuracy control.

Schölkopf, B., Bartlett, P., Smola, A., Williamson, R.

In ICANN'98, pages: 111-116, Perspectives in Neural Computing, (Editors: L Niklasson and M Boden and T Ziemke), Springer, Berlin, Germany, International Conference on Artificial Neural Networks (ICANN'98), 1998 (inproceedings)

[BibTex]

[BibTex]


no image
General cost functions for support vector regression.

Smola, A., Schölkopf, B., Müller, K.

In Ninth Australian Conference on Neural Networks, pages: 79-83, (Editors: T Downs and M Frean and M Gallagher), 9th Australian Conference on Neural Networks (ACNN'98), 1998 (inproceedings)

[BibTex]

[BibTex]


no image
Asymptotically optimal choice of varepsilon-loss for support vector machines.

Smola, A., Murata, N., Schölkopf, B., Müller, K.

In 8th International Conference on Artificial Neural Networks, pages: 105-110, Perspectives in Neural Computing, (Editors: L Niklasson and M Boden and T Ziemke), Springer, Berlin, Germany, 8th International Conference on Artificial Neural Networks, 1998 (inproceedings)

[BibTex]

[BibTex]

1996


no image
Quality Prediction of Steel Products using Neural Networks

Shin, H., Jhee, W.

In Proc. of the Korean Expert System Conference, pages: 112-124, Korean Expert System Society Conference, November 1996 (inproceedings)

[BibTex]

1996

[BibTex]


no image
Comparison of view-based object recognition algorithms using realistic 3D models

Blanz, V., Schölkopf, B., Bülthoff, H., Burges, C., Vapnik, V., Vetter, T.

In Artificial Neural Networks: ICANN 96, LNCS, vol. 1112, pages: 251-256, Lecture Notes in Computer Science, (Editors: C von der Malsburg and W von Seelen and JC Vorbrüggen and B Sendhoff), Springer, Berlin, Germany, 6th International Conference on Artificial Neural Networks, July 1996 (inproceedings)

Abstract
Two view-based object recognition algorithms are compared: (1) a heuristic algorithm based on oriented filters, and (2) a support vector learning machine trained on low-resolution images of the objects. Classification performance is assessed using a high number of images generated by a computer graphics system under precisely controlled conditions. Training- and test-images show a set of 25 realistic three-dimensional models of chairs from viewing directions spread over the upper half of the viewing sphere. The percentage of correct identification of all 25 objects is measured.

PDF PDF DOI [BibTex]

PDF PDF DOI [BibTex]


no image
Incorporating invariances in support vector learning machines

Schölkopf, B., Burges, C., Vapnik, V.

In Artificial Neural Networks: ICANN 96, LNCS vol. 1112, pages: 47-52, (Editors: C von der Malsburg and W von Seelen and JC Vorbrüggen and B Sendhoff), Springer, Berlin, Germany, 6th International Conference on Artificial Neural Networks, July 1996, volume 1112 of Lecture Notes in Computer Science (inproceedings)

Abstract
Developed only recently, support vector learning machines achieve high generalization ability by minimizing a bound on the expected test error; however, so far there existed no way of adding knowledge about invariances of a classification problem at hand. We present a method of incorporating prior knowledge about transformation invariances by applying transformations to support vectors, the training examples most critical for determining the classification boundary.

PDF DOI [BibTex]

PDF DOI [BibTex]


no image
A practical Monte Carlo implementation of Bayesian learning

Rasmussen, CE.

In Advances in Neural Information Processing Systems 8, pages: 598-604, (Editors: Touretzky, D.S. , M.C. Mozer, M.E. Hasselmo), MIT Press, Cambridge, MA, USA, Ninth Annual Conference on Neural Information Processing Systems (NIPS), June 1996 (inproceedings)

Abstract
A practical method for Bayesian training of feed-forward neural networks using sophisticated Monte Carlo methods is presented and evaluated. In reasonably small amounts of computer time this approach outperforms other state-of-the-art methods on 5 datalimited tasks from real world domains.

PDF Web [BibTex]

PDF Web [BibTex]


no image
Gaussian Processes for Regression

Williams, CKI., Rasmussen, CE.

In Advances in neural information processing systems 8, pages: 514-520, (Editors: Touretzky, D.S. , M.C. Mozer, M.E. Hasselmo), MIT Press, Cambridge, MA, USA, Ninth Annual Conference on Neural Information Processing Systems (NIPS), June 1996 (inproceedings)

Abstract
The Bayesian analysis of neural networks is difficult because a simple prior over weights implies a complex prior over functions. We investigate the use of a Gaussian process prior over functions, which permits the predictive Bayesian analysis for fixed values of hyperparameters to be carried out exactly using matrix operations. Two methods, using optimization and averaging (via Hybrid Monte Carlo) over hyperparameters have been tested on a number of challenging problems and have produced excellent results.

PDF Web [BibTex]

PDF Web [BibTex]


no image
Aktives Erwerben eines Ansichtsgraphen zur diskreten Repräsentation offener Umwelten.

Franz, M., Schölkopf, B., Mallot, H., Bülthoff, H.

Fortschritte der K{\"u}nstlichen Intelligenz, pages: 138-147, (Editors: M. Thielscher and S.-E. Bornscheuer), 1996 (poster)

PDF PostScript [BibTex]

PDF PostScript [BibTex]


no image
Does motion-blur facilitate motion detection ?

Wichmann, F., Henning, G.

OSA Conference Program, pages: S127, 1996 (poster)

Abstract
Retinal-image motion induces the perceptual loss of high spatial-frequency content - motion blur - that affects broadband stimuli. The relative detectability of motion blur and motion itself, measured in 2-AFC experiments, shows that, although the blur associated with motion can be detected, motion itself is the more effective cue.

[BibTex]

[BibTex]