Header logo is ei



no image
Kernel methods in medical imaging

Charpiat, G., Hofmann, M., Schölkopf, B.

In Handbook of Biomedical Imaging, pages: 63-81, 4, (Editors: Paragios, N., Duncan, J. and Ayache, N.), Springer, Berlin, Germany, June 2015 (inbook)

Web link (url) [BibTex]

Web link (url) [BibTex]


no image
Statistical and Machine Learning Methods for Neuroimaging: Examples, Challenges, and Extensions to Diffusion Imaging Data

O’Donnell, L. J., Schultz, T.

In Visualization and Processing of Higher Order Descriptors for Multi-Valued Data, pages: 299-319, (Editors: Hotz, I. and Schultz, T.), Springer, 2015 (inbook)

[BibTex]

[BibTex]


no image
Justifying Information-Geometric Causal Inference

Janzing, D., Steudel, B., Shajarisales, N., Schölkopf, B.

In Measures of Complexity: Festschrift for Alexey Chervonenkis, pages: 253-265, 18, (Editors: Vovk, V., Papadopoulos, H. and Gammerman, A.), Springer, 2015 (inbook)

DOI [BibTex]

DOI [BibTex]

2009


no image
Methods for feature selection in a learning machine

Weston, J., Elisseeff, A., Schölkopf, B., Pérez-Cruz, F.

United States Patent, No 7624074, November 2009 (patent)

[BibTex]

2009

[BibTex]


no image
Toward a Theory of Consciousness

Tononi, G., Balduzzi, D.

In The Cognitive Neurosciences, pages: 1201-1220, (Editors: Gazzaniga, M.S.), MIT Press, Cambridge, MA, USA, October 2009 (inbook)

Web [BibTex]

Web [BibTex]


no image
Acquiring web page information without commitment to downloading the web page

Heilbron, L., Platt, J. C., Simard, P. Y., Schölkopf, B.

United States Patent, No 7565409, July 2009 (patent)

[BibTex]

[BibTex]


no image
Text Clustering with Mixture of von Mises-Fisher Distributions

Sra, S., Banerjee, A., Ghosh, J., Dhillon, I.

In Text mining: classification, clustering, and applications, pages: 121-161, Chapman & Hall/CRC data mining and knowledge discovery series, (Editors: Srivastava, A. N. and Sahami, M.), CRC Press, Boca Raton, FL, USA, June 2009 (inbook)

Web DOI [BibTex]

Web DOI [BibTex]


no image
Data Mining for Biologists

Tsuda, K.

In Biological Data Mining in Protein Interaction Networks, pages: 14-27, (Editors: Li, X. and Ng, S.-K.), Medical Information Science Reference, Hershey, PA, USA, May 2009 (inbook)

Abstract
In this tutorial chapter, we review basics about frequent pattern mining algorithms, including itemset mining, association rule mining and graph mining. These algorithms can find frequently appearing substructures in discrete data. They can discover structural motifs, for example, from mutation data, protein structures and chemical compounds. As they have been primarily used for business data, biological applications are not so common yet, but their potential impact would be large. Recent advances in computers including multicore machines and ever increasing memory capacity support the application of such methods to larger datasets. We explain technical aspects of the algorithms, but do not go into details. Current biological applications are summarized and possible future directions are given.

Web [BibTex]

Web [BibTex]


no image
Large Margin Methods for Part of Speech Tagging

Altun, Y.

In Automatic Speech and Speaker Recognition: Large Margin and Kernel Methods, pages: 141-160, (Editors: Keshet, J. and Bengio, S.), Wiley, Hoboken, NJ, USA, January 2009 (inbook)

Web [BibTex]

Web [BibTex]


no image
Pre−processed feature ranking for a support vector machine

Weston, J., Elisseeff, A., Schölkopf, B., Pérez-Cruz, F., Guyon, I.

United States Patent, No. 7475048, January 2009 (patent)

[BibTex]

[BibTex]


no image
Covariate shift and local learning by distribution matching

Gretton, A., Smola, A., Huang, J., Schmittfull, M., Borgwardt, K., Schölkopf, B.

In Dataset Shift in Machine Learning, pages: 131-160, (Editors: Quiñonero-Candela, J., Sugiyama, M., Schwaighofer, A. and Lawrence, N. D.), MIT Press, Cambridge, MA, USA, 2009 (inbook)

Abstract
Given sets of observations of training and test data, we consider the problem of re-weighting the training data such that its distribution more closely matches that of the test data. We achieve this goal by matching covariate distributions between training and test sets in a high dimensional feature space (specifically, a reproducing kernel Hilbert space). This approach does not require distribution estimation. Instead, the sample weights are obtained by a simple quadratic programming procedure. We provide a uniform convergence bound on the distance between the reweighted training feature mean and the test feature mean, a transductive bound on the expected loss of an algorithm trained on the reweighted data, and a connection to single class SVMs. While our method is designed to deal with the case of simple covariate shift (in the sense of Chapter ??), we have also found benefits for sample selection bias on the labels. Our correction procedure yields its greatest and most consistent advantages when the learning algorithm returns a classifier/regressor that is simpler" than the data might suggest.

PDF Web [BibTex]

PDF Web [BibTex]


no image
An introduction to Kernel Learning Algorithms

Gehler, P., Schölkopf, B.

In Kernel Methods for Remote Sensing Data Analysis, pages: 25-48, 2, (Editors: Gustavo Camps-Valls and Lorenzo Bruzzone), Wiley, New York, NY, USA, 2009 (inbook)

Abstract
Kernel learning algorithms are currently becoming a standard tool in the area of machine learning and pattern recognition. In this chapter we review the fundamental theory of kernel learning. As the basic building block we introduce the kernel function, which provides an elegant and general way to compare possibly very complex objects. We then review the concept of a reproducing kernel Hilbert space and state the representer theorem. Finally we give an overview of the most prominent algorithms, which are support vector classification and regression, Gaussian Processes and kernel principal analysis. With multiple kernel learning and structured output prediction we also introduce some more recent advancements in the field.

link (url) DOI [BibTex]

link (url) DOI [BibTex]

2002


no image
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond

Schölkopf, B., Smola, A.

pages: 644, Adaptive Computation and Machine Learning, MIT Press, Cambridge, MA, USA, December 2002, Parts of this book, including an introduction to kernel methods, can be downloaded here. (book)

Abstract
In the 1990s, a new type of learning algorithm was developed, based on results from statistical learning theory: the Support Vector Machine (SVM). This gave rise to a new class of theoretically elegant learning machines that use a central concept of SVMs-kernels—for a number of learning tasks. Kernel machines provide a modular framework that can be adapted to different tasks and domains by the choice of the kernel function and the base algorithm. They are replacing neural networks in a variety of fields, including engineering, information retrieval, and bioinformatics. Learning with Kernels provides an introduction to SVMs and related kernel methods. Although the book begins with the basics, it also includes the latest research. It provides all of the concepts necessary to enable a reader equipped with some basic mathematical knowledge to enter the world of machine learning using theoretically well-founded yet easy-to-use kernel algorithms and to understand and apply the powerful algorithms that have been developed over the last few years.

Web [BibTex]

2002

Web [BibTex]

2000


no image
Robust ensemble learning

Rätsch, G., Schölkopf, B., Smola, A., Mika, S., Onoda, T., Müller, K.

In Advances in Large Margin Classifiers, pages: 207-220, Neural Information Processing Series, (Editors: AJ Smola and PJ Bartlett and B Schölkopf and D. Schuurmans), MIT Press, Cambridge, MA, USA, October 2000 (inbook)

[BibTex]

2000

[BibTex]


no image
Entropy numbers for convex combinations and MLPs

Smola, A., Elisseeff, A., Schölkopf, B., Williamson, R.

In Advances in Large Margin Classifiers, pages: 369-387, Neural Information Processing Series, (Editors: AJ Smola and PL Bartlett and B Schölkopf and D Schuurmans), MIT Press, Cambridge, MA,, October 2000 (inbook)

[BibTex]

[BibTex]


no image
Advances in Large Margin Classifiers

Smola, A., Bartlett, P., Schölkopf, B., Schuurmans, D.

pages: 422, Neural Information Processing, MIT Press, Cambridge, MA, USA, October 2000 (book)

Abstract
The concept of large margins is a unifying principle for the analysis of many different approaches to the classification of data from examples, including boosting, mathematical programming, neural networks, and support vector machines. The fact that it is the margin, or confidence level, of a classification--that is, a scale parameter--rather than a raw training error that matters has become a key tool for dealing with classifiers. This book shows how this idea applies to both the theoretical analysis and the design of algorithms. The book provides an overview of recent developments in large margin classifiers, examines connections with other methods (e.g., Bayesian inference), and identifies strengths and weaknesses of the method, as well as directions for future research. Among the contributors are Manfred Opper, Vladimir Vapnik, and Grace Wahba.

Web [BibTex]

Web [BibTex]


no image
Natural Regularization from Generative Models

Oliver, N., Schölkopf, B., Smola, A.

In Advances in Large Margin Classifiers, pages: 51-60, Neural Information Processing Series, (Editors: AJ Smola and PJ Bartlett and B Schölkopf and D Schuurmans), MIT Press, Cambridge, MA, USA, October 2000 (inbook)

[BibTex]

[BibTex]


no image
Solving Satisfiability Problems with Genetic Algorithms

Harmeling, S.

In Genetic Algorithms and Genetic Programming at Stanford 2000, pages: 206-213, (Editors: Koza, J. R.), Stanford Bookstore, Stanford, CA, USA, June 2000 (inbook)

Abstract
We show how to solve hard 3-SAT problems using genetic algorithms. Furthermore, we explore other genetic operators that may be useful to tackle 3-SAT problems, and discuss their pros and cons.

PDF [BibTex]

PDF [BibTex]


no image
Statistical Learning and Kernel Methods

Schölkopf, B.

In CISM Courses and Lectures, International Centre for Mechanical Sciences Vol.431, CISM Courses and Lectures, International Centre for Mechanical Sciences, 431(23):3-24, (Editors: G Della Riccia and H-J Lenz and R Kruse), Springer, Vienna, Data Fusion and Perception, 2000 (inbook)

[BibTex]

[BibTex]


no image
An Introduction to Kernel-Based Learning Algorithms

Müller, K., Mika, S., Rätsch, G., Tsuda, K., Schölkopf, B.

In Handbook of Neural Network Signal Processing, 4, (Editors: Yu Hen Hu and Jang-Neng Hwang), CRC Press, 2000 (inbook)

[BibTex]

[BibTex]

1998


no image
Support-Vektor-Lernen

Schölkopf, B.

In Ausgezeichnete Informatikdissertationen 1997, pages: 135-150, (Editors: G Hotz and H Fiedler and P Gorny and W Grass and S Hölldobler and IO Kerner and R Reischuk), Teubner Verlag, Stuttgart, 1998 (inbook)

[BibTex]

1998

[BibTex]

1996


no image
Künstliches Lernen

Schölkopf, B.

In Komplexe adaptive Systeme, Forum für Interdisziplinäre Forschung, 15, pages: 93-117, Forum für interdisziplinäre Forschung, (Editors: S Bornholdt and PH Feindt), Röll, Dettelbach, 1996 (inbook)

[BibTex]

1996

[BibTex]