Header logo is ei



no image
Adaptation and Robust Learning of Probabilistic Movement Primitives

Gomez-Gonzalez, S., Neumann, G., Schölkopf, B., Peters, J.

IEEE Transactions on Robotics, 36(2):366-379, IEEE, March 2020 (article)

arXiv DOI Project Page [BibTex]

arXiv DOI Project Page [BibTex]


no image
Real Time Trajectory Prediction Using Deep Conditional Generative Models

Gomez-Gonzalez, S., Prokudin, S., Schölkopf, B., Peters, J.

IEEE Robotics and Automation Letters, 5(2):970-976, IEEE, January 2020 (article)

arXiv DOI [BibTex]


no image
An Adaptive Optimizer for Measurement-Frugal Variational Algorithms

Kübler, J. M., Arrasmith, A., Cincio, L., Coles, P. J.

Quantum, 4, pages: 263, 2020 (article)

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Counterfactual Mean Embedding

Muandet, K., Kanagawa, M., Saengkyongam, S., Marukatat, S.

Journal of Machine Learning Research, 2020 (article) Accepted

[BibTex]

[BibTex]


no image
Causal Discovery from Heterogeneous/Nonstationary Data

Huang, B., Zhang, K., J., Z., Ramsey, J., Sanchez-Romero, R., Glymour, C., Schölkopf, B.

Journal of Machine Learning Research, 21(89):1-53, 2020 (article)

link (url) [BibTex]

link (url) [BibTex]

2019


no image
Convolutional neural networks: A magic bullet for gravitational-wave detection?

Gebhard, T., Kilbertus, N., Harry, I., Schölkopf, B.

Physical Review D, 100(6):063015, American Physical Society, September 2019 (article)

link (url) DOI [BibTex]

2019

link (url) DOI [BibTex]


no image
Data scarcity, robustness and extreme multi-label classification

Babbar, R., Schölkopf, B.

Machine Learning, 108(8):1329-1351, September 2019, Special Issue of the ECML PKDD 2019 Journal Track (article)

DOI [BibTex]

DOI [BibTex]


no image
SPINDLE: End-to-end learning from EEG/EMG to extrapolate animal sleep scoring across experimental settings, labs and species

Miladinovic, D., Muheim, C., Bauer, S., Spinnler, A., Noain, D., Bandarabadi, M., Gallusser, B., Krummenacher, G., Baumann, C., Adamantidis, A., Brown, S. A., Buhmann, J. M.

PLOS Computational Biology, 15(4):1-30, Public Library of Science, April 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
A 32-channel multi-coil setup optimized for human brain shimming at 9.4T

Aghaeifar, A., Zhou, J., Heule, R., Tabibian, B., Schölkopf, B., Jia, F., Zaitsev, M., Scheffler, K.

Magnetic Resonance in Medicine, 2019, (Early View) (article)

DOI [BibTex]

DOI [BibTex]


Multidimensional Contrast Limited Adaptive Histogram Equalization
Multidimensional Contrast Limited Adaptive Histogram Equalization

Stimper, V., Bauer, S., Ernstorfer, R., Schölkopf, B., Xian, R. P.

IEEE Access, 7, pages: 165437-165447, 2019 (article)

arXiv link (url) DOI [BibTex]

arXiv link (url) DOI [BibTex]


no image
TD-regularized actor-critic methods

Parisi, S., Tangkaratt, V., Peters, J., Khan, M. E.

Machine Learning, 108(8):1467-1501, (Editors: Karsten Borgwardt, Po-Ling Loh, Evimaria Terzi, and Antti Ukkonen), 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Probabilistic solutions to ordinary differential equations as nonlinear Bayesian filtering: a new perspective

Tronarp, F., Kersting, H., Särkkä, S. H. P.

Statistics and Computing, 29(6):1297-1315, 2019 (article)

DOI [BibTex]


Learning to Control Highly Accelerated Ballistic Movements on Muscular Robots
Learning to Control Highly Accelerated Ballistic Movements on Muscular Robots

Büchler, D., Calandra, R., Peters, J.

2019 (article) Submitted

Abstract
High-speed and high-acceleration movements are inherently hard to control. Applying learning to the control of such motions on anthropomorphic robot arms can improve the accuracy of the control but might damage the system. The inherent exploration of learning approaches can lead to instabilities and the robot reaching joint limits at high speeds. Having hardware that enables safe exploration of high-speed and high-acceleration movements is therefore desirable. To address this issue, we propose to use robots actuated by Pneumatic Artificial Muscles (PAMs). In this paper, we present a four degrees of freedom (DoFs) robot arm that reaches high joint angle accelerations of up to 28000 °/s^2 while avoiding dangerous joint limits thanks to the antagonistic actuation and limits on the air pressure ranges. With this robot arm, we are able to tune control parameters using Bayesian optimization directly on the hardware without additional safety considerations. The achieved tracking performance on a fast trajectory exceeds previous results on comparable PAM-driven robots. We also show that our system can be controlled well on slow trajectories with PID controllers due to careful construction considerations such as minimal bending of cables, lightweight kinematics and minimal contact between PAMs and PAMs with the links. Finally, we propose a novel technique to control the the co-contraction of antagonistic muscle pairs. Experimental results illustrate that choosing the optimal co-contraction level is vital to reach better tracking performance. Through the use of PAM-driven robots and learning, we do a small step towards the future development of robots capable of more human-like motions.

Arxiv Video [BibTex]


no image
Robustifying Independent Component Analysis by Adjusting for Group-Wise Stationary Noise

Pfister*, N., Weichwald*, S., Bühlmann, P., Schölkopf, B.

Journal of Machine Learning Research, 20(147):1-50, 2019, *equal contribution (article)

ArXiv Code Project page PDF link (url) Project Page Project Page [BibTex]


no image
Enhancing Human Learning via Spaced Repetition Optimization

Tabibian, B., Upadhyay, U., De, A., Zarezade, A., Schölkopf, B., Gomez Rodriguez, M.

Proceedings of the National Academy of Sciences, 116(10):3988-3993, National Academy of Sciences, 2019 (article)

link (url) DOI Project Page Project Page [BibTex]

link (url) DOI Project Page Project Page [BibTex]


no image
Entropic Regularization of Markov Decision Processes

Belousov, B., Peters, J.

Entropy, 21(7):674, 2019 (article)

link (url) DOI [BibTex]


no image
Searchers adjust their eye-movement dynamics to target characteristics in natural scenes

Rothkegel, L., Schütt, H., Trukenbrod, H., Wichmann, F. A., Engbert, R.

Scientific Reports, 9(1635), 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Spatial statistics for gaze patterns in scene viewing: Effects of repeated viewing

Trukenbrod, H. A., Barthelmé, S., Wichmann, F. A., Engbert, R.

Journal of Vision, 19(6):19, 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Quantum mean embedding of probability distributions

Kübler, J. M., Muandet, K., Schölkopf, B.

Physical Review Research, 1(3):033159, American Physical Society, 2019 (article)

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Inferring causation from time series with perspectives in Earth system sciences

Runge, J., Bathiany, S., Bollt, E., Camps-Valls, G., Coumou, D., Deyle, E., Glymour, C., Kretschmer, M., Mahecha, M., Munoz-Mari, J., van Nes, E., Peters, J., Quax, R., Reichstein, M., Scheffer, M., Schölkopf, B., Spirtes, P., Sugihara, G., Sun, J., Zhang, K., Zscheischler, J.

Nature Communications, 10(2553), 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Analysis of cause-effect inference by comparing regression errors

Blöbaum, P., Janzing, D., Washio, T., Shimizu, S., Schölkopf, B.

PeerJ Computer Science, 5, pages: e169, 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Learning Intention Aware Online Adaptation of Movement Primitives

Koert, D., Pajarinen, J., Schotschneider, A., Trick, S., Rothkopf, C., Peters, J.

IEEE Robotics and Automation Letters, 4(4):3719-3726, 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Spread-spectrum magnetic resonance imaging

Scheffler, K., Loktyushin, A., Bause, J., Aghaeifar, A., Steffen, T., Schölkopf, B.

Magnetic Resonance in Medicine, 82(3):877-885, 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
How Cognitive Models of Human Body Experience Might Push Robotics

Schürmann, T., Mohler, B. J., Peters, J., Beckerle, P.

Frontiers in Neurorobotics, 13(14), 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Dense connectomic reconstruction in layer 4 of the somatosensory cortex

Motta, A., Berning, M., Boergens, K. M., Staffler, B., Beining, M., Loomba, S., Hennig, P., Wissler, H., Helmstaedter, M.

Science, 366(6469):eaay3134, American Association for the Advancement of Science, 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Learning Trajectory Distributions for Assisted Teleoperation and Path Planning

Ewerton, M., Arenz, O., Maeda, G., Koert, D., Kolev, Z., Takahashi, M., Peters, J.

Frontiers in Robotics and AI, 6, pages: 89, 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Brainglance: Visualizing Group Level MRI Data at One Glance

Stelzer, J., Lacosse, E., Bause, J., Scheffler, K., Lohmann, G.

Frontiers in Neuroscience, 13(972), 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Eigendecompositions of Transfer Operators in Reproducing Kernel Hilbert Spaces

Klus, S., Schuster, I., Muandet, K.

Journal of Nonlinear Science, 2019, First Online: 21 August 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Workshops of the seventh international brain-computer interface meeting: not getting lost in translation

Huggins, J. E., Guger, C., Aarnoutse, E., Allison, B., Anderson, C. W., Bedrick, S., Besio, W., Chavarriaga, R., Collinger, J. L., Do, A. H., Herff, C., Hohmann, M., Kinsella, M., Lee, K., Lotte, F., Müller-Putz, G., Nijholt, A., Pels, E., Peters, B., Putze, F., Rupp, R. S. G., Scott, S., Tangermann, M., Tubig, P., Zander, T.

Brain-Computer Interfaces, 6(3):71-101, Taylor & Francis, 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Compatible natural gradient policy search

Pajarinen, J., Thai, H. L., Akrour, R., Peters, J., Neumann, G.

Machine Learning, 108(8):1443-1466, (Editors: Karsten Borgwardt, Po-Ling Loh, Evimaria Terzi, and Antti Ukkonen), 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Learning stable and predictive structures in kinetic systems

Pfister, N., Bauer, S., Peters, J.

Proceedings of the National Academy of Sciences (PNAS), 116(51):25405-25411, 2019 (article)

DOI [BibTex]

DOI [BibTex]


no image
Fairness Constraints: A Flexible Approach for Fair Classification

Zafar, M. B., Valera, I., Gomez-Rodriguez, M., Krishna, P.

Journal of Machine Learning Research, 20(75):1-42, 2019 (article)

link (url) [BibTex]

link (url) [BibTex]

2003


no image
Concentration Inequalities for Sub-Additive Functions Using the Entropy Method

Bousquet, O.

Stochastic Inequalities and Applications, 56, pages: 213-247, Progress in Probability, (Editors: Giné, E., C. Houdré and D. Nualart), November 2003 (article)

Abstract
We obtain exponential concentration inequalities for sub-additive functions of independent random variables under weak conditions on the increments of those functions, like the existence of exponential moments for these increments. As a consequence of these general inequalities, we obtain refinements of Talagrand's inequality for empirical processes and new bounds for randomized empirical processes. These results are obtained by further developing the entropy method introduced by Ledoux.

PostScript [BibTex]

2003

PostScript [BibTex]


no image
Statistical Learning Theory, Capacity and Complexity

Schölkopf, B.

Complexity, 8(4):87-94, July 2003 (article)

Abstract
We give an exposition of the ideas of statistical learning theory, followed by a discussion of how a reinterpretation of the insights of learning theory could potentially also benefit our understanding of a certain notion of complexity.

Web DOI [BibTex]


no image
Dealing with large Diagonals in Kernel Matrices

Weston, J., Schölkopf, B., Eskin, E., Leslie, C., Noble, W.

Annals of the Institute of Statistical Mathematics, 55(2):391-408, June 2003 (article)

Abstract
In kernel methods, all the information about the training data is contained in the Gram matrix. If this matrix has large diagonal values, which arises for many types of kernels, then kernel methods do not perform well: We propose and test several methods for dealing with this problem by reducing the dynamic range of the matrix while preserving the positive definiteness of the Hessian of the quadratic programming problem that one has to solve when training a Support Vector Machine, which is a common kernel approach for pattern recognition.

PDF DOI [BibTex]

PDF DOI [BibTex]


no image
The em Algorithm for Kernel Matrix Completion with Auxiliary Data

Tsuda, K., Akaho, S., Asai, K.

Journal of Machine Learning Research, 4, pages: 67-81, May 2003 (article)

PDF [BibTex]

PDF [BibTex]


no image
Constructing Descriptive and Discriminative Non-linear Features: Rayleigh Coefficients in Kernel Feature Spaces

Mika, S., Rätsch, G., Weston, J., Schölkopf, B., Smola, A., Müller, K.

IEEE Transactions on Pattern Analysis and Machine Intelligence, 25(5):623-628, May 2003 (article)

Abstract
We incorporate prior knowledge to construct nonlinear algorithms for invariant feature extraction and discrimination. Employing a unified framework in terms of a nonlinearized variant of the Rayleigh coefficient, we propose nonlinear generalizations of Fisher‘s discriminant and oriented PCA using support vector kernel functions. Extensive simulations show the utility of our approach.

DOI [BibTex]

DOI [BibTex]


no image
Tractable Inference for Probabilistic Data Models

Csato, L., Opper, M., Winther, O.

Complexity, 8(4):64-68, April 2003 (article)

Abstract
We present an approximation technique for probabilistic data models with a large number of hidden variables, based on ideas from statistical physics. We give examples for two nontrivial applications. © 2003 Wiley Periodicals, Inc.

PDF GZIP Web [BibTex]

PDF GZIP Web [BibTex]


no image
Feature selection and transduction for prediction of molecular bioactivity for drug design

Weston, J., Perez-Cruz, F., Bousquet, O., Chapelle, O., Elisseeff, A., Schölkopf, B.

Bioinformatics, 19(6):764-771, April 2003 (article)

Abstract
Motivation: In drug discovery a key task is to identify characteristics that separate active (binding) compounds from inactive (non-binding) ones. An automated prediction system can help reduce resources necessary to carry out this task. Results: Two methods for prediction of molecular bioactivity for drug design are introduced and shown to perform well in a data set previously studied as part of the KDD (Knowledge Discovery and Data Mining) Cup 2001. The data is characterized by very few positive examples, a very large number of features (describing three-dimensional properties of the molecules) and rather different distributions between training and test data. Two techniques are introduced specifically to tackle these problems: a feature selection method for unbalanced data and a classifier which adapts to the distribution of the the unlabeled test data (a so-called transductive method). We show both techniques improve identification performance and in conjunction provide an improvement over using only one of the techniques. Our results suggest the importance of taking into account the characteristics in this data which may also be relevant in other problems of a similar type.

Web [BibTex]


no image
Use of the Zero-Norm with Linear Models and Kernel Methods

Weston, J., Elisseeff, A., Schölkopf, B., Tipping, M.

Journal of Machine Learning Research, 3, pages: 1439-1461, March 2003 (article)

Abstract
We explore the use of the so-called zero-norm of the parameters of linear models in learning. Minimization of such a quantity has many uses in a machine learning context: for variable or feature selection, minimizing training error and ensuring sparsity in solutions. We derive a simple but practical method for achieving these goals and discuss its relationship to existing techniques of minimizing the zero-norm. The method boils down to implementing a simple modification of vanilla SVM, namely via an iterative multiplicative rescaling of the training data. Applications we investigate which aid our discussion include variable and feature selection on biological microarray data, and multicategory classification.

PDF PostScript PDF [BibTex]

PDF PostScript PDF [BibTex]


no image
An Introduction to Variable and Feature Selection.

Guyon, I., Elisseeff, A.

Journal of Machine Learning, 3, pages: 1157-1182, 2003 (article)

[BibTex]

[BibTex]


no image
New Approaches to Statistical Learning Theory

Bousquet, O.

Annals of the Institute of Statistical Mathematics, 55(2):371-389, 2003 (article)

Abstract
We present new tools from probability theory that can be applied to the analysis of learning algorithms. These tools allow to derive new bounds on the generalization performance of learning algorithms and to propose alternative measures of the complexity of the learning task, which in turn can be used to derive new learning algorithms.

PostScript [BibTex]

PostScript [BibTex]

2002


no image
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond

Schölkopf, B., Smola, A.

pages: 644, Adaptive Computation and Machine Learning, MIT Press, Cambridge, MA, USA, December 2002, Parts of this book, including an introduction to kernel methods, can be downloaded here. (book)

Abstract
In the 1990s, a new type of learning algorithm was developed, based on results from statistical learning theory: the Support Vector Machine (SVM). This gave rise to a new class of theoretically elegant learning machines that use a central concept of SVMs-kernels—for a number of learning tasks. Kernel machines provide a modular framework that can be adapted to different tasks and domains by the choice of the kernel function and the base algorithm. They are replacing neural networks in a variety of fields, including engineering, information retrieval, and bioinformatics. Learning with Kernels provides an introduction to SVMs and related kernel methods. Although the book begins with the basics, it also includes the latest research. It provides all of the concepts necessary to enable a reader equipped with some basic mathematical knowledge to enter the world of machine learning using theoretically well-founded yet easy-to-use kernel algorithms and to understand and apply the powerful algorithms that have been developed over the last few years.

Web [BibTex]

2002

Web [BibTex]


no image
Constructing Boosting algorithms from SVMs: an application to one-class classification.

Rätsch, G., Mika, S., Schölkopf, B., Müller, K.

IEEE Transactions on Pattern Analysis and Machine Intelligence, 24(9):1184-1199, September 2002 (article)

Abstract
We show via an equivalence of mathematical programs that a support vector (SV) algorithm can be translated into an equivalent boosting-like algorithm and vice versa. We exemplify this translation procedure for a new algorithm—one-class leveraging—starting from the one-class support vector machine (1-SVM). This is a first step toward unsupervised learning in a boosting framework. Building on so-called barrier methods known from the theory of constrained optimization, it returns a function, written as a convex combination of base hypotheses, that characterizes whether a given test point is likely to have been generated from the distribution underlying the training data. Simulations on one-class classification problems demonstrate the usefulness of our approach.

DOI [BibTex]

DOI [BibTex]