Header logo is ei


2015


no image
Causal Inference for Empirical Time Series Based on the Postulate of Independence of Cause and Mechanism

Besserve, M.

53rd Annual Allerton Conference on Communication, Control, and Computing, September 2015 (talk)

[BibTex]

2015

[BibTex]


no image
Independence of cause and mechanism in brain networks

Besserve, M.

DALI workshop on Networks: Processes and Causality, April 2015 (talk)

[BibTex]

[BibTex]


no image
Information-Theoretic Implications of Classical and Quantum Causal Structures

Chaves, R., Majenz, C., Luft, L., Maciel, T., Janzing, D., Schölkopf, B., Gross, D.

18th Conference on Quantum Information Processing (QIP), 2015 (talk)

Web link (url) [BibTex]

Web link (url) [BibTex]


no image
Assessment of brain tissue damage in the Sub-Acute Stroke Region by Multiparametric Imaging using [89-Zr]-Desferal-EPO-PET/MRI

Castaneda, S. G., Katiyar, P., Russo, F., Disselhorst, J. A., Calaminus, C., Poli, S., Maurer, A., Ziemann, U., Pichler, B. J.

World Molecular Imaging Conference, 2015 (talk)

[BibTex]

[BibTex]


no image
Early time point in vivo PET/MR is a promising biomarker for determining efficacy of a novel Db(\alphaEGFR)-scTRAIL fusion protein therapy in a colon cancer model

Divine, M. R., Harant, M., Katiyar, P., Disselhorst, J. A., Bukala, D., Aidone, S., Siegemund, M., Pfizenmaier, K., Kontermann, R., Pichler, B. J.

World Molecular Imaging Conference, 2015 (talk)

[BibTex]

[BibTex]


no image
Cosmology from Cosmic Shear with DES Science Verification Data

Abbott, T., Abdalla, F. B., Allam, S., Amara, A., Annis, J., Armstrong, R., Bacon, D., Banerji, M., Bauer, A. H., Baxter, E., others,

arXiv preprint arXiv:1507.05552, 2015 (techreport)

link (url) [BibTex]

link (url) [BibTex]


no image
The DES Science Verification Weak Lensing Shear Catalogs

Jarvis, M., Sheldon, E., Zuntz, J., Kacprzak, T., Bridle, S. L., Amara, A., Armstrong, R., Becker, M. R., Bernstein, G. M., Bonnett, C., others,

arXiv preprint arXiv:1507.05603, 2015 (techreport)

link (url) [BibTex]

link (url) [BibTex]


no image
The search for single exoplanet transits in the Kepler light curves

Foreman-Mackey, D., Hogg, D. W., Schölkopf, B.

IAU General Assembly, 22, pages: 2258352, 2015 (talk)

link (url) [BibTex]

link (url) [BibTex]

2012


no image
Support Vector Machines, Support Measure Machines, and Quasar Target Selection

Muandet, K.

Center for Cosmology and Particle Physics (CCPP), New York University, December 2012 (talk)

[BibTex]

2012

[BibTex]


no image
Hilbert Space Embedding for Dirichlet Process Mixtures

Muandet, K.

NIPS Workshop on Confluence between Kernel Methods and Graphical Models, December 2012 (talk)

[BibTex]

[BibTex]


no image
Simultaneous small animal PET/MR in activated and resting state reveals multiple brain networks

Wehrl, H., Lankes, K., Hossain, M., Bezrukov, I., Liu, C., Martirosian, P., Schick, F., Pichler, B.

20th Annual Meeting and Exhibition of the International Society for Magnetic Resonance in Medicine (ISMRM), May 2012 (talk)

Web [BibTex]

Web [BibTex]


no image
A new PET insert for simultaneous PET/MR small animal imaging

Wehrl, H., Lankes, K., Hossain, M., Bezrukov, I., Liu, C., Martirosian, P., Reischl, G., Schick, F., Pichler, B.

20th Annual Meeting and Exhibition of the International Society for Magnetic Resonance in Medicine (ISMRM), May 2012 (talk)

Web [BibTex]

Web [BibTex]


no image
Evaluation of a new, large field of view, small animal PET/MR system

Hossain, M., Wehrl, H., Lankes, K., Liu, C., Bezrukov, I., Reischl, G., Pichler, B.

50. Jahrestagung der Deutschen Gesellschaft fuer Nuklearmedizin (NuklearMedizin), April 2012 (talk)

Web [BibTex]

Web [BibTex]


no image
High Gamma-Power Predicts Performance in Brain-Computer Interfacing

Grosse-Wentrup, M., Schölkopf, B.

(3), Max-Planck-Institut für Intelligente Systeme, Tübingen, February 2012 (techreport)

Abstract
Subjects operating a brain-computer interface (BCI) based on sensorimotor rhythms exhibit large variations in performance over the course of an experimental session. Here, we show that high-frequency gamma-oscillations, originating in fronto-parietal networks, predict such variations on a trial-to-trial basis. We interpret this nding as empirical support for an in uence of attentional networks on BCI-performance via modulation of the sensorimotor rhythm.

PDF [BibTex]

PDF [BibTex]


no image
Simultaneous small animal PET/MR reveals different brain networks during stimulation and rest

Wehrl, H., Hossain, M., Lankes, K., Liu, C., Bezrukov, I., Martirosian, P., Reischl, G., Schick, F., Pichler, B.

World Molecular Imaging Congress (WMIC), 2012 (talk)

[BibTex]

[BibTex]


no image
Support Measure Machines for Quasar Target Selection

Muandet, K.

Astro Imaging Workshop, 2012 (talk)

Abstract
In this talk I will discuss the problem of quasar target selection. The objects attributes in astronomy such as fluxes are often subjected to substantial and heterogeneous measurement uncertainties, especially for the medium-redshift between 2.2 and 3.5 quasars which is relatively rare and must be targeted down to g ~ 22 mag. Most of the previous works for quasar target selection includes UV-excess, kernel density estimation, a likelihood approach, and artificial neural network cannot directly deal with the heterogeneous input uncertainties. Recently, extreme deconvolution (XD) has been used to tackle this problem in a well-posed manner. In this work, we present a discriminative approach for quasar target selection that can deal with input uncertainties directly. To do so, we represent each object as a Gaussian distribution whose mean is the object's attribute vector and covariance is the given flux measurement uncertainty. Given a training set of Gaussian distributions, the support measure machines (SMMs) algorithm are trained and used to build the quasar targeting catalog. Preliminary results will also be presented. Joint work with Jo Bovy and Bernhard Sch{\"o}lkopf

Web [BibTex]


no image
PAC-Bayesian Analysis: A Link Between Inference and Statistical Physics

Seldin, Y.

Workshop on Statistical Physics of Inference and Control Theory, 2012 (talk)

Web [BibTex]

Web [BibTex]


no image
PET Performance Measurements of a Next Generation Dedicated Small Animal PET/MR Scanner

Liu, C., Hossain, M., Lankes, K., Bezrukov, I., Wehrl, H., Kolb, A., Judenhofer, M., Pichler, B.

Nuclear Science Symposium and Medical Imaging Conference (NSS-MIC), 2012 (talk)

[BibTex]

[BibTex]


no image
PAC-Bayesian Analysis of Supervised, Unsupervised, and Reinforcement Learning

Seldin, Y., Laviolette, F., Shawe-Taylor, J.

Tutorial at the 29th International Conference on Machine Learning (ICML), 2012 (talk)

Web Web [BibTex]

Web Web [BibTex]


no image
Influence of MR-based attenuation correction on lesions within bone and susceptibility artifact regions

Bezrukov, I., Schmidt, H., Mantlik, F., Schwenzer, N., Brendle, C., Pichler, B.

Molekulare Bildgebung (MoBi), 2012 (talk)

[BibTex]

[BibTex]


no image
Structured Apprenticeship Learning

Boularias, A., Kroemer, O., Peters, J.

European Workshop on Reinforcement Learning (EWRL), 2012 (talk)

[BibTex]

[BibTex]


no image
PAC-Bayesian Analysis and Its Applications

Seldin, Y., Laviolette, F., Shawe-Taylor, J.

Tutorial at The European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD), 2012 (talk)

Web [BibTex]

Web [BibTex]


no image
Kernel Bellman Equations in POMDPs

Nishiyama, Y., Boularias, A., Gretton, A., Fukumizu, K.

Technical Committee on Infomation-Based Induction Sciences and Machine Learning (IBISML'12), 2012 (talk)

[BibTex]

[BibTex]


no image
Beta oscillations propagate as traveling waves in the macaque prefrontal cortex

Panagiotaropoulos, T., Besserve, M., Logothetis, N.

42nd Annual Meeting of the Society for Neuroscience (Neuroscience), 2012 (talk)

[BibTex]

[BibTex]

2005


no image
Spectral clustering and transductive inference for graph data

Zhou, D.

NIPS Workshop on Kernel Methods and Structured Domains, December 2005 (talk)

PDF Web [BibTex]

2005

PDF Web [BibTex]


no image
Some thoughts about Gaussian Processes

Chapelle, O.

NIPS Workshop on Open Problems in Gaussian Processes for Machine Learning, December 2005 (talk)

PDF Web [BibTex]

PDF Web [BibTex]


no image
Popper, Falsification and the VC-dimension

Corfield, D., Schölkopf, B., Vapnik, V.

(145), Max Planck Institute for Biological Cybernetics, November 2005 (techreport)

PDF [BibTex]

PDF [BibTex]


no image
A Combinatorial View of Graph Laplacians

Huang, J.

(144), Max Planck Institute for Biological Cybernetics, Tübingen, Germany, August 2005 (techreport)

Abstract
Discussions about different graph Laplacian, mainly normalized and unnormalized versions of graph Laplacian, have been ardent with respect to various methods in clustering and graph based semi-supervised learning. Previous research on graph Laplacians investigated their convergence properties to Laplacian operators on continuous manifolds. There is still no strong proof on convergence for the normalized Laplacian. In this paper, we analyze different variants of graph Laplacians directly from the ways solving the original graph partitioning problem. The graph partitioning problem is a well-known combinatorial NP hard optimization problem. The spectral solutions provide evidence that normalized Laplacian encodes more reasonable considerations for graph partitioning. We also provide some examples to show their differences.

[BibTex]

[BibTex]


no image
Beyond Pairwise Classification and Clustering Using Hypergraphs

Zhou, D., Huang, J., Schölkopf, B.

(143), Max Planck Institute for Biological Cybernetics, August 2005 (techreport)

Abstract
In many applications, relationships among objects of interest are more complex than pairwise. Simply approximating complex relationships as pairwise ones can lead to loss of information. An alternative for these applications is to analyze complex relationships among data directly, without the need to first represent the complex relationships into pairwise ones. A natural way to describe complex relationships is to use hypergraphs. A hypergraph is a graph in which edges can connect more than two vertices. Thus we consider learning from a hypergraph, and develop a general framework which is applicable to classification and clustering for complex relational data. We have applied our framework to real-world web classification problems and obtained encouraging results.

PDF [BibTex]

PDF [BibTex]


no image
Building Sparse Large Margin Classifiers

Wu, M., Schölkopf, B., BakIr, G.

The 22nd International Conference on Machine Learning (ICML), August 2005 (talk)

PDF [BibTex]

PDF [BibTex]


no image
Learning from Labeled and Unlabeled Data on a Directed Graph

Zhou, D.

The 22nd International Conference on Machine Learning, August 2005 (talk)

Abstract
We propose a general framework for learning from labeled and unlabeled data on a directed graph in which the structure of the graph including the directionality of the edges is considered. The time complexity of the algorithm derived from this framework is nearly linear due to recently developed numerical techniques. In the absence of labeled instances, this framework can be utilized as a spectral clustering method for directed graphs, which generalizes the spectral clustering approach for undirected graphs. We have applied our framework to real-world web classification problems and obtained encouraging results.

PDF [BibTex]

PDF [BibTex]


no image
Machine-Learning Approaches to BCI in Tübingen

Bensch, M., Bogdan, M., Hill, N., Lal, T., Rosenstiel, W., Schölkopf, B., Schröder, M.

Brain-Computer Interface Technology, June 2005, Talk given by NJH. (talk)

[BibTex]

[BibTex]


no image
Generalized Nonnegative Matrix Approximations using Bregman Divergences

Sra, S., Dhillon, I.

Univ. of Texas at Austin, June 2005 (techreport)

[BibTex]

[BibTex]


no image
Learning Motor Primitives with Reinforcement Learning

Peters, J., Schaal, S.

ROBOTICS Workshop on Modular Foundations for Control and Perception, June 2005 (talk)

Web [BibTex]

Web [BibTex]


no image
Measuring Statistical Dependence with Hilbert-Schmidt Norms

Gretton, A., Bousquet, O., Smola, A., Schölkopf, B.

(140), Max Planck Institute for Biological Cybernetics, Tübingen, Germany, June 2005 (techreport)

Abstract
We propose an independence criterion based on the eigenspectrum of covariance operators in reproducing kernel Hilbert spaces (RKHSs), consisting of an empirical estimate of the Hilbert-Schmidt norm of the cross-covariance operator (we term this a Hilbert-Schmidt Independence Criterion, or HSIC). This approach has several advantages, compared with previous kernel-based independence criteria. First, the empirical estimate is simpler than any other kernel dependence test, and requires no user-defined regularisation. Second, there is a clearly defined population quantity which the empirical estimate approaches in the large sample limit, with exponential convergence guaranteed between the two: this ensures that independence tests based on HSIC do not suffer from slow learning rates. Finally, we show in the context of independent component analysis (ICA) that the performance of HSIC is competitive with that of previously published kernel-based criteria, and of other recently published ICA methods.

PDF [BibTex]

PDF [BibTex]


no image
Consistency of Kernel Canonical Correlation Analysis

Fukumizu, K., Bach, F., Gretton, A.

(942), Institute of Statistical Mathematics, 4-6-7 Minami-azabu, Minato-ku, Tokyo 106-8569 Japan, June 2005 (techreport)

PDF [BibTex]

PDF [BibTex]


no image
Motor Skill Learning for Humanoid Robots

Peters, J.

First Conference Undergraduate Computer Sciences and Informations Sciences (CS/IS), May 2005 (talk)

[BibTex]

[BibTex]


no image
Kernel Constrained Covariance for Dependence Measurement

Gretton, A., Smola, A., Bousquet, O., Herbrich, R., Belitski, A., Augath, M., Murayama, Y., Schölkopf, B., Logothetis, N.

AISTATS, January 2005 (talk)

Abstract
We discuss reproducing kernel Hilbert space (RKHS)-based measures of statistical dependence, with emphasis on constrained covariance (COCO), a novel criterion to test dependence of random variables. We show that COCO is a test for independence if and only if the associated RKHSs are universal. That said, no independence test exists that can distinguish dependent and independent random variables in all circumstances. Dependent random variables can result in a COCO which is arbitrarily close to zero when the source densities are highly non-smooth. All current kernel-based independence tests share this behaviour. We demonstrate exponential convergence between the population and empirical COCO. Finally, we use COCO as a measure of joint neural activity between voxels in MRI recordings of the macaque monkey, and compare the results to the mutual information and the correlation. We also show the effect of removing breathing artefacts from the MRI recording.

PostScript [BibTex]

PostScript [BibTex]


no image
Approximate Inference for Robust Gaussian Process Regression

Kuss, M., Pfingsten, T., Csato, L., Rasmussen, C.

(136), Max Planck Institute for Biological Cybernetics, Tübingen, Germany, 2005 (techreport)

Abstract
Gaussian process (GP) priors have been successfully used in non-parametric Bayesian regression and classification models. Inference can be performed analytically only for the regression model with Gaussian noise. For all other likelihood models inference is intractable and various approximation techniques have been proposed. In recent years expectation-propagation (EP) has been developed as a general method for approximate inference. This article provides a general summary of how expectation-propagation can be used for approximate inference in Gaussian process models. Furthermore we present a case study describing its implementation for a new robust variant of Gaussian process regression. To gain further insights into the quality of the EP approximation we present experiments in which we compare to results obtained by Markov chain Monte Carlo (MCMC) sampling.

PDF [BibTex]

PDF [BibTex]


no image
Maximum-Margin Feature Combination for Detection and Categorization

BakIr, G., Wu, M., Eichhorn, J.

Max Planck Institute for Biological Cybernetics, Tübingen, Germany, 2005 (techreport)

Abstract
In this paper we are concerned with the optimal combination of features of possibly different types for detection and estimation tasks in machine vision. We propose to combine features such that the resulting classifier maximizes the margin between classes. In contrast to existing approaches which are non-convex and/or generative we propose to use a discriminative model leading to convex problem formulation and complexity control. Furthermore we assert that decision functions should not compare apples and oranges by comparing features of different types directly. Instead we propose to combine different similarity measures for each different feature type. Furthermore we argue that the question: ”Which feature type is more discriminative for task X?” is ill-posed and show empirically that the answer to this question might depend on the complexity of the decision function.

PDF [BibTex]

PDF [BibTex]


no image
Towards a Statistical Theory of Clustering. Presented at the PASCAL workshop on clustering, London

von Luxburg, U., Ben-David, S.

Presented at the PASCAL workshop on clustering, London, 2005 (techreport)

Abstract
The goal of this paper is to discuss statistical aspects of clustering in a framework where the data to be clustered has been sampled from some unknown probability distribution. Firstly, the clustering of the data set should reveal some structure of the underlying data rather than model artifacts due to the random sampling process. Secondly, the more sample points we have, the more reliable the clustering should be. We discuss which methods can and cannot be used to tackle those problems. In particular we argue that generalization bounds as they are used in statistical learning theory of classification are unsuitable in a general clustering framework. We suggest that the main replacements of generalization bounds should be convergence proofs and stability considerations. This paper should be considered as a road map paper which identifies important questions and potentially fruitful directions for future research about statistical clustering. We do not attempt to present a complete statistical theory of clustering.

PDF [BibTex]

PDF [BibTex]


no image
Approximate Bayesian Inference for Psychometric Functions using MCMC Sampling

Kuss, M., Jäkel, F., Wichmann, F.

(135), Max Planck Institute for Biological Cybernetics, Tübingen, Germany, 2005 (techreport)

Abstract
In psychophysical studies the psychometric function is used to model the relation between the physical stimulus intensity and the observer's ability to detect or discriminate between stimuli of different intensities. In this report we propose the use of Bayesian inference to extract the information contained in experimental data estimate the parameters of psychometric functions. Since Bayesian inference cannot be performed analytically we describe how a Markov chain Monte Carlo method can be used to generate samples from the posterior distribution over parameters. These samples are used to estimate Bayesian confidence intervals and other characteristics of the posterior distribution. In addition we discuss the parameterisation of psychometric functions and the role of prior distributions in the analysis. The proposed approach is exemplified using artificially generate d data and in a case study for real experimental data. Furthermore, we compare our approach with traditional methods based on maximum-likelihood parameter estimation combined with bootstrap techniques for confidence interval estimation. The appendix provides a description of an implementation for the R environment for statistical computing and provides the code for reproducing the results discussed in the experiment section.

PDF [BibTex]

PDF [BibTex]

2002


no image
Kernel Dependency Estimation

Weston, J., Chapelle, O., Elisseeff, A., Schölkopf, B., Vapnik, V.

(98), Max Planck Institute for Biological Cybernetics, August 2002 (techreport)

Abstract
We consider the learning problem of finding a dependency between a general class of objects and another, possibly different, general class of objects. The objects can be for example: vectors, images, strings, trees or graphs. Such a task is made possible by employing similarity measures in both input and output spaces using kernel functions, thus embedding the objects into vector spaces. Output kernels also make it possible to encode prior information and/or invariances in the loss function in an elegant way. We experimentally validate our approach on several tasks: mapping strings to strings, pattern recognition, and reconstruction from partial images.

PDF [BibTex]

2002

PDF [BibTex]


no image
Global Geometry of SVM Classifiers

Zhou, D., Xiao, B., Zhou, H., Dai, R.

Max Planck Institute for Biological Cybernetics, Tübingen, Germany, June 2002 (techreport)

Abstract
We construct an geometry framework for any norm Support Vector Machine (SVM) classifiers. Within this framework, separating hyperplanes, dual descriptions and solutions of SVM classifiers are constructed by a purely geometric fashion. In contrast with the optimization theory used in SVM classifiers, we have no complicated computations any more. Each step in our theory is guided by elegant geometric intuitions.

PDF PostScript [BibTex]

PDF PostScript [BibTex]


no image
Computationally Efficient Face Detection

Romdhani, S., Torr, P., Schölkopf, B., Blake, A.

(MSR-TR-2002-69), Microsoft Research, June 2002 (techreport)

Web [BibTex]

Web [BibTex]


no image
Kernel-based nonlinear blind source separation

Harmeling, S., Ziehe, A., Kawanabe, M., Müller, K.

EU-Project BLISS, January 2002 (techreport)

GZIP [BibTex]

GZIP [BibTex]


no image
A compression approach to support vector model selection

von Luxburg, U., Bousquet, O., Schölkopf, B.

(101), Max Planck Institute for Biological Cybernetics, 2002, see more detailed JMLR version (techreport)

Abstract
In this paper we investigate connections between statistical learning theory and data compression on the basis of support vector machine (SVM) model selection. Inspired by several generalization bounds we construct ``compression coefficients'' for SVMs, which measure the amount by which the training labels can be compressed by some classification hypothesis. The main idea is to relate the coding precision of this hypothesis to the width of the margin of the SVM. The compression coefficients connect well known quantities such as the radius-margin ratio R^2/rho^2, the eigenvalues of the kernel matrix and the number of support vectors. To test whether they are useful in practice we ran model selection experiments on several real world datasets. As a result we found that compression coefficients can fairly accurately predict the parameters for which the test error is minimized.

[BibTex]

[BibTex]