Header logo is ei


2015


no image
Causal Inference for Empirical Time Series Based on the Postulate of Independence of Cause and Mechanism

Besserve, M.

53rd Annual Allerton Conference on Communication, Control, and Computing, September 2015 (talk)

[BibTex]

2015

[BibTex]


no image
Independence of cause and mechanism in brain networks

Besserve, M.

DALI workshop on Networks: Processes and Causality, April 2015 (talk)

[BibTex]

[BibTex]


no image
Information-Theoretic Implications of Classical and Quantum Causal Structures

Chaves, R., Majenz, C., Luft, L., Maciel, T., Janzing, D., Schölkopf, B., Gross, D.

18th Conference on Quantum Information Processing (QIP), 2015 (talk)

Web link (url) [BibTex]

Web link (url) [BibTex]


no image
Assessment of brain tissue damage in the Sub-Acute Stroke Region by Multiparametric Imaging using [89-Zr]-Desferal-EPO-PET/MRI

Castaneda, S. G., Katiyar, P., Russo, F., Disselhorst, J. A., Calaminus, C., Poli, S., Maurer, A., Ziemann, U., Pichler, B. J.

World Molecular Imaging Conference, 2015 (talk)

[BibTex]

[BibTex]


no image
Early time point in vivo PET/MR is a promising biomarker for determining efficacy of a novel Db(\alphaEGFR)-scTRAIL fusion protein therapy in a colon cancer model

Divine, M. R., Harant, M., Katiyar, P., Disselhorst, J. A., Bukala, D., Aidone, S., Siegemund, M., Pfizenmaier, K., Kontermann, R., Pichler, B. J.

World Molecular Imaging Conference, 2015 (talk)

[BibTex]

[BibTex]


no image
The search for single exoplanet transits in the Kepler light curves

Foreman-Mackey, D., Hogg, D. W., Schölkopf, B.

IAU General Assembly, 22, pages: 2258352, 2015 (talk)

link (url) [BibTex]

link (url) [BibTex]

2012


no image
Support Vector Machines, Support Measure Machines, and Quasar Target Selection

Muandet, K.

Center for Cosmology and Particle Physics (CCPP), New York University, December 2012 (talk)

[BibTex]

2012

[BibTex]


no image
Hilbert Space Embedding for Dirichlet Process Mixtures

Muandet, K.

NIPS Workshop on Confluence between Kernel Methods and Graphical Models, December 2012 (talk)

[BibTex]

[BibTex]


no image
Simultaneous small animal PET/MR in activated and resting state reveals multiple brain networks

Wehrl, H., Lankes, K., Hossain, M., Bezrukov, I., Liu, C., Martirosian, P., Schick, F., Pichler, B.

20th Annual Meeting and Exhibition of the International Society for Magnetic Resonance in Medicine (ISMRM), May 2012 (talk)

Web [BibTex]

Web [BibTex]


no image
A new PET insert for simultaneous PET/MR small animal imaging

Wehrl, H., Lankes, K., Hossain, M., Bezrukov, I., Liu, C., Martirosian, P., Reischl, G., Schick, F., Pichler, B.

20th Annual Meeting and Exhibition of the International Society for Magnetic Resonance in Medicine (ISMRM), May 2012 (talk)

Web [BibTex]

Web [BibTex]


no image
Evaluation of a new, large field of view, small animal PET/MR system

Hossain, M., Wehrl, H., Lankes, K., Liu, C., Bezrukov, I., Reischl, G., Pichler, B.

50. Jahrestagung der Deutschen Gesellschaft fuer Nuklearmedizin (NuklearMedizin), April 2012 (talk)

Web [BibTex]

Web [BibTex]


no image
Simultaneous small animal PET/MR reveals different brain networks during stimulation and rest

Wehrl, H., Hossain, M., Lankes, K., Liu, C., Bezrukov, I., Martirosian, P., Reischl, G., Schick, F., Pichler, B.

World Molecular Imaging Congress (WMIC), 2012 (talk)

[BibTex]

[BibTex]


no image
Support Measure Machines for Quasar Target Selection

Muandet, K.

Astro Imaging Workshop, 2012 (talk)

Abstract
In this talk I will discuss the problem of quasar target selection. The objects attributes in astronomy such as fluxes are often subjected to substantial and heterogeneous measurement uncertainties, especially for the medium-redshift between 2.2 and 3.5 quasars which is relatively rare and must be targeted down to g ~ 22 mag. Most of the previous works for quasar target selection includes UV-excess, kernel density estimation, a likelihood approach, and artificial neural network cannot directly deal with the heterogeneous input uncertainties. Recently, extreme deconvolution (XD) has been used to tackle this problem in a well-posed manner. In this work, we present a discriminative approach for quasar target selection that can deal with input uncertainties directly. To do so, we represent each object as a Gaussian distribution whose mean is the object's attribute vector and covariance is the given flux measurement uncertainty. Given a training set of Gaussian distributions, the support measure machines (SMMs) algorithm are trained and used to build the quasar targeting catalog. Preliminary results will also be presented. Joint work with Jo Bovy and Bernhard Sch{\"o}lkopf

Web [BibTex]


no image
PAC-Bayesian Analysis: A Link Between Inference and Statistical Physics

Seldin, Y.

Workshop on Statistical Physics of Inference and Control Theory, 2012 (talk)

Web [BibTex]

Web [BibTex]


no image
PET Performance Measurements of a Next Generation Dedicated Small Animal PET/MR Scanner

Liu, C., Hossain, M., Lankes, K., Bezrukov, I., Wehrl, H., Kolb, A., Judenhofer, M., Pichler, B.

Nuclear Science Symposium and Medical Imaging Conference (NSS-MIC), 2012 (talk)

[BibTex]

[BibTex]


no image
PAC-Bayesian Analysis of Supervised, Unsupervised, and Reinforcement Learning

Seldin, Y., Laviolette, F., Shawe-Taylor, J.

Tutorial at the 29th International Conference on Machine Learning (ICML), 2012 (talk)

Web Web [BibTex]

Web Web [BibTex]


no image
Influence of MR-based attenuation correction on lesions within bone and susceptibility artifact regions

Bezrukov, I., Schmidt, H., Mantlik, F., Schwenzer, N., Brendle, C., Pichler, B.

Molekulare Bildgebung (MoBi), 2012 (talk)

[BibTex]

[BibTex]


no image
Structured Apprenticeship Learning

Boularias, A., Kroemer, O., Peters, J.

European Workshop on Reinforcement Learning (EWRL), 2012 (talk)

[BibTex]

[BibTex]


no image
PAC-Bayesian Analysis and Its Applications

Seldin, Y., Laviolette, F., Shawe-Taylor, J.

Tutorial at The European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD), 2012 (talk)

Web [BibTex]

Web [BibTex]


no image
Kernel Bellman Equations in POMDPs

Nishiyama, Y., Boularias, A., Gretton, A., Fukumizu, K.

Technical Committee on Infomation-Based Induction Sciences and Machine Learning (IBISML'12), 2012 (talk)

[BibTex]

[BibTex]


no image
Beta oscillations propagate as traveling waves in the macaque prefrontal cortex

Panagiotaropoulos, T., Besserve, M., Logothetis, N.

42nd Annual Meeting of the Society for Neuroscience (Neuroscience), 2012 (talk)

[BibTex]

[BibTex]

2008


no image
BCPy2000

Hill, N., Schreiner, T., Puzicha, C., Farquhar, J.

Workshop "Machine Learning Open-Source Software" at NIPS, December 2008 (talk)

Web [BibTex]

2008

Web [BibTex]


no image
Logistic Regression for Graph Classification

Shervashidze, N., Tsuda, K.

NIPS Workshop on "Structured Input - Structured Output" (NIPS SISO), December 2008 (talk)

Abstract
In this paper we deal with graph classification. We propose a new algorithm for performing sparse logistic regression for graphs, which is comparable in accuracy with other methods of graph classification and produces probabilistic output in addition. Sparsity is required for the reason of interpretability, which is often necessary in domains such as bioinformatics or chemoinformatics.

Web [BibTex]

Web [BibTex]


no image
New Projected Quasi-Newton Methods with Applications

Sra, S.

Microsoft Research Tech-talk, December 2008 (talk)

Abstract
Box-constrained convex optimization problems are central to several applications in a variety of fields such as statistics, psychometrics, signal processing, medical imaging, and machine learning. Two fundamental examples are the non-negative least squares (NNLS) problem and the non-negative Kullback-Leibler (NNKL) divergence minimization problem. The non-negativity constraints are usually based on an underlying physical restriction, for e.g., when dealing with applications in astronomy, tomography, statistical estimation, or image restoration, the underlying parameters represent physical quantities such as concentration, weight, intensity, or frequency counts and are therefore only interpretable with non-negative values. Several modern optimization methods can be inefficient for simple problems such as NNLS and NNKL as they are really designed to handle far more general and complex problems. In this work we develop two simple quasi-Newton methods for solving box-constrained (differentiable) convex optimization problems that utilize the well-known BFGS and limited memory BFGS updates. We position our method between projected gradient (Rosen, 1960) and projected Newton (Bertsekas, 1982) methods, and prove its convergence under a simple Armijo step-size rule. We illustrate our method by showing applications to: Image deblurring, Positron Emission Tomography (PET) image reconstruction, and Non-negative Matrix Approximation (NMA). On medium sized data we observe performance competitive to established procedures, while for larger data the results are even better.

PDF [BibTex]

PDF [BibTex]


no image
MR-Based PET Attenuation Correction: Initial Results for Whole Body

Hofmann, M., Steinke, F., Aschoff, P., Lichy, M., Brady, M., Schölkopf, B., Pichler, B.

Medical Imaging Conference, October 2008 (talk)

[BibTex]

[BibTex]


no image
Nonparametric Indepedence Tests: Space Partitioning and Kernel Approaches

Gretton, A., Györfi, L.

19th International Conference on Algorithmic Learning Theory (ALT08), October 2008 (talk)

PDF Web [BibTex]

PDF Web [BibTex]


no image
Data-driven goodness-of-fit tests

Langovoy, M.

2008 Barcelona Conference on Asymptotic Statistics (BAS), September 2008 (talk)

Web [BibTex]

Web [BibTex]


no image
mGene: A Novel Discriminative Gene Finder

Schweikert, G., Zeller, G., Zien, A., Behr, J., Sonnenburg, S., Philips, P., Ong, C., Rätsch, G.

Worm Genomics and Systems Biology meeting, July 2008 (talk)

[BibTex]

[BibTex]


no image
Discovering Common Sequence Variation in Arabidopsis thaliana

Rätsch, G., Clark, R., Schweikert, G., Toomajian, C., Ossowski, S., Zeller, G., Shinn, P., Warthman, N., Hu, T., Fu, G., Hinds, D., Cheng, H., Frazer, K., Huson, D., Schölkopf, B., Nordborg, M., Ecker, J., Weigel, D., Schneeberger, K., Bohlen, A.

16th Annual International Conference Intelligent Systems for Molecular Biology (ISMB), July 2008 (talk)

Web [BibTex]

Web [BibTex]


no image
Coding Theory in Brain-Computer Interfaces

Martens, SMM.

Soria Summerschool on Computational Mathematics "Algebraic Coding Theory" (S3CM), July 2008 (talk)

Web [BibTex]

Web [BibTex]


no image
Motor Skill Learning for Cognitive Robotics

Peters, J.

6th International Cognitive Robotics Workshop (CogRob), July 2008 (talk)

Abstract
Autonomous robots that can assist humans in situations of daily life have been a long standing vision of robotics, artificial intelligence, and cognitive sciences. A first step towards this goal is to create robots that can learn tasks triggered by environmental context or higher level instruction. However, learning techniques have yet to live up to this promise as only few methods manage to scale to high-dimensional manipulator or humanoid robots. In this tutorial, we give a general overview on motor skill learning for cognitive robotics using research at ATR, USC, CMU and Max-Planck in order to illustrate the problems in motor skill learning. For doing so, we discuss task-appropriate representations and algorithms for learning robot motor skills. Among the topics are the learning basic movements or motor primitives by imitation and reinforcement learning, learning rhytmic and discrete movements, fast regression methods for learning inverse dynamics and setups for learning task-space policies. Examples on various robots, e.g., SARCOS DB, the SARCOS Master Arm, BDI Little Dog and a Barrett WAM, are shown and include Ball-in-a-Cup, T-Ball, Juggling, Devil-Sticking, Operational Space Control and many others.

Web [BibTex]

Web [BibTex]


no image
Painless Embeddings of Distributions: the Function Space View (Part 1)

Fukumizu, K., Gretton, A., Smola, A.

25th International Conference on Machine Learning (ICML), July 2008 (talk)

Abstract
This tutorial will give an introduction to the recent understanding and methodology of the kernel method: dealing with higher order statistics by embedding painlessly random variables/probability distributions. In the early days of kernel machines research, the "kernel trick" was considered a useful way of constructing nonlinear algorithms from linear ones. More recently, however, it has become clear that a potentially more far reaching use of kernels is as a linear way of dealing with higher order statistics by embedding distributions in a suitable reproducing kernel Hilbert space (RKHS). Notably, unlike the straightforward expansion of higher order moments or conventional characteristic function approach, the use of kernels or RKHS provides a painless, tractable way of embedding distributions. This line of reasoning leads naturally to the questions: what does it mean to embed a distribution in an RKHS? when is this embedding injective (and thus, when do different distributions have unique mappings)? what implications are there for learning algorithms that make use of these embeddings? This tutorial aims at answering these questions. There are a great variety of applications in machine learning and computer science, which require distribution estimation and/or comparison.

PDF Web [BibTex]

PDF Web [BibTex]


no image
Reinforcement Learning for Robotics

Peters, J.

8th European Workshop on Reinforcement Learning for Robotics (EWRL), July 2008 (talk)

Web [BibTex]

Web [BibTex]


no image
Multi-Classification by Categorical Features via Clustering

Seldin, Y.

25th International Conference on Machine Learning (ICML), June 2008 (talk)

Abstract
We derive a generalization bound for multi-classification schemes based on grid clustering in categorical parameter product spaces. Grid clustering partitions the parameter space in the form of a Cartesian product of partitions for each of the parameters. The derived bound provides a means to evaluate clustering solutions in terms of the generalization power of a built-on classifier. For classification based on a single feature the bound serves to find a globally optimal classification rule. Comparison of the generalization power of individual features can then be used for feature ranking. Our experiments show that in this role the bound is much more precise than mutual information or normalized correlation indices.

PDF Web [BibTex]

PDF Web [BibTex]


no image
Thin-Plate Splines Between Riemannian Manifolds

Steinke, F., Hein, M., Schölkopf, B.

Workshop on Geometry and Statistics of Shapes, June 2008 (talk)

Abstract
With the help of differential geometry we describe a framework to define a thin-plate spline like energy for maps between arbitrary Riemannian manifolds. The so-called Eells energy only depends on the intrinsic geometry of the input and output manifold, but not on their respective representation. The energy can then be used for regression between manifolds, we present results for cases where the outputs are rotations, sets of angles, or points on 3D surfaces. In the future we plan to also target regression where the output is an element of "shape space", understood as a Riemannian manifold. One could also further explore the meaning of the Eells energy when applied to diffeomorphisms between shapes, especially with regard to its potential use as a distance measure between shapes that does not depend on the embedding or the parametrisation of the shapes.

Web [BibTex]

Web [BibTex]


no image
Learning resolved velocity control

Peters, J.

2008 IEEE International Conference on Robotics and Automation (ICRA), May 2008 (talk)

Web [BibTex]

Web [BibTex]


no image
Bayesian methods for protein structure determination

Habeck, M.

Machine Learning in Structural Bioinformatics, April 2008 (talk)

Web [BibTex]

Web [BibTex]

2005


no image
Spectral clustering and transductive inference for graph data

Zhou, D.

NIPS Workshop on Kernel Methods and Structured Domains, December 2005 (talk)

PDF Web [BibTex]

2005

PDF Web [BibTex]


no image
Some thoughts about Gaussian Processes

Chapelle, O.

NIPS Workshop on Open Problems in Gaussian Processes for Machine Learning, December 2005 (talk)

PDF Web [BibTex]

PDF Web [BibTex]


no image
Building Sparse Large Margin Classifiers

Wu, M., Schölkopf, B., BakIr, G.

The 22nd International Conference on Machine Learning (ICML), August 2005 (talk)

PDF [BibTex]

PDF [BibTex]


no image
Learning from Labeled and Unlabeled Data on a Directed Graph

Zhou, D.

The 22nd International Conference on Machine Learning, August 2005 (talk)

Abstract
We propose a general framework for learning from labeled and unlabeled data on a directed graph in which the structure of the graph including the directionality of the edges is considered. The time complexity of the algorithm derived from this framework is nearly linear due to recently developed numerical techniques. In the absence of labeled instances, this framework can be utilized as a spectral clustering method for directed graphs, which generalizes the spectral clustering approach for undirected graphs. We have applied our framework to real-world web classification problems and obtained encouraging results.

PDF [BibTex]

PDF [BibTex]


no image
Machine-Learning Approaches to BCI in Tübingen

Bensch, M., Bogdan, M., Hill, N., Lal, T., Rosenstiel, W., Schölkopf, B., Schröder, M.

Brain-Computer Interface Technology, June 2005, Talk given by NJH. (talk)

[BibTex]

[BibTex]


no image
Learning Motor Primitives with Reinforcement Learning

Peters, J., Schaal, S.

ROBOTICS Workshop on Modular Foundations for Control and Perception, June 2005 (talk)

Web [BibTex]

Web [BibTex]


no image
Motor Skill Learning for Humanoid Robots

Peters, J.

First Conference Undergraduate Computer Sciences and Informations Sciences (CS/IS), May 2005 (talk)

[BibTex]

[BibTex]


no image
Kernel Constrained Covariance for Dependence Measurement

Gretton, A., Smola, A., Bousquet, O., Herbrich, R., Belitski, A., Augath, M., Murayama, Y., Schölkopf, B., Logothetis, N.

AISTATS, January 2005 (talk)

Abstract
We discuss reproducing kernel Hilbert space (RKHS)-based measures of statistical dependence, with emphasis on constrained covariance (COCO), a novel criterion to test dependence of random variables. We show that COCO is a test for independence if and only if the associated RKHSs are universal. That said, no independence test exists that can distinguish dependent and independent random variables in all circumstances. Dependent random variables can result in a COCO which is arbitrarily close to zero when the source densities are highly non-smooth. All current kernel-based independence tests share this behaviour. We demonstrate exponential convergence between the population and empirical COCO. Finally, we use COCO as a measure of joint neural activity between voxels in MRI recordings of the macaque monkey, and compare the results to the mutual information and the correlation. We also show the effect of removing breathing artefacts from the MRI recording.

PostScript [BibTex]

PostScript [BibTex]