Header logo is ei


2012


no image
Scalable graph kernels

Shervashidze, N.

Eberhard Karls Universität Tübingen, Germany, October 2012 (phdthesis)

Web [BibTex]

2012

Web [BibTex]


no image
Probabilistic Modelling of Expression Variation in Modern eQTL Studies

Zwießele, M.

Eberhard Karls Universität Tübingen, Germany, October 2012 (mastersthesis)

[BibTex]

[BibTex]


no image
Learning Motor Skills: From Algorithms to Robot Experiments

Kober, J.

Technische Universität Darmstadt, Germany, March 2012 (phdthesis)

PDF [BibTex]

PDF [BibTex]


no image
Structure and Dynamics of Diffusion Networks

Gomez Rodriguez, M.

Department of Electrical Engineering, Stanford University, 2012 (phdthesis)

Web [BibTex]

Web [BibTex]


no image
Blind Deconvolution in Scientific Imaging & Computational Photography

Hirsch, M.

Eberhard Karls Universität Tübingen, Germany, 2012 (phdthesis)

Web [BibTex]

Web [BibTex]


no image
Mining correlated loci at a genome-wide scale

Velkov, V.

Eberhard Karls Universität Tübingen, Germany, 2012 (mastersthesis)

[BibTex]

[BibTex]

2011


no image
Optimization for Machine Learning

Sra, S., Nowozin, S., Wright, S.

pages: 494, Neural information processing series, MIT Press, Cambridge, MA, USA, December 2011 (book)

Abstract
The interplay between optimization and machine learning is one of the most important developments in modern computational science. Optimization formulations and methods are proving to be vital in designing algorithms to extract essential knowledge from huge volumes of data. Machine learning, however, is not simply a consumer of optimization technology but a rapidly evolving field that is itself generating new optimization ideas. This book captures the state of the art of the interaction between optimization and machine learning in a way that is accessible to researchers in both fields. Optimization approaches have enjoyed prominence in machine learning because of their wide applicability and attractive theoretical properties. The increasing complexity, size, and variety of today's machine learning models call for the reassessment of existing assumptions. This book starts the process of reassessment. It describes the resurgence in novel contexts of established frameworks such as first-order methods, stochastic approximations, convex relaxations, interior-point methods, and proximal methods. It also devotes attention to newer themes such as regularized optimization, robust optimization, gradient and subgradient methods, splitting techniques, and second-order methods. Many of these techniques draw inspiration from other fields, including operations research, theoretical computer science, and subfields of optimization. The book will enrich the ongoing cross-fertilization between the machine learning community and these other fields, and within the broader optimization community.

Web [BibTex]

2011

Web [BibTex]


no image
Bayesian Time Series Models

Barber, D., Cemgil, A., Chiappa, S.

pages: 432, Cambridge University Press, Cambridge, UK, August 2011 (book)

[BibTex]

[BibTex]


no image
Crowdsourcing for optimisation of deconvolution methods via an iPhone application

Lang, A.

Hochschule Reutlingen, Germany, April 2011 (mastersthesis)

[BibTex]


no image
Learning functions with kernel methods

Dinuzzo, F.

University of Pavia, Italy, January 2011 (phdthesis)

PDF [BibTex]

PDF [BibTex]


no image
Handbook of Statistical Bioinformatics

Lu, H., Schölkopf, B., Zhao, H.

pages: 627, Springer Handbooks of Computational Statistics, Springer, Berlin, Germany, 2011 (book)

Web DOI [BibTex]

Web DOI [BibTex]


no image
Model Learning in Robot Control

Nguyen-Tuong, D.

Albert-Ludwigs-Universität Freiburg, Germany, 2011 (phdthesis)

[BibTex]

[BibTex]