Header logo is ei


2007


no image
Advances in Neural Information Processing Systems 19: Proceedings of the 2006 Conference

Schölkopf, B., Platt, J., Hofmann, T.

Proceedings of the Twentieth Annual Conference on Neural Information Processing Systems (NIPS 2006), pages: 1690, MIT Press, Cambridge, MA, USA, 20th Annual Conference on Neural Information Processing Systems (NIPS), September 2007 (proceedings)

Abstract
The annual Neural Information Processing Systems (NIPS) conference is the flagship meeting on neural computation and machine learning. It draws a diverse group of attendees--physicists, neuroscientists, mathematicians, statisticians, and computer scientists--interested in theoretical and applied aspects of modeling, simulating, and building neural-like or intelligent systems. The presentations are interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, brain imaging, vision, speech and signal processing, reinforcement learning, and applications. Only twenty-five percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This volume contains the papers presented at the December 2006 meeting, held in Vancouver.

Web [BibTex]

2007

Web [BibTex]


no image
Mathematik der Wahrnehmung: Wendepunkte

Wichman, F., Ernst, MO.

Akademische Mitteilungen zw{\"o}lf: F{\"u}nf Sinne, pages: 32-37, 2007 (misc)

[BibTex]

[BibTex]

2004


no image
Advanced Lectures on Machine Learning

Bousquet, O., von Luxburg, U., Rätsch, G.

ML Summer Schools 2003, LNAI 3176, pages: 240, Springer, Berlin, Germany, ML Summer Schools, September 2004 (proceedings)

Abstract
Machine Learning has become a key enabling technology for many engineering applications, investigating scientific questions and theoretical problems alike. To stimulate discussions and to disseminate new results, a summer school series was started in February 2002, the documentation of which is published as LNAI 2600. This book presents revised lectures of two subsequent summer schools held in 2003 in Canberra, Australia, and in T{\"u}bingen, Germany. The tutorial lectures included are devoted to statistical learning theory, unsupervised learning, Bayesian inference, and applications in pattern recognition; they provide in-depth overviews of exciting new developments and contain a large number of references. Graduate students, lecturers, researchers and professionals alike will find this book a useful resource in learning and teaching machine learning.

Web [BibTex]

2004

Web [BibTex]


no image
Pattern Recognition: 26th DAGM Symposium, LNCS, Vol. 3175

Rasmussen, C., Bülthoff, H., Giese, M., Schölkopf, B.

Proceedings of the 26th Pattern Recognition Symposium (DAGM‘04), pages: 581, Springer, Berlin, Germany, 26th Pattern Recognition Symposium, August 2004 (proceedings)

Web DOI [BibTex]

Web DOI [BibTex]


no image
Advances in Neural Information Processing Systems 16: Proceedings of the 2003 Conference

Thrun, S., Saul, L., Schölkopf, B.

Proceedings of the Seventeenth Annual Conference on Neural Information Processing Systems (NIPS 2003), pages: 1621, MIT Press, Cambridge, MA, USA, 17th Annual Conference on Neural Information Processing Systems (NIPS), June 2004 (proceedings)

Abstract
The annual Neural Information Processing (NIPS) conference is the flagship meeting on neural computation. It draws a diverse group of attendees—physicists, neuroscientists, mathematicians, statisticians, and computer scientists. The presentations are interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, brain imaging, vision, speech and signal processing, reinforcement learning and control, emerging technologies, and applications. Only thirty percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This volume contains all the papers presented at the 2003 conference.

Web [BibTex]

Web [BibTex]


no image
Statistische Lerntheorie und Empirische Inferenz

Schölkopf, B.

Jahrbuch der Max-Planck-Gesellschaft, 2004, pages: 377-382, 2004 (misc)

Abstract
Statistical learning theory studies the process of inferring regularities from empirical data. The fundamental problem is what is called generalization: how it is possible to infer a law which will be valid for an infinite number of future observations, given only a finite amount of data? This problem hinges upon fundamental issues of statistics and science in general, such as the problems of complexity of explanations, a priori knowledge, and representation of data.

PDF Web [BibTex]

PDF Web [BibTex]