novel causal inference methods and their foundation
physics of causality and information flow
notions of complexity and their application in machine learning
statistical physics, in particular the link between causality and the second law of thermodynamics. I founded the group "causal inference" together with Bernhard Schölkopf. The website can be found here
I have been working on quantum information theory for many years and I'm still interested in it; my current causality research is strongly influenced by the paradigm that information is physical. To see the publications from my previous field visit the following website
Dominik Janzing studied physics in Tübingen (Germany) and Cork (Ireland) and received a Ph.D. in mathematics from the Unversity of Tübingen in 1998. From 1998-2006 he was a postdoc and senior scientist at the Computer Science department of the University of Karlsruhe (TH) where he worked on quantum thermodynamics, quantum control, as well as quantum complexity theory and its physical foundations. In 2006 he received his teaching permission (Habilitation) from the Computer Science Department at Universität Karlsruhe (now "Karlsruhe Institute of Technology (KIT)"). Since 2007 he has been working as a senior scientist at the Max Planck Institute for Biological Cybernetics in Tübingen, where he founded the group causal inference together with Bernhard Schölkopf.
The group develops novel methods for causal reasoning from statistical data. These novel approaches use complexity of conditional probability distributions for causal reasoning. The idea is strongly influenced by his previous work on complexity of physical processes and the thermodynamics of information flow.
Ried, K., Agnew, M., Vermeyden, L., Janzing, D., Spekkens, R., Resch, K.
Nature Physics, 11(5):414-420, March 2015 (article)
The problem of inferring causal relations from observed correlations is relevant to a wide variety of scientific disciplines. Yet given the correlations between just two classical variables, it is impossible to determine whether they arose from a causal influence of one on the other or a common cause influencing both. Only a randomized trial can settle the issue. Here we consider the problem of causal inference for quantum variables. We show that the analogue of a randomized trial, causal tomography, yields a complete solution. We also show that, in contrast to the classical case, one can sometimes infer the causal structure from observations alone. We implement a quantum-optical experiment wherein we control the causal relation between two optical modes, and two measurement schemes—with and without randomization—that extract this relation from the observed correlations. Our results show that entanglement and quantum coherence provide an advantage for causal inference.
In Proceedings of The 32nd International Conference on Machine Learning, 37, pages: 2218–2226, JMLR Workshop and Conference Proceedings, (Editors: Bach, F. and Blei, D.), JMLR, ICML, 2015 (inproceedings)
In Proceedings of the 32nd International Conference on Machine Learning, 37, pages: 1917–1925, JMLR Workshop and Conference Proceedings, (Editors: F. Bach and D. Blei), JMLR, ICML, 2015 (inproceedings)
In Proceedings of the 18th International Conference on Artificial Intelligence and Statistics, 38, pages: 847-855, JMLR Workshop and Conference Proceedings, (Editors: Lebanon, G. and Vishwanathan, S.V.N.), JMLR.org, AISTATS, 2015 (inproceedings)
In Proceedings of the Twenty-Ninth Conference Annual Conference on Uncertainty in Artificial Intelligence, pages: 440-448, (Editors: A Nicholson and P Smyth), AUAI Press, Corvallis, Oregon, UAI, 2013 (inproceedings)
Minds and Machines, 23(2):227-249, May 2013 (article)
Independence of Conditionals (IC) has recently been proposed as a basic rule for causal structure learning. If a Bayesian network represents the causal structure, its Conditional Probability Distributions (CPDs) should be algorithmically independent. In this paper we compare IC with causal faithfulness (FF), stating that only those conditional independences that are implied by the causal Markov condition hold true. The latter is a basic postulate in common approaches to causal structure learning. The common spirit of FF and IC is to reject causal graphs for which the joint distribution looks ‘non-generic’. The difference lies in the notion of genericity: FF sometimes rejects models just because one of the CPDs is simple, for instance if the CPD describes a deterministic relation. IC does not behave in this undesirable way. It only rejects a model when there is a non-generic relation between different CPDs although each CPD looks generic when considered separately. Moreover, it detects relations between CPDs that cannot be captured by conditional independences. IC therefore helps in distinguishing causal graphs that induce the same conditional independences (i.e., they belong to the same Markov equivalence class). The usual justification for FF implicitly assumes a prior that is a probability density on the parameter space. IC can be justified by Solomonoff’s universal prior, assigning non-zero probability to those points in parameter space that have a finite description. In this way, it favours simple CPDs, and therefore respects Occam’s razor. Since Kolmogorov complexity is uncomputable, IC is not directly applicable in practice. We argue that it is nevertheless helpful, since it has already served as inspiration and justification for novel causal inference algorithms.
Our goal is to understand the principles of Perception, Action and Learning in autonomous systems that successfully interact with complex environments and to use this understanding to design future systems