novel causal inference methods and their foundation
physics of causality and information flow
notions of complexity and their application in machine learning
statistical physics, in particular the link between causality and the second law of thermodynamics. I founded the group "causal inference" together with Bernhard Schölkopf. The website can be found here
I have been working on quantum information theory for many years and I'm still interested in it; my current causality research is strongly influenced by the paradigm that information is physical. To see the publications from my previous field visit the following website
Dominik Janzing studied physics in Tübingen (Germany) and Cork (Ireland) and received a Ph.D. in mathematics from the Unversity of Tübingen in 1998. From 1998-2006 he was a postdoc and senior scientist at the Computer Science department of the University of Karlsruhe (TH) where he worked on quantum thermodynamics, quantum control, as well as quantum complexity theory and its physical foundations. In 2006 Since 2007 he has been working as a senior scientist at the Max Planck Institute for Biological Cybernetics in Tübingen, where he founded the group causal inference together with Bernhard Schölkopf.
The group develops novel methods for causal reasoning from statistical data. These novel approaches use complexity of conditional probability distributions for causal reasoning. The idea is strongly influenced by his previous work on complexity of physical processes and the thermodynamics of information flow.
While conventional approaches to causal inference are mainly based on conditional (in)dependences, recent methods also account for the shape of (conditional) distributions. The idea is that the causal hypothesis “X causes Y” imposes that the marginal distribution PX and the conditional distribution PY|X represent independent mechanisms of nature. Recently it has been postulated that the shortest description of the joint distribution PX,Y should therefore be given by separate descriptions of PX and PY|X. Since description length in the sense of Kolmogorov complexity is uncomputable, practical implementations rely on other notions of independence. Here we define independence via orthogonality in information space. This way, we can explicitly describe the kind of dependence that occurs between PY and PX|Y making the causal hypothesis “Y causes X” implausible. Remarkably, this asymmetry between cause and effect becomes particularly simple if X and Y are deterministically related. We present an inference method that works in this case. We also discuss some theoretical results for the non-deterministic case although it is not clear how to employ them for a more general inference method.
Allahverdyan, AE.Hovhannisyan, KV.Janzing, D.Mahler, G.
Physical Review E, 84(4):16 pages, October, 2012
We study dynamic cooling, where an externally driven two-level system is cooled via reservoir, a quantum system with initial canonical equilibrium state. We obtain explicitly the minimal possible temperature Tmin>0 reachable for the two-level system. The minimization goes over all unitary dynamic processes operating on the system and reservoir and over the reservoir energy spectrum. The minimal work needed to reach Tmin grows as 1/Tmin. This work cost can be significantly reduced, though, if one is satisfied by temperatures slightly above Tmin. Our results on Tmin>0 prove unattainability of the absolute zero temperature without ambiguities that surround its derivation from the entropic version of the third law. We also study cooling via a reservoir consisting of N≫1 identical spins. Here we show that Tmin∝1/N and find the maximal cooling compatible with the minimal work determined by the free energy. Finally we discuss cooling by reservoir with an initially microcanonic state and show that although a purely microcanonic state can yield the zero temperature, the unattainability is recovered when taking into account imperfections in preparing the microcanonic state.
Our goal is to understand the principles of Perception, Action and Learning in autonomous systems that successfully interact with complex environments and to use this understanding to design future systems