Judea Pearl has won the 2011 A. M. Turing Award. The citation reads: "For fundamental contributions to artificial intelligence through the development of a calculus for probabilistic and causal reasoning." The A. M. Turing Award, given annually since 1966 by the Association for Computing Machinery, is considered the "Nobel Prize of theoretical computer science." Among the recipients are Marvin Minsky, Donald Knuth and Edsger Dijkstra.
Oftentimes, notions that seem so obvious and fundamental that we take them for granted turn out to be much more complex than we initially gave them credit for. It took philosophy and science entire millenia to come up with the first logically correct and useful definition of truth, due to Alfred Tarski. It took a few more decades for Judea Pearl to do the same for the concept of cause and effect. For those of you who don't know Pearl's work, go read his book Causality; I guarantee you it will be one of the most important pieces of prose you'll ever read. (If you would like a short and non-technical introduction, just read the book's closing part Epilogue: The Art and Science of Cause and Effect.)
The gist of Pearl's idea for defining causality is that you cannot do this without first formalizing the notion of an intervention. An intervention is an act of assigning a specific value to a variable by means that are completely independent of that variable's "natural environment". In Pearl's notation, intervention is denoted by do(X = x); think of it as of choosing treatment in a randomized experiment, or as of using the assignment operator in computer programming. Pearl defines the causal effect of a variable X on variable Y as a probability distribution of Y induced by deliberately setting the value of X to x (i.e. "doing" the do(X = x)). Or, in other words, event A causes event B if and only if the smallest possible set of do() operations performed on the whole system that brings about the realization of A, brings about the realization of B as well.
This may all sound trivial, but it actually has some non-trivial implications for probabilistic reasoning, which Pearl works out in full detail. Another interesting feature of his formalization of causality is that it provides a natural framework for mathematically rigorous thinking about the old philosophical ideas of "possible worlds" and counterfactuals. For example, it makes counterfactuals non-tautological (that is, it makes them sometimes true and sometimes false, and thus interesting, as opposed to their treatment in vanilla propositional logic in which all counterfactuals are vacuously true and therefore completely useless). Here is a truth-condition of a counterfactual implied by Pearl's theory:
Oftentimes, notions that seem so obvious and fundamental that we take them for granted turn out to be much more complex than we initially gave them credit for. It took philosophy and science entire millenia to come up with the first logically correct and useful definition of truth, due to Alfred Tarski. It took a few more decades for Judea Pearl to do the same for the concept of cause and effect. For those of you who don't know Pearl's work, go read his book Causality; I guarantee you it will be one of the most important pieces of prose you'll ever read. (If you would like a short and non-technical introduction, just read the book's closing part Epilogue: The Art and Science of Cause and Effect.)
The gist of Pearl's idea for defining causality is that you cannot do this without first formalizing the notion of an intervention. An intervention is an act of assigning a specific value to a variable by means that are completely independent of that variable's "natural environment". In Pearl's notation, intervention is denoted by do(X = x); think of it as of choosing treatment in a randomized experiment, or as of using the assignment operator in computer programming. Pearl defines the causal effect of a variable X on variable Y as a probability distribution of Y induced by deliberately setting the value of X to x (i.e. "doing" the do(X = x)). Or, in other words, event A causes event B if and only if the smallest possible set of do() operations performed on the whole system that brings about the realization of A, brings about the realization of B as well.
This may all sound trivial, but it actually has some non-trivial implications for probabilistic reasoning, which Pearl works out in full detail. Another interesting feature of his formalization of causality is that it provides a natural framework for mathematically rigorous thinking about the old philosophical ideas of "possible worlds" and counterfactuals. For example, it makes counterfactuals non-tautological (that is, it makes them sometimes true and sometimes false, and thus interesting, as opposed to their treatment in vanilla propositional logic in which all counterfactuals are vacuously true and therefore completely useless). Here is a truth-condition of a counterfactual implied by Pearl's theory:
The proposition "If A were true then B would be true" is true if and only if in the possible world closest to ours in which A holds, B holds as well,where world V is closest to world W iff there does not exist any world Z such that the set of do() operations required to transform W into Z is a proper subset of the set of do() operations required to transform W into V.
No comments:
Post a Comment