首页 > 信息论 > 减排了,全球就不变暖了?

减排了,全球就不变暖了?

标题党。其实这个帖子讲的是语义信息(semantic information)和信念修正(belief revision)。详见我在Tetherless World Blog上的文章

http://tw.rpi.edu/weblog/2011/03/30/can-cutting-co2-emission-avoid-global-warming/

要义:知道了p → q,那我们对¬P^¬q的信念也增强了,因为p → q包含了对¬P^¬q正的互信息。

P = 减排

Q= 全球变暖

==============

OK, you have been fooled by the title. This post will not talk about environment policies, as I have no courage or knowledge to fight either school about global warming.

As a part of my recent work on “semantic information theory”, I’m reading Compression Without a Common Prior: An Information-theoretic Justification for Ambiguity in Language, by Brendan Juba of Harvard. I had some nice conversations with Brendan on Universal Semantic Communication when he was at MIT . It’s nice to read another paper from him.

In his paper, Brendan uses an example

For an English example, consider the example of sentence, You may step forward when your number is called. The implication is that you may not step forward before your number is called, for if that was not the intention, the sentence You may step forward at any time could have been used

Logically, that means if we know p → q, is ¬p →¬q true?

We know this is not a correct inference (i.e., the Denying the Antecedent fallacy). But why it is so often people fall for fallacies of this kind?

I tried to come up with a reasonable explanation using the semantic information theory (SIT). First introduced by Carnap and Bar-Hillel, SIT studies meanings carried by messages. If a sentence is less likely to be true, then it is more surprising. So “Today is hot, and tomorrow is also hot” means more than “Today is hot”. On the other hand, if we say “Today is hot, or today is not hot”, we give very little information.

In classical information theory, the entropy of a message is determined by the statistical probability of the symbols appearing it. In SIT, the entropy of a statement is determined by its logical probability, i.e., the likelihood of observing a possible world (model) in which this statement is true. To see the difference,  let’s see another example: the message “Rex is not a tyrannosaurus” (M1) is less “surprising” than “Rex is not a dog” (M2), not because the word “tyrannosaurus” is more common than “dog”, but because the individuals represented by “tyrannosaurus” (now considered extinct) are less common than the individuals represented by “dog”. Thus, M1 has less semantic information than M2, even if it may have more Shannon information based on the statistical distribution of English words.

Now back to ( p → q)→(¬p → ¬>q). We have the truth table:

pqp → q ¬P→¬q

TTTT

TFFT

FTTF

FFTT

As we are ignorant about the likelihood of p and q, let’s suppose all 4 situations in the truth table are equally likely.  So the logical probability of ¬P→¬q is

m(¬P→¬q)=3/4

Now we know that p → q is true, so the second row in the table is ruled out. Then, the conditional logical probability

m(¬P→¬q|p → q)=2/3 [less surprising, less information]

Thus, by hearing that “You may step forward when your number is called“, it’s rational to revise downwards one’s belief about that “You may not step forward before your number is called“. The first sentence, while not a logically sufficient condition for the second, carries some semantic mutual information about the other.

Wait, is it the reverse of what we want to justify?

Maybe the real implication of “You may step forward when your number is called” is “No number called, no stepping forward”, i.e., instead of causation (¬P→¬q), we mean correlation (¬P^¬q). If that is true, it will be reasonable to not moving before your number is called:

m(¬P^¬q)=1/4

m(¬P^¬q|p → q)=1/3 [belief increases!]

Now return to the title, assuming P is “CO2 emission” and Q is “global warming”, and also assuming that the causation p → q stands, will ¬P^¬q, i.e., no CO2 emission will happen together with no global warming, make more sense? Well, based on the analysis above, it is. Logicians may disagree, but polar bears will certainly appreciate the argument.

Reference

[1] CARNAP, R., AND BAR-HILLEL, Y. An outline of a theory of semantic information. RLE Technical Reports 247, Research Laboratory of Electronics, Massachusetts Institute of Technology, Cambridge MA, Oct 1952.

[2] B. Juba, A. Kalai, S. Khanna, and M. Sudan. Compression Without a Common Prior: An Information-theoretic Justification for Ambiguity in Language. In 2nd Symposium on Innovations in Computer Science. Beijing, P.R. China. 2011.

Advertisements
分类:信息论
  1. 还没有评论。
  1. No trackbacks yet.

发表评论

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / 更改 )

Twitter picture

You are commenting using your Twitter account. Log Out / 更改 )

Facebook photo

You are commenting using your Facebook account. Log Out / 更改 )

Google+ photo

You are commenting using your Google+ account. Log Out / 更改 )

Connecting to %s

%d 博主赞过: