Thursday, June 03, 2004

Information Theory overview

Information Theory overview


Steve Den Beste of USS Clueless tries to work through an information overflow problem, and in so doing provides a nice overview of information theory.

Claude Shannon rigorously examined the basic question, "What is information?" in the late 1940's while working at Bell Labs. He developed what we now call "Information Theory", and there may be no single theoretical work which is more important and less well known. For many electrical engineers and computer programmers it's central and vital, but few laymen have ever heard of Shannon and quite a lot of programmers don't know his name.


One of Shannon's fundamental insights was that transmission is not the same as information. He concentrated particularly on the fundamental properties of bit streams (he was the first to use the word "bit" to refer to binary digits) and concluded that information was a function of surprise or unpredictability. When someone receives a message encoded as string of bits, if based on the value of the bit stream up to a given point the receiver has no better than a 50:50 chance of predicting the next bit, then that bit contains maximal information. At the other extreme, if the receiver can predict the next bit unfailingly, then that bit contains no information at all.


read the whole thing. As usual, Den Beste is thorough and articulate.

0 Comments:

Post a Comment

<< Home