Complexity of the universe



Shannon complexity explained



Claude Shannon came up with a measure of complexity which worked for signals. He brought this to the world in A Mathematical Theory of Communication (1948) His idea is that the information contained in a signal is inversely related to the signals probability.


The idea of the information of a signal being inversely related to its probability was actually posited by Nyquist first. Shannon expanded on this to include noise introduced to the transmitted signal. We are most interested here in the concept that the information intrinsic to its signal is inversely related to its probability.

According to this metric a signal which has a small chance of being sent contains a lot of information. A signal that has a large chance of being sent contains a small amount of information. If a signal has a 100% probability of being sent, it contains no information . A signal that has a very low probability of being sent contains a lot of information.

Claude was actually interested in reducing noise, but along the way came up with this metric.

His mathematical formulation for the amount of information is equal to log (1/probability) where the probability is the probability of the signal being sent.

For a signal which was a binary bit that had a 50 percent chance of being on or off, the probability is 1/2. If we use the log function base 2, then we find the information contained in this is 1. 1 bit.

For a signal which had one of eight possibilities, each equiprobable, reception of one of them would provide an information of 3, or 3 bits. Thus for an arbitrary length random sequence of bits (n) the information contained in the string will equal the number of bits in the string (n).

A fundamental limitation with Shannon's approach is that it is observer dependent. It is dependent on the observers expectation of a signal.






To the universe complexity page




1