Overview

Shannon’s theory deals with the average information of a source. Algorithmic Information Theory (AIT) deals with the information content of individual objects (strings, numbers).

Core Idea

Randomness is Incompressibility. A string is random if it has no patterns. If it has no patterns, it cannot be compressed. Therefore, a random string is one whose shortest description is the string itself.

Formal Definition (if applicable)

Chaitin’s Incompleteness Theorem: For any formal system, there are strings that are random (have high complexity), but the system cannot prove they are random.

Intuition

  • Berry Paradox: “The smallest positive integer not definable in fewer than twelve words.” (I just defined it in 11 words). AIT formalizes this paradox.
  • The Library of Babel: Contains every possible book. Most are gibberish (high entropy/complexity). A few are Shakespeare (lower complexity?).

Examples

  • Solomonoff Induction: A perfect (but uncomputable) way to predict the future. It weighs all possible programs that explain the past data by their length ($2^{-length}$). Simpler theories are more likely (Occam’s Razor).

Common Misconceptions

  • “It’s useful for engineering.” (It’s mostly pure math/philosophy because the core measures are uncomputable).
  • Halting Problem: You can’t write a program that checks if another program will run forever.
  • Godel’s Incompleteness Theorem: Math has limits.

Applications

  • Philosophy of Science: Why is simple better?
  • AI: Theoretical foundations of Artificial General Intelligence (AIXI).

Criticism / Limitations

It relies on the choice of the Universal Turing Machine (programming language). The complexity can vary by a constant factor depending on the language.

Further Reading

  • Chaitin, Meta Math!
  • Calude, Information and Randomness