Structured Programming  «Prev  Next»
Lesson 2Programming Fundamentals Prerequisites
ObjectiveVerify that you have the right background for this course.

Programming Fundamentals Prerequisites

There really aren't any prerequisites for this course other than a general familiarity with computers and an interest in learning how to write computer programs.
No previous programming experience is required for this course. If your background includes downloading and installing software from the Web or creating Web pages with HTML and a scripting language, then some of the tasks you will be asked to do in the course will be made a bit easier.
In the next lesson, you will learn what you need to take this course.

Concept and History of Algorithms

The concept of algorithm has existed for centuries, however a partial formalization of what would become the modern algorithm began with attempts to solve the Entscheidungsproblem (the "decision problem") posed by David Hilbert in 1928. Subsequent formalizations were framed as attempts to define "effective calculability" or the "effective method".
Alan Turing's Turing machines from 1936 to 1937 and 1939 gave a formal definition of algorithms, corresponding to the intuitive notion.

Kolmogorov Complexity

Kolmogorov Complexity holds a central role in the domain of Algorithmic Information Theory, serving as a quantitative measure to represent the complexity of finite strings or objects. Introduced by the Russian mathematician Andrey Kolmogorov, this complexity metric essentially quantifies the amount of computational resources needed to specify or reproduce a given string, and by extension, any digital object. In the most formal sense, the Kolmogorov Complexity \( K(x) \) of a string \( x \) is defined as the length of the shortest binary program \( p \) that, when run on a Universal Turing Machine (UTM), produces \( x \) as output and subsequently halts.

Relevance to Algorithmic Information Theory

Algorithmic Information Theory investigates the intrinsic complexity of data, functions, or any other form of information based on their shortest effective descriptions. Kolmogorov Complexity is instrumental in this study, offering an objective standard by which the complexity of individual strings can be evaluated. This makes it applicable to a wide range of questions regarding the compressibility, randomness, and structure inherent in data sets.

Uncomputability and Invariance

One of the pivotal results in Algorithmic Information Theory is that Kolmogorov Complexity is uncomputable. There exists no algorithm that can calculate the exact Kolmogorov Complexity for an arbitrary string. However, this theoretical limitation does not diminish its conceptual utility. In fact, the notion of "invariance" lends it a robust quality: although the absolute value of Kolmogorov Complexity may depend on the choice of the UTM, the difference in complexity between two strings will be invariant across different UTMs up to an additive constant. This invariance theorem provides a stable, machine-independent framework for complexity analysis.

Applications in Complexity Analysis

Kolmogorov Complexity finds applications in various sectors within computer science. For example, in data compression, the complexity gives a lower bound on the compressibility of a string, effectively signifying that no algorithm can compress the data below this bound. In machine learning and pattern recognition, it is used to gauge the complexity of models. It is also employed to distinguish between truly random sequences and those that only appear random but have an underlying deterministic structure.

Randomness Testing and Other Uses

In the realm of cryptography and randomness testing, Kolmogorov Complexity offers insights into the quality of pseudorandom number generators. A sequence is considered algorithmically random if its Kolmogorov Complexity is approximately equal to its length, implying that the sequence cannot be generated by any program substantially shorter than the sequence itself.
In summary, Kolmogorov Complexity serves as the cornerstone metric for understanding and quantifying information content in the field of Algorithmic Information Theory. Its applications are manifold, spanning areas from data compression and cryptography to machine learning and complexity theory. While it is theoretically uncomputable, its conceptual utility and machine-independent nature make it an indispensable tool for evaluating the intrinsic complexity of data and algorithms.
In algorithmic information theory, the Kolmogorov complexity of an object, such as a piece of text, is the length of the shortest computer program in a programming language that produces the object as output. It is a measure of the computational resources needed to specify the object, and is also known as descriptive complexity,
  1. Kolmogorov-Chaitin complexity,
  2. algorithmic entropy, or
  3. program-size complexity.
It is named after Andrey Kolmogorov, who first published on the subject in 1963.
The notion of Kolmogorov complexity can be used to state and prove impossibility results akin to
  1. Cantor's diagonal argument,
  2. Goedel's incompleteness theorem, and
  3. Turing's halting problem.