Adam Case, PhD Final Oral Defense

Event
Speaker: 
Adam Case
Tuesday, June 28, 2016 - 11:00am to 1:30pm
Location: 
213 Atanasoff Hall
Event Type: 

Adam Case;

Faculty Advisor:  Jack Lutz

Status:  PhD

Date: Tue, 2016-06-28

Time: 11:00 am

Location: 213 Atanasoff Hall

Title: Mutual Dimension, Data Processing Inequalities, and Randomness

Abstract: This dissertation makes progress in the area of constructive dimension, an effectivization of classical Hausdorff dimension, by introducing a new framework for reasoning about the mutual algorithmic information between two infinite objects (e.g., points, sequences). Several results on mutual information, data processing inequalities, and algorithmic randomness are obtained.

We begin by analyzing the mutual algorithmic information between two points in Euclidean space at a given precision. In fact, we describe two plausible definitions for this quantity and show that they are closely related. In order to do this, we prove and make use of a generalization of Levin's coding theorem.

The constructive dimension of a point in Euclidean space is its density of algorithmic information. We develop a more general notion of constructive dimension called mutual dimension, which is the density of algorithmic information shared by two points, and show that it has all of the properties that one would expect a measure of mutual information to have. Perhaps the most important of these properties is the data processing inequality, which says that the quantity of shared information between two objects cannot be significantly increased when one of the objects is processed by a particular kind of transformation. We show that if the transformation f is computable and Lipschitz, then, for all points x and y, the mutual dimension between f(x) and y is no greater than the mutual dimension between x and y. We prove the existence of other data processing inequalities based on various forms of continuity.

Constructive dimension can also be used to measure the density of algorithmic information of an individual sequence. Likewise, one may use mutual dimension to measure the density of mutual algorithmic information between two sequences. We show that the mutual dimension between two sequences is equal to the mutual dimension between the sequences' real representations and develop several data processing inequalities, where the transformations used are bounded Turing functionals.

Finally, the nature of coupled randomness (i.e., the algorithmic randomness of two sequences that are generated by independent tosses of coins whose biases may or may not be correlated) is explored. We show that the elements of a particular subset of coupled random sequences can be characterized using classical Shannon mutual information. We also prove that every pair of independently random sequences has zero mutual dimension and that it is possible to construct two sequences with zero mutual dimension that are not independently random. Some initial observations regarding Billingsley mutual dimension are provided, and a mutual divergence formula is investigated.