The investigator recently developed effective fractal dimensions in order to measure the density of information in large data objects. In this project, he and his group are using these dimensions to extend the theoretical foundations of high-precision scientific computing, to study prediction and compression of data streams that are truly massive relative to available computational resources, and to attack fundamental questions concerning the number of computation steps required to solve complex problems. The research on foundations of scientific computing is incorporating multiresolution processing of data from continuous geometric domains in order to develop an algorithmic extension of geometric measure theory in Euclidean spaces. Investigations of prediction and compression are focused on computation by finite-state devices and extend methods of ergodic number theory. Computational complexity research topics include derandomization, diagonalization, and dimension characterizations of time-bounded Kolmogorov complexity. Overall, the project is developing new analytic methods for use in theoretical computer science, working to achieve a greater unity between computational complexity and information theory, and training young researchers to cross traditional boundaries in the conduct of rigorous research.