**Title: **Weak Derandomizations with Applications in Complexity Theory

**Abstract: **A randomized algorithm, unlike traditional algorithms, can make decisions by tossing coins, and is only required to be correct with reasonable probability. There are many practical problems that have fast randomized algorithms and no comparable deterministic algorithms. For example, the fastest way to generate large primes is to randomly select a large number, and test if it’s prime. Randomized complexity classes also have important positions in theoretical computer science. Several randomized classes lie in between P and NP, so a natural question is how powerful random algorithms are. One approach to bounding their power is to “derandomize” them - that is, to simulate them without using randomness.

We achieve a weak derandomization result for the multipass model of randomness. The multipass model allows the machine to review its past random choices some limited number of times. We show that it’s possible to derandomize this given the added power of advice. We also show how to estimate the number of solutions to a NP problem with a constant number of influential bits. When there are multiple correct outputs, such as with prime number generation, a randomized algorithm is allowed to output any of them. Influential bits are the bits of the random string that determine which output. A constant number of influential bits means that the algorithm will only choose from a very small set of outputs. Finally, we show that reducing our algorithm to 0 influential bits would lead to a circuit lower bound for BP P NP tt .

**Committee: **Pavan Aduri (major professor), Giora Slutzki, Jack Lutz, Soma Chaudhuri, Chinmay Hegde