It has been theorized that mutation rates in the cancer cells are much greater than those in normal cells. Therefore, it is likely that an optimal mutation rate for cellular evolution exists. We hypothesized that moderately higher mutation rates in cells growing in a competitive environment facilitate adaptation and allow them to increase their fitness, a measure of offspring production. To demonstrate this, various strains of Escherichia coli, a Gram-negative bacterium and model organism were used. Most strains harbored a mutated DNA polymerase I (Pol I), an enzyme involved in chromosomal DNA replication, conferring different replication fidelities. As a control, two wild-type strains were competed together in replicate cultures and in the end, both strains won at about the same frequency. Each mutant was then competed in equal ratios against a wild-type strain, again in replicates, in an environment with limiting resources. In most cases, a moderate mutator strain with a mutation rate of 10- to 47-fold higher than wild-type totally overtook the culture. However, since the mutators won most of the time instead of every time, these results suggest that moderate mutators have a greater probability of acquiring advantageous mutations, rather than an initial growth advantage. As we learn more about mutation rate as a determinant of the ability to evolve, we hope to someday find a way to force mutators to lose and therefore decrease their evolutionary fitness. If we can model this in bacteria, the next step would be to utilize these methods in tissue culture and eventually in humans. In the future, this could be a promising route for drug development to target mutator pathways and delay the progression of cancer.