Analytic number theorists usually seek to show that sequences which appear
naturally in arithmetic are "well-distributed" in some appropriate sense.
In various discrepancy problems, combinatorics researchers have analyzed limitations
to equi-distribution, as have Fourier analysts when working with the
"uncertainty principle". In this article we find that these ideas have a natural
setting in the analysis of distributions of sequences in analytic number theory,
formulating a general principle, and giving several examples.
Article