Motivated by a desire to study the performance limits of Delay-Tolerant Networks, we develop approximations for the occurrence times of (hopefully) rare events, using Poisson Clumping methods described in D. Aldous' 1987 book, Probability Approximations via the Poisson Clumping Heuristic. In a Delay-Tolerant Network, mobile nodes collect data continuously as they move around, and offload data when they reach an access point. But if they do not reach an access point in time, their tiny chip can run out of memory, risking data loss.
We demonstrate how to apply some basic theory from Brownian Motion and the Poisson Clumping Heuristic to determine the number of access points sufficient to minimize data loss. In particular, we derive a function that calculates bounds on the expected amount of time a mobile node spends without coverage, and functions that relate access point density to expected data loss and probability of overflow. Simulation results will also be presented that demonstrate the strengths and weaknesses of our approximations.