The number e is as ubiquitous as pi, turning up everywhere, and especially in probability theory. If you randomly select real numbers between 0 and 1, and continue until their sum exceeds 1, the expected number of choices is e.
note in Calculus Made Easy, p. 153
Of course, this begs for a Python simulation:
A derivation would be nice, but that'll have to wait. I wonder about the variance.