Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

We know that when nn is large and pp is small, the chance of kk successes in nn i.i.d. Bernoulli (p)(p) trials is roughly

P(k)  eμμkk!,  k=0,1,2,,nP(k) ~ \approx ~ e^{-\mu} \frac{\mu^k}{k!}, ~~ k = 0, 1, 2, \ldots, n

where μ=np\mu = np.

The terms in the approximation are proportional to terms in the series expansion of eμe^\mu, but that expansion is infinite. It doesn’t stop at nn, so we won’t either.

A little care is required before we go further. First, we must state the additivity axiom of probability theory in terms of countably many outcomes:

If events A1,A2,A_1, A_2, \ldots are mutually exclusive, then

P(i=1Ai) = i=1P(Ai)P(\bigcup_{i=1}^\infty A_i) ~ = ~ \sum_{i=1}^\infty P(A_i)

This is called the countable additivity axiom, in contrast to the finite additivity axiom we have thus far assumed. It doesn’t follow from finite additivity, but of course finite additivity follows from it.

In this course, we will not go into the technical aspects of countable additivity and the existence of probability functions that satisfy the axioms on the spaces that interest us. But those technical aspects do have to be studied before you can develop a deeper understanding of probability theory. If you want to do that, a good start is to take Real Analysis and then Measure Theory.

While in this course, you don’t have to worry about it. Just assume that all our work is consistent with the axioms.

Here is our first infinite valued distribution.

🎥 See More
Loading...

7.1.1Poisson Probabilities

A random variable XX has the Poisson distribution with parameter μ>0\mu > 0 if

P(X=k) = eμμkk!,    k=0,1,2,P(X = k) ~ = ~ e^{-\mu} \frac{\mu^k}{k!}, ~~~~ k = 0, 1, 2, \ldots

The terms are proportional to the terms in the infinte series expansion of eμe^{\mu}. These terms μkk!\frac{\mu^k}{k!} for k0k \ge 0 determine the shape of the distribution.

The constant of proportionality is eμe^{-\mu}. It doesn’t affect the shape of the histogram. It just ensures that the probabilities add up to 1.

k=0P(X=k) = k=0eμμkk! = eμk=0μkk! = eμeμ = 1\sum_{k=0}^\infty P(X = k) ~ = ~ \sum_{k=0}^\infty e^{-\mu} \frac{\mu^k}{k!} ~ = ~ e^{-\mu} \sum_{k=0}^\infty \frac{\mu^k}{k!} ~ = ~ e^{-\mu} \cdot e^{\mu} ~ = ~ 1

Keep in mind that the Poisson is a distribution in its own right. It does not have to arise as a limit, though it is sometimes helpful to think of it that way. Poisson distributions are often used to model counts of rare events, not necessarily arising out of a binomial setting.

7.1.2An Interpretation of the Parameter

To understand the parameter μ\mu of the Poisson distribution, a first step is to notice that mode of the distribution is just around μ\mu. Here is an example where μ=3.74\mu = 3.74. No computing system can calculate infinitely many probabilities, so we have just calculated the Poisson probabilities till the sum is close enough to 1 that the prob140 library considers it a Distribution object.

mu = 3.74
k = range(20)
poi_probs_374 = stats.poisson.pmf(k, mu)
poi_dist_374 = Table().values(k).probabilities(poi_probs_374)
Plot(poi_dist_374)
plt.title('Poisson (3.74)');
<Figure size 432x288 with 1 Axes>

The mode is 3. To find a formula for the mode, follow the process we used for the binomial: calculate the consecutive odds ratios, notice that they are decreasing, and see where they cross 1. This is left to you as an exercise. Your calculations should conclude the following:

The mode of the Poisson distribution is the integer part of μ\mu. That is, the most likely value is μ\mu rounded down to an integer. If μ\mu is an integer, both μ\mu and μ1\mu - 1 are modes.

mu = 4
k = range(20)
poi_probs_4 = stats.poisson.pmf(k, mu)
poi_dist_4 = Table().values(k).probabilities(poi_probs_4)
Plot(poi_dist_4)
plt.title('Poisson (4)');
<Figure size 432x288 with 1 Axes>

In later chapters we will learn a lot more about the parameter μ\mu of the Poisson distribution. For now, just keep in mind that the most likely value is essentially μ\mu.

🎥 See More
Loading...

7.1.3Sums of Independent Poisson Variables

Let XX have the Poisson (μ\mu) distribution, and let YY independent of XX have the Poisson (λ\lambda) distribution. Then the sum S=X+YS = X+Y has the Poisson (μ+λ\mu + \lambda) distribution.

To prove this, first notice that the possible values of SS are the non-negative integers. For a non-negative integer ss, find P(S=s)P(S = s) by partitioning the event according to values of XX, keeping in mind that both XX and YY have to be non-negative because both are Poisson.

P(S=s)=k=0sP(X=k,Y=sk)=k=0seμμkk!eλλsk(sk)!=e(μ+λ)1s!k=0ss!k!(sk)!μkλsk=e(μ+λ)(μ+λ)ss!\begin{align*} P(S = s) &= \sum_{k=0}^s P(X=k, Y=s-k) \\ &= \sum_{k=0}^s e^{-\mu} \frac{\mu^k}{k!} \cdot e^{-\lambda} \frac{\lambda^{s-k}}{(s-k)!} \\ &= e^{-(\mu+\lambda)} \frac{1}{s!} \sum_{k=0}^s \frac{s!}{k!(s-k)!} \mu^k \lambda^{s-k} \\ &= e^{-(\mu+\lambda)} \frac{(\mu+\lambda)^s}{s!} \end{align*}

by the binomial expansion of (μ+λ)s(\mu+\lambda)^s. This is the Poisson (μ+λ)(\mu + \lambda) probability formula for the value ss.

One important application of this result is that if X1,X2,,XnX_1, X_2, \ldots , X_n are i.i.d. Poisson (μ)(\mu) variables, then their sum X1+X2++XnX_1 + X_2 + \ldots + X_n has the Poisson (nμ)(n\mu) distribution.