Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

Informally, the definition of independence is the same as before: two random variables that have a joint density are independent if additional information about one of them doesn’t change the distribution of the other.

One quick way to spot the lack of independence is to look at the set of possible values of the pair (X,Y)(X, Y). If that set is not a rectangle then XX and YY can’t be independent. The non-rectangular shape implies that there must be two values of XX for which the corresponding values of YY are different.

🎥 See More
Loading...

If the set of possible values is rectangular then you have to check independence using the old definition:

Jointly distributed random variables XX and YY are independent if

P(XA,YB)=P(XA)P(YB)P(X \in A, Y \in B) = P(X \in A)P(Y \in B)

for all intervals AA and BB.

Let XX have density fXf_X, let YY have density fYf_Y, and suppose XX and YY are independent. Then if ff is the joint density of XX and YY,

f(x,y)dxdyP(Xdx,Ydy)=P(Xdx)P(Ydy)     (independence)=fX(x)dxfY(y)dy=fX(x)fY(y)dxdy\begin{align*} f(x, y)dxdy &\sim P(X \in dx, Y \in dy) \\ &= P(X \in dx)P(Y \in dy) ~~~~~ \text{(independence)} \\ &= f_X(x)dx f_Y(y)dy \\ &= f_X(x)f_Y(y)dxdy \end{align*}

Thus if XX and YY are independent then their joint density is given by

f(x,y)=fX(x)fY(y)f(x, y) = f_X(x)f_Y(y)

This is the product rule for densities: the joint density of two independent random variables is the product of their densities.

The converse is also true: if the joint density factors into a function of xx times a function of yy, then XX and YY are independent.

Independent Standard Normal Random Variables

Suppose XX and YY are i.i.d. standard normal random variables. Then their joint density is given by

f(x,y)=12πe12x212πe12y2,    <x,y<f(x, y) = \frac{1}{\sqrt{2\pi}} e^{-\frac{1}{2}x^2} \cdot \frac{1}{\sqrt{2\pi}} e^{-\frac{1}{2}y^2}, ~~~~ -\infty < x, y < \infty

Equivalently,

f(x,y)=12πe12(x2+y2),    <x,y<f(x, y) = \frac{1}{2\pi} e^{-\frac{1}{2}(x^2 + y^2)}, ~~~~ -\infty < x, y < \infty

Here is a graph of the joint density surface.

def indep_standard_normals(x,y):
    return 1/(2*math.pi) * np.exp(-0.5*(x**2 + y**2))

Plot_3d((-4, 4), (-4, 4), indep_standard_normals, rstride=4, cstride=4)
<Figure size 864x576 with 1 Axes>

Notice the circular symmetry of the surface. This is because the formula for the joint density involves the pair (x,y)(x, y) through the expression x2+y2x^2 + y^2 which is symmetric in xx and yy.

Notice also that P(X=Y)=0P(X = Y) = 0, as the probability is the volume over a line. This is true of all pairs of independent random variables with a joint density: P(X=Y)=0P(X = Y) = 0. So for example P(X>Y)=P(XY)P(X > Y) = P(X \ge Y). You don’t have to worry about whether or not to the inequality should be strict.

Competing Exponentials

Let XX and YY be independent random variables. Suppose XX has the exponential (λ)(\lambda) distribution and YY has the exponential (μ)(\mu) distribution. The goal of this example is to find P(Y>X)P(Y > X).

By the product rule, the joint density of XX and YY is given by

f(x,y) = λeλxμeμy,    x>0, y>0f(x, y) ~ = ~ \lambda e^{-\lambda x} \mu e^{-\mu y}, ~~~~ x > 0, ~ y > 0

The graph below shows the joint density surface in the case λ=0.5\lambda = 0.5 and μ=0.25\mu = 0.25, so that E(X)=2E(X) = 2 and E(Y)=4E(Y) = 4.

def independent_exp(x, y):
    return 0.5 * 0.25 * np.e**(-0.5*x - 0.25*y)

Plot_3d((0, 10), (0, 10), independent_exp)
<Figure size 864x576 with 1 Axes>

To find P(Y>X)P(Y > X) we must integrate the joint density over the upper triangle of the first quadrant, a portion of which is shown below.

<matplotlib.figure.Figure at 0x1a13c78198>

The probability is therefore

P(Y>X) = 0xλeλxμeμydydxP(Y > X) ~ = ~ \int_0^\infty \int_x^\infty \lambda e^{-\lambda x} \mu e^{-\mu y} dy dx

We can do this double integral without much calculus, just by using probability facts. As you calculate, try to involve densities as much as possible, and remember that the integral of a density over an interval is the probability of that interval.

P(Y>X)=0xλeλxμeμydydx=0λeλx(xμeμydy)dx=0λeλxeμxdx      (survival function of Y, evaluated at x)=λλ+μ0(λ+μ)e(λ+μ)xdx=λλ+μ       (total integral of exponential (λ+μ) density is 1)\begin{align*} P(Y > X) &= \int_0^\infty \int_x^\infty \lambda e^{-\lambda x} \mu e^{-\mu y} dy dx \\ \\ &= \int_0^\infty \lambda e^{-\lambda x} \big( \int_x^\infty \mu e^{-\mu y} dy\big) dx \\ \\ &= \int_0^\infty \lambda e^{-\lambda x} e^{-\mu x} dx ~~~~~~ \text{(survival function of } Y\text{, evaluated at } x \text) \\ \\ &= \frac{\lambda}{\lambda + \mu} \int_0^\infty (\lambda + \mu) e^{-(\lambda + \mu)x} dx \\ \\ &= \frac{\lambda}{\lambda + \mu} ~~~~~~~ \text{(total integral of exponential } (\lambda + \mu) \text{ density is 1)} \end{align*}

Thus

P(Y>X) = λλ+μP(Y > X) ~ = ~ \frac{\lambda}{\lambda + \mu}

Analogously,

P(X>Y) = μλ+μP(X > Y) ~ = ~ \frac{\mu}{\lambda + \mu}

Notice that the two chances are proportional to the parameters. This is consistent with intuition if you think of XX and YY as two lifetimes. If λ\lambda is large, the corresponding lifetime XX is likely to be short, and therefore YY is likely to be larger than XX as the formula implies.

If λ=μ\lambda = \mu then P(Y>X)=1/2P(Y > X) = 1/2 which you can see by symmetry since P(X=Y)=0P(X = Y) = 0.

If we had attempted the double integral in the other order – first xx, then yy – we would have had to do more work. The integral is

P(Y>X) = 00yλeλxμeμydxdyP(Y > X) ~ = ~ \int_0^\infty \int_0^y \lambda e^{-\lambda x} \mu e^{-\mu y} dx dy

Let’s take the easy way out by using SymPy to confirm that we will get the same answer.

# Create the symbols; they are all positive

x = Symbol('x', positive=True)
y = Symbol('y', positive=True)
lamda = Symbol('lamda', positive=True)
mu = Symbol('mu', positive=True)
# Construct the expression for the joint density

f_X = lamda * exp(-lamda * x)
f_Y = mu * exp(-mu * y)
joint_density = f_X * f_Y
joint_density
Loading...
# Display the integral – first x, then y

Integral(joint_density, (x, 0, y), (y, 0, oo))
Loading...
# Evaluate the integral

answer = Integral(joint_density, (x, 0, y), (y, 0, oo)).doit()
answer
Loading...
# Confirm that it is the same 
# as what we got by integrating in the other order

simplify(answer)
Loading...