Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

1. A random vector Y=[Y1  Y2    Yn]T\mathbf{Y} = [Y_1 ~~ Y_2 ~~ \cdots ~~ Y_n]^T has mean vector μ\boldsymbol{\mu} and covariance matrix σ2In\sigma^2 \mathbf{I}_n where σ>0\sigma > 0 is a number and In\mathbf{I}_n is the n×nn \times n identity matrix.

(a) Pick one option and explain: Y1Y_1 and Y2Y_2 are

     ~~~~~ (i) independent.         ~~~~~~~~ (ii) uncorrelated but might not be independent.         ~~~~~~~~ (iii) not uncorrelated.

(b) Pick one option and explain: Var(Y1)Var(Y_1) and Var(Y2)Var(Y_2) are

     ~~~~~ (i) equal.         ~~~~~~~~ (ii) possibly equal, but might not be.         ~~~~~~~~ (iii) not equal.

(c) For mnm \le n let A\mathbf{A} be an m×nm \times n matrix of real numbers, and let b\mathbf{b} be an m×1m \times 1 vector of real numbers. Let V=AY+b\mathbf{V} = \mathbf{AY} + \mathbf{b}. Find the mean vector μV\boldsymbol{\mu}_\mathbf{V} and covariance matrix ΣV\boldsymbol{\Sigma}_\mathbf{V} of V\mathbf{V}.

(d) Let c\mathbf{c} be an m×1m \times 1 vector of real numbers and let W=cTVW = \mathbf{c}^T\mathbf{V} for V\mathbf{V} defined in Part (c). In terms of c\mathbf{c}, μV\boldsymbol{\mu}_\mathbf{V} and ΣV\boldsymbol{\Sigma}_\mathbf{V}, find E(W)E(W) and Var(W)Var(W).

2. Let [U V W]T[U ~ V ~ W]^T be multivariate normal with mean vector [0 0 0]T[0 ~ 0 ~ 0]^T and covariance matrix [1ρ1ρ2ρ11ρ3ρ2ρ31]\begin{bmatrix} 1 & \rho_1 & \rho_2 \\ \rho_1 & 1 & \rho_3 \\ \rho_2 & \rho_3 & 1 \end{bmatrix}

(a) What is the distribution of UU?

(b) What is the distribution of U+2VU+2V?

(c) What is the joint distribution of UU and U+2VU+2V?

(d) Under what condition on the parameters is UU independent of U+2VU+2V?

3. Let [X1  X2  X3]T[X_1 ~~ X_2 ~~ X_3]^T be multivariate normal with mean vector μ\boldsymbol{\mu} and covariance matrix Σ\boldsymbol{\Sigma} given by

μ = [μμμ]           Σ = [σ12σ12σ13σ21σ22σ23σ31σ32σ32]\boldsymbol{\mu} ~ = ~ \begin{bmatrix} \mu \\ \mu \\ \mu \end{bmatrix} ~~~~~~~~~~~ \boldsymbol{\Sigma} ~ = ~ \begin{bmatrix} \sigma_1^2 & \sigma_{12} & \sigma_{13} \\ \sigma_{21} & \sigma_2^2 & \sigma_{23} \\ \sigma_{31} & \sigma_{32} & \sigma_3^2 \end{bmatrix}

Find P((X1+X2)/2<X3+1)P\big( (X_1 + X_2)/2 < X_3 + 1 \big).

4. Let XX be standard normal. Construct a random variable YY as follows:

  • Toss a fair coin.

  • If the coin lands heads, let Y=XY = X.

  • If the coin lands tails, let Y=XY = -X.

(a) Find the cdf of YY and hence identify the distribution of YY.

(b) Find E(XY)E(XY) by conditioning on the result of the toss.

(c) Are XX and YY uncorrelated?

(d) Are XX and YY independent?

(e) Is the joint distribution of XX and YY bivariate normal?

5. Normal Sample Mean and Sample Variance, Part 1

Let X1,X2,,XnX_1, X_2, \ldots, X_n be i.i.d. with mean μ\mu and variance σ2\sigma^2. Let

Xˉ = 1ni=1nXi\bar{X} ~ = ~ \frac{1}{n} \sum_{i=1}^n X_i

denote the sample mean and

S2 = 1n1i=1n(XiXˉ)2S^2 ~=~ \frac{1}{n-1} \sum_{i=1}^n (X_i - \bar{X})^2

denote the sample variance as defined earlier in the course.

(a) For 1in1 \le i \le n let Di=XiXˉD_i = X_i - \bar{X}. Find Cov(Di,Xˉ)Cov(D_i, \bar{X}).

(b) Now assume in addition that X1,X2,,XnX_1, X_2, \ldots, X_n are i.i.d. normal (μ,σ2)(\mu, \sigma^2). What is the joint distribution of Xˉ,D1,D2,,Dn1\bar{X}, D_1, D_2, \ldots, D_{n-1}? Explain why DnD_n isn’t on the list.

(c) True or false (justify your answer): The sample mean and sample variance of an i.i.d. normal sample are independent of each other.

6. Normal Sample Mean and Sample Variance, Part 2

(a) Let RR have the chi-squared distribution with nn degrees of freedom. What is the mgf of RR?

(b) For RR as in Part (a), suppose R=V+WR = V + W where VV and WW are independent and VV has the chi-squared distribution with m<nm < n degrees of freedom. Can you identify the distribution of WW? Justify your answer.

(c) Let X1,X2,,XnX_1, X_2, \ldots , X_n be any sequence of random variables and let Xˉ=1ni=1nXi\bar{X} = \frac{1}{n}\sum_{i=1}^n X_i. Let α\alpha be any constant. Prove the sum of squares decomposition

i=1n(Xiα)2 = i=1n(XiXˉ)2 + n(Xˉα)2.\sum_{i=1}^n (X_i - \alpha)^2 ~=~ \sum_{i=1}^n (X_i - \bar{X})^2 ~+~ n(\bar{X} - \alpha)^2.

(d) Now let X1,X2,,XnX_1, X_2, \ldots, X_n be i.i.d. normal with mean μ\mu and variance σ2>0\sigma^2 > 0. Let S2S^2 be the “sample variance” defined by

S2 = 1n1i=1n(XiXˉ)2.S^2 ~=~ \frac{1}{n-1} \sum_{i=1}^n (X_i - \bar{X})^2.

Find a constant cc such that cS2cS^2 has a chi-squared distribution. Provide the degrees of freedom.

[Use Parts (b) and (c) as well as the result of the previous exercise.]