BDA3 Chapter 5 Exercise 7
Here’s my solution to exercise 7, chapter 5, of Gelman’s Bayesian Data Analysis (BDA), 3rd edition. There are solutions to some of the exercises on the book’s webpage.
Part a
Suppose y∣θ∼Poisson(θ) with prior θ∼Gamma(α,β). Let’s derive the expectation and variance of y. Using equation 1.8 (page 21), the expectation is
E(y)=E(E(y∣θ))=E(θ)=αβ.
Using equation 1.9 (page 21), the variance is
V(y)=E(V(y∣θ))+V(E(y∣θ))=E(θ)+V(θ)=αβ+αβ2=α1+ββ2.
Part b
Suppose y∣μ,σ∼Normal(μ,σ) with prior p(μ,σ2)∝σ−2. Then the expectation of μ∣y is
E(μ∣y)=E(E(μ∣σ2,y)∣y)=E(ˉy∣y)=ˉy.
For posterior expectations, we condition on the data, which allows us to treat y, n, and s as constants. Since θ:=√n(μ−ˉy)/s is a linear function of μ, its posterior expectation is zero. For this to hold, it is necessary that n≥2 for s to be well-defined (to avoid division by zero). Moreover, the first identity implicitly assumes that the expectation E(u∣y) is well-defined. Combining the calculation of p(μ∣y) with this proof shows that it is well-defined only when n≥3.
The posterior variance is
V(√ns(μ−ˉy)∣y)=E(V(√ns(μ−ˉy)∣σ2,y)∣y)+V(E(√ns(μ−ˉy)∣σ2,y)∣y)=E(ns2V(μ∣σ2,y)∣y)+V(0∣y)=E(ns2σ2n∣y)+0=n−1n−3.
Since n−3 appears in the denominator, it is necessary that n≥4. Again, for the first identity to hold, it is implicitly assumed that the variance on the left hand side is finite. It can be verified that the variance is finite for n≥4.