This is a self-study question, so I provide hints that will hopefully help to find the solution, and I'll edit the answer based on your feedbacks/progress.
The parameter estimates that minimize the sum of squares are
β^0β^1=y¯−β^1x¯,=∑ni=1(xi−x¯)yi∑ni=1(xi−x¯)2.
To get the variance of
β^0, start from its expression and substitute the expression of
β^1, and do the algebra
Var(β^0)=Var(Y¯−β^1x¯)=…
Edit:
We have
Var(β^0)=Var(Y¯−β^1x¯)=Var(Y¯)+(x¯)2Var(β^1)−2x¯Cov(Y¯,β^1).
The two variance terms are
Var(Y¯)=Var(1n∑i=1nYi)=1n2∑i=1nVar(Yi)=σ2n,
and
Var(β^1)=1[∑ni=1(xi−x¯)2]2∑i=1n(xi−x¯)2Var(Yi)=σ2∑ni=1(xi−x¯)2,
and the covariance term is
Cov(Y¯,β^1)=Cov{1n∑i=1nYi,∑nj=1(xj−x¯)Yj∑ni=1(xi−x¯)2}=1n1∑ni=1(xi−x¯)2Cov{∑i=1nYi,∑j=1n(xj−x¯)Yj}=1n∑ni=1(xi−x¯)2∑i=1n(xj−x¯)∑j=1nCov(Yi,Yj)=1n∑ni=1(xi−x¯)2∑i=1n(xj−x¯)σ2=0
since
∑ni=1(xj−x¯)=0.
And since
∑i=1n(xi−x¯)2=∑i=1nx2i−2x¯∑i=1nxi+∑i=1nx¯2=∑i=1nx2i−nx¯2,
we have
Var(β^0)=σ2n+σ2x¯2∑ni=1(xi−x¯)2=σ2n∑ni=1(xi−x¯)2{∑i=1n(xi−x¯)2+nx¯2}=σ2∑ni=1x2in∑ni=1(xi−x¯)2.
Edit 2
Why do we have
var(∑ni=1Yi)=∑ni=1Var(Yi)?
The assumed model is Yi=β0+β1Xi+ϵi, where the ϵi are independant and identically distributed random variables with E(ϵi)=0 and var(ϵi)=σ2.
Once we have a sample, the Xi are known, the only random terms are the ϵi. Recalling that for a random variable Z and a constant a, we have var(a+Z)=var(Z). Thus,
var(∑i=1nYi)=var(∑i=1nβ0+β1Xi+ϵi)=var(∑i=1nϵi)=∑i=1n∑j=1ncov(ϵi,ϵj)=∑i=1ncov(ϵi,ϵi)=∑i=1nvar(ϵi)=∑i=1nvar(β0+β1Xi+ϵi)=∑i=1nvar(Yi).
The 4th equality holds as
cov(ϵi,ϵj)=0 for
i≠j by the independence of the
ϵi.