Suppose that X1 Xn are independent random variables with den

Suppose that X1 Xn are independent random variables with density or mass function f(x, theta) and suppose that we estimate theta using the maximum likelihood estimator theta: we estimate its standard error using the observed Fisher information estimator se(theta)={ -nsigma i=1 l\"(x, theta)^-1/2} where l\'(x,theta),l\"(x, theta) are the first two partial derivatives of In f(x, teta) with respect to theta Alternatively, we could use the jackknife to estimate the standard error of theta. if our model is correct then we would expect (hope) that the two estimates are similar. In order to investigate this, we need to be able to get a good approximation to the \'\'leave-one-out\" estimators {theta-i,}. Show that theta-i, satisfies the equation n l\'xi,theta =sigma j=1 l(xi, theta-i

Solution

Suppose we have samplevX=(x1,x2,....xn) and estimator ^=S(X), then the jacknife focuses on the samples that leave out one observation at a time is X(i)=(x1,x2,.....,xi-1,xi+1,......xn) is called jacknife samples. Then the jacknife estimate of se is defined as se^2jack=[(n-1)/n(summation of i=1 to n(^(i)-^(.))2)]. By Fisher -Neyman criterian L=(product of i=1 to n)f(xi,) and l\'(x,)=0 and l\"(x,)<0 then l\'(Xi;^(-i))=(summation j=1 to n)li(Xj;^(-i)).

 Suppose that X1 Xn are independent random variables with density or mass function f(x, theta) and suppose that we estimate theta using the maximum likelihood e

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site