Abstract:
Textbook approaches to forming asymptotically justified confidence
intervals for the spectrum under
very general assumptions were developed by the mid-1970s. This paper shows that under the
textbook assumptions, the true confidence level for these intervals
does not converge to the asymptotic level, and instead is fixed at
zero in all sample sizes. The paper explores necessary
conditions for solving this problem, most notably showing that
under weak conditions, forming valid confidence intervals
requires that one limit consideration to a finite-dimensional time
series model.
|