Article contents
Testing the Goodness of Fit of a Parametric Density Function by Kernel Method
Published online by Cambridge University Press: 11 February 2009
Abstract
Let F denote a distribution function
defined on the probability space (Ω,,P), which is absolutely
continuous with respect to the Lebesgue measure in
Rd with probability
density function f. Let
f0(·,β) be a
parametric density function that depends on an
unknown p × 1 vector β. In this
paper, we consider tests of the goodness-of-fit of
f0(·,β) for
f(·) for some β based on (i) the
integrated squared difference between a kernel
estimate of f(·) and the
quasimaximum likelihood estimate of
f0(·,β) denoted by
In and (ii) the
integrated squared difference between a kernel
estimate of f(·) and the
corresponding kernel smoothed estimate of
f0(·, β) denoted by
Jn. It is shown in
this paper that the amount of smoothing applied to
the data in constructing the kernel estimate of
f(·) determines the form of the
test statistic based on
In. For each test
developed, we also examine its asymptotic properties
including consistency and the local power property.
In particular, we show that tests developed in this
paper, except the first one, are more powerful than
the Kolmogorov-Smirnov test under the sequence of
local alternatives introduced in Rosenblatt [12],
although they are less powerful than the
Kolmogorov-Smirnov test under the sequence of Pitman
alternatives. A small simulation study is carried
out to examine the finite sample performance of one
of these tests.
- Type
- Research Article
- Information
- Copyright
- Copyright © Cambridge University Press 1994
References
- 101
- Cited by