Likelihood Ratio Tests for Model Selection and Non-Nested Hypotheses
.
Econometrica 57 (2): pp. 307-333 (1989)

In this paper, we develop a classical approach to model selection. Using the Kullback-Leibler Information Criterion to measure the closeness of a model to the truth, we propose simple likelihood-ratio based statistics for testing the null hypothesis that the competing models are equally close to the true data generating process against the alternative hypothesis that one model is closer. The tests are directional and are derived successively for the cases where the competing models are non-nested, overlapping, or nested and whether both, one, or neither is misspecified. As a prerequisite, we fully characterize the asymptotic distribution of the likelihood ratio statistic under the most general conditions. We show that it is a weighted sum of chi-square distribution or a normal distribution depending on whether the distributions in the competing models closest to the truth are observationally identical. We also propose a test of this latter condition.
  • @stephandoerfel
Diese Publikation wurde noch nicht bewertet.

Bewertungsverteilung
Durchschnittliche Benutzerbewertung0,0 von 5.0 auf Grundlage von 0 Rezensionen
    Bitte melden Sie sich an um selbst Rezensionen oder Kommentare zu erstellen.