Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgments
- Symbols
- Acronyms
- 1 An introduction to empirical modeling
- 2 Probability theory: a modeling framework
- 3 The notion of a probability model
- 4 The notion of a random sample
- 5 Probabilistic concepts and real data
- 6 The notion of a non-random sample
- 7 Regression and related notions
- 8 Stochastic processes
- 9 Limit theorems
- 10 From probability theory to statistical inference*
- 11 An introduction to statistical inference
- 12 Estimation I: Properties of estimators
- 13 Estimation II: Methods of estimation
- 14 Hypothesis testing
- 15 Misspecification testing
- References
- Index
13 - Estimation II: Methods of estimation
Published online by Cambridge University Press: 06 July 2010
- Frontmatter
- Contents
- Preface
- Acknowledgments
- Symbols
- Acronyms
- 1 An introduction to empirical modeling
- 2 Probability theory: a modeling framework
- 3 The notion of a probability model
- 4 The notion of a random sample
- 5 Probabilistic concepts and real data
- 6 The notion of a non-random sample
- 7 Regression and related notions
- 8 Stochastic processes
- 9 Limit theorems
- 10 From probability theory to statistical inference*
- 11 An introduction to statistical inference
- 12 Estimation I: Properties of estimators
- 13 Estimation II: Methods of estimation
- 14 Hypothesis testing
- 15 Misspecification testing
- References
- Index
Summary
Introduction
In the previous chapter we discussed estimators and their properties. The main desirable finite sample properties discussed in chapter 12 were:
Unbiasedness, Efficiency,
with Sufficiency being a property relating to specific probability models. The desirable asymptotic properties discussed in the previous chapter were:
Consistency, Asymptotic Normality, Asymptotic efficiency.
The notion of the ideal estimator was used as a comparison rod in order to enhance the intuitive understanding of these properties. The question of how one can construct good estimators was sidestepped in the previous chapter. The primary objective of this chapter is to consider this question in some detail by discussing four estimation methods:
1 The moment matching principle,
2 The least-squares method,
3 The method of moments, and
4 The maximum likelihood method.
A bird's eye view of the chapter
In section 2 we discuss an approach to estimation that has intuitive appeal but lacks generality. We call this procedure the moment matching principle because we estimate unknown parameters by matching distribution and sample moments. The relationship between the distribution and the sample moments is also of interest in the context of the other methods. Section 3 introduces the least-squares method, first as a mathematical approximation method and then as a proper estimation method in modern statistical inference. In section 4 we discuss Pearson's method of moments and then compare it with the parametric method of moments, an adaptation of the original method for the current paradigm of statistical inference.
- Type
- Chapter
- Information
- Probability Theory and Statistical InferenceEconometric Modeling with Observational Data, pp. 637 - 680Publisher: Cambridge University PressPrint publication year: 1999