This book is intended for use in a rigorous introductory PhD level course in econometrics, or in a field course in econometric theory. It covers the measure-theoretical foundation of probability theory, the multivariate normal distribution with its application to classical linear regression analysis, various laws of large numbers, central limit theorems and related results for independent random variables as well as for stationary time series, with applications to asymptotic inference of M-estimators, and maximum likelihood theory. Some chapters have their own appendices containing the more advanced topics and/or difficult proofs. Moreover, there are three appendices with material that is supposed to be known. Appendix I contains a comprehensive review of linear algebra, including all the proofs. Appendix II reviews a variety of mathematical topics and concepts that are used throughout the main text, and Appendix III reviews complex analysis. Therefore, this book is uniquely self-contained.
Herman J. Bierens is Professor of Economics at the Pennsylvania State University and part-time Professor of Econometrics at Tilburg University, The Netherlands. He is Associate Editor of the Journal of Econometrics and Econometric Reviews, and has been an Associate Editor of Econometrica. Professor Bierens has written two monographs, Robust Methods and Asymptotic Theory in Nonlinear Econometrics and Topics in Advanced Econometrics Cambridge University Press 1994), as well as numerous journal articles. His current research interests are model (mis)specification analysis in econometrics and its application in empirical research, time series econometrics, and the econometric analysis of dynamic stochastic general equilibrium models.
Part I. Probability and Measure: 1. The Texas lotto; 2. Quality control; 3. Why do we need sigma-algebras of events?; 4. Properties of algebras and sigma-algebras; 5. Properties of probability measures; 6. The uniform probability measures; 7. Lebesque measure and Lebesque integral; 8. Random variables and their distributions; 9. Density functions; 10. Conditional probability, Bayes's rule, and independence; 11. Exercises: A. Common structure of the proofs of Theorems 6 and 10, B. Extension of an outer measure to a probability measure; Part II. Borel Measurability, Integration and Mathematical Expectations: 12. Introduction; 13. Borel measurability; 14. Integral of Borel measurable functions with respect to a probability measure; 15. General measurability and integrals of random variables with respect to probability measures; 16. Mathematical expectation; 17. Some useful inequalities involving mathematical expectations; 18. Expectations of products of independent random variables; 19. Moment generating functions and characteristic functions; 20. Exercises: A. Uniqueness of characteristic functions; Part III. Conditional Expectations: 21. Introduction; 22. Properties of conditional expectations; 23. Conditional probability measures and conditional independence; 24. Conditioning on increasing sigma-algebras; 25. Conditional expectations as the best forecast schemes; 26. Exercises; A. Proof of theorem 22; Part IV. Distributions and Transformations: 27. Discrete distributions; 28. Transformations of discrete random vectors; 29. Transformations of absolutely continuous random variables; 30. Transformations of absolutely continuous random vectors; 31. The normal distribution; 32. Distributions related to the normal distribution; 33. The uniform distribution and its relation to the standard normal distribution; 34. The gamma distribution; 35. Exercises: A. Tedious derivations; B. Proof of theorem 29; Part V. The Multivariate Normal Distribution and its Application to Statistical Inference: 36. Expectation and variance of random vectors; 37. The multivariate normal distribution; 38. Conditional distributions of multivariate normal random variables; 39. Independence of linear and quadratic transformations of multivariate normal random variables; 40. Distribution of quadratic forms of multivariate normal random variables; 41. Applications to statistical inference under normality; 42. Applications to regression analysis; 43. Exercises; A. Proof of theorem 43; Part VI. Modes of Convergence: 44. Introduction; 45. Convergence in probability and the weak law of large numbers; 46. Almost sure convergence, and the strong law of large numbers; 47. The uniform law of large numbers and its applications; 48. Convergence in distribution; 49. Convergence of characteristic functions; 50. The central limit theorem; 51. Stochastic boundedness, tightness, and the Op and op-notations; 52. Asymptotic normality of M-estimators; 53. Hypotheses testing; 54. Exercises: A. Proof of the uniform weak law of large numbers; B. Almost sure convergence and strong laws of large numbers; C. Convergence of characteristic functions and distributions; Part VII. Dependent Laws of Large Numbers and Central Limit Theorems: 55. Stationary and the world decomposition; 56. Weak laws of large numbers for stationary processes; 57. Mixing conditions; 58. Uniform weak laws of large numbers; 59. Dependent central limit theorems; 60. Exercises: A. Hilbert spaces; Part VIII. Maximum Likelihood Theory; 61. Introduction; 62. Likelihood functions; 63. Examples; 64. Asymptotic properties if ML estimators; 65. Testing parameter restrictions; 66. Exercises.
Number Of Pages:
- ID: 9780521542241
- Saver Delivery: Yes
- 1st Class Delivery: Yes
- Courier Delivery: Yes
- Store Delivery: Yes
Prices are for internet purchases only. Prices and availability in WHSmith Stores may vary significantly
© Copyright 2013 - 2016 WHSmith and its suppliers.
WHSmith High Street Limited Greenbridge Road, Swindon, Wiltshire, United Kingdom, SN3 3LD, VAT GB238 5548 36