A First Course on Parametric Inference (2nd Revised edition)
By: B. K. Kale (author)Hardback
1 - 2 weeks availability
After a brief historical perspective, the text discusses the basic concept of sufficient statistic and the classical approach based on minimum variance unbiased estimator. There is a separate chapter on simultaneous estimation of several parameters. Large sample theory of estimation, based on consistent asymptotically normal estimators obtained by method of moments, percentile and the method of maximum likelihood is also introduced. The tests of hypotheses for finite samples with classical Neyman - Pearson theory is developed pointing out its connection with Bayesian approach. The hypotheses testing and confidence interval techniques are developed leading to likelihood ratio tests, score tests and tests based on maximum likelihood estimators.
B. K. Kale.: Prof. of Statistics (Retd.), University of Pune
Preface to the Second Edition / Preface to the First Edition / Introduction / Sufficient Statistic / Minimum Variance Unbiased Estimation / Simultaneous Estimation of Several Parameters / Consistent Estimators / Consistent Asymptotically Normal Estimators / Method of Maximum Likelihood / Tests of Hypotheses - I / Tests of Hypotheses - II / Interval Estimation / Nonparametric Statistical Inference / References / Index.
Number Of Pages:
- ID: 9781842652190
2nd Revised edition
- Saver Delivery: Yes
- 1st Class Delivery: Yes
- Courier Delivery: Yes
- Store Delivery: Yes
Prices are for internet purchases only. Prices and availability in WHSmith Stores may vary significantly
© Copyright 2013 - 2017 WHSmith and its suppliers.
WHSmith High Street Limited Greenbridge Road, Swindon, Wiltshire, United Kingdom, SN3 3LD, VAT GB238 5548 36