ISBN:
978-81-8487-416-7 Publication Year: Reprint 2016
Pages: 334 Binding: Paper Back
About the book
Parametric Inference: An Introduction discusses the basic concept of sufficient statistic and the classical approach based on minimum variance unbiased estimator. There is a separate chapter on simultaneous estimation of several parameters. Large sample theory of estimation, based on consistent asymptotically normal estimators obtained by method of moments, percentile and the method of maximum likelihood is also introduced. The tests of hypotheses for finite samples with classical Neyman-Pearson theory is developed pointing out its connection with Bayesian approach. The hypotheses testing and confidence interval techniques are developed leading to likelihood ratio tests, score tests and tests based on maximum likelihood estimators.
Key Features
Table
of content
Preface / Introduction / Sufficient Statistic / Minimum Variance Unbiased Estimation / Simultaneous Estimation of Several Parameters / Consistent Estimators / Consistent Asymptotically Normal Estimators / Method of Maximum Likelihood / Tests of Hypotheses –I / Tests of Hypotheses – II / Interval Estimation / Nonparametric Statistical Inference / Inference for Instantaneous and Early Failure Time Distributions / References / Index.
Audience
Senior Undergraduate, Graduate Students & Teachers