Check Delivery
The discussion of increasing levels of data summarising to achieve maximal summarization and its connections with sufficient and least sufficient statistics are the book's opening points. The book provides a thorough explanation of the theories and findings regarding uniformly minimum variance unbiased estimators (UMVUE), including the well-known Lehmann-Scheffe and Rao theorems that produce a UMVUE. By introducing Fishers information and Chapman, Robbins, and Kiefer variance lower bounds for Pitman models, it explores Cramer-Rao and Bhattacharyya variance lower bounds for regular models. Additionally, the book examines large sample features of various estimators, such as consistency, consistent asymptotic normalcy (CAN), and best asymptotic normality (BAN), in addition to introducing several estimate methods, including the well-known maximum likelihood method. By utilising the model's built-in symmetry structure and the Bayes, Empirical Bayes, and Hierarchical Bayes estimators in various statistical models, separate chapters are devoted to finding the Pitman estimator among equivariant estimators for location and scale models. One of the presentation's many draws is its systematic presentation of the theory and findings in many statistical contexts and models. Each chapter ends with a number of instances that have been solved using various statistical models, along with an explanation of the underlying concepts and findings.
Author | Manoj Kumar Srivastava, Abdul Hamid Khan, Namita Srivastava |
Publisher | PHI Learning Private Limited |
Language | English |
Binding Type | Paper Back |
Main Category | Science & Mathematics |
Sub Category | Statistics |
ISBN13 | 9788120349308 |
SKU | BK 0133406 |
A handpicked list of products which has touched millions
Fast Shipping On All Orders
30 Day Money Back
Technical Support 24/7
All Cards Accepted
© Copyright 2022 | GetMyBook.com All Rights Reserved.