Riguardo questo articolo
viii, 1 leaf, 208 pp. Original cloth. Near Fine, in near fine dust jacket (price-clipped). 'A second major result of the fifties that I want to mention is that of a student of Wald, Charles Stein, which he published in 1956, and which further led to publication of the famous James-Stein estimator in the early sixties. This result of Stein has been termed by Efron [27] 'one of the most striking theorems of post-war mathematical statistics.' In this work Stein discovered the fact that the mean of a sample from a multivariate normal distribution of dimension p may be inadmissible if p ≥ 3, i.e. there are estimators whose risk functions are everywhere smaller than the risk of the sample mean. He also supplied a class of estimators with this property these have since become known as James-Stein estimators. These estimators are closely related to shrinkage, Bayes and empirical Bayes estimators. Stein was certainly a most remarkable person. In 1956 he was 35 years old, a professor at Stanford University, which he had joined in 1953, and had already done some fundamental research in mathematical statistics, including the famous Hunt-Stein theorem relating group theory and statistical invariance. Apart from the results already mentioned, he contributed a number of other important ones to mathematical statistics' ('Statistics in the Fifties' Presidential Address; Efron = Efron, B, Introduction to James and Stein (1961) Estimation with Quadratic Loss. Breakthroughs in Statistics, Volume I, 437-442). Robbins's paper is reprinted, with an Introduction by I. J. Good (pp. 379-387), in Samuel Kotz & Norman Lloyd Johnson (eds), Breakthroughs in statistics, Vol. I: Foundations and Basic Theory, 1992, pp. 388-394). 'Another important contribution of 1956 was the introduction of the Empirical Bayes method by Robbins. The question Robbins asked himself in a previous paper was whether one could usefully apply Bayes' theorem even if the prior for a parameter is unknown but known to exist. In his 1956 paper he answered this question by considering a random variable X which has a probability distribution depending on a parameter L, which is assumed to have a prior distribution. Let X1, X2, ……., Xn be n values of X occurring with different values of L, say L1, L2, …., Ln. He considered estimating Ln when all n observations X1, X2, ……., Xn are available. In the discrete Poisson case he showed how the Bayes estimator with respect to squared error loss, i.e. the posterior mean, can essentially be written as the ratio of the marginal mass function of X evaluated at two different values. These could then each be estimated in a natural fashion by the corresponding sample mass functions. Robbins also carried this through for the geometric and binomial distributions and he gave some indications of the more general case. Indeed a very clever idea!' ('Statistics in the Fifties'). 'Robbins' most influential contribution to statistical theory' (Bradley Efron, 'Robbins, empirical Bayes and microarrays', Ann. Statist., Vol. 31, No. 2, 2003, 366-378). 'Since the 1950's Wald's theory has been refined and extended in many directions. His conceptual framework has been influential in a number of fields in mathematical statistics, including Le Cam's asymptotic theory [in the volume offered here], Stein estimation [in the volume offered here], Huber's theory of robust estimation and others' ('Statistics in the Fifties'). This volume also includes: HOEFFDING, W. 'The role of assumptions in statistical decisions' (105-114); KARLIN, S., 'Decision theory for Pólya type distributions. Case of two actions, I' (115-128).
Codice articolo 17518
Contatta il venditore
Segnala questo articolo