# [Reading] ➵ In All Likelihood: Statistical Modelling And Inference Using Likelihood ➼ Yudi Pawitan – Tactical-player.co.uk

Really an outstanding second book to read on estimation by Fisher s method of maximum likelihood A little light on the mathematics, but you can fill that in on your own, if you really need to know. I first heard about this book in 2006 at the International Meeting of the Psychometric Society in Montreal from a colleague I ordered it from the day I heard about it from one of the conference computers It s great and I ve found examples in it for teaching plus a lot of things that I simply didn t know, didn t remember, or didn t really understand the first time.The examples make this book really useful compared to technical texts like Bickel Doksum or Lehmann These books are useful, of course, but not so much as texts for courses Pawitan s book has tons of really great little examples that bring the concepts down to earth for the reader For instance, when he plots four score functions normal, Poisson, binomial and Cauchy , you see immediately why estimation is difficult in models such as the Cauchy compared to the normal It also builds intuition about what the score function actually is I have unpublished notes from John Marden Statistics, UIUC , who was my statistical theory professor, which are very, very good Pawitan s book is on par The fact that the R code is available is fantastic. As a practicing data analyst, I frequently find the standard classical statistical techniques either useless, cumbersome, or convoluted for all but the most straightforward applications However, before running across the book, I thought my only other option would be to utilize Bayesian methods which are infinitely flexible and elegant to perform inference with complex, non normal non linear models Unfortunately, solving Bayesian models relies on two rather unappealing concepts a prior probability so that all your calculations still yield probabilities ,and Markov Chain Monte Carlo a computationally intensive, sometimes non convergent, algorithm that is only asymptotically correct wrt the true Bayesian posterior probability distribution Reading Dr Pawitan s book introduced me to a very satisfying third way as he calls it Instead of force fitting all uncertainty into a probability, the likelihood approach recognizes two types of uncertainty, which is both novel in statistics and extremely refreshing once you understand why two types are necessary The first, which I would call well calibrated uncertainty, is analogous to a confidence interval for the mean of a normal sample With this type, we know how often we would be wrong under repeated sampling from this population, so we have a good idea how well our method brackets the true mean i.e., well calibrated.The other type of uncertainty is unique to the likelihood approach This type of uncertainty arises if precise, repeated sampling error rates cannot be derived or estimated In this case you are left with basically two choices apart from collecting data create asymptotically well calibrated inferences i.e., assume that as N infinity, your repeated sampling probability statements would become precise or admit that you do not precisely know the error rate and then rely solely on the likelihood and perhaps some non frequentist calibration metric Dr Pawitan shows how to do this using the AIC to calibrate the likelihood This type of uncertainty is NOT stated in terms of probability, which I find incredibly honest, as giving probabilities gives the air of accuracy knowledge than is usually warranted with complex models.From an applied standpoint, I think the likelihood approach is superior to the Bayesian approach not because it is necessarily accurate, but because it possesses a far less cumbersome theoretical apparatus while retaining all the flexibility and elegance of a Bayesian approach Likelihood methods do not require Markov Chain Monte Carlo, nor do they require jacobians for transformations on inferences instead, all you need is a good old root finder to solve essentially all problems Simulation is useful if you are doing bootstrapping along with likelihood, but it is not an essential part of the inference machine so to speak For missing data, you can use Expectation Maximization, which again only requires a simple computer package Finally, you can incorporate prior information subjective opinion or objective data as a prior likelihood, which unlike the prior probability does not need to integrate to one in Bayesian stats, you have to resort to improper priors and hope for the best for situations where you want to represent complete ignorance Also, a prior likelihood is probably psychologically closer to what most of us do when we evaluate the prior plausibility of a hypothesis, as we usually aren t very good at estimating raw probabilities.The actual book is very complete, with good coverage of the fundamental mathematical statistics Sometimes he could be clearer about his motivation for a particular topic, but overall I found it an excellent applied statistics text with great theoretical underpinning If you are looking for a modern, flexible, and nuanced approach to applied statistics, you can do no better than this book. Ok textbook but useful reference for any graduate math or stat person. Excellent coverage of likelihood, with significant depth Explanations are very clear, but forewarned they are targeted at a higher level audience I m thinking advanced undergrad stats course or entry level grad textbook This is a textbook with lots of math, but a writer who really wants you to understand I especially appreciated the coverage of Score and Fisher Information matrix, both observed and empirical I also thought the treatment of nuisance and random parameters was fantastic The examples and R code on his website are also great addition to this book. Based On A Course In The Theory Of Statistics This Text Concentrates On What Can Be Achieved Using The Likelihood Fisherian Method Of Taking Account Of Uncertainty When Studying A Statistical Problem It Takes The Concept Ot The Likelihood As Providing The Best Methods For Unifying The Demands Of Statistical Modelling And The Theory Of Inference Every Likelihood Concept Is Illustrated By Realistic Examples, Which Are Not Compromised By Computational Problems Examples Range From A Simile Comparison Of Two Accident Rates, To Complex Studies That Require Generalised Linear Or Semiparametric Modelling The Emphasis Is That The Likelihood Is Not Simply A Device To Produce An Estimate, But An Important Tool For Modelling The Book Generally Takes An Informal Approach, Where Most Important Results Are Established Using Heuristic Arguments And Motivated With Realistic Examples With The Currently Available Computing Power, Examples Are Not Contrived To Allow A Closed Analytical Solution, And The Book Can Concentrate On The Statistical Aspects Of The Data Modelling In Addition To Classical Likelihood Theory, The Book Covers Many Modern Topics Such As Generalized Linear Models And Mixed Models, Non Parametric Smoothing, Robustness, The Em Algorithm And Empirical Likelihood