HAPPY BIRTHDAY SIR DAVID COX! Today is David Cox’s birthday, he would have been 98 years old today. Below is a remembrance I contributed to Significance when he died, with a link to others in that same issue.
“In celebrating Cox’s immense contributions, we should recognise how much there is yet to learn from him”
By Deborah G. Mayo
It is fitting that we take this time to reflect on Sir David Cox’s immense contributions to science. I am honoured to add a personal reflection and remembrance.
Importance of statistical foundations
Alongside his monumental contributions that are widely recognised as transforming applied statistics, Cox returned again and again to questions of the foundations, basic concepts, and theory of statistical inference. In a 1978 paper, he writes that “the aims of a study of foundations include”:27
- qualitative clarification of the objectives of statistical work;
- formal justification of, or even improvements to, procedures of analysis…;
- the provision of a systematic basis for tackling new problems.
In the preface to his 2006 book, Principles of Statistical Inference, he explains:19
Without some systematic structure statistical methods for the analysis of data become a collection of tricks that are hard to assimilate and interrelate to one another. … The development of new methods appropriate for new problems would become entirely a matter of ad hoc ingenuity. … [O]ne role of theory is to assimilate, generalize and perhaps modify and improve the fruits of such ingenuity.
Much of the theory is concerned with … assessing the relative merits of different methods of analysis, and it is important even at a very applied level to have some understanding of the strengths and limitations of such discussions. This is connected with somewhat more philosophical issues connected with the nature of probability. A final reason, and a very good one, for study of the theory is that it is interesting.
There are two central themes in Cox’s statistical philosophy. First, there is the importance of calibrating methods by considering how they would behave in (actual or hypothetical) repeated sampling (“it seems clear that any proposed method of analysis that in repeated application would mostly give misleading answers is fatally flawed” 9 ). Second, there is the need to ensure that the “calibration is relevant to the specific data under analysis, often taking due account of how the data were obtained” Crucial questions in relation to these facets of Cox’s statistical philosophy had long been of importance to my work in philosophy of science: how can the frequentist calibration be used as an epistemic assessment of what can be learned from data? How can the assessment be made relevant to the specific data without leading to the unique case, precluding error probabilities? Little did I know that I would have the good fortune to talk and work directly with Cox on tackling them.
It was late in the summer of 2003 when I boldly emailed Cox to invite him to be part of a session on “Philosophy of Statistics” that I was organising for the second Erich Lehmann conference to be held in May 2004 at Rice University. To my surprise he said yes. (The session also included David Freedman.) For the next two years we talked about how to view “frequentist statistics as a theory of inductive inference”, which became a joint paper in the conference proceedings. In June 2006, Cox presented this joint work at a conference I organised at Virginia Tech, ERROR ‘o6. A 2010 workshop (“Statistical Science and Philosophy of Science: Where Should They Meet?”), and a 2010 paper, followed. Our collaboration led to an important change in my research trajectory: I would focus on applying philosophy of statistics to problems in science (and much less on using probabilistic ideas in philosophy of science). My 2014 paper, “On the Birnbaum Argument for the Strong Likelihood Principle”, would not have been written without Cox’s support and encouragement. His insights and feedback over several years were very important to the completion of my 2018 book, which ends with his words: “It’s time”.31
It was an extraordinary experience to learn from Cox’s own reflections about such key statistical figures as Barnard, Birnbaum, Box, Fisher, Neyman, Egon Pearson, Jeffreys and many others. He had a unique and wonderfully irreverent sense of humour. He was unfailingly optimistic, unpretentious, open-minded, and had the uncanny ability to synthesise complex ideas in a succinct, clarifying form.
Learning from Cox
Returning to the preface of Cox’s 2006 book on statistical inference, he explains: “The object of the present book is to … describe and compare the main ideas and controversies over more foundational issues that have rumbled on at varying levels of intensity for more than 200 years.” Many such foundational controversies are ones about which Cox wrote illuminatingly for over 60 of those years, notably on statistical significance tests, from 1958 to Attention to Cox’s delineation of different types of null hypotheses, contexts, and corresponding interpretations points the way to avoiding much of today’s misuses and misunderstandings.
The objective is to recognize explicitly the possibility of error and to use that recognition to calibrate significance tests and confidence intervals as an aid to interpretation. This is to provide a link with the real underlying system, as represented by the probabilistic model of the data generating process.19
In celebrating Cox’s immense contributions, we should recognise how much there is yet to learn from him.
The ASA newsletter, AMSTATNEWS, has announced the David R. Cox Foundations of Statistics award in time for his birthday here.