ARNOLD ZELLNER

by Michael Wiper
mwiper@est-econ.uc3m.es

Professor Zellner is a founding member of ISBA and was president of ISBA in 1994-1995. ISBA gave him a Founder's Award Plaque in 1998. Professor Zellner was President of the American Statistical Association (ASA) in 1991, first Chair of the ASA Section on Bayesian Statistical Science in 1993 and Seminar Leader of the NBER-NSF Bayesian Seminar for 25 years. He has worked at the University of Chicago since 1966 and is Distinguished Service Professor Emeritus of Economics and Statistics. He has worked in Bayesian statistics and economics for more than 30 years and has published more than 200 articles, monographs and books, many of which have been extremely influential in Bayesian research. He has received numerous awards and worked with statisticians and economists all over the world. A fuller description of his work can be seen at his homepage.

We e-mailed Professor Zellner a number of questions about his career and the Bayesian world in general. Here are his responses.

1. Why did you decide to become a statistician?

After getting an undergraduate degree in physics at Harvard, I completed about a year and a half of graduate work in physics at U. of California at Berkeley where I interacted with my brother Norman and his friends who were doing doctoral work in quantitative economics. I became aware of a great opportunity to develop and use quantitative methods and data to solve economic and business problems. I have been pursuing these objectives for many years and have come to view methods for learning from data and making decisions, that is statistics, as the foundation of all the sciences.

And why Bayesian?

Since reading Sir Harold Jeffreys's books in the 1960s, I was impressed by the central role that his axioms and successful applied studies gave to Bayes's Theorem and uses of it in estimation, testing, prediction, etc. Thus, I started a program of research to compare Bayesian and non-Bayesian statistical solutions to various problems. Over the years, I and others found that Bayesian solutions were generally better and thus I and many others were happy to become identified as BAYESIAN.

2. Can you name some of the people and events that have had a great influence on you during your career?

In addition to interaction with my PhD. thesis advisors George Kuznets, Ivan Lee and Robert Gordon, who were very helpful, interaction with George Box in the 1960s when I was a faculty member at the U.of Wisconsin was very stimulating given his deep understanding of statistical theory and application. Then too, Sir Harold Jeffreys's work and comments were and are extremely influential with respect to my theoretical and applied Bayesian work. Jeffreys's philosophy, simplicity postulate, invariant priors, information theory measures, and applications had an extremely important influence on me and my work. On the several occasions when I visited with Jeffreys at Cambridge, our conversations were most constructive. The same can be said with respect to many interactions with George Barnard and Jack Good. Both offered constructive comments and helpful references on many occasions. In addition, it was Barnard and Jenkins' JRSS paper in the 1960s on a "weighted likelihood" approach to the analysis of time series models that led me to view the weighting function as a prior density and to realize that this then produced exact finite sample inferences for time series models...no asymptotics needed!!!! What a lovely realization that was. Then too, George Barnard and Edwin Jaynes have been very constructive in the late 1980s and 1990s in connection with my derivation of optimal information processing rules, including Bayes's Theorem, and my development of the Bayesian method of moments (BMOM) that yields inverse probability statements regarding parameters and future observations without the use of a likelihood function and Bayes's Theorem. We have now tested, using post data odds, BMOM and traditional Bayesian predictive densities using data,
a procedure that appealed very much to George Barnard. Last I have been very fortunate to have had many able colleagues and graduate students work with me on my NSF grants from the 1960s to the present, many of whom co-authored papers with me (see papers listed on my homepage.) Our hard work and long discussions were very influential in shaping my thoughts as were the many comments received at semi-annual meetings of the NBER-NSF Seminar on Bayesian Inference from 1970 and thereafter. While at the U. of Chicago since 1966, the following past and present colleagues have been very influential and helpful: Milton Friedman, Ed George, Al Madansky, Rob McCulloch, Jim Press, Harry Roberts, Peter Rossi, Steve Stigler, Hodson Thornber, George Tiao and David Wallace.

3. Modesty apart, what do you consider to be your own contribution to statistics?

Modesty apart, I believe that I have done much through my research, teaching and other efforts to have the Bayesian approach be accepted in statistics and econometrics. Working many problems from several points of view and comparing solutions, mentioned earlier, has been a particularly effective approach. Further, I take great pride in my work on seemingly unrelated regression models (SURs) that have been very useful in many fields. Then, immodestly, I believe that the optimal information theoretic maximal data information priors (MDIPs) that can, with appropriate side conditions, be made invariant to relevant transformations are extremely useful. Then producing an optimal information processing procedure that yields Bayes' Theorem as a 100% efficient information processing rule and a first relation between Bayes's Theorem and entropy, as noted by Ed Jaynes, appears important to me. Also, employing this approach with other inputs has produced new, optimal information processing rules. Next, I have produced a whole range of minimum expected loss (MELO) estimates and point predictions for many problems, balanced loss functions, solutions to many applied problems including point and turning point forecasting, portfolio problems, control problems, etc. And most recently, there is the Bayesian method of moments (BMOM) that has been applied to many different models and problems and permits Bayesians to make inverse probability statements when the form of the likelihood function is unknown. Last, and very important, I take great pride in having worked with a large number of doctoral students over the years who have produced remarkable theses, most of them Bayesian, and gone on to have very successful careers in research, teaching and consulting in the U.S. and many other countries.

And what is your "best" piece of work?

Given my prejudices, I would have to say my book, An Introduction to Bayesian Inference in Econometrics, Wiley, 1971, reprinted in Wiley Classics Library, 1996. It was an early report on the usefulness of Bayesian inference and provided many examples comparing Bayesian and non-Bayesian solutions to estimation, prediction, testing and control problems. Many different models, including my seemingly unrelated regression (SUR) model, were analyzed and examples computed. Much of my later work was done to extend and improve analyses that appeared in this 1971 volume, including work on estimating reciprocals, ratios, structural coefficients, and other functions of parameters in the MELO approach, extending the MDIP approach to producing priors, introducing broadened, "balanced" loss functions, etc.

4. Have you ever experienced any discrimination against Bayesianism?

No. The best that I can provide along these lines is a remark that Ted Anderson made at his 80th birthday party last June when I said he looked more like 40 than 80. He replied that you Bayesians never could count. Also, Jim Durbin once remarked that genetics determines who is Bayesian and who is not Bayesian. A few years later, he presented a Bayesian paper at a seminar at the U. of Chicago. I remarked that his genes must have changed. He remarked, "What's all this fuss about a little mathematics?"

5. For students starting out, would you recommend them to go into Bayesian statistics and, if so, what advice would you give them?

By all means I would tell them to learn Bayesian statistics and make sure that in their courses and work that they work inference and decision problems from different points of view and compare solutions. To make meaningful comparisons, they have to understand not only the Bayesian approach but others, e.g. sampling, likelihood, structural, empirical likelihood etc. approaches. In recent lecturing to grad students, I have found that many coming into my courses are all mixed up about definitions of probability, interpretations of Bayesian and non-Bayesian confidence and prediction intervals, testing procedures, etc. In an old fashioned approach to get them straightened out, I tell them that these and similar issues will be featured on the final examination. Then they get serious and learn what's what.

6. Having worked with economists for many years, do you find their views about Bayesian methods as a whole to be different from workers in other fields, e.g. physicists, engineers etc?

Economists, in contrast to workers in other fields, tend to be more familiar with the theory of decision-making under uncertainty developed by Ramsey, Savage, Friedman and others. Also, economic theorists have used Bayes's Theorem as a learning model for many years. However, many older quantitative, applied economists and econometricians have just received non-Bayesian material in their training and many tend to be wary of the Bayesian approach using arguments that have appeared in old statistics and econometrics texts. However, the new generation is much better educated in Bayesian matters and has already produced many important and useful Bayesian results. With respect to physicists, many, if not most, are not well-trained in statistics, Bayesian or non-Bayesian. Surprisingly, not very many physicists with whom I have come into contact have read Jeffreys's book, Theory of Probability, although more tell me that they will read it. Many engineers have taken courses in engineering statistics and appear to me to be quite pragmatic. They will use anything that works in practice, including Bayesian analysis, perhaps prodded along by Richard Barlow who teaches engineers Bayesian analysis at the U. of California at Berkeley.

7. What do you enjoy most about your work?

Getting solutions that work well in practice and seeing grad students succeed in their doctoral research and careers.

And least?

Grading examinations and attending "windy'" academic committee meetings.

8. What is your favourite statistics book?

H. Jeffreys's, Theory of Probability [Note that physicists tend to refer to statistics as "probability theory," perhaps because Rutherford is supposed to have said that if you need statistics to analyze your data, your experiment needs redesigning.] I also like Jim Berger's book, Statistical Decision Theory and Bayesian Analysis, 2nd ed. Springer-Verlag, 1985.

9. What is your favourite Bayesian statistics joke?

Would you want your daughter to marry a Bayesian?

10. Why did you decide to start ISBA?

Bayesian statistics had become so important as a foundation for all the sciences that many, including myself, thought it appropriate to aid its development world-wide by creating ISBA. In addition, our NBER-NSF Bayesian Seminar group had had many productive and very enjoyable meetings in Venezuela, Mexico, India, Canada, Brazil, etc. and thus the decision to extend these interactions into the future in a more organized manner was not hard.

What do you think about ISBA now after 6 years?

ISBA's growth and development, particularly its successful meetings, new chapters in India, Chile and S. Africa, and enlarged membership are very impressive. Also, I particularly like the plans for the new, expanded ISBA Newsletter that have recently been circulated worldwide. The new Newsletter, it appears to me, will be one step along the way to a much needed ISBA Journal of Bayesian Analysis serving all the disciplines.

How should ISBA develop over the next few years? Are there any specific things you would like to see it do?

While there are many possibilities, including the production of an ISBA Journal of Bayesian Analysis, I would like to see ISBA do all it can to support the activities, meetings and publications of the Chapters, new and old. With respect to world meetings, it would also appear worthwhile to have volumes produced that contain the major theoretical and applied papers and general discussions of major issues presented at each meeting. If well-designed and if many relevant issues are treated, e.g. how to use Bayesian analysis to make tax and budget policy as done by Charles Whiteman in connection with his consulting with the governor of the State of Iowa, these volumes may become "best sellers" and generate revenue for ISBA and its Chapters.

11. What have been the greatest changes in Bayesian methods in the years since you started?

One is the current ability to compute almost any integral by numerical techniques, e.g. MCMC, etc. Second, we now have a plethora of procedures for producing both diffuse and informative priors. Third, and very important, we now have many more successful applications of Bayesian analysis, e.g. Mike West's impressive applied work, Jose Quintana's and Blu Putnam's very useful Bayesian portfolio formation applications, etc. Now we not only have good theory, we also have impressive performance in practice to which we can point.

Are all these changes for the good or has anything been lost?

I can't think of anything that has been lost by experiencing these remarkable, valuable changes.

12. What do you predict will be the changes in the next 10 years of Bayesian statistics?

Forecasting 10 years into the future is very difficult. Hence take the following with a few grains of salt. While Bayes' Theorem has been a valuable learning model for workers in all the sciences, as with all models, the Bayesian learning model will probably be generalized and changed in certain ways. Then work to evaluate the modified versions will be undertaken. Work by Diaconis, Zabell, Goldstein and myself has already resulted in new learning models that are in the process of being evaluated which should make Bayesian learning applicable to a broader range of problems and even more effective than it is today. After all, the Model-T was followed by the Model-A, the V-8 Model, etc., Newton's Laws by Einstein's Laws, etc.

13. Are we headed for a Bayesian millennium?

In my article, A Bayesian Era, read at a Valencia meeting and published in the 1988 volume, Bayesian Statistics 3, ed. J.M. Bernardo et al, I stated that a Bayesian Era has already started. Subsequent developments including the founding of ISBA and of the ASA Section on Bayesian Statistical Science in 1992 and the strong upsurge in the number of Bayesian papers and publications world-wide would lead me to say, immodestly, that I was right. Also, the benefits to society of having Bayesian statistical methods that are sound and that yield good solutions to inference, decision and control problems are enormous and deserve to be measured and reported. Congratulations to all of us who have helped make this Bayesian Era come into existence.

If you would like to read more about Professor Zellner's career, you can see another interview with him in the journal, Econometric Theory, 5, 1989, 287-317 which is reprinted in Professor Zellner's book Bayesian Analysis in Econometrics and Statistics: The Zellner View and Papers, Edward Elgar Publ. Ltd, 1997. There is also an interesting collection of 48 papers by 98 authors in honour of Professor Zellner: Bayesian Analysis in Statistics and Econometrics: Essays in Honor of Arnold Zellner, edited by Donald A. Berry, Kathryn M. Chaloner and John K. Geweke, Wiley, 1996.
Return to the main page