Since Allen T. Craig's death in 1978, Bob Hogg has revised the later editions of this text. However, when Prentice Hall asked him to consider a sixth edition, he thought of his good friend, Joe McKean, and asked him to help. That was a great choice for Joe made many excellent suggestions on which we both could agree and these changes are outlined later in this preface. In addition to Joe's ideas, our colleague Jon Cryer gave us his marked up copy of the fifth edition from which we changed a number of items. Moreover, George Woodworth and Kate Cowles made a number of suggestions concerning the new Bayesian chapter; in particular, Woodworth taught us about a "Dutch book" used in many Bayesian proofs. Of course, in addition to these three, we must thank others, both faculty and students, who have made worthwhile suggestions. However, our greatest debts of gratitude are for our special friend, Tom Hettmansperger of Penn State University, who used our revised notes in his mathematical statistics course during the 2002-2004 academic years and Suzanne Dubnicka of Kansas State University who used our notes in her mathematical statistics course during Fall of 2003. From these experiences, Tom and Suzanne and several of their students provided us with new ideas and corrections. While in earlier editions, Hogg and Craig had resisted adding any "real" problems, Joe did insert a few among his more important changes. While the level of the book is aimed for beginning graduate students in Statistics, it is also suitable for senior undergraduate mathematics, statistics and actuarial science majors. The major differences between this edition and the fifth edition are: It is easier to find various items because more definitions, equations, and theorems are given by chapter, section, and display numbers. Moreover, many theorems, definitions, and examples are given names in bold faced type for easier reference. Many of the distribution finding techniques, such as transformations and moment generating methods, are in the first three chapters. The concepts of expectation and conditional expectation are treated more thoroughly in the first two chapters. Chapter 3 on special distributions now includes contaminated normal distributions, the multivariate normal distribution, thet- andF-distributions, and a section on mixture distributions. Chapter 4 presents large sample theory on convergence in probability and 1 distribution and ends with the Central Limit Theorem. In the first semester, if the instructor is pressed for time he or she can omit this chapter and proceed to Chapter 5. To enable the instructor to include some statistical inference in the first semester, Chapter 5 introduces sampling, confidence intervals and testing. These include many of the normal theory procedures for one and two sample location problems and the corresponding large sample procedures. The chapter concludes with an introduction to Monte Carlo techniques and bootstrap procedures for confidence intervals and testing. These procedures are used throughout the later chapters of the book. Maximum likelihood methods, Chapter 6, have been expanded. For illustration, the regularity conditions have been listed which allows us to provide better proofs of a number of associated theorems, such as the limiting distributions of the maximum likelihood procedures. This forms a more complete inference for these important methods. The EM algorithm is discussed and is applied to several maximum likelihood situations. Chapters 7-9 contain material on sufficient statistics, optimal tests of hypotheses, and inferences about normal models. Chapters 10-12 contain new material. Chapter 10 presents nonparametric procedures for the location models and simple linear regression. It presents estimation and confidence intervals as well as testing. Sections on optimal scores and adaptive methods are preRobert V. Hogg is the author of 'Introduction to Mathematical Statistics', published 2004 under ISBN 9780130085078 and ISBN 0130085073.