For all but the most trivial of scientific questions, rigorous analysis in research will require conditioning, such as stratification or adjustment. Practically, all observational data analyses and even nearly all RCTs require analyses involving conditioning and adjustment. Almost immediately, statistical modeling offers several advantages in conditioning and adjustment: for efficiency with limited information, for using continuous covariates with minimal information loss, for managing complexity (bias, effect modification, missing-ness, heteroskedasticity, etc.) in an integrated and coherent framework; and now with modern tools, advantages for visualization of information and patterns in the data.
The regression modeling framework for statistical analysis accommodates the all the fundamental statistical interests of (i) estimation, (ii) hypothesis testing, (iii) description, and (iv) prediction. Even simple univariate and bivariate analyses are special cases of more general regression problems that make strong assumptions about ignorability of covariates and which arbitrarily set coefficients for these covariates to zero. There are special advantages for approaching regression problems from a prediction perspective: if you get the left hand-side correct (\(\hat{Y}\)) you get better performance for estimation, hypothesis testing and even description. Therefore, methods for optimizing prediction confer advantages and benefits for other statistical and research objectives. But doing prediction well is as much craft as it is technical, and receiving the experience and insight necessary for cultivating this craft is an important factor for effectiveness and success in scientific work. The wisdom—often non-obvious or even counter-intuitive–cataloged in the Regression Modeling Strategies (RMS) text and the short course are a unique and rare opportunity to rapidly develop statistical acumen and accelerate scientific sophistication. RMS is also a gateway and stepping stone for other modern statistical applications such as Bayesian methods and hierarchical models, robust statistical techniques, biomarker identification, adaptive trials, comparative effectiveness, meta-analysis and evidence synthesis, etc.; and for judicious use of machine learning/AI, etc.
RMS helps us better understand and manage sources of uncertainty in scientific work; thus, RMS makes us all better scientists, and better—more sophisticated—consumers and administrators of research.