Advanced Methodology and Statistics Seminars (AMASS)

Designed to enhance researchers’ abilities, there are generally two seminars offered on Thursday or during the course of the convention. They are 4 hours long and limited to 40 attendees. Participants in these courses can earn 4 continuing education credits per seminar.

 

Thursday, November 20 | 8:00AM – 12:00PM

#1: Introduction to Multilevel Modeling
Presented by:

Meghan K. Cain,
Assistant Director, Educational Services, StataCorp LLC

Participants earn 4 continuing education credits


Category: Research Methods and Statistics

Keywords: Statistics, Research Methods

Basic to moderate level of familiarity with the material.

Nested data are ubiquitous in clinical psychology research. Nesting occurs when there are natural groupings in the data, such as when students are nested in classrooms or patients are nested in doctors, and with longitudinal data. If not properly modeled, such data structures create statistical dependences that can lead to inflated Type I error rates. Multilevel models are one solution. Multilevel models not only provide valid inference from nested data, but also allow researchers to ask more complex questions about their data.

In this workshop, I will offer both a conceptual and technical introduction to multilevel modeling. In particular, we will focus on how to formulate models, interpret coefficients, perform model building and model comparison. We will also spend significant time disaggregating effects between versus within clusters. For example, you may have some variables collected at the student level that explain both student-level differences and average school-level differences. By disaggregating effects, effective interventions can be designed at the appropriate level, i.e. student or school level.

All examples will be illustrated in Stata. Prior experience with Stata is not required. All participants will be sent a temporary Stata license.

Outline:

  • Nested data
    • What is it
    • Why is it a problem
    • Exploring nested data
  • Multilevel models
    • Conceptual introduction
    • Technical introduction
    • Fitting and interpreting multilevel models
  • Special topics
    • Longitudinal data
    • Disaggregating within- versus between-effects

At the end of this session, the learner will be able to:

  • Identify and explore nested data, particularly categorizing variables as level-1 or level-2.
  • Create a “spaghetti” plot to visualize nested data.
  • Fit a multilevel model to nested data in Stata.
  • Interpret results from a multilevel model, particularly fixed versus random effects.
  • Disaggregate an effect into its level-1 and level-2 components, by adding cluster means and cluster-mean centering the original variable.

Recommended Readings:

Antonakis, J., Bastardoz, N., & Rönkkö, M. (2021). On ignoring the random effects assumption in multilevel models: Review, critique, and recommendations. Organizational Research Methods, 24(2), 443-483.

Enders, C. K., Mistler, S. A., & Keller, B. T. (2016). Multilevel multiple imputation: A review and evaluation of joint modeling and chained equations imputation. Psychological methods, 21(2), 222.

Hoffman, L., & Walters, R. W. (2022). Catching up on multilevel modeling. Annual review of psychology, 73(1), 659-689.

Ployhart, R. E., Bliese, P. D., & Strizver, S. D. (2025). Intensive Longitudinal Models. Annual Review of Organizational Psychology and Organizational Behavior, 12(1), 343-367.

Thursday, November 20 | 1:00PM – 5:00PM

#2: Optimizing the Use of Discrepant Results When Using, Interpreting, and Integrating Assessment Data

 

Presented by:

Andres De Los Reyes, Ph.D., Professor of Psychology, University of Maryland, College Park

Elizabeth Talbott, Ph.D., Professor, William and Mary

Bryce McLeod, Ph.D., Professor, Virginia Commonwealth University

Sarah J. Racz, Ph.D., Assistant Clinical Professor, University of Maryland, College Park

Participants earn 4 continuing education credits

Category: Research Methods and Statistics

Keywords: Assessment, Measurement, Research Methods

Basic level of familiarity with the material.

Researchers often administer multiple instruments designed to measure the same domain. In research, scores taken from multiple instruments may be used to assess their psychometric properties (e.g., establishing convergent, discriminant, and criterion-related validity) or to address substantive research aims (e.g., identifying risk factors for mental health concerns, selecting appropriate evidence-based therapies, and predicting treatment outcomes).

In applied settings, these same instruments may be used to make high-stakes decisions about clients (e.g., diagnosing, treatment planning, response monitoring). Several decades of research indicate that in both applied and research settings, two instruments designed to assess individuals on the same domain commonly produce discrepant results (De Los Reyes, 2024). By “discrepant results” we mean distinct estimates about the same domain in terms of its form, function, level, and relations to other domains.

Yet, for nearly 70 years, researchers have made sense of these discrepant results using paradigms that assume that results that converge contain all the valid data (e.g., Converging Operations, multi-trait, multi-method matrix; Campbell & Fiske, 1959; Garner et al., 1956). This assumption translates to the use of data integration strategies that misclassify discrepant results as measurement error when these results, in fact, contain valid data (Makol et al., 2025).

Grounded in new training and coursework resources (https://bit.ly/3CySQPx), this workshop provides attendees with a falsifiable approach to discrepant results in mental health assessments. This approach addresses five competencies linked to these discrepant results: (a) theory, (b) epistemology, (c) selection of data sources, (d) construction of analytic models, and (e) design of studies. A series of collaborative, experiential activities will allow attendees to apply these competencies to their work. The workshop will also provide attendees with syntax for the analytic procedures illustrated in the workshop, and examples of publicly available datasets to implement them. In turn, we expect these resources will allow attendees to inform the production of high-quality research and the construction of training modules, coursework, and other forms of pedagogy at their home institutions.


Outline:

  • Hour 1: Principles of Measurement and Methodology
    • Workshop Overview and Brief History about Measuring Mental Health
    • Data Conditions: Describing Discrepant Results in Mental Health
    • Theory, Epistemology, and Assumptions: Relevance to Discrepant Results
    • Think-Pair-Share Activities Throughout the Hour
  • Hour 2: How Traditional Practices Conflict with Mental Health Data
    • Traditional Strategies for Analyzing and Integrating Mental Health Data
    • Discrepant Results Aren’t Created Equal: The Operations Triad Model
    • Q and A about Principles and Practices
    • Think-Pair-Share Activities Throughout the Hour
  • Hour 3: A Paradigm for Capitalizing on Discrepant Results in Mental Health
    • Interpreting Your Discrepant Results: The CONTEXT Validation Paradigm
    • CONTEXT Part I: Selecting Your Data Sources
    • CONTEXT Part II: Selecting Your Analytic Strategies
    • CONTEXT Part III: Designing Your Studies and Selecting Your Validity Criteria
    • Think-Pair-Share Activities Throughout the Hour
  • Hour 4: Group-Based Activities and Extended Q and A
    • Group Activity: Implementing Strategies for Integrating Data
    • Extended Q and A and Informal Discussion about Discrepant Results

    At the end of this session, the learner will be able to:

    • Synthesize meta-analytic reviews demonstrating the robust nature of discrepant results in mental health research.
    • Compare and contrast traditional and contemporary paradigms for conceptualizing and interpreting discrepant results in research findings.
    • Compare and contrast methods for integrating discrepant results to understand links between data assumptions and the conditions to which those methods will be applied.
    • Describe a set of conceptual and measurement validation paradigms designed to optimize the utility of discrepant results in mental health research.
    • Apply conceptual and measurement validation paradigms to experiential activities, with the goal of learning principles for implementing analytic procedures for integrating and modeling data that contain discrepant results.
    • Connect and apply contemporary principles and assumptions about discrepant results to learner’s own research and training activities.

      Recommended Readings:

      De Los Reyes, A., Wang, M., Lerner, M.D., Makol, B.A., Fitzpatrick, O., & Weisz, J.R. (2023). The Operations Triad Model and youth mental health assessments: Catalyzing a paradigm shift in measurement validation. Journal of Clinical Child and Adolescent Psychology, 52(1), 19-54. https://doi.org/10.1080/15374416.2022.2111684

      Makol, B.A., Youngstrom, E.A., Racz, S.J., Qasmieh, N., Glenn, L.E., & De Los Reyes, A. (2020). Integrating multiple informants’ reports: How conceptual and measurement models may address long-standing problems in clinical decision-making. Clinical Psychological Science, 8(6), 953-970. https://doi.org/10.1177/2167702620924439

      McLeod, B. D., Porter, N., Hogue, A., Becker-Haimes, E. M., & Jensen-Doss, A. (2023). What is the status of multi-informant treatment fidelity research? Journal of Clinical Child and Adolescent Psychology, 52(1), 74-94. https://doi.org/10.1080/15374416.2022.2151713

      Rescorla, L.A., Ivanova, M.Y., Achenbach, T.M., Almeida, V., Anafarta-Sendag, M., Bite, I., Caldas, J.C., Capps, J.W., Chen, Y.C., Colombo, P., da Silva Oliveira, M., Dobrean, A., Erol, N., Frigerio, A., Funabiki, Y., Gedutienė, R., Guðmundsson, H.S., Heo, M.Q., Kim, Y.A., Zasępa, E. (2022). Older adult psychopathology: International comparisons of self-reports, collateral reports, and cross-informant agreement. International Psychogeriatrics, 34(5), 467-478. https://doi.org/10.1017/S1041610220001532

      von der Embse, N., & De Los Reyes, A. (2024). Advancing equity in access to school mental health through multiple informant decision-making. Journal of School Psychology, 104, 101310. https://doi.org/10.1016/j.jsp.2024.101310