[BEAR Seminar] When Growth Mixture Models Break
I’m excited to be giving a BEAR Seminar talk at UC Berkeley:
When: Tuesday, September 9, 2025, at 2:00 p.m.
Where: Berkeley Way West 4310 and via Zoom
Event page: Berkeley Events Calendar
Talk overview
Growth Mixture Models (GMMs) are widely used to capture unobserved heterogeneity in longitudinal data. But they are fragile: nonidentifiability can cause classes to collapse or merge, and common information criteria (like AIC or traditional DIC) often fail under skewed or multimodal likelihoods.
In this talk, I’ll discuss posterior pathologies such as minuscule-class behavior, twinlike-class degeneracy, and stuck chains, and show why plug-in deviance penalties sometimes become negative. I’ll introduce diagnostics (moving-SD checks, Distinguishability Index) and argue that variance-based penalties (DIC_pV), WAIC, and LOO-CV align more closely with the marginal likelihood and provide more reliable guidance for class enumeration.
If you’re nearby—or joining remotely—I’d love to see you there!