首页 > 解决方案 > Why does an lmer model converge in one experimental condition but not another?

问题描述

I am new to using linear mixed-effects models. I have a dataset where participants (ID, N = 7973) completed two experimental conditions (A and B). A subset of participants are siblings and thus nested in families (famID, N = 6908).

omnibus_model <- lmer(Outcome ~ Var1*Var2*Cond + (Cond|ID) + (1|famID), data=df)

The omnibus model converges and indicates a significant three way interaction between Var1, Var2 and Cond. As a post-hoc, to better understand what is driving the omnibus model effect, I subsetted the data so that there is only one observation per ID.

condA <- df[which(df$condition=='A'),]
condA_model <- lmer(Outcome ~ Var1*Var2 + (1|famID), data=condA)
condB <- df[which(df$condition=='B'),]
condB_model <- lmer(Outcome ~ Var1*Var2 + (1|famID), data=condB)

condA_model converges; condB_model does not. In condB_model "famID (Intercept)" variance is estimated at 0. In the condA_model, I get a small, but non-zero estimate (variance=0.001479). I know I could get an estimate of the fixed effect of interest in condition A versus B by a different method(such as randomly selecting one sibling per family for the analysis and not using random effects), but I am concerned that this differential convergence pattern may indicate differences between the conditions that would influence the interpretation of the omnibus model effect.

What difference in the two conditions could causing the model in one subset not to converge? How would I test for the possible differences in my data? Shouldn't the random effect of famID be identical in both subsets and thus equally able to be estimated in both post-hoc models?

标签: statisticsregressionlme4mixed-modelsconvergence

解决方案


作为事后研究,为了更好地理解是什么推动了综合模型效应,我对数据进行了子集化,以便每个 ID 只有一个观察值。

这个过程没有意义。

这两种情况有什么不同会导致一个子集中的模型不收敛?

有很多原因。一方面,这些减少的数据集是减少的,即更小,因此检测您感兴趣的“效果”的统计能力要小得多,例如随机效果的方差。在这种情况下,它可能被估计为零并导致奇异拟合。

famID 的随机效应不应该在两个子集中相同,因此在两个事后模型中同样能够被估计吗?

不,这些是完全不同的模型,因为基础数据不同。没有理由期望两个模型的估计值相同。


推荐阅读