TY - JOUR
T1 - A generalized expectation model selection algorithm for latent variable selection in multidimensional item response theory models
AU - University, Zhejiang
AU - Zheng, Qian-Zhen
AU - Xu, Ping-Feng
AU - Shan, Na
AU - Tang, Man Lai
N1 - © 2023, The Author(s), under exclusive licence to Springer Science Business Media, LLC, part of Springer Nature.
PY - 2023/11/2
Y1 - 2023/11/2
N2 - In this paper, we propose a generalized expectation model selection (GEMS) algorithm for latent variable selection in multidimensional item response theory models which are commonly used for identifying the relationships between the latent traits and test items. Under some mild assumptions, we prove the numerical convergence of GEMS for model selection by minimizing the generalized information criteria of observed data in the presence of missing data. For latent variable selection in the multidimensional two-parameter logistic (M2PL) models, we present an efficient implementation of GEMS to minimize the Bayesian information criterion. To ensure parameter identifiability, the variances of all latent traits are assumed to be unity and each latent trait is required to have an item exclusively associated with it. The convergence of GEMS for the M2PL models is verified. Simulation studies show that GEMS is computationally more efficient than the expectation model selection (EMS) algorithm and the expectation maximization based L1-penalized method (EML1), and it yields better correct rate of latent variable selection and mean squared error of parameter estimates than the EMS and EML1. The GEMS algorithm is illustrated by analyzing a real dataset related to the Eysenck Personality Questionnaire.
AB - In this paper, we propose a generalized expectation model selection (GEMS) algorithm for latent variable selection in multidimensional item response theory models which are commonly used for identifying the relationships between the latent traits and test items. Under some mild assumptions, we prove the numerical convergence of GEMS for model selection by minimizing the generalized information criteria of observed data in the presence of missing data. For latent variable selection in the multidimensional two-parameter logistic (M2PL) models, we present an efficient implementation of GEMS to minimize the Bayesian information criterion. To ensure parameter identifiability, the variances of all latent traits are assumed to be unity and each latent trait is required to have an item exclusively associated with it. The convergence of GEMS for the M2PL models is verified. Simulation studies show that GEMS is computationally more efficient than the expectation model selection (EMS) algorithm and the expectation maximization based L1-penalized method (EML1), and it yields better correct rate of latent variable selection and mean squared error of parameter estimates than the EMS and EML1. The GEMS algorithm is illustrated by analyzing a real dataset related to the Eysenck Personality Questionnaire.
KW - Bayesian information criterion
KW - Exploratory item factor analysis
KW - Generalized expectation model selection algorithm
KW - Multidimensional item response theory model
UR - http://www.scopus.com/inward/record.url?scp=85178347701&partnerID=8YFLogxK
U2 - 10.1007/s11222-023-10360-x
DO - 10.1007/s11222-023-10360-x
M3 - Article
VL - 34
SP - 1
EP - 15
JO - Statistics and Computing
JF - Statistics and Computing
IS - 1
M1 - 49
ER -