18 Item selection in multidimensional computerized adaptive testing (Present by Nicky)

Wayne's comment

Wayne's comment

CHEN Chia Wen -
回帖数:0

This study is helpful to overview the itemselection method in computerized adaptive testing. Furthermore, It interpretshow to extend these methods to be applied at the multidimensional CAT and therelationship between these methods. The first introduced method is D-optimality rule. Thisrule is a posterior fisher information including the prior variance covariancematrix. Then the Kullback-Leibler information index (KI) was described. It representsthe distance between the true theta likelihood and estimation likelihood. KImeans how the item can discriminate the extent of the true theta and othertheta (within an interval). the larger KI indicate the better sensitive todetect the difference between the current theta estimation and other theta. Thethird method, Continuous Entropy Method (CEM) was mentioned. It use the entropyformula and let the probability in entropy become the posterior liklihood. wecan select the minimal expected posterior entropy as this selection rule. Thenthe mutual information method measures how much the correlation between thecurrent theta and the response on the candidate items. In section 2.5 theauthor illustrates the mathematical relationship among those methods. In theresult, it is interesting that KI was weak rule in multidimensional selectionrule. The interpretation in discussion mentioned an special example. The item 1in this example indicated when the true theta and other theta is difference,the KL still equal to zero. It means the discrimination effect of multivariateKL is not able to perform well.

1)The difference between the KLP and KLB isthe position of the current and new posterior distributions. Why does thisstudy didn't compare these two method? I am curious the difference of theeffects on these two methods.