10.07.2015 Views

ICCS 2009 Technical Report - IEA

ICCS 2009 Technical Report - IEA

ICCS 2009 Technical Report - IEA

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

latent variables link to observable variables via measurement equations. An observed variable xis thus modeled as(1) x = L y x + d ,where L y is a q x k matrix of factor loadings, x denotes the latent variable(s), and d is a q x 1vector of unique error variables. The expected covariance matrix is fitted according to thetheoretical factor structure.During the confirmatory factor analyses, selected model-fit indices were also used to measurethe extent to which a model with an assumed a-priori structure “fitted the data.” For the<strong>ICCS</strong> analysis, model fit was assessed primarily through use of the root-mean square error ofapproximation (RMSEA), the comparative fit index (CFI), and the non-normed fit index (NNFI), all ofwhich are less affected than other indices by sample size and model complexity (see Bollen &Long, 1993).It was assumed, with respect to the analysis, that RMSEA values over 0.10 would suggest anunacceptable model fit while values below 0.05 would indicate a close model fit. As additionalfit indices, CFI and NNFI are bound between 0 and 1. Values below 0.90 and 0.95 indicate anon-satisfactory model fit whereas values greater than 0.95 suggest a close model fit.In addition to these fit indices, standardized factor loadings and residual variance were usedto assess model structures for questionnaire data. Standardized factor loadings l’ can beinterpreted in the same way as standardized regression coefficients if the indicator variableis regressed on the latent factor. The loadings also reflect the extent to which each indicatormeasures the underlying construct. Squared standardized factor loadings indicate how muchvariance in an indicator variable can be explained by the latent factor and are related to the(standardized) residual variance estimate d’ (these provide an estimate of the unexplainedproportion of variance) asd’ = (1–l’ 2 ) .Multidimensional models were used to assess the estimated correlation(s) between latent factorsand to review the similarity of the different dimensions measured by the item sets.Generally, maximum likelihood estimation and covariance matrices are not appropriate foranalyses of (categorical) questionnaire items because the approach treats items as if they arecontinuous. Weighted least squares estimation with polychoric correlations (see Jöreskog, 1990,1994) were therefore used to estimate the confirmatory factor models. The software packagethat was used to do this was LISREL 8.72 (Jöreskog & Sörbom, 2004).A decision was made to use confirmatory factor analyses for sets of conceptually relatedquestionnaire items that measured between one and four different factors. This approach madeit possible to describe both the extent to which items measured underlying latent traits aswell as the associations between latent factors. The analyses employed data from the (pooled)<strong>ICCS</strong> calibration samples of students, teachers, and schools, a process that ensured equalrepresentations of countries in the analyses.Item response modelingItem response modeling was typically used to scale questionnaire items. The one-parameter(Rasch) model (Rasch, 1960) for dichotomous items models the probability of selectingCategory 1 instead of 0 asP i (q) = exp(q n– d i ) , (1)1+exp(q n – d i )where P i (q) is the probability of person n scoring 1 on item i, q n is the estimated latent traitof person n, and d i is the estimated location of item i on this dimension. For each item, itemresponses are modeled as a function of the latent trait q n .SCALING PROCEDURES FOR <strong>ICCS</strong> questionnaire ITEMS161

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!