12.07.2015 Views

Conceptual Framework and Overview of Psychometric Properties

Conceptual Framework and Overview of Psychometric Properties

Conceptual Framework and Overview of Psychometric Properties

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

eliability coefficient for all students (N=569)across all items on The Report was arespectable .83. This indicates a fair degree <strong>of</strong>stability in students= responses, consistentwith other psychometric tools measuringattitude <strong>and</strong> experiences (Crocker & Algina,1986). Some sections <strong>of</strong> the survey were morestable than others. For example, the reliabilitycoefficient for the 20 College Activities itemswas .77. The coefficient for the 10 OpinionsAbout Your School items was .70, for the 14Educational <strong>and</strong> Personal Growth items .69,for the five reading, writing, <strong>and</strong> nature <strong>of</strong>examinations items .66, <strong>and</strong> for the six timeusage items .63. The mental activities <strong>and</strong>program items were the least stable, withcoefficients <strong>of</strong> .58 <strong>and</strong> .57 respectively.In 2002, we conducted a similar test-retestanalysis with 1,226 respondents whocompleted the paper survey twice. For thisanalysis, we used the Pearson productmoment correlation to examine thereliability coefficients for the items used toconstruct our benchmarks. For the itemsrelated to three <strong>of</strong> the benchmarks(academic challenge, enriching educationalexperiences, <strong>and</strong> the academic challenge),the reliability coefficients were .74. Thestudent interaction with faculty membersitems <strong>and</strong> supportive campus environmentitems had reliability coefficients <strong>of</strong> .75 <strong>and</strong>.78 respectively.Summary. Taken together, these analysessuggest that the NSSE survey appears to bereliably measuring the constructs it wasdesigned to measure. Assuming thatrespondents were representative <strong>of</strong> theirrespective institutions, data aggregated at theinstitutional level on an annual basis shouldyield reliable results. The correlations are highbetween the questions common to both years.Some <strong>of</strong> the lower correlations (e.g., nature <strong>of</strong>exams, rewriting papers, tutoring) may be afunction <strong>of</strong> slight changes in item wording <strong>and</strong>modified response options for other items onthe later surveys (e.g., number <strong>of</strong> paperswritten). At the same time, compared with2000, 2001 <strong>and</strong> 2002 data reflect a somewhathigher level <strong>of</strong> student engagement on anumber <strong>of</strong> NSSE items, though the relativemagnitude <strong>of</strong> these differences is small.Checking for Mode <strong>of</strong> Administration EffectsUsing multiple modes <strong>of</strong> surveyadministration opens up the possibility <strong>of</strong>introducing a systematic bias in the resultsassociated with the method <strong>of</strong> data collection.That is, do the responses <strong>of</strong> students who useone mode (i.e., Web) differ in certain waysfrom those who use an alternative mode suchas paper? Further complicating this possibilityis that there are two paths by which studentscan use the Web to complete the NSSEsurvey: (1) students receive the paper surveyin the mail but have the option to complete itvia the Web (Web- option), or (2) studentsattend a Web-only school <strong>and</strong> must completethe survey on-line (Web-only).Using ordinary least squares (OLS) or logisticregressions we analyzed the data from NSSE2000 to determine if students who completedthe survey on the Web responded differentlythan those who responded via a traditionalpaper format. Specifically, we analyzedresponses from 56,545 students who hadcomplete data for survey mode <strong>and</strong> all controlvariables. The sample included 9,933students from Web-exclusive institutions <strong>and</strong>another 10,013 students who received a papersurvey, but exercised the Web-option. Wecontrolled for a variety <strong>of</strong> student <strong>and</strong>institutional characteristics that may be linkedto both engagement <strong>and</strong> mode. The controlvariables included: class, enrollment status,housing, sex, age, race/ethnicity, major field,2000 Carnegie Classification, sector,undergraduate enrollment from IPEDS,admissions selectivity (from Barron’s, 1996),urbanicity from IPEDS, <strong>and</strong> academic supportexpenses per student from IPEDS. In additionto tests <strong>of</strong> statistical significance, wecomputed effect sizes to ascertain if the<strong>Framework</strong> & <strong>Psychometric</strong> <strong>Properties</strong>Page 17 <strong>of</strong> 26

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!