12.07.2015 Views

Question and Questionnaire Design - Stanford University

Question and Questionnaire Design - Stanford University

Question and Questionnaire Design - Stanford University

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>Question</strong> <strong>and</strong> <strong>Question</strong>naire <strong>Design</strong> 287r<strong>and</strong>omized response technique significantly reduced socially desirable answers.However, many respondents probably do not underst<strong>and</strong> the procedure, which maycause them to not follow the instructions. Edgell, Himmelfarb, <strong>and</strong> Duchan (1982),for example, found that many respondents would not give the directed response to aquestion if that response was a socially undesirable one <strong>and</strong> the question wassufficiently sensitive (see also Holbrook & Krosnick, 2005).An approach similar to the r<strong>and</strong>omized response technique, but one lesslikely to arouse respondent suspicion or confusion, is the ‘‘item count technique’’(see, e.g., Droitcour et al., 1991). This approach r<strong>and</strong>omly assigns respondentsto one of two lists of items that differ only in whether a focal sensitive item isincluded. Respondents are asked how many of the items, in total, apply to them, notwhich apply to them. If the items are chosen appropriately, essentially no onewill choose all or none, so it will be possible to estimate the proportion to whichthe focal item applies without knowing the identity of any particular respondentto whom it applies. As is true for the r<strong>and</strong>omized response technique, however,the item count technique introduces an additional source of sampling error, whichmeans that larger sample sizes are required. Experiments have found that whencompared to direct self-reports, the item count technique often yielded more reportsof socially undesirable behaviors or attitudes (for reviews, see Holbrook & Krosnick,in press; Tourangeau & Yan, 2007). In the instances where this difference did notappear, it could have been because social desirability bias did not distort the directself-reports.Another method designed to reduce social desirability bias attempts to save facefor respondents by legitimating the less desirable response option. The most commonapproach involves noting in the question that many people do not engage in thesocially desirable behavior, for instance, ‘‘In talking to people about elections weoften find that a lot of people were not able to vote because they weren’t registered,were sick, or just didn’t have time.’’ Holbrook <strong>and</strong> Krosnick (2005) showed that thiswording reduces voting reports.In addition, yes/no response options can be converted into multiple responseoptions, only one of which represents the desirable state, for instance:1. I did not vote in the November 5th election.2. I thought about voting this time, but didn’t.3. I usually vote, but didn’t this time.4. I am sure I voted in the November 5th election.Belli, Traugott, Young, <strong>and</strong> McGonagle (1999) reported that offering thesecategories reduced voting reports, though their comparisons simultaneously variedother features as well.Finally, consistent with our advice in the preceding section on don’t knows, it isbetter not to provide explicit DK options for sensitive items, as they are more apt toprovide a cover for socially undesirable responses.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!