20.08.2015 Views

Single Sourcing and Content Management - Technical ...

Single Sourcing and Content Management - Technical ...

Single Sourcing and Content Management - Technical ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Applied Research<strong>Single</strong> <strong>Sourcing</strong> <strong>and</strong> <strong>Content</strong> <strong>Management</strong>(IRB) for the Protection of Human Participants atTowson University in Maryl<strong>and</strong>.Based on several formal interviews <strong>and</strong> someinformal conversations with technical communicatorsabout single sourcing <strong>and</strong> content management methods<strong>and</strong> tools, Dayton revised the survey <strong>and</strong> solicitedreviews of the new draft from three practitioners withexpertise in the subject matter <strong>and</strong> from an academicwith expertise in survey research. Dayton again revisedthe survey in response to those reviewers’ suggestions.Hopper then converted the survey into an interactiveWeb-delivered questionnaire using Zoomerang (acopy of the survey that does not collect data may beexplored freely at http://www.zoomerang.com/Survey/WEB22B38UWBJKZ).Moving the survey from a page-based format tomulti-screen Web forms proved challenging. Multiplebranching points in the sequence of questions createdfive primary paths through the survey: no SS/CM, SSonly, CM only, SSwCM, <strong>and</strong> academics. Respondentsnot using SS or CM were presented with 20 or 21questions depending on whether their work group hadconsidered switching to SS/CM methods <strong>and</strong> tools.Respondents in the three subgroups of SS/CM werepresented with 30 to 33 questions, depending on theiranswers to certain ones. The version of the survey foracademics contained 24 questions, but we ultimatelydecided to leave academics out of the sampling framefor reasons explained later.For all paths through the survey, question typesincluded choose one, choose all that apply, <strong>and</strong> openended. All fixed choice questions included a final answerchoice of “Other, please specify” followed by a spacefor typing an open-ended answer. The first completedraft of the Web-based survey was pilot tested byabout 30 participants, which included practitioners,graduate students, <strong>and</strong> academics. The reported timesfor completing the survey ranged from less than 8 to25 minutes. Testers who went through the path foracademics <strong>and</strong> the path for those not using SS or CMreported the fastest completion times <strong>and</strong> offered thefewest suggestions. Testers answering the questions forthose using SS/CM suggested some improvements inwording, formatting, <strong>and</strong> answer options, most of whichwe agreed with <strong>and</strong> made changes to address.Deployment of the SurveyThe version of the survey for academics was entirelydifferent from the four variations for practitioners.Following the pilot test, we reassessed the pros <strong>and</strong>cons of fielding two surveys at the same time. We wereparticularly concerned that the number of academicrespondents would be quite small unless we drew aseparate sample of only academic members. After theSTC Marketing Manager assured us that academicscould be filtered from the membership database beforedrawing a sample, we decided to limit the samplingframe to practitioners. (The sampling frame is the totalpopulation of people from whom the r<strong>and</strong>om sample isdrawn.)The sampling frame consisted of about 13,500STC members, about 3,000 fewer than the totalmembership at that time (May 2008). In addition toexcluding academics, students, <strong>and</strong> retirees, the STCMarketing Manager also excluded STC members whohad opted not to receive messages from third-partyvendors. From the sampling frame of about 13,500members, the STC Marketing Manager drew a r<strong>and</strong>omsample of 1,000 using an automated function for thatpurpose available in the STC office’s membershipdatabase application.Over 11 days, the Marketing Manager e-mailed tothe sample four messages that we composed. The firste-mail went out on a Thursday: a brief message fromSTC President Linda Oestreich describing the survey<strong>and</strong> encouraging participation. The second e-mailwas sent the following Tuesday, signed by us, invitingrecipients to take the survey <strong>and</strong> providing a link to theconsent form. (Researchers working for federally fundedinstitutions are required by law to obtain the informedconsent of anyone asked to participate in a researchstudy.) Respondents accessed the survey by clicking thelink at the bottom of the consent form. (Appendix Ccontains copies of the two e-mails mentioned above <strong>and</strong>the consent form.)The Internet server housing the survey wasconfigured to prohibit multiple submissions from thesame computer. When a respondent completed thesurvey by clicking the Submit button on the final screen,a confirmation page displayed our thank-you message<strong>and</strong> offered respondents the option of e-mailing the378 <strong>Technical</strong> Communication ● Volume 57, Number 4, November 2010

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!