12.01.2015 Views

RESEARCH METHOD COHEN ok

RESEARCH METHOD COHEN ok

RESEARCH METHOD COHEN ok

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

ENSURING VALIDITY 145<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

avoiding having too long or too short an<br />

interval between pretests and post-tests<br />

ensuring inter-rater reliability<br />

matching control and experimental groups<br />

fairly<br />

ensuring standardized procedures for gathering<br />

data or for administering tests<br />

building on the motivations of the respondents<br />

tailoring the instruments to the concentration<br />

span of the respondents and addressing other<br />

situational factors (e.g. health, environment,<br />

noise, distraction, threat)<br />

addressing factors concerning the researcher<br />

(particularly in an interview situation);<br />

for example, the attitude, gender, race,<br />

age, personality, dress, comments, replies,<br />

questioning technique, behaviour, style and<br />

non-verbal communication of the researcher.<br />

At the stage of data analysis there are several<br />

areas where invalidity lurks; these might be<br />

minimized by:<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

using respondent validation<br />

avoiding subjective interpretation of data (e.g.<br />

being too generous or too ungenerous in the<br />

award of marks), i.e. lack of standardization<br />

and moderation of results<br />

reducing the halo effect, where the researcher’s<br />

knowledge of the person or knowledge of other<br />

data about the person or situation exerts an<br />

influence on subsequent judgements<br />

using appropriate statistical treatments for<br />

the level of data (e.g. avoiding applying<br />

techniques from interval scaling to ordinal data<br />

or using incorrect statistics for the type, size,<br />

complexity, sensitivity of data)<br />

recognizing spurious correlations and extraneous<br />

factors which may be affecting the data<br />

(i.e. tunnel vision)<br />

avoiding poor coding of qualitative data<br />

avoiding making inferences and generalizations<br />

beyond the capability of the data to<br />

support such statements<br />

avoiding the equating of correlations and<br />

causes<br />

avoiding selective use of data<br />

<br />

<br />

<br />

avoiding unfair aggregation of data (particularly<br />

of frequency tables)<br />

avoiding unfair telescoping of data (degrading<br />

the data)<br />

avoiding Type I and/or Type II errors (see<br />

http://www.routledge.com/textbo<strong>ok</strong>s/<br />

9780415368780 – Chapter 6, file 6.2.doc).<br />

A Type I error is committed where the<br />

researcher rejects the null hypothesis when it<br />

is in fact true (akin to convicting an innocent<br />

person: Mitchell and Jolley 1988: 121); this<br />

can be addressed by setting a more rigorous<br />

level of significance (e.g. ρ

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!