01.05.2015 Views

Evaluating organizational stress-management interventions using ...

Evaluating organizational stress-management interventions using ...

Evaluating organizational stress-management interventions using ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

EVALUATION OF INTERVENTIONS 25<br />

patterns (Colarelli, 1998; Griffiths, 1999; Hartley, 2002; Heaney, Israel,<br />

Schurman, Baker, House, & Hugentobler, 1993). In unpredictable or<br />

uncontrolled settings such an approach (1) raises the risk of Type III error<br />

(erroneously concluding an intervention is ineffective when it is actually its<br />

implementation that is faulty; Dobson & Cook, 1980) and (2) limits<br />

explanatory yield (e.g., inconsistent intervention effects remain difficult to<br />

explain; see Cox et al., 2000b; Parkes & Sparkes, 1998). Moreover,<br />

controlled or predictable intervention exposure patterns occur so infrequently<br />

that there is a need for alternative ways of managing quantitative<br />

evaluation 1 that are viable in the face of unpredictable and uncontrollable<br />

exposure patterns (Colarelli, 1998; Kompier, Aust, van den Berg, & Seigrist,<br />

2000a). In summary, the identification of causal relationships may be<br />

hindered unless study designs are adapted to reflect true, but uncontrollable<br />

and unpredictable, patterns of intervention exposure.<br />

ADAPTED STUDY DESIGNS AS AN EVALUATION<br />

STRATEGY<br />

Applied social scientists (such as those evaluating public health promotion<br />

or large-scale community education programmes) have achieved good<br />

results by being flexible in their application of the principles of study design<br />

(Fitzgerald & Rasheed, 1998; Harachi, Abbot, Catalano, Haggerty, &<br />

Fleming, 1999; Lipsey & Corday, 2000). When, for example, working in<br />

community settings they have adapted study designs, through the use of<br />

process evaluation, to reflect actual intervention exposure patterns (Lipsey,<br />

1996; Lipsey & Corday, 2000). On-going or post hoc measures of<br />

intervention exposure (i.e., process evaluation: see Kompier & Kristensen,<br />

2000; Yin, 1994, 1995; Yin & Kaftarian, 1997) have been used to identify or<br />

adapt the evaluation design so that the evaluation can ‘‘work backward from<br />

the target clientele and what they actually receive/experience, not forward<br />

from the intervention activities and what the intervention agents purportedly<br />

deliver’’ (Lipsey, 1996, p. 301).<br />

Given the sometimes insurmountable difficulties associated with intentionally<br />

introducing and controlling intervention exposure, this flexible<br />

approach to evaluation offers a practical means of evaluating <strong>stress</strong><br />

<strong>management</strong> <strong>interventions</strong>. Data on exposure to <strong>interventions</strong> can be<br />

obtained through an intervention process evaluation (i.e., questioning<br />

participants about their experiences and triangulating those data with<br />

documentary information and by interviewing those involved in planning<br />

and implementing <strong>interventions</strong>; Griffiths, 1999; Nytro, Saksvik, Mikkelsen,<br />

1 The authors recognize that qualitative methods also offer viable alternative approaches to<br />

evaluation in chaotic <strong>organizational</strong> settings (see Kompier et al., 2000a).

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!