28.03.2014 Views

isbn9789526046266

isbn9789526046266

isbn9789526046266

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

There are values that are internal to science and epistemically relevant (Phillips and Burbules, 2000,<br />

p. 54). For instance, one must not falsify evidence and must be open to criticism. Without these values<br />

– which “foster the epistemic concerns of science as an enterprise that produces competent warrants for<br />

knowledge claims” – “scientific inquiry loses its point”.<br />

External and epistemically irrelevant values range from the personal (e.g., “computing education<br />

is important and we must make sure everyone who does our courses learns at least the basics”) to<br />

externally given agendas (e.g., “we need to find out how best to advertise our department to potential<br />

future students”). These values may and do (and must) guide the scientist as they choose which<br />

research questions are important enough to merit investigation – a decision which may change during<br />

an investigation. They also affect the courses of action that we take and recommend on the basis of our<br />

results. However, we should minimize the intrusion of epistemically irrelevant values into the process of<br />

establishing empirical evidence for knowledge claims about our object of research. In other words, our<br />

results are ideally not affected by what we would prefer them to be.<br />

Phillips and Burbules (2000) are careful to point out that “of course the judgment about what is<br />

epistemically relevant or irrelevant is itself – like all judgments – potentially a fallible one” (p. 54).<br />

On methods and results<br />

Various pragmatists and mixed-methods scholars (recently, e.g., Patton, 2002; Johnson and Onwuegbuzie,<br />

2004; Morgan, 2007; Tashakkori and Teddlie, 2010) have argued that there is a disconnect or only a very<br />

loose connection between philosophy and research methods. Johnson and Onwuegbuzie (2004), for<br />

example, caution that “there is rarely entailment from epistemology to methodology” (p. 15). Patton<br />

(2002) makes a similar comment and concludes: “In short, in real-world practice, methods can be separated<br />

from the epistemology out of which they have emerged” (p. 136). Similarly, a set of research findings<br />

can be useful to researchers and practitioners who subscribe to different ontological and epistemological<br />

assumptions than the original researchers. The final word on the epistemological status of our results lies<br />

with our readers, who will interpret our results on the basis of their own beliefs.<br />

I will return to the fundamentals of qualitative and quantitative research shortly to discuss the<br />

trustworthiness of our research. Now, let us return to our research questions.<br />

16.2.3 The following chapters describe several interrelated studies<br />

The pragmatist may use any methods for empirical work that produce useful results and that suit the<br />

research goals. That is what we have done.<br />

To answer Question 1, “In what ways do novice programmers experience learning through visual<br />

program simulation?”, we adopted a phenomenographic perspective. As discussed in Chapter 7,<br />

phenomenography is a research approach that is geared towards studying the different ways in which<br />

people experience phenomena in educational contexts. Compatibly with our research questions – and<br />

typically of phenomenographic studies – we used qualitative methods to explore learners’ understandings<br />

of VPS.<br />

We also adopted a qualitative orientation to answer Question 2, “What happens during visual program<br />

simulation sessions?” More specifically, we used a form of data-driven, qualitative content analysis, an<br />

approach suitable for the exploration of complex data.<br />

We approached Question 3, “Does a short VPS session help produce short-term improvement in<br />

learners’ ability to predict the behavior of given programs?” quantitatively, and used an experimental<br />

setup to measure the short-term impact of VPS in classroom use.<br />

Finally, we sought to answer Question 4, “How do students react to the use of UUhistle in CS1 (and<br />

why)?”, through both qualitative and quantitative analyses of course feedback questionnaires, which are<br />

an inexpensive way of surveying the opinions of many students.<br />

Table 16.1 presents an overview of our studies.<br />

254

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!