05.12.2012 Views

A Moving Target—The Evolution of Human-Computer Interaction

A Moving Target—The Evolution of Human-Computer Interaction

A Moving Target—The Evolution of Human-Computer Interaction

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Definitions: HCI, CHI, HF&E, IT, IS, LIS<br />

HCI is <strong>of</strong>ten used narrowly to refer to work in the discipline <strong>of</strong> an author or instructor. I define it very broadly to<br />

cover major threads <strong>of</strong> research in four disciplines: human factors, information systems, computer science, and<br />

library & information science. Later, I elaborate on differences in the use <strong>of</strong> simple terms that make the relevant<br />

literatures difficult to explore. Here I explain how several key disciplinary labels are used. CHI (computer -human<br />

<strong>Interaction</strong>) here has a narrower focus than HCI; CHI is associated mainly with computer science, the Association<br />

for Computing Machinery Special Interest Group (ACM SIGCHI), and the latter's annual CHI conference. I use<br />

human factors and ergonomics interchangeably and refer to the discipline as HF&E. (Some writers define erg onomics<br />

more narrowly around hardware.) The <strong>Human</strong> Factors Society (HFS) became the <strong>Human</strong> Factors and<br />

Ergonomics Society (HFES) in 1992. IS (information systems) refers to the management discipline that has also<br />

been labeled data processing (DP) and management information systems (MIS). I follow comm on parlance in<br />

referring to organizational information systems specialists as IT pr<strong>of</strong>essionals or IT pros. LIS (library and info rmation<br />

science) represents an old field with a new digital incarnation that includes important HCI research. With<br />

IS taken, I do not abbreviate information science, a discipline that increasingly goes by simply 'information,' as in<br />

"Information School" or "School <strong>of</strong> Information."<br />

Conventions: Moore's Law and Inflation<br />

A challenge in interpreting past events and the literature is to keep in mind the radical differences in what a typical<br />

computer was from one decade to the next. We are familiar with Moore's law, but not with its many subtle<br />

effects. To some extent, conceptual development can be detached from hardware, but the evolving course <strong>of</strong><br />

research and development cannot. Because we do not reason well about supralinear or exponential growth, we<br />

<strong>of</strong>ten failed to anticipate how rapidly change would come, and when it came we did not realize the role <strong>of</strong> the<br />

underlying technology. Moore's law specifies the number <strong>of</strong> transistors on an integrated circuit, but it is useful to<br />

consider the broader range <strong>of</strong> phenomena that exhibit exponential growth. Narrowly defined, Moore's law may<br />

soon be revoked, but broadly defined this is unlikely. The health <strong>of</strong> the technology industry is tied to ongoing<br />

hardware innovation. This provides the motivation and resources to continue, perhaps through novel materials<br />

and cooling techniques, three-dimensional architectures, optical computing, more effective parallelism, or other<br />

means. Do not underestimate human ingenuity when so much is at stake. There is also great opportunity for innovation<br />

and increased efficiency in s<strong>of</strong>tware. Finally, there is a tendency in the literature to state costs in terms<br />

<strong>of</strong> dollars at the time <strong>of</strong> the events described, although one dollar when the first commercial computers appeared<br />

was equivalent to ten dollars today. I have converted prices, costs and grant funding to U.S. dollars as <strong>of</strong> 2012.<br />

HUMAN-TOOL INTERACTION AND INFORMATION<br />

PROCESSING AT THE DAWN OF COMPUTING<br />

In the century prior to the advent <strong>of</strong> the first digital computers, advances in technology gave rise to two fields <strong>of</strong> research<br />

that later contributed to human-computer interaction. One focused on making the human use <strong>of</strong> tools more<br />

efficient, the other on ways to represent and distribute information more effectively.<br />

Origins <strong>of</strong> <strong>Human</strong> Factors<br />

Frederick Taylor (1911) employed technologies and methods developed in the late 19th century—photography,<br />

moving pictures, and statistical analysis—to improve work practices by reducing performance time. Time-andmotion<br />

studies were applied to assembly-line manufacturing and other manual tasks. Despite the uneasiness with<br />

“Taylorism” reflected in Charlie Chaplin’s popular satire Modern Times, scientists and engineers continued working<br />

to boost efficiency and productivity using this approach.<br />

Lillian Gilbreth (1914) and her husband Frank were the first engineers to combine psychology and scientific<br />

management. Lillian Gilbreth focused more holistically than Taylor on efficiency and worker experience; some consider<br />

her the founder <strong>of</strong> modern <strong>Human</strong> Factors. Her PhD was the first degree awarded in industrial psychology.<br />

She went on to advise five U.S. presidents and became the first woman inducted into the National Academy <strong>of</strong> Engineering.<br />

World War I and World War II accelerated efforts to match people to jobs, train them, and design equipment that<br />

could be more easily mastered. Engineering psychology was born during World War II after simple flaws in the design<br />

4

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!