11.07.2015 Views

Encyclopedia of Computer Science and Technology

Encyclopedia of Computer Science and Technology

Encyclopedia of Computer Science and Technology

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

408 risks <strong>of</strong> computingHafner, Katie. The Well: A Story <strong>of</strong> Love, Death & Real Life in theSeminal Online Community. New York: Carol & Graf, 2001.Howard Rheingold [home page]. Available online. URL: http://www.rheingold.com. Accessed November 12, 2007.Kimball, Lisa, <strong>and</strong> Howard Rheingold. “How Online Social NetworksBenefit Organizations.” Available online. URL: http://www.rheingold.com /Associates/onlinenetworks.html.Accessed November 12, 2007.Rheingold, Howard. Smart Mobs: The Next Social Revolution. NewYork: Basic Books, 2003.———. Tools for Thought: The History <strong>and</strong> Future <strong>of</strong> Mind-Exp<strong>and</strong>ing<strong>Technology</strong>. 2nd rev. ed. Cambridge, Mass.: MIT Press,2000.———. The Virtual Community: Homesteading on the ElectronicFrontier. Revised ed. Cambridge, Mass.: MIT Press, 2000.[The first edition is also available online. URL: http://www.rheingold.com/vc/book/. Accessed November 12, 2007.]Smart Mobs Blog. Available online. URL: http://www.smartmobs.com/. Accessed November 12, 2007.The WELL Available online. URL: http://www.well.com. AccessedNovember 12, 2007.risks <strong>of</strong> computingProgrammers <strong>and</strong> managers <strong>of</strong> s<strong>of</strong>tware development aregenerally aware <strong>of</strong> the need for s<strong>of</strong>tware to properly dealwith erroneous data (see error h<strong>and</strong>ling). They knowthat any significant program will have bugs that must berooted out (see bugs <strong>and</strong> debugging). Good s<strong>of</strong>tware engineeringpractices <strong>and</strong> a systematic approach to assuring thereliability <strong>and</strong> quality <strong>of</strong> s<strong>of</strong>tware can minimize problemsin the finished product (see s<strong>of</strong>tware engineering <strong>and</strong>quality assurance, s<strong>of</strong>tware). However, serious bugs arenot always caught, <strong>and</strong> sometimes the consequences can becatastrophic. For example, in the Therac 25 computerizedX-ray cancer treatment machine, poorly thought-out comm<strong>and</strong>entry routines plus a counter overflow resulted inthree patients being killed by massive X-ray overdoses. Theoverdoses ultimately occurred because the designers hadremoved a physical interlock mechanism they believed wasno longer necessary.Any computer application is part <strong>of</strong> a much larger environment<strong>of</strong> humans <strong>and</strong> machines, where unforeseen interactionscan cause problems ranging from inconvenience toloss <strong>of</strong> privacy to potential injury or death. Seeing thesepotential pitfalls requires thinking beyond the specifications<strong>and</strong> needs <strong>of</strong> a particular project. For many years theUsenet newsgroup comp.risks (<strong>and</strong> its collected form, RisksDigest) have chronicled what amounts to an ongoing symposiumwhere knowledgeable programmers, engineers, <strong>and</strong>others have pointed out potential risks in new technology<strong>and</strong> suggested ways to minimize them.Unexpected SituationsA common source <strong>of</strong> risks arises from designers <strong>of</strong> controlsystems failing to anticipate extreme or unusual environmentalconditions (or interactions between conditions).This is a particular problem for mobile robots, which unliketheir tethered industrial counterparts must share elevators,corridors, <strong>and</strong> other places with human beings. For example,a hospital robot was not designed to recognize when itwas blocking an elevator door—a situation that could haveblocked a patient being rushed into surgery. A basic principle<strong>of</strong> coping with unexpected situations is to try to designa fail-safe mode that does not make the situation worse. Forexample, an automatic door should be designed so that if itfails it can be opened manually rather than trapping peoplein a fire or other disaster.Unanticipated InteractionsThe more systems there are that can respond to externalinputs, the greater the risk that a spurious input might triggeran unexpected <strong>and</strong> dangerous response. For example,the growing number <strong>of</strong> radio-controlled (wireless) deviceshave great potential for unexpected interactions betweendifferent devices. In one case reported to the Risks Forum, aSwedish policeman’s h<strong>and</strong>held radio inadvertently activatedhis car’s airbag, which slammed the radio into him. Severalmilitary helicopters have crashed because <strong>of</strong> radio interference.Banning the use <strong>of</strong> electronic devices at certain times<strong>and</strong> places (for example, aboard an aircraft that is taking <strong>of</strong>for l<strong>and</strong>ing) can help minimize interference with the mostsafety-critical systems.At the same time, regulations themselves introduce therisk that people will engage in other forms <strong>of</strong> risky behaviorin an attempt to either follow or circumvent the rule. Forexample, the Japanese bullet train system imposed a stiffpenalty for operators who failed to wear a hat. In one casean operator left the train cabin to retrieve his hat while thetrain kept running unsupervised. This minor incident actuallyconceals two additional sorts <strong>of</strong> risks—that <strong>of</strong> automatinga system so much that humans no longer pay attention,<strong>and</strong> the inability <strong>of</strong> the system to sense the lack <strong>of</strong> humansupervision.Unanticipated Use <strong>of</strong> DataThe growing number <strong>of</strong> different databases that track eventhe intimate details <strong>of</strong> individual lives has raised many privacyissues (see privacy in the digital age). Designers <strong>and</strong>maintainers <strong>of</strong> such databases had some awareness <strong>of</strong> thethreat <strong>of</strong> unauthorized persons breaking into systems <strong>and</strong>stealing such data (see computer crime <strong>and</strong> security).However, most people were surprised <strong>and</strong> alarmed by thenew crime <strong>of</strong> identity theft, which began to surface in significantnumbers in the mid- to late-1990s (see identity theft).It turned out that while a given database (such as customerrecords, bank information, illicitly obtained DMVrecords, <strong>and</strong> so on) usually did not have enough informationto allow someone to successfully impersonate another’sidentity, it was not difficult to use several <strong>of</strong> these sourcestogether to obtain, for example, the information needed toapply for credit in another’s name. In particular, while mostpeople guarded their credit card numbers, they tended notto worry as much about Social Security numbers (SSN).However, since many institutions use the SSN to indextheir records, the number has become a key for unlockingpersonal data.Further, as more organizations put their records online<strong>and</strong> make them Web-accessible, the ability <strong>of</strong> hackers, privateinvestigators (legitimate or not), <strong>and</strong> “data brokerage”

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!