12.07.2015 Views

Journal_Issue_1

Journal_Issue_1

Journal_Issue_1

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

4TheBig IdeaPercentage of U.S. Labor Force in Agriculture, Industry, Information, and ServicesStage 1Stage 2 Stage 350403020AgricultureIndustryInformationService101880 1900 1920 1940 1960 1980Source: SRI International, “The Post-Industrial Society: Shift in Economic Activity Already Made,” cited in StuartBrand, The MIT Media Lab (Viking Penguin, New York, 1987). Reprinted with permission of SRI International, MenloPark, California.In a relatively stable business environment, anorganization’s people tend to stay put and naturallybecome highly knowledgeable over time. Tacitly, theyabsorb and socialize knowledge about the company’sproducts and services, its markets, customers,competitors, and suppliers—and once gained, thatknowledge sustains them indefinitely. Knowledgebecomes embedded in the firm’s routines and culture.New recruits learn from old hands purely by workingalongside them, and exposure and seasoning is a farmore important learning mechanism than training. Insuch an environment it is safe to assume that sufficientknowledge and capabilities exist in the organization,or that incremental learning occurs fastenough, to deal with contingencies. Time, logic, andexperiments solve most problems.Now, rapid change means quicker knowledge obsolescence,and a need to scale new learning curves inunnaturally compressed time frames. Every week ina typical company brings news of some emergingmarket, some hot technology, some unexpected formof competition—opportunities all, if only the companyhad the knowledge base to deal with them. Tryingto keep pace, management constantly introducesinternal change. New strategies, new structures,new processes, new tools—all create need for manypeople to learn new things at once.Huge overhauls of knowledge bases don’t happennaturally. So where they have happened, they haveoften brought trauma. The fastest way to change theknowledge of an organization, after all, is to replacethe people. Unfortunately, it’s a stupid way, becauseit only means that the same type of coup will have tooccur the next time major change hits the business.And meanwhile, much that was valuable in the oldknowledge base—but that was captured only inorganizational stories and cultural artifacts—hasbeen lost.Smart Products and Service IntensityThe need to manage knowledge actively becomesmore obvious when what you sell is knowledge. Fora research lab, a consulting firm, a software vendornot to manage knowledge would be equivalent toWal-Mart not managing inventory, or Ford not managingproduction. Interestingly, though, it’s not justthe gurus who are selling knowledge these days.Firms from BP, which drills oil, to Senco, which makesnails, now routinely describe themselves as being “inthe knowledge business.”This is because the make-up of today’s products andthe way in which they are delivered encapsulate anunprecedented amount of knowledge. In the extreme,this takes the form of “smart products”—things thatcan, for example, diagnose their own maintenancerequirements or adapt to a particular owners’preference. More broadly, we are seeing a rise in R&Dexpense (one proxy for measuring knowledge investment)as a proportion of cost of goods sold. Theprice of a camcorder has fallen by about 80% inSelling Knowledge on the ’Net, pg. 41Becoming a Knowledge-Based Business, pg. 9


5It also becomes critical to manage knowledge more deliberately when: theproducts and services you sell have a greater knowledge component, ascompared to the cost of the raw materials; you expand your business intoother parts of the world, where much that you know now does not apply;and/or your organization has sprawled across lines of business andgeography, making informal knowledge-sharing difficult. All three ofthese conditions are affecting sufficient numbers of firms to becharacterized as major trends. There are a few others forcing managerialattention on knowledge . . .TheBig Ideaarticle abstractsix years’ time, yet today’s models have more engineeringexpertise behind them than ever. Knowledgeintensityin products is also resulting from a trendtoward “mass customization,” which essentially buildsgreater knowledge of particular customers’ needs intowhat used to be a standardized product. John Deereseeders roll efficiently off the production line, butgiven thousands of possible variants, each one istailored to its individual buyer. Could anyone denythis is now a more knowledge-intensive product?(One wonders how they’re going to keep them downon the farm . . . )Finally, as firms increasingly bundle products withservice in their pricing, they are increasing the knowledgecomponent of what they sell. A seller of lightingfixtures quickly discovers that different levels of serviceare sought by Home Depot, Saks Fifth Avenue,and an interior design firm. The firm that is able totranslate that knowledge into tailored offeringsstands to increase its business with every account.And in what might otherwise be a commoditybusiness, it will see its profit margin wideneddisproportionately by this knowledge component.It’s Not a Small World After AllGlobal integration of the economy lets more andmore firms, globally run and sourced, produce moreand more goods for each dollar of profit. In the US,our market share of global GDP went from 52% to 23%in just a few decades. Even though the pie has grownmuch bigger, our share in it is fiercely contested.In fact, the challenges of globalization may bealerting more executives to the need for knowledgemanagement than anything else.As companies try to position themselves to expandwithin the global economy, their efforts are oftenstultified by clear deficiencies in knowledge. Theirpeople simply do not know enough about how to spotglobal opportunities, or once an opportunity isspotted, how business is done in that part of theworld. Worse yet, they may not understand the basicmodel by which the business succeeds, or how toreplicate that success in new outposts.The huge scope of the modern organization makes animportant case for more deliberate knowledge management.Sheer numbers is one problem: at Ernst &Young, for example, a piece of intellectual capital(i.e., knowledge codified and distributed) that isimportant to only one-tenth of employees must stillfind its way into 7,000 heads! Geography bringsadditional challenges: if knowledge is only transferredthrough proximity and exposure, how longdoes it take for something that is known in Munichto make it to Michigan? This is the problem thatinspired Hewlett-Packard’s Lew Platt to say: “If onlyHP knew what HP knows, we could be three timesmore productive!”


6TheBig IdeaOne last point on the scope of today’s organization:highly diversified or vertically integrated firms mayhave heightened needs for knowledge managementbecause they do not choose to concentrate on corecompetencies. Where the variety of businesses andtypes of operations is great, the chances diminish thatimportant knowledge will simply seep through theorganization informally and naturally. As DorothyLeonard-Barton points out, in a volatile world, corecompetencies can become core rigidities. It is moreexpedient to learn how to learn than to learn aspecific subject.Here Today, Gone TomorrowEven those rare firms who have not seen theirknowledge needs change dramatically—who perhapsoperate in mature industries or rely little oninnovation—recognize an increasing need for knowledgemanagement. This is because, while they mayrequire the same basic knowledge base, they aretypically asking a smaller number of employees tohouse it. Downsizing, the scourge of the nineties, isa severe strain on organizational knowledge. Byremoving slack from a worker’s day, it makes newknowledge generation or acquisition difficult. Atworst, downsizing is the intellectual capital equivalentof strip-mining, since it usually begins by earlyretiringa firm’s most experienced people and drivingaway its most talented.Whether due to firms’ disloyalty to workers or viceversa, or other forces altogether, workforce mobilityis a fact of modern life. No organization can take itsknowledge base for granted—erosion occurs withevery position that turns over. Recognizing thismeans understanding that continuous investment isnecessary, and not just in the knowledge base ofindividuals, but in the shared knowledge base of thefirm. Firms who do only the former may becomeexploited as training grounds: spend two years intheir new management program, then cash in bytaking that expertise elsewhere. Enlightened firmsdon’t react by curtailing such development, but theydo find ways to make knowledge transfer a two-waystreet. By setting out to manage knowledge, torepresent what people know and make itaccessible, they turn individual knowledge into atransferable asset.The reduction of employee bases and growingattrition rate within them become, of course, evenbigger problems when the firm does not have theluxury of stable knowledge needs, but in fact mustadvance rapidly in gaining new knowledge. It seemsinconceivable that, without active management, afirm could hope to meet escalating knowledge needswith fluctuating—or fewer—knowledge workers.The Coming VirtualityKnowledge management is also being necessitated bythe changing structure of organizations, and particularlythe desire to integrate far-flung operations.Businesses that were once organized alonggeographic lines are now reorienting themselvesaccording to markets, or products, or processes—orall of the above in complex matrices. WithinThe Well-Read Manager, pg. 85 Bob Galvin, pg. 77


8TheBig IdeaRecognition is growing that there is much to begained through knowledge management. As wellas the underlying forces outlined in this article,there are the positive examples of a vanguard offirms. Among other data points, the growingnumber of “CKOs” and equivalents indicates thecommitment many firms are now making to a moreknowledge-based future.article abstractForecasted Dollars Organizations Will Spend on KnowledgeManagement Consulting Services4.543.532.521.51.501994 1995 1996 1997 1998 1999Source: DataQuest, 1996Even Better Reasons for Managing KnowledgeThe pace of change, the knowledge-intensity of goodsand services, the growth in organizational scope,staff attrition, new structures, and information technology. . . All of these forces are leading executivesto more formal knowledge management. There is agrowing recognition of knowledge as an asset, whichcan be substituted for land, labor, or capital, and canbe a greater force than any of those in the productionof goods and services.For those executives, however, who somehow remainuntouched or unmoved by such underlying forces,there is an even more powerful argument for knowledgemanagement: the success of the vanguard whohave already taken on the challenge. Consider theteam at Hoffmann-LaRoche, which worked to makethe knowledge requirements of new drug approvalmore explicit. By substantially reducing the time tomarket of their next new product, they earned thecompany millions. Or the architects of the severalinitiatives underway at Hewlett-Packard, improvinghow knowledge is generated, captured, and transferredaround the organization. Or the group atMonsanto which has constructed a knowledge base tomake new and important insights instantly accessible.Successes like these began with a recognitionthat knowledge management is now possible andnecessary in ways it hasn’t been in the past. Andthey are just the beginning. After all, firms have beenmanaging, analyzing, and measuring land, labor, andcapital for several hundred years. By contrast, wehave only just begun to understand and analyze theworkings of knowledge in organizations.It’s no wonder that most executives are strugglingto understand exactly what to do with knowledge.But the few who are figuring it out are showingus the way forward. Along the way, they’re makingit clear why the rest of the business world isturning its attention to knowledge, and why, ifyour management team hasn’t, it should now.Innovation in Action, pg. 14


Decreto Rectoral 1088, página 11Artículo 24. Curso de inducción de Monitores Académicos o Tutores ParesComo parte del proceso de formación inicial, el estudiante que haya sido elegidoMonitor Académico o Tutor Par por primera vez, deberá asistir al curso deinducción de monitores académicos o tutores pares, el cual se desarrollará en unajornada de 8 horas presenciales, de acuerdo a las fechas definidas por elDepartamento de Planeación y Desarrollo Académico.Artículo 25. Seminarios de Formación Docente. Como parte del proceso de formaciónpermanente y durante el tiempo que se desempeñe como monitor académico otutor par, el estudiante tendrá la posibilidad de cursar tres seminarios deformación docente, del total de seminarios que la Universidad oferte para este fin.Dicha oferta aparecerá en el catálogo de asignaturas de la Universidad.Los seminarios de formación docente serán registrados en el expedienteacadémico del Monitor Académico o Tutor Par como asignaturas electivas de doscréditos, tendrán calificación numérica la cual hará parte del promedio del períodoen el que sean cursados y su inscripción no dará lugar a pago de derechospecuniarios.No obstante, si el estudiante reprueba la asignatura podrá repetirla hasta por dosveces, según las condiciones establecidas por el Reglamento Académico, perodeberá cancelar el valor correspondiente al número de créditos de la misma.Para el caso de estudiantes que no formen parte del programa de MonitoresAcadémicos o Tutores Pares pero que deseen cursar los seminarios de formacióndocente en carácter de electivas, podrán registrar dichos créditos dentro de losrangos establecidos de matrícula, con el consecuente pago de derechospecuniarios por este concepto.Parágrafo. Como complemento a la formación permanente, se incentivará laparticipación de los Monitores Académicos o Tutores Pares como asistentes deinvestigación en proyectos relacionados con el área educativa y en los cursos dedesarrollo profesoral que oferte el Departamento de Planeación y DesarrolloAcadémico para los profesores de la Universidad. Estos últimos se constituirán enun estímulo para la formación, pero no serán reconocidos como créditos electivosen el expediente académico del estudiante.CAPÍTULO XDE LA EVALUACIÓN DEL PROGRAMAArtículo 26. Definición de Evaluación. La evaluación se define como el conjunto deactividades a través de las cuales se valora cuantitativa y cualitativamente elcumplimiento de los objetivos propuestos para el Programa de MonitoresAcadémicos o Tutores Pares, así como el desempeño del Monitor o Tutor Par enla realización de sus funciones y responsabilidades.Se constituye en un proceso permanente y sistemático que a través de laidentificación de los logros, fortalezas y debilidades, favorece la toma dedecisiones para la mejora continua del Programa.Artículo 27. Objetivos de la Evaluación. El proceso de evaluación tiene comoobjetivos fundamentales:


11Instead of the conventional wisdom of“build a better mousetrap and the worldwill beat a path to your door,” it’s aboutbuilding a better path.TheBig IdeaA: That’s right. And to do that, it has to havememory, and it has to have filters. And thatbrings us to another feature of smart products:that they anticipate. So we’ve said that aknowledge-based TV Guide should be connected,customized, upgradable, interactive, learning,and anticipating. Well, we’ve just said quite amouthful when you apply it to any other offering.Q: Are you saying that many products in the futurewill have all these attributes?A: It’s not that every one of them is required forsomething to be called a “knowledge-based”offering. But the more of them you have in youroffering, the more indeed it is knowledge-based.And I would hasten to add that this applies notonly to products and services. If you’re talkingabout building a knowledge-based business, youhave to look at every aspect of the business. Forexample, what does it mean to talk about knowledge-basedcustomers and markets? Knowledgebasedresources? Knowledge-based models ofcompeting? Knowledge-based processes forrunning the business? All of these things caninvolve connectedness, customization, learning,and the rest.A: Brian Arthur’s recent article in Harvard BusinessReview notes how the competitive model isslowly, slowly changing from an industrial modelin which the law of diminishing returns operatedto the information era model in which the law ofincreasing returns operates. In a knowledgeintensivebusiness, the critical competitive issueis to establish the standard—get everybody usingyour product. Once everybody’s “locked in” toyour standard, you have effectively shut out competitorsand your returns continue to increaserather than diminish. This turns the industrialmodel on its head. Instead of the conventionalwisdom of “build a better mousetrap and theworld will beat a path to your door,” it’s aboutbuilding a better path. And forget the mousetrap,because it is a trap. The product can’t be thefocus because it’s going to have a lifecyclemeasured in months, if not seconds, and will beconstantly upgraded.Q: Let’s talk about the competitive model. How isthat going to change?Selling Knowledge on the ’Net, pg. 41 The Connected Economy ..., pg. 61Accelerating New Product Development, pg. 55


12TheBig Idea...all these peoplewho are trying to build knowledge-basedorganizations without first focusing onbuilding knowledge-based businessesare putting the cart before the horse.Q: You’re obviously thinking about knowledge at ahighly strategic level. Most of the “knowledgeinitiatives” we hear about focus more on theinternal organization—creating processes, tools,and structures to facilitate knowledge transferand use. Are these focusing on the wrong goal?A: Let me say this about “knowledge-basedorganizations.” The only reason an organizationexists is to carry out the fundamental purpose ofthe institution: a hospital exists to take care ofpeople’s health; a school exists to educate people;a commercial firm exists to satisfy marketplaceneeds for goods and services. The organization isnot the end, but the means to the end. Therefore,it has to follow and not precede the business. Ibelieve you cannot have a knowledge-basedorganization until you have a knowledge-basedbusiness. And all these people who are trying tobuild knowledge-based organizations without firstfocusing on building knowledge-based businessesare putting the cart before the horse. They aregoing to make a lot of gurus rich by buying booksand going to speeches, but it’s all going to cometo naught. If they don’t understand what it isthey’re doing it for, they’re just going to buildbureaucracies, and take their eye off the reasonthey exist—which, again, is to satisfy marketneeds with appropriate products and services.Q: Why aren’t more firms “knowledge-based” today?Are there still major hurdles to overcome? Doesinformation technology have to evolve further?Do we need even cheaper MIPS?A: No! The supporting technology is to a greatdegree already there. Well, one thing we need isincreased bandwidth. But certainly, in terms ofcrunching power . . .Q: Wait! First talk to me about increased bandwidth.A: I mentioned that data come in basically fourforms: numbers, words, sounds, and images.Actually, there are other forms, as well, such assmell, taste, touch, intuition, imagination,emotion—but we don’t have very sophisticatedtechnologies for those and therefore they haven’tbecome major forms. And although there arethose four basic forms, there have been only two“killer applications” for computing—namely,spreadsheets and word processing—and they dealonly with numbers and words. One thing that isfairly safe to predict is that, as we move into theknowledge era, sound and image will become asimportant as numbers and words have been in thepast. More bandwidth will enable sound andimage to become equal partners with numbersand words. This is something that few if anycompanies are prepared for. But it’s coming.Multimedia is simply the first shot across the bowin making that happen.Murray Gell-Mann, pg. 75


13One thing that is fairly safe to predict is that, as we moveinto the knowledge era, sound and image will become asimportant as numbers and words have been in the past.TheBig IdeaQ: Interesting. But if technology is not what youconsider the biggest obstacle . . . what is?A: If you want it in a word, it’s mindset. It takes ashift in mindset to see yourself as a knowledgebasedbusiness. It means, for example, if you area hotel, seeing that you can probably make moremoney off your knowledge of how to run a hotelthan off the traffic you can generate in thatparticular piece of real estate, brick, and mortar.Q: So it’s another way of defining your corecompetence?A: Yes, if you will. It’s defining yourself in terms ofthe knowledge you have. Which may be aboutyour customers, or may be about a set of products,or may be about an underlying technology.It’s stripping away the tangible matter anddefining your business it terms of its essence—which may be intangible but is hardly immaterialto its economic value.Q: How do you anticipate this new mindset willtake hold?A: What I believe is going to happen is that anawareness is going to burst on the scene that wehave to become more knowledge-based. All of ourproducts should be smart. And a lot of this will bejust talk, throwing around jargon. But there willbe firms that will bring out knowledge-basedofferings, and those firms will gain a competitiveadvantage. The qualities that characterizeknowledge-based offerings and businesses willcome to be well known, and it will becomeroutine to ask: “how could that apply to mycompany?” And it’s just going to pick upmomentum from there.John Kao, pg. 73 Why Knowledge? Why Now?, pg. 2


14Knowing the Drill:Virtual Teamwork at BPDon CohenInnovationin ActionAbout the author:Don Cohen is editor of GroundWork,which communicates findings of researchinto organizational knowledge management.He also provides freelance assistanceto researchers and authors, and iscurrently helping to compile a book onmanaging knowledge. Contact him atDonJCohen@aol.com.On a cold day on the North Sea in 1995, a group of BPExploration drilling engineers had a problem.Equipment failure had brought operations to a halt—and because they couldn’t diagnose the trouble, theyfaced the prospect of taking the mobile drilling ship(leased at a cost of $150,000 a day) back to port indefinitely.Instead, they hauled the faulty hardware infront of a tiny video camera connected to a newlyinstalled computer workstation. Using a satellite link,they dialed up a BP drilling equipment expert inAberdeen. To him, the problem was apparent, and heguided them quickly through the repair. The downtime, as it turned out, lasted only a few hours.The equipment aboard the ship was there thanks to apilot project BP had just undertaken called “VirtualTeamwork.” The name reflects the aim: to supportcollaboration across the barriers of distance andorganizational structure, through the use ofsophisticated technology.The project had grown out of BP Exploration’sreorganization a year earlier into forty-two separatebusiness assets. Prior to that, exploration activitieshad been carried out by a few closely controlledregional operating centers. Believing that smaller,more autonomous businesses could work more efficientlyand creatively, Managing Director JohnBrowne had overseen the transformation of the companyinto “a federation of assets,” each with the freedomto develop processes and solutions to serve itsown local needs. The assumption was that some localinitiatives would turn out to be applicable elsewherein the company. BP would benefit from the varietyIf Only HP Knew ..., pg. 20


15After highly centralized BP Exploration was reorganized into a “federation of assets,”new ways had to be found to enable knowledge-sharing across parts of the business. Onesuccess has been videoconferencing. BP piloted the technology in five geographicallydispersed work communities, being careful to set clear business goals by which to measureits value. The project team knew it was not true that “if you build it, they will come.”People with much to gain from the new capability still required coaching to see how it couldenhance their work.article abstractInnovationin Actionand creative power of forty-two moderate-sizedcompanies sharing their experiences. 1Good communication was clearly essential tomaking the federation work. In videoconferencingtechnology, Browne and others saw potential forfostering some of the creative synergy theysought. Accordingly, management authorized aneighteen-month, $13 million pilot project to testthe concept.Designing the PilotBrowne asked John Cross, then head of InformationTechnology, to lay the groundwork. Cross’s firstdecision was that the project should be undertakenby an independent group formed for the purpose—not “owned” by Information Technologies. He wantedto emphasize that the objective was behavior andwork pattern change, not technology. It also madesense, given the aim of transcending organizationalboundaries, that the project team be drawn fromdiverse parts of the company. A core team of five wasappointed, most of whom had had experience in morethan one area. It was led by Kent Greenes, whoworked in Human Resources and had a backgroundin Operations.proved extremely useful—completed the setup.Connections were made using ISDN lines and, wherenecessary, satellite links.For the pilot, the team decided to equip five differentcommunities with Virtual Teamwork clients, to provideenough variety for a fair test. First, they chosethe Andrew Project group, which was completing anew drilling platform for an emerging oil field.The others included a mature oil field group; anestablished network of experts who had alreadybeen communicating with each other by e-mail,newsletters, and occasional meetings; a new networkof geoscientists and engineers formed specifically forthe project; and what the team called the “businesscenter network.” This last consisted of five VT clientsplaced in key BP offices around the world, each witha full-time “host” whose job was to encourage its useat that location.The team began work in December 1994, specifyinghardware and software for the Virtual Teamworkstation (or “client”). The package included desktopvideoconferencing equipment, multimedia e-mail,application sharing, shared chalkboards, tools torecord video clips, groupware, and a web browser.A document scanner—a last-minute addition thatResearch Roundup, pg. 87


16Innovationin ActionIn establishing the goals of the project, the emphasiswas entirely on promoting the achievement ofbusiness goals. Performance agreements wereco-developed by the core team and participants ineach of the five groups. Goals included increasing theefficiency and effectiveness of decision-making,reducing costs, adhering to schedules, and solvingproblems creatively. Recognizing the importance ofmeasuring these results objectively, the core teamhired an independent consulting firm to perform thetask. The consultants helped generate the list ofexpected benefits at the start and tracked actualresults as the pilot progressed.The Importance of CoachingA subgroup of the core team called the ChangeManagement Team was responsible for helpingparticipants understand both how to use thetechnology and how it could further their work. Thiseffort was deliberately called “coaching” rather than“training”: coaches work to get the best out ofplayers—they don’t simply present information topassive recipients. Only twenty percent of thecoaches’ time was designated for training peoplein how to use the system. The rest would consistof challenging and helping them to exploit itscapabilities to serve their business needs. The coreteam was so convinced that extensive coaching wasessential to the success of the project that they spentapproximately half the pilot’s budget on it.An unplanned experiment helped prove them right.Due to budget constraints, one of the projects—thenew network of geoscientists and engineers—was setup without coaching. The members of what was calledthe Virtual Petrotechnical Team were given VT equipmentand essentially left alone to find uses for it. Thisproject was the only one of the five that failed. Theproblem was not that the group couldn’t make thetechnology work—it was fairly simple to operate.What they lacked was an understanding of why theyshould bother. Remarks from the team (“I don’t seehow this fits in with my work.” “The people I want totalk to are not on the network.”) were similar to thosemade initially by other teams. In part because therewas no one to help the group explore the value of thesystem and overcome their skepticism, their VTnetwork declined and eventually fell silent.Proof of ConceptIn the four other groups, once clients were put inplace, project directors were surprised at how quicklyvirtual teamworking became an integral part of theirwork. Teams began to experience the benefits of thesystem within weeks—in some cases within days—andenthusiasm and use increased.The Andrew Project provides a good example of thepositive impact of virtual teamworking. The use of VTtechnology was one of two innovations on the project.The other was the decision to complete as muchof the platform as possible on shore before movingconstruction to the offshore drilling site. There wasno link between the innovations, but together theyBob Galvin, pg. 77If Only HP Knew ..., pg. 20


17The ability to assemble the most knowledgeableindividuals around a problem ad hoc, rapidly andinexpensively, has been one benefit of BP’s “VirtualTeamwork” program. Amongst established teams,videoconferencing has enriched collaboration,reduced iterations and slack time, and fosteredgreater levels of mutual trust and commitment.Based on the success of its pilot testing, BP hasrolled out VT to many additional sites.article abstractInnovationin Actionseemed to create an enthusiasm for doing work innew ways. Building the platform was a joint effort byBP and two other companies: Brown & Root, aHouston-based design and engineering firm with anoffice in Wimbledon; and Trafalgar House, a Teessideconstruction company. This project would test virtualteamworking’s usefulness not only in connectingemployees over distance but also in linking separateorganizations. Initially, Trafalgar House expresseddoubt; they questioned whether the value of a faceto-facemeeting could really be provided by viewingdistant team members on a computer monitor.that the project was principally about behavior, nottechnology.) Time frames were also compressed bythings like the VT clients’ application-sharing feature,which allowed teams to write memos jointly, avoidinghours or days of sending drafts back and forth. Insum, virtual teamworking contributed significantly tothe project’s meeting its target date and incurring amuch lower total cost of steadily bringing forwardfirst oil, a principal milestone in the development of anew field.Certainly, virtual teamworking did not eliminate theneed for meetings. Colleagues still needed them toestablish mutual trust and to hash out importantissues involving large groups. Meetings were,however, significantly reduced. Having met once,participants found that videoconferencing maintaineda richness of communication and a sense of directpersonal contact that phone calls, e-mail, or memoscould not match. Before long, even Trafalgar Housepraised the system.But the quantifiable benefits on the Andrew Projectwent well beyond reductions in travel expenses andtime. There were also measurable productivityimprovements related to more efficient informationsearches and issue resolution, and less “miscommunication.”One finding was that commitments made“face-to-face” using the VT stations were honoredmuch more consistently than commitments made byphone or mail. (This underscores John Cross’s point,Research Roundup, pg. 87


18Innovationin ActionUnexpected UsesThe VT team was even more encouraged by somespontaneous and relatively unstructured uses of thetechnology they observed. Although the immediatebenefits are less clear than the cost reductions andproductivity increases of the Andrew Project, theseexplorations suggest that virtual teamworking isdeveloping a life of its own, and may have far-rangingimpact on the way work at BP Exploration is done.For example, VT users began communicating acrossprojects, with members of the Andrew Project, forinstance, contacting members of the Miller Team. Theconnection was important: much of the knowledgethe latter team had gained from a now-mature oilfield was highly applicable to work on the emergingAndrew field. The collaboration inspired an imaginaryheadline—”Scottish oil discovered in Alaska!”—coinedby the core team to proclaim VT’s ability to nullifydistance and transfer knowledge.The “hosts” of the Business Center Network, meanwhile,decided to hold weekly “virtual coffee breaks.”The idea was to try to mimic a knowledge-sharingopportunity that co-located employees enjoy everyday. Famously, “water cooler conversations” are howpeople absorb corporate culture; they also bringabout chance conversations that sometimes sparkcreative ideas. With no set agenda announced, thesevirtual coffee breaks have attracted up to twentypeople at eight separate locations. Their expectation—andthe company’s—is that the conversationswill pay off in unpredictable ways.Another unplanned use of the network allowedteamwork to transcend not only distance but time.Quarterly, Rodney Chase (head of BP Exploration)holds a Performance Review where reports aresubmitted by managers of all the firm’s assets. Inthe past, the review was a mish-mash of live presentationsand, from those managers who could notmake the meeting, prose reports or bullet-pointslides. Four days before the December 1995 QuarterlyPerformance Review, it occurred to one of Chase’sassistants that VT might enrich the review. Workingthrough the Business Center Network hosts, the VTteam arranged for every asset manager to create avideo report. As well as increasing the review’squality, the effort extended its reach. All reports weretaped and subsequently published on CD-ROM, tomake them available to senior managers worldwide.Next StepsBased on the success of the pilot, plans wereapproved to expand Virtual Teamworking by a significantnumber of new clients in 1996. By the end of1997, the team hopes the equipment will be availableto a high proportion of professional staff—providingthe “critical mass” needed to transform the companyinto a far-flung but close-knit federation of businessunits and workers.They are currently developing online videocon-Choosing Your Spots ..., pg. 45John Kao, pg. 73


Innovationin Action19ferencing “yellow pages” to replace the pilot project’ssimple phonebook. Yellow page listings will includephotographs and short biographies noting individuals’interests, not just their formal roles. The team alsohopes to integrate a knowledge base into thesystem that will guide people with questions tosources of expertise.At the same time, BP established a knowledge managementtask force, reporting to Director Russell Seal,whose purpose is to identify and recommend newopportunities and strategies for organizational learningand knowledge sharing. The task force will evaluateknowledge activities inside and outside thecompany to determine which should be expanded orintroduced. Building on the objective of the VirtualTeamworking project, the goal is to improve performanceby promoting behavioral changes that willmake continuous learning and knowledge-sharing thecompany norm.1 The idea of a corporate federation is similar to the “multilocal”structure Nonaka and Takeuchi describe in The Knowledge-Creating Company as part of Matsushita’s corporate aim tobecome a “possibility-searching company.” Nonaka andTakeuchi remark on “the importance of transcending thedichotomy between localization and globalization,” an aptdescription of Browne’s aim.A Prescription for Knowledge Management, pg. 26


20“If Only HP Knew WhatHP Knows . . .”Innovationin ActionThomas H. DavenportAbout the author:Tom Davenport, professor of InformationManagement at the University of Texas,Austin, is best known for his research onhow organizations bring about majorinnovations in their work processes.His 1993 book, Process Innovation:Reengineering Work through InformationTechnology, was the first book to describewhat has become known as “businessreengineering.”More recently, Davenport’s research interesthas shifted to the question of whether“knowledge work” is characterized byprocesses and amenable to processimprovement. Last year, he published“Improving Knowledge Work Processes”in Sloan Management Review. He also hastwo books forthcoming on related topics.Hewlett-Packard is a large, successfulcompany with over $38 billion in 1996revenues. Its fast annual revenuegrowth—approximately 30%—from such a large basehas astounded observers. The company competes inmany markets, including computers and peripheralequipment, test and measurement devices, electroniccomponents, and medical devices. It has 112,000employees and over 600 locations around the world.HP is known for its relaxed, open culture. Allemployees, including the CEO, work in open cubicles.Many employees are technically-oriented engineerswho enjoy learning and sharing their knowledge. Thecompany is perceived as being somewhat benevolentto its employees, and fast growth has obviated theneed for major layoffs. All employees participate ina profit-sharing program.The company is also known for its decentralized organizationalstructure and mode of operations. Businessunits that perform well have a very high degree ofautonomy. There is little organized sharing of information,resources, or employees across units. HPmanagers feel that the strong business-specific focusbrought by decentralization is a key factor in thefirm’s recent success. Although culturally open tosharing, few business units are willing to invest timeor money in “leveraged” efforts that do not have anobvious and immediate payback for the unit. It isThe Well-Read Manager, pg. 85 Creating Fertile Ground ..., pg. 34


21If ever there were a “knowledge-intensive” company, it’s Hewlett-Packard, the huge and hugely successful hightechfirm. There is widespread recognition at HP that its knowledge—about markets, products, and customers—is its biggest source of competitive advantage. But because the firm is highly decentralized, its knowledge isdispersed across business units that have little perceived need to share with one another.In such an environment, it’s no surprise that knowledge management efforts have proliferated. Three notable oneshave been: the “Trainers’ Trading Post”; the “Connex” guide to internal experts; and “HP Network News,” aresource for HP dealers. All have been successful, leading managers to the conclusion that knowledge, like thefirm that houses it, may not require a central management function. Instead, the emphasis is on buildingawareness of and sharing lessons from the many projects underway.article abstractInnovationin Actioncommon, however, for employees to move fromone business unit to another; this mobility makespossible some degree of informal knowledge transferwithin HP.In mid-1995 it became apparent that severalknowledge management initiatives were underway invarious HP business units. Some had been in place forseveral years; others were just beginning. Noticingthis phenomenon, Bob Walker, HP’s CIO and VicePresident, and Chuck Sieloff, Manager of InformationSystems Services and Technology (ISST), decided toattempt to facilitate knowledge management at HPby holding a series of workshops on the topic. Theiridea was to bring together a diverse group of peoplewithin the company who were already doing knowledgemanagement in some form, or who were interestedin getting started. The corporate ISST group hadpreviously sponsored similar workshop initiatives inthe areas of reengineering and organizational changemanagement. Key objectives for the workshopsincluded the facilitation of knowledge sharingthrough informal networking, and the establishmentof common language and management frameworksfor knowledge management. Walker and Sieloffappointed Joe Schneider, an ISST staff member whoalso focused on Web-based systems, to organizethe workshops.from corporate units, and the rest from variousbusiness units. Joe Schneider asked participants atthe meeting if they were aware of other knowledgemanagement initiatives. From this discussionSchneider compiled a list of more than 20 HP siteswhere some form of proactive knowledge managementwas underway. Several of the initiatives aredescribed below.Trainer’s Trading PostOne knowledge management initiative involvesHP educators. Bruce Karney is a member of theinfrastructure team for the Corporate Education organization,part of HP’s Personnel function. Karney estimatesthat there are more than 2,000 educators ortrainers distributed around HP, most of whom workwithin small groups and find it difficult to shareknowledge. About two years ago, in response to complaintsby the education community that “we don’tknow what’s going on,” Karney began work onapproaches to knowledge sharing for HP educators.He hoped to make the group more of a community;until this effort, it had no shared history, process, ortool set.The first workshop was held in October of 1995. Anoutside consultant facilitated the meeting, and presentedsome proposed definitions and frameworks.About 20 people attended the first session; 13 wereBob Galvin, pg. 77


22“Trainer’s Trading Post” is a Lotus Notes-based forum to help HP’s thousands of internaltrainers and educators share ideas, materials, and methods. This group obviously appreciatesthe value of knowledge transfer, but motivating them to participate still requires an“evangelist.” “Connex” is short for “connection to experts.” It’s a guide to knowledgeablepeople within HP Laboratories, the company’s research arm. “Knowledge Links” was prototypedby a group supporting new product generation, and was to have housed a variety ofknowledge important to that function. The design was overly ambitious, however, and inthe end the system was not built.article abstractInnovationin ActionUsing Lotus Notes as the technology vehicle, Karneyestablished three different “knowledge bases” foreducators to use:“Trainer’s Trading Post,” a discussion database ontraining topics;“Training Library,” a collection of trainingdocuments (e.g., course binders);“Training Review,” a “Consumer Reports”collection of evaluations of training resources.Training Review never took off; educators werereluctant to opine online about the worth of coursematerials or external providers, and there was noreward structure for participating. It was thereforemerged with Trainer’s Trading Post. Training Librarydid receive many contributions, but as participantsdiscovered that they could attach materials to submissionsto Trainer’s Trading Post, that knowledgebase became the dominant medium for educator use,and Karney expects that it will be the sole offering inthe future.Karney adopted innovative tactics to get submissionsto the knowledge bases. He gave out free Noteslicenses to prospective users. When a new knowledgebase was established, he gave out 3,000 freeairline miles for the first 50 readers and another 500miles for anyone who posted a submission. Laterpromotions involved miles for contributions, forquestions, and for responses to questions. By early1996, more than two-thirds of the identified educatorcommunity had read at least one posting, and morethan a third had submitted a posting or commentthemselves. Still, Karney was frustrated. Despite hiscountless attempts with free miles and e-mail andvoice mail exhortations, he still felt the need tocontinually scare up fresh contributions. “The participationnumbers are still creeping up,” he notes, “butthis would have failed without an evangelist. Even atthis advanced stage, if I got run over by a beer truck,this database would be in trouble.”Building a Network of ExpertsAnother knowledge project was initiated by thelibrary function within HP Laboratories, thecompany’s research arm. The goal of this project isto provide a guide to human knowledge resourceswithin the Labs and, eventually, to other parts ofHewlett-Packard. If successful, the guide will help toaddress a problem identified by a previous directorof the Labs: “If only HP knew what HP knows.”The directory of HP experts, called Connex, is beingdeveloped by Tony Carrozza, an “InformationTechnical Engineer.” He has been working part-timeon the project for almost a year; the system isscheduled to go into its pilot phase soon. It uses aWeb browser as an interface to a relational database.The primary content of the database is a set of expert“profiles,” or guides to the backgrounds and expertiseof individuals who are knowledgeable on particulartopics. By browsing or searching Connex, it will beeasy to find, for example, someone in HP who speaksGerman, knows ISDN technology, and has a master’sKnowing the Drill: Virtual Teamwork at BP, pg. 14A Prescription for Knowledge Management, pg. 26


Innovationin Action23or PhD in a technical field. Upon finding someone,the searcher can quickly link to the individual’s homepage if it exists.One concern Carrozza has is how to create a manageablelist of knowledge categories in the database thatwill be widely understood and will accurately reflectthe Labs’ broad universe of knowledge. Carrozzaplans to rely on the experts themselves to furnishtheir original knowledge profiles and to maintainthem over time. He expects that this will be achallenge, and speculated that experts might be givenincentives—for example, Carrozza suggested, “a DoveBar for each profile”—to submit and maintain profiles.As a back-up, a “nag” feature is built into the systemto remind people to update their profiles. Carrozzaalso anticipates that there may be problems with theterm “expert”; he is trying to identify less politicallyladen terms.Connex will be implemented initially for the Labs,but Carrozza hopes that the expert network will eventuallyexpand throughout all of HP. He knows thatother parts of the company will be developing theirown databases, but he hopes that they will use theConnex structure. He is already working with theCorporate Education group described above to createa network of educators using Connex. He adds, “Iknow other people are building expert databases.I just don’t know who they are.”Knowledge Management on Product ProcessesHP’s Product Processes Organization (PPO) is a corporategroup with the mission of advancing productdevelopment and introduction. It includes suchdiverse functions as Corporate Quality, Procurement,Product Marketing, Safety and Environmental, andOrganizational Change. The Product GenerationInformation Systems (PGIS) group serves each ofthese functions. Bill Kay, the PPO director, put PGISat the center of the PPO organization chart becausehe felt that information management needed tobecome a core competence of PPO.As part of that competence, Kay asked Garry Gray,the manager of PGIS, and Judy Lewis, another PGISmanager, to begin a knowledge managementinitiative. As a “proof of concept” the PPO knowledgemanagement group developed Knowledge Links, aWeb-based collection of product development knowledgefrom the various PPO functions. Consistent withthe philosophy of the knowledge management group,Knowledge Links contained knowledge contributedby “knowledge reporters and editors,” who obtainedit through interviews with experts. The systemprototype has been used many times to demonstratethe concept of knowledge management with PPO“customers,” but the goal of summarizing knowledgeacross PPO proved overly ambitious, and the systemwas never built.Choosing Your Spots ..., pg. 45 Accelerating New Product Development, pg. 55Technology Watch, pg. 79


24“HP Network News” began as a simple database of the questionsfrequently asked of the Computer Products Organization by HP’snetwork of dealers. Because dealers have direct access to it, it has significantlyreduced the number of phone calls to HP technical support. Thedatabase is constantly mined and carefully managed for even greater usefulness.This is a classic example of leveraging knowledge, and has beenhighly successful.article abstractInnovationin ActionThe PPO knowledge management group is currentlyworking on three projects. One involves competitorinformation for HP’s Components group. The goalof the second project is to create a Web-basedinterface to primary and secondary researchinformation. The third system manages internationalmarketing intelligence. Each of these projects isbeing developed in a collaboration between PGISand other PPO groups, e.g., Product Marketing andChange Management. The goal is not for PGIS to manageknowledge by itself, but rather to facilitate theprocess of structuring and disseminating knowledgethrough the use of information technology.Managing Knowledge for the ComputerDealer ChannelPerhaps one of the earliest initiatives to explicitlymanage knowledge at HP was an effort to capture andleverage HP product knowledge for the ComputerProducts Organization (CPO) dealer channel. It beganin 1985. Technical support for the dealer channel hadpreviously involved answering phone calls; the businessunit was growing at 40% annually, and calls fromdealers were growing at the same rate. Eventually,answering all the phone calls would require all thepeople in Northern California. HP workers began toput frequently-asked questions on a dial-up database,and the number of dealer support calls began todecline. According to David Akers, who managed theproject, the development group views each supportcall as an error.The system came to be called HP Network News.It was converted to Lotus Notes and has beenremarkably successful in reducing the number ofcalls. One key reason for the system’s effectiveness isthe developers’ close attention to the actual problemsfaced by dealers—not their own ideas about whatknowledge is important. Another important factor isthe constant effort by developers to add value to theknowledge. For example, lists are constantly made ofthe most frequently asked questions, frequentlyencountered problems, and most popular products.These lists are publicized and dealers are encouragedto download the information from the Notes database.Less valuable information is pruned away. HPNetwork News is still going after 10 years, and it hasbeen a significant factor in the high support ratingsHP receives from its dealers.SummaryChuck Sieloff and Joe Schneider are committed toadvancing the state of knowledge management, but ina decentralized company like Hewlett-Packard, it isnot clear what steps should be taken. They discusswhether there are actions they could take beyondfacilitating the Knowledge Management Workshop.They feel that knowledge is already exchanged wellwithin work groups and even business units, butthere is little support in the culture for sharing acrossunits. However, for ISST to try to change the culturejust for the purpose of knowledge management seemslike the tail wagging the dog.Selling Knowledge on the ’Net, pg. 41 Becoming a Knowledge-Based Business, pg. 9


Innovationin Action25Schneider and Sieloff also wonder just how differentmanaging “knowledge” is from managing information.Many of the HP initiatives are arguably a mixture ofknowledge and information, and drawing the linebetween the two is difficult. Sieloff feels that thesame fact could be either data, information, orknowledge for different people. Of course, thevarious information systems groups at HP have agreat deal of experience at managing data and information.How relevant is the experience gained inthese areas to problems of knowledge management?Schneider believes that facilitating knowledgemanagement at HP can be viewed as a knowledgemanagement problem. The company has bothinternal expertise and external sources of knowledgeon knowledge management. At the corporate level,Schneider is using the workshops as one mechanismto understand who needs this knowledge and howbest to transfer it. He also wants to get the workshopparticipants involved in an ongoing knowledgemanagement network that shares best practices andtransfers emerging knowledge.Schneider discuss the concept with regard to HP, theyquestion whether a corporate knowledge executivewould make sense in such a decentralized company.The current HP approach, which emphasizes awareness-buildingand the development of commonvocabulary and frameworks through workshops,is a subtle one. The two managers feel it isappropriate for HP’s culture, but they are alwayslooking for other techniques and methods thatmight be introduced.This case was prepared with research assistance fromDavid De Long.However, neither Chuck Sieloff nor Joe Schneider hasknowledge management as the only component (or inSieloff’s case, even a major component) of his job.They know that other firms are establishing permanent,full-time positions overseeing knowledge managementissues at the corporate level—a “ChiefKnowledge Officer,” for example. When Sieloff andBecoming a Knowledge-Based Business, pg. 9


26A Prescriptionfor Knowledge ManagementWhat Hoffmann-LaRoche’s Case Can Teach OthersInnovationin ActionAbout Patricia Seemann:This article reflects the thinking and workof Dr. Patricia Seemann, who was formerlyDirector of Knowledge Systems forHoffmann-LaRoche. Since the eventsdescribed here, Seemann has left that firmto accept a position with Ernst & Young’sEuropean management consulting practice.Seemann is a medical doctor by training,but has spent her career in the pharmaceuticalindustry. Prior to joiningHoffmann-LaRoche, she restructuredand led two departments for a largebiotechnology firm. Contact her at106424.2323@compuserve.comManagers thinking about embarking onknowledge management projects mightbenefit from the experience ofHoffmann-LaRoche, the international pharmaceuticalfirm. Management there credits a single knowledgeinitiative with making a significant difference in theprofitability of new products. Yet the initiative wasa fairly modest one—it called for no huge new informationsystems, no army of information processors.In fact, the lessons it teaches are all about focus—onthe right problem, at the right level, and on the rightgoals for the business.Targeting the Right ProblemFor Hoffmann-LaRoche, as for every pharmaceuticalcompany (and many other types of companies) muchdepends on the speed of new product launches.Industry observers estimate that development of anew drug takes, on average, five to eight years andcosts over $250 million. Firms that can expedite thatprocess stand to gain tremendously. First, theyrecoup their development costs faster—and generatehigher profits. At the same time, shorter developmenttime means more new product ideas can be placedin the pipeline—helping to hedge the risk of any ofthem fizzling (as many do). In an industry like pharmaceuticals,where a firm’s market standing is onlyas good as its current patents, fast and sure drugdevelopment is the key to survival.Accelerating New Product Development, pg. 55


27“At the outset, we wondered ‘where should we start?’ And my view was that we needed togo for a part of the business that was truly relevant to its strategy. Which meant that itwould get senior management attention, and that we would have tangible economic andbusiness results. If we had said, ‘let’s start with something in human resources’—like howwe deploy ex-patriots globally, for instance—that would not have been the right move.”article abstractInnovationin ActionThe competitive intensity is similar in other industries,but for pharmaceutical firms there is even anadditional complication. At a computer company, forexample, the Eureka! may have barely passed the scientists’lips before marketers are making their firstsales. No such luck in Pharmaceuticals. Great scienceonly gets you to the next step in the process: the FDA.Without approval from the US Food & DrugAdministration (and its counterparts around theworld) the product goes nowhere—and that’s a stepthat can take months or even years.When Patricia Seemann took on the role of Roche’sDirector of Knowledge Systems in 1993, the firm’strack record on new drug applications (NDAs) wasmixed. On many occasions, new products sailedthrough the approval process and enjoyed aprolonged marketplace advantage. Other times,though, NDAs got hung up by requests for more informationor additional trials—or were approved formore limited usage than hoped. Dr. Seemann’sresponsibility in general was to determine wherebetter knowledge management might make a realdifference to Roche. Looking across the company’soperations, her eye fell naturally on this spot.Goal-Setting: A Balance of Ideals andConservatismThe opportunity seemed almost too good to betrue. Here was an area where the company hadproven its abilities again and again. The problemwas not its capability but only its consistency. Clearly,this was a case of harnessing the knowledge the companyalready had and applying it moreuniformly. It was also an area that promised hugepayback. For every day gained in the market availabilityof a new drug, Roche had determined, thecompany stood to gain a million dollars. If only itsoccasional height of achievement could be madethe norm, Seemann considered, Roche could leapfrogits competitors handily.The potential benefit of accelerating NDAs was furthercompounded by the fact that there were over 30new drug projects in progress at the time—a typicalnumber for the firm. If better knowledge managementcould pare down average approval time by only afraction, the return on investment would be immense.Scanning the opportunity, it seemed resonable toSeemann to expect at least a three-month reductionin major new drug approvals. At $1 million per day,that would represent some $90 million in additionalprofitability—per product. At the meeting where herteam presented their plan, says Seemann, “we had areasonably captive audience.”Creating Fertile Ground ..., pg. 34


28“A question we asked ourselves at Hoffmann-LaRochewas: when we develop a product, what are we reallydeveloping? We’re not developing the tablet.We’re developing the knowledge around a diseaseand its treatment. That is what we’re putting outon the market. All of a sudden, knowledge is nolonger just a tool to do your business. It becomesyour business.”article abstractInnovationin ActionDefining the Knowledge ChallengeAs Seemann’s team looked further into the new drugapproval process, it became more and more clear thatthis was truly a challenge of knowledge management.First, the product involved—the application document—waspurely and simply a knowledge product.And second, compiling it drew on the work andinsight of literally hundreds of knowledge workers.These two basic observations led Seemann to designa knowledge management initiative with two thrusts:1) to help product teams prototype the knowledgerequired for their new drug applications; and 2) toproduce a comprehensive “map” of the knowledgesources in the company that might contribute totheir completion.Prototyping KnowledgeThe first—and possibly most important—step atRoche was to perceive that its primary product isknowledge. This is particularly true in the new drugapproval process, where the “customer” is theregulator. Regulators, after all, do not need orwant the drug made by a company, but only the informationabout it. This shift in perspective, consideringthe regulator as customer and the new drugapplication as product, was a crucial one. It becameevident immediately that the NDA was a product that,unlike Roche’s drug products, was not manufacturedfor efficient or effective performance. To provethe point, Roche could reflect on one particularlyineffective NDA. In that document, Roche’scollaborators had failed to emphasize evidence thatthe drug was effective even when taken just once aday. As a result, the FDA approved it at a twice-a-daydosage, negating what would have been a significantadvantage over competitors’ products.In general, the problem with NDAs was that they weretreated as data dumping grounds. Often amounting totens of thousands of pages, they buried the mostimportant information (like possible side effects) withthe most available (like how often the lab animals’cages were cleaned). Picture a cold remedy with 10milligrams of active ingredient delivered in a tabletthe size of a suitcase. That was how Roche wasdelivering its most important knowledge.Prototyping knowledge requirements meant understandingbetter the real needs of their customers. Toget that insight, the Roche team assembled a group ofex-regulators who had real insight into the approvalprocess. They helped Seemann’s team by observingthat regulators are really only trying to answer threecrucial questions: Is the drug safe? Does it work?And is it of sufficient quality? The myriad datarequirements specified by regulators’ guidelineswere simply extensions of those questions. The teamalso worked to compile the various guidelines issuedby the separate regulating bodies of Roche’s top 20markets worldwide.Why Knowledge? Why Now?, pg. 2


29“The goal that we set for ourselves at Roche was thefollowing: We wanted to make sure that people wouldbe able to consistently access and effectively accessthe organization’s knowledge. We included threeareas that people really had to have access to. Onewas customer knowledge. The second was experienceknowledge. And the third was process knowledge.”article abstractInnovationin ActionThe prototypes that emerged from this work representedthe total accumulation of knowledge requiredfor each NDA. For one thing, they covered in oneplace the requirements to get a drug approvedaround the globe—something that previously wouldhave been considered a miracle. Even more importantly,they focused directly on the three questionsthat had to be answered—and on communicatingRoche’s key messages relating to each. Yes, theseprototypes still responded to the minutiae of the regulators’requirements—but they did so by way of presentinga logical and compelling response to whatthose regulators were really asking. In other words,these were knowledge products potent and beneficialenough to bear the Roche name.Mapping the Knowledge of the OrganizationIf the prototypes represented where the productteams needed to be with their knowledge, Seemannstill needed to provide a map of how to get there.Prototyping knowledge raised as many questionsas it answered: Where would all the supporting informationcome from? When would it be available? Whoshould contribute at what point? To clarify thesources and flows of knowledge related to Roche’snew drug development, Seemann’s team constructeda knowledge map.In its simplest form, a knowledge map is a straightforwarddirectory pointing people who need access toknowledge to the places where it can be found.Usually, such maps recognize both explicit and tacitknowledge—that is, knowledge that has beencaptured in documents and databases, and knowledgethat resides only in the heads of experts. Seemann’smap certainly included this “Yellow Pages”functionality, but it went further. She incorporatedfeatures that would tie the map more directly to thecompletion of an NDA, and make it a better toolfor accelerating the launch of new drugs.Creating Fertile Ground ..., pg. 34


30“When people don’t share their knowledge, in themajority of cases, it’s not because they’re malevolent.It’s not because they’re not being rewarded forit. And it’s not because the structure of the organizationsomehow gets in the way of it. It’s becausethey don’t know why and when and how they shouldshare it.”article abstractInnovationin ActionThe first of these features was a Question Tree. Anymap needs an organizing framework, an architectureof the content it will cover. Seemann’s map took as itsstarting point the rewritten guidelines developed withhelp from former regulators. Again, these focused onthe three basic questions Roche had to answer aboutevery new drug: Is it safe? Does it work? And what isits quality? Branching out from those questions, themap showed the more detailed sub-questions that followed.Under “is it safe?”, for example, were thingslike: “at what dosage level is it safe?”, “does it haveside effects for the patient?”, and “is there any effecton the patient’s offspring?” For each question, themap pointed to the source of its answer. It simultaneouslydetailed what knowledge would have to begained for the typical new drug approval and whereto go to find it.The second key feature Seemann added to the Rochemap was a specification of “Knowledge Links.” Itwasn’t enough, she believed, to create a purely passivemap. That would serve people who neededknowledge, but not people who had knowledge tooffer. Knowledge links are road signs that show withwhom and at what point a person or group shouldshare their knowledge. For example, the team conductinganimal testing was not in the habit of immediatelysharing what was learned about dose-responsewith the person who would eventually direct the clinical(human) tests. Their work was too far separatedby time, geography, and hierarchy for this to happennaturally. But having those findings when theybecame available would help clinical testers designmore efficient studies more quickly, and perhapscollapse the drug approval time frame significantly.With the addition of Knowledge Links, the mapbecame a more active driver of knowledge-sharing. Tospecify the links, Seemann’s team simply had to getpeople from various parts of the organization to sitdown together and talk about what they wouldlike to know and when.Finally, the knowledge team added a great deal ofcontent to the map to show how specific questions init had been answered successfully in the past, or howthey might be better answered in the future. With allthese valuable features, the map had become apowerful—and sometimes controversial—thing. Partdirectory, part process description, part best-practicerepository, it was precisely the knowledge tool theorganization needed.


Innovationin Action31Condition Cured—Plus Some BeneficialSide EffectsRoche’s first indication that its knowledge managementeffort was succeeding came early on, when itasked the ex-regulators to react to its knowledgemap. The group responded with high praise. Had theybeen equipped with the insight it promised, they said,they could readily have approved a new drug.Could it be true? More to the point, could Roche’sknowledge map really deliver more effective NDAs? Itwasn’t long till Roche had a real test, and this timethe feedback was definitive. In an application for newindications for a drug, Roche expected its filing timeto consume 18 months. With the new knowledge toolsin place, it took just 90 days. Approval from the FDA,projected at three years, came within nine months.Perhaps even more important in the long run aresome of the less measurable—and in some cases,unexpected—benefits of the initiative. Seemannreports, for example, a surprising outpouring ofgratitude from people using the knowledge map (itwas rolled out to over 3,000 employees). For the firsttime, many of them had a clear picture of the multiyearwork underway, and where their current work fitinto it. Many, too, were gratified to be included inthe map’s “yellow pages” as important sources ofknowledge. This recognition had nothing to do withhierarchy and everything to do with value tothe organization. These two benefits—greater transparencyof Roche’s workings and a motivatingrecognition of knowledge sharing—will yield muchgreater returns than any single new drug approval.Seemann also believes that making Roche’sknowledge and knowledge processes more visible hashelped to break down some vexing interdepartmentalbarriers. Every type of organization has its factionsand its walls. In pharmaceutical companies, there is atraditional rift between the MDs and the PhDs. Whenevery group has a greater perspective on how theirwork dovetails, greater tolerance and cooperationresults. This works across geographic/culturalbarriers, as well. After the knowledge map wasissued, Roche’s animal testing department, which hadwrestled with globalization for years, was able to finda globalizing structure within a few months. It seemedto help, too, in achieving a rapid assimilation ofSyntex after that firm was acquired by Roche.Whether due more to its tangible or its intangiblebenefits, the knowledge management initiative wasembraced by Roche. In fact, it is generally agreed thatit is the first management initiative to becomeembedded in the organization.


32“In very large and complex organizations and projects, it is very difficult for people tounderstand ‘where are we in all this?’ When the knowledge map was implemented, whatpeople were most grateful for was the possibility of truly understanding what they weredoing—where they were in the company, and how it all fit together.”article abstractInnovationin ActionLessons to Take AwayEven this quick telling of the Hoffmann-LaRochestory makes some important lessons clear. First, managershoping to make a difference through betterknowledge management should start by focusing onthe right problem. Patricia Seemann chose a spot thatwas closely tied to the strategy of the business, and adriver of the firm’s future growth. She also focused ona process that was undeniably knowledge-intensive,ensuring that the impact of knowledge improvementswould be great. This points to a second lesson: setdefinitive goals for what the effort will achieve.Preferably, as at Roche, these can be stated in termsof ultimate increases in profitability.The third lesson to take away from Roche’s story isthat knowledge management need not be technologyintensive,and should not be technology-driven. Toolslike prototypes and knowledge maps can be surprisinglylow-tech. They don’t require people to buy intomajor infrastructural overhauls up front and onfaith—they simply get a job done, and win convertsalong the way.Finally, Roche’s success teaches a lesson aboutbringing together the right project team. A mix oftwenty-five Roche people and a variety of outsideconsultants, Seemann’s was small enough to movefast, but big enough to bring a variety of perspectivesto the table. Most importantly, every member of theteam was drawn from the best and the brightestRoche had to offer. Too often, Seemann knew,internal projects are staffed with employees whohave time on their hands. Unfortunately, they may befree for good reason—they are not the organization’smost valued contributors. Getting the best benefits aproject on two levels: it gets the work done fasterand better, and it makes a very visible statementabout the project’s importance to top management.In Seemann’s words, “Do not divest knowledgemanagement to your deadwood. Knowledge is somethingthat is so dear to the company that only thebest and brightest can actually bring it out.”Epilogue: Why Knowledge Management MattersBetter knowledge management certainly made adifference for Hoffmann-LaRoche. It also made adifference for the rest of us. Recall that Roche is acompany dedicated to developing treatments forpeople with life-threatening diseases. When the timecame to test its new knowledge tools, one majordrug in question was Invirase TM , and the disease ittargeted was AIDS.Knowing the Drill: Virtual Teamwork at BP, pg. 14


Innovationin Action33“The creation of value is coming increasingly fromthe collaboration of groups—the combination oftheir experience and skills. So, that combiningand recombining is becoming the true challenge forcompanies. The point is no longer to manage thesilos, but to bring together around a problem theright group of people with the right knowledge.”article abstractInvirase TM is the first compound in a new class of antiretroviralagents called proteinase inhibitors. Theywork by interfering with the activity of proteinase, anenzyme that is critical to the replication of HIV, thevirus that causes AIDS. Hoffmann-LaRoche submittedthe NDA for Invirase on August 31, 1995. It wasreviewed by the Antiviral Drugs Advisory Committeeof the US FDA on November 7, 1995.Today, the drug is being used widely by HIV-positivepatients in ongoing studies of its safety and efficacy.Thousands are receiving the drug free of chargefrom Roche through its “Compassionate Treatment”program. Most are experiencing fewer negative sideeffects than with former treatments. And thetreatment is the most effective to date in the fightto eradicate this terrible disease.


34Creating Fertile Ground forKnowledge at MonsantoBipin JunnarkarInnovationin ActionAbout the author:Bipin Junnarkar is Director of KnowledgeManagement at The Monsanto Company,based in St. Louis, Missouri. Monsanto isa science-based company devoted to discovering,manufacturing, and marketingagricultural products, performancechemicals used in consumer products,prescription pharmaceuticals, and foodingredients.Junnarkar has been with the firm forseveral years, working mostly in the areaof management information systems.His long-time emphasis has been on introducingnew technologies, processes, roles,and behaviors to tap Monsanto’s collectiveintellect more effectively.There are various paths by which organizationscome to the realization that they mustdo more to manage their knowledge. Formany, it follows in the wake of reengineering anddownsizing: with fewer people to do the work, theyneed to equip each to work smarter. For others, it’s awake-up call from a major customer, taking theirbusiness to a more state-of-the-art competitor. But atMonsanto, the motivation is more positive, if no lesspressing: here, in the midst of prosperity, the drivingconcern is growth.White Spaces and Gray MatterWhen our Chairman and CEO Bob Shapiro tookoffice in 1995, his first priority was to makeMonsanto more growth-oriented. The demands ofan increasingly global economy were making it clearthat profits for the foreseeable future were notenough; world-class competitors would be vying forshare, and Monsanto would have to grow faster toremain a dominant player.Mr. Shapiro stresses two major themes in his questfor growth: more agility in existing businesses; andfaster recognition and exploitation of new businesspossibilities. With an eye to the first, he effected a“radical decentralization,” transforming Monsanto’sfour huge operating companies into fifteen businessunits—each of a size more conducive to flexibility,focus, and speed of adaptation. To help with thesecond, he charged one of those units to focus purelyWhy Knowledge? Why Now?, pg. 2Knowing the Drill: Virtual Teamwork at BP, pg. 14


Innovationin Action35At Monsanto, a need to grow quickly in new business areas convinced management it must focus on knowledge.Leaving the firm’s knowledge base to evolve at its traditional pace would simply not be sufficient. Instead, managementneeded to actively support the creation of new knowledge (by connecting people and information andconverting information into knowledge), the encapsulation of that knowledge into forms that could be shared,and the dissemination of knowledge throughout the organization. Achieving these objectives requires attentionto people, process, and technology. The most important process to understand and manage is how new knowledgeis created—and the best guide Monsanto has found is the work of Nonaka and Takeuchi.article abstracton growth opportunities. Called “Growth Enterprises,”the unit’s mission is: “to grow existing business withinbusiness units and create new business by exploiting‘white spaces’ where core competencies exist toincrease the overall profitability of the enterprise.”The “white spaces” concept is an important onedrawn from the work of Gary Hamel and C.K.Prahalad, but it raised immediate challenges. Howwould we find such spaces? What determineswhether they are unexploited opportunities, ortruly barren ground? How can priorities be setamong the opportunities? The key is in the referenceto “core competencies”: growth at Monsanto will bedriven by how well we are able to apply and buildon the knowledge our people already have. Themission of the Growth Enterprises unit was soonaccompanied by a vision: “to create and enable alearning and sharing environment where knowledgeand information are effectively used acrossthe enterprise.”What Knowledge Management Can DoFrom the outset, there has been no quibbling atMonsanto about the need for more explicitknowledge management. Our Board of Directorsreadily approved a significant investment in it.But the way to apply that investment was not immediatelyapparent: what would do most to make individuals’knowledge more accessible to others? Toensure that the best knowledge is being appliedto decisions? To uncover knowledge gaps and tofill them?In considering the right approach to knowledge management,it helped to consult the available literatureon the topic and, even more so, to share ideas withother managers focusing on knowledge. Our currentthinking is that knowledge management at Monsantoshould focus on five objectives:Connecting people with other knowledgeablepeopleConnecting people with informationEnabling the conversion of information toknowledgeEncapsulating knowledge, to make it easierto transferDisseminating knowledge around the firmIt also seems clear that, despite the claims of sometechnology vendors, there are no “silver bullets” toaccomplish any of these objectives. In our knowledgemanagement efforts, as in any major businessinitiative, lasting change can only come aboutthrough a sustained and balanced interplay ofprocess, technology, and people.Knowledge Creation as a ProcessMost managers today would agree that managing anarea requires an understanding of the basic processesinvolved. Certainly I, as a manager tasked withimproving knowledge management, felt I needed abetter understanding of knowledge processes, andparticularly of those involved in knowledge creation.


36Figure 1Modes of Knowledge ConversionSocializationInternalizationKnowledgeExternalizationCombinationAdapted from Nonaka and Takeuchi, The Knowledge-Creating Company.Innovationin ActionHow does a business become knowledgeable about anew area (a “white space”)? What’s the differencebetween collecting data points and advancingknowledge? How would Monsanto know if it werebecoming more knowledgeable in net over time?An excellent resource in thinking about thesequestions was the work of Ikujiro Nonaka andHirotaka Takeuchi, who wrote The Knowledge-Creating Company. Their starting point becameMonsanto’s: that, “in a strict sense, knowledge iscreated only by individuals.” That observation,simple as it seems, has served many times as a touchstonefor proposed initiatives. Far from denying thevalue of organization-level knowledge management, itemphasizes the need for explicit efforts to makeknowledge more widely known. In their words,“Organizational knowledge creation should be understoodas a process that ‘organizationally’ amplifiesthe knowledge created by individuals and crystallizesit as a part of the knowledge network of the organization.”That process, as Nonaka and Takeuchi describe it, canseem chaotic—yet there is some orderliness to it. (SeeFigure 1.) There is certainly no strict sequence ofsteps 1, 2, 3, and 4, but typically knowledge is createdin four ways. “Socialization” involves activities likebrainstorming, discussion, and debate, where peopleexpose their knowledge to others and test its validity.“Externalization” involves putting knowledge to use;this happens when the organization makes a decision,for example, or states a goal. “Combination” is thebringing together of diverse pieces of knowledge toproduce new insight. And “internalization” happenswhen an individual, exposed to someone else’sknowledge, makes it their own.The interesting thing is that the process can start inany of the four quadrants, and will trigger activity inthe others. “Organizational Knowledge Creation,”explain Nonaka and Takeuchi, “is a spiral process inwhich the above interaction takes place repeatedly.”For this spiral to remain active and ascending, it musttake place in an “open system,” in which knowledge isconstantly exchanged with the outside environment.And it must be fueled by seeming contradictions andparadoxes; by constantly challenging the existingknowledge, these infusions will force higher discoveriesand syntheses. (It’s important to realize thatgood knowledge management is not about makingeveryone’s life more comfortable. Better to make ituncomfortable! The knowledge creation processshould generate more questions than answers.)Reflecting on any typical work week, it’s clear most ofus vacillate between upward and downward spirals.As might be expected, the downward trend is set inmotion when knowledge activity does not culminatein “internalization.” Meetings may be called, opinionsmay be vocalized, decisions may be made—but if asignificant number of individuals do not leave thetable with their own knowledge enhanced, there is nolasting gain. For Monsanto, the process laid out byNonaka and Takeuchi made the managementchallenge clear: to stay in an upward spiral.The Well-Read Manager, pg. 85 Murray Gell-Mann, pg. 75


Innovationin Action37Knowledge is created through socialization, externalization, combination, and internalization. The challenge is tosupport all of these processes so that the organization enjoys an upward spiral of knowledge creation. Sinceall of them involve making connections among people and information sources, the objective of Monsanto’s technologyand people-oriented changes has been to turn the company into a “connection-making machine.” New ITcapabilities have been implemented to create knowledge repositories, cross-link them for easy navigation, andsupport decision-making. At the same time, people are connecting more effectively through networks or“communities of practice.”article abstractTechnology: Building a “Connection-MakingMachine”Socialize, externalize, combine, internalize. At itsheart, the knowledge creation process is aboutmaking connections. The objective of Monsanto’sknowledge work, then, is to facilitate thoseconnections, first, among knowledgeable people (byhelping them find and interact with one another) andsecond, between people and sources of information.Information technology, of course, is a key enabler ofconnections, and this has been a big part of our workto date. Through a variety of information initiativeswe have implemented data warehousing, full-textsearch engines, internet/intranet capabilities, collaborativeworkgroup software, and major new operationalsystems (SAP). More broadly, our work ininformation management has three thrusts:To create repositories (data warehouses, operationalsystems) to house important information,both quantitative and qualitative;To cross-link those repositories so that navigationis easy and the technology is transparent tousers; andTo improve our capabilities to perform analyses insupport of decision-making.Our IT focus in knowledge management has beenon infrastructure—that is, on creating enterprisewidecapabilities and not on delivering informationsystems per se. Together, the systems we have implementedcomprise a logical architecture by which enduserapplications tap the structured and unstructuredknowledge available to the organization.The People Aspects: Networks and New RolesAs consuming as some of our technology implementationshave been, we have tried to keep in mindat all times that any information system is simply ameans to an end. Information technology may be anecessary but will never be a sufficient condition forknowledge creation and sharing. Coming back toNonaka and Takeuchi’s observation, we realize atMonsanto that knowledge is fundamentally aboutpeople management—equipping and encouragingpeople to generate knowledge important to ourfuture and share it with others in the organization.Accordingly, much of our work has been directed atthe creation, care, and feeding of networks, or “communitiesof practice.” Networks of people are notonly mechanisms for communicating; they help toadvance collective understanding by providing aforum for “sense-making.” In so doing, they createvalue for their individual members as well as theorganization. What is needed to sustain a vibrant network?We believe it takes information, permeableand natural communication, dynamism in the networkitself, and some key supporting roles: what wehave come to call “stewards” (or “shepherds”), topicexperts, and “cross-pollinators.”The Connected Economy ..., pg. 61 If Only HP Knew ..., pg. 20


38Various important “knowledge worker” roles are being recognized and formalized at Monsanto. Theseinclude knowledge stewards, topic experts, cross-pollinators, and knowledge teams.In its ongoing knowledge efforts, Monsanto will benefit from an emerging methodology that ties knowledgemanagement to business strategy. The methodology calls for mapping the business at successive levels, sothat the overall business model is translated into requirements for information, knowledge, performancemeasures, and supporting systems. In future efforts, as always, the greatest successes will come from abalanced approach featuring people, process, and technology-related changes.article abstractInnovationin ActionThe role of “knowledge steward” can be stated interms of Nonaka and Takeuchi’s emphasis on the“upward spiral” of knowledge creation. Sustainingthat spiral requires focus and resources—and this isthe responsibility of the steward. Think about themission director’s role in Apollo 13, the famous NASAmoon shot in which a crew was nearly stranded inspace. His role was to clarify the most importantknowledge problem facing the group, marshal theright resources and experts, and tighten theircollective focus on finding a solution. Importantly, atMonsanto, the steward is not necessarily the rankingmember of a given team; playing the role has muchmore to do with having the right instincts andinclinations than having the right title. Similarly,“topic experts” can come from all walks of theorganization; they are the knowledge workerswhose perspectives help the network “make sense”of the information before them, by recognizingpatterns and providing context. “Cross-pollinators,”finally, are the conduits to and from other networksand other sources of knowledge. Their activitysupports synthesis and “outside the box” thinkingby the networks.Supporting all these functions at Monsanto is a webof knowledge teams, tasked with creating and maintaininga “yellow pages” guide to the company’sknowledge, and serving as points of contact forpeople seeking information about different subjects.These teams are far more than information “helpdesks.” They are proactive and creative in thinkingabout Monsanto’s knowledge needs in theirassigned topic areas. If a business manager asks forinformation, they know not to stop simply at fulfillingthe letter of the request. Instead, they probe furtherinto why that information is valuable, how it will beused, and therefore who else might find the sameinformation—or access to the requester’s increasingknowledge—of value. Because these teams must casta wide net, covering internal and external, qualitativeand quantitative information, their composition issimilarly diverse: each brings together people trainedin information technology , library science, and relevantcontent areas. The teams are geographicallydispersed and self-directed.A Prescription for Knowledge Management, pg. 26


39Figure 2The Information Map Points to Sources of Existing InformationInternalQuantitiveStructured Financial Sales Budgeting LRPQualitiveUnstructured Strategic Plan Monthly Reports DocumentsExternal Point of Sale Competitor Info. Financials Newswires Federal RegisterInnovationin ActionMapping the Way ForwardIf there is any such thing as a “knowledge managementmethodology,” it is being invented on the flyby organizations like Monsanto in the midst ofknowledge experiments. With the benefit of only ourexperience to date, we would propose a methodologybased on a series of maps, charting the knowledgechallenges facing a firm at progressively finer degreesof resolution.First, it makes sense to map the overall businessmodel driving the firm’s performance and profitability.What actions and decisions are most importantto its success? We have come to call a mapdrawn at this level a “learning map”—mainlybecause it is so useful as tool to educate the entireorganization and help them to internalize thestrategic aims of the business.Once a learning map has illustrated what drives thebusiness, it is possible to construct an “informationmap” noting the information required to supportthat activity and decision-making. This informationmap must consider qualitative (or unstructured)information as well as quantitative, and informationfrom both internal and external sources. (SeeFigure 2.)A “knowledge map” can come next, illustrating howinformation is codified, transformed into knowledge,and used. Among the important uses of this map areto highlight knowledge strengths and shortfalls in theorganization and to inform the creation and supportof knowledge networks.The fourth type of map is something more widelyknown as a “Balanced Scorecard”—that is, the set ofperformance measures that top management shoulduse to gauge the health and progress of the business.Such scorecards are “balanced” because they combinetraditional financial measures (which lag performance)with important non-financial measures(which are often leading indicators). Many firms areimplementing balanced scorecards with or withoutknowledge initiatives. It’s important to include themin a knowledge management methodology for tworeasons: 1) because knowledge fuels performance infundamental ways, and 2) because measures mustbe defined to track the impact of knowledgemanagement efforts.All four of these maps highlight needs for data andinformation storage, manipulation, and integration.Thus, the final map is an information technology mapreflecting the infrastructure and systems needed tosupport the knowledge work of the organization.This basic five-map methodology, stated here at avery high level, is the framework we intend to applyto our ongoing knowledge management effortsat Monsanto.A Prescription for Knowledge Management, pg. 26Research Roundup, pg. 87


40Innovationin ActionKnowledge Management: The Essential ElementsIf Monsanto’s efforts at knowledge management aresucceeding, it is probably due most to our holisticapproach. Rather than relying on a single bullet—likeknowledge-sharing incentives, for example, or groupware—Monsantois drawing on a whole arsenal ofpeople-, process-, and technology-related changes.The first priority in terms of people has been to recognizeand formalize the roles of different kinds of“knowledge workers.” The work on process has been,first, to focus on knowledge creation, and then todefine ways in which individual knowledge becomesan organizational asset. And the focus of technologyefforts has been to impose better organization onknowledge and enable connections among peopleand information.Is this purely due to better knowledge management?Of course not. The number of variables that comeinto play in a regulatory approval process is great.But does there seem to be a direct correlationbetween investment in knowledge and better performancein new product and business development?Clearly, yes. Monsanto is moving ahead as a source ofinnovation and effectiveness. And if stock price is anyindicator, the word is getting out that we have somevery productive units, and that we are figuring outhow to engage the collective intellect of our people.Is it working? Certainly, Monsanto is growingprofitably, and our success in “exploiting whitespaces”—particularly in the exploding field ofbiotechnology—has been proven. One of our mostinteresting new products, for example, is Bollgard TM ,which can properly be called a “smart product.”Bollgard is a new kind of cotton plant, geneticallyengineered for greater defense against a pest whichaccounts for 80% of cotton plant destruction in theUnited States. In fact, in one six-month period sincethe firm began actively managing knowledge, it wasawarded four regulatory approvals to sell innovativenew products—this, in an industry where the typicalEPA approval process has taken over eight years.Becoming a Knowledge-Based Business, pg. 9 A Prescription for Knowledge Management, pg. 26


Innovationin Action41Selling Knowledgeon the ’NetAbout “Ernie”:“Ernie” is the name (and face) given byErnst & Young to its newest consultingresource—an internet-based service toentrepreneurs. Humanizing the servicemade sense, because Ernie is no robot;he’s an amalgam of real people—the firm’sbest advice-givers—and their most up-todateknowledge. “Ernie,” claims the firm’sads, “has 1,000 years of experience andover 150 professional degrees, knows IPOto IRS and ESOP to MIS. Ernie has beenthere, done that, and never takes a minuteoff.” Contact him at http://ernie.ey.com.Of the thousands of corporations that havebuilt sites on the World Wide Web, only ahandful have dared to explore the Web’spotential for offering new and compelling ways ofdoing business. Ernst & Young has joined that selectgroup with its ambitious new Web-basedservice, Ernie.“The consulting industry advises clients on how todeal with all the changes occurring in technology, yetrarely do we talk about how our own industry isbeing impacted by all these changes,” says RogerNelson, deputy chairman of Ernst & Young ConsultingServices. “The Internet offers a powerful new way todeliver services directly to clients. We wanted to usethe power of the medium to its fullest potential.”If Ernie lives up to its promise—quick, direct accessto the knowledge and resources of Ernst & Young,whenever and wherever the client needs it—it willchange the consulting industry as we know it. Ernie isdesigned to serve an emerging market that most largeconsulting firms cannot effectively serve: the swellingsegment of startup companies with revenues under$200 million per year. “These companies are growingquickly and are faced with building an organizationaround their successful product or service,” says GregErickson, director of Ernst & Young’s EntrepreneurialConsulting division. “They face an endless set ofissues that they must navigate to grow the business:setting up compensation plans, creating a computer


42Sometimes, the goal in knowledge management is to introduce new knowledge-based productsor services. Such was the case with “Ernie”, a new internet-based consulting servicelaunched in 1996 by Ernst & Young. Ernie enables the firm to leverage further its singlemost important asset: the knowledge of its professionals. The service is marketed toentrepreneurs—whose needs for business advice are wide-ranging, but who are rarelyable to justify involved consulting projects.article abstractInnovationin Actioninfrastructure, establishing human resource policies,interpreting tax and accounting regulations—all thethings that large organizations take for granted.”Traditionally, these companies have had to buildtheir corporate structures piece by piece, hiringspecialists in each area, or training staffers to becomethose specialists. Hiring consultants was out of thequestion—they were considered to be too expensiveand took too long. “We knew we had the resources tohelp them, but we lacked a cost-effective means toprovide services when and where they needed them,”laments Erickson.No means, that is, until the Internet and the WorldWide Web began to gain corporate acceptance. Ernst& Young realized that the Internet could offer aneffective alternative for companies that can’t affordor don’t need a full team of onsite consultants.“We began looking at efficient means of linking abusiness and all its departments directly to ourresources,” says Brian Baum, Ernst & Young’s directorof Internet service delivery. “We realized that thepower of the Internet was that we could use it as amedium to establish that direct connection.”Though the medium certainly has powerful potential,Baum doesn’t mean to imply that the Internet willreplace Ernst & Young’s traditional means ofdelivering consulting. Bringing a team of consultantsonsite remains the best way to implement largecomputer systems or bring large-scale change to anorganization. Ernie offers something completelydifferent, according to Baum. “When we firstannounced Ernie, the press kept asking, ‘Won’t thiscannibalize your traditional services?’,” says Baum.“We were surprised by the question, because wedidn’t think it was an issue.”Why not? “The pace of change is so fast today thatthere is a need for immediate support to help navigatethe waters of change,” replies Baum. “That needhasn’t always been there. Organizations used to havetime to adapt to change. Now they don’t. They needhelp today, and traditional consulting can’t offer thatkind of help. So Ernie is serving an entirely newmarket—the market for decision support.”Understanding the Internet’s potential for deliveringservices to clients was one thing. Creating a servicecapable of providing real value to customers—andconvincing them that it was worth paying for—wasquite another, however. When it explored ways tostructure the service, Ernst & Young decided to lookto its strongest asset: the many years of experiencethat are captured in various minds and databasesaround the company. “At one time or another, wehave answered just about any question theseemerging companies could have,” says Baum. “It wasjust a question of developing a quick, effectiveinterface for linking clients with the resources ofErnst & Young.”Simply offering online access to a database full ofdocuments would not be much of a help to clients,however. That’s why Ernst & Young decided to buildWhy Knowledge? Why Now?, pg. 2Becoming a Knowledge-Based Business, pg. 9


Innovationin Action43Ernst & Young accumulates business knowledge ata fantastic rate in the course of helping clientsaround the world solve problems every day. But, asknowledge delivery vehicles, its specialists are inefficient.They can only be in one place at a time,and their available time is scarce. The Internetprovided a new delivery vehicle, and allowed the firmto leverage its knowledge to serve a whole new tierof clients.article abstractErnie not just out of bits but out of brains as well.When clients access the Ernie Web site, they areasked to formulate a question, assign a title to it, andoffer some background on how they plan to use theinformation. Then they choose one of eight categories—tax,for example—and submit the question.If they have trouble formulating the question, ordon’t know what to ask, they can access Ernie’sextensive Frequently Asked Questions (FAQ) databasefor help. If the FAQ database doesn’t meet theirneeds, the client can send a specific query to Ernieasking for help.Once the query is formulated and submitted, Ernieautomatically routes it to the appropriate department—andthe appropriate consultant within thatdepartment—who receives an e-mail saying he or shehas a question from Ernie. The consultant is thenresponsible for packaging together a response thatincludes his or her specific experience in the subjectarea, as well as data from Ernst & Young’s resourcedatabase. Clients are guaranteed a two-day turnaroundtime on all their questions. And since clientspay a flat yearly fee to use Ernie, they can ask asmany questions as they like.While the Internet offers Ernie customers quick andinexpensive access to consulting knowledge, themedium also provides a very important benefit toErnst & Young consultants: flexibility. “Airplanesaren’t flexible,” says Baum, “the Internet is.” Withportable laptops and fast Internet links to Ernst &Young’s powerful computer network, consultants canaccess their Ernie questions wherever and wheneverthey want. This means Ernie queries can go to themost qualified consultants while the logistics remainsimple. “We need to keep the service fresh, and thatmeans having access to our most experiencedpeople working out in the field,” says Baum. “Theservice would quickly fail if we began sending outpre-programmed or scripted responses.”Because Ernie is such a radical break with the traditionalconsulting business model, it has taken awhilefor customers—and Ernst & Young itself—to get usedto the concept. “We are aware that the consultingindustry in general and the Big Six in particular havesome image baggage: High cost, limited access, andonly high-level contacts in client companies,” saysBaum. “We needed a way to broaden our reach, tointeract with all levels and all departments of theclient company. Internally, we met with a bit of skepticismat first, but when the skeptics realized that thiswas an opportunity to explore an untapped marketand deflate some of the Big Six stereotypes, theyjoined the team.”Even the name of the service reflects a kind of accessibilityand informality that one would not normallyassociate with the Big Six. “We had a lot of discussionover the name,” chuckles Baum. “We had some formalnames that perpetuated all the stereotypes about theBig Six, and then we had Ernie mixed in there. Werealized that since the service represented a majorshift in the way we did business and marketedourselves, the name should reflect that shift.”If Only HP Knew ..., pg. 20 The Connected Economy ..., pg. 61


44Ernie’s best customers to date have been managers dealing with informationtechnology decisions, tax planning, and human resource issues. All are areas where up-todateknowledge is critical, but relatively generic and easy to share. What makes Ernie aknowledge offering and not simply an information service is that it combines databaseaccess with human access. Responses to inquiries are not automatic but compiled byexperienced professionals sensitive to the context of the inquiry.article abstractInnovationin ActionSo far, industry observers don’t seem to have anyproblems with the name—or the service.“Entrepreneurial companies usually require the mosthelp with tax, accounting, and technology questionsas they struggle to grow, yet they lack the internalresources to tackle them on their own,” says SusanSiew-Joo Tan, an analyst with International DataCorporation. “With Ernie, they can now engage thehelp of a world-class firm at a competitive price.”Ernie is most popular among functional and departmentalexecutives, who must grapple with the shorttermeffects of rapid change. When Ernst & Youngpiloted Ernie with 88 of its client companies, the concepthad its greatest power among HR directors, MISdirectors, and accounting executives. “People whohave to act quickly,” says Baum.People like Lyle DeWitt, who is controller for TrinetEmployer Group, a fast-growing, San Leandro,California-based company that provides humanresource services—such as payroll checks and benefitplans—to companies with fewer than 100 employees.“Ernie is a great product for our needs,” says DeWitt,“because it gives us quick answers about various taxcodes and other human resource-related issues.”Companies also find that Ernie increases the productivityof their own people. Ron Wacek, vice presidentof finance for the American division of Syskonnect, aGerman computer networking hardware maker, usedto depend solely on peers in the company to help himget answers to complex questions regarding internationaltrading and currency issues. Sometimes his col-leagues had the answers at their fingertips; moreoften, they had to spend time researching the problem.Now Wacek can access the research alreadydone by Ernst & Young. “Ernie gives me peace ofmind,” he says. “Ernie’s vast database can handle anyquestion asked and that makes it well worth it—evenif I only have one question a month.”Yet these emerging companies aren’t merely lookingfor answers to current problems. Since their primaryobjective is growth, they also want to be able topredict future trends. Helmuth, Obata & Kassabaum,a St. Louis-based international architecture firm,uses Ernie as a quick, inexpensive means to getstrategic information about the rapid changes thatare affecting the airline industry. “We do a fairamount of airport work, but we wanted to be a littlebit ahead of the game, so we are using Ernie to getadvance information,” says Chris Strom, director ofmarketing for the company.“Just-in-time consulting” such as this has never beena possibility before Ernie, notes Baum. “We neverused to be on these people’s radar screens, eitherbecause they didn’t have time for consulting engagementsor because of their perceptions about the BigSix,” says Baum. “Now they can get direct input intothe decision-making process while they’re makingthe decision. We see this electronic delivery of servicesas becoming a cornerstone of the consultingindustry,” he adds.If so, that cornerstone may have an inscription thatreads simply: “Ernie.”


45Choosing Your Spotsfor Knowledge ManagementPeter Novins and Richard ArmstrongAbout the authors:Richard Armstrong is Manager ofContinuous Improvement Initiatives atBechtel, the huge engineering and constructionfirm. He was part of the designteam that developed Bechtel’sOrganizational Learning process andwas chartered with its implementation.Currently, he is developing changestrategies to better leverage the knowledgeof the Engineering and ConstructionCentral Functions organization. Contacthim at rmarmstr@bechtel.com.Peter Novins is an Ernst & Young Partnerbased in New York. He specializes inhelping clients enhance the performanceof their knowledge assets and the productivityof their knowledge workers. Contacthim at Peter.Novins@ey.com.If you think the decisions that make or break acompany are those made by strategists at thetop, go back and re-read your Tolstoy. Whetherin war or in commerce, it’s the sum total of countlessdecisions made every day on the front lines thatdetermine the course of future events. Their amassedweight can create a momentum—or a chaos—farbeyond the power of senior leaders to redirect.Success, in business or other large-scale endeavors,depends on good individual, daily decisions outweighingbad ones over time. And the most importantthing top management can do to ensure success is toempower people throughout the organization to makegood decisions. Partly, this is a question of simplygranting the authority for decision-making—andestablishing accountability for decisions made.But much more importantly, it’s a question ofequipping people with the knowledge required tomake decisions well.A Blueprintfor ChangeWe see companies of all kinds coming to this samerealization, and also recognizing its implication: thatthey must manage their organizational knowledge in amuch more deliberate, explicit fashion. When baddecisions are made, after all, it is rarely because theknowledge did not exist in the organization to makethem well. It’s because that knowledge wasn’tbrought to bear on the decision, because it wasburied in a corner of the organization where it couldnot be found in time—or because it was neverWhy Knowledge? Why Now?, pg. 2


46The purpose of knowledge management is to ensurethat every decision-maker has the benefit of thewhole organization’s experience and intellect. Butmanagers often start down the wrong path intrying to facilitate knowledge transfer becausethey categorize knowledge according to domains(what it is about). In considering what knowledgeto manage and how, it is more useful to consider itsapplicability and transferability.article abstractA Blueprintfor Changesought in the first place. Applying the fullness of anorganization’s knowledge to its decisions meansworking hard to represent it, transfer it, make itaccessible, and encourage its use. None of thishappens automatically.Unfortunately, while many companies are seeing theneed for better knowledge management, we see mostof them leading the charge by heading directly andforcefully down the wrong path. The major problemseems to be the model they adopt for organizationalknowledge and learning—which is based too simplisticallyon how people learn as individuals—and thefalse priorities that model points them toward.Thinking About Knowledge In a New WayIt’s understandable that, as managers start to thinkabout making organizations knowledgeable, they lookfor guidance to the only model they know: makingindividuals knowledgeable. They make an immediateassumption that organizational knowledge is simplyindividual knowledge writ large, and that the analogywill hold up through any level of expansion. It’snot true.The first problem that this false start leads to is atendency to categorize the knowledge that is importantto the business into domains. If organizationalknowledge management is basically the challenge ofsharing knowledge broadly that used to be trapped inpockets, then the first questions to arise are: whatknowledge are we hoping to share, and with whom?When individuals think about the “what” of knowledge,they think in terms of domains, or what it isthey are knowledgeable about. A person may beknowledgeable about political history, fly-fishing, oraccounting. Asked to categorize their knowledge, theywould organize it into buckets like sixties trivia,seventeenth century poetry, and current affairs. Ourbookstores are set up in this way; so are our newspapers.In fact, our entire educational system encouragesus to think in terms of domains of knowledge,and the mastery of them over time.It’s hardly surprising, then, that when aspiringknowledge managers in organizations begin thinkingabout the question “what knowledge?”, they think interms of domains. “We need to manage our knowledgeabout the Japanese market,” they might say—orchemical engineering, or new product introduction.Recognizing that there are various means by whichand levels to which knowledge can be managed, theywill then set priorities amongst these domains.There are other ways, however, of classifying knowledgethat have little to do with content. Rather thanslotting a piece of knowledge based on whether it isabout this or that, for example, we might slot itaccording to where it could be found: does it comefrom a single source or from multiple sources? Thiswould be a focus on origin as opposed to domain.Some other characteristics that might guideknowledge classification include:Bob Galvin, pg. 77


47Figure 1Knowledge Can Be More or Less Universally ApplicableLocal Applies only to a limitedset of conditions Dependent on physicaland/or geographicsituation “Detailed” knowledgeGlobal Widely applicable acrossthe business Crosses process, industry,technical, and culturalbounds “General” knowledgeRecipient: who is likely to need to use it?Applicability: How broadly does the knowledgeapply? Is it local or global in nature?Transferability: How easy is it to impart theknowledge to others, and how difficult for them toapply correctly?Richness: How much is the knowledge dependent onits context, and how much meaning would be lostthrough simplification?Currency: How old is the knowledge? How timeless?Trustworthiness: Is it easy to test? Does it come froma reliable source?The list could go on and on. The point is that, inconsidering the question of what knowledge to share,the answer need not be expressed as “knowledgeabout x” or “knowledge about y.” In fact, our workindicates that thinking about knowledge in termsof domains is not very useful at all in guiding knowledgemanagement. Instead, the real insight comeswhen we look at relative levels of applicabilityand transferability.Categorizing Knowledge for ManagementAny given piece of knowledge that may be importantto a business decision falls somewhere along a continuumof applicability. (See Figure 1.) At oneextreme, a thing that is known might be purely localin nature. That is, it applies only to its immediate setof conditions, and is dependent on a given physical orgeographic situation. At the other extreme, theknowledge might be global in nature, applying widelyacross the business, and across process, industry,technical, and cultural bounds. One way to think ofthis range is as detailed versus general knowledge. Itseems clear that a given piece of knowledge shouldbe managed differently according to how broadlyapplicable it is.A Blueprintfor ChangeKnowing the Drill: Virtual Teamwork at BP, pg. 14


48Figure 2Knowledge Can Be More or Less Easily TransferredFigure 3For Useful Categories of Knowledge to ManageProgrammableProgrammable Rule-based knowledge Can be applied multiple times ”Learning from history” to avoid repeating mistakesLocalQuick AccessKnowledgeOne-OffKnowledgeBroad-BasedKnowledgeComplexKnowledgeGlobalUniqueUnique Context sensitive to specific situationA Blueprintfor Change Judgment-based ”Just-in-case” knowledge Projecting into possible future problemsEqually important is the level of transferabilityexhibited by a given piece of knowledge. (See Figure2.) Knowledge that is rule-based is highly transferable,because it can be stated simply and definitively(“if x condition is present, then the best approachis y”). It can be transferred multiple times withoutlosing its validity. At the other extreme, transferabilityis low when knowledge is judgment-basedand/or very context-sensitive. While it may be highlyapplicable in future situations, it is hard to capture insuch a way that it can be accessed “just in time.” Thecontinuum of transferability can be seen as rangingfrom the programmable to the unique.If we think along both of these dimensions simultaneously,then we begin to have an answer to thequestion of “what types of knowledge are we dealingwith?” Figure 3 shows a matrix of four possiblecategories of knowledge. Their labels point to keydifferences in how they should be managed, asdiscussed below.Murray Gell-Mann, pg. 75


49“Quick Access” knowledge is well served byplacement in a database where it can be soughtwhen needed. “Broad-Based” knowledge, on theother hand, should be distributed proactively to largegroups of people. “Complex” knowledge deservesthe most management attention, and can’t betransferred purely through technology. “One-Off”knowledge rarely justifies management investmentbeyond creating informal communities of interest.article abstractQuick Access KnowledgeA piece of knowledge may be easily transferable(even programmable) but not very broadly applicable.For example, a reservations clerk in the Ritz-CarltonHotel may learn that when Mr. Smith books a room,he wants a non-smoking one. This is a piece of knowledgeeasily transferred to others throughout thechain, but not very broadly applicable. (We cannotinfer, for example, that all people from Smith’s hometownare non-smokers.) The term “quick access”makes sense for this kind of knowledge because it isbest managed by placing it in an accessible spot—most likely a sophisticated database—for use if andwhen needed. It would be a mistake to distribute thisknowledge proactively to all personnel, just in casethey might someday need it.Broad-Based KnowledgeOther pieces of knowledge in the organization may beboth easily transferable and broadly applicable. Anexample might be the organization’s personnelpolicies, such as the knowledge of how to fill out atimesheet. With such “broad-based” knowledge types,it does make sense to broadcast to the organizationby packaging the knowledge and distributing itproactively. Unfortunately, there is a tendency infirms to think more knowledge is broad-based thanactually is; this is the source of the “informationoverload” felt by so many employees. One antidoteis to broadcast information about how to accesscommonly needed knowledge, rather thanbroadcasting the knowledge itself.Complex KnowledgeWhen a piece of knowledge is broadly applicable butnot easily transferable, it is best transferred throughstructured training efforts. An example of such complexknowledge might be, in a consulting firm forexample, the knowledge of how to manage a largescaleproject. Many people in the organization needthis knowledge, but the vicissitudes of good projectmanagement are largely resistant to hard-and-fastrules. In other industries, the approach to managingcomplex knowledge transfer is often apprenticeship.In both cases, there is a recognition that the learnermust develop a feel for the area that can only begained through proximity or attention to someonealready knowledgeable about it.One-Off KnowledgeFinally, there is knowledge in organizations that isneither easy to transfer nor broadly applicable. Anetwork manager in one office may know a great dealabout configuring Macintosh systems, but if most ofthe organization uses Windows, the knowledge is notworth sharing broadly—and would be difficult totransfer in any case. Because the payoff of managingthis category of knowledge is very low, it makes littlesense to focus knowledge management efforts here. Itis sufficient to support the establishment of informal,special-interest networks of people who might benefitfrom interacting occasionally with each other.A Blueprintfor ChangeIf Only HP Knew ..., pg. 20Bob Galvin, pg. 77


50Traditionally, we think of knowledge being transferred from a knowledgeableindividual to another individual or, in a classroom setting, to a group.The larger opportunity for knowledge management is in transferringthe knowledge of the many bright and experienced people in anorganization to the individual decision-maker—or better still, tomany decision-makers. Going from one-to-one knowledge transfer tomany-to-many requires a shift in mindset. It takes people out ofthe comfort zone that has been established through a lifetime oftraditional learning.article abstractA Blueprintfor ChangeThese brief notes already make it clear that thinkingabout knowledge in terms of applicability andtransferability yields much clearer guidance for managementthan thinking of it in terms of domain. In allfour cases, as in most areas of business, the best formof management is a careful balance of influencingpeople’s behavior, introducing effective processes,and putting in place supporting technology. The mixdiffers, however, with the category. Quick Accessknowledge, for example, is highly amenable tocomputerization, and management here should be atits most IT-intensive. Complex knowledge, on theother hand, demands the highest level of people management.The four categories have clear managementimplications, too, for levels of investment and effort.One-Off knowledge yields little return on high managementeffort. Complex knowledge managementmay represent the single greatest source ofcompetitive advantage.Getting Knowledge to the Right PeopleEarlier, we said that the challenge of knowledge managementstarts with a key question: what knowledgeare we hoping to share, and with whom? Many companieshave started down the wrong path in the firstpart of the question, by looking at knowledge interms of content domains rather than in terms ofapplicability and transferability. The mistake followsnaturally from the fact that the only model most managershave for knowledge management is that of theindividual. The same bias gets companies into troubleon the second part of the question: among whomdoes knowledge need to be shared?Individuals are most comfortable with knowledgesharing that originates with individuals. A singleknower envisions himself imparting knowledge to oneother person, or imparting knowledge to many otherpeople. Similarly, when it comes to gaining newknowledge, he is apt to think of that knowledge beingimparted to him by some one other person, who isaddressing either him alone or him as part of a group.Figure 4 shows that this mindset of one-to-one orone-to-many knowledge transfer is only half the universeof possibilities. Our own experience would indicatefurther that it is the poorer half. The realopportunity lies in the realm where individuals andcompanies are least comfortable—knowledge transferfrom many to many.Creating Fertile Ground ..., pg. 34 A Prescription for Knowledge Management, pg. 26


51Figure 4Possible Levels of Knowledge SharingOneOriginManyOne Apprenticeship Networks CoachingRecipientMany Mentoring Presentations Leverage Books ArticlesA company learns more in a day than an individuallearns in a career. It makes sense, then, that whenit comes time for an individual to make a businessdecision, he or she will do better to draw on theknowledge of the total organization rather than theknowledge of a specific individual, however intelligent.When a decision-maker floats an inquiry witha group of advisers, he is inviting knowledge transferfrom many to one. When a project team raisesquestions on a networked discussion database, thetransfer taking place is many-to-many. Both holdthe promise of applying greater amounts of usefulknowledge to people at the point in time when a decisionneeds to be made.Knowledge transfer from many to many will notbecome comfortable overnight. Managers willcontinue to tend toward their traditional means ofacquiring knowledge individually and from trustedindividuals. And as they begin to experiment withbroader scopes, they will quickly run up against adisconcerting sense of loss of control. (Is too muchknowledge being given away too broadly?)Nevertheless, the allure of many-to-many knowledgetransfer is already undeniable; what else to accountfor the wild popularity of groupware—a technologythat has yet to prove itself in many hard results. Therecognition is growing that the real leverage to begained from knowledge assets is in this kind oftransfer, and that knowledge management effortsshould focus here first.A Case of Knowledge Management: BechtelCorporationOne company that has focused its knowledge managementefforts appropriately on what knowledge toshare with whom is Bechtel Corporation. Given thecompany’s recent competitive challenges, there is noquestion it needed to get more leverage out of itsknowledge assets. But just as surely, its constrainedoperating budget meant that any knowledge managementprojects had to be targeted at the areas ofgreatest payback. As in many companies, the pathforward was hardly obvious at the outset and Bechtelhad a few false starts. (See sidebar, pg. 53.)Very quickly, Bechtel’s management came tounderstand that the real payback opportunity lay inlearning how to leverage its complex knowledge.Project teams working on multi-year engineering andconstruction jobs were inevitably generating hugeamounts of new knowledge—but unfortunately, muchof that knowledge was remaining hidden to the restof the organization. Isolated in those teams, it wasinaccessible and unleveraged. Wheels were beingreinvented left and right.A Blueprintfor ChangeSelling Knowledge on the ’Net, pg. 41Knowing the Drill: Virtual Teamwork at BP, pg. 14


52The frameworks outlined in this article were developedin work at Bechtel, where they helped to focusknowledge management efforts on the areas thatpromised the greatest impact on quality and profitability.One tendency that had to be countered wasindividual project managers’ assumptions that theirchallenges and experiences were unique. In fact,much of the learning taking place in Bechtel’shundreds of active projects is highly applicableto others.article abstractA Blueprintfor ChangeDetermining what types of knowledge could andshould be shared was an important—and sometimescontentious—discussion. There was a real bias on thepart of individual project managers to believe thateach of their projects was unique. If that were true,then the knowledge gathered for or generated bytheir teams would not be relevant to other teams,and any effort to capture and transfer it would be awaste of time. In fact, however, the level of applicabilityis surprisingly high. People had to have thisdemonstrated to them in a variety of cases beforethey could see how much of Bechtel’s knowledge fellin the category of “complex knowledge” as opposedto being “one-off.”Today, Bechtel is busy creating the infrastructure thatwill support the rapid transfer of complex knowledgefrom many to many. The effort holds great promisefor increasing the value Bechtel delivers to clientsaround the world. Happily, it also promises greaterprofitability on major projects.Keeping Knowledge Management in PerspectiveAs an increasing number of success stories arepublicized about benefits gained from more activeknowledge management, we have only one fear: thatthe approach will begin to be pursued for its ownsake. Knowledge management is not an end initself. Businesses do not exist for the purpose ofpropagating and advancing knowledge—they exist tosell products and services. But to the extent thatcompetitive advantage relies on informed decisionmakingwithin the business, knowledge managementhas a critical role to play.Keeping knowledge management in perspectivealso means understanding that it may be possibleto have too much of a good thing. As with processinnovation, quality management, or any improvementprogram requiring resources, there are problems inbusinesses that simply aren’t big enough, aren’timportant enough, aren’t worth the effort of fixing. Inthe end, the success of knowledge management willhang on the thoughtfulness of managers in choosingtheir spots.Becoming a Knowledge-Based Business, pg. 9


53Richard ArmstrongBechtel CorporationSince its founding in 1898, Bechtel hasworked on more than 15,000 projects in140 nations on all seven continents. Itspeople have mastered every kind of engineering,construction, management, development,and financing challengeimaginable. At the end of 1995, Bechtel wasworking for about 580 clients on almost1,000 active projects and studies.ManagingComplex Knowledgeat BechtelAt Bechtel, we realized we had to do a betterjob of leveraging the complex knowledge wegain in every major project. Bechtel is anengineering and construction firm, focused on thelarge-scale infrastructural needs of industrialclients, the petrochemical refining industry, thepower utility industry, and the US government. (Anexample of the type of project we take on isBoston’s “Big Dig”—depressing a major highwaythat cuts through the city.)Customers hire us for our deep expertise, but giventhe huge investment levels typically associated withthese projects, they are also understandablyobsessed with cost. Looking to the future, we knowthat the single biggest driver of our competitivesuccess will be our ability to lower the totalinstalled cost to the customer.A Blueprintfor ChangeThere are any number of ways to go about loweringcosts, and unfortunately many of them involve cuttingquality as well, or cutting investments in thefuture of the company. Our approach may haveseemed a little counter-intuitive, because it meantspending some money to save money. Essentially,we decided that we could deliver higher value and cut costs if we could capitalizebetter on our accumulated knowledge. We saw that a competence in organizationallearning could be a strategic advantage and a differentiator for Bechtel.It hasn’t been as easy as we might have thought at the outset. We thought then thatwe could simply create some kind of “lessons learned” clearinghouse and make it accessiblethrough information technology and presto—time to move on to the next challenge.In fact, it has taken a lot more careful attention to people and process issues.For example, we assumed that people would naturally see the benefit of knowledgesharingand willingly populate our “Knowledge Bank.” We didn’t stop to wonder:why would someone whose personal marketability depends on their knowledge andproductivity take time out from a busy day to make what they know widely available?If Only HP Knew ..., pg. 20


54It would be an injustice to Bechtel’s people to imply that the whole problem was selfinterestedknowledge-hoarding. In fact, there were deeper forces at work. One was afundamental cultural bias against openness; since Bechtel is privately held, it simplyisn’t in the habit of disclosing information. In some cases, too, knowledge can’t beshared, where Bechtel has access to clients’ proprietary information. Most of all, wesuffered from a lack of clarity on what kinds of knowledge people should be sharing.There is a strong tendency on the part of our project managers to believe the challengesthey are tackling and the clients they are serving at any given time are utterlyunique. They don’t recognize that the knowledge they gain is also relevant to their colleagues.But in fact, there is great applicability, especially where the same workprocesses are being used (another major goal at Bechtel). Trying to communicate tomanagers that their work is less “one-off” than they assume has been a difficult task.Part of the solution has been to create new organizational roles specifically tasked withknowledge management. A Chief Knowledge Officer position is being created with overallresponsibility and accountability for promoting learning and knowledge. As well asimplementing a management process around knowledge, this person will be activelyconcerned with communicating the business case for knowledge investments. Other keyplayers are Knowledge Stewards. Each of these holds responsibility for a portion of ourKnowledge Bank, seeing that the appropriate content is collected and disseminated.This relieves us of some of the burden of teaching everyone in the organization torecognize what knowledge is broadly applicable or easily transferable.A Blueprintfor ChangeThe Knowledge Bank itself has evolved since ourinitial conception of it as a simple repository forlessons learned. We have expanded its mission toinclude “quick access” knowledge and informationfrom both internal and external sources. Again, ourpeople are very focused on adhering to projecttimelines. A major obstacle for them is the amountof time it takes to assemble the information theyneed from multiple sources. By creating more of a“one-stop shop” for knowledge, we hope to compressproject lifecycles, enhance customer service,and drive greater profitability.It would be premature to say our knowledge managementefforts have paid off; we’ve just begun theprocess. But the time spent so far in designing thechange has been well invested. With a betternotion of what knowledge we must manage, and agreater appreciation of how people, process, andtechnology elements must be integrated, we can beconfident of gaining the maximum payback on ourknowledge investments.Creating Fertile Ground ..., pg. 34


55Accelerating NewProduct DevelopmentKarl T. UlrichSteven D. EppingerAbout the authors:Karl T. Ulrich is an associate professor atThe Wharton School at the University ofPennsylvania, and also holds a secondaryappointment in the Department ofMechanical Engineering and AppliedMathematics. He has participated as amember of development teams for dozensof products ranging from snack foods tosurgical instruments.Karl Ulrich and Steven Eppinger publishedProduct Design and Development in 1995, butit has already become the most respected texton the subject in use at leading business schools.The book’s structure makes for clear teaching;each chapter explores an aspect of productintroduction with reference to a specific actualexample. Here, we excerpt a chapter on projectmanagement, and the case example isfrom Kodak.Steven D. Eppinger is an associateprofessor at the Massachusetts Institute ofTechnology Sloan School of Management.He specializes in the study of large productdevelopment projects and has adviseddevelopment teams at many firms in theautomotive, electronics, and consumerproducts industries.Amanufacturer of microfilm imaging equipmentapproached the Eastman KodakCompany to design and supply microfilmcartridges for use with a new machine underdevelopment. The target specifications were similarto previous products developed by the cartridgegroup at Kodak. However, in contrast to the usual24-month development time, the customer neededprototype cartridges for demonstration at a tradeshow in just 8 months, and production was to begin4 months later. Kodak accepted this challenge ofcutting its normal development time in half andcalled its efforts the Cheetah Project.Onthe HorizonEffective project management was crucial to the successfulcompletion of the project. Cheetah projectmanagers focused on defining the critical path, collectingthe best possible project team, and findingways to accelerate project tasks.


56Exhibit 1PERT Chart for the Cheetah project. Thecritical path is designated by the thickerlines connecting tasks. Note that tasks, G,H, and I are grouped together becausethe PERT representation does not depictcoupled tasks explicitly.A. Receive & accept specification H. Design moldB. Concept generation/selection I. Design assembly toolingC. Design beta catridges J. Purchase assembly equipmentD. Produce beta catridges K. Fabricate moldsE. Develop testing program L. Debug moldsF. Test beta catridges M.Certify cartridgeG. Design production cartidge N. Initial production runA 2B 4 C 8 D 8GL 4F 2 H K 10 M 2 N 2TaskA 2E 5I14Duration (weeks) J 6Onthe HorizonThe Critical PathThe dependencies among several tasks, some ofwhich may be arranged sequentially and some ofwhich may be arranged in parallel, leads to the conceptof a critical path. The critical path is the singlesequence of tasks whose combined required timesdefine the minimum possible completion time for theentire set of tasks. Consider the project plan representedin Exhibit 1. Either the sequence C-D-F or thesequence C-E-F defines how much time is required tocomplete the four tasks C, D, E, and F. In this case,the path C-D-F requires 18 weeks and the path C-E-Frequires 15 weeks, so the critical path for the wholeproject includes C-D-F. Identifying the critical path isimportant because a delay in any of these criticaltasks would result in an increase in project duration.All other paths contain some slack, meaning that adelay in one of the noncritical tasks does notautomatically create a delay for the entire project.Several software packages are available for producingGantt charts and PERT (program evaluation andreview technique) charts; these programs can alsocompute the critical path.Team Staffing and OrganizationThe project team is the collection of individuals whocomplete project tasks. Whether or not this team iseffective depends on a wide variety of individual andorganizational factors. Smith and Reinertsen (1991)propose seven criteria as determinants of the speedwith which a team will complete product development;in our experience these criteria predict manyof the other dimensions of team performance as well:1. There are 10 or fewer members of the team.2. Members volunteer to serve on the team.3. Members serve on the team from the time ofconcept development until product launch.4. Members are assigned to the team full-time.5. Members report directly to the team leader.6. The key functions, including at least marketing,design, and manufacturing, are on the team.7. Members are located within conversationaldistance of each other.While few teams are staffed and organized ideally,these criteria raise several key issues: how big shouldthe team be? How should the team be organizedrelative to the larger enterprise? Which functionsshould be represented on the team? How can thedevelopment team of a very large project exhibitsome of the agility of a small team?


57Exhibit 2Project staffing for the Cheetah project. Numbers shown are approximate percentages of full time.Person Month: 1 2 3 4 5 6 7 8 9 10 11 12Team Leader 100 100 100 100 100 100 100 100 100 100 100 100Schedule Coordinator 25 25 25 25 25 25 25 25 25 25 25 25Customer Liaison 50 50 50 50 25 25 25 25 25 25 25 25Mechanical Designer 1 100 100 100 100 100 100 100 100 50 50 50 50Mechanical Designer 2 50 100 100 100 100 100 100 50CAD Technician 1 50 100 100 100 100 100 100 100 50 50 50CAD Technician 2 50 100 100 100 100 100 50Mold Designer 1 25 25 25 25 100 100 100 100 25 25 25Mold Designer 2 100 100 100 100Assembly Tool Designer 25 25 25 25 100 100 100 100 100 100 50 50Manufacturing Engineer 50 50 100 100 100 100 100 100 100 100 100 100Purchasing Engineer 50 50 100 100 100 100 100 100 100 100 100The minimum number of people required on theproject team can be estimated by dividing the totalestimated time to complete the project tasks by theplanned project duration. For example, the estimatedtask time of the Cheetah project was 354 personweeks.The team hoped to complete the project in 12months (or about 50 weeks), so the minimum possibleteam size would be seven people. All other thingsbeing equal, small teams seem to be more efficientthan large teams, so the ideal situation would be tohave a team made up of the minimum number ofpeople, each dedicated 100 percent to the project.After considering the need for specialized people, thereality of other commitments of the team members,and the need to accommodate an increase and subsequentdecrease in workload, the project leader, inconsultation with his or her management, identifiesthe full project staff and approximately when eachperson will join the team. When possible, team membersare identified by name, although in some casesthey will be identified only by area of expertise (e.g.,mold designer, industrial engineer). The projectstaffing for the Cheetah project is shown in Exhibit 2.Three factors make realizing this ideal difficult. First,specialized skills are often required to completethe projects. For example, one of the Cheetah taskswas to design molds. Mold designers are highlyspecialized, and the team could not use a molddesigner for a full year. Second, one or more keyteam members may have other unavoidable responsibilities.For example, one of the engineers on theCheetah project was responsible for assisting in theproduction ramp-up of a previous project. As a result,she was only able to commit half of her time to theCheetah project initially. Third, the work required tocomplete tasks on the project is not constant overtime. In general, the work requirement increasessteadily until the beginning of production ramp-upand then begins to taper off. As a result, the teamwill generally have to grow in size as the projectprogresses in order to complete the project as quicklyas possible.Onthe Horizon


58Kodak’s Cheetah project was initiated when the firmaccepted a commission to produce a new microfilmcartridge—in half the time it normally took todevelop a new product. Meeting the deadline wouldrequire that many things be done faster, and everythingbe done right. The first key step was to definethe “critical path” to clarify which tasks would bethe best target of additional resources and attention.Defining the best composition of the project teamwas also important. It had to be multi-functional,drawing on highly specialized skills, but as small anddedicated as possible.article abstractOnthe HorizonAccelerating the ProjectProduct development time is often the dominantconcern in project planning and execution. The followingguidelines can help to accelerate productdevelopment projects. Most of these guidelines areapplicable at the project planning stage, although afew can be applied throughout a development project.Accelerating a project before it has begun ismuch easier than trying to expedite a project that isalready underway.The first set of guidelines applies to the project asa whole.Start the project early. Saving a month at thebeginning of a project is just as valuable as saving amonth at the end of a project, yet teams often workwith little urgency before development formallybegins. For example, the meeting to approve a projectplan and review a contract book is often delayed forweeks because of difficulty in scheduling a meetingwith senior managers. This delay at the beginning of aproject is exactly as costly as the same delay duringproduction ramp-up. The easiest way to complete aproject sooner is to start it early.Manage the project scope. There is a naturaltendency to add additional features and capabilitiesto the product as development progresses. Somecompanies call this phenomenon “creeping elegance,”and in time-sensitive contexts it may result in anelegant product without a market. Disciplined teamsand organizations are able to “freeze the design”and leave incremental improvements for the nextgeneration of the product.Facilitate the exchange of essential information.A tremendous amount of information must be transferredwithin the product development team. Everytask has one or more internal customers for theinformation it produces. For small teams, frequentexchange of information is quite natural and isfacilitated by team meetings and co-location of teammembers. Larger teams may require more structureto promote rapid and frequent information exchange.Blocks of coupled tasks identify specific needs forintensive information exchange. Computer networksand emerging software technology offer somepromise for enhancing this exchange within largerdevelopment teams.The second set of guidelines is aimed at decreasingthe time required to complete the tasks on thecritical path. These guidelines arise from the factthat the only way to reduce the time required to completea project is to shorten the critical path. Notethat a decision to allocate additional resources toshortening the critical path should be based on thevalue of accelerating the entire project. For someprojects, time reductions on the critical path can beworth hundreds of thousands, or even millions ofdollars per day.A Prescription for Knowledge Management, pg. 26Five Myths ..., pg. 79Knowing the Drill: Virtual Teamwork at BP, pg. 14


59There are a variety of ways to accelerate new product development, beginning with theearliest stages. Start early, keep the project’s scope from creeping, and provide for rapidand rich information exchange amongst team members. Focus on the critical path, findingways to get those tasks done more quickly. Avert any delays spent waiting for resources,overlap as many tasks as possible, “pipeline” others, and outsource where possible. Finally,focus on delays involved in iterations, by increasing the frequency of the necessary ones,and de-coupling tasks to avoid the unnecessary ones.article abstractComplete individual tasks on the critical pathmore quickly. The benefit of recognizing the criticalpath is that the team can focus its efforts on this vitalsequence of tasks. The critical path generally onlyrepresents a small fraction of the total project effortand so additional spending on completing a criticaltask more quickly can usually quite easily be justified.Sometimes completing critical tasks more quicklycan be achieved simply by identifying a task ascritical, so that it gets special attention. Note thatthe accelerated completion of a critical task maycause the critical path to shift to include previouslynoncritical tasks.Eliminate some critical path tasks entirely.Scrutinize each and every task on the critical pathand ask whether it can be removed or accomplishedin another way.Eliminate waiting delays for critical pathresources. Tasks on the critical path are sometimesdelayed by waiting for a busy resource. The waitingtime is frequently longer than the actual timerequired to complete the task. Delays due to waitingare particularly prominent when procuring specialcomponents from suppliers. In such cases, purchasinga fraction of the capacity of a vendor’s productionsystem in order to expedite the fabrication of prototypeparts may make perfect economic sense in thecontext of the overall development project, eventhough the expenditure may seem extravagant whenviewed in isolation. In other cases, administrativetasks such as purchase order approvals may becomebottlenecks. Because in past cartridge developmentprojects periodic budget approvals had caused delays,the Cheetah project leader began early to pursueaggressively the necessary signatures so as not tohold up the activities of the entire team.Overlap selected critical tasks. By modifying therelationships between sequentially dependent taskson the critical path, the tasks can sometimes be overlappedor executed in parallel. In some cases, thismay require a significant redefinition of the tasks oreven changes to the architecture of the product. Inother cases, overlapping entails simply transferringpartial information earlier or more frequentlybetween nominally sequential tasks (Krishnan, 1993).Pipeline large tasks. The strategy of pipelining isapplied by breaking up a single large task into smallertasks whose results can be passed along as soon asthey are completed. For example, the process offinding and qualifying the many vendors who supplythe components of a product can be time consumingand can even delay the production ramp-up if notcompleted early enough. Instead of waiting until theentire bill of materials is complete before the purchasingdepartment begins qualifying vendors, purchasingcould qualify vendors as soon as eachcomponent is identified. Pipelining in effect allowsnominally sequential tasks to be overlapped.Onthe HorizonA Prescription for Knowledge Management, pg. 26


60Onthe HorizonOutsource some tasks. Project resource constraintsare common. When a project is constrained by availableresources, assigning tasks to an outside firm orto another group within the company may proveeffective in accelerating the overall project.The final set of guidelines is aimed at completingcoupled tasks more quickly. Coupled tasks are thosethat must be completed simultaneously or iterativelybecause they are mutually dependent.Perform more iterations quickly. Much of thedelay in completing coupled tasks is in passing informationfrom one person to another and in waiting fora response. If the iteration cycles can be completedat a higher frequency, then the coupled tasks cansometimes be completed more quickly. In the Cheetahproject, the mechanical engineer worked closely withthe mold designer, who in turn worked closely withthe mold maker. In many cases, these three shareda single computer terminal for the purpose ofexchanging ideas about how the design was evolvingfrom their three different perspectives.De-couple tasks to avoid iterations. Iterationscan often be reduced or eliminated by taking actionsto de-couple tasks. For example, by clearly definingan interface between two interacting componentsearly in the design process, the remaining design ofthe two components can proceed independentlyand in parallel. The definition of the interface maytake some time in advance, but the avoidance ofsubsequent iterations may result in net time savings.The PostmortemThe Cheetah project was completed on time, despitethe aggressive schedule. The team agreed that themost important contributors to project success were:Empowerment of a team leaderEffective team project solvingEmphasis on adherence to scheduleEffective communication linksFull participation from multiple functionsBuilding on prior experience in cartridgedevelopmentUse of computer-aided design (CAD) tools forcommunication and analysisEarly understanding of manufacturing capabilitiesThe Cheetah team also identified a few opportunitiesfor improvement:Use of three-dimensional CAD tools and plasticmolding analysis toolsEarlier participation by the customer in thedesign decisionsImproved integration of tooling design andproduction system design


61The Connected Economy:Beyond the Information AgeChristopher MeyerAbout the author:Christopher Meyer is Director of the Ernst& Young Center for Business Innovationand a partner in the firm’s managementconsulting practice. He also leads thefirm’s research effort focused on oneaspect of the “connected economy”:the applications of complexity scienceto business. Contact him atChris.Meyer@ey.com.Computers are incredibly fast, accurate, andstupid; humans are incredibly slow, inaccurate,and brilliant; together they are powerfulbeyond imagination.—Albert EinsteinOver the last fifteen years, business hasfocused on two themes: time andtechnology. Because informationtechnology can link people and processes ever morequickly and cheaply, we have multiplied the instantconnections among individuals, organizations, andinformation itself. Attention has been focused on theresulting acceleration of business. But connectionsare doing more than accelerating the economy; theyare changing the way it works. As the number ofconnections among the elements of a systemgrows, the system no longer behaves predictably—the system as a whole begins to exhibit unforeseen,“emergent” properties. 1 Two famous examples of theunanticipated results of connectivity are the 1965Northeast Blackout and the 1987 stock market crash.Onthe HorizonThe Northeast Blackout was the largest in historybecause of the connections shared by the utilities inthe power grid. These connections translated anoverload at one point in the system into a cascadeof failures throughout the grid. Similarly, the stockmarket crash of October 1987 was created not by economicfundamentals or trader sentiment, but by theinteraction of programmed instructions created by


62Focus on Time as Driverof CostCommunications Technology Data Communications Wireless/MobileCustomer Expectations for“Anytime, Anyplace”PeopleThe Connected EconomyConnectionsProcessesComputersExpected Outcomes Acceleration GlobalizationThe Adaptive EconomyUnexpected Outcomes Inter-enterpriseRelationships Adaptive SystemsTheory Increasing ReturnsEconomics Emotion in theWorkplace19801985 1990 1995 2000Figure 1: Competition is driving business to form connections at an unprecedented rate. This connected economy is havingunanticipated consequences which will combine in an even greater transformation–to the adaptive economy.independent trades. These unconnected instructionsbecame linked by the mechanisms of the market,increasing the volatility of the system as a whole andcausing the Dow Jones Industrial Average to lose 23%of its value in a day.These examples reveal that connection is not necessarilya matter of information technology, thoughconnections are often enabled by it. Connections area matter of linked decision-making mechanisms. Theelectric utilities were not connected by a computer,but by electromechanical control systems. The tradingrules were connected by the particulars of theirtiming and price instructions. 2In biology, it would have been impossible to foretellthe emergence of mammals, reptiles, and the rest byobserving the new cellular forms that arose when twotypes of bacteria combined to create the first cellwith a nucleus. 3 Similarly, we cannot predict theshape of the economy as the emergent propertiesof connection begin to appear. We can, however,observe four trends that, like the nucleated cell,have the ability to transform today’s landscape:Inter-enterprise RelationshipsIncreasing Returns EconomicsAdaptive Systems TheoryOnthe HorizonDoes this mean that a more connected economy willbe characterized by an increasing frequency ofcalamitous events? Perhaps, but explosive growthevents will be equally characteristic (e.g., Microsoft,the World Wide Web, and electronic commerce).Beyond volatility, however, the connected economywill begin to exhibit some novel, emergent properties.In this article, we argue that connections are growingenormously in number, speed, and type, and that thistrend will turn the economy into a “complex adaptivesystem.” Market economies have always been adaptiveto a degree. But future rates of adaptation willmake capitalistic “gales of creative destruction”everyday weather.Emotion in the WorkplaceInter-enterprise RelationshipsIn the 1980’s, Procter & Gamble’s Pampers Divisionentered into a strategic partnership with theWal-Mart stores that sell so many of its diapers.Wal-Mart agreed to transmit daily Pampers sales datafrom each of its nearly 2,000 locations to Procter &Gamble. Using this data, P&G restocks Wal-Mart’sstores with no action needed by Wal-Mart. The resultalmost doubled Pampers’ inventory turns in the firstyear, to nearly one hundred. A decade later, the levelof inter-company coordination achieved by Procter &Gamble and Wal-Mart still exceeds the level of interdepartmentalcommunication and coordination atmost companies. 4


63Connection eliminates delay at all levels of an industry: Between manufacturer and assembler, Just-In-Time connectionsreduce inventories; Between customer and supplier, Internet communications provideinstant availability status; Between managers, message store and forward across 12 time zonesshortens product development time.article abstractThe Evolution of ConnectionThe power of a computer is proportional to the squareof the number of computers it is connected to.—Robert Metcalf, the inventor of EthernetIn the 1980’s, early work in activity based costingrevealed that time was a primary driver of cost in theindustrial corporation. Studies of “time based competition”showed that more than 80% of the cycle timein many production processes was spent in “non-valueadded activities” such as waiting in queues andinventories. 5 Eliminating these delays and their attendantcosts was a primary focus of “just-in-time”manufacturing and logistics and, more broadly, of“reengineering.”At the same time, technologies such as 800 numbers,overnight package delivery (based on bar coding),and customer databases changed the economicsof customized, remote service. Consequently, “anytime,anyplace” delivery became the customer’sexpectation, and seven-day, 24-hour service theprovider’s obligation.rather than relying on stock made months inadvance. This new system created a direct feedbackloop from the customer to the maker, circumventingthe delays of information transfer through the retailerand wholesaler. In addition to cutting inventory,carrying costs, and retail markdown expense, thenew connection provided better input on customerdesign preferences.Electronic connections continue to shrink the lengthof time between action and response, thanks togrowing prevalence of pagers, cellular phones, voicemailsystems, electronic data interchange, etc.Connection innovations continue to proliferate.Examples of technologies that help make faster, morerobust, more reliable connections include: electroniccommerce; broadband networks; multi-media; globalpositioning systems; on-line language translation;effective data-to-voice and voice-to-data interfaces;seamless man-to-machine interfaces; data communicationsprotocols; non-invasive medical monitoringdevices; and integrated vehicle-highway systems.Onthe HorizonThese cost-saving opportunities and escalating customerexpectations, combined with falling cost oftelecommunications, strengthened the push for connection.Innovative businesses quickly capitalized onthe advantages such connection could convey. TheItalian clothier Benetton, for example, substitutedinformation for inventory by collecting daily salesdata from around the world. This allowed the companyto make manufacturing decisions based on purchases,


64The extraordinary growth of inter-enterprise relationships is one unanticipated consequenceof an increasingly connected world. P&G’s and Wal-Mart’s famous partnership was anearly hint of the future. Today, arm’s-length relationships are giving way across the boardas firms cooperate in complex, connected webs.Another form of inter-enterprise relationship is outsourcing. With robust connectionspossible, a firm can choose to focus only on what it does well, and have much work that isnecessary but not “core” performed by others who specialize in it.article abstractOnthe HorizonPrior to the connected age, little attention waspaid to the collaborative relationships possiblebetween enterprises. The industrial economybuilt self-sufficient institutions focused on mass production.Factories were large and often isolated, andcompanies developed the ancillary capabilities—fromlegal departments to whole company towns—neededto support their own activities. The best availablemeans for economically sensitive coordinationbetween departments was hierarchy: sharing a boss.In the connected economy, high technologybusinesses are reshaping inter-enterprise relationships.6 Arm’s length relationships are giving way tocomplex, connected webs. Capabilities are beinglinked together to create value, often for a relativelybrief period. Not infrequently, the entitiescooperating in one area may compete in another.Inter-enterprise connections have arisen from necessity.When IBM first marketed the PC in 1981, customersperceived it to be produced entirely by IBM.In reality, IBM’s main achievement was putting itsbrand on a working set of components made by Intel,Microsoft, Quantum, et al. The interchangeabilityof these “modules” of the PC value chain becameapparent over the next decade, as the componentbrands achieved parity with IBM’s. The current arrayof computer competitors—including Compaq,Dell, IBM, and now Intel itself—features every combinationof value chain modules. Each module can bebought separately.This modularization of value chain components is notlimited to high technology. The “third party logistics”industry sells warehousing, rapid response shipping,and tracking services, allowing organizations likepharmaceutical companies to focus on their research.Enterprise software systems from vendors includingSAP, Baan, and PeopleSoft provide templates for coreprocesses ranging from manufacturing to humanresources management.This has been called the “hollowing out” ofcorporations. But it is also the creation of “valuewebs,” highly interdependent enterprises each tightlyfocused on a narrow set of capabilities. Companieswill be unable to maintain capabilities that are notworld class, and will rely on richly connected relationshipswith specialists to create their value webs.It seems likely that these webs will be far moreadaptive than the vertically integrated corporation.Knowing the Drill: Virtual Teamwork at BP, pg. 14


65Number of alliances formed300200100InformationtechnologyAutos andaircraftBiotechnologyChemicals andnew materials01970 1975 1980 1985 1990Figure 2: Alliances in selected high-technology fields, 1970-1993.(Source: The Alliance Revolution, Benjamin Gomes-Casseres, Harvard University Press, 1996.)The economy will increase its adaptability byaffording instant access to well developed capabilities,rather than taking time to grow them foreach new enterprise. The ability to rapidly andsuccessfully form intimate and effective interorganizationalconnections is becoming a keycorporate skill in the connected economy.Eventually, this trend may extend to individualemployees who will become entrepreneursproviding specialized services to a value chain.Increasing Returns EconomicsWhy has everyone with a modem received a dozenAOL discs? Why is the hottest software product of thedecade—the Web browser—given away free? Whydid Sun Microsystems, Oracle, and seven other companieswith an interest in the success of the JavaInternet script language recently amass $100 millionto create a Java venture capital fund managed byKleiner, Perkins, Caulfield & Byers?The answer is increasing returns economics.Economists have traditionally taught that businessesgrow to the point where returns to scale diminish.That is, the benefits of scale are overwhelmed bythe disadvantages of size, such as the difficulty ofcoordination, or the distances between producerand customer. These ideas fit well in the era whencommunication and transportation were difficult.The connected economy facilitates coordination, andincreasingly the “goods” can be delivered over awire—dramatically increasing the size of the enterpriseat which diminishing returns set in. Just asimportant, the connecting parties must share a standardof communication. Such a standard can establishan economic community of interest.These factors create increasing returns to scale:under such circumstances, the positive economicfeedback that is created drives the market to a singlesolution rather than several competing ones.Onthe HorizonThe phenomenon is not new. Clockmakers and timetellersin the middle ages had to decide between whatwe now call clockwise and counterclockwise motion.Similarly, an increasing returns scenario also arose inthe early days of the telephone industry: it wasunderstood that a single network connecting everyonewas a superior solution to several competingnetworks. Economists deemed telephone networks“natural monopolies” and recommended regulation.


66In a more connected world, traditional economics is surprisingly overturned.Rather than experiencing diminishing returns, a product approaching marketsaturation is rewarded with increasing returns. Consider Windows.The key to increasing returns is to set the standard and achieve “lock-in” to yourproduct—whether it’s a safety razor or a web browser. Lock-in involves infrastructureinvestment, and in an economy where much for sale is intangible, thisincludes learning curves. Going forward, achieving increasing returns willincreasingly require cooperation among a number of firms.article abstractOnthe HorizonThe experience curve that has long been recognizedas a special case of increasing returns. The positivefeedback loop between market share and cost measuredby the curve makes market leadership animperative and drives stable industries towardoligopoly, a process limited only by the threat ofantitrust action.What’s different with, say, clocks and other knowledgeproducts is that the positive feedback isbetween market share and ease of use. To achievethis ease of use, knowledge products rely on the protocolsof financial transactions (Quicken), the formatsof videodiscs (consumer electronics), the API’s 7 ofoperating systems and on the accumulated learningof their customers. A new Bill Gates can no longerpropose a radically new operating system, becausethe industry is “locked-in” 8 to the hardware, software,and (most important) the learning investment in theexisting product.As connections grow, we will enter a period wherecompanies fight to achieve increasing returns in everlargersegments of the economy. Eventually, some ofthese battles will result in standards we take forgranted, like 60 cycle 110 volt current. 9 But lock-inwill be increasingly difficult for a single company toachieve—it will more often occur at the industrylevel. Just as there are many makers of clockwiseclocks, there are many manufacturers of fax machinesfollowing the Group III fax standard, and manypurveyors of Windows PCs.Thus, while Windows itself stands (temporarily) as alocked-in monopoly, the consortium approach takenby Sun Microsystems et al. to invest in Java startupswill be more prevalent in the future. The strategyimproves Java’s chances for success, reduces eachcompany’s risk, and injects the unique investmentmanagement skills of Kleiner, Perkins into the valueweb. Because the economy will become increasinglyboth intangible and connected, increasing returnswill become a strategic imperative. We will seemore strategic innovations in the management ofincreasing returns.To compete in an increasing returns market, abusiness must rapidly establish its product as thestandard. The goal is to quickly lock in the investmentof users and of providers of complementarygoods, whether it be in equipment or in knowledge.Once you’ve installed and learned to use thesoftware AOL sent to you, GENIE’s disc will likelybe ignored.Increasing returns suggests that new solutions willdiffuse—and disappear—much more rapidly, a secondmechanism of increased adaptivity. But the economywon’t adapt the way economists predict—in a smoothapproach to the “optimum.” It will proceed by leaps,in a path-dependent evolution. To understand suchbehavior, we will turn to the sciences of adaptation.


67The connected economy increasingly resembles other highly complex,adaptive systems. So, unexpectedly, management theory is learning fromthe new sciences of chaos and complexity.Complex adaptive systems theory teaches us to view systems as groups ofindividual agents, each acting according to some basic rules. It’s theirinteraction—the sum of their individual behaviors—that makes the systembehave as it does. The system’s properties are emergent, not planned. Therole of the manager, then, is less the dictator and more the platformbuilder and selective intervener.article abstractAdaptive Systems TheoryFarm equipment manufacturer John Deere offersautomated planters for sowing every kind of seed inevery imaginable farming condition: over 1.6 millionconfigurations in all. Due to the huge number ofoptions, none of the usual array of optimizationtools such as dynamic programming were able to efficientlyschedule the company’s planter factory. AfterDeere began using genetic algorithms to optimizescheduling, the company’s order cancellation rateplummeted from hundreds to 5 or 6 over a six-monthperiod. Overtime went down drastically, too. 10Genetic algorithms are just one of a growing numberof tools and ideas germinated in the as yet arcaneworld of adaptive systems theory—also known ascomplexity science—that are finding profitable applicationsin the business world. The most exciting possibilities,however, remain in the research stage:better models of business and the economy.Traditional economic theory has explored “very thoroughlythe domain of problems that are tractable bystatic equilibrium analysis,” says Brian Arthur, the scientistwho pioneered the concept of increasingreturns. 11 “But it ... virtually ignored the problems ofprocess, evolution and pattern formation—problemswhere things were not at equilibrium, where there’s alot of happenstance, where history matters a greatdeal, where adaptation and evolution might go onforever,” he explains. As investigators seek thelaws by which systems of all kinds grow and adapt,economists are beginning to tackle these“non-equilibrium” problems. 12Adaptive systems theories hold the potential toamplify our understanding of the evolution of theenterprise and the economy. As the connectedeconomy emerges, enterprises are recognizing thatthe interactions among economic actors have becomeat least as important as the efficient functioning ofeach agent.Complexity science focuses on the behavior of suchsystems, called complex adaptive systems (CAS).A CAS consists of independent “agents,” each capableof making decisions using a few rules. Agents can bepeople, circuit breakers, trading instructions, orDNA—any entity whose decisions can be defined byrules. The actions of one agent affect the choicesof another, so they are connected. Simulations withlarge numbers of interacting agents show that theybehave in ways that are often unpredictable; theproperties of such systems are said to “emerge” fromthe behavior of the connected individuals. Oftenthese emergent properties are “non-linear”—unpredictableand volatile, like the stock market and thepower grid.Onthe HorizonMurray Gell-Mann, pg. 75


68Onthe HorizonThese ideas afford a new way of looking at business:an economy is an adaptive system of agents (firmsand individuals) interacting. Under certainenvironmental conditions (a legal system, capitalavailability), properties such as growth, cyclicality,and distribution of income emerge. Likewise, acompany can be reframed not as a machine in equilibrium,but as an adaptive system of individuals.Success, however measured, is an emergent outcome,not the product of a machine. In the connectedeconomy, the organizations providing the infrastructureupon which agents can most effectivelyand productively organize themselves will attractthe greatest talent. 13The development of complexity theory will help usunderstand the behavior of the economy as itsadaptivity grows.Emotion in the WorkplaceA book on Trust 14 is currently a common subjectof business meeting conversations. Improvisationalacting troupes appear at executive conferences.“Casual day” is prevalent enough to spur new retailingventures. The art of “constructive confrontation” is arequired course for new Intel employees. New York’sMorgan Hotel has redecorated, replacing its hardedgedEurostyle with soft colors, comfy armchairs,and a philosophy of comforting people—rather thanproviding the arena for their power lunches.Though some of these trends may prove fads, theypoint to a tendency of more enduring significance.Human emotion will become a more important factorin organizational life over the next twenty years, forseveral reasons:1. The Battle for Attention. As monitoring and controlfunctions become automated, management willincreasingly be making choices about what to payattention to. Since managers will be ever more bombardedby demands for attention, the battle will escalate.Advertising and entertainment professionalshave long known the answer: appeal to emotion. Thetechniques used in television and other media toengage our emotions will be increasingly employed inbusiness situations.2. High-Affect Technology. The reduced cost to presentsound, color, and motion will help “arm” thisbattle for attention. These media engage parts of thebrain wired for emotion, as black and white text doesnot. 15 Business relationships will employ new toolsalready used by the advertising and entertainmentindustries to capture attention. Stan Davis believessmell may become the next medium. The tools will bechosen based on their ability to elicit emotion.3. The Impact on Productivity. All attention is not ofequal quality: Mihaly Csikszentmihalyi’s book Flow:The Psychology of Optimal Experience describes indetail the familiar state of intense concentration, highperformance, and controlled excitement that occurswhen individuals are emotionally engaged in work.Becoming a Knowledge-Based Business, pg. 9


69Another unexpected consequence of the connected economy is greater emphasis onemotion in the workplace. This is true for five reasons: 1) attention is the manager’sscarcest resource—and cutting through requires appealing to emotion; 2) deliveringemotion-impacting messages is easier with today’s multimedia capabilities; 3) therecognition is growing that emotionally engaged people are more productive; 4) flexiblework arrangements are making it harder to delineate one’s work from one’s “life”; and 5)as organizations cease to be viewed as machines, people will stop being asked to behavelike automatons.article abstractHe asserts that individuals in flow are much moreproductive, 16 and they associate the state withfeelings of pleasure, power, and accomplishment.It is thus in the interest of business to create an environmentwhere people spend a greater proportion oftheir time in this fully engaged flow state. Since individualstake greater pleasure from their work in sucha state, the workplace that encourages this state willattract the most talented labor.4. The Blurring of Personal Workspace.Telecommunications options will create choices forfinding an ideal flow-state workspace. It is harder toretain the traditional “manager-as-economic-man”stance when working at home, in the midst of familyand personal environment.5. The Waning of the Machine Metaphor. Asbiological ideas take hold in business, the idea of theemployee as a part of a mechanical process will loseits power, and it will become less defensible tosuppress emotion at work.The Adaptive AdvantageIn the primeval soup from which we all descended,atoms connected with other atoms to form molecules.These reacted to form organic chemicals, which eventually—throughmillions of years and billions ofchance opportunities—created bacteria and higherorganisms. This property of “self-organization,” asadaptive systems theorist Stuart Kauffman calls it,is a general characteristic of systems whose componentscan interact with one another (i.e., connect).And, in such systems, connections grow ever richerover time.The new density of connections is now evolvingthe four new features of economic life discussed inthis article. Perhaps these features are like the adaptations—lungs?legs? wings? vocal chords?—thathelped mammals evolve. If the growing density ofconnections in the economy parallels the developmentof new cell types in the biosphere, what will bethe outcome?Onthe HorizonHow this focus on emotional connection maychange adaptiveness is unclear. Perhaps thenon-linear aspects of the economy will be betterunderstood and reconciled as a result. But for allthe above reasons, managers in the future—orwhatever managers will become—will have to havea far greater facility for eliciting, channeling, andmodulating emotions than today. Organizationsthat can hire and develop this capability willachieve higher organizational performance.John Kao, pg. 73


70As in the biosphere, a growing density ofconnections will continue to yield new and morecomplex forms of economic life. The shape of thingsto come is utterly unpredictable—so how aremanagers to manage? The enlightened will taketheir cue from adaptive systems and facilitateexperimentation, recombination, evolution, andproliferation. The ideal will be a balance of stabilityand innovation.article abstractOnthe HorizonWe can no more predict the range of future enterprisesfrom today’s innovations than we can envision anelephant by considering the properties of the bacteriafrom which it evolved. The implication for us as managersis that we won’t know what is right—the companiesthat will thrive in the new economy might havetrunks and tusks, but they also might have wings andfeathers. As managers, how do we cope?Biology has been working on this problem for aboutfour billion years. What does it teach us? The answersare far from complete, but on the frontier of adaptivesystems theory is a set of ideas about adaptivenessitself. The adaptive system, it appears, balancescontrol against disorder, efficiency against experimentation,and standardization against diversity. Thisbalance gives it the best chance both to thrive whenthe environment is stable and to re-adapt when thesituation changes radically.We are in the beginning of learning how to use thelaws of the biosphere for the econosphere: how tofind the balance between stability and innovation.Given our recent mechanistic bent, our overarchinganswer for the present is to favor adaptivity. Theanswer is not to fight the adaptiveness, but to go withit—you can’t fight emergence. Learn the “Lessonsfrom Adaptive Systems Theory” listed on page 71.It will be painful, threatening, and dangerous toabandon the structures, systems, and dicta thatbrought us to where we are now. But if they preventus from adapting to the connected future, they mustbe left behind.It’s the best possible time to be alive,when almost everything you thought you knewis wrong.—Tom Stoppard, ArcadiaAct I, Scene Four


71Lessons Forthe Adaptive EconomyExperiment, don’t plan. The best-laid plans willbe confounded by events to which you arenewly connected. In complex systems, it onlytakes a small deviation to create a huge change(the flapping of a butterfly’s wings in Beijingcan cause a hurricane in Hawaii). In the connectedeconomy, your business is an agent in amassive, complex system.Recombine, don’t invent. Nature makesprogress by mixing together the “ideas” of evolution—literally,by swapping elements of code.Build on things that work by recombining themin new ways. The connected economy makesthese things more accessible faster than before.Innovate, don’t perfect. Nature never reaches amaximum—it finds an adaptation that doesbetter than what went before. In the connectedeconomy, someone else will create more valueby building on your innovation than you can bycontinuing to work on it.Act, don’t coordinate. Stuart Kauffman has experimented with the way simulated organizations performwhen they are controlled in different ways: centrally (the “Stalinist Limit”); completely individually (the“Leftist Italian Limit”); and various options in between. His results suggest that adaptation occurs mosteffectively and requires the lowest amount of energy when organizations are broken into “patches” ofmodest size—six to ten. While these results are derived from “toy world” simulations, their implicationsring true to experienced managers.Trust, but verify. The biosphere is full of nasty characters. Wasps lay their eggs in caterpillars, whichdie to support the wasps. The Ebola virus has little redeeming social value. But both have adapted effectivelyto their environments. Computer scientist John Holland’s research has shown that the “tit for tat”strategy, which may be seen as equivalent to the Golden Rule, can be beaten by more sophisticatedplayers (actually software programs genetically bred for the purpose) that first clean out the playerswho are too “moral” for their own good. The players then revert to the Golden Rule behavior amongsophisticated opponents who understand the game. Neither the connected economy nor the adaptiveeconomy that will succeed it are “new age”—both are as red in tooth and ink as ever.Onthe Horizon


72Onthe Horizon1 Quantitative simulations show that as the number ofconnections among elements in a system rises above half thenumber of elements, the probability of cascading events risesdramatically. This represents a “phase change” in the system,resulting in non-linear responses to external inputs. SeeStuart Kauffman, At Home in the Universe. (New York:Oxford University Press, 1995).2 Underscoring the point, the SEC installed automatic rulesdesigned to limit similar cascades—and called them“circuit breakers.”3 See Gould, Steven Jay, Wonderful Life: The Burgess Shaleand Nature of History (New York: Norton, 1989).4 See Stan Davis, 2020 Vision (1992) for an explanation ofthese ideas.5 Stalk, George. “Time: The Next Source of CompetitiveAdvantage,” Harvard Business Review, July/August 1988.6 See Moore, James, The Death of Competition: Leadershipand Strategy in the Age of Business Ecosystems (New York:Harper Business, 1996).7 Application Program Interface8 Brian, Arthur, W., ”Increasing Returns and the New World ofBusiness,” Harvard Business Review, July/August 1996.9 Note the penalty customers pay for using portable phonesand computers—carrying around low voltage PC powersupplies (the “softwindows” of the electrical novice).10 Petzinger, Thomas, “At Deere They Know a Mad Scientist MayBe a Firm’s Biggest Asset,” The Wall Street <strong>Journal</strong>,(July 14, 1995).11 Waldrop, M. Mitchell, Complexity: The Emerging Science atthe Edge of Order and Chaos (New York: Simon & Schuster,1992), p. 325.12 In fact, adaptive systems theory presented Arthur with thetools necessary for modeling the theory of increasing returnseconomics. See also Anderson, Philip W., Kenneth J. Arrow,and David Pines, The Economy as an Evolving ComplexSystem (Boston: Addison Wesley, 1988).13 This, in turn will create an increasing returns feedback loop:the best talent using the best infrastructure creates the mostvalue, which creates the greatest means to improve theinfrastructure and attract more talent.14 Fukuyama, Francis, Trust (New York: Free Press, 1996).15 DaMasio, Antonio R, Descartes’ Error, Emotion, Reason,and the Human Brain (New York: G.P. Putnam’s Sons, 1994).16 Csikszentmihaly; Mihaly, Flow: The Psychology of OptimalExperience (New York: Harper Collins, 1990).


73John Kao OnCorporate CreativityAbout the author:John Kao draws on an appropriatelyeclectic background in his study of corporatecreativity. He is a Harvard-trainedpsychiatrist, a successful high-tech entrepreneur,and a film producer (of theaward-winning “Sex, Lies, and Videotape”).But it’s his avocation as a jazz pianist thathas given him the best metaphors todescribe how some companies managecreativity better than others. He teachesa course on the subject at HarvardBusiness School, and shared somehighlights with us . . .“Managing creativity really requires a newmanagerial mindset,” claims John Kao.“If you think about what traditionalmanagement is all about—how it’s taught in businessschools and practiced in organizations—the skills thatare rewarded have a lot to do with analyzing options,decreasing uncertainty, and paying a lot of attentionto detail. But those kinds of skills may actually behighly dysfunctional in an environment where themission is to generate new insights, ideas, andprocesses that lead to the realization of value.”So how are managers concerned with enhancingcreativity supposed to approach their work? “One ofthe biggest clues for me,” says Kao, “came from‘Schindler’s List.’” In one scene of that film, a workerobserves Schindler not working like the others andquestions his role. “And—if you remember the greatline—he says: ‘I handle the presentation.’ More andmore, when we talk about managing creativity, we aretalking about stage managing—about creatinghot environments where unprecedented things cantake place.”Implicit in this new managerial mindset is the ideathat the real strength of a company is the creativity ofits people, and that great creative leaps happen onlywhen people are able to improvise. Kao notes thatgood improvisation (or “jamming”) does requireimportant kinds of knowledge, but also requires acomfort level with not knowing. In the words of Zenphilosophers, part of what an organization mustachieve is “beginner’s mind.”SharedConversation


74A few voices stood out at the second annual Knowledge Advantage colloquium(held in November 1995 in Chicago), representing very differentperspectives on the topic of knowledge management. John Kao exploredthe challenge of managing creativity, applying the metaphor of an improvisationaljazz band and weaving in some Zen teachings. Nobel LaureateMurray Gell-Mann looked at knowledge through the lens of complexity science.And Motorola’s Bob Galvin brought a bottom-line perspective to theconversation, describing why he believes it is just good business to investin employees’ knowledge growth.article abstractSome companies have very effective techniques forclearing the mind—that is, challenging their expertiseand perceiving new realities. “Jan Timmer,” notesKao, “who ran the big change management process atPhilips over the last few years (Project Centurion),called his managers together and said, ‘Nobody whohas functional expertise in a particular area is goingto be allowed to run a project having to do with thatarea.’” Similarly, The Coca-Cola Company recentlyhired 125 marketing executives, none of whom, byintention, had any beverage experience.Also at Coke, CEO Doug Ivester considers it hispersonal responsibility to visit typical stores anonymously,to get an unadulterated view of the market.Other companies make this a distinct role in theorganization. Meiji Seika, a Japanese confectionerycompany, told one manager in 1994 that his jobdescription was to live in Brussels, eat dinners out,and visit grocery stores. Period. His nickname at thecompany was “Tastebuds.”The organizational equivalent of “beginner’s mind”can be facilitated in other ways, too, says Kao. Aswell as clearing minds, managers can think about howto clear spaces and how to clear beliefs.that are conducive to creativity, right down to thearchitectural details.” This was the case at Oticon, ahearing aid manufacturer in Denmark. An example:“They decided to build their staircase twice as wide.Why? Because they wanted people to be able to goup and down without having to say, ‘Oh, excuse me’—so they could continue the informal conversationsthey were having. Oticon’s premise was: If we’regoing to compete through creativity, we have to\maximize the opportunity for conversationsinternally.” Certainly, the office furniture makerSteelcase believes this. In its research, design, andcommunications investments, it is “betting the companythat creative collaboration will be the theme ofthe future.”Finally, “clearing the beliefs” of a company is about“creating a climate of belief—fostering an expectationof not just the possibility but the inevitability ofcreativity.” Those who manage creativity astutelyraise the company’s level of expectation of it andmake it more tangible. The value and priority placedon creative work is clear.SharedConversationWhen Kao talks about “clearing the space,” he meansthe actual physical work environment. “You can getvery detailed about how to create environmentsA SharedConversation


75MurrayGell-Mann onthe Complex Adaptive BusinessAbout the author:Murray Gell-Mann was awarded the NobelPrize in Physics in 1969, and is best knownfor his theory predicting the existence of“quarks”—a discovery which sparked thedevelopment of a new branch of physicsknown as quantum chromodynamics.His interests extend to many other subjects,including archaeology, history,evolutionary biology, linguistics, learning,and creative thinking. He was able to bringall these subjects together as a foundingmember of the Santa Fe Institute, wherehe is involved in the study of complexadaptive systems.When W. Edwards Deming began hispioneering work with Japanese firms inthe 1950s, he was, says Murray Gell-Mann, essentially recognizing the firm as a complexadaptive system. He saw that organizations, like otherevolving organisms, were constantly gathering informationfrom their environments and adapting to it.Deming’s contribution, as Gell-Mann describes it, issomething to which all managers should aspire: toensure that the firm is always adapting to “the realselection pressures on the organization—namely theneed to please and retain customers and to make aprofit—rather than selection pressures generated byindividuals in the organization, which may not becoincident with the needs of the organization asa whole.”Gell-Mann explains himself by way of a quick introductionto complexity science. The complexity thatwe see in the world all around us, he explains, is dueto three basic things: “very simple rules, initial order,and the operation over and over and over again—the relentless operation—of chance.” What do wemean when we speak of an object’s or system’s complexity?A good definition is that its complexity ismeasured by “the length of a very concise descriptionof its regularities.”SharedConversation


76A SharedConversationSharedConversationFinding these simple rules and regularities is theobjective of scientific enterprise—itself a complexadaptive system. In science, the rules (or as Gell-Mann calls them, schema) are the theories andequations proposed to describe the world and predictits workings. Like Maxwell’s equations or, today,superstream theory, they state the simple, robustlaws that govern a diverse set of phenomena.Such schema are not absolute—there are always competingtheories about, to be proved or disproved byfurther experimentation. As more information isgathered, the complex system that is sciencecontinues to adapt.Businesses too, of course, have their schema, in theforms of policies and practices. Those that lead tosuccess are selected for continuance, while the othersdie off. The problem comes when the pressures onthe selection process are coming not from payingcustomers but from internal players with personalagendas and the power to skew the feedback. As inscience, Gell-Mann notes, “we see other selectionpressures coming from human frailty: excessiveambition, greed, and so on.” Responsible managerswill fight this threat by working to make the companyinto “a genuinely adaptive system.”Gell-Mann notes another implication of complexityscience (or what he calls “plectics”) for business: thatthe formulation of strategy is at best problematic. “Ahundred years ago, every scientist would have said:‘Of course, if you know the initial condition and youknow the law, you can predict what will happen.’ Nowwe know that’s absolutely untrue.”Two sources of unpredictability are the problem. Thefirst is that the fundamental laws of the universe arequantum mechanical: “all they give is a set of probabilitiesfor alternative histories of the universe—not aclear prediction that one history will occur instead ofthe others.” Second, there is the famous phenomenonof chaos to contend with, which involves extremesensitivity to initial conditions: “tiny, tiny changesin initial conditions can distort the outcome byhuge amounts.”The translation to business is that it’s frequentlyimpossible to find a “best” strategy. Instead, Gell-Mann posits, “what is most important is to have afamily of strategies, such that one can vary theresponse to one’s changing circumstances accordingto success.” Robustness doesn’t consist so much inhaving a particular pattern of response as in havingan enormous set of possible responses. “At the veryworst, you can move around at random among a setof related strategies—and if you can do better thanrandom, fine.”


77Bob Galvin onLearning at MotorolaAbout the author:Robert W. Galvin stepped down as chairmanof Motorola in January 1990 tobecome Chairman of the ExecutiveCommittee of the Board. Exactly 50 yearsearlier, his father, Motorola founder PaulGalvin, had invited him to perform his firstassignment for the company: an address toits national sales convention.Under Bob Galvin’s leadership, Motorolaexpanded into international markets inthe 1960s and, over the next decade,shifted its focus away from consumer electronicsand into high-technology markets.By the end of the 1980s, Motorola hadbecome the premier worldwide supplierof cellular telephones.Bob Galvin, by his own account, has “beenaround a long time” in business, and hasseen at least four eras of thinking aboutwhat makes for corporate competitiveness. But thebest answer came to him one day when he wasreflecting on his tennis game. “I simply noted that,when I lost, I lost because the other person was|better trained than I.” From that, a broader concerndawned: How could Motorola prevail unless itwere competitive counterpart-to-counterpart,person-to-person?“I came into my associates the next day and said: ‘Ithink I’ve finally identified the essence of how we’regoing to become more competitive. We must all striveto be as good or better than the best—not the average,but the best—of anybody who holds our job at acompetitor.” This was the beginning of what wouldeventually become a huge commitment to trainingby the company, ultimately taking the form ofMotorola University.But the idea was hardly a no-brainer. While no onerefuted the value of learning, many claimed the costwould be too high. Galvin offered the counterintuitiveargument. “My position was—and I’d done somethinking before I sprung this on the gang—thattraining wasn’t going to cost us anything.” Despiteproposing an initial expenditure of $40 million, hewas convinced the payback would more than repaythe cost.SharedConversation


78A SharedConversationGalvin increased Motorola’s knowledge advantagein other ways, as well. One idea he borrowed fromthe US government has since become commonpractice in industry: to establish an “intelligencedepartment.” He even hired away an agent from theCIA to found it and build a staff of “professionalknowledge acquirers.” The group has been invaluableto Motorola and to Galvin personally: “They knewwhat the Internet would do before its creators knewwhat it would do.”One key in such intense knowledge acquisition is, ofcourse, to avoid “information overload.” This was aproblem when Motorola executives were preparing tocrack the Japanese market. “We were just buried withall the things we ultimately knew about Japan.” Thebreakthrough came when the team stepped back fromthe maze of information and focused on an essentialfactor: “that what the Japanese really respect ispower.” As Galvin describes it, “we knew then wewere going to have to use power as a two-by-four tobreak our way into Japan—or risk having the principleof sanctuary bury our entire industry.”walking down the street by putting some four thingstogether. It may have no relevance for me, but atleast I’m in motion, I’m practicing.”Not surprisingly, Galvin is now working to bring thisskill to all of Motorola’s leaders. “We are explicitlydirecting our people, from middle managers on up, tobe extremely conscious of the processes of creativethinking.” Just as importantly, they are being urged togo use it. Explains Galvin: “We’ve got to be bold. If wedon’t leave a legacy—if we don’t use our knowledgeto cause something different and better to occur,then we shouldn’t have our jobs. Because that is ourjob. We are leaders.”SharedConversationFocusing on the essence of what one knows orshould know is not easy. But Galvin believes it is, likemuch of creative problem solving, a learnable skill—even, he says, a “vocational skill.” He personallyundertook to learn how to think creatively some 40years ago, with the help of a seminal guide: AppliedImagination by Alex Osborn. Like any learned skill,creativity takes practice. And practice Galvin does.“Now I do it almost instinctively. I’ll have an idea


TechnologyWatch79TECHNOLOGY WATCHFive Myths that Slow Down SoftwareDevelopmentJohn ParkinsonTechnologyAbout the author:John Parkinson is an Ernst & YoungPartner and the firm’s Chief Scientist. Hespecializes in improving the quality andproductivity of information systems professionals,and was the chief architect ofErnst & Young’s Navigator System Seriesmethodology. He received his degrees inmathematics and information sciences atExeter University in the UK. Contact him atJohn.Parkinson@ey.com.In 1982, Fred Brooks wrote The Mythical Man-Month, an insightful book that exposed a particularlyvirulent example of flawed thinking.The thinking went (and in some quarters still goes)that, if it takes a team of five developers one year tocreate a new software program, then that translatesto a job of sixty man-months. Clearly, some reasoned,those sixty months could be divided up differently.The addition of five more full-time people, forexample, should mean the software could be done insix months. Throwing more people at projects becamea common tactic for trying to accelerate the softwaredevelopment process.Unfortunately, as Brooks points out, it doesn’t work.For one thing, larger teams consume more time incommunication, management, and integration. Buteven beyond these “transaction” costs, the tactic hasfundamental problems because it assumes that oneman-month is indistinguishable from another. In fact,given the varying skills of the men and women andthe different stages of the project, some man-monthsare definitely more equal than others. In the end, thetactic makes about as much sense as trying to divideup gestation responsibility among three women, andexpecting a baby in three months.Over the decade and a half since Brooks wrote hisbook, we’ve been observing and collecting othermyths that slow down improvement processes.Particularly in the past five years, this has taken themore positive form of research into how to accelerate


80TechnologyWatchSoftware developers could accomplish more faster by shedding some of themyths they now labor under. The Universality Myth leads developers toapply standard development methodologies without adjusting for theparticular characteristics of the challenge (there are, in fact, at least eightdimensions of variability).The Completeness Myth leads developers to overengineer or “polish”their solutions in the (often) mistaken belief that customers will acceptnothing less than perfection on delivery. The “More Must Be Better” Mythrefers to time spent up-front capturing customer requirements. In fact,less time can be better—far more important is the use of prototyping.atchsoftware development. Following are explanationsof five myths still making the rounds, and somesuggestions on how to get beyond them.The Universality MythWhen our research program first started, weanalyzed the development process to discoverwhere time and effort are typically spent. Ourfindings were surprising.Only 45% of the total resources could be associatedwith the deliverables. This was such a low result thatwe suspected that either our research methodologyor data capture tool was at fault. Further interviewswith the project teams showed that in fact there weretwo sources of attributable effort that did not showup in our analysis: effort required for the problemdiscoveryprocess; and effort expended to developand analyze solution options that were not actuallyused. Still, the combined effect of these two findingsaccounted for about 20% of the total project effort,taking our “valid resource consumption” total toabout 65%. Where did the remaining 35% of ourconsumed resources go?Some effort was expended to correct problemsresulting from “defects” that entered the process atsome earlier stage. But the use of standard developmentmethodologies was the major culprit.Methodologies treated every project identically,which resulted in plans that contained unnecessaryor non-value-added work for a particular project.Project teams faced with work plans that they did notarticle abstractfully understand, but with utilization measures verymuch in mind, worked the plan. This explainedvirtually all of the 35% “lost” effort.If every project is to some degree different, what arethe characteristics that determine these differences?We grouped our suspected sources of variability intoeight “dimensions” that seemed to account for mostof the differences that mattered to us.1. The characteristics of the problem domain.2. The characteristics of the solution domain.3. The approach to be taken to identifying theproblem and creating a solution.4. The skills and experience of the project team.5. The tools and techniques used to support thesolution development process.6. The target technology environment forthe solution.7. The baseline knowledge available at the start ofthe project.8. The selected implementation approach forthe solution.We then began to use this project characteristicsmodel to influence the planning of projects. We tookabout 20 (out of 300) of the already completed projectsfor which we had the original work plans andplanning assumptions, and looked at the differencesin resource estimates that would have resulted if they


TechnologyWatch81had been planned using the characteristics model.These projects would have been on average 28%smaller, if all factors known to the project plannerswere taken into account. The absolute range wasfrom 11% to 43%. None would have been larger.In addition, we began to plan new projects using thecharacterization approach. Every project was alsoplanned in parallel, using the old approach. This time,we got slightly different improvement results. Theprojects were still mostly smaller in terms of plannedresources, but only on average by 24%. The absoluterange was from 13% to 34%.As the projects planned using our new process werecompleted, we analyzed the actual resources theyrequired and compared this to the projects from ouroriginal research database. We found an averageof 16% reduced resources at completion. Just by modifyingthe project planning process, we were able toget back almost half of the 35% “wasted” resources.Failure to capture all of the improvements in theproject outcomes was due to:Overoptimism in planning and estimatingassumptions. Although no individual assumptionwas unreasonable, the assumptions taken as awhole added up to too optimistic a view.Unanticipated requirements. Here, thediscovery process identified a significant numberof unanticipated but essential requirements.Various “decelerators.” These includedunavoidable but unplanned staff substitutionsand unexpected unavailability of key resources,infrastructure delays and technology stabilityproblems, development tool failures, andsponsorship changes.Clearly we can’t anticipate all sources of variability ofoutcome. But we can be smarter about how we planand estimate work.The Completeness MythOur next myth concerns the very common IS trait ofoverengineering solutions. In many projects, thedeliverables that were produced were far more elaboratethan was required. In some cases, more than 20%of total resources were consumed in what we came tocall “polishing the deliverables.” When asked why somuch time was spent “improving” what was alreadycomplete, project teams invariably gave the “qualitymatters to us and our customers” response. Whenasked if they appreciated the extra effort, most customersindicated that they were satisfied with thefirst or second attempt, and assumed that all thatextra work—which tended to slow things down—wasfor the project team’s benefit.We are not condoning the delivery of poor, unfinished,or inappropriate work. We are suggesting, however,that the project team develop an understandingof the customer’s desired levels of quality before theproject starts.Accelerating New Product Development, pg. 55


82TechnologyWatchThe important rule here is just another manifestationof the Pareto principal: 80% of value is realizedfrom 20% of the solution—and there is no value atall until you implement something. Getting the first20% developed and into use is the best way to besuccessful.The "More Must Be Better" MythAs our research data built up, we continued to lookfor ways to reduce cycle times. One area consumingtime and resources was “requirements analysis,” theearly stages of the project dedicated to identifyingand documenting customer needs. We had adoptedthe underlying principles of Information Engineering(IE), 1 which recognize that many rework problemsstem from process failures during requirementsanalysis. By getting the requirement analysis right,therefore, these sources of rework will be eliminated.When we put this principle into practice, we didsee more requirements analysis, but not alwaysbetter requirements.We defined requirements quality in terms of thenumber of changes that were required during thedesign and development phases of a project and thelevel of customer satisfaction after implementationof the project’s deliverables. In particular, we lookedat 25% of the projects that had both high customersatisfaction scores and low change rates to see howmuch requirements analysis had been done. But wediscovered that there was no correlation between theamount of effort that went into requirements analysisand the quality of the requirements that resultedfrom the work.Testing this result on the complete population of projects,we discovered the same thing. Finally, we triedit on the “worst” 25% of the projects that had low customersatisfaction scores and high change rates. Thefindings held.When we re-analyzed the best 25% of the projects, wediscovered that customers had been shown examplesand prototypes that were close to the finished application.And this process had begun very early on inrequirements analysis. We then tested the hypothesisthat there should be a link between requirementsquality and early exposure to something close to thefinal solution. Not surprisingly, projects that had notused prototyping or best-practice examples duringrequirements analysis scored poorly on requirementsquality. Projects that did use prototypes, but that didnot manage to create a prototype that was close tothe final solution, also fared poorly.We then formed the conclusion that requirementsanalysis is not just, or even mostly, about analyzingrequirements using the classic “find the problem,then solve it” model. Instead, requirements analysisis mostly about mutual education between customersand developers. By designing a requirements captureprocess that recognizes the nature of successfulmutual education, we can manage to the needsof the process rather than an abstract modelof objectives.Accelerating New Product Development, pg. 55 A Prescription for Knowledge Management, pg. 26


83In most cases, it’s a myth that business customers know exactly what theywant in a piece of software and can articulate that desire. They providemore useful guidance when they are given ideas and examples to whichthey can respond.The “We Are Different” Myth leads developers to reinvent much code thatcould simply be appropriated from other sources. Research shows that atypical application involves about 45% new coding versus reuse; it shouldnever be necessary to develop more than 35% from scratch—and the goalshould be 5%.TechnologyWatcharticle abstractThe "We Know What We Want" MythRecently, there has been a trend to presenttechnology matters in mainstream businesspublications. These articles, usually well-researchedand well-written, target a non-technical businessaudience and lack much in the way of technicaldepth. As awareness-building vehicles, they do acompetent job of describing complex technologiesin straightforward terms. Yet many CIOs dread them.Why? Because these articles create the impressionthat the technologies described are readily availableand mature; that they are easy to use; that just abouteveryone is already using them; and that if yourcompany is not, you are missing the boat. So whathappens? The next day, in comes an executivedemanding to know when XYZ Inc. will be usingobject-oriented programming to develop neuralagents to do business over the Internet, and why it’snot happening already. “Isn’t that what we pay youIS guys for?”Meanwhile, as PCs have gotten cheaper and cheaper,many executives who don’t have PCs at work, dohave them at home. They use extremely sophisticatedand very cheap software 2 that supports color, hasinstant tutorials, and provides “Wizards” for the hardparts. The software is also relatively defect-free.This is the application benchmark they come toexpect from all the applications they use. They neverexperience the problems of scaling up single-userexperience to the corporate environment. But theycertainly know that they want their large-scalebusiness software to look and behave just like theproducts they use at home.Smart IS organizations can use this growing technologicalliteracy among their customers to their advantageif they are prepared to change their ownprocesses to respond to it. Instead of traditionalrequirements capture methods (“tell me what youwant”), substitute “best of breed” product comparisonsand technology demonstrations to detect anddefine improvement opportunities. Get ahead of yourcustomers and establish the IS organization as a reliablesource of information about new technology andthe solutions it can provide.The “This is Different” MythOver time, we have built up a collection of analysesof the executing image of various types of softwarethat allows the “origin” of the code to be recorded.This structural analysis has allowed us to identify andrecord the source of the many different componentsthat make up a modern software application. It alsoallows us to answer three other related questions:1. How much actual influence do we have over thecode that makes up our applications?2. How much of the code that we can influenceshould we care about?3. How much of what we should be doing arewe actually doing when we developapplication software?


84Origin of code in executing application image %TechnologyWatch Operating system-supplied services and system utilities. 22%Data communications, networking and transaction management, including concurrency control,transaction recovery, and commit logic. 13%Data management, including standard data access, data and view synchronization, andmetadata management. 15% User interface behavior, interface control, data presentation, and interaction management. 15%Industry- or marketwide business rules and interenterprise process and data structuredefinition conventions. Includes regulatory requirements and conventions. 15% Enterprisewide business rules and process or data structure definition conventions. 15% Problem-specific rules, processing logic, or data structure definitions. 5%Table 1 Summarized structural analysis of applications softwareThe execution image of a modern application is enormous.It will include components from many differentsources. The image is generally much bigger than itwas prior to the introduction of graphical user interfacesand distributed data architectures, but thedistribution of the code has remained remarkablystable. Table 1 summarizes the results of the analysisof several thousand application images developed torun on a fairly wide range of target technologyplatforms. The profile that results is remarkablyindependent of the actual technologies used.Although the specific values do vary somewhat, theline between what we can influence and what wecannot is pretty well fixed at about 35% (The lastthree items in Table 1).The logical target for our developers? Build only the5% or so of the application that consists of problemspecificlogic. We should be able to build everythingelse just once, or get it from someone who hasalready developed, tested, and published it. This isthe ultimate target of the component reuse model.Even if we don’t have any of the common industryand enterprise components available, we should stillnever have to build more than 35% of an application.If 35% of the execution image is the maximum weactually need to develop from scratch, what’s theaverage proportion that we actually do develop? Asurprising 45%. On average, we develop 10% more ofthe total execution image of an application than themaximum we actually need. This can largely beattributed to inexperienced or undertraineddevelopers, who develop code that was already availablefrom other sources, or don’t trust the servicessupplied by the operating system and develop theirown versions, and so on.Stopping this kind of behavior requires awarenessbuildingand the establishment and reinforcement ofspecific competencies, but it can quickly pay foritself in terms of reduced work effort and improvedcomponent reuse levels.Although software engineering processes have beendescribed and debated for nearly two decades,sustainable levels of performance in software developmentremain elusive. We have shared our researchresults on the leading practices for software development,and the myths that prevent their successfuladoption, in the hope that they will contribute toimproved development processes—and in the hopethat the multi “man-year” development project may,in time, also prove to be a myth.1. Information Engineering, Volumes 1-4, James Martin,Prentice Hall 1987.2. Microsoft’s Excel spreadsheet package, which can be boughtfor under $100, represents about 5,000 function points ofcontent. That’s about the same as a medium-sized businessapplication. Microsoft can sell it for 2¢, retail, a function pointbecause they will sell millions of copies. Most users will onlytouch about 10% of the available functionality—but for under$100, who cares? The IS organization will build only a fewcopies of applications software, and even a few thousand istoo few to get Microsoft’s economies of scale.


Well-ReadManager85WELL-READ MANAGERWell-ReadOn the Theme of Knowledge ManagementManagerA good background for understanding the role ofknowledge in societies and within companies isPeter Drucker’s Post Capitalist Society (Boston,Butterworth Heinemann, 1993).Two more recent books have focused on the use ofknowledge within business firms and the competitiveadvantage of managing knowledge effectively. One isThe Knowledge-Creating Company (New York,Oxford University Press, 1995) by Ikujiro Nonaka andHirotaka Takeuchi, and Dorothy Leonard-Barton’sWellsprings of Knowledge (Boston, Harvard BusinessSchool Press, 1995).Butterworth-Heinemann is in the midst of publishinga series of “readers” on knowledge management,which collect some of the most influential articles todate. Researchers from the Center for BusinessInnovation have served as editors of the first severalvolumes. So far, Knowledge and OrganizationalStructure and Knowledge Management Tools haveappeared. Another good anthology of articles, notpart of that series, is Organizational Learning,edited by Michael Cohen and Lee Sproull (SagePublications, 1996).We’re looking forward to a few new books in 1997.Tom Stewart’s Intellectual Capital (CurrencyDoubleday) will finally give full treatment to theinsights he has gained over three years of reportingon the topic for Fortune. And Thomas Davenportand Laurence Prusak are teaming up on two newbooks: Information Ecology (Oxford UniversityPress) and Working Knowledge (Harvard BusinessSchool Press).Some excellent articles include “Informal Networks:The Company Behind the Chart” by David Krackhardtand Jeffrey R. Hanson (Harvard Business Review,71:4, 1993), “Learning by Knowledge-Intensive Firms”(<strong>Journal</strong> of Management Studies, 29:6, 1992) byWilliam H. Starbuck, and “Improving Knowledge WorkProcesses” (Sloan Management Review, 1996) byTom Davenport et.al.Finally, two books produced by The Ernst & YoungCenter for Business Innovation (and available onlythrough it) capture the content of the 1994 and 1995“Knowledge Advantage” conferences. The volumes area good mix of practitioner stories and theoretical andresearch-based insights. A third volume is in theworks to represent the 1996 event.


86HEADS-UPeads-UpUpcoming EventsManaging the Knowledge of the OrganizationSemi-Annual ConferenceFebruary 23-26, 1997Napa CAContact: Mare Rasmussen, Ernst & Young617-725-1557Managing the Supply Chain: Integration,Implementation and OptimizationApril 14-16, 1997Fort Lauderdale FLContact: Gartner Group Events203-316-6757Heads-UpMaximizing Customer LoyaltyWorkshop and RoundtableFebruary 24-25, 1997Scottsdale AZContact: ITSMA Events800-404-8762International Software Partnering ConferenceMarch 2-4, 1997Palm Springs CAContact: Registrar415-908-2659Year 2000 <strong>Issue</strong>s and Answers ConferenceApril 16-18, 1997Toronto CanadaContact: DCI Events508-470-3880IT Management in the 21st Century: ConvergingBusiness and IT Management AgendasApril 23-25, 1997Palm Springs CAContact: Gartner Group203-316-6757Forrester Technology Management ForumMarch 4, 1997San Francisco CAContact: Forrrester Research Events617-497-7090Internet & Electronic Commerce Conference &ExpositionMarch 18-20, 1997New York NYContact: Expocon Eventswww.iec@expocon.comPerformance Measurement for CustomerProfitability in BankingMarch 24-26, 1997Atlanta GAContact: IQPC Events800-303-9160 or www.iqpc.comLeading the Charge in Turbulent Environments:IT as a Change AgentCIO Perspectives ConferenceMarch 23-26, 1997San Diego CAContact: Registrar800-366-0246A Tale of Two Webs: Making Your Intranet andthe Internet Pay OffWebMaster Perspectives ConferenceMay 18-21, 1997Miami FLContact: Registrar800-366-0246Outsourcing Life Cycle Conference:Managing Through and Beyond the Sourcing DecisionJune 3-5, 1997Monterey CAContact: DCI Events508-470-3880Computerworld Smithsonian AwardsJune 9-10, 1997Washington DCContact: Dave Petrou, Ernst & Young216-861-2068Conference Board Strategic OutsourcingConferenceJune 19-20, 1997New York NYContact: Julia Kirby, Ernst & Young617-725-1581


87RESEARCH ROUNDUPMeasures that MatterResearchThe Ernst & Young Center for Businessinnovation recently conducted an importantresearch effort into the “motivations ofmoney men.” Its mission: to gauge the degree towhich stock analysts’ recommendations (and thereforestock prices) are influenced by various nonfinancialperformance criteria. The researchemployed a rigorous survey methodology andenjoyed a solid response rate. Results show thatmajor investors are heavily influenced byinformation about a company’s performancebeyond its financial reporting.Those surveyed were asked to say which specifictypes of non-financial performance information hadthe greatest impact on their decision-making.Thirty-nine possibilities were suggested, includingsuch things as market share, product defect rates,and employee attrition rates. Ranking highest were:1. The quality of a company’s execution of itscorporate strategy; and2. The credibility of a company’s management team.Both these criteria were reported on average to behigher than a “6” in importance on a scale from 1 (notat all important) to 7 (very important). (Note that,when confronted with such Likert scales, it is aknown phenomenon that respondents will avoid theextremes. Therefore a rating of “6” is quite high.) Over80% of respondents rated both these criteria as a “5”or “6” in importance.Least important, the survey found, are compensationissues and the receipt of formal quality awards. Forexample, the ratio of CEO compensation to workforcecompensation was ranked last in importance, withmost respondents assigning it minimal importance.Interestingly, the use of employee teams also scoredvery low as an influencing factor.Where do investors go to get the information theyconsider most useful to their analysis? Most popularas sources of insight are a company’s managementpresentations and its public filings/reports. (Bothaverage-ranked above 5.) Other forms of easilyattainable, public information are used butless valued.The second part of the research employed a “discretechoice” technique. This technique analyzes actualinvestment choices made by securities analysts, inorder to assign relative weights to factors influencingtheir decisions. Hypothetical investment options werepresented to the analysts, and researchers examinedwhat happened to their evaluation of a stock when,for example, a manufacturer’s defect rate went down.This research technique is considered more reliablein revealing true behavior than techniques that relyon self-reporting of behavior. Happily, the techniquecorroborated the self-reported findings noted earlier.Differences in findings emerged along industry lines.(Four were studied specifically: computers, pharmaceuticals,food products, and oil & gas.) In the computerindustry, for example, quality of managementResearchRoundup


88oundupResearchRoundupwas rated highest as a factor influencing analystrecommendations, followed by strength of marketposition and, in third place, the quality of productsand services. This implies that a computer companyhoping to boost its share price would be wise tocommunicate more aggressively its positivedevelopments and industry standing in those areas.In the Pharmaceutical industry, by contrast, communicationsabout new product development wouldhave greatest impact.Naturally, Ernst & Young is most interested inapplying these findings for the benefit of its clientbase—those publicly-traded corporations who benefitfrom better perceptions of their share value. The firmanticipates advising clients along four lines:improvement of measurement systems to allowmore accurate and reliable reporting ofnon-financial performanceenhanced understanding of the “value creationprocess”—that is, the underlying mechanics ofhow share prices are influenced by performanceand reportingmore efficient resource allocation, so that performanceand reporting enhancements can be madein areas of highest impactachievement of higher share price, resulting fromall these types of effortAlready, Ernst & Young has developed a model basedon this study which allows it to predict for clients theimpact they might expect on share price of specificefforts to improve and communicate non-financialperformance. For more information on the study andits potential uses, contact Amy Blitz at the Ernst &Young Center for Business Innovation (617-725-1589 orAmy.Blitz@ey.com).Accelerating New Product Development, pg. 55A Prescription for Knowledge Management, pg. 26


89RESEARCH ROUNDUPResearcExploration and PredictionScenario planning has been widely andsuccessfully used in the oil and gas industryto prepare for a variety of contingencies—from geopolitical developments to oil price movementsto competitive strategic possibilities. A newresearch report applies the technique to one of theindustry’s fastest changing aspects—the informationtechnology environment.The study, which Ernst & Young helped to fund andconduct, was led by Daniel Yergin of CambridgeEnergy Research Associates (CERA). Yergin is theauthor of the Pulitzer Prize-winning best-seller ThePrize: The Epic Quest for Oil, Power, and Money.Thirty-eight oil and gas industry firms contributed tothe research through a series of interactiveworkshops and one-on-one interviews.Together, these firms explored the likelihood andimplications of three distinct scenarios labeled byresearchers “woven world,” “strongholds,” and“coral islands.” The names reflect the fundamentallydifferent directions the industry could take withregard to interoperability. The other key driver ofthe scenarios is ease of communications, based onbroadening bandwidth and the convergence ofcomputing and communications technologies.play” in the use of IT in exploration and production,and a set of prescriptions for management. The prescriptionsare cast as two complementary agendas:one for senior management, which must take moreinterest in and responsibility for IT as a strategicasset, and one for Information Systems management,which must align better with business needs.What can we predict with confidence about IT in thisindustry? Five technologies, say the researchers, willassuredly have an impact: the World Wide Web,photonics, data warehouses, object technology, andmicroscopic robotics (combining nanotechnology withvirtual reality and artificial intelligence capabilities).For more information on the study, contact theE&Y members of the team: Tim.Crichfield@ey.com(or 415-951-3207) or Tom.Franklin@ey.com(or 713-750-8250). Copies of “Quiet Revolution:Information Technology and the Reshaping of theOil and Gas Business” are also available from CERA:617-441-2632.ResearchRoundupThe report issued by CERA explains the scenarios andwhy the assumptions behind each are reasonable. Italso devotes several chapters to two other productsof the study: a description of the current “state of


90RESEARCH ROUNDUPYear 2000 Looms for IS ExecutivesResearchResearchRoundupIt seems unbelievable, but it appears the informationtechnology industry never expected theworld to last till the year 2000. Informationsystems everywhere—in business, government,academe, everywhere—are predestined to go haywireas their internal clocks roll over from 99 to 00. Theproblem is that computers record dates using sixdigits (June 5, 1996, for example, is recorded 960605).So after the turn of the century, computers will haveno way of distinguishing between, say, 1961 and 2061.If you think it’s a problem that only affects the ISgroup, think for a moment about your retirementsavings. And how nice it would be not to have acomputer wipe them out.A recent survey by Ernst & Young’s technology consultingpractice confirmed the nervousness the Year2000 is causing in the ranks of IT executives, andgauged their progress in preparing for the day. Mostrecognize serious ramifications for their companies.A typical comment: “The impact of the Year 2000problem, if it isn’t solved, will be devastating.” Keysystems, they say, will cease to function, or worse,begin producing erroneous results.executives interviewed hope to complete their conversionand testing by the beginning of 1999 toprovide one full year of working experience beforethe drop-dead date.Due to the magnitude and complexity—but one-timenature—of the problem, the majority of companieswill call on outside help in the solution. Goodsources for finding it, they report, are Gartner Groupresearch, focused seminars, and the Internet.Meanwhile, a few cockeyed optimists are even able tosee a silver lining in the situation. Noted one, “a sidebenefit will be a complete description of our softwareinventory and interfaces.” Some also see this asa forcing function to clean up obsolete code intheir systems.For more information on the survey and itsimplications, contact Clint Alston, Ernst & YoungNational Director, at 214-665-5336 (or by e-mail atclint.alston@ey.com).While these companies are taking the Year 2000problem very seriously and are committed to solvingit, only a small minority have actually started workingon the conversion. Nevertheless, many of theFive Myths ..., pg. 79


91RESEARCH ROUNDUPBest Practices inGlobal Supply Chain ManagementRoundupErnst & Young teamed up with managementprofessors from the University of Virginiaand the University of Western Ontario on anambitious study: “Global Supply Chain Management:Partnering for World-Class Performance.” Itsobjective: to identify the supply chain partneringand management practices that contribute most torevenue enhancement and cost reduction.Firms pursue supply chain relationships with an eyeto making some key performance improvements.Their first and overarching goal is to form relationshipsthat increase end-customer satisfaction.Improving profits was the goal cited second mostoften.Reducing overall operating costs wasconsidered important, but interestingly not theprimary motivation.Seventy-five firms participated in the study byproviding comprehensive data on their practices andperformance. The ticket of admittance was high: toparticipate, a core company had to provide access tosuppliers and their suppliers and to customers andtheir customers. Moreover, many of the chainstouched multiple continents and were consideredtruly global. In terms of breadth and depth, the studywas unprecedented.Findings outlined four distinct levels of “partnering,”each of which is effective in different circumstances.True “collaboration,” for example, involves effectivepractices in areas like training and technologysharing, and observable behaviors having to dowith joint planning, communication, and conflict resolution.“Coordination” stops short of these collaborativepractices, but entails tighter linkages than mere“Cooperation.” At the lowest level, “Open MarketNegotiations” are marked by price-based decisions,short-term agreements, more suppliers, and moreadversarial relations with them—hardly partnering,but appropriate in some situations.Not surprisingly, significant differences showed upin the practices and perspectives of sellers versusbuyers, and upstream players versus downstream.Sellers, for example, are more interested in andare better prepared to build relationships thanare buyers.The findings also reflect leading practices inperformance measurement. The best global supplychain managers don’t settle for blunt measures offinancial outcome, but fine-tune their process andmarket measures. Time-to-market, inventory levels,and market share are some of the things theywatch closely.So far, the findings of the study have been sharedwith only the study participants and selected groupsof Ernst & Young clients. A summary report will bemade available more broadly. For information onhow to receive it, contact Leanne Gershkowitz at216-737-1901 (e-mail: Leanne.Gershkowitz@ey.com).ResearchRoundupMeasures that Matter, pg. 87


92INCONCLUSION<strong>Issue</strong>s and Non-<strong>Issue</strong>sIan FrazierSummary: In May, the American ProsperityFoundation, Inc., an office-based samplingorganization, chose from a preselected groupa smaller group, which it believed was unusually significant.Then professionals took that subset anddivided it even further. At issue was whether anissue was an issue or a non-issue. On certainissues/non-issues, disagreement was so small asto be statistically negligible. For example:ISSUEMy taxesNON-ISSUEYour taxesSo far, so good. Other i/non-i inquiries, while lessclear-cut, nevertheless fell within an acceptablemargin of certainty—IThe Russian SuicideDeath ChairIIRegular ChairsIf the first was designated an issue, although possiblynot, in the judgment of many respondents, and thesecond was definitely not an issue except insofar asthe first one was (albeit to a lesser degree), whatthen? The question seemed to lop the entireprocedure off at the knees, and progress stalled.Enter Nils Garrickson, a twenty-five-year-oldwunderkind trained in the emerging scienceof cybernetics.\Unfortunately, he was fired, leaving us right back atsquare one. Then they brought in somebody else, Tomsomebody. He also got fired. Then they brought inMarcie, who was more or less kicked upstairs fromAccounting. What she did, first off, was to go throughall the non-issues and take a whole new look just atthem. She found, to her surprise, that many did notstrictly qualify as non-issues at all, but included asprinkling of pseudo-issues, sub-issues, secondaryissues, meta-issues, and dead issues, as well as one orwhere acceptable certainty was taken to be a percentagegreater than sixty-six, or slightly more thanAmerica’s ninety-three million television households.InconclusionSo far, so good. However, what were researchers todo in cases like the following?I (Non-I)“She’s the Sheriff”Non-I (I)“Turner & Hooch”Why Knowledge? Why Now?, pg. 2


93Inconclusiontwo real serious issues that had somehow beenmisfiled. Now we were getting somewhere. Print-outsof the new, culled list of non-issues were issued toevery department head. Marcie’s managerial stylewas hands-on, direct, and at times confrontational.Part Welsh, part Greek, with a slight mustache and abig, strapping form, she got the most from her smallermale associates. First off, she established a standardof “i/non-i-ness,” based on the following model:IGolf junketsNon-IMiniature-golf junketsLater quantifying the standard by means of a simplealgebraic formula (included in work sheet), shereceived the Nobel Prize.At the time, my department was working on an issuefor which we had not yet found a correspondingnon-issue:INon-ISex in the ?workplaceI had run through all the non-i tables without success,and Marcie was becoming impatient. One Easter Istayed over just to get some hours to myself on thecomputer. Monday morning rolled around and Ihadn’t had a chance to go home and shower.Suddenly it hit me! I ran into Marcie’s office. She waswatering her plants. She’d just arrived. Puzzled, shelooked up as I scrawled on her blackboard:ISex in theworkplaceNon-ISex in thefireplaceMarcie plugged the coordinates into her formula—andsure enough, they checked out. We examined ourfigures again and again to make absolutely certain.Overjoyed, we reviewed my data sheets to see if theycontained any discoveries that might be patentable,and we found plenty.From then on, everything seemed to happen at once.Funding poured in. People had been waiting for asystem that could reliably provide non-issues for anyissues that came up, and vice versa. Now we had thatsystem in place, with an exclusive seventeen-yearlicense worldwide. In short order, we were able toengineer the following i/non-i couplings:IScofflawdiplomatsGangsta rapThe B-1 bomberYoung Elvis,old ElvisNon-IDiplomats ingeneralGangsta gift wrapThe B-flat bomberOld Elvis,dead ElvisInconclusionMurray Gell-Mann, pg. 75


94InconclusionEach of these produced revenues for the foundationwell in excess of thirty-five hundred dollars.Everyone began to look forward to going to work inthemornings. Staffers took each other out to lunch andsplurged on health insurance. Every day, it seemedlike, someone was coming up with a new “eureka” andshooting off a Roman candle in the commissary.Then, one afternoon just before quitting time—we’dbeen getting along so well, and our system wasworking so beautifully!—Marcie fired me. The firstthought that ran through my mind was, Never sleepwith someone from the office! Of course, I hadn’tslept with anyone, but that was small comfort now.As she turned to leave my cubicle and stepped intothe hall, someone fired her. Then the guy who firedher heard his phone ringing and, when he picked itup, learned he had been fired. I cleaned out my deskand fired some people and went home, only to find amessage on my machine from Personnel telling meI’d been rehired. But that turned out to be an error:the next day I received official notification thatI’d been fired.Naturally, the stage was now set for Nils Garrickson,Part Two. He was calling himself Nilsa and was takinga whole new approach. Apparently he/she hadobtained some funds for a business to warehouseclosed issues which technicians would then attemptto reopen. NilsCo offered me a flat daily rate, nobenefits, everything off the books. Some of the issuesI was working with were so closed that I was forcedto resort to procedures which were bad science, evendangerous. Once or twice I managed to turn a closedissue into a fuzzy issue, but that was about it. After afew months, I quit.I sat at home collecting unemployment and waitingfor my phone to ring. Meanwhile, the world movedfarther away from the old i/non-i classical polaritiesin which I had been trained. Some would say that ithad never conformed to our model to begin with, andperhaps they would be right. The rare pieceworkassignments I picked up almost never involved a goodtextbook non-issue—just issues that someone wantedme to skirt or talk around.Now, as I look back over my career, I realize thatissues versus non-issues, as an issue, is something ofa false issue. We all get caught up in discussion of theissues, and we try to use reason, and it’s such awaste. Our country is being destroyed. Focusing onour differences blinds us to an evil that threatenseverything we’ve worked for and cherish. In addition,we must try to develop a new mode that definesissues less in terms of what they are not (or are). Thiscan sound more complicated than it really is, if onlywe break it down, which can be done easily bysomeone with the proper theoretical tools.The Connected Economy ..., pg. 61

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!