Technologyforecast - Le Club Innovation Banque Finance Assurance
  • No tags were found...

Technologyforecast - Le Club Innovation Banque Finance Assurance

TechnologyforecastA quarterly journalSummer 2008In this issue:042640601998: Globalization…2008: ContinuouschangeMaking complexitymanageableOperationalWeb 2.0Bringing orderto chaos

ContentsFeatures04 1998: Globalization… 2008: Continuous changeAre agile management approaches the recipe for success in theface of change? How can IT contribute to making managementmore agile?28 Making complexity manageableWhat do you do when complexity can’t be avoided?40 Operational Web 2.0Finally, enterprise-class tools and methods are emerging to createinformal online networks. Already, business users are collaborating todeliver better decisions.60 Bringing order to chaosThe next suite—an intelligent business performance platform—blends business intelligence, rules, and process management, linkingknowledge with the processes that need it.

Interviews16 Where to find bedrockHenning Kagermann of SAP discusses IT’s dual role of providing astable foundation while enabling innovation.22 The era of CIO accountabilityLakshmi Narayanan of Cognizant provides a look at the globaldelivery model and the CIO’s blended business and IT role.52 The freedom to connectRichard Rosenblatt of Demand Media describes how to borrow ideasfrom the consumer Web for enterprise benefit.56 The dos and don’ts of mashupsJohn Crupi of JackBe clarifies what enterprise mashups are capableof, what they aren’t, and where their power lies.Departments02 Message from the editor72 Acknowledgments74 Subtext

Message fromthe editorWelcome to the relaunch of PricewaterhouseCoopers’ Technology Forecast.To our former readers who valued past publication of Technology Forecast,we welcome you back. Our goal remains the same: to provide insight,forward-looking perspective, and points of view that can once again helpyou understand, anticipate, and deliver the competitive advantage to begained from new and emerging technologies. To a new generation ofreaders, we look forward to engaging with you in conversation—anddebate—about the future of technology in the enterprise.Experienced readers of Technology Forecast books will immediately notice adifference. We’ve lost weight! More specifically, a journal format will allow morefrequent publication and will bring more immediacy to the topics we cover.Commencing with the next issue of the new Technology Forecast, the focuswill be on one theme per issue. We’ll explore that theme first in one articleat the level of the general business audience and in more technical detail ina second article. And then, case studies will provide additional perspectiveson that overarching theme.A print publication can cover only a small portion of the enterprisetechnology landscape. For additional perspectives and for greater depth onindividual topics, our Web site will contain a growing collection of associatedmaterial, including white papers, case studies, interviews, and quick-hittingperspectives on late-breaking developments. We hope you’ll want to taketime to visit first issue of the new Technology Forecast looks at today’s enterprisefrom two perspectives: (1) the combined impact of Y2K remediation,e-business, and enterprise-resource-planning (ERP) adoption from theiremergence 10 years ago through to the present day and (2) how today’sbusiness imperatives and technologies will transform enterprises over thenext 10 years. The article “1998: Globalization…2008: Continuous change”describes how enterprises are currently in the early stages of more effectivelyleveraging technology so as to enhance their agility around strategic decisionmaking. Business leaders are recognizing that responding to multiplesimultaneous changes in the business environment is now the norm.We see strong parallels today between Y2K remediation and IT-complexityremediation and between e-business adoption and Web 2.0 adoption in theenterprise, together with ERP suite adoption and the emergence of a suite02 PricewaterhouseCoopers

that comprises business intelligence, business processmanagement, and business rules management. Vendorsare providing new tools in support of a new model ofmanagement: a collaborative, distributed model ofaccountability and of authority to act, complemented bymore-comprehensive approaches to managing risk. Theopportunity is there to greatly enhance the quality ofdecision making throughout the enterprise. Will yourenterprise engage with that opportunity? Will it leveragethe risks and rewards in proper proportion?Two of our interviewees in this issue offer context forand perspective on trends in enterprise technology anddiscuss the lessons they’ve learned about technologyadoption patterns. Henning Kagermann, chairman ofthe executive board and CEO of SAP AG, contrastsfast-moving startups with what his customers expectfrom SAP, with a clear role for both. Lakshmi Narayanan,vice chairman of Cognizant, discusses the emergenceof the global delivery model that originated with Y2Kremediation and how productivity in services isextending the life of offshoring.The three articles that follow then break down theindividual themes that compose the foundation forenterprise transformation. The article “Makingcomplexity manageable” describes IT complexity,analyzes what’s driving it, and covers the approachesenterprises are taking to move from its episodicremediation to ongoing management.The article “Operational Web 2.0” focuses on the risein enterprise use of Web 2.0. Even though Web 2.0 isstill in its early days, every person we interviewed forthis issue was sure it would have a huge impact on theenterprise. We offer a narrative of the ways that Web 2.0,combined with the other trends described here, canenhance management decision making.To back up our focus on Web 2.0, Richard Rosenblatt,cofounder, chairman, and CEO of Demand Media andformer chairman of MySpace, is interviewed on the riseof user-created content. And John Crupi, chieftechnology officer of JackBe, takes Web 2.0 fully intothe enterprise context by describing the increasingimportance of mashups.Finally, the article “Bringing order to chaos” looks thefurthest out. In this article we describe the adoptionpatterns of business intelligence, business process, andbusiness rules software and the potential to consolidatethose three tools into a packaged suite.I welcome you to these pages dedicated to understandingthe future of information technology in the enterprise.And I want to hear from you! Do share your own thoughtsand experiences with us as together we accept theongoing challenge to leverage technology to createmore-competitive, more-innovative, and morecustomer-friendlyenterprises.Paul HorowitzPartnerTechnology Solutions LeaderTechnology forecast quarterly 03

1998: Globalization…2008: Continuous changeAre agile management approaches the recipe for successin the face of change? How can IT contribute to makingmanagement more agile?04 PricewaterhouseCoopers Technology Forecast

Globalization. Business-IT alignment. Today, they’regivens, but 10 years ago, not so much. In 1998, newforces of globalization were reshaping the businessenvironment, generating new competitors, anddemanding new sourcing strategies and markets. At thesame time, three other trends were at work: Y2K readinessand e-business initiatives dominated IT spending plans.And “big-bang” enterprise-resource-planning (ERP)deployments defined new roles and departments forbusiness and IT while making others obsolete.It was a perfect storm that for many companies resultedin improved alignment between business and technology.IT systems ultimately redefined business models,culminating for successful companies in the extendedvirtual enterprise, an outcome that most did not fullyanticipate when they started. Arriving at a successfuloutcome was often a long, drawn-out process.Today, we find the enterprise on the cusp of a similarshift. Major business forces are fueling strategic ITinvestments that promise to transform the enterprise.Instead of a specific change agent, such as globalizationor Y2K, the emerging business driver is change itselfand the continuous nature of change. And instead ofe-business and ERP packages, new businessperformance platforms and interactive Webtechnologies are the tools of choice.Ten years ago, the enterprise and with it the ITorganization viewed enterprise transformation as ashort-term, tactical process: define requirements,design a solution, implement the change, and retire theproject. Even though these projects would usually runfor multiple years, the assumption was that requirementsset in year one would still be viable at the conclusion ofthe project in, say, year four.The new imperative of managing continuous change is the businessdriver that, together with key technology trends, will fuel the next enterprisetransformation. Initiatives that improve management of IT complexity,leverage Web 2.0 capabilities, and adopt more efficient businessperformance platforms will be the key technology drivers. They willenable agile enterprise management and spur the development of newbusiness models, organizational designs, and competitive responses.1998: Globalization... 2008: Continuous change 05

Highly centralized leadershipStrategymanagementVirtualizationIT complexitymanagementBlogsServiceconsumerSOAIntegratedCPMSaaSMDM EACloudApplication computingconsolidationFolksonomiesServicebuilderMashupsRulesenginesEnterprise Web 2.0WikisSharedstrategyAnalyticsservicesIntelligent businessperformance platformVirtualmeetingsProcesspatternsEnhanceddirectoriesFocus ofinnovationCIO-as-teamStandardsfirstFor example, IT complexity frames all discussions ofnew technologies. The potential benefits that enterprisescan generate from Web 2.0 and an intelligent businessperformance platform depend on the degree to whichan enterprise has come to grips with IT complexity. Theintelligent business performance platform will supportWeb 2.0 technologies such as mashups by exposingdata and reports as Web services. Large quantities ofuser-generated data on company blogs and third-party“advisor” Web sites will be useful only if businessintelligence analytics can make sense of the data.IT complexity and business processesIT infrastructure was never simple, but several forceshave amplified complexity in recent years. Mergers andacquisitions, the drive for early adoption of strategiessuch as virtualization, wireless and remote ITdeployments, and other factors have contributed tocomplexity. Reducing IT complexity begins byaddressing the unintended consequences of pastDevolved leadership and authorityDistributed,collaborative ITBusiness-unit-ownedFigure 3: Overview of agile management infrastructureSource: PricewaterhouseCoopers, 2008investments in technology. The haphazard accumulationof IT assets that have become complex and inflexible isthe biggest roadblock to a distributed organizationalleadership and authority scheme.IT complexity will only increase as enterprises take onnew initiatives such as the adoption of service-orientedarchitectures or master data management techniques.The key for enterprises will be to move from ITcomplexity remediation to ongoing complexitymanagement. This will demand an explicit plan ofattack for sampling, adjusting, promoting, andstandardizing on innovative technologies.Historically, CIOs have had a rough time making apurely business case for reducing IT complexity. But thepush for management agility in response to continuouschange will move enterprises to the tipping point. Whentime to market is weighed as a part of the businesscase, then agility can be measured as a return on ITinvestment. With a proper IT complexity managementstrategy in place, the stage is set for adoption ofinnovations like Web 2.0 that promise to add value tothe organization.One of the main causes of IT complexity is the lack ofenterprise maturity in understanding its businessprocesses. IT complexity rarely operates in a vacuum;rather, IT systems can be no more orderly than theprocesses they enable. It is not uncommon for largeenterprises, especially those created from manymergers and acquisitions, to maintain small butsignificant differences between fundamentally identicalbusiness processes. These differences are thenreflected in the applications supporting them. The resultis a growing mountain of applications that must bemaintained, even though the differences in processcreate no actual business value.Here IT can help clear the way for change. An intelligentbusiness performance platform provides suites of toolsto model and modify processes, to measure them, andto analyze what the measurements reveal. Leadingcompanies today use business analytics and businessprocess modeling tools to identify the highest performingbusiness units, correlate value to a specific process,1998: Globalization... 2008: Continuous change 09

and standardize on these proven processes. Inevitably,these process redesigns allow enterprises to retireredundant applications, a key contributor to IT complexity.Inside Web 2.0The adoption of Web 2.0 and its host of socialtechnologies has been a phenomenon few could havemissed. What hasn’t been fully established is howenterprises will define and measure the business valueof blogs, wikis, social networking, mashups, tagging,folksonomies, and other hallmarks of Web 2.0.Web 2.0 technologies help mitigate two importantchallenges in achieving agile management:• Web 2.0 capabilities address the need for culturalchange by enabling more open, multiwayconversations across the enterprise and beyond.• Web 2.0 capabilities help solve the challenge ofcommunicating with a large number of business-unitleaders who are typically geographically dispersed.Agile management requires high levels ofcommunication and collaboration. In enterprises thathave hundreds or thousands of business-unit leaders allover the globe, traditional meetings for analysis,planning, and decision making would be overwhelmed.Virtual meetings, internal blogs, enterprise-equivalentsof MySpace or Facebook pages for connecting andsharing business-relevant documents, and widgetenvironments for quick development and sharing ofbusiness analytics are some examples of how Web 2.0will support agile management teams. However, themultiway, public nature of blogging, wikis, and othernew social technologies will require a shift of mindset atthe senior management level before enterprise Web 2.0can have an impact.Beyond blogging and other communications-orientedWeb 2.0 technologies are an emerging form of usergeneratedprogramming techniques called “mashups.”Mashups typically rely on Web services to combinemultiple sources of information and process logic tocreate new Web applications. Some of the most usefulmashups for agile management will combine businessinformation and analytics. The intelligent businessperformance platform will support mashups byexposing data and reports as Web services. Mashupswill be most useful when they allow business analyststo combine business intelligence analytics with other,perhaps external sources of data.Once a valuable management tool has been pulledtogether as a mashup, an agile management team canuse the self-publishing capability of social networks andblogs to share it widely—both the information and thebusiness logic. Others in the organization can then buildon or adjust the business logic for their own purposes.(See the sidebar, “How agile management could work inpractice,” on page 8.)Dealing with this plethora of internal communicationand collaboration, comments and requests fromcustomers, and blogging reviews by outside experts islikely to be one of the challenges for command-andcontrol-stylemanagement. Business-unit leaders at theedge of the enterprise will need a platform to assessthese ideas, the ability to test them against theconstraints of the higher-level strategy, andmechanisms for directly introducing process changes intheir organizations. In short, they will rely on anintelligent business performance platform.An intelligent business performancemanagement platform combinescentrally created business analyticsservices, business process analysistools, business process design,execution, monitoring capability,and business rules engines.10 PricewaterhouseCoopers Technology Forecast

Business intelligence and business analytics are key components.Intelligent business performance platforms not only encourages acompany to formalize its business policies, but also can execute thosepolicies automatically, rapidly, and consistently, allowing staff to focus oncreative solutions to exceptional circumstances.The role of intelligent businessperformance platformsAn intelligent business performance platform supportsapplications that monitor and manage critical businessprocesses and outcomes. An intelligent businessperformance platform is a way to characterize theemerging suite of functionality combining businessintelligence, business process management, andbusiness rules management. This emerging platform willfocus on automating and enhancing dynamic,knowledge-driven processes rather than low-leveltransactions. Examples include merger integration,strategy management, budgeting, enterprise riskmanagement, governance and compliance, and manyindustry-specific activities.Business intelligence and business analytics are keycomponents. Intelligent business performance platformsnot only encourages a company to formalize its businesspolicies, but also can execute those policies automatically,rapidly, and consistently, allowing staff to focus oncreative solutions to exceptional circumstances.An intelligent business performance managementplatform combines centrally created business analyticsservices, business process analysis tools, businessprocess design, execution, monitoring capability, andbusiness rules engines. This is an area where the ITorganization can add value by standardizing andsupporting these tools.The fundamental reason for pursuing an agilemanagement strategy is to increase the overall agility ofthe enterprise. Agile management will not succeed ifbusiness units have complete flexibility to simply do asthey please. Making sure managers at the edges of theenterprise are making the best decisions can beimmeasurably enhanced by leveraging businessperformance management tools. Business-unit leadersmust have access to comprehensive information aboutinternal processes, business outcomes, partnerprocesses, customer preferences, and broader marketconditions and forecasts.Additionally, if these informed and empoweredbusiness-unit leaders determine a change in course iswarranted, they need the tools to make business processchanges directly. Putting in a request to an overwhelmedcentral IT department will only result in delays thatthreaten the success of the business opportunity.Looking forward: Enterprise adoptionand risk managementIn the next five years, several indicators suggest that ITspending will be constrained, especially when comparedto spending a decade ago. IT organizations are oftentold “do more with less.” And global economic conditionsin the near term will forestall new capital investments.The investments in technologies needed to supportimproved agility will be gradual and predominantlyfunded from IT cost savings that accrue from efficiencygains. New pricing and provisioning models, such assoftware as a service and cloud computing, willaugment the outright purchase of IT assets. The bigbangimplementations associated with ERP adoption—with associated high costs—will not be repeated.1998: Globalization... 2008: Continuous change 11

IT trends for managing complexity, Web 2.0, andbuilding out intelligent business performance platformsshow different adoption patterns.A substantial percentage of enterprises are alreadyreducing the complexity of their IT systems. Theproblem is widely recognized and the value propositionfor taking on complexity is becoming increasingly clear.What’s less clear is how far along enterprises are inaddressing complexity as an ongoing managementpriority rather than a series of one-time projects.Enterprise adoption of Web 2.0 technologies and thecollaborative, open environments Web 2.0 reallyrepresents, such as blogs, wikis, mashups, and othersocial technologies, is accelerating. Unlike complexityremediation projects, the connections between Web 2.0investments and enterprise value creation are not yetwell established. As a result, the adoption patterns forWeb 2.0 in the enterprise are more experimental anddepartmentally focused rather than enterprisewide. Notall risks are well understood. For example, how shouldan enterprise monitor public blogs for storms of negativecomments about enterprise products or services?Further out still is the trend toward adoption of anintelligent business performance platform. In fact, theconsolidation of tools defined by business intelligence,business process management, and business rulesmanagement has not yet occurred. However, theinvestments by such companies as IBM, Oracle, andSAP to acquire and develop the full range of capabilitiesdescribed later in this forecast demonstrate that thistrend is already in place. (See “Bringing order to chaos,”page 60.)Rate of change and riskWill the business case for addressing continuouschange drive adoption of these new IT services andsolutions? Or, will the perceived risk of attempting asignificant transformation of the enterprise outweigh itsperceived benefits?Vendors may focus on sophisticated use-cases for thedistributed management of strategy, authority, andaccountability. They will develop compelling virtualenvironments that support management teams inPricewaterhouseCoopers andtechnology forecasting: An interviewwith Sheldon LaubeSheldon Laube leads PwC’s Center for AdvancedResearch (CAR), where he directs research anddevelopment on business problems that have noknown solution in the marketplace. Laube first joinedPrice Waterhouse in 1984 and later became CIO. Heleft the firm in 1995 to found two successful startupsand returned to lead PwC’s Center for AdvancedResearch in 2003. Laube discusses the reason whytechnology forecasting became important to PriceWaterhouse 20 years ago and the challenges thattechnology forecasting presents.PwC: You were there at the beginning whenmanagement at Price Waterhouse decided topublish the Technology Forecast book. How didthis idea get started?SL: It goes back to the late 1980s. The consultingbusiness of what was Price Waterhouse at the timefigured out that success came from largerengagements, but larger meant higher levels ofcomplexity, with major technology and systemsintegration components. The focus of our practice atthat time was around mainframe solutions; newtechnology architectures based on UNIX and client/server caught some parts of our practice by surprise.This raised a concern among our practice leaders,“How do we keep our consultants informed aboutthe latest trends in technology?”At the time we had a research group in Menlo Park,California, staffed with PhDs in computer science. Isuggested they be given the task of producing anannual publication—to be used for internal trainingand knowledge development—that would describe“what’s coming next” in technology. We called it theTechnology Forecast. After the first issue came out, it12 PricewaterhouseCoopers Technology Forecast

was quickly recognized as having external as wellas internal value. The Technology Forecast was agreat mechanism for telling our clients that wecared about the future of technology, that wespent our own time and energy and moneydeveloping a perspective on the future oftechnology. (See Table 1.)Others firms have their own ways of signaling themarket that the topic is of interest to them.Coming from an accounting firm set a differentcontext. CPAs as certified professionals have apublic responsibility to invest in extendingknowledge of accounting. Price Waterhousepublished thought leadership as a sort of publicresponsibility, an effort to perfect accountingprinciples and practice by contributing to publicknowledge. It was a natural extension for PriceWaterhouse to want to do the same withtechnology knowledge, especially if it had a goalof being “objective.” And the process that createdthe content was an aggregation of many points ofview in an effort to have an objective perspective.It was really a learning process. Having a point ofview based on bringing together a wide range ofexpertises and synthesizing them into a coherentwhole has tremendous value. That’s because theissues being dealt with, the impact of technologyon business and society, are not diminishing inany meaningful way. You can search on the Weband retrieve 1,000 separate individualperspectives. But editorial voice is still important;it was the synthesis in juxtaposition of what wechose to talk about and not talk about thatdefined value for clients. Each year there was adifferent set of issues.PwC: Looking at the issues chosen forattention, clearly the Internet was a key topic.How did your perspective on the Internet atthe time define where you looked for value,and what has surprised you in terms of howthe Internet has evolved?SL: In terms of the Internet, at USWeb our insightwas that if there’s gold in the hills, the money will bemade in tools and services to the miners. Webelieved every business would have a URL. Andthat’s come to pass; that vision we got right. Thething we didn’t catch, as many didn’t, was the riseof personal Web sites and the notion of socialnetworks and things of that nature. We didn’tunderstand the notion of social interaction throughthe Web and that it might be the driving force nowgoing forward for the next 10 years, the Web 2.0world. All of these, it’s just unreal, MySpace andFacebook, they’re very different ideas. And even thephrase “social interaction” doesn’t really capture it.You’re not really replicating a phone call on the Web;what’s striking is that you are capturing or“recording” in some sense your interactions andpublishing them for others, even the public to read,including a public you will never actually knowpersonally. It really does represent a whole new wayof interacting in the same revolutionary way as wethought of Web sites as revolutionizing theenterprise. Web 2.0 is now revolutionizing socialinteraction, and no one even knows where that’sgoing to go.PwC: How have these latest developments in theWeb 2.0 world impacted the way you think abouttechnology forecasting?SL: I think the really the interesting thing about theworld of forecasting is to try to understand thethings that you can actually predict and then realizethat just because you can see the logic of that, therange of possibilities of what might happen nextbecause of it are beyond any normal person’s abilityto extrapolate, to make the right choices about whatwill happen. That means you have to be very aware.You’ve got to keep watching for the trends of howthose fundamental changes that technology enablesget created and then instantiated in very clever andvery unusual ways. I mean, just the transition ofFacebook, for example, from an education-only to acommercial enterprise was not operationally predictable.1998: Globalization... 2008: Continuous change 13

PricewaterhouseCoopers and technology forecastingOur previously published books sought broad perspectives on trends ofthe day. We and our collaborators usually got it right, notwithstanding anumber of surprises.Focus area On-the-mark forecasts Biggest surprisesInternetHardware andinfrastructureSoftware• Steady growth for Internet access overall• Total dollar size of the business-to-businessInternet far surpasses the business-to-consumermarket• Smart handheld devices and cell phonesrationalize into a single device• Importance of software applications in datacenter management and lowering total cost ofownership• Linux operating system redefining cost ofcomputing• The rise of enterprise suites for financialsystems, product management andmanufacturing, customer and supplierrelationships, and sales and marketingautomation• Addition of e-business support and Web portalfunctionality to suites.• Integration of platforms and componentarchitectures• Lack of success of wireless broadband,including low-earth orbit satellite services• Emergence of Web 2.0 and the importance ofuser-generated content• Paid search fundamentally altering Web businessmodels• Impact of vertical Web products and services notconstrained by their information content• The ubiquity of the personal computer, which didnot decline in favor of more manageable,appliance-like devices• Seriousness of the growing problem of ITcomplexity• Failure of knowledge management systems toachieve broad adoption in the enterprise• Massive vendor consolidation in many areas• The failure of the application service providermodel in absence of a multi-tenant architectureTable1: Technology Forecasts 1998 to 2004: Highlights and surprises14 PricewaterhouseCoopers Technology Forecast

Organizations today must commit to managing the constant changethat’s come to dominate global business. In the end, success will cometo those who recognize and mitigate the risks as part of an overallstrategy that unifies intelligent business performance platforms andWeb 2.0 technologies while managing complexity.hyper-collaborative structures. But that does notguarantee market adoption.The most significant barrier to the transition towardagile management techniques is the management ofrisk. Imagine, for example, how a large bank wouldreact to the proposal that decision making be furtherdevolved. But let’s face it: For most companies today,the imperative is to become more agile at decisionmaking or face ever more daunting competition. Newrisk management investments must be linked to agilityenhancement initiatives.Today’s governance, risk, and compliance planningparameters reflect the existing top-heavy commandand-controlstructures in place in most enterprises. Asthe tools that enable more distributed-yet-coordinatedmanagement of strategy improve, a paralleldevelopment of how the activities of governance, riskmanagement, and compliance are conceptualized andultimately distributed will need to happen as well.Without effective controls in place, the steps to developmore agile management are likely to be seen as too risky.However, organizations today must commit tomanaging the constant change that’s come to dominateglobal business. In the end, success will come to thosewho recognize and mitigate the risks as part of anoverall strategy that unifies intelligent businessperformance platforms and Web 2.0 technologies whilemanaging complexity.In fact, the key to managing that risk may lie in the newtechnologies themselves. A collaborative, distributedmodel of accountability and of authority brings with it amore-comprehensive approach to managing risk.Says Henning Kagermann with SAP, “If you have thismore agile strategy… and if you do it right, it’s evenbetter. The risk is higher, but the reward is as well.”For more information on the topics discussed in thisarticle, contact1998: Globalization... 2008: Continuous change 15

Where to find bedrockHenning Kagermann of SAP discusses IT’s dual role ofproviding a stable foundation while enabling innovation.Interview conducted by Bo ParkerHenning Kagermann is CEO of SAP, since April 2008 jointly with Léo Apotheker. He joinedSAP in 1982, initially overseeing product development in the areas of cost accounting andcontrolling. He has been on the executive board of SAP since 1991 and became CEO in1998, until 2003 jointly with SAP cofounder Hasso Plattner. In this broad-ranginginterview, Kagermann discusses the interdependencies of transactional and businessintelligence systems, and he weighs the pros and cons of Web 2.0.PwC: When you look back 10 years and recallthe excitement of the time, with e-business andthe huge investment in enterprise resourceplanning (ERP) suites, what conclusions can wedraw today regarding the promise and potentialfor technology to contribute to businesssuccess?HK: I have always been a little more conservative thanthe other players, particularly the players in the SiliconValley. If I look back, the status of e-commerce wasobviously overhyped. Collectively, all players togethercaused a lot of damage to our industry by promisingclients things they couldn’t fulfill, and I doubt we haverecovered from that. In particular the overhyped ripand-replacemessages were not good. At the end of theday, many of the topics turned out to be the right ones,but from a perspective of timing it was by far too early.What I experienced—and I was sitting there sometimes,asking, “What about the customer?”—was that CEOs ofkey clients felt forced to do something because of allthis hype, and then they did the wrong thing. Theymade investments because of technology hype, notbecause of business need.We are in a situation now where we have learned ourlesson. First of all, there is stickiness of legacy codeand stickiness of clients, and that will not go away.Second, business has taken power away from IT. It’sabout business and having a good business case. Thegood news is: We can now deliver the functionality andcapabilities promised by the earlier e-commerce hype,but we must continue to manage or hide the complexity.PwC: So despite the hype 10 years ago, you seethat the vendor community has pretty muchfinally delivered the technologies needed totransform companies into the virtual, networkedenterprises promised by the e-commerce hype?Is there anything you would describe asunfinished work from that era?HK: I feel the legal context is a barrier to the virtualenterprise. Obviously, there is more openness, more16 PricewaterhouseCoopers Technology Forecast

collaboration, more networking of businesses—all fine.On the other hand, we have more regulation, more rigidcompliance frameworks like Sarbanes-Oxley and moreapparently, little things where, if you make a mistake,you can put your company at risk. So integrity,reputation, sharing IP appropriately—these topics aregetting trickier and trickier. As long as you’re doingthings within your own enterprise, it’s in a closed shopin terms of legal and integrity. You can change manythings; to some extent you’re flexible. But if you moveto a business network model with shared processesoutside your enterprise, now legal complications canquickly multiply. You have to do business in a way sothat you are not, let’s say, violating the rights of otherpartners. So this is, for me, one of the most urgentquestions of how far we will go with the networkedeconomy. Will regulation, will society push back here?PwC: Do you see any specific technology trendsthat will drive enterprise transformation in thenext 10 years?HK: Many, many trends are coming, not one, whetheryou look at software as a service, open source, cloudcomputing, pervasive computing, next generationInternet, in-memory databases, and others. Not onesingle trend will be the dominating one; transformationcomes from the variety.PwC: Has anything surprised you about Web 2.0as it has developed thus far in the enterprisecontext?HK: Web 2.0 is important, but not the next big thing. Weshould resist the tendency to overhype again. I mustadmit that communities are taking off more than Ithought. We have our SAP Developer Network withmore than one million participants. But if I look to theBusiness Process Expert community, where 280,000people share knowledge about processes: that’s thereal surprise to me. I thought it would work in theengineering world because that’s their nature, but Iassumed consultants would not share knowledge,because it’s not their nature. So it was my litmus testthat I said, OK, if consultants start sharing knowledge,then something is happening. There are not just newways of collaboration, but they can even change thecharacter of collaboration. That is a trend that’ssurprising, and this is something with Web 2.0 thatreally adds value.PwC: You mentioned the hype from technologyinnovators around the rip and replace of legacyIT. Hasn’t the add-on strategy led manyenterprises to unmanageable levels of ITcomplexity? How is the industry going to resolvethat, and how is SAP going to participate?HK: There is only one low-risk option: You have to linksystem landscape consolidation to a business case. Forexample, if a company moves into new business fields,it can build the future platform in that new area. Startfresh, grow the platform, and then over time consolidatethe other areas on that platform. That is a compellingbusiness case, and makes sense. So, pick anopportunity where you have a case, and then build thefuture. That is much easier to do today, because thenew technologies are far more open. You canconsolidate the legacy at your preferred pace. Andmany companies are doing it that way. They consolidatethe landscape over time, and combine it with abusiness case.Where to find bedrock 17

PwC: How will companies avoid running into thiscreeping complexity problem again?HK: I‘m afraid it’s the nature of business and IT. It’salways the same pattern. Customers look for an ITpartner who is a reliable, a long-term partner who has ahigh profile of engineering excellence, integration,globalization, quality, and support. That’s whatcompanies are looking for—assurance that theirinvestment will be safe for the next 10 to 15 years.So let’s assume we are such a partner, and I reallybelieve we are. In that case, you give up some speedand entrepreneurship by definition. Why? If you havethis reputation and work hard to deliver on your clients’expectations, you cannot quickly offer a great newproduct that is not integrated or proven and works onlyin North America and the UK and nowhere else; itdamages your reputation, your brand. So in order toensure the reliability and high quality customers expectfrom their long-term partner, in certain areas you will beslower than some startup in the Valley.Now say you have these nimble, quick guys, 200people, very innovative. Nobody complains if they comeup with a solution that is not localized. No one expectsit from them. Great. And because it appeals to theusers, in some departments they will say, “I need it. Iwant it. Give it to me now.” So they increase thecomplexity of the internal IT partner, and two years laterthey will look and say, “Gosh, why have I done this? DoI really want this?” At the end, they will turn to theirlong-term partner to take care of it. They come to youand ask you, and then you say, “Yes, we can do it, but itwill take a little longer,” because they want you tolocalize it for Costa Rica, for Nigeria, for China. Youcannot say, “Look, this is not a market for me.” They willanswer, “But you are a global company with productsfor the global economy.”PwC: So what you’re saying is that there’s beena failure to look at the total cost of ownership.Companies look at the project cost, but not thetotal cost for the whole enterprise. Arecompanies becoming more aware of this as theyare adopting new technology this time around?HK: No, it’s once again going in the other direction. Thishas to do with the need, demand, and push forbusinesspeople to be successful. Most of them are stilllooking to their silo and saying, “I need this in order todo my job.” In certain areas the CEO will give up andsay, “Let them do it and consolidate later.” It’s a trickysituation. Only a few companies are extremelydisciplined and do not allow it, but that’s mostlycompanies that go for efficiency and operationalexcellence. If you look at the companies that arefocused on being a customer relationship master or thatwant to be an innovator, they adopt technology withoutconsidering the whole cost picture.PwC: For some companies, then, it’s a tradeoff.They cannot afford to wait for the fullyengineered, globally localized, highly reliableinnovations without being challenged bycompetitors. So should they proceed with their“eyes open” and then have the discipline twoyears later when the opportunity is there to cleanup—to remove the complexities that haveceased to be associated with value add?HK: You have to clean up. That’s the key point. Ofcourse, you have to innovate, and if you do it fast, youwill add complexity. But, in parallel, you have to cleanup those things where it’s not worth it any longer todifferentiate yourself against others. This is a kind ofsedimentation approach to technology, where you allowfor complexity where it pays off, but after some timeeveryone will have the same capability. Then you haveto clean up, so that your non-differentiating processesare supported by a very clean backbone. Becausethere’s only a certain level of complexity you canmaster, and it’s something companies need to be awareof. Some companies can master a little more, othersless, but it’s not endless.PwC: We’ve been talking about IT complexity asif IT was off on its own building these complexsystems, when, in fact, they’re responding tobusiness units wanting to support a process in a18 PricewaterhouseCoopers Technology Forecast

new way or do a new process. Do you see afocus on process excellence and processmodeling capability? Do you see executivesactually engaging with business processmodeling tools and making that the driver ofinnovation? Because that’s a way to reducecomplexity in IT—to start by reducing complexityin the process, right?HK: We see two trends. One is to bring the knowledgeabout processes closer to the business by usingbusiness process management with integratedmodeling tools. We are in the process of switching frompurely descriptive models to active models. You cansimulate the impact of a business process changeimmediately and include the business expert into thedesign process. Of course you cannot changeeverything, but you have a sufficient level of flexibilitybuilt into the process model. In navigating through themodel, you can see what you can change and what youcannot change because of regulatory or businessintegrity reasons.For example, if somebody tries to change the revenuerecognition process, the built-in compliance will ensurethat the change conforms to accounting standards. Ifyou have a modeling environment that knows theseconstraints but still allows process flexibility withoutviolating these constraints, it will be highly valued bybusiness units. Especially if, when businesspeople ask“Why are we doing this here?”, the model explains theconstraints but also shows that they have someprocess options. They can ask, “Let’s see whathappens if I take another option” and you can show itwithout any development effort or delay.PwC: But most executives are not conversanttoday in business modeling terminology andnotation.HK: I think it’s not necessary that you as an executiveunderstand the model and work with it. The key is thespeed. A consultant may give you an overview and useit. You can ask questions and then the consultanttranslates it into change, and you see the resultsimmediately and make your decisions. Customers likethis approach. They feel like co-designers.Maybe businesspeople of the future read models likebooks or instead of a, let’s say 400-page MBA typebook. It’s due to education. There are people whounderstand mathematics and can read formulas veryfast; other people can read music the same way. It’s asimilar concept. Models are an abstraction; they arecondensed—a kind of language that describes businessprocesses and rules. If you’re educated this way, I thinkyou’re pretty quick to see if a model is different or not.If you see it the first time, you say, “OK, why shouldI do that?”PwC: But there’s also the level of granularity,where the model you’re looking at is appropriateto the role you have in an organization. Do yousee this also as an opportunity?HK: Yes, because a lot of people speak aboutsemantics these days. You have the semantic Web andall those things coming. From my point of view, that’llreally be a breakthrough—more than Web 2.0—becauseit tries to give things a meaning, which will greatlyfacilitate collaboration. What happens if you get twoIf you have a modeling environment that knows these constraints but stillallows process flexibility without violating these constraints, it will behighly valued by business units.Where to find bedrock 19

We always believed that youcannot define one set of metadatafor transactions and another set ofmetadata for business intelligence,because they are interconnected.drawing files from two engineers, let’s say from Daimlerand BMW, for example, and they want to collaborate—can they interpret each other’s drawing files?If the semantics are the same, the meaning, youunderstand it immediately. Mathematics is like this. Ifyou have a mathematician from India and another fromChina or Germany, they would understand amathematical formula, although it has taken centuries toget there. And music is the same. But we don’t have iton this level for business process modeling, so first ofall a universal modeling language is required. This issomething people are working on. In the world ofservice-oriented architecture (SOA), we managebusiness connectivity with services. However, it will takesome time until we evolve a collection of services into alingua franca of business.PwC: There seems to be wide recognition todayof the value of standardizing on a single ERPplatform and single instance of that platform.How far does this go as you focus more onbusiness intelligence platforms?HK: We always believed that you cannot define one setof metadata for transactions and another set ofmetadata for business intelligence, because they areinterconnected. Software will continue to replace workthat is routine in an enterprise. The way people work willbe more and more driven by exceptions, by insight, byanalytics, and not by just doing transactions. That’sbehind us. The more routine the work is, the more it canbe described by rules, then the more we can replacepeople work by software. We minimize humaninteraction in this area as much as possible, becauseit’s not value adding.When do you need human interaction? If there arefailures, first of all, and we hope we have few of those.Second are exceptions—something unexpectedhappens, and you have to solve it. This is whencollaboration is needed, where, for example, a suppliercouldn’t deliver, but production has to go on. What todo now? And even here you can have some rules andtemplates to support teams. And finally it’s about theunforeseen, or what we call intelligence, where peoplelook for new upsell opportunities into their customerbase, turn strategy into operational plans, or prepare forentering new business opportunities.When you automate all the other stuff, people have timefor this. That means intelligence, and businessintelligence is initiating transactions more and more. Sotransactions might be at the end, where you say, “Imade up my mind, and I’m going to make these pricechanges in these territories to maximize margins.” Youcannot divide transactions always from intelligence.Therefore, over time it will be more integrated. In thefuture, designers also must have analytical options inmind if they create new applications. This is a kind ofconvergence, and the same will happen for thecollaborative aspects of applications.PwC: Right now there really are three differentmarkets for the technologies we have beendiscussing: the business intelligence market, thebusiness process management market, and thebusiness rules engine market. Where are youseeing the transition to a consolidated platformof all of those coming together? Is it going to bedriven by the need for semantic consistency, oris something else going to be driving it?HK: We promote this concept of “closed loop,” wherewe say these things will be loosely coupled for a while.Returning to the concept of big-bang versusevolutionary consolidation, to drive it into large clients,you can’t come in and say, “I have a great new idea, butfirst you have to rip and replace all the stuff you already20 PricewaterhouseCoopers Technology Forecast

have.” Companies will answer, “Look, I have all theseprocesses in place. I don’t want to replace, but pleaseaugment them with analytics, give it to me, and connectit.” Therefore, you need the decoupling, but I think youhave to connect them in a way that generates morevalue for the clients. We are saying, “If you have a verymodern, consistent, and complete business intelligenceplatform, for example, you can continue to extract datafrom everywhere, from your data warehouse—it’s allfine. But you could also improve the underlying processplatform by firing exception events into this platformand saying this has to be solved, and trigger thosethings more real time.” This is a loosely coupledintegration, but it’s very worthwhile. If you want to besmart in doing it, you have to understand both sides:the process platform and the intelligence platform. Sowe see some connectivity, some kind of integration, butit’s a different way. I’m not talking about everything inone big box.PwC: As enterprise software moves forward,solving semantics problems, dealing withcomplexity at a business process level and notjust a technology level, and as we move towardmore collaborative and open environments—what Web 2.0 really represents—how do you seethe SAP platform evolving?HK: I think there’s been an interesting reversal of sortsin technology. In the last 10 or 15 years, if newtechnology changed, for example your programminglanguage or your database language or your operatingsystem, then the application adjusted. Now it’s more theopposite, that the applications live longer than parts ofthe technology. More and more it’s becoming like anaircraft where you, let’s say, have the same aircraftflying 20 years, but the engines are different. In themeantime, you have Internet inside, and, so youconstantly incorporate new technologies. And we feelthat’s the way larger applications in businesses willevolve, becoming more adaptive to new technologies.That means they have to be more modular, with betterinterfaces, etc. I believe that’s the trend. If you view itthis way, and if you drive in this direction, then youavoid the surprises of new technologies that you can’tsee today. This is very important. I believe we cannotanticipate everything that’s coming in two years. But Ican assure my clients that whatever happens, I can takeadvantage of it in a way that insulates them fromdisruption, no sleepless nights. This is a betterapproach. So many things are happening, we shouldn’tassume we can anticipate them all.PwC: So it’s a new sense of agility, is what I’mhearing you say. Not only do companiesthemselves need to be agile. Obviously, that’sbeen increasingly the case, and you are acompany, so you need to be agile, but theproduct you give companies needs to be agile.HK: Yes.PwC: And that is a different way of thinkingabout a product, because in many ways there’sbeen great value for technology companies tomake products not agile, because that createscustomer stickiness.HK: Yes, and that’s a big mistake. We came to theconclusion a few years ago that you have two choices.You can select a lock-in strategy, which has given manycompanies a high return in the past, so it’s OK thatsome people believe this might be a better strategy. Idon’t agree, because it’s more a question of whetheryou feel you’re strong enough to survive in a truly openworld. But if you have this more agile strategy thatwe’re just discussing and if you do it right, it’s evenbetter. The risk is higher, but the reward is as well.Because you have no lock-in; you invite others to playaround and take business away from you. On the otherside, it keeps you agile. You always have a secondchance. If competition is faster or better, eventually theinnovation becomes standard fare and clients want it intheir stable platform. If you are a trusted partner and notcomplacent, you can do the sedimentation and comeup with a robust and long-living solution even two orthree years later. •Where to find bedrock 21

The era of CIO accountabilityLakshmi Narayanan of Cognizant provides a look at theglobal delivery model and the CIO’s blended businessand IT role.Interview conducted by Vinod Baya and Hari RajagopalachariLakshmi Narayanan has been with Cognizant since 1994: He joined as CTO and was CEOand president from 2003 until 2006. Now as vice chairman, Narayanan travels extensivelyin the United States and Europe to meet with clients of Cognizant. He is a member of theboard of the US-India Business Council and, until recently, was also chairman of the boardof the National Association of Software and Services Companies (NASSCOM). In thisinterview, Narayanan offers insights on IT developments in the past decade, from the Y2Kphenomenon, to the rise of the global delivery model, to the impacts of enterprise 2.0 trends.PwC: As you think about the evolution of ITduring the past 10 years, what aspects reallysurprised you?LN: What surprised me, first and foremost, was themobility of data and information. We really didn’tanticipate mobility in the manner in which it ishappening today. We knew that the signs were there:credit mobility, capital mobility, commodities mobilityand, to some extent, time mobility. But the data andinformation that corporations were using pretty muchresided in specific locations. Then that started movingabout. So people had access to both proprietary andpublic data, which could be put to use in an enormousnumber of ways. That is a significant difference.Combine that mobility with the new technology ofcomputing power—the solid-state physics of processmemory and the iPod type of way you can carry 20gigabytes of data. It’s conceivable that you can carryyour corporate database on your laptop every day.The related surprise was the ability to carry or deal withdata from anywhere in the world. It really surprised us,as did the many ways of putting it to use. For example,today’s cardiogram machines and CT (computedtomography) scan machines have such a huge capacityto store images, and, once collected, these images areavailable for telediagnostics from anywhere in the world.Just as there are Internet kiosks today, it’s conceivablethat in the near future you will have some of theseclinical kiosks at roadside corners. You just walk in, getyour scan done, and then be diagnosed by a doctorsitting somewhere in the Philippines or India or China.PwC: Do you foresee any changes that mighthave a similar impact in the next 5 to 10 years?LN: It’s essentially the extension of what I have alreadysaid. Some equipment in an automobile, for example,has a chip that is communicating. If it doesn’t performproperly or if it doesn’t complete a task in a certain22 PricewaterhouseCoopers Technology Forecast

amount of time, it transmits a message. In India,particularly, if you look at the rate of growth inautomobile sales, you have to ask, “How are all thesecars going to be serviced?” The new method will directthe automobile diagnostics to a central place, allowingone to control and monitor the state of the automobile.The same thing applies to medical equipment and othersuch segments. All functioning equipment will be ableto communicate constantly.The other change is the miniaturization that’s happening.What’s happening to storage will happen to processors.It’s already in the works, startup companies are lookingat servers that use only 55 percent—or even 15 percentor 20 percent—of the energy that they consume today.These efforts will yield compact servers or green techmachines. With less power requirement, you no longerneed huge data centers and centralization. This will leadto new applications using the massive computingpower that will be available. The only bottleneck that Isee is bandwidth: You have the storage, you have thecomputing power, but the communication bandwidth isnot expanding as rapidly as it should.PwC: That’s interesting, because the story of thepast 10 years has been about deployment ofbroadband. So you’re saying that it is not goodenough—particularly compared with theadvancements in storage and computing power.LN: Correct. Today, everybody is streaming. Streamingvideo is common. Most people want to sit in an officeand monitor their home, or sit at home and monitor theoffice. All these require constant inputs. Insuranceagencies in the United States, for example, have thesedevices on automobiles that track every 30 minutes andkeep writing over the recorded data. Anytime there is anaccident—boom—the video is available for everything:for evaluation, for damage estimation, for calling up thebody shop, for processing the insurance, and things ofthat nature. That capability requires a lot of streaming,and we don’t have sufficient bandwidth today for suchapplications. It’s not a technology constraint.Unfortunately, it’s an issue involving government,telecom, bandwidth, and spectrum—not only here, butall over the world.PwC: Talking about the industry Cognizantoperates in—the global service delivery modelhas made robust progress during the past 10years. How much of this growth were youexpecting, and how much of it is a surprise?LN: The global or remote delivery model has been in theworld for the last 20 to 25 years, and it was growing at arather slow pace. In the last 10 to 12 years, the pacehas really picked up, primarily because of the talentshortage elsewhere. That’s been one of the drivingforces behind the expansion of this particular industry.Now, the element that contributes to rapid growth—apart from the availability of talent—is the outcome.Successful delivery of solutions to businesses the worldover—the consistency with which it has been delivered,the high quality and the lower cost—has significantlycontributed to the rapid growth.What we didn’t anticipate was how long we could pushout the cost advantage. In 1994-95, it waspredominantly a labor arbitrage kind of model; that wasthe primary reason. And people asked how long thecost advantage will continue. Even then there was wageinflation and not enough people were available. Thebest guess then was that we definitely would not haveany challenges for the next four to five years, but not sothereafter. By 1997-98, the answer to that question wasstill the same, but it was increasingly apparent that itThe era of CIO accountability 23

While we are setting up finishing schools and new universities areemerging in order to meet this demand, there is a revolution in progresswhere these 1.6 million people in the industry are also getting more andmore efficient, like that 80-member team reducing to a 30- or40-member team. We are getting a certain relief on the base that we’vealready established as an industry. That is working in favor of theindustry, taking some pressure off. The people available are notnecessarily suitable for the task. However, that suitability gap is beingbridged by the increased efficiency and productivity that we’re derivingon the basis of that workforce.would be difficult to sustain that kind of advantage forlong because the costs were increasing.Now, after about 12 years, if you asked me the samequestion, the answer would be the same: We would notface any challenges for the next four to five years. So,we have been able to push the cost advantage outward.We thought it would last for only about four or fiveyears, or, at best, 10 years. It has been pushing out.PwC: So you don’t think we’ve passed the limitsof supply-side elasticity on the labor yet.LN: It’s not due to supply-side elasticity. There arevarious factors that complicate this. In terms of newer,better tools, productivity and efficiency have improved.The ability to put together a different type of model totackle a problem has improved. Getting talent fromdifferent places to work in a highly collaborative way—almost a global collaboration—is something we had notanticipated. All those factors of global collaboration—whereby minds from different parts of the world arecollaborating on a single problem, and tools areavailable to implement the solution far more rapidlytoday—have led to the sustainability of this kind ofadvantage—the cost advantage—although wages inIndia have been increasing 10 percent to 15 percent inthe last 10 to 12 years, and the corresponding wageincreases in the markets we serve, such as the UnitedStates and Europe, have been no more than 3 percentto 5 percent every year. So just mathematical logic willtell us that at some point, these two curves will have tomeet. But they seem to be going on almost in parallel inour area, primarily because of the productivityimprovements and the tools and the technologies thathave become available.PwC: Global IT service delivery has obviouslymatured a lot more than business processoutsourcing, or BPO. Can you talk about thedevelopment of the BPO market, vis-à-vis ITservices? What needs to be done to make itmore mature? What are the barriers? What doyou see after that?LN: The BPO industry is clearly not as mature as thetechnology industry, but in the limited time that theindustry has existed, it has made tremendous progress.The BPO industry is just about four or five years old. Itstarted in the captive mode, and then transformed intoan industry of third-party service providers. To that24 PricewaterhouseCoopers Technology Forecast

extent, these companies have done well. The processeshave been streamlined and so on.For this industry, serving customers—whether internalcustomers or external customers—is the mantra. Butthe customer orientation of BPO companies and theimpact of their work are not very well understood,because the context is not very clear. People are veryunfamiliar with the markets that they’re dealing with.The way things operate in the US, for example, is notclear. Here, people are providing contact and customerservice support, which is a very process-drivenapproach. They will deliver on SLAs (service-levelagreements), but they will not deliver a unique customerexperience. As a result, you will have challenges if theindustry is to scale to the next level, to meet allcustomers’ challenges and service requirements.Companies that have demonstrated that kind of mindsetand history have not come out into the open yet,which is the biggest challenge.PwC: How much of a bottleneck is the fact thatcompanies have not yet started investing indomain competencies—for example, F&A(finance and accounting)?LN: That would be a fair assessment. Under normalcircumstances, you can categorize, say, F&A as ahorizontal activity in that you’re doing the receivables,payables, accounting, bookkeeping, and those type ofactivities, but the moment a company is engaged inan M&A (mergers and acquisitions) activity, thereis a certain level of domain specialization that hasto come about.For example, we do alternative fund accounting for oneof our customers. That’s highly specialized. It’s notsomething that can be done as part of your regularaccounting work. You have to set up a separate teamand you have to have qualified people who understandthat business. That is the point about domain-specificcompetencies.Are there projects that are under way right now thatleverage domain competencies? Yes, but not as high innumber as those in the horizontal BPO space. However,the sustainable opportunities are in this area. In myview, horizontal BPO will eventually get automated out.That’s the purpose. For example, you might start with a100-member team, but with the application oftechnology, finally, you are no more than a 10-memberteam. All you’re doing, in addition to delivering theservice, is shrinking that team—you’re making it moreefficient, you’re automating the processes. On thedomain side, some amount of shrinking is possible, butthen newer applications are going to come aboutbecause the growth will constantly be there. Themigration from this horizontal BPO to domain-specificBPO has been gradual, because horizontal BPO isclearly the low-hanging fruit.PwC: In the past you have been vocal aboutshortage of talent in India also. How do you seethat impacting the global delivery or BPO trends?LN: Let me share an anecdote. One of the customers ofa BPO company doing F&A work told the companyrepresentative: “You have a team of about 80 peopleworking on this problem for me. I could do it with 40The BPO industry is clearly not as mature as the technology industry, butin the limited time that the industry has existed, it has made tremendousprogress. The BPO industry is just about four or five years old. It startedin the captive mode, and then transformed into an industry of third-partyservice providers. To that extent, these companies have done well. Theprocesses have been streamlined and so on.The era of CIO accountability 25

There are so many open-source products and stacks that have beendeveloped and put out there. Very smart people collaborated on the flyand created these products. So why can’t that context be created withinthe company so that all these smart people can collaborate from within?That is exactly what we have done now and put together as the Cognizant2.0 platform, a platform that provides the ability for global collaboration,where a developer in China, a tester in Hungary, a designer in Chennai(Madras), and an analyst in the US can collaborate in real time using thisplatform to execute a project. That’s already becoming available.people in my country if I had the right 40 people, butunfortunately I don’t have them. Why is it that you arenot as efficient as the 40 people?” The companyrepresentative’s answer was: “In the USA, you could doit with 40 people. If I had the 20 or 30 right people, I toocould do it with them. The fact of the matter is thatneither you nor I have access to those 20 or 30 rightpeople. We have to accept that all we have is these 80people. Three or four years down the line, out of these80, I’ll get the right 30 or 40 and those are the peoplewho will be working on it going forward. But today’scontext is different.” That captures the situation quitewell. Things are taking a lot more time and entailinglarger teams. As a result of this, there is a greaterdemand today for a larger number of people. The ITindustry here in India provides employment to 1.6million professionals and that number is expected to goto about 2.4 or 2.5 million by 2010. We’re talking abouta shortage of close to 400,000 people because wedon’t have enough people.While we are setting up finishing schools and newuniversities are emerging in order to meet this demand,there is a revolution in progress where these 1.6 millionpeople in the industry are also getting more and moreefficient, like that 80-member team reducing to a 30- or40-member team. We are getting a certain relief on thebase that we’ve already established as an industry. Thatis working in favor of the industry, taking some pressureoff. The people available are not necessarily suitable forthe task. However, that suitability gap is being bridgedby the increased efficiency and productivity that we’rederiving on the basis of that workforce.PwC: Obviously, in your work with global servicedelivery and BPO, you deal with CIOs a lot. Andas IT has evolved, so have CIOs. How has therole of the CIO changed from 10 years ago tonow, and how do you see that role changing inthe next 5 to 10 years?LN: It has changed in two ways. One, CIOs understandthe business a lot more today than they did about 10years ago. Then, they were almost not expected tounderstand the business, whereas today, if you don’tunderstand the business, you cannot be a CIO. Two,the accountability has shifted: No longer can they saythat they delivered a technically superior project or aproduct to the business, but the business has notreaped any benefit out of that system. That is not thebusiness’s fault; it is the CIO’s fault. There is a totalalignment between business and technology, which isexpected, and the outcome is not a technologyoutcome, but a business outcome.26 PricewaterhouseCoopers Technology Forecast

We see that, and it directly translates to us beingpartners with CIOs. Now, CIOs expect us to understandthe business a lot more than before. And whatever workwe do, we are accountable for the business outcome—such as customer satisfaction or improving marketshare—rather than the project or the technologyoutcome. There’s no point saying, “I gave you code ontime and on budget and that code has only four defectsin a million lines.” This doesn’t carry very far these days.That’s a big difference.PwC: Does that affect how you price yourservices?LN: Yes, it affects how we price and staff our services.For example, about six years ago, we never put muchemphasis on having MBAs and business analysts in thecompany. Today, we are saying our target is one MBAfor every 25 technical people in the company and theyhave to work together on teams. It’s almost mandated.That’s a big shift, because you need to understandtechnology and also the business. Our efforts to maketechnology people business savvy weren’t assuccessful as envisioned. It’s not possible for hard-coretechnical people to embrace business the way an MBAor a business analyst would do. And because of theMBA dynamics—these are clearly higher-priced peoplecapable of doing more for the business efficacy of theproject—the pricing has changed.the company so that all these smart people cancollaborate from within? That is exactly what we havedone now and put together as the Cognizant 2.0platform, a platform that provides the ability for globalcollaboration, where a developer in China, a tester inHungary, a designer in Chennai (Madras), and ananalyst in the US can collaborate in real time using thisplatform to execute a project. That’s already becomingavailable. As we speak, we have beta customers, wherewe have inducted the customer team into thisdevelopment environment. It’s almost like the opensourcedevelopment, except that it is a defined project,there are deliverables and they have to follow aparticular process. So the concepts of social networkingsites have been applied to this collaborative platform.PwC: What form does it take with respect to aparticular client?LN: We’ll open this up to those clients who are receptiveto this idea and want to be a part of this platform. We’llthen see the challenges that confront us. For example,once I open it up to one client, you cannot open it upinternally to a whole bunch of other people that you’repartnering with. There are certain restrictions there. •PwC: The emergence of enterprise Web 2.0,mashups, and Facebook-style applicationprogramming interfaces is making programmingaccessible to more and more users, evenbusiness users. You have thousands ofprogrammers in your company, so how does thatimpact your business?LN: Yes, I agree, it is becoming easier. It has to becomeeasier. I’ll give you the example of open source. Thereare so many open-source products and stacks thathave been developed and put out there. Very smartpeople collaborated on the fly and created theseproducts. So why can’t that context be created withinThe era of CIO accountability 27

Making complexitymanageableWhat do you do when complexity can’t be avoided?28 PricewaterhouseCoopers Technology Forecast

1616-CEOPrimaryFocus.aiPrimary focus areas for CEOsAgile management teams need new technologies tosupport smarter and faster decision making. Butcapitalizing on opportunities to innovate with newtechnologies is difficult in today’s enterprise environmentbecause of the overwhelming complexity created bygenerations of previous technology investments.To confront the costs of complexity, both managementand their IT organizations are approaching it insophisticated new ways. When enterprises successfullymanage that complexity, their appetite for innovationthrough technology investments will accelerate.More than 75 percent of about 1,400 global CEOssurveyed by PricewaterhouseCoopers (PwC) in 2006said the level of complexity in their organization ishigher than it was three years earlier. (See Figure 1.)Some level of complexity is an inevitable by-product ofdoing business today, so the challenge becomes howto keep complexity at a manageable level.What creates this complexity? Expansion into newterritories, mergers and acquisitions, and the launch ofnew products and services have been the primarysources, according to these CEOs. Moreover, complexitycreates such a drag on enterprise performance thatFor many, it’s no surprise that the ITfunction is the highest-prioritycomplexity challenge.Information technology 84%Organizational structure 79%Financial reporting & controls 69%Customer sales & service 69%Figure 1: Primary focus areas for CEOs to reduce complexitySource: PricewaterhouseCoopers survey of 1,400 global CEOs, 2006nearly 80 percent of the CEOs said that reducingunnecessary complexity was a personal priority.The CEOs’ primary focus areas were informationtechnology (84 percent), organizational structure (79percent), financial reporting and controls (69 percent),and customer sales and service (69 percent).For many, it’s no surprise that the IT function is thehighest-priority complexity challenge. First, IT echoesbusiness as it is now integral to all business functions.As businesses and business processes have becomecomplex, so has IT. Second, IT complexity occurs notonly in the operations, but also in the architectures,applications, and data solutions deployed in the ITenvironment.Making complexity manageable 29

omplexityThe three dimensions of IT complexityIn IT, complexity can be seen as a result of threeprinciple dimensions as shown in Figure 2. These are:• Number of entities. The large number of productsand solutions, either hardware or software, that makeup the overall IT system. The larger the number, themore complex the system. Monitoring, managing,and maintaining a large number of IT entities takesconsiderable human and specialized resources.• Degree of heterogeneity. The lack of standardizationamong the various entities as they are sourced fromseveral vendors and by different parts of the enterprise.This means that entities behave differently and requirespecialized knowledge and skills for theirimplementation, integration, and operation.• Number of interconnections. The frequency withwhich the entities are integrated or interconnected toeach other for the purposes of automation or otherneeds. Integration among these parts means theyhave unique dependencies and connections, makingRemediationNumber of interconnectionsDegree of heterogeneityRemediationDesired stateLess complex,lower cost,agile managementCurrent stateFigure 2: Three dimensions of complexitySource: PricewaterhouseCoopers, 200802-ThreeDimensionsOfComplexity.aiRemediationNumberof entitiesit onerous to introduce changes or quickly diagnosethe system if a problem occurs.As a result of these dimensions, complexity increaseswhen enterprises interconnect a large number ofnonstandard heterogeneous parts (infrastructure orapplications or data) to carry out necessary businessfunctions.Many trends have contributed to the complexity withinIT environments. (See Figure 3.) Enterprises haveadopted several generations of architectures andsolutions over the years, thereby adding layers oftechnologies from many vendors.Those technologies and solutions often duplicatedfunctionalities in different business units and spannedvarious operating systems, hardware platforms, andversions. Complexity increased as most systems becamesubsequently integrated for business efficiency andprocess automation. Complexity increased further asenterprises increasingly extended their IT systems tosuppliers, partners, and customer IT environments.Additional complexity stems from incompatible,duplicated, and poor-quality data that applicationsstore and share. Furthermore, business continuityrequirements and rising customer demands forceddata centers to focus on nonstop operations, includinghot-swap backup data centers. This trend hasmultiplied the number of IT assets that must beintegrated and managed.For competitive reasons, enterprises often developedcustom innovations in-house, especially those that usedemerging technologies. Competition or marketconditions required fast action that couldn’t wait untiltechnology solutions were standardized or simplified,and as a result, complexity increased further.“Of course, you have to innovate, and if you do it fast, youwill add complexity. But, in parallel, you have to clean upthose things where it’s not worth any longer to differentiateyourself against others,” suggests Henning Kagermann,CEO of SAP. (For more of Kagermann’s thoughts, seethe interview on page 13.) While most enterprises haveembraced technology as part of their efforts todifferentiate themselves, they have not been seriousabout the cleanup that Kagermann is talking about.30 PricewaterhouseCoopers Technology Forecast

IT complexityIT complexityFault tolerantHigh availabilityApplication integrationData integration24x7 operationsBusiness continuityDepartmental computingCustom applicationsDistributed computingWeb applicationsCloud computingInter-enterprise applicationsMainframe architectureClient/server architectureWeb architectureService-oriented architectureTimeFigure 3: IT complexity over time—The adoption of new architectures and the imposition of new businessrequirements have continued to increase complexity.Source: PricewaterhouseCoopers, 2008Business impact of IT complexityYears of technology absorption have steadily worsenedthe complexity problem. This complexity has anundesirable impact on business in two principal ways:• Higher cost. The continual management of patches,upgrades, new versions, and complex interfaces inthe environment requires specialized skills. Industryestimates suggest that more than 70 percent of ITbudgets are spent on ongoing operations andmaintenance, leaving few funds for innovations andadoption of more-efficient and emerging solutionsthat drive enterprise competitiveness.• Lack of agility. The ability to respond quickly tochanging business requirements and marketopportunities decreases. Many IT infrastructures arenow brittle because of numerous integrations andassociated dependencies. Most changes areonerous, require exhaustive testing to initiate, andtake too long. They also increase the risk that acritical function may not work as before or may havean unexpected effect on another part of the system.Despite those impacts, most organizations have notaddressed the issue directly because more-compellingareas for IT spending have taken priority. For example,regulatory compliance, new process automation, and newbusiness opportunities have higher priorities. Additionally,fear of breaking what is already running has also keptenterprises from addressing the issue.As awareness and the impact of IT complexity grow,however, some companies are beginning to addresscomplexity directly. Hewlett-Packard Co. (HP), forinstance, is working hard to reduce complexity in itsown IT operations and plans to cut ongoing operationsand maintenance costs from 70 percent of its total ITbudget to 20 percent, according to Forrester Research.HP intends to simplify IT on multiple dimensions: from85 data centers to 3; from more than 5,000 applicationsto 1,100; from more than 21,700 servers to 14,000; andfrom 762 data marts to a single view of the enterprise.Making complexity manageable 31

Complexity remediationAlthough there is no off-the-shelf solution to addresscomplexity, many of the emerging technologies andarchitectural approaches can mitigate IT complexitywhen implemented appropriately. The adoption of thosenew technologies and approaches will accelerate duringthe next decade.In broad terms, IT systems consist of infrastructure,application, and data layers, as detailed in Table 1.The three layers of IT face different challenges and willbenefit from different approaches to complexityremediation. Because managing IT across these threelayers is a responsibility shared among the businessunits, so is complexity remediation. The followingdiscussions explore the approaches enterprises adoptto reduce complexity.Reducing complexity with virtualizationand cloud computingVirtualization is the aggregation of IT resources and theirphysical characteristics so as to make them available toapplications and users in an on-demand manner.Virtualization helps organizations optimize resourceutilization. The resources can be servers, storage, orother network components.In a non-virtualized environment, servers typically arededicated to specific applications. Server utilization canaverage below 20 percent or 30 percent depending on thehardware platform. Declining hardware costs encouragedIT organizations to overprovision in order to accommodatepeak loads. Dedicating additional hardware to anapplication was easier than optimizing utilization. However,Remediation focusOrganizational groupsinvolvedIT complexityremediation approachApplies toInfrastructure layerCan be pursued byenterprise IT aloneVirtualization andcloud computingServers, storage, networking,operating systemsApplication layerBusiness unit andenterprise ITApplication rationalizationand application portfoliomanagement (APM)Legacy applications, vendorapplications, middleware,Web servicesData layerBusiness unit andenterprise ITMaster data management(MDM)Databases, businessintelligence solutions, datawarehousing solutionsTable 1: The three layers of IT face different challenges and will benefit from different approaches to IT complexity remediation.Note that Table 1 highlights only some of the approaches to complexity remediation. Other approaches andtechnologies not listed here can be brought to bear in each of the layers.32 PricewaterhouseCoopers Technology Forecast

In many ways, virtualization of IT infrastructure is a return to the mainframeera, when computing was offered as an on-demand, centralized servicecoupled with a system of controls and usage-based charges.while hardware costs were flat or declining, operationsand management costs rose significantly as the numberof servers to manage increased.Virtualization software eliminates the tight coupling thathad existed between applications and the IT assets thatsupport them. By pooling hardware resources,standardizing operating systems, and dynamicallyallocating resources according to application demands,IT organizations can deploy fewer servers. Averageutilization across them can reach optimal levels withoutsacrificing either availability or performance.In many ways, virtualization of IT infrastructure is a returnto the mainframe era, when computing was offered asan on-demand, centralized service coupled with a systemof controls and usage-based charges. All major systemsvendors—including HP, IBM, and Sun Microsystems—as well as software vendors—such as Citrix Systems(which acquired Xensource in 2007), Microsoft Corp.,and VMware—are offering virtualization solutions.Cloud computing—server resources made available bythird-party services over the Internet—leverages andextends virtualization technologies. These sharedcomputing, storage, database, and networking resourcestypically are hidden behind standard interfaces thatmultiple businesses and users access in a multi-tenantenvironment.Cloud computing is a continuation of the vision of utilitycomputing, which suggests that IT capabilities will beaccessible as a utility—much like electricity and telephony.The utility approach means that enterprises need notmaintain and manage their own IT infrastructure and datacenters. Instead they access Internet services as needed.Collectively, these technologies challenge the existingeconomics of IT infrastructure and, over the long term,offer lower cost and simpler operations.Cloud computing strategies are being actively pursuedby vendors that already own and operate large datacenters, such as Amazon, Google, and IBM. Amazon’sservice, called Amazon Web Services, brings togethercomputing, storage, and database functionality over theInternet and enables enterprises to create and operateWeb applications without owning any IT infrastructure.Benefits: Lower cost and agilityVirtualization reduces operating costs by increasingasset utilization. According to a 2007 Data Centersurvey by Symantec Corp., more than 66 percent ofdata center managers say that server consolidation andserver virtualization are the two strategies they arecurrently deploying to cut data center costs, followedby other approaches, such as automation of routinetasks and data center consolidation. Virtualization reducescosts and complexity because IT staffs require trainingin fewer technologies, replication of best practices fromone area to another becomes easier, and IT organizationscan source products from fewer vendors.In addition to lowering costs, virtualization also bringsflexibility to the IT infrastructure. The provisioning of ITassets to fluctuating workloads is more efficient. Forexample, on-demand scalability can speed thedeployment of a new application, since no new hardwareneeds to be installed and configured by either IT or theend user. Also, in fast-growing businesses, theinfrastructure can scale for new or existing applicationsby provisioning more capacity dynamically. And in turn,resources that support applications taken out of servicecan be put to new uses.Cloud computing’s benefits are similar to those ofvirtualization, but cloud computing is more comparableto utility or telecom carrier service-level agreement (SLA)Making complexity manageable 33

situations. By giving the responsibility for the ITinfrastructure to a service provider who specializes inmanaging multi-tenant infrastructures, enterprises payto have someone else manage the complexity for them.This realizes a clear benefit, but one that incurs a costof some flexibility and customization capability.Mainstream adoptionAdoption of virtualization solutions is expected toaccelerate for the next few years. According toForrester Research, half of surveyed enterprise ITorganizations were using x86 server virtualization inearly 2008, increasing to two-thirds by 2009. Amongenterprises using virtualization, 24 percent of servers arevirtualized today, and that percentage is expected toreach 45 percent by 2009.Because virtualization is an emerging technology, mostof the applications running in the virtualizedenvironments thus far are non–mission critical.However, as the technology matures, adoption willspread to mission-critical applications. It will alsospread from large enterprise data centers to small andmidsize businesses and desktops, thereby continuingthe momentum for the foreseeable future.Emerging concerns about the power requirements ofdata centers are also fueling the adoption of virtualizationand cloud computing. Fewer servers mean less powerto run the data centers, which helps IT comply withan increasing number of green inititatives.ChallengesAlthough virtualization and cloud computing reducecomplexity and cost, they also create new challenges.Because servers are not dedicated to particularapplications in virtualized environments, IT departmentswill need robust capacity and priority managementprocesses. And they’ll need controls and protocols toensure adequate performance for specific applications.Business users accustomed to having dedicatedresources will need to understand and negotiate SLAs.The prevailing licensing practices of most softwarevendors that sell packaged applications are not suitablefor virtualized environments. Applications are typicallytied to a specific hardware or operating system. Overthe long term, software licensing and pricing mustevolve toward value-based pricing. (For more informationon pricing models, see “Software pricing trends: howvendors can capitalize on the shift to new revenuemodels,” available at Over the shorterterm, however, virtualization is challenging existinglicensing practices that depend on a strong couplingbetween application software and the hardwareplatform it runs on.With respect to the three dimensions of complexity inFigure 2, virtualization reduces complexity primarilyby reducing the number of IT entities that the ITorganization needs to manage. Additionally, bystandardizing on a few selected hardware platforms,virtualization also reduces the heterogeneity in theinfrastructure layer. Cloud computing reduces complexityeven further by taking away the need for managing anIT infrastructure altogether.Reducing complexity withapplication rationalizationIn most enterprises, the number of applications—whether sourced from vendors or custom developedin-house—has increased steadily over the years. It isnot uncommon for large enterprises to have severalthousand applications that need to be maintained,managed, and supported.According to a study by the BPMForum, 78 percent of enterpriseslarger than $500 million in revenuessay they maintain and supportredundant, deficient, or obsoleteapplications. They also estimate thatmore than 20 percent of the IT budgetgoes toward such applications.34 PricewaterhouseCoopers Technology Forecast

Emerging delivery models such assoftware as a service (SaaS)—aspopularized by Google Enterprise,NetSuite,, andother vendors—also reduceapplication-specific complexity byeliminating an enterprise’s need tomanage, maintain, or upgradeapplications and their infrastructure.According to a study by the BPM Forum, 78 percent ofenterprises larger than $500 million in revenues say theymaintain and support redundant, deficient, or obsoleteapplications. They also estimate that more than 20percent of the IT budget goes toward such applications.In the same vein, according to the 2007 data centersurvey by Symantec Corp., more than 67 percent of ITmanagers claim their data centers are getting too complexand that they have too many applications to manage.All of these applications require version control, patches,upgrades, bug fixes, feature enhancements, and relatedsupport, which make for substantial administrativeburdens on the overall IT function.At the same time, an organization may no longer needall the applications it has. Although enterprises keepcreating new applications to respond to competitivedynamics and market opportunities, most enterprisesdo not have a structured or mature process for retiringor modernizing older or outdated applications.Application rationalization is the process of consolidating,streamlining, and simplifying the application portfolio.Applications that do not support the enterprise businessobjectives or those that are redundant get weeded out.Required functionality gets consolidated in a core set ofapplications with standardized interfaces or modularservices. Application rationalization also brings forth agovernance structure that ensures enterprisewidevisibility, so that new applications get added in light ofexisting applications and aligned with emergingbusiness needs.A leading technique that can help with applicationrationalization is application portfolio management(APM). APM uses portfolio management techniques totrack, measure, and justify the benefits of a particularapplication to its costs. This usually requires thatenterprises aggregate pertinent information aboutapplications and integrate it with business informationto create intelligence and visibility into applications.According to IDC, the worldwide APM market in 2006was about $1.81 billion, and IDC expects it to grow toabout $2.44 billion by 2011. Tools available from softwarevendors to support rationalization include IBM RationalPortfolio Manager, Planview’s application portfoliooptimization solution, and Serena’s Mariner APM.Trends facilitating application rationalizationConsolidation in the enterprise software industry isreducing the number of vendors from which enterpriseswill source applications. At the same time, vendors areproviding preintegrated solutions, thereby reducing anenterprise’s need for custom integrations. Less customintegration means fewer resources that the enterprisemust allocate to support the integrations.Moreover, emerging delivery models such as softwareas a service (SaaS)—as popularized by GoogleEnterprise, NetSuite,, and othervendors—also reduce application-specific complexityby eliminating an enterprise’s need to manage, maintain,or upgrade applications and their infrastructure. Systemintegrators and professional services providers alsooffer solutions for application rationalization. Activeservice providers include PricewaterhouseCoopers,Accenture, IBM, Infosys, and Wipro.Application rationalization reduces IT complexity byaddressing all three dimensions in Figure 2. It reducesthe number of entities by eliminating duplications andretiring unwanted applications. It reduces degree ofheterogeneity by driving the standardization of softwareplatforms and vendors. Finally, with fewer applications,there are fewer resulting interconnections that need tobe managed or maintained.Making complexity manageable 35

Reducing complexity in dataApplications both create and consume data. Asapplications have proliferated over the years, so havedata and databases. Data exists in departmental silosand is often duplicated, mischaracterized, orinaccurately stored.The proliferation of data creates a major problem withregard to data quality and integrity. A survey by theData Warehousing Institute found that 83 percent oforganizations suffer problems with data because ofinaccurate reporting, internal disagreements, andincorrect definitions. Problems with data quality andintegrity have a negative impact on enterprises’productivity, decision-making processes, and overallmarket competitiveness. Such problems are criticalenough to cause a surge of interest in adoptingsolutions and approaches that address those concerns.According to IDC, the worldwidemarket for MDM software will growat a 20.2 percent five-year CAGR,advancing from $1.25 billion in2006 to $3.12 billion in 2011. Theassociated professional servicesmarket is projected to more thandouble from $3.03 billion to $6.08billion in 2011, a 15 percent CAGR.A leading approach to reducing complexity with data ismaster data management (MDM). The goal of MDM isto provide an approach to creating a single, unified viewof an organization’s data—in other words, a singleversion of the truth. MDM gets at the heart of the dataproblem by creating a trusted source for data quality,data integrity, and data consistency.Master data consists of information about criticalresources, such as customers, products, employees,and other business assets. Critical business operationsand processes depend on such data. Master data isusually centrally managed, is subject to enterprisegovernance polices, and is distributed and used acrossall systems. Vendors and service providers are activelyoffering solutions and services to help enterprisesbenefit from master data management.The MDM marketMDM products aggregate data from disparate systemsand then clean and normalize the data. The result is asingle view of the data—one that can be synchronizedand can be shared throughout the enterprise. Otherfunctions include support for workflow, business rules,data models, and integration capabilities to make theright data available to the right resource at the right time.Solutions are offered by established platform vendorssuch as IBM, Oracle Corp., and SAP as well as smallerniche vendors such as Informatica Corp., Initiate Systems,and Purisma. MDM-oriented professional services areoffered by many of the large systems integrators as wellas outsourcers such as PricewaterhouseCoopers,Accenture, IBM, Infosys Technologies, and WiproTechnologies.According to IDC, the worldwide market for MDMsoftware will grow at a 20.2 percent five-year compoundannual growth rate (CAGR), advancing from $1.25 billion in2006 to $3.12 billion in 2011. The associated professionalservices market is projected to more than double from$3.03 billion to $6.08 billion in 2011, a 15 percent CAGR.MDM adoptionMany current business imperatives influence the robustgrowth prospects for MDM. Regulatory compliancerequirements, for example, are encouraging enterprisesto invest in systems and processes that validate andensure appropriate controls on sensitive customer andfinancial data.36 PricewaterhouseCoopers Technology Forecast

Remediation in actionMany new-product rollouts taught a largemultinational consumer product company thatcomplexity in its IT environment was delaying thetime to market for new products and services.This negatively affected the company’s performancein a highly competitive marketplace. To improve thespeed to market of new products, the companylaunched a remediation initiative that targeted itsinfrastructure, application, and data layers.Currently, this global enterprise has more than 2,000applications—the result of mergers and acquisitions,rollup of companies, expansion into new territoriesand markets, custom development, and other factorsover many years. Often, the enterprise has multipleinstances of a product—such as enterprise resourceplanning (ERP) and customer relationshipmanagement—from the same vendor. In somecases, the enterprise has different versions of thesame product, and in other cases, it has the sameproduct on different platforms.Application portfolio management will help theenterprise to rationalize the application portfolio frommore than 2,000 to a final target of about 800. It willachieve its goal by retiring nonstrategic and unusedapplications, removing redundancies, andstandardizing on versions and platforms.For instance, all ERP solutions are beingconsolidated into a single, central ERP environmentusing minimal customization and simple integrationsto other applications. In some cases, data fromolder applications will be transitioned to modernarchitecture, and older applications will be retired.These efforts are concurrent with the architecturaltransition to a service-oriented architecture (SOA) topromote modularity and standard interfaces. Inaddition to the focus on packaged applications, afew work streams are examining the adoption ofsoftware as a service (SaaS) for noncore functions.The transition is likely to span four to five years.Parallel to the application rationalization tasks areefforts focused on the creation of master data andthe governance policies for managing its integrationand distribution. Data about customers and productsis being aggregated from the existing systems andrationalized to remove differences in definitions,accuracy, and completeness. The enterprise isperforming this rationalization by creating a unifiedview in a master data hub with a persistent relationaldata store built on open standards using Java 2Platform, Enterprise Edition (J2EE).Upfront work in application rationalization andmaster data management (MDM) will improve theinfrastructure consolidation work stream. Theseefforts will shrink the infrastructure footprint byreducing the number of servers by about 25 percent.By leveraging virtualization technologies, theutilization on the x86 servers will jump from about8 percent now to 25 percent to 30 percent in thefuture, and utilization on the UNIX servers willincrease from 30 percent now to about 65 percentin the future. Consolidation in servers is initiallydirected at non-customer-facing functions in theback office, such as Dynamic Host ConfigurationProtocol (DHCP) and Domain Name Service (DNS).Over time, consolidation will spread to other backofficeand some front-office functions.Making complexity manageable 37

Integration of data after mergers and acquisitionsrequires the migration and aggregation capabilities ofMDM to extract the synergies necessary for achievingfinancial returns.Finally, enterprises are investing heavily in businessintelligence solutions to leverage the large amounts ofdata that enterprises generate and store. That effortbenefits from a trusted source for the data that getsprocessed, a key value proposition of MDM.To succeed with MDM, manyorganizations will need to changetheir existing practices. Mostimportant, MDM requires that ITmanagers and business managersagree on the semantics of the dataand the underlying information.These managers need to resolve allof the inevitable internaldisagreements about the definitionsof data describing customers,products, employees, and otherbusiness assets.Other IT initiatives also benefit from MDM solutions. Forinstance, the service-oriented architecture (SOA)approach promotes the development of services thatprovide data in a standardized form. These solutionsprovide real-time transactional and business analysisdata delivered through Web services technologies thatfeed relevant business processes and applications. LikeMDM, SOA initiatives also benefit from an enterpriselevellook at data that has common definitions and isnot duplicated.Business process management systems (BPMSs) takethe information feed and bring the right information toimportant decision-making processes in a timely manner.(See “Bringing order to chaos,” p. 59.) IncorporatingMDM functionality into an SOA or BPMS investmentmeans that trusted, high-quality data and informationwill be available to applications and processes.To succeed with MDM, many organizations will need tochange their existing practices. Most important, MDMrequires that IT managers and business managers agreeon the semantics of the data and the underlyinginformation. These managers need to resolve all of theinevitable internal disagreements about the definitionsof data describing customers, products, employees,and other business assets. In addition, the managersmust build a governance structure and establishprocesses for maintaining consistent definitions,avoiding duplication of records, and resolving variancesamong data sources.An effective MDM strategy can alleviate the complexityrelated to data by managing metadata in a centrallocation and improving data quality and consistency forall applications.Although enterprises can use MDM to reduce the costsassociated with managing data, the key benefit in thefuture will be the responsiveness of IT. A centralizedrepository of master data means changes can be madeeasily and propagated efficiently across the enterprise.Effective governance practices would mean that theduplication and proliferation of bad data would becontained or eliminated.Returning to our dimensions of complexity in Figure 2,MDM reduces IT complexity by reducing the number ofdata entities that need to be managed. It also reducesheterogeneity by developing common definitions andstandards for data integrity. The number of connectionsis reduced as well, as a single version of the dataeliminates the need for point-to-point connections foraccessing that data.38 PricewaterhouseCoopers Technology Forecast

Complexity is a double-edged sword: Complexity creates value bybringing rich new functionality and higher levels of automation tobusiness operations. However, it can cripple an organization by causingmanagement burden, lack of responsiveness, and out-of-control costs.Complexity also creates value when it is hidden away from the end usersand accessed with simple, standard interfaces.Hiding complexityVirtualization, application portfolio management, andmaster data management consider IT needs at theenterprise level: across all silos, departments, andfunctions. As a result, IT assets and resources areoptimized globally, with a view toward overall enterpriseneeds and strategy, in addition to meeting localrequirements. When done right, these approachesbring the current IT house in order and prepare it forabsorbing future waves of technologies and solutionsin a manner that can keep complexity in check.Complexity is a double-edged sword: Complexitycreates value by bringing rich new functionality andhigher levels of automation to business operations.However, it can cripple an organization by causingmanagement burden, lack of responsiveness, andout-of-control costs. Complexity also creates valuewhen it is hidden away from the end users andaccessed with simple, standard interfaces. “Anysimplification needs to focus on the user experience;that is where maximum breakthroughs will also occur,”points out S. Ramadorai, CEO of Tata ConsultancyServices. (To read more of Ramadorai’s comments,please see the interview at instance, an automobile is a complex piece ofmachinery that has evolved in quality and convenienceover the years. When automatic transmissions wereintroduced in automobiles, they no doubt increasedcomplexity of the car from that of manual transmission.But the complexity of the overall driving experience wasreduced because of the standard interface andautomation. Most of the automobile’s complexity eitheris not visible or is abstracted with dials and messagesappearing on the dashboard.IT environments need to evolve in a similar manner.Unnecessary complexity must be avoided, and valuecreatingcomplexity must be harnessed and hiddenbehind standardized processes or interfaces. Theincentives—in terms of either cost savings or increasedflexibility—are in place for enterprises, vendors, andservice providers to encourage this evolution.For more information on the topics discussed in thisarticle, contactMaking complexity manageable 39

Operational Web 2.0Finally, enterprise-class tools and methods are emerging to createinformal online networks. Already, business users are collaborating todeliver better decisions.40 PricewaterhouseCoopers Technology Forecast

Tech book publisher Tim O’Reilly coined the term “Web2.0” in 2003. In the process, he identified a commonthread that connects many seemingly separate thingssuch as blogs, wikis, and mashups.The term “Web 2.0” refers to social software thatenables people to interact and share information in newways. It is the interactivity of Web 2.0 services thatdistinguishes the services from those of Web 1.0. If Web1.0 was a one-way broadcast, then Web 2.0 is two-wayconversation. Web 2.0 provides a way for users to addtheir own input directly and instantly to what someoneelse has posted on the Web. This form of persistentinteraction converts what used to be a one-to-one or aone-to-many information flow into a many-to-manyinformation flow.Web 2.0 is associated with familiar public onlineservices such as blogs, wikis, social networking,bookmarking, and mashups. Web 2.0 describesinteractive social networking platforms such asMySpace, Second Life, and Facebook, as well asstandalone services such as Blogger, Wikipedia,Bloglines, Google Maps, Digg, and Yahoo! Pipes.Web 2.0 and agilityAgile management demands the kind of decisionmakingsupport that Web 2.0 offers. With Web 2.0,decision makers can use virtual-presence technologiesto facilitate instant access to networks of subjectmatterexperts. Additionally, smart enterprise searchcan locate the knowledge trails those networks ofexperts leave in blogs, wikis, and mashups.In the process, the art of business communication isbeing redefined. Informal channels within and amongorganizations have always been important, and Web 2.0is rapidly becoming the communications backbone forthose informal channels.Decision makers can use virtual-presence technologies to facilitateinstant access to networks of subject-matter experts. Additionally, smartenterprise search can locate the knowledge trails those networks ofexperts leave in blogs, wikis, and mashups.Operational Web 2.0 41 aspect of Web 2.0One-to-manyMany-to-manyOne-to-oneMedia documents broadcasts Web 1.0 WebcastsKnowledgeand insightexpand andtimeliness increases blog wiki mashups meetings conversations brainstormTools and techniquesPowerPoint,Excel, Word,portals, radio,TWidgets, wikis,blogsites, filters,ranking systems,hack day folksonomiesFace-to-face,phone calls,messaging,paper napkinFormality decreaseswhile validation isless rigorousFigure 1: Many-to-many aspect of Web 2.0Source: PricewaterhouseCoopers, 2008Beyond wikis and blogsInteractivity and personalization are what have madeWeb 2.0 so popular for consumers. For enterprises, it’sabout more than just interaction; it’s about tools thatcan help devolve authority, empower the workforce, andbuild bridges between organizational silos.Transformation of this sort is possible only forestablished enterprises focused on the cultural shiftthat must accompany the technology adoption. Thisimplies a long road and a steady course for those whowant the real benefits the interactive Web promisesfor enterprises.A recent surge in media coverage of the topic makes itseem new, but Web 2.0 concepts have been around formore than five years now. Over those years, Web 2.0has altered the landscape of the consumer Web andchanged how people exchange information. ByDecember 2007, MySpace and Facebook were rankedfifth and ninth, respectively, in audience share amongthe top 20 Web sites in the US, according to Hitwise.Neither service existed six years ago: MySpace beganservice in late 2003, and Facebook, in early 2004.By January 2008, hundreds of millions of peopleworldwide had joined the current generation of socialnetworking sites:42 PricewaterhouseCoopers Technology Forecast

• MySpace led the rest, with more than 200 million users.• Chinese site had more than 90 million.• Facebook reached 63 million.• Bebo, popular in the UK, claimed 40 million.Given the sheer numbers, the current generation of Web2.0 services is a communications phenomenon withimplications beyond the online communities thatpreceded them.So how do they fit in? And where are they leading us?Online social networking: Sharingexperience onlineElectronic capabilities improved our ability to shareexperiences by orders of magnitude—to the point thatglobal, instantly available multimedia are largely takenfor granted. Those improvements have become morefrequent via the mass availability of computertechnology and high-bandwidth telecommunications.In the 1950s, the professional media elite dominatedtelevision. By the mid-2000s, many of the most popularvideos on YouTube were being made and posted byamateurs. The popularity of YouTube rivals or exceedsthat of any single television channel. As early as 2006,22 percent of the UK population watched YouTube atleast occasionally, and 2 percent of those surveyedindicated they consequently watched less television,according to the BBC.At that point, YouTube was serving up a total of 100million videos a day to all of its worldwide audience. Asof March 2008, the most-watched video on YouTube—made by an amateur—had been viewed 107.5 milliontimes since its posting a year earlier—an average of293,000 views a day.Much of this public information sharing deservesattention, but what about the enterprise side? True,many of the same people who’ve registered onMySpace or Facebook are in the workforce. (Forexample, the PricewaterhouseCoopers (PwC) networkon Facebook reached 14,000 users in March 2008.)Recent college graduates who are coming of age withtools like Facebook find e-mail a less desirable way tocommunicate. As they enter the workforce, they willexpect comparable tools inside the enterprise.Much of the activity on the consumer Web is frivolous,and some of the companies experimenting with socialsoftware seem preoccupied with superficial aspects ofit. Enterprises are right to be skeptical, which raises aquestion: What value do Web 2.0 and the methodsbehind it actually provide for enterprises?Business value of social mediaThe value of the Web is tied to the content that resultsfrom the interaction itself. Ideas created duringconversations used to be ephemeral—lost to those notpresent at the time of the conversation. Socialnetworking means these informal conversations getcaptured in a persistent way and are broadly accessible.The reusability of conversations between smart,engaged employees in unanticipated contexts defines anew dimension for information communications.Much of the activity on theconsumer Web is frivolous, andsome of the companiesexperimenting with social softwareseem preoccupied with superficialaspects of it.Once information becomes part of the aggregate,recommendation engines can assess the preferences ofusers and retrieve more relevant results. For example, thecollaborative filtering technology behind recommendationengines (already in use at call centers to help customerservice representatives suggest products to customers)can be used to suggest relevant content to employees.Operational Web 2.0 43

432Billions of dollarsAnnual salesWidgetsPodcastingMashupsWikisBlogsRSS1Socialnetworking02007200820092010201120122013Figure Source: 2: Forrester, Spending 2008 on social mediaSource: Forrester Research, 2008By reading directly about the experiences of others,employees can learn from each other. BT Designmanaging director J. P. Rangaswami observed in a 2007blog post, “Knowledge management is not really aboutthe content; it is about creating an environment wherelearning takes place.”Enterprise search tools are beginning to make itpossible to analyze free-form text and mine for factsthat can be exported. Other tools—like Really SimpleSyndication (RSS) enable information to be pushedfrom a platform such as Facebook into places on thecompany intranet or into an enterprise application.Still other tools encourage interaction and becomedestinations for users to share and respond to ideas.IDC analyst Rachael Happe says she uses Twitter—amicroblog platform that many people use to let othersknow what they’re doing—to post ideas instead.With a circle of like-minded acquaintances online,Happe gets quick feedback on each idea. The ideas getscrutinized not only by people she’s interacted withbefore but also by others who happen to find thediscussion thread.Certainly, there are privacy issues, but giving employeesthe option of sharing what is ultimately related to theirwork product anyway won’t always be an issue. Mostmay decide to share at least some information. Thebenefits include increased connectedness, reuse oflessons learned, and elimination of the duplication thatresults when experiences are shared.These benefits have not gone unnoticed. Enterprisespending on social networking alone is forecast to riseto 66 percent annually from 2007 to 2013, according toForrester Research. As Figure 3 shows, 51 percent ofGlobal 2000 companies expect to buy Web 2.0 toolsin 2008.Essentially, Web 2.0 media promise an additionalcommunications channel for types of informationexchange less easily facilitated by other media.MashupsWhile Web 2.0 technologies assist in capturing andbringing information online, mashups help aggregate it,help analyze it, and help users understand what theinformation means.Mashups address the ad hoc analysis and displayneeds of enterprise users. They simplify the process ofretrieving and aggregating data quickly from multiple44 PricewaterhouseCoopers Technology Forecast

Web 2.0 adoption in 2008Global 2000(20,000+ employees)Very large(5,000 to 19,999 employees)Large(1,000 to 4,999 employees)Medium-large(500 to 999 employees)Medium-small(100 to 499 employees)Small(6 to 99 employees)Buying Considering only Not considering51% 12% 37%40% 16% 44%41% 15% 44%33% 15% 52%26% 16% 58%20% 13% 68%Sample sizeN = 236N = 257N = 510N = 226N = 481N = 519Figure 3: Web 2.0 adoption in 2008Source: Forrester, 2008 Research, 2008Web sources by partially automating the process. Andthey make it possible to repurpose the use ofinformation on-the-fly inside a browser.Mashups make rapid development of browser-based,rich Internet applications possible. Users—as opposedto programmers—can develop, refine, and share highlycustomizable views of data from disparate sources bothinternally within the enterprise and externally.To a greater extent than before, users are able tospecify how they’d like their data presented. Mashupsencourage higher levels of customization and morepervasiveuse of internal and external information. Theresulting applications are thus very user-centric.At the same time, content developers are using morestandardized, semistructured data such asmicroformats in Web pages. Microformats make iteasier to reliably extract contact information orgeographic location information, for example, whichleads to improved ability to aggregate and integratedata derived from separate sources of data on the Web.Such companies as IBM, Denodo Technologies,JackBe, Kapow Technologies, and Serena Softwarehave offered enterprise mashup platforms since 2005 orso. JackBe’s examples of how organizations are usingits mashup platform provide some insights into what theplatforms enable.JackBe’s chief technology officer John Crupi notedduring a briefing that the US Defense IntelligenceAgency (DIA) has the ability to do scenario-based riskanalysis of military assets in real time. (See “The dosand don’ts of mashups,” page 56.) Instead of usingdated, PowerPoint slide snapshots of these scenarios,DIA analysts wanted to pose the scenarios during thebriefings and display the results the system presented.JackBe helped the DIA create a portal that displayedscenario results that analysts could present visually toothers, as well as bookmark and share online. (SeeFigure 4.)Users—as opposed toprogrammers—can develop, refine,and share highly customizableviews of data from disparatesources both internally within theenterprise and externally.Operational Web 2.0 45

Figure 4: DIA’s scenario mashupSource: JackBe, 2008Limitations of enterprise mashupsWhile mashups represent a powerful technique, theycan be only as good as the information on which theyrely. Mashups provide an additional layer of analysisand presentation capability on top of service-orientedarchitecture (SOA) and Web-enabled data sources.Being at the top of the stack, they depend on each layerbeneath them.Without consistency at each layer—especiallyconsistency in quality, governance, and uniformsemantics at the data layer and sufficient, consistentservices and application programming interfaces—mashups that aggregate and analyze data across silosare ineffective. Thus, enterprises that encounterinconsistencies when building business intelligenceapplications are likely to encounter similar problemswhen developing mashups.Users may encounter more access complexity withmashups. Enterprises implement access control inmany different ways, and current identity managementinfrastructures can create a complex maze ofpermissions. Enabling mashups means adding to thatcomplexity and revising access control wherever itimposes a barrier.Like many other legacy technologies, most of theexisting identity management mechanisms did notanticipate mashups. Moreover, information technologyorganizations struggle whenever users request reportsfrom data sources that cross silos. For example, it mightbe a good idea to develop a mashup of internal vendordata combined with external data about vendorcreditworthiness, but it presupposes that the differentunits have agreed on a single definition of a vendor.Reuse of mashups developed by others is another bigpotential win, but reuse hinges on proper versioncontrol, design, and metadata—hardly a given whenend users are developing mashups. Enterprise 2.0consultant and blogger Dion Hinchcliffe alluded to theversion control problem in a list of mashup issues thathe posted in 2007. Some bona fide enterprise mashupplatforms with version control exist, and others arein development.Design and metadata issues speak to governance ofmashups themselves: How well are they tagged andotherwise designed so that a person in one departmentcan use a mashup created by a person in anotherdepartment or in another organization?Mashup application environmentsAlthough some claim that users themselves can createtrue applications, it’s more accurate to say that amashup application environment provides an additionallayer of end-user aggregation capability andconfigurability. The platform acts as a bridge—withflexible application functionality—between SOAservices or databases and an AJAX (asynchronousJavaScript and XML)-enabled browser client. Bycontrast with the composite applications available insuites, mashup functions are essentially more malleableand more ad hoc.Users can produce mashups in a way that can preservedata integrity: they can avoid the equivalent ofuncontrolled growth, because the data aren’t duplicated46 PricewaterhouseCoopers Technology Forecast

010-Denodoarchitecture.aiypical mashupUser interfaceWeb container / App serverMashupsOrchestrate Join/Union Filter AnnotateEnterprisemashup serverVirtualize / GovernS1 S1 S1 S1Virtual servicesWeb and otherdata sourcesD1 D2 D3 D4 D5 D6Figure 5: A sample mashup architectureSource: JackBe, 2008Source: JackBe, 2008or decoupled from the sources and because the logicstays and can be modified in one place. Enterprisemashup platforms mediate between the data sourcelayer and the presentation layer in a way that’sconsistent with the underlying SOA governanceframework that enterprises are building, as in theDenodo Technologies architecture. (See Figure 5.)Value propositions for mashupsThere are four key value propositions that support theuse of mashups.Data analysis efficiency• Knowledge workers are investing many hours ofeffort in desktop tools such as Excel andPowerPoint, pulling data from separate enterprisesystems, combining it, analyzing it, and charting it.• Enterprises can optimize that labor by making thelogic and presentation configuration sets createdwith the tools directly sharable on the Web. The setsthen become reusable assets, and should the logicneed to change, they provide a single place formaking those changes.Improved database utilization• Mashups are well suited for the kind of perishableinformation that now exists in often-underuseddatabases, such as those that contain customercontact records. They can serve as a mechanism forincreasing the utilization of this kind of resourcegenerally.• Enterprise mashup platforms can combine internaldatabases with public data (geolocation data, forexample) and at the same time adhere to the accesscontrol rules of the internal data sets.• Previously stranded, ad hoc databases can beaccessed and used in more powerful ways. Onceinformation from several places gets linked andbecomes more useful, it becomes more important tomaintain and add to it.Operational Web 2.0 47

Knowledge transfer efficiency• An effective mashup platform based on soundgovernance principles can result in more of thebenefits of broad information sharing without riskinginformation compromise. Such sharing then allowsmore knowledge to cross functional lines—so thatresearch and development can leverage themashups in operations, for example.Power-user leverage• Mashups are a natural fit for people who can developExcel macros, a new way to tap into the skills of thisgroup of workers.The viability and ease of use ofthese media controls consistentlylag behind technological advancesin media.Mashups are not replacements for high-volumeenterprise applications. Fundamentally, a mashupplatform provides a configurable way to link live data;integrate, filter, and analyze it; and share the persistentresult visually or in tabular form.Crupi made this observation: “We get customers askingif we can mash up a million records. We say you wouldnever want to mash up a million records. You haveaccess to your database, which knows how to managea million records; the mashup addresses the smallsubset of records that you need to get to the user.”Enterprise media controlsBecause user-created content and ubiquitous sharedmedia are new, businesses are still trying to make senseof them. Clearly, they understand the value of capturingwhat Rangaswami calls watercooler information.Making watercooler information persistent with the helpof social networking tools is now feasible.As Rangaswami points out, it’s the kind of informationthat enterprises have wanted to capture for decadesnow. Who people are specifically, what they’ve done,what they’re talking about that’s work related, and whoand what they know are all pieces of information of highvalue to enterprises once such information getsnetworked, becomes searchable, and is tagged.There are many different business, legal, and personalreasons to impose controls on data. Any company hasits own closely held information, such as intellectualproperty, trade secrets, business processes, andpersonal information. The viability and ease of use ofthese media controls consistently lag behindtechnological advances in media.By analogy, e-mail has existed since the mid 1960s andhas been common in the business world since the1990s, but until recently, most e-mail controls wereprocedural rather than technical.Technical access control of electronic documents suchas e-mail attachments, offered by enterprise rightsmanagement software vendors, has only within the pastyear or two seen adoption outside of highly regulatedverticals such as healthcare, telecommunications,pharmaceuticals, and the public sector.Out of a $41-billion global security software andservices market, IDC estimated that only $1.1 billionwas spent in 2007 specifically on information protectionand control—that is, on products providing enterpriserights management and content-filtering software. Theresearch firm forecasts that market will grow to $3.2billion by 2011.48 PricewaterhouseCoopers Technology Forecast

Because Web 2.0 media and mashups furtherdemocratize information; because they span boundariesbetween departments, divisions, and enterprises; andbecause they introduce security vulnerabilities in theprocess, they demand procedural controls similar to theprocedural control of e-mail 20 years ago.However, many enterprises do not yet see the valuepropositions of social media as clearly as they did thoseof e-mail. While e-mail supplanted most regular mail,social media seem to merely augment informationalready on hand. At a tactical level, the benefits don’talways seem to warrant the apparent risks. In spite ofthose perceptions, enterprises’ social media adoption ison the increase.To manage risks effectively, enterprises will againneed to stress more-elaborate procedural controls thatmake use of what’s available from a technicalstandpoint. In some ways, some forms of Web 2.0media allow better risk mitigation than e-mail does,because the information is more visible and thus moreeasily moderated.Out of a $41-billion global securitysoftware and services market, IDCestimated that only $1.1 billion wasspent in 2007 specifically oninformation protection and control—that is, on products providingenterprise rights management andcontent-filtering software. Theresearch firm forecasts that marketwill grow to $3.2 billion by 2011.Content filtering is also possible. Companies can eitherscreen each item before it’s posted or monitor the trafficwith tools that detect and stop anomalous behavior.Other aspects of Web 2.0 content security, however, aremurky at this point.Some organizations provide places internally whereemployees can store personal information. Others takeadvantage of public social networking tools and link toprofiles stored at Facebook or LinkedIn. This techniqueraises the question of whether Facebook profileinformation is appropriate for internal audiences.Outlook: Transformations enabled by Web 2.0To be able to use this Web 2.0 information effectively,enterprises will need to:• Facilitate dynamic blending and display ofinformation to improve its utility• Invest in text mining and filtering to identify andretrieve the most-valuable informationThis must be done quickly and cheaply enough toexploit the information before it loses significant value.Six hypothetical examples illustrate how enterprisesmight use Web 2.0 to gain competitive advantage.Example 1: A company records a human resources(HR) orientation session and segments, tags, andindexes it using an automated tool.• Reuse of the same session over and over is animmediate initial benefit.• Segmentation means the information can berepurposed and customized as well. HR can addsegments for different roles and departments overtime.• At any time in the future, any employee can retrievethe most-relevant and most-needed portion of theinformation provided.Example 2: With the right mashup infrastructure, anemployee who struggled to understand what a timereportingsystem requires could annotate a standardorientation video and combine it with self-made videosshowing mistakes being made and corrected.• Engineers optimize an entire process flow bycapturing the information in each part of the processat one plant. They then can more easily replicate theOperational Web 2.0 49

In general, improved interconnectedness will mean that work that used toget done in isolation can be done more effectively together. The degreeof effectiveness will depend on how well enterprises can make thecultural changes necessary to result in true interconnectedness—something the tools alone cannot accomplish. With a high degree ofinterconnectedness, much more of the power of the informal organizationcan be brought to bear.same process in multiple plants with the help ofinteractive Web tools that encourage, rather thanrestrict, sharing.• Segmenting the process into subprocesses helpsmake it possible for users to optimize the workflowprocesses they are responsible for. The Web toolsautomate the reintegration of these subprocessesso that the entire revised flow becomes visible toall users.Example 3: Through blog searches, workers find outabout nearly identical projects taking place in twodifferent locations and decide to pool resources.• More generally, Web 2.0 technologies can provide abackbone for distant collaboration. Findingcolleagues with needed skill sets and relevant pastexperiences is facilitated by Web 2.0 infrastructure.Example 4: Once the contents of a meeting becomepersistent and accessible, virtual-meeting participantscan continue the conversation by adding comments aswell as capturing input from those who couldn’t attendin real time.• Global companies will find examples like this helpfulin serving employees and managers in differenttime zones.Example 5: Purchasing agents take advantage of theinteractive Web to develop mashups that serve ascustom catalogs. For example, the procurementmashup will cut the time required to compare andcontrast pricing information from suppliers.• More generally, custom access to select informationapplies to many disciplines. For instance, a physicianmight want to know when new articles appear inselect journals or when US Food and DrugAdministration findings become available.Example 6: An executive can take advantage of abusiness intelligence report mashup that evaluatesthe tax ramifications of shifting the balance ofoperations from one group of countries to another.• Similar applications could be developed for currencyhedging, commodity trading, and other tasks thattrack shifting data.50 PricewaterhouseCoopers Technology Forecast

ConclusionEase of use is not yet in place. Building mashups inmany cases will be nonintuitive, often requiring the skillof power users. But ease of use should improve quickly.By the end of the decade, casual enterprise users willthemselves become used to simple mashups. Alreadytoday, casual users can construct consumer (versusenterprise) Web 2.0 applications with such tools as IntelMash Maker, Microsoft Popfly, and Yahoo! Pipes.Web pioneer Tim Berners-Lee predicted in March 2008that electronic documents such as bank statements andcalendars would acquire enough intelligence on theirown that users could create mashups simply bydragging one document on top of another. He offers theexample of dragging a calendar on top of a bankstatement to be able to remember where and whenmoney was spent.While there may be value even in simple mashups likethe one Berners-Lee describes, the fact remains thatmany of the business-value benefits of mashups, socialnetworking, and other elements of Web 2.0 will remainintangible for some time, but as that time approaches,the workplace will become friendlier because employeescan personalize their own profiles. Ad hoc workingarrangements will be able to come together morequickly. And documents, services, and databases willsee greater reuse and interconnectedness.In general, improved interconnectedness will mean thatwork that used to get done in isolation can be donemore effectively together. The degree of effectivenesswill depend on how well enterprises can make thecultural changes necessary for true interconnectedness—something the tools alone cannot accomplish. With ahigh degree of interconnectedness, much more of thepower of the informal organization can be brought to bear.For more information on the topics discussed in thisarticle, contactOperational Web 2.0 51

The freedom to connectRichard Rosenblatt of Demand Media describeshow to borrow ideas from the consumer Web forenterprise benefit.Interview conducted by Alan MorrisonRichard Rosenblatt cofounded Demand Media in 2006. Previously, he was CEO ofIntermix and chairman of MySpace. Demand Media takes a vertical approach through asingle platform for Web content development and distribution. In this interview, Rosenblattdiscusses how the Web has evolved and how enterprises can take advantage of theinteractive aspects of it.PwC: You’ve been involved with the Web forquite a long time—all the way back to itsprogenitors. What did the Web look like as anopportunity for you at that point? Can you thinkback to ’94?RR: Absolutely. It’s great to talk about it becauseremember, the World Wide Web didn’t even start until1994. Before then, there was no GUI (graphical userinterface) that allowed you to put up HTML (hypertextmarkup language) pictures, photos, and all the stuff thateveryone just takes for granted as the Web. Beforethen, it was e-mail, and you weren’t able to really doanything commercially interesting unless you were ahard-core techie. So, the invention of the World WideWeb was the first time that it really became a mediaopportunity. Before that it was just a technologyopportunity.After I was introduced to the World Wide Web, Ithought, “Wait a second, you can put up text and somepictures and it’s free as long as you know how toprogram HTML.” Now, if you combine that with theInternet’s global reach, you have a new, excitingcommercial opportunity.It was really just a static billboard type of ad. Then, in1995 and until we sold the company in 1999, we tried tofigure out a way to make the Web site moreinteractive—a place where you could actually sell stuffonline. That’s what iMall was; it was the first onlineshopping mall and e-commerce solution to let smallbusinesses sell products through the Web.The Web has gone now from a very flat, more utilitariantype of communication vehicle to a multimediaexperience that has the potential to replace television.PwC: As you were doing this in the ’90s and theearly 2000s, were there any aha moments withimplications beyond what you might haveassumed previously?RR: There were. With iMall, we first created what youmight call a business software business. It was atraditional service where we talked to clients and triedto sell them Web sites with a pretty simple pitch: “We52 PricewaterhouseCoopers Technology Forecast

are going to build new Web sites for you and put it oniMall, which has traffic, just like shopping malls.”That approach is very difficult to scale because youneed lots of people to build many Web sites, andpeople want lots of changes to their Web sites and newfeatures—all the different things that make it verydifficult for very client-centric types of business.So in about 1995 or 1996, we built a set of tools thatallowed the small business to use a very simpletemplate, and the business itself was able to build theWeb site. So the aha to me was that the Internet allowsyou to turn the power over to the small business tobuild its own Web site, and you just take a recurring fee.That was a beautiful model.That model for the small business allowed me to focusand invest heavily in MySpace, the next generation. Itwasn’t for the businesses; it was for individuals. So nowindividuals could build their own space, connect withtheir friends, put up their own information, do everythingthat they wanted to do through the Web. The aha for thelast 13 years has been to enable the users to do whatthey want with it and then figure out the business modelas you are giving them that power.PwC: Demand Media seems like a logicalextension of that.RR: Yes, but vertically focused. When we startedDemand Media, we knew social media was becomingmore vertical and more personal. That presented abusiness opportunity because it’s very easy to sell toadvertisers that are interested in reaching golfers, forexample. So we have Web sites and domains frombirding to paintballs, and they get more microfocusedlike sports cars to green-friendly sports cars.As you go more and more vertical, you can manage onone platform an infinite number of verticals, and theyare very apt for targeted advertising because you knowwhy the people are there.PwC: On the Web now, things are getting moreand more fragmented. Where are we headed?RR: For a long time now, I have been talking about whatI call the portable profile. It’s like a passport where ifyou log into, there is all your stuff. If youlog into, there is all your stuff, if you log intoany Web site. We are able to do all this on our platform,and because all the sites are on the same platform, userinformation is automatically retrievable from any site ordomain. So if you are a member on any of our Websites, you are a member on all of our Web sites if you sochoose. Or, you may not want to be the same personon as you are on do believe at one point your content will beaggregated in what I think ends up being your owndomain. At, for instance, I will log in, and itwill suck in all my different information from all myprofiles around the Web. And I do think that’s the future.It will be very interesting to see who does that.PwC: What about other consumer developmentsthat will affect the enterprise? Take mashups, forexample. Companies have so many databasesthat are pretty much stranded internally: A fewpeople know about them, and others don’t.If you have the means of aggregating andpresenting that information internally, thatwould be powerful. Are you working at allwith mashups?The freedom to connect 53

RR: We announced in March that we bought Pluck. Onereason we did was for its SiteLife social mediaplatform—Guardian News and Media and others in thepublishing business, for example, have been using thatplatform to add interactivity and online communitycapabilities to their Web sites. The Pluck APIs(application programming interfaces) could presentsome interesting mashup possibilities. Otherwise, whenwe do mashups, it’s more on the consumer side. Wehave a partnership on the video side, where people cancome and get everything from Star Wars clips to allkinds of professional clips and music and mash thatinto their own personal video. So they could createwhatever they chose with their own Star Wars renditionof something related to another hip-hop song, all legalcontent. The main thing here is giving users the abilityto grab lots and lots of content from what we callthe studio and build up their own Web site, whichwe monetize.PwC: Do you think that licensing in general issort of opening up a bit more because of all thevideo mashup activity on YouTube?RR: It is. We’ve talked to lots of publishers. When I soldMySpace to Fox, I spoke about this with Rupert(Murdoch) and Peter (de Monnink) and the whole crew. Italked about the need to set their content free. Peoplewere stealing it anyhow. They should put their contenton the Web and include ads, because if it’s streamed,you can’t skip the ads. When you let people do whatthey want, they can put the clip on their home pages,send it around to their friends, do whatever that takes.Hulu leverages the ability to take all this professionalcontent and give it to the MySpace and Yahoo! usersand all these people in the world. So people don’t stealit. That’s a big move. When people ask me what Web3.0 is, I really think it’s Web 2.0 with Web 1.0. It’s thetraditional media model of producing great content withthe low-cost distribution of the Web. And I don’t meanre-creating expensive content. I mean taking theselibraries that these media companies have had for ahundred years and literally starting to figure out newways of content integration—starting to figure out adigital way of distributing it.PwC: Another aspect to this involves widgetsand the modularization of content frames thatyou can embed anywhere. Does that present anopportunity we haven’t thought of before? Imean, something beyond what’s alreadyhappening in sites like Netvibes, for example?RR: I am very excited about widgets because widgetslet you do what I mentioned previously about being ableto grab content. So by wrapping your text, your video,your applications—anything you want—in a widget,someone can just click on it and put that right on theirpage. It’s the ultimate syndication opportunity.As longas the quality of the widgets is high, people are going towant to pass those widgets around, and there can beadvertising embedded in those widgets. And you mayhave found another form of advertising—a widgetbusiness model, in other words. That’s why we areinvesting so heavily in video. We want to have as muchinventory as possible, and we believe we are the largestInternet-created library of video content on the entireWeb. We are the largest supplier of content to YouTuberight now. We receive over 600,000 views of our videosper day on YouTube alone.PwC: Let’s put this in an enterprise context.Where many enterprises are internally is still Web1.0, especially those that have a lot of history.They have a lot of legacy systems, and one ofthe problems they’re facing has to do with theirintranets: people don’t go there; they are notinterested in them. So how do we make thislegacy approach to the Web more dynamic? Arethere things you could suggest that enterprisesdo to enliven their existing Web sites internally?RR: The best thing would be to add social media.Seventy percent of people on the Internet are interestedin interacting with the Web sites they visit. So, whethercompanies start with just adding simple blogs or theystart with something like the ability for people to postcomments, being open and allowing people to publishcontent about the company—good and bad—will getthem enormous credibility.54 PricewaterhouseCoopers Technology Forecast

The fear companies have is: What if the communitydecides to put down one of their products or services?Well, they are going to do it anyhow. And the fact is thatif you have a product or a service at a company, peoplewill rally to your defense—to defend you. And it won’tbe behind the scenes. It will be right out in front ofeverybody. And you could use it to dispel lots of rumorsand hopefully motivate the team, but you have to beopen, and you have to let them do what they want to doand just be ready for some good and some bad. In theend, it’s going to work out. Companies should letpeople comment on their jobs and what they thinkabout the company.PwC: Speaking of social networking, how far doyou think we are from having just about everymeeting that could be useful to larger audiencesactually recorded and posted on the internalWeb for these companies? Is it simple enoughfor most companies to do that now?RR: It is so simple. I can’t believe companies are notdoing that already. That’s shocking to me. From anefficiency perspective, you should have all the generalquestions and answers on the Web. So everything fromyour vacation policy to standard stuff—which, in anenterprise, thousands or tens of thousands ofemployees need—should all be recorded and availablein a video archive. Let people go through it.PwC: So how are you doing that inside yourcompany? Are you tagging your videos forthose purposes?RR: Yes. We have an internal network where employeescan find everything and manage everything related totheir business life right there. We then gave all of ouremployees their own free dot-tv domain. They are ableto create their own personal Web site to do whateverthey choose, to build a persona. And on that channelthey can record their own video, their own chat, orwhatever they want. Right now, we are just finishing upa group of videos for employees—everything they needto know about Demand Media. The only reason wehave hesitated is because our business is changing sofast. And we are worried for competitive reasons thatwe just needed to make sure our network was secureenough so that we can preclude someone from sendingthose videos somewhere outside the company. Andthat technology is available.PwC: What about searching on video? Enterprisesearch is something that’s way behind what’savailable on the public Web, for example. Areyou able to search on your internal videos?RR: In a very basic way. There are video searchproducts that you can license, but video search has notreally been figured out yet—whether from the consumeror business side at all. There are still technical hurdlesto overcome. For example, we had to go through awhole process to understand why, when searching on atopic for video, some thumbnails show up and somedon’t. So video search is still at an embryonic stage.PwC: If you had to name one essential thingcompanies could do to improve their intranetuser experience now, what would it be?RR: Giving employees the freedom to personalize theirprofiles—and personalization, generally—is a big deal.It’s not going to be clean and white and always exactlythe same, but personalization is one of the mostimportant things to people. They really believe theprofile is an extension of them. Allowing them to usetechnology to connect to related job duties, relatedprojects, even related ads, or, you know, related thingsabout the company they love or hate is a big deal. Nowthey have a reason to go to the intranet. There is awhole discovery process that people love about socialnetworks: the ability to discover new people and newthings that they didn’t even know existed, to say, “Thatperson lives 10,000 miles away, and he cares about thethings I care about.”So it’s about communication, it’s about connecting,and it’s about giving them the freedom to update asmuch as they want. •The freedom to connect 55

The dos and don’tsof mashupsJohn Crupi of JackBe clarifies what enterprise mashups arecapable of, what they aren’t, and where their power lies.Interview conducted by Alan MorrisonJohn Crupi joined JackBe in April 2006 and brought Deepak Alur and Dan Malks with himfrom Sun Microsystems, where Crupi was chief technical officer of the Enterprise WebServices Global Practice. Together, Crupi’s team has helped JackBe develop a mashupplatform called Presto that includes the governance, security, and user interface capabilitiesnecessary for the enterprise. In the process, Crupi has discovered quite a bit about howa mashup ecosystem might function, as well as how the development role of businessunits is expanding.PwC: How did JackBe get started?JC: JackBe started in 2001 as a technology companytrying to enable call centers to make orders online usingthe browser. IE (Internet Explorer) back in the late ’90swould allow asynchronous communications, which hasnow become AJAX (asynchronous JavaScript andXML). Because of that, they could offer specializedapplications that were very interactive, but only in IE atthe time. They were able to solve a lot of the businessproblems call centers had getting a Web browser tobehave more interactively.This thing was so popular and so powerful, that theydecided to sell off the company and actually take thetechnology and build a company based on that. Thattechnology is called AJAX and is now ubiquitousin the browsers, but really, it was more or less thestarting point. JackBe made the transition from anAJAX company to an enterprise mashup companyin late 2005.PwC: We’ve seen the term enterprise mashup fora couple of years now. How has the conceptevolved during that time?JC: If you think about a mashup as combining data ortaking data from disparate sources and putting ittogether, in a way that you can use it and interact with itin various different forms, that really sounds like whatwe do every day in Excel by copying and pasting andthen manipulating it and running formulas and macroson it and maybe showing graphs. But what we do in theenterprise with cut and paste in Excel wastes aridiculous amount of time.The way we collaborate with that end product is that wee-mail a spreadsheet. So if you think about it, the Excelsheet might serve just one campus or destination. But ifyou automate or allow users to integrate and mash it upthemselves and feed it into Excel or feed it to a portal orfeed it to a Web app running on iPhone, then theenterprise really starts getting excited because theyhave been trying to do this for years. IT is just wayoverburdened to do that.56 PricewaterhouseCoopers Technology Forecast

PwC: You’re describing a single, online,persistent version of a macro that everybodyhas access to. So are we avoiding the problemof spreadsheet hell by making it persistent onlineand avoiding the version problem, or is therestill going to be some of this aspect of a lackof control?JC: If you talk about governance and you say, “Wedon’t have governance, we don’t know how to governsomething.” Basically that’s a showstopper: IT can stopany type of deployment or any type of application. Theysay we don’t have governance in place to do that.But then you say, “Well, wait a second; I see that I have5,000 spreadsheets that are going throughout myorganization every day; how is that governed?” ITdoesn’t govern that, and that’s true, right? It’scompletely ungoverned. So in some sense, byintroducing mashups and providing automaticconnectivity into Excel as a point of destination, youactually have more governance around the data,because it’s almost as though you are controlling thedata that they can get out into the spreadsheet, and notonly that: you are also able to push it out.So if I have data in my worksheet and I select an area,that can be data—a data service that’s pushed out—then can attach its entitlements to that, so that others inmy group can have access to that information. Youactually get around sending spreadsheets for people tojust view it or look at separate cells. Now you can starttreating spreadsheets as a service because ofenterprise mashups.PwC: Are you envisioning that you would see amashup capability in portals—or in a software asa service platform?JC: If you think about it, in the last 10 years, not a lot ofinnovation has happened on the user side, especially interms of UI (user interface). The innovations were in thebrowser or desktop application. So what we found wasthat this ecosystem of many UIs was already formedand that organizations didn’t want to be forced into onedestination to see these mashups. So, we completelypulled out of providing a tool kit of our own. Instead, weare looking at existing front ends as a publishingdestination.We have a concept called Mashlets, which is ourterminology for an enterprise widget or a mashupwidget. It’s a wrapper and it’s a black box aroundwhatever mashup I create. I can put a face on a mashupvia a Mashlet, and I can publish that Mashlet to myportal as a portlet, I can publish it to a Web page, I canpublish it to my iPhone and just consume it as a Webpage, or I can push it into some of the next-generationportals like Netvibes and Pageflakes.This is the trend that’s happening; I want mashups to besmall pieces of data integration, but I also want to beable to push that data wherever I want to.PwC: Are you making Mashlets available to Webdevelopers rather than end users?JC: No, actually it’s to the end user. A Mashlet is anauto-generated UI. A mashup in the enterprise is datathat’s been integrated or combined; whether it’s mergedor joined or filtered, it’s just data. We could get that datato the browser or to the spreadsheet. It’s just data. Wecan put all different types of faces on it, and that’s whatwe give the user.The dos and don’ts of mashups 57

People are kind of sick and tired of hearing about SOA(service-oriented architecture) when it comes to thebusiness unit because they have been hearing aboutthis for five years or so. It’s going to be agile; it’s goingto give you your ROI (return on investment); and so on.Well, in our experience, business units don’t see thatSOA has really brought them too much. As a matter offact, business isn’t even asking for SOA. But why arethey asking for mashups? Because they see mashupsas a way to get data integrated, whether they do itthemselves, or they use a tool, or they have an analystdo it for them. Ultimately, it’s getting closer to them.This is a nonthreatening way for them to talk about whatSOA should have given them before. So, we put a faceon the SOA with mashups, and you can show them themashup very quickly, and if in fact that mashup isincorrect, or is not visualized correctly, or doesn’t haveall the data they want, they can see that quickly. Theycould see it in a matter of hours versus months orweeks. So mashups are SOA for the ultimate end user.PwC: It sounds like there is a very strongconnection between user spreadsheets andmashups. Both tend to take data from sources,combine it, and analyze it, but then use it forreporting or charting, not to collect new data. Isthat a fair way of thinking about it?JC: That has been the way that we’ve been used todoing things. And it’s been part of the problem, if youlook at the BI (business intelligence) tools. A lot ofthem—once you get all the data models and analyticsdone—are fancy reporting engines. End users are usedto doing creative, customized reports, but the reportsare relatively dead; you can’t do much with it other thanlook at it and print it. The same thing with Excel. It turnsout that Excel is a great place; it’s almost an enterprisecanvas for enterprise users to take data and put itsomewhere so that as soon as they get it, they can putit there, and they can do something with it and integrateit. And we know that users want to get at more and moredata faster and faster, so it’s kind of a hybrid model.You have a core piece of data you are always getting,but the real need is to connect it to other data, whetherit’s siloed internally, maybe in an SAP system and inPeopleSoft where the systems won’t talk together. Soyou have pull it out to different UIs or external data. Youwant to be able to integrate or combine that data andsee it in a much faster, dynamic way that doesn’trequire you to go to an IT and be put on a wait list for ayear or two.PwC: One of the things about consumer mashupsis it that they have a wisdom-of-crowds qualityabout them. The crowd with the help of searchengine algorithms or recommendation systemsacts as a filter to surface mashups like the, for instance. Now millions ofpeople have heard about it. An enterprise tendsto cover a smaller area. Do enterprises have thescale to mimic mass filtering?JC: That’s a great question. The good news about beinga startup or at least solving problems in an emergingspace is that you start hearing people’s horror storiesand dirt about what they have been doing in the past tosolve problems or do workarounds. And one of thethings we hear all the time is that a given companyspent a ridiculous amount of time filtering through logfiles just to find out what their users are doing and thenstarting to invest in semantic analysis engines,ontologies, taxonomies, and all these tagging thingsthat are looking at unstructured data. It would just betoo hard to build an infrastructure to support that.Now, what enterprise mashups give you—and whatJackBe’s mashup platform supports a two-stepprocess. We know that IT has to govern all the servicesthat are being consumed. So all the services that will bemashed up or exposed to the user or the business areactually registered or, as we say, virtualized first in thesystem. Whether it’s a database, a WSDL (WebServices Description Language) SOA service or REST(representational state transfer service), it actually getsregistered with our mashup server. Then it becomes58 PricewaterhouseCoopers Technology Forecast

exposed for mashups. Because we act as anintermediary and all data is flowing between ourmashup server and the service (which may be internalor external), we know what services are being used; weallow users to tag any service they are consuming, andthey can apply tags to any mashup you create. You canrate services; you can rate mashups; you can do allthese sorts of things and put and attach metadata tothe services you are consuming and the mashups thatyou are creating. And we push it all out in what we callthe mashup hub. Also, instead of connecting intopeople, people connecting to people, we drive theinterest and have interest-driven networks, so that whenpeople are potentially consuming a mashup or creatinga mashup, then this may be around an area of interest.That’s how we start to create these informal or organicnetworks. If I could see mashups that were around myarea of research and I wanted to attach myself to anetwork that exists, that is a whole bunch of people. Idon’t really care too much about the people; I careabout the artifacts and the systems, the things that theyare mashing up. That is the more natural fit, we think,than just doing tagging and ad hoc searches.PwC: Would you say that mashups that bringtogether structured and unstructured data will bemore successful than mashups that rely primarilyon structured transactional data?JC: Yes, because users have a continually growing needto get at data, and much of that data is veryunstructured, and a lot of it has to do with the way thatdata is created.We have a multitiered process. If wecreate mashups that are consuming unstructured dataand providing more structure to it, that itself can betagged by the user. So by adding more structure to thatdata so it can be consumed, we are not only betting onthe fact that systems will be making data morestructured and more structured. But in fact, users willparticipate in adding structure to this information. I thinkthe unstructured side will exceed the transactional sideof data integration.PwC: Do your developer communities still consistprimarily of people from the chief informationofficer’s organization? They may be linked in tobusiness units, but they are basically people whoknow what XML (extensible markup language) is.Or are you seeing people who would be thinking,I can do this with Excel, or I can do this withmashups; which one do I want to use?JC: We do have to go through IT in many instances, butwe are also noticing that the business is getting muchmore savvy in funding development themselves.They don’t go to IT and say, “Build the systems for me.”They really go to IT to say, “We need the systems, andwe need access to these services, and provision mesome boxes. I will fund the developers or I have my owndevelopers that are going to build this. But ultimately, Ineed this built for my business.”PwC: So are mashups defining a boundary linethat allows safe things to happen without havingto use internal IT resources?JC: That’s really the big trend here. Not only are thebusiness units trying to get things done; they are tryingto look for new revenue opportunities. Mashups alsocreate new revenue opportunities, because now youcan get data out in ways you weren’t able to before—not big pieces of the information in terms of large apps,but data. We get a lot of customers asking if we canmash up a million records. We say you would neverwant to mash up a million records. Mashups areintended to get to the user. You have access to yourdatabase, which knows how to manage a millionrecords; the mashup addresses the small subset ofrecords that you need to get to the user. This littlepocket is being built and funded by the business. Itlooks like software as a service, but it’s internal. Thebusiness usually has developers who do nothing butcreate wrappers and service interfaces to these siloedsystems. Then the business starts building theseapplications on top. So this is the trend; that’s what’sstarting to happen. •The dos and don’ts of mashups 59

Bringing order to chaosThe next suite—an intelligent business performance platform—blendsbusiness intelligence, rules, and process management, linking knowledgewith the processes that need it.60 PricewaterhouseCoopers Technology Forecast

Improving business prospects is not easy. It fundamentallyrequires insight derived from a combination of knowledgeabout how business processes work and sound businessperformance measures.This insight can lead to new and improved businessprocesses, which then can encourage the creation ofbusiness value. In today’s complex, global enterprise,the needed alignment is rare.Information technology can help. Vendors aredeveloping and early adopters are implementing whatPricewaterhouseCoopers calls intelligent businessperformance platforms.These platforms support andintegrate applications, and they monitor and managebusiness processes and outcomes. The three corecomponents of this emerging platform include the following:• Business intelligence applications, which collectand interpret business events. The percentage ofvisitors making a purchase at a Web site is asimple example.• Business process management applications, whichmonitor and manage the company’s workflow. Theautomated routing of claims at an insurancecompany is an example.• Business rules management applications, whichautomatically and consistently apply businesspolicies. Exception handling in a financial transactionsystem is an example.Today, with heroic effort, these applications can bestitched together to guide management and facilitateprocess enhancement. Enterprises that create valuethrough rapid process changes are beginning to define amarket opportunity for a pre-integrated suite of tools thatsupports intelligent business performance management.Drivers for improved businessprocess insightIT departments in today’s enterprises are beingsqueezed by three important pressures from thebusiness:• Internet business activities have real-timerequirements that are crucial factors in maintainingcustomer loyalty, in supporting business partners,and in managing key business processes.• The constant pressure exerted by globalizationmeans that any competitor—whether local,Enterprises that create value through rapid process changes arebeginning to define a market opportunity for a pre-integrated suite oftools that supports intelligent business performance management.Bringing order to chaos 61

domestic, or abroad—poses a threat. As a result,organizations must compete effectively on serviceand product quality, responsiveness, and price—often, all three at once.• Integration with partners adds additional pressure forbusiness efficiency. Companies that are part of asupply chain no longer have the luxury of improvingbusiness processes at their own pace. Because ofthe integration with systems belonging to partnersand customers, operations must be performed asquickly as the external systems require.To compete, businesses must be more efficient thanever, and the focus on efficiency must remain anongoing priority. All competitors will be constantlyupgrading 18 their own abilities to run processes optimallyin the context of changing market conditions.One of the principal solutions to the need for optimizedbusiness processes has been the use of businessintelligence (BI) and business process managementBPP (BPM). Stack BI helps companies identify key events anddevelopments in different areas of operations, whileBPM enables companies to streamline their businessprocesses. Done right, BPM increases efficiency andeffectiveness—including the appropriate handling ofexceptional events.BI and BPM are frequently combined with businessrules management systems (BRMSs), which performBusinessprocessmanagementBusinessintelligenceBusinessrulesmanagementIntelligent business performance platformIT infrastructureanalysis and help coordinate processes. BRMSs alsoprovide complex decision making—that is, the ability toautomate decision making on the basis of extensivesets of rules and constraints.Taken together, the three technologies of BI, BPM, andBRMS must be integrated with each other by anintelligent business performance platform (IBPP) andatop the enterprise IT infrastructure, as shown in Figure1. Often, however, enterprises implement only some ofthe technologies and grow organically toward theadoption of the others, as needs dictate.As Figure 1 shows, the IT infrastructure is thefoundation for intelligent business performanceapplications. The IT infrastructure provides accessto information about business18-BPPStack.aiprocesses andoutcomes. Enterprise Web applications, for example,provide necessary information to businessintelligence applications.Distinguishing characteristics of IBPPIBPP technologies have a set of common characteristicsthat define their role in the enterprise. Most often theyhelp organizations achieve better business processes.These characteristics include:• Better use of real-time data from businessprocesses. These technologies use monitoring toolsthat watch business activities, compare performanceagainst established thresholds and targets, andidentify exceptional events. Real-time data can resultfrom the minimal monitoring of performanceparameters or from highly complex analysis thatlooks for meaningful patterns in enterprise data.• Reporting of the real-time data in user-friendlypresentations such as portals or, more frequently,executive dashboards. These tools are designed tohave straightforward top-level interfaces, so thatusers can tell at a glance the general state of agiven process.Figure 1: The IBPP stackSource: PricewaterhouseCoopers, 200862 PricewaterhouseCoopers Technology Forecast

Whether companies go so far as to set up a corporate performanceoffice and appoint a CPO, they will need professional managers trainedin the use of performance metrics to analyze the data and to recognizehow the data tracks to specific events in the business processes. To dothis, companies will first need to have good IBPP tools in place.Managers review the real-time data and regularreports to further optimize a process and enhanceresponsiveness. Organizations strive to establish acycle of process optimization followed by monitoringand a new round of optimization.When successful, an organization’s ability to executethis optimization cycle on a consistent basis will be adefining competitive advantage.• Implementation of IBPP processes as a series ofloosely coupled, highly configurable services usingcommon semantics. Taken together, they form acomplete and efficient business process.To achieve the component modularity as well as theneeded integration, the business activities areimplemented as Web services.Usually, the IBPP architecture is service oriented. This isnot an absolute requirement; however, the IBPP designworks best when rigid, proprietary, command-andcontrolworkflows are migrated to flexible, open,standards-based Web services.These changes represent a significant shift for manyorganizations. Fortunately, BPM and BI technologiescan be implemented slowly. They are not rip-andreplacesolutions, but approaches that can be addedincrementally to an IT infrastructure and serve as atarget for the migration of existing business processesat a pace determined by the company’s need forbusiness process efficiency.The focus on business performance is fostering thecreation of a new management position, the chiefperformance officer (CPO). The CPO is in charge ofconsolidating, analyzing, and presenting analyses tosenior management and to the board regarding thecurrent state of companywide operations andidentifying where performance deviates fromexpectations and from industry benchmarks. Inaddition, the CPO has responsibility for designingand helping to implement key performance indicators(KPIs) for a given business within the context of aspecific industry.Whether companies go so far as to set up a corporateperformance office and appoint a CPO, they will needprofessional managers trained in the use ofperformance metrics to analyze the data and torecognize how the data tracks to specific events in thebusiness processes. To do this, companies will firstneed to have good IBPP tools in place.Business intelligenceBusiness intelligence (BI) is the discipline of gatheringdata about the results of specific business processes.BI is often combined with business process monitoring;however, BI does not so much relate to themanagement of the business process’s internaloperations . It shows how a given process operateswithin the context of the larger business model.For example, a BI product can track the sales of acertain item by region for a certain time period. Incontrast, a business process monitoring product cantrack how long it took for an order to be completed.Bringing order to chaos 63

Because BI products track metrics that relate to distinctareas of activity, they tend to be specific to variousbusiness departments and lines of business in anorganization. Human resources (HR) and marketing aretypical areas of focus for BI products.According to analysis by Gartner, BI has representedone of the largest areas of investment in businessimprovement during the last few years. This surge isattributable to several factors:• Greater acceptance of the need for real-time metricsand dashboards to measure business activities• The growing importance of providing guidance tooperations based on analysis of patterns in enterprisedata (detection of fraud, money laundering, unusualcustomer patterns, and so forth)• Regulatory compliance (demonstrating compliancewith Sarbanes-Oxley, the Health Insurance Portabilityand Accountability Act [HIPAA], and so forth)• The availability of better tools for data transformationand data miningAnother key reason for the greater adoption of BI is itsown increasing maturity. For years, BI often wassynonymous with expensive solutions that generatednumerous reports containing little real value. However,improvements in the tools have made it easier tocustomize reports, view real-time data in dashboardsand portals, and obtain relevant information in atimely fashion.challenge of establishing single, consolidated, andclean (so-called gold version) instances of data recordsis a persistent issue that IT departments must tacklebefore they can consider BI.The primary approach to this problem is master datamanagement (MDM). According to a 2007 surveyperformed by InfoWorld magazine, IT managersreported that the two largest challenges toimplementing BI were the problem of data quality and,to a lesser extent, integrating BI with operational systemdata sources.Going forward, one of the biggest changes in BI will bethe transition presently in progress toward activebusiness management based on the use of real-timedata. BI is evolving from a pull activity, in which data isobtained only when specifically asked for, to a pushactivity, in which constant updates of real-time businessdata are pushed to dashboards and portals.As business line managers assume greater control ofspecific business processes and therefore managecloser to the business operations, this demandfor real-time data on a push basis is expectedto accelerate.BI marketThe BI market is dominated by a few enterprise-toolsvendors, which spent much of 2007 acquiring smaller,market-leading BI companies. These major vendors areGoing forward, one of the biggest changes in BI will be the transitionpresently in progress toward active business management based on theuse of real-time data.IT departments can deploy BI solutions incrementallywithout disrupting existing systems. However, thisflexibility does not imply that BI requires no changes. BImust have clean data to provide useful results, and theBusiness Objects (acquired by SAP), Cognos (now partof IBM), Hyperion (now part of Oracle, but with asuite of products different from Oracle), Microsoft,and Oracle.64 PricewaterhouseCoopers Technology Forecast

A second tier of independent BI vendors includesInformation Builders, MicroStrategy, SPSS, and to alesser extent, SAS. As market consolidation continuesduring the next few years, these independents are likelyto be acquired or to merge to remain competitive withthe larger, established vendors in this market.Demand for BI solutions is expected to expand themarket for these tools at a compound average growthrate (CAGR) of 8.6 percent through 2011, according toGartner. Reasons for this growth, in addition to thoselisted previously, include the following:• A newfound appreciation by senior management ofthe value of data derived from business processesand operations• Greater familiarity with quantifiable measures bybusiness analysts within organizations• The projected adoption of BI by midsize companiesSome of the prospective midsize customers, however,might opt for open-source solutions, such as Pentaho’sOpen BI Suite.Longer term, BI may begin disappearing as a standaloneproduct category. The acquisitions of BI vendorsin 2007 have so far not resulted in any notable integrationof product lines. However, that integration is almostcertain to come to pass, probably starting with SAP.At that point, BI products simply will be another modulein a larger enterprise tools suite. This suite will becharacterized by a common shared metadata that alltools can interact with intelligently.This shared metadata design is sometimes referred toas a closed-loop approach. It’s clear, however, thatclosed-loop integration must not destroy the loosecoupling between modules, so that organizations arestill able to adopt parts of the BI or enterprise resourceplanning (ERP) suites without having the entire suiteimposed upon them.BI may disappear in another direction as well: BI toolsmay be subsumed by BPM suites. Already, BPM tool kitshave strong reporting capabilities, including dashboardsand portals. Integrating those technologies with the datamanagement and data analysis strengths of BI wouldbegin that cycle of consolidation.OperationOptimizationDesignBusiness processmanagementlife cycleExecutionFigure 2: A typical BPM activity life cycleSource: Adapted from IDC, 2008Business process managementBusiness process management (BPM) focuses on theoptimization of business processes. Many organizationsactively are pursuing BPM for two principal reasons:• The need for leaner business processesModeling• The demand for agility in regulatory complianceAlthough the need to be lean is well understood, thedemand for agile compliance is less evident. Ascompanies have been forced to tighten recordkeeping and more closely manage their processes toassure auditability and compliance (particularly withSarbanes-Oxley and HIPAA), they’ve come to realizethat existing processes are not easily adapted to thenew requirements.Bringing order to chaos 65

Managers are increasingly targeting business activitiesimpacted by regulation for increased transparency andcontrol. Many regulated activities have been poorlydocumented and left to evolve without sufficientmanagement oversight. BPM is one solution tothis challenge.The foundation of BPM is the view that a businessprocess is an asset—a single coherent and tangibleentity that can be managed as a unit made up ofnumerous activities.This view underlies the series of optimization-orientedactivities that are the core elements of BPM. Theseinclude modeling the process, designing andimplementing changes that better deliver on the goal ofoptimization, monitoring the business process by usingreal-time data, and finally, using the derived data tocreate the potential for additional cycles of optimization.This cycle is shown in Figure 2. Different BPM vendorsand analysts have slightly different names for the variousstages, but the basic cycle of activities is constant.These BPM activities are frequently grouped into ahigher-level cycle:• Business process modeling• Implementation• Business activity monitoringEach of these three phases has its own set of tools.Occasionally, as in the case of Oracle, for example,these tools are integrated so that outputs generated bythe modeling tool can be used as inputs directly in theimplementation stage and later they can help dictatethe monitoring stage.Getting started with BPMOrganizations considering BPM as an optimizationstrategy need to take certain preliminary steps toensure success. The first and perhaps most importantis to have stable business processes that can bemodeled and mapped. If a company has not reachedthis level of maturity and so does not understand itsoperations in terms of defined processes, it is not readyfor BPM.SAP’s vision of a businessperformance product suitePricewaterhouseCoopers sat down with Doug Merrittat SAP’s research labs in Palo Alto, California, todiscuss the emerging category of businessperformance applications and how SAP plans tointroduce the technologies to new and existingcustomers. Merritt is a corporate officer and memberof SAP’s Executive Council.PwC: How are you framing these new categoriesof applications that integrate process flows,business intelligence, and business rules to yourclient base so that they feel compelled to adopt anew suite of applications similar to the ERP(enterprise resource planning) adoption pattern?DM: I don’t think that big, broad-based marketcategories like ERP start as big, broad-basedcategories. What we now call ERP began as a seriesof application modules in related but separatefunctional domains: financials, HR, manufacturing,materials management, sales, and distribution.Over time, it made more sense to even more tightlypackage and integrate these offerings into a unifiedERP suite, but for the first three, four, five years afterwe named it ERP, people were still buying theoffering for specific functionality, such as financials orHR or manufacturing. The way that we’ve framed ournew category of applications—what we’re looselycalling business user apps or performanceoptimization apps—is by targeting specific painpoints around knowledge-centric work that isrelatively chaotic and unstructured today. We choseto start our focus on the “office of finance” within theorganization, but that was simply based on ourunderstanding of need and market opportunity. Thatis why you saw governance, risk, and compliance66 PricewaterhouseCoopers Technology Forecast

(GRC) first, and then financial performancemanagement next, which will quickly be followedby other performance management and newapplications focused on a line of business. Withinthe performance optimization apps category,we’re creating mini categories that are forspecific needs.PwC: Do you anticipate an evolvingapplication suite?DM: Absolutely. We’ve been trying to focus ondiscovery of the business-critical categories thatare a top imperative within organizations.Compliance was pretty obvious. That wasn’t ahard one to pick. But the real challenge isunderstanding where, within compliance, is the“burning platform”? That is, what is the mostubiquitous and horizontal business pain?We then have to choose the relevant package ofcomplementary business solutions that we couldassemble. It needs to have a high-level businesspurpose, so that the CFO feels strong value inbuying an entire GRC suite.Our starting point was our Access Controlsoffering. This was our primary value proposition.Almost every prospect we’ve approached says, “Ihave to have a way of understanding segregationof duties and conflicts within segregation ofduties.” What makes the suite concept work isthat closely tied to access-control issues areprocess controls and closely tied to those is theneed for a risk management framework.And we broaden the suite a little bit and say,“What about trade compliance?” and “Whatabout environmental compliance?” We wrappedthat suite together and said, “Here’s our GRCoffering.” And it has gotten great traction. We hadmore than $200 million of license sales just in thatcategory last year, and it was only its first full yearof production.PwC: Where else have you used this approach?DM: We took a similar approach with financialperformance management (FPM); again, trying toredefine the category. For a long time, the markethas been focused on delivering budgeting andconsolidations solutions, but very closely related tothese is the need for profitability management andstrategy management. So, we took a nextgenerationapproach to a truly usable and fullyintegrated planning and consolidation solution, andwe tied a profitability and strategy managementmessage with it to create a bigger and morecompelling bundle. The value proposition remainsthe pain point that planning and consolidationsolutions don’t address for the business user.PwC: What’s next at SAP?DM: We’ve had a team working for almost a year onother integration scenarios between GRC and FPM,areas such as risk-adjusted planning, compliantconsolidations, compliant close, and strategic riskmanagement. And we keep looking for other line-ofbusinesspain points where we can find a uniqueangle to meet an unfilled need. We are asking,“What is the compelling, must-have functionalityneeded to optimize performance for sales,marketing, supplier, procurement, and humanresources management?”The key point is that these are not highly scripted,standardized applications similar to ERP. To besuccessful with this next generation of applications,we are positioning the user as the control point,offering guidance and tools that anticipate theknowledge aggregation process. The nextapplications focus heavily on supportingcollaborative, creative activities that leveragebusiness intelligence and an on-the-fly ability tomodify processes that support teams that span thefunctional hierarchy of an organization, or even workoutside of the organization.Bringing order to chaos 67

For this reason, many organizations begin by deployingBPM at the departmental level. In some cases, thedeployments begin at subdepartment levels in whicha single business process is mature enough formodeling and can be isolated sufficiently to serve as afirst deployment of the process. The intention is that ifthe project works successfully, it will be expanded tothe full department and then to larger portions ofthe organization.In some organizations, the desire to implement BPMhas provided the impetus to clean up businessprocesses and make them amenable for modeling bydefining them clearly and identifying activities that arenon-essential. These activities, in theory, are among theitems that the organization will remove in the BPMimplementation process.Thereafter, monitoring is installed to obtain real-timedata that is made available using the standard reportingmechanisms portals, dashboards, and reports. Datagleaned from these sources then fuels a new cycleof optimization.It should be noted that few organizations today aresufficiently mature to repeat this cycle on a continualbasis. More often, an organization completes the BPMprocess and a long time gap ensues before a new cycleis undertaken.As corporate performance becomes a defining aspectof a company’s competitive differentiation, cycle timeswill decrease. The role of a CPO is key to championingthe changes necessary for this strategy to worksuccessfully.Demand for BPM tools is expected to exceed that of BI tools. Gartnerpredicts the market will grow from $1 billion to $2.6 billion by 2011, aCAGR of 23.8 percent. To put this number in context, IDC projects thatof the $100 billion spent on process automation software in 2007, only$1 billion went to BPM tools. So, BPM is still a small niche, but as theprojected CAGR indicates, it’s one that should grow quickly during theforecast period.Once processes at an organization are mature, or atleast well defined, BPM becomes possible. The firststep is to model the process or processes using anyone of several standard notations. The organizationthen analyzes the processes for inefficiencies, andmakes changes to the process models.After appropriate review, implementation begins, almostalways relying on a Web services architecture toprovide the underlying infrastructure. The shape of theimplementation is determined by formal processdefinitions generated from the modeling stage, includingthe definitions of needed services and, in the case ofWeb services, the Business Process ExecutionLanguage (BPEL) to make sure the services follow theproper sequence.BPM marketThe BPM market is dominated by a small number ofvendors that have true end-to-end solutions. Theseinclude EMC Documentum, IBM, Microsoft, andOracle. A large number of niche vendors provide keyparts of the BPM solution to companies that havespecific needs.Vendors without end-to-end enterprise tools but thatoffer BPM packages include Lombardi Software (whichrecently began offering BPM software as a service),Pegasystems (which integrates a high-end businessrules engine into its product), and Savvion.In the open-source market, the leading BPM vendorsare Intalio and the JBoss division of Red Hat.68 PricewaterhouseCoopers Technology Forecast

Demand for BPM tools is expected to exceed that of BItools. Gartner predicts the market will grow from $1billion to $2.6 billion by 2011, a CAGR of 23.8 percent.To put this number in context, IDC projects that of the$100 billion spent on process automation software in2007, only $1 billion went to BPM tools. So, BPM is stilla small niche, but as the projected CAGR indicates, it’sone that should grow quickly during the forecast period.One emerging trend that might fuel this growth is thecloser collaboration between BPM vendors andenterprise content management (ECM) systemsvendors. This trend is driven by the basic fact that muchof the workflow in business processes today is basedon the movement of documents. This trend is especiallyevident in the case of Documentum, an ECM vendorthat is now the BPM division of EMC, the software andstorage vendor. Likewise, alliances such as the recentlyannounced product integration between Intalio andECM vendor Alfresco Software are further examples ofthis trend.Business rulesBusiness processes, in whatever form they take,depend heavily on business rules. Rules drive theactivities, coordinate data movement and workflow,and provide decision automation in complex situations.In the past, business rules were hard-coded intoapplications by developers; today, rules are discretecomponents that encapsulate logic. They are commonlydeveloped by business analysts—rather than IT staff—and they are executed inside a business rules engine(BRE), which is a high-performance tool that can testthousands of individual rules or rule sets that have beenbundled for a specific business purpose.Most BPM vendors commonly provide a basic BRE andrules-development tools as part of their solution. Andthe design of modern BPM systems around a Webservices structure typically extends to the BREcomponents. This makes the rules accessible to othersoftware designed to make use of Web services.The use of a BRE rather than hard-coded rules givesBPM tremendous agility. A change in corporate policyor a new regulation usually can be implemented quicklyby simply adding or changing the rules, so that aprocess treats a business event differently.Organizations that have complex process requirements,however, need rules support beyond what is found inBPM product suites. These companies rely instead onbusiness rules management systems (BRMSs), whichare comprehensive, standalone, high-volume, businessrules design and execution systems.Financial services companies, for example, use BRMSproducts to process loan applications. The data isanalyzed via rules and a decision is made regarding theapplicant’s suitability for different loan products. Theanalysis, triage, and decision automation often involvethe testing of hundreds of rules, sometimes thousands,before a tailored offering (including terms and rates thatmatch the applicant’s situation) is formulated. BRMSscan make these decisions in a matter of seconds.This automation of decision making provides significantbenefits and guarantees that decisions are madeconsistently and impartially. Most BRMS products havebuilt-in auditing functions, so that in the event of acomplaint or inquiry, the financial institution can showexactly which rules were applied and why. For thisreason, BRMSs are also useful for ensuring anddemonstrating regulatory compliance.For enterprises, BRMSs provide great agility; not onlybecause they automate decisioning but because theyenable organizations to modify policies or implementnew programs quickly. For example, to provide a newloan offering for select applicants, a financial servicesfirm needs only to add rules to the rules repository.Likewise, heavily regulated industries can quicklyimplement new regulations and requirements.This automation of decision makingprovides significant benefits andguarantees that decisions aremade consistently and impartially.Bringing order to chaos 69

Business rules marketThe BRMS market is smaller than either the BI and BPMsegments, and it is dominated by vendors of smallersize. The principal vendors of pure-play BRMSs includeCorticon, Fair-Isaac, ILOG, and Pegasystems. In opensourcesoftware, the leading package is Drools, whichwas recently folded into Red Hat’s JBoss product line.The incursion of the Web deep intothe heart of corporate operationshas already demonstrated that therate of change of technology isincreasing and that IT departmentsmust keep up. To do so, they needlean processes coupled withinfrastructure that is reconfigurableand adjustable, so that the overallorganization can leverage the newtechnologies to respond effectivelyto customer needs and competitivepressures.The small size of these vendors has led to continualspeculation that they would be bought by larger BPM orenterprise vendors. However, that projection has notmaterialized, although as discussed later, the interest ofacquirers might be satisfied through other means.The BRMS market is a mature market with establishedvendors and solid products. In 2006, IDC estimated themarket to be $230 million and projected it to grow at aCAGR of 19.3 percent through 2010.Although most rules vendors sell BRMS packages asstandalone products, they are increasingly working withBPM vendors to bid together on requests for proposals(RFPs) and to offer solutions that integrate theirrespective products. This has been particularly evidentin the co-marketing between ILOG and EMCDocumentum, as well as similar relationships betweenILOG and other BPM vendors.In the near term, the BRMS market segment is unlikelyto be subject to the consolidation that has occurred inthe BPM market; rather, BRMS vendors will continue asstandalone companies. Most BPM tools already haverules engines, so a BRMS acquisition would provideonly incremental lift.The major BRMS vendors offer products and servicesunrelated to BPM—thereby making an acquisitionexpensive. In addition, the good prospects for growth inthe BRMS sector, particularly in industries wheredecision automation is critical—banking, financialservices, and insurance—suggest that the BRMSvendors would be resistant to acquisition overtures.The quality of rules engines in BPM solutions will likelyemerge as a competitive differentiator for BPM vendors;as a result, BPM vendors will pursue original equipmentmanufacturer (OEM) relationships or other closealliances with BRMS vendors.Effects on enterprisesFor enterprises to improve business efficiency by usingintelligent business performance products, they need tohave achieved a specific level of IT maturity. Twomilestones, in particular, must have been reached:• The organization, or at least the department, musthave defined business processes. This step is thesine qua non for optimization to deliver much benefitSome industries, such as manufacturing, healthcare,and financial services, have attained this level. Otherindustries are still behind the curve and mustaddress the lack of formal process definitions first.• The IT organization must be capable of deployingand using a service-oriented architecture (SOA).Even though SOA is not required, it is emergingas one of the best technology architectures fordesigning a highly responsive andefficient enterprise.70 PricewaterhouseCoopers Technology Forecast

Although the IBPP will take some time to come together, its adventis likely to push the constituent technologies into a much higher rateof adoption.Most IT departments have some Web servicesexperience, and therefore have practice with theirimplementation. However, for some organizations, theshift to the new model and away from monolithic, highlyintegrated approaches represents a difficult cultural andtechnological change.Until these organizations can move toward greateracceptance of services-based architectures,business process-oriented solutions will not deliverthe expected benefits.Most analysts believe that process-based businessoptimization will continue to grow because of a trackrecord of effective implementation and because aprocess focus does not use technologies that requirerip-and-replace change. Instead, software packagescan be folded into existing IT infrastructure and, onceinstalled, can lead to continually improved processeswithin the context of a single business process, adepartment, or even an entire enterprise.For this reason, some departments have implementedsome of these technologies without the participation ofIT. Due to the flexibility of services-basedimplementations, integrating these projects—whetherdone by IT or the business units—is not terribly difficultor expensive.The incursion of the Web deep into the heart ofcorporate operations has already demonstrated that therate of change of technology is increasing and that ITdepartments must keep up. To do so, they need leanprocesses coupled with infrastructure that isreconfigurable and adjustable, so that the overallorganization can leverage the new technologies torespond effectively to customer needs and competitivepressures. Intelligent business performancetechnologies provide exactly that capability whilefulfilling the mandates for efficiency, agility, andregulatory compliance.Outlook: Intelligent business performanceplatformEnterprises use three key technologies to be more agileand more responsive in the face of changingcompetitive pressures and market vagaries. Thesesolutions fit together naturally:• BI and BPM both generate reports as a key benefit,although they report on different domains.• Both technologies also depend on business rules—BPM more so than BI—so there is an undeniablelinkage between these technologies.Moreover, in many scenarios, the same enterprise, oreven the same division within an enterprise, would useall three technologies. Vendors of BI and BPM tools arewell aware of these possibilities, so it is fair to expectthat eventually the major enterprise vendors—especiallyIBM, Microsoft, Oracle, and SAP—will formulateintegrated, end-to-end product suites that containsubstantial products from all three categories.Although the IBPP will take some time to come togetheras an integrated suite of software, its advent is likely topush the constituent technologies into a much higherrate of adoption. Likewise, in small and midsizebusinesses, a similar consolidation among open-sourceproducts will do much to introduce these technologiesto companies unaccustomed to focusing their IT effortson business process optimization.For more information on the topics discussed in thisarticle, contactBringing order to chaos 71

AcknowledgmentsAdvisorySponsoring PartnerPaul HorowitzUS Thought LeadershipPartner-in-ChargeTom CrarenCenter for Technology and InnovationManaging EditorBo ParkerEditorsVinod Baya, Alan MorrisonContributorsAndrew Binstock, David King, Susan King, Hari Rajagopalachari,Christine WendinEditorial AdvisorsLarry Best, Galen Gruman, Mark Haller, Glen Hobbs, Hema Kadali,Jim Kinsman, Robert Lattimore, Bud Mathaisel, Justin McPherson,Avinash Rajeev, Jim Ramsey, Terry Retter72 PricewaterhouseCoopers Technology Forecast

CopyeditorsLea Anne Bantsari, Paula PlantierMarketingAngela ChamblissTranscriptionPaula Burns, Saba MirzaGraphic DesignCreative DirectorHoward AllenDesignersLaura Tu, Marina WaltzIllustrationsBruce LeiningerOnlineDirector, Online MarketingJack TeuberDesigner and ProducerJoe BreenReviewersJohn Benge, Eric Berg, Randy Browning,Bill Cobourn, Sean Middleton, Rob Scott,Ricardo Suieras, Gerard Verweij, Bob ZukisSpecial thanks toRandy Churchill, Victoria Huff,George Kennedy, Don McGovern,Christof Menzies, Jacqueline Olynyk,Sohail SiddiqiIndustry perspectivesDuring the preparation of this report, we benefitedgreatly from interviews and conversations with thefollowing executives:John Crupi, CTO, JackBeHenning Kagermann, chairman of the executiveboard and CEO, SAPSheldon Laube, partner and director, PwC Centerfor Advanced ResearchDoug Merritt, president, SAP Labs North AmericaLakshmi Narayanan, vice chairman, CognizantGlenn Ricart, managing director, PwC Center forAdvanced ResearchRichard Rosenblatt, cofounder, chairman,and CEO, Demand MediaSubramanian Ramadorai, CEO,Tata Consultancy ServicesAcknowledgments 73

SubtextThe intelligent businessperformance platformAgile managementWeb 2.0IT complexityPrivacyApplicationDecision-making supportBusiness intelligence, business process management, and business rulesare piece parts. But converged, they can accelerate knowledge-drivenactivities.In an era of continual change, the leading edge has embraced the need topush authority and accountability to lower levels of management. But adevolved management approach in fast-changing markets implies the needto tackle risk first, and then governance, all with the help of the bestinformation available.The most important business aspect of social networks isn’t that they’resocial—it’s the networks. How can enterprises use these to tap into thepower of the informal organization?Cars have evolved to be internally complex, but simple to operate. Whycan’t IT be the same way?The expression “You have no privacy—get over it” has been replacedwith “You can have privacy—enforce the policies you’ve set with theIBPP rules engine.”Instead of functions given to the user, they’re functions the user has morecontrol over.“Support” now means the lowest-level decisions get automated. Insteadof information, the ability to act on information. Instead of passiveautomation, active.Comments or requests? Please go to ( provides industry-focused assurance, tax and advisory services to build public trust and enhance value forits clients and their stakeholders. More than 146,000 people in 150 countries across our network share their thinking, experience and solutions todevelop fresh perspectives and practical advice.© 2008 PricewaterhouseCoopers LLP. All rights reserved.”PricewaterhouseCoopers” refers to the PricewaterhouseCoopers global network or othermember firms of the network, each of which is a separate and independent legal entity.

More magazines by this user
Similar magazines