ARTICLE IN PRESST. Williams et al. / International Journal of Project Management xxx (2009) xxx–xxx 3up the OGC in 2000, pulling <strong>to</strong>gether staff from variousagencies [14]. The report led <strong>to</strong> the establishment of theOGC (2004) Gateway Process TM with six well-defined, standardizedand documented Gateways. Gateway Review 0looked at strategic management at the programme level,and Gateways 1–5 at the project level, covering differentstages of the project life-cycle. Private sec<strong>to</strong>r engagementcame from the use of experienced consultants who had beenindividually accredited by OGC for Gateways. The SixGateways start at Ministerial level and work all the waydown <strong>to</strong> suppliers. Parliamentary/Governmental level isundertaken by mechanisms outside this study.Categorisation came later, looking at high political significance,riskiness of the programme and cost. At the <strong>to</strong>plevel were the ‘‘Top 20” Mission Critical projects, theOGC sitting on the project board. The next level was ‘‘HighCriticality”: for these, Gateway reviews had <strong>to</strong> use seniorpeople or independents. Different rules applied <strong>to</strong> ‘‘Medium”and ‘‘Low Criticality” projects. Later still, a generalconcern for better programme management gave rise <strong>to</strong>the development of small Centres of Excellence as part ofthe framework, bringing ‘‘best practice” <strong>to</strong> the Department,acting as an OGC liaison point within a Department andreporting directly <strong>to</strong> the Head of Department. Morerecently came a Project Initiation Process. The espousedaim of the framework is specifically for the OGC <strong>to</strong> achievefinancial savings (according <strong>to</strong> procedures laid down by theNational Audit Office). At the time of this research, theOGC worked by influence – its recommendations had notthen been mandated (although this is set <strong>to</strong> change). Thisis the traditional UK civil service culture. The OGC didnot consider individual project Gateway reports; ratherthey looked for systemic trends. Reports on a particularproject went only <strong>to</strong> the OGC and the sponsor (the ‘‘SeniorResponsible Owner”/SRO), although special reports on the<strong>to</strong>p ‘‘mission critical” projects go <strong>to</strong> the Prime Minister’sOffice. A substantial number of people were involved inimplementing the framework and giving advice.A Treasury report entitled ‘‘Transforming governmentprocurement” [15], stated that the OGC would becomemore focused, powerful and smaller, and, since thisresearch was carried out, the Gateway Process has beenmandated, with four areas of development in the OGCframework: Major Projects Portfolio: The system of criticality hasbeen replaced by the Major Projects Portfolio – a list ofthe key projects across the public sec<strong>to</strong>r for deliveringthe Government’s service imperatives. A single integratedquarterly report on the health of the Government’s MajorProjects Portfolio will be produced in conjunction with theCabinet Office. Major Projects Review Group: This is a scrutiny committeefor major Government projects, sponsored by the Treasury,challenging projects on deliverability, affordabilityand value-for-money. Their intervention will not only helpthe team, but will also be of the nature of a scrutiny so willhave much stronger power, with an emphasis on actions <strong>to</strong>be taken. It consists of eleven very senior members of theCivil Service or Government agencies. Enhanced Gateway Reviews: There will be a new overarchingrating of ‘‘delivery confidence” <strong>to</strong> supplementthe current rating, indicating the Review team’s assessmen<strong>to</strong>f their confidence that the project will deliver itsintended outcomes and benefits. When there is a ‘‘Red”rating, the report will not only go <strong>to</strong> the sponsor, but willbe escalated <strong>to</strong> the Head of Department and beyond. Thereviews will also include action plans, moni<strong>to</strong>red by OGC. ‘‘Starting Gate”: A new intervention is <strong>to</strong> be introduced,intended <strong>to</strong> provide assurance at the stage of developingmajor new policy options, prior <strong>to</strong> initiation of a projec<strong>to</strong>r programme.2.2. UK Ministry of Defence (MoD)The one major section of the UK public sec<strong>to</strong>r that uses adifferent framework is the MoD. The MoD has always hadan ‘‘extended life-cycle”, both very early and very late. Theframework came in as the relationship with industry changed,becoming more co-operative, and ensuring that boththe whole industrial base and UK sovereign capability areconsidered. Contracting defence budgets gave motivationfor value-for-money and <strong>to</strong> procuring more accurate predictions.The CADMID process, part of SMART acquisition,was introduced in 1998, following McKinsey work,which also showed the need for a stronger cus<strong>to</strong>mer withinthe MoD. This became known as ‘‘Capability Management”and was led by a Deputy Chief of Defence Staff.The framework is anchored within the MoD Main Board.Following the McKane report [16], the procurement andlogistics agencies were unified in<strong>to</strong> ‘‘DE&S”. This encompassedthe other espoused goal of the framework: <strong>to</strong> managethe MoD’s projects as a single portfolio in order <strong>to</strong> getthe best capability for the MoD as a whole. The UK MoDsystem works with different types of projects, each having adifferent categorisation. There are two Gates: the InitialGate <strong>to</strong> release funds for assessment, and the Main Gate<strong>to</strong> release funds for the main project. Projects go <strong>to</strong> theInvestment Appraisal Board via two routes simultaneously,from the advocate of the project (the SRO) andvia independent scrutiny (within MoD but independen<strong>to</strong>f the project). A preliminary ‘‘Foundation Review” is alsobeing brought in. The system is vertically integrated, in thatGates look at the entire project, including the industrialbase. Each project is undertaken by an ‘‘Integrated ProjectTeam” (IPT), responsible on the project <strong>to</strong> the SRO, butresponsible overall within DE&S. Thus, the MoD considersthe whole portfolio of projects; the ‘‘Capability” cus<strong>to</strong>merconsiders the programme; and the IPT, theindividual project. The Chief of Defence Material reports<strong>to</strong> corporate targets on DE&S overall performance. Currenttransitions of the framework consist of minor changesfollowing the McKane report.Please cite this article in press as: Williams T et al.An investigation of governance frameworks for public projects in Norway and the UK. Int J Project Manage (2009),doi:10.1016/j.ijproman.2009.04.001
ARTICLE IN PRESS4 T. Williams et al. / International Journal of Project Management xxx (2009) xxx–xxx2.3. NorwayThe triggering incident in Norway was a series of unsuccessfulmajor projects and repeated project overspend duringthe 1980–90s. Deputy Secretary General of the Ministryof Finance, Peder Berg, led a government committee investigatinga number of project cases [17]. The Ministry ofFinance initiated the development of an obliga<strong>to</strong>ry QualityAssurance Scheme in 2000, with manda<strong>to</strong>ry externalassessment of projects, performed by consultant companies,before the financing decision by Parliament. Thiswas manda<strong>to</strong>ry for all state-financed projects over NOK500 million/£42 million, excluding Oil and Gas. The goalwas <strong>to</strong> ensure improved quality-at-entry. It was a bot<strong>to</strong>m-upprocess within the Ministry, with Peder Berg as adriving force. The decision <strong>to</strong> introduce this governanceframework was made by the Prime Minister’s Office. In2005 there was a second generation of the framework,reflecting the need <strong>to</strong> do something at an earlier stage.The same entity is responsible for the governance frameworkacross all sec<strong>to</strong>rs, with few exceptions. For both generationsof the QA Scheme the intention was <strong>to</strong> establish asystem where politics and administration is well divided,with the interplay between these two sides well unders<strong>to</strong>od.The whole framework is a control measure. Controlrules are documented in the contracts between the Ministryof Finance and consulting companies, the control objectbeing the documents assessed in the regime. There aretwo gateways: QA1 focuses on the rationale of the project.It covers the early choice of concept and strategy, and thedecision <strong>to</strong> initiate project pre-planning, using a compulsorydossier of four documents, and looking at many alternatives.QA2 considers the decision <strong>to</strong> finance the project,looking at one alternative only, and controlling the ProjectManagement Plan, with several sub-documents and a focuson cost. QA1 and QA2 provide a <strong>to</strong>ol for control from the<strong>to</strong>p; Parliament–Government–Ministry–Agency. Verticalintegration s<strong>to</strong>ps at the agency-level and the private sec<strong>to</strong>ris not addressed. There are several coordination Forumswhere the Ministry of Finance gathers key interested peoplefor discussions, often resulting in common understandingand definition of terms and professional standards. The<strong>Concept</strong> Research Programme supports the developmen<strong>to</strong>f the regime and studies the practices of the agenciesand consultants.As soon as the new framework was introduced in 2005,the need <strong>to</strong> develop new common definitions and guidelineswas evident. Using the same model as that in the previousintroduction period, a number of development processeshave been started, resulting in a series of new guidelines.2.4. Comparison of frameworksThe three initiatives seem <strong>to</strong> have been prompted bysimilar developments and motivations; the OGC and Norwegianinitiatives are both anchored at the <strong>to</strong>p politicallevel and organised under the Ministry of Finance. OGCgoals are more explicit, administratively focused and measuredin terms of money; in Norway there are more clearlypolitically anchored goals, which do not specify theexpected effect of implementation. All frameworks aimed<strong>to</strong> include transparency, being open <strong>to</strong> scrutiny, and particularlycandid about the basis for decision-making. Alsoincluded were learning, willingness <strong>to</strong> change, the settingof common, high professional standards and politicalanchoring of the framework on a high level, or non-politicalQA/Gateway review. The process of development, however,was genuinely different. In Norway the initiatingprocess was bot<strong>to</strong>m-up, as was the implementation of theimprovement. In the UK both processes were <strong>to</strong>p-down,as was the implementation of the management system. Differentstrategies were chosen: Norway breaking with traditionand introducing a new arrangement, the UK buildingon tradition. The Norwegian and MoD’s frameworks aremanda<strong>to</strong>ry; the OGC framework currently works by influence.The Norwegian framework is a bot<strong>to</strong>m-up process oflearning from cases, transferring experience <strong>to</strong> other sec<strong>to</strong>rsand building ‘‘the new profession”. The UK OGC andMoD frameworks, <strong>to</strong> some extent, are a <strong>to</strong>p-down introductionof a common ‘‘quality system”, the Centres ofExcellence representing the ‘‘new profession”. Both Norwayand the OGC have established a support organisationlooking for systemic trends: in the UK as a permanent publicentity; in Norway as an external research programme.The MoD reports on systemic trends at a <strong>to</strong>p level. TheOGC looks only at systemic trends; Norway and theMoD also report on single cases. Norway has a centralizedco-ordination forum, while the OGC has established distributed‘‘Centres of Excellence”. (The MOD is already asingle, organised entity.)Comparison of the two frameworks highlights some differences.Vertical and horizontal integration is different. Anotable characteristic of the Norwegian framework, and itscomponents, is its simplicity, with a more macro-analyticperspective; the UK side is more comprehensive, and adequatefor more detailed control measures at a lower hierarchicallevel, being more micro-analytic from a PPM poin<strong>to</strong>f view. The organisation implementing the UK governanceframework also supplies the answer <strong>to</strong> the question:‘‘how <strong>to</strong> achieve”, whereas the Norwegian framework onlyanswers ‘‘what <strong>to</strong> achieve”. The use of external consultantsis similar in both countries, but in Norway companies areassigned <strong>to</strong> carry out reviews; in the UK this is done byindividuals. The Norwegian framework is manda<strong>to</strong>ry, soconsultants do not have <strong>to</strong> persuade the agencies and projec<strong>to</strong>rganisations. In the UK, the assessment requires onlya small amount of effort from senior consultants [18],review roles are defined in detail, and there is a standardreport format; in Norway the QA-team performs a completeindependent analysis of the project, over manymonths, with roles agreed in the Forum; within theMoD, assessments are made internally, roles and the dossierformat being defined in detail. In Norway the controlmeasures were focused initially on cost and risk, but arePlease cite this article in press as: Williams T et al.An investigation of governance frameworks for public projects in Norway and the UK. Int J Project Manage (2009),doi:10.1016/j.ijproman.2009.04.001
- Page 1 and 2:
3PrefaceThe research presented in t
- Page 3 and 4:
5Table of contentsPART 1 ..........
- Page 5 and 6:
7Summary/abstractThe work reported
- Page 8 and 9:
101 IntroductionProjects are increa
- Page 10:
collection and interpretation of th
- Page 13 and 14:
15normative agendas, in other words
- Page 15 and 16:
17world of projects. The choice of
- Page 17 and 18:
19Figure 3 Sources and dataThe proj
- Page 19 and 20:
21In line with Flyvbjerg (2006b), t
- Page 21 and 22:
233 Concepts and constructs of thep
- Page 23 and 24:
25by most BOKs and textbooks in pro
- Page 25 and 26:
274 Main topics covered by the rese
- Page 27 and 28:
294.2 Empirical indications from ex
- Page 29 and 30:
31Budget proposed by the Norwegian
- Page 31 and 32:
3330 %Difference (%) from the propo
- Page 33 and 34:
3520 00018 000Cost development from
- Page 35 and 36:
37In paper 9 (Magnussen 2009a) an a
- Page 37 and 38:
395 Conclusions and directions for
- Page 39 and 40:
41estimates must be implemented at
- Page 41 and 42:
43Klakegg, Ole Jonny, Terry William
- Page 43 and 44:
45List of government documents 11Fi
- Page 45 and 46:
47Part 2 - Papers1. Magnussen, Ole
- Page 47 and 48:
Paper 1Magnussen, Ole Morten, and K
- Page 49 and 50:
AbstractCost overruns and delays ar
- Page 51 and 52:
Cost effectiveness considerations:
- Page 53 and 54:
demonstrates another fundamental is
- Page 55 and 56:
Figure 1 The Extended Quality Assur
- Page 57 and 58:
Expected effects of the revised qua
- Page 59 and 60:
ReferencesBerg, Peder, Kilde, Halva
- Page 61 and 62:
International Journal of Project Ma
- Page 63 and 64:
O.M. Magnussen, N.O.E. Olsson / Int
- Page 65 and 66:
O.M. Magnussen, N.O.E. Olsson / Int
- Page 67 and 68:
O.M. Magnussen, N.O.E. Olsson / Int
- Page 69 and 70:
Paper 3Magnussen, Ole M., and Nils
- Page 71 and 72:
MANAGING THE FRONT-END OF PROJECTS:
- Page 73 and 74:
Olsson, Samset, Austeng and Lädre
- Page 75 and 76:
1995; Packendorff, 1995), mainly cr
- Page 77 and 78:
to a better way of managing the fro
- Page 79 and 80:
Figure 1 Basic structure of the ind
- Page 81 and 82:
from empirical investigations has b
- Page 83 and 84:
among organizations, not individual
- Page 85 and 86:
actors, the relationships are repro
- Page 87 and 88:
the network approach has been used
- Page 89 and 90:
established based on the views of i
- Page 91 and 92:
IHFJKGAELDMBCA - The focal projectB
- Page 93 and 94:
the NDEA. The communication strateg
- Page 95 and 96:
Directorate for Cultural Heritage (
- Page 97 and 98:
the project. The basic activity was
- Page 99 and 100:
assumed to be more important than o
- Page 101 and 102:
Another interesting observation was
- Page 103 and 104: REFERENCESEngwall, Mats. 2003. No p
- Page 105 and 106: Söderlund, Jonas. 2002. On the dev
- Page 107 and 108: Flexibility at Different Stages in
- Page 109 and 110: Reproduced with permission of the c
- Page 111 and 112: Reproduced with permission of the c
- Page 113 and 114: Reproduced with permission of the c
- Page 115: Paper 5Olsson, Nils O. E., and Ole
- Page 126 and 127: PAPERSGovernance Frameworks for Pub
- Page 128 and 129: supporting setting of and achieving
- Page 130 and 131: The study proceeded as follows:•
- Page 132 and 133: y the Chief of Defence Materiel, he
- Page 134 and 135: (within MoD but independent of thep
- Page 136 and 137: Norway U.K. (MoD) U.K. (OGC)Charact
- Page 138 and 139: from the external consultants was s
- Page 140 and 141: governance. International StudiesRe
- Page 142 and 143: Paper 7Williams, Terry, Ole Jonny K
- Page 144 and 145: The development of the frameworksUK
- Page 146 and 147: notable characteristic of the Norwe
- Page 148 and 149: Office to the National Audit Office
- Page 150 and 151: highly complex and changing decisio
- Page 152 and 153: ARTICLE IN PRESSAvailable online at
- Page 156 and 157: ARTICLE IN PRESST. Williams et al.
- Page 158 and 159: ARTICLE IN PRESST. Williams et al.
- Page 160 and 161: ARTICLE IN PRESST. Williams et al.
- Page 162 and 163: ARTICLE IN PRESST. Williams et al.
- Page 164 and 165: Paper submitted to the Internationa
- Page 166 and 167: One constraint is that actual costs
- Page 168 and 169: In other words, the QA scheme is a
- Page 170 and 171: author, there apparently are no stu
- Page 172 and 173: 20,0 %15,0 %10,0 %5,0 %0,0 %P50 est
- Page 174 and 175: The pre-eminent result is that the
- Page 176 and 177: Results from the analysis of the de
- Page 178 and 179: more fundamental assessments of pro
- Page 180 and 181: Paper 10Magnussen, Ole M. 2009. Exp
- Page 182 and 183: Explaining cost estimate difference
- Page 184 and 185: changes and external factors. Facto
- Page 186 and 187: 30 %Difference (%) from the propose
- Page 188 and 189: Table 2 Areas associated with expla
- Page 190 and 191: agency. In this case, the observed
- Page 192 and 193: ReferencesFlyvbjerg, B., Holm, M.K.