Download Presentation - Software Engineering Institute - Carnegie ...

sei.cmu.edu

Download Presentation - Software Engineering Institute - Carnegie ...

Carnegie Mellon

Software Engineering Institute

Pittsburgh, PA 15213-3890

CMMI ® Today

Mike Phillips

CMMI Program Manager

Software Engineering Institute

Carnegie Mellon University

® CMMI is registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.

Sponsored by the U.S. Department of Defense

© 2003 by Carnegie Mellon University

page 1


Carnegie Mellon

Software Engineering Institute

Brief History – CMMI

1992 – Software CMM created

1994 – Systems Engineering CMM created

1998 – CMMI Product Suite initiated

2001 – CMMI-SE/SW V1.0 released

2002 – CMMI-SE/SW/IPPD/SS V1.1 Product Suite

released

2003 – 10,000 people trained in “Intro to CMMI;” 150+

SCAMPI benchmark appraisals in at least 12

countries; CMMIweb site “hits” exceed 1M/month

© 2003 by Carnegie Mellon University page 2


Carnegie Mellon

Software Engineering Institute

CMMI Today

Stable Version 1.1 CMMI Product Suite was released

January 2002.

Errata sheets cover known errors and changes with book

publication.

FAQs are generated to cover broader issues.

Yahoo has CMMI Process Improvement and Lead

Appraiser Group sites.

CMMI web pages hits have surpassed 1M/month.

Change Request announcement addressed 90 day review

period through Dec 12.

© 2003 by Carnegie Mellon University page 3


LEVEL 5

OPTIMIZING

Carnegie Mellon

Software Engineering Institute

SW-CMM v1.1 vs. CMMI Process Areas

Defect Prevention

Technology Change Mgmt

Process Change Management

Causal Analysis and Resolution

Organizational Innovation & Deployment

LEVEL 4

MANAGED

Quantitative Process Mgmt

Software Quality Mgmt

Organizational Process Performance

Quantitative Project Management

LEVEL 3

DEFINED

LEVEL 2

REPEATABLE

Organization Process Focus

Organization Process Definition

Training Program

Integrated Software Mgmt

Software Product Engr

Intergroup Coordination

Peer Reviews

Organization Process Focus

Organization Process Definition

Organizational Training

Integrated Project Management

Risk Management

Requirements Development

Technical Solution

Product Integration

Verification

Validation

Decision Analysis and Resolution

Requirements Management

Requirements Management

Software Project Planning

Project Planning

Software Project Tracking & Oversight Project Monitoring and Control

Software Subcontract Mgmt

Supplier Agreement Management

Software Quality Assurance

Product & Process Quality Assurance

Software Configuration Mgmt Configuration Management

Measurement and Analysis

© 2003 by Carnegie Mellon University page 4

4


Carnegie Mellon

Software Engineering Institute

CMMI Improvements over the CMM

Emphasis on measurable improvements to achieve

business objectives.

Process areas have been added to place more

emphasis on some important practices:

• Risk Management

• Measurement and Analysis

Engineering Process Areas

• Decision Analysis

© 2003 by Carnegie Mellon University page 5


Carnegie Mellon

Software Engineering Institute

Adoption—What else is

happening now?

Publication of SEI Series Book with

Addison-Wesley: others include:

• CMMI Distilled: Second Edition

• Systematic Process Improvement Using

ISO 9001:2000 and CMMI

• Balancing Agility and Discipline

Annual NDIA/SEI CMMI User Workshop

• Denver Hyatt Technical Center

• Nov 17-20

400+ attendees

Mappings taken on by IEEE

© 2003 by Carnegie Mellon University page 6


Carnegie Mellon

Software Engineering Institute

How about SEI Publications?

Technical notes and special reports:

• Interpretive Guidance Project: Preliminary Report

• CMMI and Product Line Practices

• CMMI and Earned Value Management

• Interpreting CMMI for Operational Organizations

• Interpreting CMMI for COTS Based Systems

• Interpreting CMMI for Service Organizations

• Providing Safety and Security Assurance (in progress)

• Interpreting CMMI for Acquisition (in progress)

© 2003 by Carnegie Mellon University page 7


Carnegie Mellon

Software Engineering Institute

CMMI Transition Status

As of 12/31/03

Training

Introduction to CMMI – 10103 trained

Intermediate CMMI – 777 trained

Introduction to CMMI Instructors – 219 trained

SCAMPI Lead Appraisers – 379 trained

Authorized

Introduction to CMMI V1.1 Instructors - 176

SCAMPI V1.1 Lead Appraisers – 267

© 2003 by Carnegie Mellon University page 8


Carnegie Mellon

Software Engineering Institute

Number of CMMI Students

Trained (Cumulative)

7000

6000

5000

4000

3000

2000

1000

0

1999 2000 2001 2002 YTD

2003

CMMI (Staged)

CMMI (Continuous)

4-30-03

© 2003 by Carnegie Mellon University page 9


450

Carnegie Mellon

Software Engineering Institute

Number of Lead Appraisers Authorized

(Cumulative)

400

350

300

250

200

150

100

50

0

1994 1995 1996 1997 19981999 2000 2001 2002 YTD

2003

Lead Evaluators

(SCE)

Lead Assessors

(CBA IPI)

Lead Appraisers

(SCAMPI)

9-30-03

© 2003 by Carnegie Mellon University page 10


Carnegie Mellon

Software Engineering Institute

Intro to the CMM and CMMI Attendees

(Cumulative)

20000

18000

16000

14000

12000

10000

8000

6000

4000

2000

0

1995 1996 1997 19981999 2000 2001 2002 YTD

2003

CMM Intro

CMMI Intro

CMMI Intermediate

© 2003 by Carnegie Mellon University page 11

9-30--03


Carnegie Mellon

Software Engineering Institute

Number of Assessments Reported to the SEI by Year

30 September 2003

600

500

400

300

200

100

0

1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003

SPA SEI CBA IPI SCAMPI vX

© 2003 by Carnegie Mellon University page 12


Carnegie Mellon

Software Engineering Institute

CMMI ® Results Study Framework

Costs

Benefits / Savings / Improvements

Investment

in

Improvement

Costs of

Improvement

Operational

Costs

Process

Capability

or or Maturity

Quality

Schedule /

cycle time

Product cost

Customer

satisfaction

Rework

Effort

Productivity

Predictability

Enhanced

functionality

“ilities”

Employee

morale

Process

compliance

- COSTS /

EXPENSES

Cost of quality

/ ROI

+ REVENUE

/ SAVINGS

© 2003 by Carnegie Mellon University page 13


Carnegie Mellon

Software Engineering Institute

Boeing, Australia

Making transition to CMMI from SW-CMM and EIA 731;

early CMMI pilot in Australia

RESULTS on One Project

• 33% decrease in the average cost to fix a defect

• Turnaround time for releases cut in half

Product cost

Schedule /

cycle time

• 60% reduction in work from Pre-Test and Post-Test Audits;

passed with few outstanding actions

• Increased focus on product quality

• Increased focus on eliminating defects

• Developers seeking improvement opportunities

Quality

In Processes is there a Pay-Off? Terry Stevenson, Boeing Australia, Software Engineering

Australia 2003 conference.

© 2003 by Carnegie Mellon University page 14


Carnegie Mellon

Software Engineering Institute

Lockheed Martin M&DS

SW CMM ML2 (1993) to ML 3 (1996) to CMMI ML5

(2002)

Results

• Award Fees during 2002 are 45% percent of

unrealized award fees at ML2

1996 - 2002

• Increased software productivity by 30%

• 16% reduction in Dollars/KLOC

• Decreased defect find and fix costs by 15%

Customer

satisfaction

Productivity

Product cost

Quality

Internal data shared through Collaboration; August 2003.

© 2003 by Carnegie Mellon University page 15


Carnegie Mellon

Software Engineering Institute

General Motors Corporation

CMMI focus 2001

Goal is Integration of Supplier Work & GM Project Execution

Results:

• Improved schedule – projects met milestones and

were fewer days late

Schedule /

cycle time

Camping on a Seesaw: GM’s IS&S Process Improvement Approach. Hoffman, Moore &

Schatz, SEPG 2003.

© 2003 by Carnegie Mellon University page 16


Carnegie Mellon

Software Engineering Institute

Aggregated Appraisal Results

Results from 18 Defence Community* appraisals conducted over

the period Mid 2000 - Present

• *Includes Defence Industry and Department of Defence appraisal

results

(C) Copyright Commonwealth of Australia - September 2003

© 2003 by Carnegie Mellon University page 17


Carnegie Mellon

Software Engineering Institute

The Road Ahead….

Formal Review period ends mid-December

CMMI Team will review CRs to determine possible

Change Packages for a V1.2 of model and/or method

CCB will determine which CPs, if any, are needed

(stability goal remains)

Improvement Packages would be an FY 05 effort, with

piloting

V1.2 would be ~FY 06

© 2003 by Carnegie Mellon University page 18


Carnegie Mellon

Software Engineering Institute

CMMI Staged and Six Sigma

5

4

3

2

1

• Organization-wide 6σ improvements and control

• Correlation between process areas & 6σ methods

• 6σ used within CMMI efforts

• Infrastructure in place

• Defined processes feed 6σ

• 6σ philosophy & method focus

• 6σ “drilldown” drives local

(but threaded) improvements

Initial

• 6σ may drive toward and

accelerate CMMI solution

Managed

Defined

Optimizing

Process

improvement

Quantitatively

Managed

Process measured

and controlled

Process characterized for the

organization and is proactive

Process characterized for projects and

is often reactive

Process unpredictable and poorly controlled

Six Sigma is enterprise wide.

Six Sigma addresses product and process.

Six Sigma focuses on “critical to quality” factors.

© 2003 by Carnegie Mellon University page 19


Carnegie Mellon

Software Engineering Institute

Six Sigma and CMMI Continuous

Achieve high capability in PAs that build Six Sigma skills.

• MA, QPM, CAR, OPP

Use capability to help prioritize remaining PAs

5

Foundational

PAs

Capability

4

3

2

1

0

MA

QPM

CAR

OPP

DAR

PP

PMC

SAM

IPM

RSK

REQM

RD

TS

PI

VER

VAL

CM

PPQA

OPF

OPD

OT

OID

Remaining PAs ordered by business factors, improvement opportunity, etc. which are

better understood using foundational capabilities. CMMI Staged groupings and DMAIC

vs. DMADV are also factors that may drive the remaining order.

[Vickroy 03]

© 2003 by Carnegie Mellon University page 20


Carnegie Mellon

Software Engineering Institute

LMC M&DS Process Standard

Roadmap

CM M I SE/SW V1.1 – 2002 M & D S P P S - 2002 ISO 9001 - 2000 EIA-632-1999 ISO /IEC 12207 - 1995

O rganizational Process Focus (O PF):

SG 1:Determine Process Improvement

O pportu nities

SG 2: Plan and Implement Process

Im provem ent A ctivities

Organizational Process Definition (OPD):

SG1: Establish Organizational Process Assets

Organizational Training (OT):

SG 1: E stablish O rganizationa l training

Capability

SG 2: Provide N ecessary T raining

O rganizational Process Performance

(O PP ):

SG 1: E stablish Perform ance B aselines and

M odels

Organizational Innovation and Deployment

(O ID ):

SG 1: Select Im provem ents

SG 2: Deploy Improvements

Project Planning (PP):

SG 1: Establish Estim ates

SG 2: D evelop a Project Plan

SG3: Obtain Commitment to the Plan

[S-P 03]

CMMI

SE/SW

v1.1

GEN M PS 0002_PPS

Program Process Standard:

1 .1 Q u ality A ctivities

2.6 Quality

Program

Process

Standard

2002

G EN M PE 0002, Procedure for Continuous

Process Improvement

GEN MPS 0002;

GEN M PS 0002_PPS

Program Process Standard:

1 .1 Q u ality A ctivities

H R S M PE -0 5 0 5 – Job Q ualifica tion &

T raining Procedure

2.1 Program M anagement & Control

M PS 0023 Q uantitative M anagement

M PS 0023 Q uantitative M anagement_Q M G

1.1 The Program Process Standards:

Quality Activities

2.6 Quality

M PE 0023 M etrics Program

G E N M PE -0 00 2 – Procedu re for C ontinu ou s

Process Im provem ent

G EN M PE-0022 – Technology Change

M anagement

1.1 The Program Process Standards:

Quality Activities

2.6 Quality

1.3 Program Plan Process

2.3 Contract M anagement

1.1 The Program Process Standards:

Quality Activities

2.6 Quality

8.1General

8.2 M onitoring and measurement

4.1 General Requirements

4.2 Documentation requirements

4.5.3 System Verification Process

R32: Enabling Product Readiness

4.2.1 Planning Process

R4: Process Implementation Strategy

6.2.2 Competence, awareness & training5.1 Enterprise Factors

5.2 Project Factors

5 .3 E xterna l Factors

5.4 Influence of other Enterprise Projects

5.1 Management Commitment

5.4.1 Quality objectives

8.1 General

8.2.3 M onitoring & measurement of

processes

8.2.4 M onitoring & measurement of

product

8.4 Analysis of data

8.5.1 Continual improvement

5.6.3 Review output

8.4 Analysis of data

8.5.1 Continual Improvement

ISO 9001

7.1 Planning of Product Realization

7 .2 .1 D eterm ination of R equ irem ents

related to the project

7 .2 .2 R eview of R equirem ents related to

the product

7.2.3 Customer Communication

7.3.1 D esign

-2000

& Development Planning

7.5.1 Control of Production & Service

Provision

7.5.2 Validation of Processes for

Production & Service Provision

8.5.1 Continual Improvement

EIA 632 -

1999

5.1 Enterprise Factors

5.2 Project Factors

5 .3 E xterna l Factors

5.4 Influence of other Enterprise Projects

5.1 Enterprise Factors

5.2 Project Factors

5 .3 E xterna l Factors

5.4 Influence of other Enterprise Projects

4.1.1 Supply Process

R1: Product Supply

4.2.1 Planning Process

R4: Process Implementation Strategy

R 5: T echnical Effort D efinition

R 6 : Schedu le & O rganization

R 7: T echnical Plans

R 8: W ork D irectives

7.3 Improvement process

5.3.1 Process implementation

7.3 Improvement process

5.2.4 Planning

5.2.5 Execution & control

7.4 Training process

6.8 Problem resolution process

7.3 Improvement process

ISO/IEC

12207 -

1995

7.3 Improvement Process

5 .2 .1 Initia tion

5.2.2 Preparation of response

5.2.3 Contract

5.2.4 Planning

5.2.6 Review & Evaluation

5.3.1 Process Implementation

5.5.1 Process Implementation

6.1 Documentation process

6.2 Configuration M gmt process

6.3 Quality assurance process

6.6 Joint review process

7.1 M anagement process

7.2 Infrastructure process

7.4 Training process

Six Sigma links:

Level 2 Measurement & Analysis PA, Level 4/5 PAs

© 2003 by Carnegie Mellon University page 21


Improvement Rating

(0-10)

Improvement

Rating x weight

18 40% 100% 20% 10 2

15 35 80 17 8 1

12 30 60 14 6 0

6

9 25 40 11 4 0

20 20 8 2 0

3 15 0 5 1 0

80 120 80 150 160 10

Potential Totals

1000_____ Goal Met

200____ No Gain

0_____ Worse

480

600

1 2 3 4 5 6 7 8 9 10

20 20 10 25 20 5

20 50% 100% 20% 10% 3

Percent Defect

reduction

Customer

Satisfaction

Quality Goals

Carnegie Mellon

Software Engineering Institute

Six Sigma Approach at Northrop

Level 2 Self Assessment & Monitoring

Grumman

Linked with Business

Planning and

Oversight

• Business planning

• Project selection

SPM

Enabled by

Infrastructure

• Training

•Tools

•Awareness

• Database Tied to Employee

Performance

• Goals, awards

• Job and career paths

Minitab.lnk

Current Performance - Baseline

(circle current actual performance)

Quantitatively Driven

• Six Sigma improvements are

quantified

Engaged with

External

Customers

• Visibility

• Participation

Level 2 Self Assessment & Monitoring

Startit! – © a NGMS 2003 by product Carnegie Mellon University page 22

10

8

6

4

2

0

EXAMPLE OF TOOL USAGE

Previous Total

Current Total

Quantitative

Process

Capability

Measurement

Performance

and

Objectives

Assessment

Report

Performance

Metrics

Relative Weight

Benchmark/Baseline

Integrated Product

Development

Number of IPD

Teams started

EXAMPLE OF TOOL USAGE

Employee

retraining

Percent of

Employees Retrained

ISO 9000

Percent of

Orientation Meetings

Quality

Quality Goals

Integrated with Quality

Program

• Integrated Training,

Awareness, & Policies

• Integrated CMMI & Six

Sigma projects

• Integrated tracking and

reporting via DB, PRA, etc.

Percent complaint

reduction

Competitive

Analysis

Completed Studies

1

2

3

4

5

6

7

8

9

10

Sum=100


Carnegie Mellon

Software Engineering Institute

Northrop Grumman’s Six Sigma

Implementation

Started implementing Six Sigma in

2001

Trained over 3000 Green Belts (80

hours), and over 200 Black Belts

(160 hours)

Completed several hundred projects

covering all functional areas

• Customer involvement and award

fee citations

About half of the projects are

improving an engineering process

© 2003 by Carnegie Mellon University page 23


Carnegie Mellon

Software Engineering Institute

3 Keys to Competitive Leverage at

Northrop Grumman

CMMI provides

guidance for

measuring,

monitoring and

managing

processes

Six Sigma

Six Sigma is a

business strategy to

deliver value and

develop a sustainable

competitive

advantage

CMMI

Knowledge

Management

KM provides a

strategy

to utilize data and

transform it into

knowledge to

enable

informed and

decisive

management

leadership

© 2003 by Carnegie Mellon University page 24


Carnegie Mellon

Software Engineering Institute

Process Maturity Profile

CMMI ® v1.1

SCAMPI SM v1.1 Appraisal Results

December 2003

Mike Phillips

CMMI Program Manager

Software Engineering Institute

Carnegie Mellon University

We could not produce this report

without the support of the organizations

and lead appraisers who reported theirappraisal results to the SEI SM .

Our many thanks for their continuing

cooperation with our data collection

and analysis efforts.

Software Engineering Measurement and Analysis Initiative

The Software Engineering Institute is a federally funded research and development center sponsored by the U.S. Department of Defense and operated by Carnegie Mellon ® University

® CMMI and Carnegie Mellon are registered in the U.S. Patent and Trademark Office by Carnegie Mellon University

SM SCAMPI and SEI are service marks of Carnegie Mellon University

© 2003 by Carnegie Mellon University

© 2003 by Carnegie Mellon University page 25


Carnegie Mellon

Software Engineering Institute

Outline

Introduction

Current Status

Summary

Terms used in this Briefing

How to Report your Appraisal Results to the SEI

© 2003 by Carnegie Mellon University page 26


Carnegie Mellon

Software Engineering Institute

Introduction: Purpose

Characterize the adoption of the CMMI

Describe results from Standard CMMI Appraisal Method

for Process Improvement (SCAMPI v1.1) Class A

appraisals using Capability Maturity Model Integration

(CMMI) v1.1 *

Encourage continued reporting of results

* Organizations previously appraised against CMMI v1.0 and who have not reappraised

against v1.1 are not included in this report

Please visit: http://www.sei.cmu.edu/sema/profile_about.html for additional information

or questions you may have about this briefing before contacting the SEI directly

© 2003 by Carnegie Mellon University page 27


Carnegie Mellon

Software Engineering Institute

Current Status

SCAMPI v1.1 appraisals conducted since April 2002

release through

September

October

2003

2003 and reported to the SEI by

October 2003

• 136 appraisals

• 123 organizations

• 68 participating companies

11

• reappraised organizations

520

• 44% projects

• offshore organizations

Please refer to: Terms Used in this Report on page 21

© 2003 by Carnegie Mellon University page 28


Carnegie Mellon

Software Engineering Institute

Reporting Organizational Types -

1

Commercial/In-house

49.6%

Contractor for

Military/Government

44.7%

Military/Government

Agency

5.7%

0% 10% 20% 30% 40% 50% 60%

% of Organizations

Based on 123 organizations

© 2003 by Carnegie Mellon University page 29


Carnegie Mellon

Software Engineering Institute

Reporting Organizational Types -

2

Commercial/In-house

8.1%

41.5%

Contractor for

Military/Government

10.6%

34.1%

USA

Offshore

Military/Government

Agency

5.7%

0% 5% 10% 15% 20% 25% 30% 35% 40% 45%

% of Organizations

Based on 123 organizations

© 2003 by Carnegie Mellon University page 30


Carnegie Mellon

Software Engineering Institute

Types of Organizations

Based on Primary Standard Industrial Classification (SIC)

Retail Trade

Wholesale Trade

1.6%

Code

1.6%

Engineering & Management

Services

27.0%

Finance, Insurance and Real

Estate

4.8%

Public Administration (Including

Defense)

11.1%

Industrial Machinery And

Equipment

3.2%

Transportation Equipment

4.8%

Services

57.1%

Manufacturing

23.8%

Electronic & Other Electric

Equipment

7.9%

Business Services

27.0%

Health Services

1.6%

Services, Nec

1.6%

Instruments And Related

Products

7.9%

Based on 63 organizations reporting SIC code. For more information visit: http://www.osha.gov/oshstats/sicser.html

© 2003 by Carnegie Mellon University page 31


Carnegie Mellon

Software Engineering Institute

Organization Size

Based on the total number of employees within the area of the organization

that was appraised

2000+

4.9%

25 or fewer

8.2%

1001 to 2000

11.5%

26 to 50

10.7%

501 to 1000

13.1%

201 to 2000+

50.0%

1 to 100

33.6%

51 to 75

9.0%

76 to 100

5.7%

101 to 200

16.4%

301 to 500

13.9%

201 to 300

6.6%

Based on 122 organizations reporting size data

© 2003 by Carnegie Mellon University page 32


Carnegie Mellon

Software Engineering Institute

Use of Model Representations in

Appraisals

120

100

Number of Appraisals

80

60

40

20

0

Staged

Continuous

Based on 136 appraisals

© 2003 by Carnegie Mellon University page 33


Number of Appraisals

Carnegie Mellon

Software Engineering Institute

Disciplines Selected for

80

Appraisals

70

60

50

40

30

20

SW = Software Engineering

SE = System Engineering

SS = Supplier Sourcing

IPPD = Integrated Product and

Process

Development

10

0

SW + SE SW SE SW + SE + SS SW + SE +

IPPD + SS

SW + SE +

IPPD

SW + IPPD +

SS

SE + IPPD +

SS

SE + SS

Based on 136 appraisals reporting coverage

© 2003 by Carnegie Mellon University page 34


Carnegie Mellon

Software Engineering Institute

Summary Organizational

100%

Maturity Profile

90%

80%

70%

% of Organizations

60%

50%

40%

30%

23.0%

33.3%

26.4%

20%

10%

11.5%

5.7%

0%

Initial Managed Defined Quantitatively

Managed

Optimizing

Based on most recent appraisal of 87 organizations reporting a maturity level rating

© 2003 by Carnegie Mellon University page 35


Carnegie Mellon

Software Engineering Institute

Maturity Profile by

Organizational Type

100%

90%

80%

% of Organizations

70%

60%

50%

40%

30%

28.6%

55.6%

50.0% 50.0%

32.7%

20%

14.3%

13.9%

18.4%

16.7%

10%

8.3%

6.1%

5.6%

0%

Initial Managed Defined Quantitatively

Managed

Optimizing

Commercial/In-house Contractor for Military/Government Military/Government Agency

87

Based on most recent appraisal of organizations reporting organization type and a maturity level rating

© 2003 by Carnegie Mellon University page 36


Carnegie Mellon

Software Engineering Institute

Countries Where Appraisals

Have Been Performed and

Reported to the SEI

Australia Canada China Colombia France India

Japan Korea, Republic of Russia Switzerland Taiwan United Kingdom

United States

© 2003 by Carnegie Mellon University page 37


Carnegie Mellon

Software Engineering Institute

USA and Offshore Summary

Organizational Maturity Profiles

100%

90%

80%

% of Organizations

70%

60%

50%

40%

30%

27.8%

54.5%

29.6%

20%

14.8%

15.2%

20.4%

21.2%

10%

6.1%

3.0%

7.4%

0%

Initial Managed Defined Quantitatively

Managed

Optimizing

USA

Offshore

Based on 33 U.S. organizations and 54 offshore organizations reporting their maturity level rating

© 2003 by Carnegie Mellon University page 38


Carnegie Mellon

Software Engineering Institute

Process Area Satisfaction – Maturity

Level 2

100%

90%

80%

% of Appraisals Satisfied

70%

60%

50%

40%

30%

20%

10%

0%

REQM PP PMC SAM MA PPQA CM

119 121

121

93 118 119 120

Based on the number of appraisals listed above that rated the process area

© 2003 by Carnegie Mellon University page 39


Carnegie Mellon

Software Engineering Institute

Process Area Satisfaction – Maturity

Level 3

100%

90%

80%

% of Appraisals Satisfied

70%

60%

50%

40%

30%

20%

10%

0%

RD TS PI Ver Val OPF OPD OT IPM RSKM IT ISM DAR OEI

89 88 84

92 85 91 90 88 83 89 13 15 82 11

*

Based on the number of appraisals listed above that rated the process area

11 of the 83 appraisals rating IPM also examined the IPPD goals. 10

of these 11 appraisals satisfied IPM with the IPPD goals.

© 2003 by Carnegie Mellon University page 40


Carnegie Mellon

Software Engineering Institute

Process Area Satisfaction – Maturity

Levels 4&5

100%

90%

80%

% of Appraisals Satisfied

70%

60%

50%

40%

30%

20%

10%

0%

OPP QPM OID CAR

37 36

28 30

Based on the number of appraisals listed above that rated the process area

© 2003 by Carnegie Mellon University page 41


Defined

Carnegie Mellon

Software Engineering Institute

Process Area Profiles - 1

Organizations Appraised at Maturity

Level 1

*

OEI

DAR

ISM

IT

RSKM

Fully

Satisfied

Rated

IPM

OT

OPD

OPF

Val

Ver

PI

TS

Managed

RD

CM

PPQA

MA

SAM

PMC

PP

REQM

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

% of Appraisals

Based on 12 appraisals reporting a maturity level rating

* None of the 14 appraisals selected the IPPD discipline

© 2003 by Carnegie Mellon University page 42


Optimizing

Carnegie Mellon

Software Engineering Institute

Process Area Profiles - 2

Organizations Appraised at

Maturity Level 2

Quantitatively

Managed

*

CAR

OID

QPM

OPP

OEI

DAR

ISM

IT

RSKM

IPM

OT

OPD

OPF

Val

Ver

PI

TS

Fully

Satisfied

Rated

Defined

RD

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Based on 22 appraisals reporting a maturity level rating

% of Appraisals

*

1 of the 22 appraisals rating IPM also examined the IPPD goals.

That appraisal satisfied IPM.

© 2003 by Carnegie Mellon University page 43


Carnegie Mellon

Software Engineering Institute

Summary

Relatively even reporting from the Commercial and Contractor

communities, however Commercial organizations are primarily

outside of the U.S. and Government Contractors are primarily

located in the U.S.

Of U.S. organizations, the services and manufacturing industries

reported most appraisals.

Compared to the early reports of the SW-CMM maturity profile, the

early data reflects a relatively more mature CMMI profile.

Additional information and charts will be added to this briefing as

more appraisals are reported and therefore more data is available to

support these breakdowns.

© 2003 by Carnegie Mellon University page 44


Carnegie Mellon

Software Engineering Institute

Terms Used in this Report

Company - Parent of the appraised entity

A company can be a commercial or non-commercial

firm, for-profit or not for-profit business, a research

and development unit, a higher education unit, a

government agency, or branch of service, etc.

Organization – a.k.a. Appraised entity

The organization unit to which the appraisal results

apply. An appraised entity can be the entire company,

a selected business unit, units supporting a particular

product line or service, etc.

Offshore - Appraised entity whose geographic location is not

within the United States. The parent of the appraised

entity may or may not be based within the United

States.

© 2003 by Carnegie Mellon University page 45


Carnegie Mellon

Software Engineering Institute

Report your Appraisal Results

to the SEI

The briefing is only possible due to the cooperation of

organizations and individuals sending in their appraisal

results to the SEI

In order to provide this information and service in the future,

it will depend on this continued cooperation

Please visit:

http://www.sei.cmu.edu/sema/report.html

for forms, information, and instructions on how to report

appraisals to the SEI

© 2003 by Carnegie Mellon University page 46


Carnegie Mellon

Software Engineering Institute

Contact Information

Please visit:

http://www.sei.cmu.edu/sema/profile_about.html

and review the information provided before contacting:

SEI Customer Relations (412) 268-5800

SEI FAX number (412) 268-5758

Internet Address

customer-relations@sei.cmu.edu

Mailing Address

Customer Relations

Software Engineering Institute

Carnegie Mellon University

Pittsburgh, PA 15213-3890

© 2003 by Carnegie Mellon University page 47

More magazines by this user
Similar magazines