CMMI: Building Mature Software Development Processes
Implementing CMMI for software development excellence. From Initial to Optimizing maturity levels, with practical guidance for agile organisations.
CMMI: Building Mature Software Development Processes
CMMI (Capability Maturity Model Integration) provides a framework for improving organisational processes. For software development teams, it offers a roadmap from ad-hoc practices to optimised, continuously improving processes.
Understanding CMMI
CMMI V2.0 Overview
CMMI V2.0 Structure:
├── Capability Areas (Categories)
│ ├── Doing - Building and delivering solutions
│ │ ├── Ensuring Quality (ENQ)
│ │ ├── Engineering & Developing Products (ED&DP)
│ │ └── Delivering & Managing Services (D&MS)
│ │
│ ├── Managing - Planning and managing work
│ │ ├── Planning & Managing Work (P&MW)
│ │ ├── Managing Business Resilience (MBR)
│ │ └── Managing the Workforce (MTW)
│ │
│ ├── Enabling - Supporting implementation
│ │ ├── Supporting Implementation (SI)
│ │ └── Managing Supplier Agreements (MSA)
│ │
│ └── Improving - Sustaining behaviour
│ ├── Improving Performance (IP)
│ └── Sustaining Habit & Persistence (SH&P)
│
├── Practice Areas (20 total)
│ └── Specific practices for each capability area
│
└── Maturity Levels (1-5)
└── Staged representation of organisational capabilityMaturity Levels Explained
| Level | Name | Characteristics |
|---|---|---|
| 1 | Initial | Ad-hoc, unpredictable, reactive |
| 2 | Managed | Projects planned and executed |
| 3 | Defined | Organisation-wide standards |
| 4 | Quantitatively Managed | Data-driven decisions |
| 5 | Optimizing | Continuous improvement culture |
Level 2: Managed
Core Practice Areas
// cmmi-level2.ts
interface Level2PracticeArea {
id: string;
name: string;
intent: string;
practices: Practice[];
}
const level2PracticeAreas: Level2PracticeArea[] = [
{
id: 'EST',
name: 'Estimating',
intent: 'Develop and maintain estimates of work effort and resources',
practices: [
{
id: 'EST.1',
name: 'Establish and maintain estimation approach',
description: 'Define estimation methods, models, and data sources',
evidence: ['Estimation guidelines', 'Historical data repository']
},
{
id: 'EST.2',
name: 'Develop estimates',
description: 'Create estimates for work products and tasks',
evidence: ['Effort estimates', 'Resource estimates', 'Assumptions log']
},
{
id: 'EST.3',
name: 'Review and update estimates',
description: 'Maintain estimates as work progresses',
evidence: ['Estimate revisions', 'Variance analysis']
}
]
},
{
id: 'PLAN',
name: 'Planning',
intent: 'Establish and maintain plans that define work activities',
practices: [
{
id: 'PLAN.1',
name: 'Establish project plans',
description: 'Develop comprehensive project plans',
evidence: ['Project plans', 'Schedules', 'Resource allocations']
},
{
id: 'PLAN.2',
name: 'Identify and analyse risks',
description: 'Identify risks and plan mitigations',
evidence: ['Risk register', 'Mitigation plans']
},
{
id: 'PLAN.3',
name: 'Obtain commitment',
description: 'Obtain stakeholder commitment to plans',
evidence: ['Sign-offs', 'Meeting minutes']
}
]
},
{
id: 'MC',
name: 'Monitor and Control',
intent: 'Provide understanding of project progress',
practices: [
{
id: 'MC.1',
name: 'Monitor project performance',
description: 'Track actual vs planned progress',
evidence: ['Status reports', 'Dashboards', 'Metrics']
},
{
id: 'MC.2',
name: 'Manage corrective actions',
description: 'Address deviations from plan',
evidence: ['Issue logs', 'Corrective action records']
}
]
}
];Implementing Project Management
# level2-project-management.yml
estimation_process:
inputs:
- Work breakdown structure
- Historical project data
- Resource availability
- Technical complexity assessment
estimation_techniques:
- name: Expert judgement
when_to_use: Novel work with limited historical data
accuracy: Low to medium
- name: Analogous estimation
when_to_use: Similar projects completed previously
accuracy: Medium
- name: Parametric estimation
when_to_use: Well-understood work with good metrics
accuracy: Medium to high
- name: Three-point estimation
when_to_use: Uncertain work requiring risk consideration
formula: (Optimistic + 4*Most Likely + Pessimistic) / 6
outputs:
- Effort estimates (hours/days)
- Duration estimates
- Resource requirements
- Assumptions and constraints
- Confidence levels
planning_process:
plan_components:
- Project scope statement
- Work breakdown structure
- Schedule and milestones
- Resource plan
- Risk management plan
- Quality plan
- Communication plan
- Stakeholder management plan
review_and_approval:
- Technical review
- Management review
- Stakeholder sign-off
monitoring_process:
metrics_to_track:
- Schedule variance
- Effort variance
- Defect trends
- Risk status
- Issue resolution rate
reporting_frequency:
- Daily standups
- Weekly status reports
- Monthly steering committeeLevel 3: Defined
Organisation-Wide Standards
// cmmi-level3.ts
interface Level3PracticeArea {
id: string;
name: string;
organisationalFocus: string;
practices: Practice[];
}
const level3PracticeAreas: Level3PracticeArea[] = [
{
id: 'OPF',
name: 'Organisational Process Focus',
organisationalFocus: `
Establish and maintain organisational process assets
that enable consistent project execution.
`,
practices: [
{
id: 'OPF.1',
name: 'Establish organisational process assets',
description: 'Create standard processes, guidelines, and templates',
evidence: [
'Standard process documentation',
'Process asset library',
'Tailoring guidelines'
]
},
{
id: 'OPF.2',
name: 'Deploy organisational process assets',
description: 'Make assets available and ensure adoption',
evidence: [
'Training materials',
'Deployment records',
'Usage metrics'
]
}
]
},
{
id: 'OPD',
name: 'Organisational Process Definition',
organisationalFocus: `
Define and maintain the organisation's set of
standard processes.
`,
practices: [
{
id: 'OPD.1',
name: 'Establish standard processes',
description: 'Document standard software development lifecycle',
evidence: [
'SDLC documentation',
'Process maps',
'Role definitions'
]
},
{
id: 'OPD.2',
name: 'Establish measurement repository',
description: 'Create central repository for process metrics',
evidence: [
'Metrics definitions',
'Measurement database',
'Analysis reports'
]
}
]
},
{
id: 'OT',
name: 'Organisational Training',
organisationalFocus: `
Develop skills and knowledge to perform roles
effectively.
`,
practices: [
{
id: 'OT.1',
name: 'Establish training capability',
description: 'Define training needs and develop curriculum',
evidence: [
'Training needs analysis',
'Training plan',
'Course materials'
]
},
{
id: 'OT.2',
name: 'Deliver training',
description: 'Provide training and track completion',
evidence: [
'Training records',
'Competency assessments',
'Feedback surveys'
]
}
]
}
];
// Standard SDLC Implementation
interface StandardSDLC {
phases: SDLCPhase[];
gates: QualityGate[];
artefacts: Artefact[];
roles: Role[];
}
const organisationalSDLC: StandardSDLC = {
phases: [
{
name: 'Requirements',
activities: [
'Stakeholder identification',
'Requirements elicitation',
'Requirements analysis',
'Requirements documentation',
'Requirements validation'
],
entryGate: 'Project approval',
exitGate: 'Requirements baseline'
},
{
name: 'Design',
activities: [
'Architecture design',
'Detailed design',
'Interface design',
'Design review'
],
entryGate: 'Requirements baseline',
exitGate: 'Design approval'
},
{
name: 'Implementation',
activities: [
'Coding',
'Code review',
'Unit testing',
'Static analysis'
],
entryGate: 'Design approval',
exitGate: 'Code complete'
},
{
name: 'Testing',
activities: [
'Integration testing',
'System testing',
'Performance testing',
'User acceptance testing'
],
entryGate: 'Code complete',
exitGate: 'Test completion'
},
{
name: 'Deployment',
activities: [
'Deployment planning',
'Production deployment',
'Post-deployment verification'
],
entryGate: 'Test completion',
exitGate: 'Go-live'
}
],
gates: [
{
name: 'Requirements Baseline',
criteria: [
'All requirements documented',
'Requirements reviewed and approved',
'Traceability matrix established'
]
},
{
name: 'Design Approval',
criteria: [
'Design documents complete',
'Technical review passed',
'Security review completed'
]
}
],
artefacts: [
{ phase: 'Requirements', artefact: 'Requirements Specification' },
{ phase: 'Requirements', artefact: 'Traceability Matrix' },
{ phase: 'Design', artefact: 'Architecture Document' },
{ phase: 'Design', artefact: 'Design Specification' },
{ phase: 'Implementation', artefact: 'Source Code' },
{ phase: 'Implementation', artefact: 'Unit Test Results' },
{ phase: 'Testing', artefact: 'Test Plan' },
{ phase: 'Testing', artefact: 'Test Results' }
],
roles: [
{ role: 'Project Manager', responsibilities: ['Planning', 'Monitoring', 'Reporting'] },
{ role: 'Business Analyst', responsibilities: ['Requirements', 'Stakeholder management'] },
{ role: 'Architect', responsibilities: ['Technical design', 'Standards'] },
{ role: 'Developer', responsibilities: ['Implementation', 'Unit testing'] },
{ role: 'QA Engineer', responsibilities: ['Testing', 'Quality assurance'] }
]
};Level 4: Quantitatively Managed
Statistical Process Control
// cmmi-level4.ts
interface QuantitativeManagement {
objectives: PerformanceObjective[];
baselines: ProcessBaseline[];
controlCharts: ControlChart[];
}
interface PerformanceObjective {
id: string;
name: string;
metric: string;
target: number;
unit: string;
toleranceBand: { lower: number; upper: number };
}
interface ProcessBaseline {
process: string;
metric: string;
mean: number;
standardDeviation: number;
dataPoints: number;
establishedDate: Date;
}
const defectDensityBaseline: ProcessBaseline = {
process: 'Code Review',
metric: 'Defects per KLOC',
mean: 3.2,
standardDeviation: 1.1,
dataPoints: 50,
establishedDate: new Date('2024-01-01')
};
// Statistical Process Control Implementation
interface ControlChart {
metric: string;
centerLine: number;
upperControlLimit: number;
lowerControlLimit: number;
dataPoints: DataPoint[];
}
const calculateControlLimits = (
baseline: ProcessBaseline
): { ucl: number; lcl: number } => {
// 3-sigma control limits
const ucl = baseline.mean + (3 * baseline.standardDeviation);
const lcl = Math.max(0, baseline.mean - (3 * baseline.standardDeviation));
return { ucl, lcl };
};
const analyseProcessStability = (
chart: ControlChart
): StabilityAnalysis => {
const outOfControl: DataPoint[] = [];
const trends: Trend[] = [];
// Check for points outside control limits
for (const point of chart.dataPoints) {
if (point.value > chart.upperControlLimit ||
point.value < chart.lowerControlLimit) {
outOfControl.push(point);
}
}
// Check for runs (7+ consecutive points on one side of center line)
let runCount = 0;
let runSide: 'above' | 'below' | null = null;
for (const point of chart.dataPoints) {
const currentSide = point.value > chart.centerLine ? 'above' : 'below';
if (currentSide === runSide) {
runCount++;
if (runCount >= 7) {
trends.push({
type: 'run',
startIndex: chart.dataPoints.indexOf(point) - runCount + 1,
length: runCount,
side: runSide
});
}
} else {
runSide = currentSide;
runCount = 1;
}
}
return {
stable: outOfControl.length === 0 && trends.length === 0,
outOfControlPoints: outOfControl,
trends,
recommendation: outOfControl.length > 0
? 'Investigate special cause variation'
: trends.length > 0
? 'Monitor for systematic shift'
: 'Process is stable'
};
};Performance Measurement
# level4-metrics.yml
quality_metrics:
- name: Defect Density
formula: Defects / KLOC
target: < 3.0
measurement_point: Post-release
baseline: 3.2 (σ = 1.1)
- name: Defect Removal Efficiency
formula: (Defects found before release / Total defects) × 100
target: "> 95%"
measurement_point: Post-release
baseline: 93% (σ = 4%)
- name: Code Review Effectiveness
formula: Defects found in review / Total pre-release defects
target: "> 60%"
measurement_point: Per release
baseline: 58% (σ = 8%)
productivity_metrics:
- name: Velocity Stability
formula: Standard deviation of sprint velocity
target: < 15% of mean
measurement_point: Per sprint
baseline: 12% (σ = 3%)
- name: Estimation Accuracy
formula: |Actual - Estimated| / Estimated × 100
target: < 20%
measurement_point: Per project
baseline: 22% (σ = 10%)
schedule_metrics:
- name: Schedule Performance Index
formula: Earned Value / Planned Value
target: 0.95 - 1.05
measurement_point: Monthly
baseline: 0.97 (σ = 0.08)
- name: Milestone Hit Rate
formula: Milestones met on time / Total milestones
target: "> 90%"
measurement_point: Per release
baseline: 88% (σ = 6%)Level 5: Optimizing
Continuous Process Improvement
// cmmi-level5.ts
interface OptimizingPractices {
defectPrevention: DefectPreventionProcess;
processInnovation: ProcessInnovationProcess;
performanceOptimization: PerformanceOptimizationProcess;
}
interface DefectPreventionProcess {
causalAnalysis: CausalAnalysisMethod;
preventiveActions: PreventiveAction[];
effectivenessMeasurement: EffectivenessMeasure[];
}
// Causal Analysis and Resolution
const performCausalAnalysis = async (
defects: Defect[]
): Promise<CausalAnalysisResult> => {
// Categorize defects by root cause
const categorizedDefects = categorizeByRootCause(defects);
// Pareto analysis to identify vital few
const paretoAnalysis = performParetoAnalysis(categorizedDefects);
// Identify top causes (80/20 rule)
const topCauses = paretoAnalysis.categories
.filter(c => c.cumulativePercentage <= 80);
// Root cause analysis for top causes
const rootCauseAnalyses: RootCauseAnalysis[] = [];
for (const cause of topCauses) {
const rca = await performRootCauseAnalysis(cause, {
method: 'five_whys',
depth: 5
});
rootCauseAnalyses.push(rca);
}
// Generate preventive actions
const preventiveActions = rootCauseAnalyses.flatMap(rca =>
generatePreventiveActions(rca)
);
return {
defectsAnalysed: defects.length,
topCauses,
rootCauseAnalyses,
preventiveActions,
expectedImpact: calculateExpectedImpact(preventiveActions)
};
};
// Process Innovation
interface ProcessInnovation {
id: string;
description: string;
expectedBenefit: string;
pilotResults?: PilotResult;
deploymentStatus: 'proposed' | 'piloting' | 'deploying' | 'deployed';
}
const evaluateProcessInnovation = async (
innovation: ProcessInnovation
): Promise<InnovationEvaluation> => {
// Pilot the innovation
const pilotResult = await runPilot(innovation, {
duration: '2 sprints',
teams: ['pilot_team_1', 'pilot_team_2']
});
// Measure against baseline
const comparison = await compareToBaseline(pilotResult);
// Statistical significance test
const significanceTest = performStatisticalTest(
comparison.baseline,
comparison.pilot,
{ confidenceLevel: 0.95 }
);
return {
innovation,
pilotResult,
comparison,
statisticallySignificant: significanceTest.significant,
recommendation: significanceTest.significant && comparison.improvement > 10
? 'Deploy organisation-wide'
: significanceTest.significant
? 'Continue piloting'
: 'Do not proceed'
};
};Technology and Process Change Management
# level5-improvement.yml
improvement_framework:
sources_of_improvement:
- Causal analysis of defects
- Process performance analysis
- Industry best practices
- Technology innovations
- Employee suggestions
evaluation_criteria:
- Expected improvement magnitude
- Implementation cost
- Risk level
- Time to benefit
- Alignment with strategic goals
pilot_process:
duration: 2-4 sprints
scope: 1-2 teams
measurements:
- Key process metrics
- Adoption challenges
- Unintended consequences
deployment_process:
phases:
- Pilot completion and analysis
- Process asset updates
- Training material development
- Phased rollout
- Full deployment
- Effectiveness verification
continuous_improvement_cycle:
- Identify improvement opportunity
- Analyse root causes
- Propose solution
- Evaluate and pilot
- Deploy if successful
- Measure effectiveness
- InstitutionaliseCMMI and Agile
Integrating CMMI with Agile Practices
// cmmi-agile-mapping.ts
interface CMMMAgileMapping {
practiceArea: string;
agileEquivalent: string[];
integrationApproach: string;
}
const cmmiAgileMapping: CMMMAgileMapping[] = [
{
practiceArea: 'Estimating (EST)',
agileEquivalent: [
'Story point estimation',
'Planning poker',
'Historical velocity'
],
integrationApproach: `
Use velocity-based estimation with story points.
Track estimation accuracy over time.
Maintain estimation guidelines for consistency.
`
},
{
practiceArea: 'Planning (PLAN)',
agileEquivalent: [
'Sprint planning',
'Release planning',
'Backlog refinement'
],
integrationApproach: `
Create release plans from product roadmap.
Detailed sprint plans during sprint planning.
Update plans based on velocity and scope changes.
`
},
{
practiceArea: 'Monitor & Control (MC)',
agileEquivalent: [
'Daily standups',
'Sprint reviews',
'Burndown charts',
'Sprint retrospectives'
],
integrationApproach: `
Daily monitoring through standups.
Sprint-level tracking with burndowns.
Formal review and corrective action in retrospectives.
`
},
{
practiceArea: 'Requirements (RD)',
agileEquivalent: [
'User stories',
'Acceptance criteria',
'Definition of Done'
],
integrationApproach: `
Document requirements as user stories with acceptance criteria.
Use Definition of Done for completeness criteria.
Maintain traceability through tooling.
`
},
{
practiceArea: 'Verification (VER)',
agileEquivalent: [
'TDD',
'Automated testing',
'Code review',
'CI/CD'
],
integrationApproach: `
Verification built into development through TDD.
Automated testing in CI pipeline.
Mandatory code review before merge.
`
}
];
// Agile Metrics for CMMI
interface AgileMetrics {
level2: Metric[];
level3: Metric[];
level4: Metric[];
}
const agileMetricsForCMMI: AgileMetrics = {
level2: [
{ name: 'Sprint velocity', frequency: 'per sprint' },
{ name: 'Sprint burndown', frequency: 'daily' },
{ name: 'Defects found in sprint', frequency: 'per sprint' },
{ name: 'Stories completed vs committed', frequency: 'per sprint' }
],
level3: [
{ name: 'Cross-team velocity comparison', frequency: 'quarterly' },
{ name: 'Process compliance', frequency: 'per sprint' },
{ name: 'Training completion', frequency: 'quarterly' }
],
level4: [
{ name: 'Velocity stability (std dev)', frequency: 'rolling 6 sprints' },
{ name: 'Defect density trend', frequency: 'per release' },
{ name: 'Estimation accuracy', frequency: 'per sprint' },
{ name: 'Control chart analysis', frequency: 'monthly' }
]
};Implementation Roadmap
Phased Approach
# cmmi-implementation-roadmap.yml
phase_1_foundation:
duration: 3-6 months
focus: Level 2 practices
activities:
- Establish project planning processes
- Implement estimation guidelines
- Create status reporting templates
- Define change management process
- Train project managers
success_criteria:
- All projects have documented plans
- Weekly status reports produced
- Estimation accuracy tracked
phase_2_standardisation:
duration: 6-12 months
focus: Level 3 practices
activities:
- Document standard SDLC
- Create process asset library
- Establish training programme
- Define organisational metrics
- Deploy process management tools
success_criteria:
- Standard processes documented
- All staff trained on processes
- Process compliance > 80%
phase_3_measurement:
duration: 6-12 months
focus: Level 4 practices
activities:
- Establish measurement programme
- Create process baselines
- Implement SPC techniques
- Deploy metrics dashboards
- Train staff on quantitative methods
success_criteria:
- Baselines established for key metrics
- Control charts maintained
- Data-driven decisions documented
phase_4_optimization:
duration: Ongoing
focus: Level 5 practices
activities:
- Implement causal analysis programme
- Establish innovation pipeline
- Deploy continuous improvement framework
- Measure improvement effectiveness
success_criteria:
- Regular causal analysis sessions
- Measurable process improvements
- Innovation programme activeKey Takeaways
-
Start with Level 2: Build project management fundamentals before organisational standardisation
-
CMMI + Agile works: Agile practices can satisfy CMMI requirements with proper documentation
-
Metrics matter: Quantitative management requires consistent measurement and analysis
-
Process assets: Create reusable templates, guidelines, and tools for consistency
-
Training is essential: Staff must understand and follow defined processes
-
Continuous improvement: Level 5 is a journey, not a destination
-
Evidence collection: Document compliance for appraisals and internal review
-
Tailor appropriately: Standard processes should allow project-specific tailoring
CMMI provides a framework for process maturity, but implementation must fit your organisation's context. Focus on real improvement, not just appraisal ratings.