Governance Utility // Benchmark + Monte Carlo

Timeline Delusion Index

A calm, clinical duration sanity-check. Compare declared timelines against benchmark bands and simulate uncertainty using Monte Carlo to quantify “how likely this plan is to survive contact with reality.”

Built with: HTML • Tailwind • Vanilla JS Benchmarks are generic sanity bands, not vendor commitments.
Inputs
Define your benchmark band (months)

Scope Size is ignored in custom mode. Use your org’s historical data.

Tip: Use months for clean comparisons across project classes.

Depts or units requiring cross-team coordination. More teams = more overhead, per Brooks’s Law: “Adding manpower to a late software project makes it later.” — Fred Brooks, The Mythical Man-Month (1975)

Training, adoption, process disruption
50
Cosmetic
tweaks
New reports,
minor workflow
Process redesign,
retraining
Multi-dept overhaul,
role changes
Org-wide
transformation
Data hygiene, process maturity, decision speed. Feeds Monte Carlo only.
50
No data,
no process
Scattered docs,
tribal knowledge
Defined processes,
some gaps
Clean data,
engaged sponsor
Battle-tested,
change-ready
Monte Carlo Controls
Enable Monte Carlo
Quantifies probability of hitting declared date

Baseline uses a triangular distribution across benchmark band, with stochastic overhead from integrations/teams/change, and capped readiness benefit.

Interpretation Rules
  • Reality Ratio — compares declared duration to the benchmark median.
  • Delusion Index — measures structural compression: how far below median is the declared duration, given integration, team, and change load? Org Readiness is excluded — it affects delivery probability, not the compression gap.
  • Coordinating Teams — overhead scales quadratically per Brooks’s Law. Going from 5 to 6 teams costs more than going from 2 to 3.
  • Monte Carlo — simulates full probability distribution including Org Readiness. Answers: “What are the odds we finish by the declared date?”
When to Use This Tool
01
Annual planning & budgeting. When leadership is locking in delivery dates to fit fiscal calendars rather than delivery reality. Validate timelines before they become commitments.
02
Vendor selection. When an implementation partner pitches an aggressive timeline. Use this to pressure-test their estimates against industry benchmarks before signing.
03
Scope change decisions. Mid-project, when someone wants to add integrations or bring in more teams. Plug in the new numbers and show what just happened to the probability.
04
Re-baselining conversations. When a project is already in trouble. “I feel like we need more time” is weak. “There’s a 4% probability of hitting the current date” is a different conversation.
05
Portfolio prioritisation. Running multiple initiatives? Score each one to see which timelines are credible and where to concentrate governance attention.
Benchmark Lookup (months)
Type Small Medium Large
AI 3–6 6–12 12–24
ERP 6–9 9–18 18–36
CRM 3–6 6–12 12–18
HR 3–6 6–9 9–15

Generic heuristic baselines. Active selection highlighted.

Reality Telemetry
Benchmark Band (months)
Median:
With your load:
Median extended by integration, team & change overhead
Declared Duration
Reality Ratio:
Delusion Index

1.0 = aligned • 3.0+ = hallucination • 5.0+ = fantasy • 8.0 = cap

Classification
Suggested Posture
Monte Carlo Output
Disabled
Probability hit declared date
Runs:

Low % = schedule is a political statement.

Simulated duration percentiles
P50
P80
P90

Suggested commitment:

Recommended Governance

Disclaimer

Benchmarks are generic planning heuristics. Monte Carlo simulation illustrates uncertainty under assumed variability; it is a decision aid, not a guarantee. Actual timelines vary by scope definition, vendor maturity, integrations, data quality, regulatory burden, and change adoption.

Benchmark Sources & References

The benchmark bands used in this tool are derived from industry research, analyst reports, and vendor-neutral advisory data. They represent generic heuristic baselines across project types and scope sizes, not vendor-specific commitments. Key sources are listed below.

ERP Implementation
CRM Implementation
AI Implementation
HR / HRIS Implementation

Note: Benchmark bands in this tool are synthesised from the above sources and represent conservative planning heuristics. Individual project timelines will vary based on scope, vendor, org complexity, and change readiness. Sources last verified February 2026.