Evaluation Metrics and International Standards for Selecting or Acquiring SI Companies

 

Technical and Development Capability Metrics

To evaluate an SI firm’s development capability, it’s important to quantify software delivery capacity and engineering efficiency. Widely used today are DevOps/DORA metricsdeployment frequency, lead time for changes, time to restore service (MTTR), and change failure rate (CFR)—as a baseline for engineering performance. For example, DORA’s “Elite” performers ship with lead time < 1 day and MTTR < 1 hour; checking whether a team meets these thresholds helps gauge its strength(see figure). Also useful is whether the firm is appraised under CMMI (Capability Maturity Model Integration) and at what level. CMMI-DEV (for development) assesses process maturity across defined Process Areas and quantifies organizational capability via formal appraisals. Higher maturity (Levels 4–5) indicates stronger process improvement capability, though the level alone doesn’t guarantee outcomes. For product quality, ISO/IEC 25010 defines eight characteristics—functional suitability, reliability, performance efficiency, usability, security, compatibility, maintainability, portability—that you can map to concrete measures and assess implementation status. Combining these development and quality indicators enables a clearer comparison of SI vendors (see figure).

Figure: Example DORA performance categories (software delivery speed and stability for DevOps teams)

Quality and Security Management Maturity

Quality and information-security management maturity are critical to an SI firm’s trustworthiness. Certification to ISO/IEC 27001 (ISMS) evidences alignment with a recognized standard—covering risk assessment, asset management, access control, etc.—and signals commitment to protecting customer data. For development-specific security maturity, OWASP SAMM (Software Assurance Maturity Model) evaluates the software security program across the lifecycle—design, implementation, verification, deployment—so you can quantify balance and resourcing. The U.S. NIST SSDF (Secure Software Development Framework) provides a common language of secure-development practices to reduce vulnerabilities, useful for both buyers and suppliers. Assessing adoption and conformance to ISO 27001, SAMM, and SSDF gives an objective view of quality and security maturity.

Process Maturity

To measure maturity of development/operations processes themselves, use CMMI or the ISO/IEC 33000 series (the successor to ISO/IEC 15504 “SPICE”). CMMI-DEV’s staged representation rates maturity across five levels; the continuous representation scores capability by Process Area, both via formal appraisals. The ISO/IEC 330xx standards define process capability/maturity assessment; ISO/IEC 15504 migrated to 330xx in 2009, and standards such as 33020/33063 define assessment methods and criteria. Third-party assessment results (e.g., CMMI Level 3) and process assessment reports are valuable for M&A due diligence and comparative evaluation.

Financial Health

Financial health is essential for selection and acquisition. Look at revenue scale and growth, operating margin/EBITDA margin, and cash flow. Assess customer concentration to understand dependency risk—e.g., if any single client accounts for a large share (often ≥10%), the vendor may face switching risk and skewed bargaining power. In diligence, quantify dependency on the top 5–10 clients and verify multi-year contract renewal rates and churn. High churn in maintenance/subscription models weakens cash-flow stability. Also evaluate operating cash-flow trends, leverage and equity ratios, FX exposure, and asset liquidity to judge short- and long-term solvency.

Customer Portfolio, Industry Coverage, and Track Record

An SI firm’s strength appears in its customer mix, industry coverage, and case record. Review industry domains served (finance, manufacturing, public sector, etc.), specialties, major reference customers, and past large projects delivered. Vendor-selection frameworks often score industry expertise, and published success stories and references are weighted heavily. Because requirements differ by sector (e.g., healthcare vs. manufacturing), a wider cross-industry track record indicates versatility. Global accounts and overseas delivery centers also signal capacity for scale. Since these are harder to quantify, list project summaries (scope/value), client scale, and outcomes for comparison. Supplement with repeat engagement rate and customer referrals as qualitative evidence of loyalty.

Governance, Contract Compliance, and Risk Management

Evaluate corporate governance, contract management, and risk response maturity. ISO 31000 provides an enterprise-wide risk-management framework—assessment, treatment planning, monitoring, and continuous improvement—applicable to any organization. Conformance indicates aligned risk understanding and executive oversight. ISO/IEC 38500 is the international standard for IT governance, embedding IT strategy, investment, and compliance into corporate governance. Check principles like accountability, business alignment, performance, compliance, and human behavior in practice. In SI-specific contracts, examine contract adherence (uptime/service quality guarantees, maintenance/support obligations) and compliance programs (data-leak prevention, privacy protection). Verify regulatory readiness (e.g., GDPR, sector rules) and certifications (e.g., Privacy Mark, SOC 2). Turning these governance/risk factors into a checklist with scoring helps make risk reduction visible.

Customer Satisfaction, Repeat Business, CSAT/NPS

Ultimately, service quality shows up in CSAT (Customer Satisfaction), NPS (Net Promoter Score), and repeat/renewal rates. CSAT uses 5- or 10-point scales to rate satisfaction with deliverables and service responsiveness. NPS gauges loyalty via “likelihood to recommend,” indicating long-term relationship health and brand value. Collect recent survey results, year-over-year scores, renewal/expansion rates for major accounts, and qualitative feedback on incident handling. For vendor comparisons, average CSAT/NPS across multiple clients and show renewal rate (%) to visualize customer-centric performance.

Evaluation Methods, Visualization, and Checklist (Examples)

To compare across dimensions, use scorecards and radar charts. A vendor scorecard might plot axes like Quality, On-time Delivery, Cost, Support, and Compliance to make relative differences obvious. Likewise, a structured checklist with weighted scoring (e.g., 1–5 points per item) is standard.

Sample checklist (excerpt): tailor weights/thresholds to your requirements.

  • Technical/Development Capability: DORA metrics (deployment freq., lead time, MTTR, CFR); CMMI maturity level; engineering headcount/capacity; test and CI automation rate; code-quality diagnostics

  • Quality/Security: ISO/IEC 25010 characteristic coverage; ISO 27001 certification; OWASP SAMM level; NIST SSDF adoption; vulnerability remediation rate

  • Process Maturity: CMMI maturity/capability; ISO/IEC 330xx process capability results; process documentation and SOP completeness

  • Financial Health: Revenue and growth; EBITDA margin; cash-flow trends; equity ratio/leverage; customer concentration (top-5 share); renewal/churn rates

  • Customer Portfolio & Industry Coverage: Key clients (industry/size); sector-specific case counts; large-deal history (contract value); global delivery footprint

  • Governance & Risk: ISO 31000 alignment and risk organization; ISO/IEC 38500 alignment; contract compliance history; legal/regulatory readiness (privacy, compliance)

  • Customer Satisfaction & Loyalty: CSAT/NPS(latest surveys); renewal rate; referrals; support satisfaction

Mapping these indicators to scorecards and comparison tables supports selection and acquisition decisions with objective evidence. Finally, third-party certifications (ISO, CMMI, etc.) and industry benchmarks should be referenced alongside your internal scoring to triangulate the risk and investment thesis.

Figure: Example vendor scorecard combining multiple axes (Quality, On-time, Cost, Support, Compliance)

References (high level): DORA metrics overview; CMMI guide; ISO/IEC 25010 characteristics; OWASP SAMM; NIST SSDF; ISO standards overview; vendor-selection process guides; CSAT/NPS basics.

Comments

Popular posts from this blog

Japan Jazz Anthology Select: Jazz of the SP Era

In practice, the most workable approach is to measure a composite “civility score” built from multiple indicators.