Benchmarking & Metrics Summary Reports (Best Practice)

BMM-Summary Topic Summary
BMM Summary

Overview

The Construction Industry Institute (CII) established the CII Benchmarking and Metrics (BM&M) program in 1995 to provide self-analysis tools to member companies, quantify the benefits of CII Best Practices, and support research teams. By 2002 the program had achieved those goals by deploying the first online benchmarking questionnaire and by generating key reports and many industry-sponsored studies. The program has continued to evolve since then to meet the needs of CII members and to support the Institute’s strategic goals.

Key Findings and Implementation Tools

1 : CII Guidance

The BM&M Committee guides the program, interfacing with CII leadership through the Board of Advisors and other committees to better serve the membership. The BM&M Committee oversees the program and works alongside the CII benchmarking staff to promote and develop benchmarking initiatives. As the needs of the membership have changed, so have the services provided by the BM&M program. And, as the membership evolves in the future, the program will stay attuned to its emerging needs. (BMM2013-1, page 2)

Reference: (BMM2013-1)

2 : Growth of Benchmarking Database

CII started collecting project data with a paper-based questionnaire in 1996. By the time the 2002 summary report was published, the database had grown to hold 1,037 projects with a total installed value of $54.2 billion and had begun accepting project data through a web-based questionnaire. There was an almost even split between owner and contractor submitted projects as shown in Figure 1. After consistent decreases in average project cost for both owners and contractors, the trend had begun to level with small projects being dominant. As the trend toward small projects continues, changes in aggregate performance and practice use metrics may also become apparent.

Reference: (BMM2002-3)

3 : 2013 Report

BM&M Committee members composed a summary report in 2013 to provide insight into the current status of the program and the services it provides, with emphasis on the changes that have taken place since 2002.  CII members that have not been involved with the program for several years are strongly encouraged to revisit the program to learn how it has changed and how it may benefit their respective organizations. Given past and recent work CII has done to improve the program, participating companies will gain tremendous value in both the near and long terms. . (BMM2013-1, page 1)  
Reference: (BMM2013-1)

4 : Sector Specific Programs

The program has responded to industry-specific benchmarking needs with absolute metrics and data to support member companies engaged in the pharmaceuticals and biotechnology, upstream oil and gas, downstream oil and gas, and healthcare sectors. The set of metrics has been broadened, the database has grown, and the toolset available for interface with the data has been improved significantly. Program expansion will continue in industry sectors that have a need and are willing to participate and provide support. Productivity benchmarking for both engineering and construction have also been developed. (BMM2013-1, page 12)
Reference: (BMM2013-1)

5 : Performance Assessment Labs

CII’s strategic goal of expanding its geographical reach to serve members’ global operations is supported by the benchmarking program through the establishment of Performance Assessment Labs (PALs) in regions throughout the world. This expansion allows the program to leverage local support, knowledge, and resources, while expanding and improving the program’s capacity to serve member companies. PALs currently operate in Brazil and Canada, and CII plans to develop more PALs as opportunities arise. (BMM2013-1, page 13)
Reference: (BMM2013-1)

6 : Value of Benchmarking Database

The traditional research objectives of CII continues to find support from the benchmarking program. Many new research teams use data and definitions from the program that help form their data collection and research analysis framework. (BMM2013-1, page 15) 
The BM&M program also continues to produce the annual safety report, which has become an industry-standard reference for CII members and non-members alike. The BM&M system assessed the level of implementation of safe practices in 2001 for feedback and to quantify practice impacts on project performance. Figure 13, provided for illustration, shows safety practice use norms for owner domestic projects segregated by cost category. Project size has a significant impact on the level of practice use as depicted in the figure. Larger projects report greater practice use on average. This finding prompted research into small projects to determine how practices may need to be modified for use on these projects.  Note BM&M programs for Safety are included in the Safety Knowledge Area.             (BMM2013-1, page 6)   

 
Reference: (BMM2013-1)

7 : Benchmarking Process Automation

Several years ago it was recognized that automation of the entire CII Benchmarking process using the Web was essential to making the system work in a lean and efficient manner necessary to meet participant needs and resource constraints. Significant strides have been made in this regard, and members are currently reaping the benefits. Data are now collected exclusively via a Web-based questionnaire and reports are now returned in a similar manner. By 2002 data could be submitted during project execution when it was most convenient to the project team. (An interim report, the Progress Key Report, was already available on-line even before a project has been finally submitted. BMM2002-3, page 23) 
Reference: (BMM2002-3)

8 : Data Mining and Self-Analysis

The database can be mined to formulate typical industry norms that can help users understand how the industry in general performs. The BM&M program enables members to perform self-analysis of project performance and best practices use. The newly released Performance Assessment System (PAS) interface provides a more intuitive and user-friendly approach to conducting self-analysis and mining the data to support decision-making. Several examples of representative industry data analysis and how it may be used in practical applications are provided.  (BMM2013-1, page 10)
Reference: (BMM2013-1)

9 : Implementation Tool #1

IR BMM-2, Benchmarking & Metrics Implementation Toolkit Pocket Guide

This pocket guide is a companion publication of the online Benchmarking & Metrics Implementation Toolkit. This pocket guide, defines benchmarking, provides the reasoning behind benchmarking, and details the steps and activities to implement a benchmarking process as a Best Practice

Reference: (IRBMM-2)

Key Performance Indicators

Improved cost, Improved schedule, Improved quality (reduced errors & omissions), Reduced change, Improved safety

Research Publications

Benchmarking & Metrics Summary Report - BMM2013-1

Publication Date: 01/2014 Type: Performance Assessment Pages: 39 Status: Reference

Benchmarking & Metrics Implementation Toolkit Pocket Guide - IR-BMM-2

Publication Date: 05/2004 Type: Performance Assessment Pages: 44 Status: Tool

Benchmarking and Metrics Summary Report for 2001 - BMM2002-3

Publication Date: 02/2002 Type: Performance Assessment Pages: 38 Status: Supporting Product


Presentations from CII Events

Session - 10-10: Measures that Matter

Publication Date: Presenter: Number of Slides: 60 Event Code: AC2014

Plenary Session - CII’s 10-10 Performance Assessment Campaign

Publication Date: Presenter: Number of Slides: 14 Event Code: AC2013

Implementation Session - CII’s 10-10 Performance Assessment Campaign

Publication Date: Presenter: Number of Slides: 31 Event Code: AC2013

Plenary Session - Quantitative Easing 3.0 – Boosting the Amount of Information in the Project System

Publication Date: Presenter: Number of Slides: 10 Event Code: AC2011

Implementation Session - Quantitative Easing 3.0 – Boosting the Amount of Information in the Project System

Publication Date: Presenter: Number of Slides: 52 Event Code: AC2011

Plenary Session - Don’t Gamble with Your Project’s Performance

Publication Date: Presenter: Number of Slides: 20 Event Code: AC2009

Implementation Session - Don’t Gamble with Your Project’s Performance

Publication Date: Presenter: Number of Slides: 35 Event Code: AC2009

Plenary Session - Industry Demands Clear Measurements: The Case for Tailored, Transparent Metrics

Publication Date: Presenter: Number of Slides: 21 Event Code: AC2006

Implementation Session - Productivity Metrics: Which Circle Is Yours?

Publication Date: Presenter: Number of Slides: 25 Event Code: AC2006

Implementation Session - Industry Specific Metrics: The Case for Tailored, Transparent Metrics

Publication Date: Presenter: Number of Slides: 30 Event Code: AC2006

Plenary Session - Benchmarking: What the Data Tell Us

Publication Date: Presenter: Number of Slides: 16 Event Code: AC2007

Implementation Session - Benchmarking: What the Data Tell Us

Publication Date: Presenter: Number of Slides: 39 Event Code: AC2007

Session - Benchmarking as a Best Practice

Publication Date: Presenter: Number of Slides: 18 Event Code: AC2004

Session - Benchmarking: The Journey to Improvement

Publication Date: Presenter: Number of Slides: 34 Event Code: AC2001

Session - Benchmarking in the Information Technology Age

Publication Date: Presenter: Number of Slides: 20 Event Code: AC2000

Session - Benchmarking for the Next Millennium

Publication Date: Presenter: Number of Slides: 13 Event Code: AC1999

Session - BM&M:What Can It Do for You?

Publication Date: Presenter: Number of Slides: 55 Event Code: AC1998


Tags