Data: Navigating a new currency

The Construction Playbook1 stresses that: “A critical success factor for the effective completion and transition of a project or programme is the sharing of high quality, robust data and information between parties during the project lifecycle and into operation.”2 A few years earlier in 2018, the Gemini Principles3 around data sharing for the forthcoming National Digital Twin valued this assumption, stating that greater data sharing could release an additional £7bn per year of benefits across UK infrastructure, which is equivalent to 25% of total spend.4

Establishing protocols and processes around data sharing is essential for the transformation of construction. While data sharing practices have yet to be fully established and normalised, they will happen – and civil engineering surveyors should be enacting best practice and ensuring their continuing professional development factors in skills in data management. While protocols and standards focus on quality processes, surveyors focus on quality data and therefore are natural leaders in managing and specifying data requirements.

The information delivery lifecycle

Information is developed and built up through the lifecycle of a project, commonly referred to as the digital plan of work (DPoW). The unified CIC/APM digital plan of work consists of eight generic stages:

The level of information need (formally known as the ‘level of definition’ ) is defined for each stage gateway and is the aggregate of level of detail and level of information. The ‘level of detail’ is the description of graphical content required to address the decisions at each stage gateway. And the ‘level of information’ is the description of nongraphical content required for this.

As information progressively develops at each stage throughout the project delivery it collectively forms the Project Information Model. The graphical representation may not change at each stage but ‘Information’ will be added at each stage.

For example, at concept stage graphical detail may look very realistic but spatially inaccurate, plus information is likely to be low grade with a lot of unknowns. Whereas at handover and close-out, graphical detail will accurately reflect the as-built position of the works and information delivered will be sufficient to maintain and operate it. The production and delivery of information on a project is assigned to specific Task Teams (Disciplines) - for example civil, mechanical and electrical. These ‘own’ the information they are responsible for producing and only they can create or edit that data. All information, regardless of the work stage it is developed at, can be assigned one of three states:

The work-in-progress state is used for information while it is being developed by its task team/discipline. Information in this state is not visible or accessible to any other discipline.

When the discipline is ready to share its information, it must pass through a check, review and approval workflow and is given a status code (often referred to as a suitability code). This is necessary so the receiving party can have confidence in the information shared and has some understanding of the purpose for which it was shared. The status codes that can be assigned are:

The purpose of the shared state is to enable constructive and collaborative development of the Information Model within a delivery team.

When a discipline promotes information to the published state it must pass through a further review and authorization workflow. The published status codes assigned - A0, A1, A2, A3, A4, A5, A6, A7 - all indicate the stage gateway of the digital plan of work they refer to. The information at shared and published states is visible and accessible by other disciplines within a delivery team but is not editable by them. If the information requires editing it is returned to the work-in-progress state for amendment and resubmission by the discipline that owns it.

This process of information development and exchange is defined by BS EN ISO 19650-2:2018 and is undertaken within a common data environment (CDE). A CDE is the single source of information for a project, used to collect, manage, and disseminate all relevant project information through a managed process. A critical function of the CDE is to provide a clear and secure audit trail or journal of all changes and amendments to that information, including who created it, who read it, who edited it, who shared it (and for what purpose), who checked and reviewed it, who approved it, who authorised it to be ‘published’ and when all these activities took place.

t the end of Stage 6, the as-built information represents the as-built asset in content and dimensional accuracy and is submitted to the client for acceptance, along with the commissioning and handover documentation. The complete PIM is handed over at the end of the project and culminates in the transfer of relevant information from the PIM to the asset information model (AIM), for use in asset management and potentially within a digital twin.

Leading up to this state of high quality and robust information requires a careful and structured approach, which includes adherence to strict processes and standards and an element of risk management.

Standards and standardisation 

The UK government’s National Data Strategy5 of December 2020 stated that while the standards were ‘well recognised', SMEs generally do not use information management. The key hurdles to be overcome included software licensing and cost; lack of in-house training and skills; interoperability; a perception that BIM was only for larger construction projects; and a lack of demand from clients.

Since then, the Construction Playbook has clearly set out to ensure that client demand is there (at least in the public sector); the Government and Industry Interoperability Group (GIIG) has been established to support interoperability; software houses and market forces are addressing licensing costs; and training is filtering through the supply chain from the major contractors. Perception will change in time and professional bodies have a role to play in providing learning opportunities around specification requirements and standards, and the importance of a balanced and structured approach to data management throughout the lifetime of an asset.

Standardisation of data is necessary for collaboration. The Construction Playbook is very clear about government expectations of contractors around data management, explicitly saying they should use the UK BIM Framework to standardise the approach to generating and classifying data, data security and data exchange, and to support the adoption of the Information Management Framework and the creation of the National Digital Twin.

Naming protocols for information containers for objects and layers should be established early in a project and align to the needs of the client. It is imperative that these requirements are communicated to the project delivery team via the BIM Execution Plan (BEP) and that everyone adheres to them. The Geospatial Commission uses FAIR terminology6 to assess the fitness for purpose of data, with data that is:

The term Q-FAIR is also used by the commission and adds ‘quality’ to the data ideal. Civil engineering surveyors – both commercial and geospatial – should keep the Q-FAIR principle in mind when commissioning, capturing and managing data. The role of the surveyor in determining quality is key to the success of projects and will cover the currency, accuracy level, verification and suitability of data – addressing concerns around how much the data can be trusted and how it will be used with other data. This relates to the ‘level of information need’, which might require a higher level of detail and accuracy at DPoW Stage 4 (detail design), than at DPoW Stage 2 (concept design), for example.

Another initiative that will aid data standardisation is the International Cost Measurement Standard (ICMS).7 ICMS provides a high-level structure and format for classifying, defining, measuring, recording, analysing and presenting life cycle costs and carbon emissions associated with construction projects. CICES is one of 49 global bodies in the coalition steering the development of the standard.

Sharing securely

Geospatial surveyors should be mindful of the adage, capture once, use many times. The geospatial project execution plan should be developed as part of early engagement with the client and address what existing data is known about and available, and ensure that new data capture is carried out with the whole project lifecycle in mind.

The potential for sharing data in future projects needs to be addressed in the contract. The surveyor is best placed to comment on how the data could be used in future projects for other clients.

Surveyors have access to a huge range of data, and need to be mindful of their responsibility to keep that data secure, especially on national infrastructure projects. Clients will increasingly specify data security requirements, such as Cyber Essentials8 accreditation, in tender documents. The Centre for the Protection of National Infrastructure (CPNI) has a wealth of guidance material on developing a security-mindedness approach9 and assessing the security of data management systems. The National Cyber Security Centre (NCSC) has developed guidance on cyber security for construction businesses.10

For underground utility surveys, the cross-industry endorsed Secure Data Management for Utility Surveys11 published by CICES is also useful. 

Facing the risks

Data sharing can appear highly risky to those whose careers have been shaped though the traditional adversarial culture of construction. This leads to a reluctance to share between stakeholders, particularly where added value has been embedded based on personal judgments and interpretation of information.

Further work needs to be carried out to determine the most effective processes for data validation. Currently, the recipient of data expects it to have been validated by the sender. However, there is a chain of thought that turns the table on this expectation and recommends that the recipient verifies the data it receives. Recipient verification transfers risk from the sender and has a commercial implication around who is paying for verification and the actions that may need to be managed stemming from the outcome of verification and any resulting change management. This model is outlined in the Construction Playbook where in a list of dos and don’ts, one is: “Don’t... hold incoming suppliers responsible for errors in data (excluding forecasts) where they are unable to complete due diligence. Where data turns out to be incorrect, there should be a contractual mechanism for reflecting this adjusting for errors.”12

However data validation is carried out, the process should be collaborative with a structure in place to notify parties of any discrepancies and clashes. Who is in charge of the truth and when needs to be thought about right at the start of a project. In the Construction Playbook, the second step in the delivery model assessment for public works projects and programmes is to identify data inputs. This sits right after framing the challenge of what type of sponsor and governance approach is being taken, and before considering the delivery model.

With data thought about early and often, and an accurate and reliable pipeline of information flowing through a project, the natural progression is to put it to further work. Good data should be used as a benchmark to aid decisions in forthcoming projects. Ensuring the quality of this data as it is used in future evaluation is a further role where the skills of the commercial manager will be beneficial. Machine learning programs are already being used by public sector clients to manage risks on megaprojects by assessing historical data. As machine learning and AI become more developed and familiar tools, this kind of analysis will become more common.

We cannot not share 

Open data initiatives, where non-sensitive data is made available without constraint for transparency, engagement and innovation purposes, are increasingly encouraged by the UK government. Surveyors need to take care that legal and security liabilities are considered when sharing data for the public good.

Commercial barriers to data sharing were addressed in Data for the Public Good13 from the National Infrastructure Commission in December 2017, where perceived commercial risk was studied under the glare of overall industry efficiencies. Putting it simply, the report stated that: “By refusing to share data, a private company or organisation keeps control of that data as it grows... as the volume of data increases and machine learning techniques are applied, the quality of the data improves and so becomes more valuable. Thus there are increasing returns to data, which if retained in the private sphere, will remain as narrow returns to the private company rather than wider returns to the economy as a whole.”14

Professional civil engineering surveyors are bound by the royal charter that governs them to benefit society.

That narrow view of protecting commercial returns has to widen. 

---

1 https://www.gov.uk/government/publications/the-construction-playbook

2 Page 68, The Construction Playbook

3 https://www.cdbb.cam.ac.uk/system/files/documents/TheGeminiPrinciples.pdf

4 Page 2, The Gemini Principles

5 https://www.gov.uk/government/publications/uk-national-data-strategy/national-data-strategy

6 https:// www.gov.uk/government/collections/best-practice-guidance-and-tools-for-geospatial-data-managers

7 https://icms-coalition.org

8 https://www.ncsc.gov.uk/cyberessentials/overview

9 https://www.cpni.gov.uk/developing-security-mindedness-approach

10 https://www.ncsc.gov.uk/guidance/cyber-security-for-construction-businesses

11 https://www.cices.org/content/uploads/2022/03/Secure-Data-Management-for-Utility-Surveys.pdf

12 Page 50, Construction Playbook 

13 https://nic.org.uk/app/uploads/Data-for-the-Public-Good-NIC-Report.pdf

14 Page 48, Data for the Public Good