Site Investigation

Challenges in site investigation for infrastructure projects

Johan van Staveren, Geotechnical Engineering Specialist, Van Oord, David Kinlan, Contracts Director, Inframara, and Jason Errey, Director, OEMG Global, with Dr Euan J Provost, Surveyor, OEMG Global 

Tendering processes and risk management

THERE are numerous examples of major cost blowouts in civil infrastructure projects in Australia. A recent audit of the AU$120bn Australian federal infrastructure pipeline revealed AU$33bn in currently forecast overspend. Worse still, this figure is expected to rise, as many projects are yet to commence.

As a result of the overspend, the government has flagged infrastructure projects are likely to be cut, however, no criteria have been made available for stakeholders and communities to understand if their projects are at risk, causing substantial anxiety.

Two notable and recent examples of projects experiencing overruns are the Melbourne City West Gate Tunnel (WGT) project and Snowy Hydro 2.0. Together these projects represent AU$14bn in cost over-runs. Both projects share a similar issue, namely insufficient acquisition of site data, and poor communication of data that has resulted in an underestimation of risk.

The WGT Project is projected to create 1.5 million m3 of tunnel spoil (excavated rock and soil), over an 18-month period. During excavation on the WGT Project PFAS (per- and polyfluoroalkyl substances) soil contamination was discovered. PFAS is a ‘forever’ chemical linked to liver disease, fertility issues and cancer amongst other complications.

The regulator rightly determined that the excavated soils required treatment prior to disposal, resulting in tunnelling being delayed for a number of years.

A recent audit of the AU$120bn Australian Federal infrastructure pipeline revealed AU$33bn in currently forecast overspend. As a result of the overspend, the government has flagged infrastructure projects are likely to be cut.

Assessment of the likely levels of contamination were clearly underestimated, despite the knowledge of the site history and its industrial heritage.

The Victoria State Budget in 2022 revealed a massive AU$4.7bn cost blowout to the project. It is now projected to cost AU$10.2bn to complete and will be delivered eight years late.

There is a lack of transparency regarding the work or costs associated with the ground modelling for this project, which is typical for infrastructure programmes throughout the world.

Snowy Hydro 2.0 is located remotely within the Kosciuszko National Park in the Snowy Mountains of New South Wales. In essence, the Snowy Hydro 2.0 system will link the Talbingo Reservoir (bottom storage) and Tantangara Reservoir (top storage) via 27km of new tunnels.

The project also includes a new power station which will be located in a cavern some 800m underground.

In October 2023, the Australian Broadcasting Corporation’s programme Four Corners investigated the problem-plagued Snowy 2.0 pumped hydro project, shedding light the current situation: A bogged tunnelling machine, an unexpected volume of sludge and toxic gas.

This investigation provided unusual insight into the ground modelling studies and costs for major government infrastructure.

In August 2023, the federal government further increased funding for Snowy 2.0 to AU$12bn – triple the October 2018 figure, when the final decision was made to go ahead; and six times the initial AU$2bn estimate that was claimed it would cost when originally proposed in March 2017.

The Four Corners programme reported on the geotechnical assessment, and that as much as AU$2bn of the cost over-runs, can be attributed to an intersection of; poor decision making, inadequate ground characterisation and ground conditions that were worse than foreseen.

The bogged tunnel boring machine (named Florence) stalled when a sinkhole appeared just 150m after it started operations.

While engineers had expressed concerns regarding uncertainty about the ground during the feasibility stage, the request for more geotechnical information was turned down by project engineers.

The project engineer acknowledged that it was understood that Florence was proceeding into soft ground, however both the softness and extent of this ground were underestimated, to the extent that the machine was stopped due to inflowing sludge that caused a sinkhole to progress to the ground surface.

With the intense public scrutiny surrounding Snowy Hydro 2.0, it was announced during a parliamentary enquiry, that AU$100m had been spent on ground studies.

This figure is about right (approximately 5%) for the initial projected AU$2bn spend, however given that the project consultants were asking for additional factual site investigation data and the seen failures, it is unclear if it was money well spent.

It is telling however, that adverse ground conditions, delays and AU$2bn in additional costs attributable to ground conditions were encountered just 150m in to the 15km tunnel. The question now becomes, has the project engineer altered data acquisition and management strategies to account for hazards that are currently unforeseen along the remaining 14.87km of the tunnel route and if so, how?

Currently the administrators of the federal infrastructure budget must consider Snowy Hydro to be an uncontrolled risk and a contagion that will impact the overall infrastructure portfolio. The project is sold to the public as a key enabler for the electrical grid powered by renewable energy, but at the moment it is likely being delivered at the expense of a regional road upgrades that will save lives.

With the intense public scrutiny surrounding Snowy Hydro 2.0, it was announced during a parliamentary enquiry, that AU$100m had been spent on ground studies.

Better data acquisition strategies and communication during the pre-construction phase of a project’s lifecycle are essential. This will assist in the mitigation of latent ground conditions claims, support sustainable design and construction strategies, and improve budget forecasting in the early stages of a project lifecycle. The above examples show that not delivering the right data, at the right time, to the right people can severely impact decision making, and result in the large blow outs in project budgets and timelines seen.

Poor data and communication provides ideal conditions for creating ‘white elephants’ where costs no longer exceed benefits. This article looks at how to best address key issues around the data acquisition and communication for the project ground model during the preconstruction phase of the project lifecycle, and the positive outcomes this will have on the economic and environmental forecasts, and client/contractor relationships with better contracts.

Effective site investigation

The site investigation for major infrastructure projects involves balancing a number of initial challenges including, a lack of verified subsurface data, budget limitations at project inception and the complexity of accessing built-up, remote or offshore locations. The ground model that binds geophysical, environmental and geotechnical data, should be viewed as the foundational set of constraints that guides how these challenges are addressed.

While essential during the pre-tender phase, this foundational ground model will support the project through to asset management and project decommissioning. An effective ground model is the result of a multi-modal study that is phased, and includes spatial data, remote sensing, high resolution geophysics, and targeted environmental and geotechnical studies.

The cornerstone of an effective ground model is an appropriately funded site investigation, and a communication strategy that embraces both expert and non-expert stakeholders. Effective ground modelling and communication of data underpins a sustainable design, positive community engagement, construction strategies, and the contracts that underpin the successful owner/ non-owner relationships.

Better data acquisition strategies and communication during the pre-construction phase of a project’s lifecycle are essential.Traditionally, the prevailing thought has been that more intrusive investigations, such as boreholes and/or cone penetration tests (CPTs) will yield better results. This is a blunt force approach that on its own will likely fail to adequately describe the existing geological and environmental settings, and the up or downside risks associated with construction in the prevailing environmental setting.

The construction contractor can never trust the interpretation of the entire geological and environmental setting of a project site solely from randomly placed point samples at a site. Even a borehole placed at the proposed location of infrastructure foundations will not provide any assurance that the ground is the same even five metres away, and the courts have recognised this in cases such as in Obrascon (2015).

The targeting of physical sampling, environmental and geotechnical, is a poorly understood and executed element of ground studies. Both geotechnical and environmental sampling efficiencies rely on a prior detailed understanding of the subsurface geological setting and the embodied risks. For example, to ensure site contamination is understood, all soil regimes present must be adequately sampled. In the marine environment, this means the sand, silt and clay soils must be adequately sampled, as contaminates are likely to preferentially reside in one of these regimes, normally silts.

 

However, if randomised sampling has missed the silt areas, then overall site contamination will be under reported, as was seen in the West Gate Tunnel project and the made ground (un-engineered fill) determined in the Obrascon court case.

Similarly, untargeted borehole studies are likely to miss high risk anomalous ground, or miss-report the impact of anomalous ground. Either way, if the extent of anomalous or contaminated ground are reported to all stakeholders prior to design and construction strategies being established, such ground can be avoided or appropriately accounted for. Advanced digital geophysical techniques can offer the best way of visualising site conditions prior to establishing geotechnical and environmental sampling campaigns.

It has been demonstrated on a number of successful projects, that a holistic site characterisation process, or ground modelling, that is used to effectively communicate appropriate data to expert and non-expert stakeholders, realises better economic and environmental outcomes, as well as a collaborative approach to infrastructure delivery.

The phased multi-modal investigations include the orderly acquisition of data through desk studies, bathymetric/topographic and geomorphological surveys, high-definition digital geophysical investigations and lastly well-targeted intrusive environmental and geotechnical investigations.

Together this orderly acquisition of ground data offers a comprehensive and trusted view of the prevailing ground conditions. Digital geophysical techniques are an essential part of the modern ground model. When appropriate to an environment, these techniques are used to acquire high resolution imagery of the sub-bottom and distinguish soil and rock types/quality and depth to layer information.

Equally important to the efficiency and success of data collection for the ground model is the administration of data collection. Both internal and external contractors must be engaged and managed to ensure simple aspects of data acquisition, such as consistent sample site names or spatial/positional control, are consistent. As such, overall communication and data acquisition strategies must begin with the owners engineer and engineering consultant in the pre-data acquisition phase.

Ground studies that are not adequately funded and/or have poor data acquisition and communication strategies through failing to integrate geophysical, environmental and geotechnical studies will inevitably result in an unverified ground model and lead to cascading problems and unmanaged risk resulting in lost economic and environmental opportunities.

In plain language, it is cheaper to undertake and deliver ground studies with appropriate integrated acquisition and communication strategies than ad hock and ill-conceived studies, and project outcomes will be improved.

Understanding, and effectively communicating ground risk during the early stages of a project allows infrastructure designs to be modified to avoid risk, or appropriate construction strategies to be put into place to manage risk. It would have been far cheaper if strategies had been in place to manage the volumes of contaminated soils in the WGT Project, or the soft soils for Snowy 2.0.

As seen with both of these examples, despite the AU$100m spend on ground studies, the failure to understand and/or communicate early-stage risk resulted in designs and construction strategies that were not suited to the geological and environmental conditions.

Other downside impacts may include inappropriate stakeholder engagement or combative contracts where one side (usually the contractor) is forced to accept excessive and unreasonable risk.

These failures inevitably lead to ground condition claims seen and negative impacts project execution, and often acrimonious working relationships between the contractor and the client. To mitigate these challenges, an overhaul to the current ad hock approach to site characterisation, data gathering, data sharing, and ground risk assessment is crucial.

Ground models and tendering

During the project feasibility stages clients are best positioned, through their design engineers, to procure ground condition data and ensure fair and equitable communication of the ground model. However, design engineers do not have clear insight into the construction strategies, or innovative approaches that might be implemented by each contractor.

A lack of client-side understanding could result in data that is not fit for purpose and cost the clients more through additional works, sub-optimal designs, progress delays and claims. Additionally, a competitive tendering processes may not always consider the knowledge level of tenderers, leading to a mismatch between the risk profile of the project and the contractors’ assessments. It is recommended therefore, for clients to involve multiple likely tendering contractors in an early contractor involvement (ECI) process, to comment on the scope and type of proposed clientrun site investigation campaigns.

Depending on the size of the project, clients should consider paying for this service, as quality industry collaboration during the site investigation tender phase will achieve better results. Regardless of the tender process, clients must always strive for fairness and equity by ensuring raw site data is sufficient to provide an accurate risk assessment for contractor methodologies, and data is communicated effectively.

A lack of stakeholder confidence in data or the communication of risk often tempts clients to divest risk to third parties, usually the contractor, through generic disclaimers or non-reliance clauses rather than improving the risk setting.

While contractual solutions may appear cheaper in the short term, clients must focus on improving the quality of data, as the responsibility for ground risk and downside costs will impact them. Ultimately, it is the client that drives the need, time frames and often locations of infrastructure and it has the best opportunity to properly assess the proposed site and risks. Kicking the proverbial “risk-can” down the road to contractors will never mitigate reputational harm for clients or actual harm to customers from project delays.

Regardless of the contract conditions, delays and downstream costs as a result of unforeseen ground conditions cannot be fully recovered, so these are best avoided. Leaving the reliance ‘investigation’ to tenderers during the tender, or early construction phase beds uncontrolled risk into a project, and a potential contagion in the client’s infrastructure portfolio. This is an inadequate approach to this very real risk.

The effective communication of complex data and analysis is critical during the tender process. Tenderers face time constraints when assessing likely subsurface conditions during the tender process. Insufficient or poorly targeted soil data that results in data gaps and sub-par laboratory testing campaigns can lead to conservative design and additional risk allocations, and inevitably lead to claims in the future.

Occasionally, it can also lead to unforeseen risks, too optimistic design and knock-on delays, all adding to final-cost uncertainty. Therefore, focusing on maximising price certainty should be a priority for clients and consultants to support confidence for all stakeholders. Early involvement of contractors can aid this process.

Communication through visualisation

The communication of ground data is still predominantly through multiple PDF documents that are typically in excess of 1,000 pages, and not in a readily accessible digital form. This style of data delivery is problematic in promoting data silos, and does not constitute ‘digital delivery’.

Critically, these reports isolate geophysical, geotechnical and environmental data, and are largely meaningless and unusable to the non-expert. Indeed, such hefty reports may even convey a sense of comfort to the non-expert that is unwarranted, especially if point data are not targeted and verified.

This point is clearly seen in the Snowy Hydro 2.0 project. Here, despite around AU$100m spent on ground studies, a failure to recognise or effectively communicate the extent and risk embodied in the soft soils resulted in at least AU$2bn cost uplift to the project. Further, it is likely that there was a similar spend in the case of the WGT Project, where a failure to understand contaminated soils resulted in a AU$4bn cost uplift.

To improve the communication to expert and non-expert stakeholders the construction industry can leverage the congruence of maturing technologies including, visualisation (Cesium, Unity, Unreal), immersive technologies (HoloLens, Apple Vision), artificial intelligence (Nvidia, ChatGPT), knowledge graphs (Stardog, RDFox), cloud computing (Azure, AWS, Google Cloud), parametric design and design automation (Python-enabled scripting and software control), ruggedised high powered portable computers, and increased coverage of high-speed internet (5G, Starlink).

Using these technologies conjunctively could allow for the development of a true digital twin with the aim to design, analyse, simulate, verify and iterate infrastructure and construction strategies in a realistic digital facsimile of the environmental setting.

While the true digital twin is still some way off, some clients already create their own geotechnical models to assist in the assessment of likely site conditions at a project site.The digital twin would allow experts access to digital data and provenance while providing for intuitive story boarding for the non-expert (lawyers, accountants, contract specialists, etc), with the goal of allowing all stakeholders to understand positive and negative ground risk opportunities and improve collaboration.

While the true digital twin is still some way off, some clients already create their own geotechnical models to assist in the assessment of likely site conditions at a project site. However, most modelling software is complex and uses proprietary process and data formats.

This may impose an additional burden on tenderers who must ensure compatibility with these models and their process and are generally unhelpful to the non-expert. Standard organisations like W3C and OGC have not yet finalised standards for the delivery of spatial data over the web, so there are still many different formats available.

To promote a level playing field in the meantime, it is recommended that clients share geotechnical models with tendering contractors in common and open formats and at an early stage. Early contractor engagement will assist clients in understanding this fast-evolving field.

The availability of open-source data, such as the British Geological Survey (BGS) should also be considered. Minview in New South Wales, and the Dutch BRO (basisregistratieondergrond or Basic Ground Register) are additional opensource examples. These data sets can provide all parties with more informed insights into site conditions and the regional setting. This data can aid in better understanding the geology, geological hazards, and geotechnical ground conditions.

The BGS, and others, make some source data, particularly boreholes available, However, it must be remembered that this data is likely not validated. For example, the provenance of the data is not always shown and no site investigation contractor comments are retained relating to the accuracy of the data.

While the opportunities associated with digital twins and open data sources are appealing, it must be remembered that the model is only as good as the data it contains. Unverified open source data supplied via rudimentary ground modelling environments provides a good example of the caution that must be exercised with new visualisation technologies.

Without improving data collection techniques and strategies, the same errors will continue to be seen in infrastructure execution. It is hoped that the provenance model within the digital twin will have robust structures to provide realistic probabilities surrounding the success of any predicted outcome.

Conclusions

While the infrastructure industry is moving towards full digitisation, it is essential to recognise that creating accurate ground model digital twins requires comprehensive, and 3D, ground and subsurface surveys and strategies.

Decision makers must always bear in mind that costs associated with managing and mitigating risks decrease as the understanding, and trust, of ground conditions increases, conversely the opportunity to leverage environmental and economic upsides from quality ground models decreases as trust decreases or the project lifecycle progresses.Automation initiatives and the rise of 3D design software are driving advancements in preparation of ground models, but the quality of data remains crucial to their effectiveness. To leverage these developments effectively, it is imperative to involve specialists early in the project and heed their advice.

Sound data gathering and understanding of ground conditions are fundamental to successful projects, even though they may not be considered glamourous.

Decision makers must always bear in mind that costs associated with managing and mitigating risks decrease as the understanding and trust, of ground conditions increases, conversely the opportunity to leverage environmental and economic upsides from quality ground models decreases as trust decreases or the project lifecycle progresses.

Noteworthy initiatives, such as the British Geological Survey’s ‘Big Borehole Dig’ transformation from paper to digital, and CIRIA’s publication of the Geotechnical Baseline Reports guide (C807) in January 2023, contribute to the improvement of data availability and equitable contract terms.

The CIRIA guide emphasises the importance of developing a geotechnical baseline report (GBR) to address uncertainty in below-ground conditions, fostering a common understanding among all project stakeholders. However, effort is still required to ensure data retains sufficient provenance and can be trusted.

By addressing the challenges in site investigation, tendering processes, and risk management, the industry can enhance project outcomes and promote greater efficiency and safety in infrastructure development.

The current AU$33bn over-runs seen in the current Australian federal infrastructure pipeline represents significant anxiety and harm to the public, and will result in delays or cancellations to promised developments.

Addressing the productivity losses that stem from poor data acquisition and data acquisition strategies will significantly reduce risk and cost to the public through better ownernon-owner contracts and trust. Further, this will lead to better, faster and cheaper infrastructure for customers, more accurate forecasting for owners and profit surety for non-owners. 

Johan van Staveren, geotechnical engineering specialist, Van Oord, David Kinlan, contracts director, Inframara, and Jason Errey, director, OEMG Global, with Dr Euan J Provost, surveyor, OEMG Global

johan.vanstaveren@vanoord.com

david.kinlan@inframara.com

jasonerrey@oemg-global.com

www.vanoord.com www.inframara.com www.oemg-global.com

All images courtesy of the authors.