Skip to main content

We look at modern TSF management practices centered at the digital twin of the site, seamlessly integrating monitoring and modeling.



Pieter Neethling
Segment Director for Mining Operations – Seequent


22 min

See more on demand videos


Find out more about Seequent's mining solution

Learn more

Video Transcript

(gentle music)

<v Janina>Hello and welcome everyone.</v>

Thank you very much for joining us today

at our Seequent presentation.

My name is Janina Elliott.

I am Seequent’s Global Central Technical Lead,

and I’m joined today by Jennifer Biddlecombe,

our Senior Account Executive for North America.

And the two of us are part

of Seequent’s tailing solution team,

and have the immense pleasure to moderate today’s talk

on The Role of Dynamically Updated Digital Twin

in a Modern Tailing Storage Facility.

At the end of the talk, we will make sure to address several

of your burning questions regarding the resolution.

But in case there isn’t enough time,

rest assured that we will provide you with an answer

just after the webinar via a personalized email.

So with no further ado, I’d like to introduce our colleague,

team leader and speaker for the day, Pieter Neethling.

Now, Pieter, as the Segment Director for Mining Operations,

Pieter is strongly focused

on making Seequent’s Geoscience Portfolio more relevant

to production geologists and environmental experts.

And through his focus growth, and this focus growth,

he aspires to solve essential operational challenges

centered on safety and productivity workflow improvements.

Now, Pieter has more than 30 years of experience

in the mining industry with varying roles in operation,

including lead positions in consulting

and mining technical services

at multiple top tier organizations.

But a special interest and passion of Pieter’s

has always been the safe development and maintenance

of tailing storage facilities.

And this is the focus for his talk today.

Okay, over to you Pieter, here we go.

<v Pieter>Thank you for the introduction Janina.</v>

Hi everyone.

Today I’m going to talk to you about the value

of a dynamically updating digital twin,

and the role it might play in mitigating potential failures

of tailings storage facilities.

Before I progress with the presentation,

I would like to make a statement

of confidentiality and disclaimer.

Please note that the presentation

is for informational purposes only,

and it’s not a commitment

to deliver software features or functionality.

And now for some context for why digital twin

becomes so important.

The mining sector is increasingly exposed to environmental,

social and governance compliance.

And the need to comply

with the associated regulatory frameworks

has necessitated a shift in the way the industry

manages risk and adheres to responsible mining practices.

Investors also want to ensure that their money is used

in a sustainable and responsible fashion,

as evidenced by the signatories

to the Principles for Responsible Investment.

Greater transparency entailings management disclosure,

and working with industry

and community regulatory and financial stakeholders

to promote the application of consistent disclosure

that informs better tailings than stewardship,

has become a key operational objective.

These social economic and political risks,

along with the need to digitally transform the industry,

means everyone is talking about tailings.

Recent events have reminded us

that more work needs to be done

to increase dam resilience and due diligence.

To truly be able to learn from a failure event

and fulfill the ultimate goal of the global standard,

it is essential that there is complete transparency

regarding the chain of events.

All data analysis and decision-making processes

need to be clearly understood.

However, to achieve this objective is easier said than done,

with people, processes and technology challenges

core to mitigating the risks

and embedding a cohesive safe tailings management solution.

Let’s take a look at some of the challenges.

Complexity comes in many forms and here are a few examples.

Monitoring and understanding of data and systems

for some of the largest man-made structures on earth,

and that these are evolving structures

makes this inherently challenging.

The changing factor of safety, at a TSF as it evolves

is a major concern.

And scenario testing, to help predict potential failure,

as a TSF evolves with time, is a significant pain point.

The monitor to model to design workflows are by and large,

still unnecessarily complex.

This consumes resources and introduces risk.

And finally, there are a host

of multi-stakeholder collaborations to manage.

Technical teams that we have engaged,

believe it’s not just data and file incompatibility

that is an issue.

But also the lack of multidisciplinary interaction

and comprehension that is a cause for miscommunication.

Our conversations with senior management

find a common thread

where reporting on all of their storage facilities

that are at varying ages, conditions and locations,

particularly where there is little standardization

in how the structure is being monitored is a major issue.

They agree that their teams waste a lot of time struggling

to get the data into a useful and consistent format.

Managers are therefore not fully confident

that they are delivering a comprehensive picture

of all of the assets and the problems are not being missed.

So how is Seequent helping tackle these challenges?

The key to any solution is that it provides the means

to work effectively as a team and ensure data transparency.

These are the underlying principles that allow

for a robust review and decision making process.

But how is this accomplished?

Firstly, all stakeholders in the project,

were the modelers from different geoscientific groups,

project managers, or third parties such as consultants

and JV partners need to have access to the latest data

in as near real time as possible.

Secondly, everyone needs to work collaboratively

from a single source of truth to create up-to-date models

that facilitate the development of a digital twin.

A robust monitor to model to design workflow

bridges the gap between typically disconnected monitoring

and technical analysis workflows.

The sequence TSF solution, a continuous modeling paradigm

is established that solves the data management

and multi-stakeholder conundrum,

and centers activities on as-built performance

and failure prevention with good communication,

a key success factor.

Core to our solution is Seequent Central,

a cloud hosted model and data management system,

with web based visualization capability

that facilitates collaboration.

Seequent’s modeling suite comprises Leapfrog,

Oasis montaj and Geostudio,

all of which addresses the modeling and analytical needs

of geologists, geophysicists and geotechnical engineers.

Seequent works to provide the best of breed solution

by working with partners and their products.

For example Esri.

Modflow and Feflow for flow modeling

are examples of industry standard products

Seequent’s TSF solution works with.

In combination, the outputs define a live digital twin

and completes the monitoring to modeling

and design workflow.

Let’s take a look at the role of our expert applications

in a little more detail.

Geophysical methods from the ground, air and satellite

provide efficient remote sensing of TSFs.

Geophysical methods are a great way to see non-intrusively

into the subsurface and provide information

on say water content.

These methods can easily be repeated at regular intervals

over the same area,

giving a great reference point for monitoring changes.

Permanently in place systems are also on the market.

These dense data collection techniques are superb

for filling in between physical measurements.

Like piezometers, basically guiding the construction

of the 3D digital twin and making it more accurate.

The geological model forms the foundation

of the digital twin, where continuous assessment

of the altered rock, tailings depositions,

along with the structural discontinuities

can impact the physical structure.

The geological model with the hydrogeological

and geotechnical analysis are inseparable

from a robust workflow, and to fully understand changes

in the foundation characteristics of the structure.

More specifically, it is typical

that geotechnical model domains

and their associated soil and rock designed parameters

are attributed to geological model domains.

This generally means a geotechnical model

is only as good as its geological model,

with the geological model playing a significant role

in informing geotechnical design parameters.

In summary, an accurate representation

of the geological conditions

is fundamental to developing optimized tailings solutions,

particularly at the design stage.

This adds significant value and cost savings,

both at construction and during ongoing maintenance

over the lifetime of the tailings facility.

What is key to the stability of tailings?

Water management is one of the biggest challenges

for managing the tailings facility,

especially those using wet transport and deposition.

The hydrogeological extension provides an efficient

and robust workflow for transferring the model

in the digital twin to Modflow and Feflow,

including importing results back into the digital twin.

This saves time and effort

as well as facilitates a regular update

of the forward-looking analysis for TSF behavior.

Geotechnical analysis is the key

to understanding factors of safety and reliability.

Geotechnical analysis is an integral part

of modern TSF design and management.

The analysis is used not only to provide insights

on deformation, consolidation

and stability at the design stage,

but to adapt the design to actual conditions

during construction based on the interpretation

of monitoring data.

As such, the key benefit of geotechnical analysis

within the digital twin lies in the ability

to make informed decisions about the immediate

and forward looking performance of the facility.

A dynamically updated digital twin of the physical system

ensures knowledge transfer

throughout the history of the site,

meaning that decisions are intentional rather than reactive.

The importance of a geotechnical digital twin

was recently highlighted at Seequent Lyceum

by a colleague of mine.

He presented the findings

of a retrospective numerical simulation

of the construction history

of the Mount Polley Tailings Storage Facility.

The simulation, as seen in this video,

incorporated a thin layer of clay

that exhibited strength loss during deformation.

The simulation revealed a repeating pattern

at each stage of construction,

the excess pore water pressure generation, dissipation

and equilibration of the flow system.

More importantly, each stage of construction

was associated with a change in strength

along the developing rupture zone.

The strength varied between peak and residual

right up to the point of failure.

All it took was one additional bump

at that stage of construction

to cause full strength loss in the clay,

resulting in a catastrophic failure of the entire facility.

A key realization was that deformations of the TSF

were negligible up to the point of collapse.

As such, monitoring would not have been diagnostic

of the impending failure,

which highlights the need for a geotechnical analysis

as an integral component of the digital twin.

We can easily imagine how a dynamically updated digital twin

with version control, multiple realizations of the geology

and a single source of truth, whatever for the engineers

to explore different physical phenomena

as the facility evolved.

Moreover, the digital twin could have been passed down

from the outgoing to incoming engineers of record,

allowing the knowledge transfer to be unbroken.

So in summary, when we assess what is required

to manage tailing storage facilities safely,

and the requirements of the global tailing standard,

teams have to think about holistic design.

The digital twin becomes the basis for design

used at all phases of the project life cycle.

Development of the digital twin forces the engineers

to understand the physical system

and make informed decisions

about the facility’s performance as it evolves.

A comprehensive digital twin

that consistently incorporates changing data

and evaluates all spatial, numeric

and intellectual information in a 3D plus temporal context

helps to identify problems early.

It can also help design targeted monitoring programs.

Interpreting monitoring data is a significant challenge

because it goes beyond plotting a time series

and trigger thresholds.

Again, that is only valuable if it’s interpreted

in the context of the digital twin.

A continuously updated digital twin

enables an adaptable design that allows material changes

in the design to be identified in the moment.

Informed field decisions can be made

to either alter the design

or accept the current construction trajectory

that meets the factors of safety.

We truly believe a paradigm shift is required

whereby tailings governance needs to shift

from a dominantly reactive long-term modeling approach

to a more strongly agile,

even predictive short-term modeling method.

To help prevent failure isn’t about one piece of data

or a single technology.

But it’s how you bring it all together that counts.

Thank you for your time and attention.

But just before I hand you back to Janina,

please note that we have a host of additional information

on tailings that you can source

from our website.

Over to you Janina.

<v Janina>Thank you very much, Pieter</v>

and yeah that is wonderful.

So I would like to address of course the audience

in the last couple of minutes.

And if you have any questions regarding Pieter’s talk

please feel free to write them here in the question window.

And in previous conversation with a few of the attendees,

there were a few questions that were asked

and we want to make sure that we address these as well.

So I’m going to ask my colleague Jenny if she’s available

to help me address those questions.

(Jennifer chuckling)

<v Jennifer>Hi everybody.</v>

We’ve just got a couple of questions here Janina.

What was meant by more agile

rather than reactive workflow

is one of the questions that just came up.

<v Janina>Oh yeah okay.</v>

Yeah so the way that tailings monitoring and governance

is currently conducted is somewhat lengthy

and not always linear process.

So the flow of observational and measure data

is often inhibited by lack of interconnectivity

between technology,

but also between those multidisciplinary teams.

The consultants, the reviewers, the geotechnicians,

the third parties,

everyone involved is somewhat disconnected from each other.

And so that very nature of this aggregated process means

that we’ll never really sees the full real-time picture

or the complete digital twinness if you will.

So to understand all of those influencing factors at play,

and because of that,

one can only really react when or if a red flag goes up.

So only when everyone works from a one source of truth

and has insight into each other’s expertise

and forward thinking 3D models and design work,

one can partake in a more agile and even predictive workflow

that mitigates the risks from the start

and also identified issues before they truly turn

into real problems.

That’s the idea.

<v Jennifer>Fantastic.</v>

We’ve just got one other question here as well.

How does the solution allow companies

that adhere more closely to those new global standards

that have recently been put together?

<v Janina>Yeah, they’re the new global standard of</v>

(indistinct chattering)

Yeah then you have a close look at this document

and believe me, we have.

(Janina laughing)

The headlines and that is really six chapters

and they all ask to for the operator and everyone involved

in the tailings facility to meaningfully engage people.

To build an interdisciplinary knowledge base

and to develop robust designs that integrate

that knowledge base.

And together develop an organizational culture

that promotes learning, communication

and really early problem recognition.

That’s the key.

And all of these aspects are truly fulfilled

when an uninhibited and real time communication

in 3D occurs.

So when there’s interactive collaboration

that can take place between everyone involved,

and that’s exactly what our solutions are aiming to do.

Our individual products integrated with each other

and underpinned through Central.

It really brings people and data together

and promotes that intellectual exchange

between subject matter experts, stakeholders

and of course also the public to build a really clear

and perpetual transparent view of everything

that governs that side.

And so in that sense,

we really quite closely adhere to the global standard.


<v Jennifer>Thanks Janina, I think that was,</v>

that answered that really well.

I think that was all the questions.

So if anyone else who, oh, sorry,

let me just check there’s something else.

Does Leapfrog allow real-time data streaming into the model,

to say monitoring and visualize the phreatic surface?

<v Janina>Yeah, it depends on what kind of,</v>

very good question.

It depends on which format it comes in.

But ultimately in Central we have built a dynamic link

where you can bring in point information for example,

and link it dynamically and directly to Leapfrog.

So in Leapfrog you would be able to build a phreatic surface

or any kind of 3D surface

based on that xyz point information.

And as soon as that information in its raw form

is refreshed in Central,

in the Central data room and its repository,

then your Leapfrog project will automatically

be notified about it.

So in that sense you can just right click, refresh

and then your phreatic surface will rebuild

based on the new information.

And depending on how you set up your project,

you have an opportunity then to actively compare

what that surface looks like

relative to the previous interpretation, the previous model

and what those changes mean going down the roads

in terms of your hydrogeological assessment

of the site, yeah.

<v Jennifer>Fantastic.</v>

And if there is any questions about workflow

or if anyone would like to get in touch with us

about setting up those different workflows

to dynamically link that data then we’d love to talk to you

and perhaps have a look at what you’re working with

at the moment.

<v Janina>Thank you very much for joining</v>

and thanks Jenny, thanks Pieter for the great talk

and we hope to hear from you soon.

Have a great remaining conference.

(gentle music)