Skip to main content

ICMM Vendor Engagement for Tailings Storage Facilities Monitoring.

The industry has already come a long way to match the Global Tailings Standards and the ICMM in particular has greatly contributed to this development. Yet, as we all know, technological evolution never stops and indeed is speeding up. And so, while monitoring will continue to represent a corner stone to the understanding of Tailings Storage Facilities, there are elements of the overall analytical workflow that can be optimised to let the data speak more loudly and facilitate an agile decision-making process. ​

In this presentation we take a look at dynamic digital twin technology and review what a digital twin represents from a Seequent and Sensemetrics perspective and how a transparent and real-time flow of information can assist an iterative workflow approach.



Iain McLean
Executive Vice President – North America Seequent

Janina Elliot
Global Central Technical Lead – Seequent

Alex Pienaar
Director of Mining – Sensemetrics


15 min

See more on demand videos


Find out more about Seequent's mining solution

Learn more

Video Transcript

<v Iain>The unique issues surrounding tailings dams</v>

require a combination of solutions

all of which are proven components.

We have a 20 year plus history in the industry

and have been an integral part

of decision-making throughout.

Now, we wanted to share a view with the ICMM

that our proactive approach to TSF management

can be achieved by bringing together

trusted monitoring technology

with geoscience modeling tools

to generate a dynamic digital twin,

to enable better planning and decision-making.

Now the original request for proposal,

one to end the intentions of the committee were

that they wanted to evaluate the current state of the art

in monitoring systems, including dashboards.

And we recognize that they provide a vital link

in the management of facilities,

but we wanted to highlight that they’re part of the solution

and we aim to propose to them

and to you hear that there is a more holistic view

to be had that can be a crucial tool

in securing the safety and social contract

of these tailing sites.

Janina please, if you could just go to the next slide,

the committee to respect the intentions

that are expressed in this statement.

And so now I’m going to introduce your Janina

who is going to run the agenda and set the context

for our approach.


<v Janina>Thanks Iain.</v>

Okay, so let’s have a look at the agenda

for today’s presentation.

We will start by providing a brief overview

of your trusted partners in the industry

and the established technologies

that you rely upon on a daily basis.

Then we will establish what it is that we wish

to achieve as a community of partners,

particularly considering the goals

of the global tailings standard.

Now the industry has already come a long way

to match the standards

and the ICMM in particular has greatly contributed

to this development

yet, as we all know, technological evolution never stops

and indeed is speeding up.

And so while monitoring will continue

to represent our cornerstone to the understanding of TSF,

there are elements of the overall analytical workflow

that can be optimized to let the data speak more loudly

and facilitate an agile decision-making process.

As such, we will have a look

at the dynamic digital twin technology

and we’ll review what a digital twin represents

from a Sequent and as Sensemetrics perspective

and how we’re transparent and real-time flow

of information can assist an iterative workflow approach.

Now, many of you will be familiar with Sequent

as we have been an established partner

at the global mining houses for many decades,

but what may be new to you is that Sequent

as well as Sensemetrics

have recently joined forces

with Bentley, a company that aspires

to bring both the digital above

and below surface worlds together

to create true project insight

through digital twin technology.

As such, we’ve become one family risk Sensemetrics

and have now a unique opportunity

to leverage each other’s expertise

and vast industry experience to take a leap

into the future of TSF management.

Now, what is it that drives the industry today?

I think all of us know what the answer to this question is,

and I won’t elaborate on past challenges

and events we are all familiar with.

Instead I’d like to highlight

how far the industry has come since,

and the direction has taken to facilitate the new standard.

Arguably, the biggest commitment

is to a digital transformation

and creation of a comprehensive

and accessible knowledge base

that provides complete transparency regarding the management

of TSF and the chain of events in case of failure.

Now here’s the ideal case scenario

of an integrated workflow approach

whereby the accumulated and real-time knowledge,

3D monitoring and modeling,

and ultimately a digital twin drive TSF governance.

The idea is that a dynamic digital twin provides the basis

for a robust decision-making process to maintain

or adjust the site strategy regarding technical execution,

data acquisition, learning, emergency response,

as well as reporting and public outreach.

However, that is more easily said than done.

And so we have some challenges,

the analysis design and operation of the TSF is challenging

and there’s no doubt about that.

Both the design and our understanding

of the site are constantly evolving.

And the engineer of record is responsible

for detecting changes in the current

and future performance of the facility

through all phases of the life cycle.

Therefore targeted monitoring is essential,

but interpreting the data it’s challenging as well

and must be done in the context of the physical system.

What we would like to do today is to take you on a journey

and show you what modern tiers

of management could look like.

Rigorous monitoring through sensor data is

and remains a cornerstone and how we assess TSF.

It allows us to understand how the facilities behave today.

The element we wish to add to the current

and establish process is predicting.

That is the ability to detect a system change

through monitoring, correlate

and assess the information in a 3D context

with all other system information i.e the digital twin

and react in a preventative manner

to create safer TSF sites.

Of course, to allow for this process to unfold

and support an iterative continuation,

the digital twin needs to be flexible

and dynamic to incorporate change

at any stage of the life cycle.

The key ingredient here is not only integrated technology

and transparent data flow,

but that all entities involved and that is management

to technical engineers to consultants internal

and external reviewers are interconnected in real time.

All parties to be able to network and collaborate,

to support the perpetuation of the digital twin,

utilizing the latest data

and each other’s knowledge and expertise.

And now I hand the conversation to my colleague, Alex,

who will take us through the technical execution

of the dynamic digital twin workflow

as imagined by Sequent and Sensemetrics.


<v Alex>Thank you very much.</v>

So the evergreen digital twin part

of my data provides a new set of tools

to meet new standards

for engineers with a technological leap

through the looking glass,

into the inner mechanics of critical assets,

such as tightened storage facilities.

It represents the next evolution of static 3D models,

dashboards, uncontrolled variables.

In the context of TSF,

this allows engineers to visualize

their assets track changes, simulate events,

and perform analysis to dynamically recalibrate.

And so doing it enables learning reasoning

and an overall better understanding of the asset itself.

What does it mean to sensorized the tones then practice?

Does that mean increasing dishonor count by five times,

maybe 10 times.

What about surface and subsurface defamation?

The increase density and diversity of sensor types makes

for an extremely complex landscape of sensing connectivity,

device control network and data management

before measurements can even be converted

into usable information.

IOT removes the multiple friction points,

inherited bringing all of this together.

It essentially increases efficiency and productivity

while at the same time,

enabling scalability and flexibility,

not across a single site, but across multiple sites.

IOT ultimately becomes the foundational piece

of the data science pyramid

and I myself I’m far from having any compliments

in this domain,

but I remember in high school being presented

with Maslow’s hierarchy of needs,

the best I can describe it as today is the different stages

that humans have to go through to find happiness.

The key thing to get out of this though,

is that you have to build them on the underlying layer

and in the cumulative way.

It needs lowered down hierarchy must first be satisfied

before individuals can attend to the needs higher up.

The same is true for the data science group.

You cannot get

to the artificially intelligent blockchain powered,

extended reality enter buzzword year stage

before being accomplished in underlying stage.

If you progress to the next stage too fast,

it’s not a solid permit.

And moreover to properly accomplish each of the stages,

you must use the awkward from the previous stage

as the input to the next step.

IOT has this unique capability

to build an extremely solid foundation on top of a broad

and diverse sensor base.

This is achieved through edge computing devices

that deliver itself provision sensor agnostic interface

through the ability of cloud computing

that enables a comprehensive system

of record powering an infinitely scalable

computational engine and all of this accessible

to the subsequent layer through an easy

to use Application Programming Interface or API.

And as these layers build on top of each other,

we evolve beyond data acquisition,

beyond understanding how facilities behave

as separate dashboard or static model to a state

where we are controlling our facilities behave

in the future through evergreen digital twins,

where we are equipping engineers

with that technological leap through the looking glass,

into the inner mechanics of these facilities,

enabling learning and reasoning on a whole new level

where we facilitate collaboration through the easy access

to contextually understood data derived

from the same source of truth,

to make decisions as a network of stakeholders

or as a team.

And then into greater workflow,

therefore instantly connects the site situation

with the analytical,

in essence, allowing all stakeholders or team members

to understand how facilities behavior,

and more importantly,

this provides them with two mechanisms

to control how facilities behave in the future.

The first digital twins have the ability

to improve predictive performance over time.

The improvements of the digital twin itself.

This can only be achieved

when real-time sensor data is tightly integrated

with tools such as 3D seepage models,

slope stability analysis,

and stress and strain calculations.

The second mechanism is through the ability

to inform the user when predicted performance will be poor.

This means identifying these regions of improvement

and offering a course of action

for improving its predictive capabilities

through updated engineering designs,

3D geological models and or 3D watertight.

And with that, I’m handing it back to my colleague, Janina.

<v Janina>Thanks Alex.</v>

Okay, now let’s have a look

at how Sequent modeling products pick up on the journey

after data acquisition

and make a dynamic digital twin come to life.

And here we see a dynamic 3D model of the local geology.

The services are implicitly generated in leapfrog.

The resulting geological model can be used

as a foundation for a hydrogeological grid model.

In addition, we can introduce your physical data

or other numeric point information such as sensor data

to build a full 3D interpolation model.

This information can be correlated to build a site model

that can then be actively shared in 3D

or via a 2D cross-section as shown here.

And you have, you have central.

The key aspect for collaboration is the maintenance

of a model and communication history,

central our cloud-based data management system,

where all active stakeholder isn’t active system,

where all act stakeholder

can access version controlled knowledge basis,

but only can they review and visualize video models live

on the web,

but actively exchange comments

and notify team men’s members in real time.

It is also outfitted with a version control data store

that allows for the easy exchange dynamic linking

of essential data files for subsequent analysis.

And here we have to use studio

where the initial 2D cross-sectoral from leapfrog is used

for geo technical analysis of the foundation responds

to construction loading,

ending with a graph

of the key performance indicator for construction.

The results of this analysis

can then reenter the iterative workflow,

actively informing essential target Rangers

for sensor data and monitoring.

And this is the proposed workflow.

And now I’d like to hand the conversation back to Ian

to conclude our presentation.

<v Iain>Thank you.</v>

So indeed thanks Janina.

So in summary,

when we assess what it takes to take the next step

in modern tailing storage facilities,

monitoring and management teams have

to consider the digital transformation of the process.

Monitoring data provides one essential cornerstone

for the development of a transparent

and dynamic digital twin,

which in turn becomes the basis for designs used

at all phases of the project’s life cycle.

Now this holistic approach invites the engineers

to participate in the investigation of the physical system

to understand the geological constraints

and make informed decisions

about the facility’s performance as it evolves.

A comprehensive digital twin

that consistently incorporates changing data

and evaluates all spatial numeric

and intellectual information

in a three-dimensional plus temporal context helps

to identify problems early.

It can also help develop

and adjust the targeted monitoring programs

and enables the operator to create an adaptable design

that allows changes to be identified in the moment

to accept the current construction trajectory

that meets the factors of safety required.

So the essential takeaway from our presentation

is that we want to convey a conviction,

that there is a paradigm shift occurring.

Tailings governance is able now to shift

from a predominantly reactive long-term modeling approach

to a more strongly agile predictive

short-term modeling method.

Preventing failure is not about a single data point

or technology,

rather it’s how you bring

the whole complex information set together that counts.

Thank you very much for the time

and of course we welcome further questions.