Skip to main content

In the first of our five-part Tailings series, Janina Elliott, Global Central Technical Lead at Seequent, shares Seequent’s workflow solution that is helping safeguard operations with digital twin technology.

The workflow solution that is helping safeguard operations with digital twin technology.

Environmental, socio-economic and political risks along with a need to digitally transform the mining industry, has put Tailings Storage Facilities (TSFs) in the spotlight. To truly learn from a failure event and fulfil the goal of the global standard, complete transparency regarding a chain of events is essential.

In the first of our five-part Tailings series, Janina Elliott, Global Central Technical Lead at Seequent, shares Seequent’s workflow solution that is helping safeguard operations with digital twin technology.

In this session, Janina discusses the vast and complex challenges the industry is facing, what a dynamically updated digital twin is and its role in bridging the physical and digital world, the power of 3D visualization and modelling, and how to uncover valuable insights from data and share across teams.



Janina Elliot
Global Central Technical Lead – Seequent


32 min

See more on demand videos


Find out more about Seequent's mining solution

Learn more

Video Transcript

<v Janina>Hello,</v>

and welcome to the first installment of our webinar series,

“Seequent’s Dynamic Digital Twin Solution

for Modern Tailings Storage Facilities Management.”

My name is Dr. Janina Elliot,

Seequent’s Global Central Technical Lead.

And today I will guide you through

the first portion of our proposed workflow,

from data acquisition to an integrated 3D model in Leapfrog.


Before we begin, let’s do a little housekeeping.

At this time,

I would like to make a statement of confidentiality

and provide a disclaimer.

Please note that this presentation

is for informational purposes only,

and is not a commitment to deliver

software features or functionality.

The software products that will be shown today

are the latest versions of Leapfrog Geo and Central.

And despite the webinar’s technical connotation,

the presentation is designed for wide audience,

both from the technical and non-technical domain.

During the webinar, the audience is muted

to ensure that the presentation doesn’t run over time.

But should you have any questions,

please don’t hesitate to write it

into the question window in GoToMeeting.

We will make sure that a personalized reply

will be sent to you via email in due time.

After the webinar,

we would like to ask you to remain

for one or two minutes longer

to partake in a short survey

that will help us understand your needs

and learn how we can improve our offerings.

And as always, if you wish to maintain

or share a recording of this webinar,

a link to the video will be sent to you

shortly after the presentation.

Okay, so let’s get started.

Before we go ahead and jump into the demonstration,

I would like to introduce ourselves

to the viewers that haven’t met us yet.

So who is Seequent?

Some of you may have heard of us

through one of our subsurface modeling applications,

such as Oasis montaj, Leapfrog, or GeoStudio,

but others may never have had a touch point.

We are a software company that builds solutions

for the geoscientific community.

And it is our mission to enable our customers

to make better decisions about the earth,

environment, and energy challenges.

Because it is that robust decision-making process

to provide security and longevity to your organization.

Seequent has a truly global presence with over 430 staff.

But Not only do we hire geologists from the mining side,

we also bring in experts from other geoscientific fields

and provide active solutions

to the civil, energy, and environmental sectors.

And through this cross-sector engagement

with our global customers,

our team continuously learns and develops

new cutting edge technology

that aspires to support you in your decision-making.

Now, arguably one of the most important decisions

the global mining community recently made,

was to widely commit to an improved due diligence process

regarding the safe and sustainable design,

construction, maintenance, and remediation

of tailings storage facilities.

This commitment was formalized

in the Global Industry Standard On Tailings Management,

released in early August, 2020.

And since, operators and consultancies alike

have begun to investigate new strategies and technologies

that would allow them to fulfill

the ultimate goal of the standard,

to cause zero harm to people and the environment.

But to truly be able to affect change

and learn from failure events,

it is essential that there’s a complete transparency

regarding the chain of events.

All data, analysis, and decision-making processes

need to be clearly understood at any given time,

and ideally, in real-time.

However, to achieve this objective

is easier said than done.

Monitoring and understanding the ever-changing data

coming from some of the largest man-made structures on earth

is inherently challenging.

Our conversations with senior management highlights

that reports we see from a multitude of storage facilities

vary not only in location, age, and conditions,

but also in terms of the techniques applied

to monitor and test scenarios for potential failure.

This distinct lack of standardization makes the overall

monitor to model to design workflow

unnecessarily complex,

consumes resources and introduces risk.

In addition, supervisor and reviewers also fear

that the local technical teams struggle

not only with inconsistent file format and compatibility,

but the absence of multidisciplinary interaction

and sufficient comprehension,

that is a cause for miscommunication.

As such, both managers and technical teams agree

that they’re not fully confident

that a comprehensive picture of all the assets is delivered,

and that problems might be missed.

To create a truly comprehensive and live model

of a tailings storage facility,

all the groups involved need to work and communicate

effectively as a team,

and ensure consistent data transparency.

These underlying principles are what allows

for robust review and decision-making process

throughout the life cycle of a site.

But how is this best accomplished?

Firstly, all stakeholders in a project

whether modelers from different geoscientific groups,

project managers, or third parties,

such as consultants and JV partners,

need to have access to the latest data

in as near real-time as possible.

Secondly, everyone needs to work collaboratively

from a single source of truth

to create up-to-date models that facilitate

the development of a digital twin,

that is the complete conceptualization

of the physical system through a model,

numeric or otherwise.

Thirdly, to build the digital twin,

all modelers need to understand the data contextually.

Each geoscientific group views and utilizes

raw data differently.

And, you need to be able to trace data from the origin

throughout its transformation process.

And finally,

every stakeholder needs to have access to the team

and its collective intelligence,

that is to each other’s thought processes and expertise

to collaboratively analyze,

iteratively refine the digital twin,

and then take the next steps together

to arrive at robust target ranges

for the reviewing decision-maker.

When all of these conditions are met,

then tailings governance can shift

from a dominantly reactive, long-term modeling approach

to a more agile, even predictive, short-term one.

Now, Seequent has listened very closely,

and we have developed an integrated solution

in which Seequent modeling and data management products

are used in an intuitive workflow

to develop a digital twin.

The overall workflow begins by collecting

various types of geospatial data,

collected across the tailings side.

This may include drilling and GIS data,

but also geochemical and geophysical information,

whereby the latter can be fully analyzed

in Seequent’s Oasis montaj software.

All of these data points

are then introduced into the Leapfrog suite

to create an implicit geological model.

At the same time,

hydrogeological data from boreholes,

piezometric measuring stations, and more,

is collected to create a hydrogeological model,

either with the help of Leapfrog

or through our partner products,

such as FEFLOW and MODFLOW.

The 2D and 3D results

of the hydrogeological and geological models,

as well as engineering data

and material property information,

are then communicated and shared

with the geotechnical engineers.

The team is then able to carry out

long and short-term stability analysis through GeoStudio

to arrive at a geotechnical model.

In combination, all spatial and numeric models

provide the means to define a true digital twin

of the tailings environment.

Through Central, this collective information

can then be shared, version tracked,

and peer-reviewed by all the stakeholders

to define the next steps in the cycle,

and continuously update the digital twin

as the tailings site evolves.

In today’s webinar,

we will tackle the first portion of the workflow,

and show how raw field data can be effectively stored,

version tracked, and communicated through Central,

and actively integrated into a dynamic 3D model

in Leapfrog for subsequent analysis.

Okay, so let’s start by looking at Seequent Central.

Just in case you haven’t come across

Seequent Central as of yet,

Central is a cloud-based data and model management system

that provides a platform not only for the exchange

and retention of data,

but also cross-collaborative communication

in a 3D environment.

Central also facilitates the holistic integration

of data coming from a variety of software programs,

and as such, assists in a breakdown

of perceived technical and disciplinary barriers

between traditionally siloed groups.

And here we are in the Central portal,

the hub where all tailings facility stakeholders

can interactively communicate their work,

visualize Leapfrog models,

and build an auditable record

of project specific data and models.

As I mentioned earlier,

teamwork and data transparency are key factors,

and underpin the entirety of the indicated workflow.

As such, Central will resurface multiple times,

whenever major milestones are achieved

in the modeling process of the digital twin

that require preservation and communication.

For today’s workflow component,

we focus only on a subset of Central’s many features,

which we will explore in more detail

in the third installment of our webinar series.

We begin by having a closer look

at the data room environment of our demo project,

which resides in the files tab.

Each project stored in Central

is outfitted with a unique data room,

where you and all other project stakeholders

have the opportunity to store and version track

their project specific data,

no matter their software origin.

So once you’ve mapped out who provides essential data

on a regular basis and requires access,

you can create a custom folder structure.

Within each folder,

different files are chronologically organized,

which makes it easy to find the latest geotechnical report,

the latest lidar files and drone imagery,

or up-to-date water table point data, as shown here.

Should you utilize geophysics

for non-intrusive investigation of your TSF site,

then you’d be pleased to hear

that Oasis montaj-generated grids and voxels

can be directly linked to and stored within

the Central data room.

Another great advantage of the Central system

that goes beyond simple data retention,

is the integrated nature of our Seequent software.

As such, specific file types,

for example, aforementioned 2D grids,

point files, borehole tables,

polylines, planar structural data, and meshes,

can be dynamically linked to a live 3D Leapfrog model.

Once linked, the Central system notifies the modeler

that a project team member has introduced new data

that is ready for instant use,

thus aiding a real-time flow of information.

Now, let’s have a look at that data transfer in Leapfrog.

And here we are in the Leapfrog modeling suite,

where the dominant portion of the geospatial data

collected at site can be correlated and evaluated

in a comprehensive 3D model,

which represents a significant component

of the complete digital twin.

As you can see, Leapfrog is organized

in a very intuitive way.

On the left-hand side, we have the project tree,

which allows you to import geospatial field data

from multiple sources, including various databases,

and of course, Central.

The principle idea here is to start on the surface

and introduce topography meshes,

GIS data, and drone imagery,

that can be adjusted over time

to document the evolution of the project site.

Then you can step into the subsurface,

and introduce raw data relating to invasive

or non-invasive investigative methods.

These include borehole information,

designs relating to above or below-surface structures,

points relating to monitoring data

and other numeric measurements,

polylines, geophysical data, and even structural data.

As I mentioned earlier,

the dominant number of these data types

can be dynamically linked and sourced from Central.

Let me give you a quick example

here regarding the points folder.

To access data, all you need to do is to right click

and choose “Import Points from Central.”

This action will grant you access

to any Central project that you are privy to,

as well as the data collected in its modeling history,

and of course, its data room.

Once accessed, the link is established

and the point file is transferred.

As soon as the new version

has been made available in the data room,

for example, through another team member or consultant,

you’re automatically notified through a small clock symbol

right here in your Leapfrog project.

It is then up to you to decide

whether you want to go ahead

and use the new data right away,

to update your current model,

or build a second interpretation in parallel.

And how do we go about building a 3D model in Leapfrog?

The raw data collected in the folders

at the top of the project tree

can be actively linked

to the subsequent modeling folders below,

such as the geological models folder, estimation,

numeric interpolant models, and so on.

Here, the geospatial data is then evaluated

by a mathematical algorithm

that creates an implicit 3D model.

To visualize what that means,

let me give you an example utilizing water table information

collected in boreholes that vary over time.

The principle idea of an implicit 3D model

is that we take advantage of a mathematical algorithm

in order to ease our work.

The mathematical algorithm Leapfrog uses

is called the Fast Radial Basis Function,

which is akin to dual kriging.

What it does, is that it takes each individual XYZ point,

and statistically evaluates them against each other

to then create a best fit surface

that passes through each individual data point.

This surface then represents a water table.

Our geological contact,

our hypothetical domain boundary

representative of a specific value, et cetera.

So let’s have a closer look

at our active water table model here.

Each surface shown is organized

in a surface chronology folder,

which means that these surfaces are also placed

in a stratigraphic or chronological context with each other.

When I open up the surface,

we can see the overall dependencies

captured all the way to the hyperlinked source information.

Once the source information is updated,

the new data points will automatically flow

into the dynamically linked surface,

and the Fast Radial Basis Function will modify

the surface interpretation

to match the new statistics.

It is then up to the modeler

to apply their expert knowledge,

to identify if the surface needs to be altered manually

to match the natural environment.

For example,

in areas where data is scarce,

you can add explicit modeling methods

to augment the implicit model.

You can introduce polylines and additional data points

that support the building of a best fit surface environment

that takes all of the information into account.

Through this modeling method,

you have an opportunity

not only to arrive at a model very quickly,

but build a number of interpretations

based on the same data.

These interpretations can then be peer-reviewed,

and you can collaboratively arrive

at the best possible representation of the physical system

to support the development of the overall digital twin.

Once the geological interpretation is established,

you can go ahead and utilize the information

to create hydrogeological models within Leapfrog.

In a hydrogeological modeling folder,

you can build active grids that can then be introduced

into our partner products, such as MODFLOW and FEFLOW,

for subsequent flow modeling.

Now, what does this exactly look like?

Here’s an example of a finalized model.

To build a brand new model,

all we have to do is to right click

into the hydrogeological model folder,

choose MODLFOW for example,

and click on “New Structured Model.”

The principle idea is that you can pick

any previously created geological model

as a foundation for your hydrogeological grid structure.

Here, I’m going to go with my TSF model,

built in the geological modeling folder,

and introduce the specific lithological surfaces

that I deem appropriate

for this particular hydrogeological model.

Once in place,

I can also decide how to separate the individual layers,

and what kind of grid structure I wish to develop.

So for example,

what kind of grid size or cell spacing I wish to employ,

et cetera.

Now, in this particular case,

you have already prepared one of the models.

And what I’d like to show you,

is that in addition to defining your grid,

you also have the opportunity to set dry head value,

as well as edit the hydrogeological properties

of the model ahead of time.

Once ready,

export the information directly to MODFLOW,


Now, this particular project

has been actively outfitted with geological,

hydrogeological, and geophysical data.

More specifically, resistivity information.

The resistivity data was calculated in Oasis montaj,

our geophysical modeling software,

and linked to Leapfrog through the geophysical data folder,

that can access information directly from Central.

In this case, the resistivity information relates

to multiple 2D sections within the TSF environment.

The point information displayed on these sections

can be actively interpolated into the 3D space,

through our numeric models folder.

Just to give you an idea of what that looks like,

I have prepared a numeric model ahead of time.

Similarly to what I explained earlier

regarding the water table surface development,

an interpolation model takes each individual XYZ point

into account.

Again, it utilizes the Fast Radial Basis Function then,

to statistically evaluate the volumetric distribution

of individual value ranges to create a 3D model

of the resistivity environment.

In addition, it allows for the manual adjustment

of statistical parameters

to modify the orientation of the established volumes

according to the observed trend.

The resulting resistivity model

shows us that there’s a distinct low resistivity pathway,

likely relating to a structural feature

within this environment.

Should you wish for more control

on the statistical distribution of, let’s say, contaminants,

or geochemical variables within the environment,

you can employ the estimation folder,

that is outfitted with an experimental variogram,

and a number of alternate mathematical algorithms

such as ordinary kriging, and more.

Now in this particular example,

the results of the geophysical analysis

have not yet been added to the geological interpretation,

and need to be introduced to show a more realistic model.

The advantage of Leapfrog is that you can go ahead

and dynamically evaluate information

from all sorts of different data sources

to truly build a coherent holistic model

within one package.

Now here’s the complete model of the TSF site,

including the interpreted fault.

Once the geological model has been updated,

that change needs to be actively communicated

to your colleagues,

and more specifically, the geotechnical engineers,

for subsequent stability analysis.

In order to do so,

I first need to create a set of sections

that represent the current geometry.

Luckily, building sections in Leapfrog is easy.

Here’s the basic layout of a cross section in Leapfrog.

And how did we arrive at this?

It’s fairly straightforward.

In the cross section folder,

you have to decide whether you wish to build

an individual section, a fence section, or a serial section.

For example, in case of a serial section,

you will notice here that the set that automatically appears

hinges itself in the 3D space on an existing section.

However, you can decide of course,

on a specific orientation of the particular center section,

as well as the relative spacing

regarding the remaining sections of the set.

Once the set is established,

you can then go ahead and define

just exactly what you wish to present on your section

through evaluations.

You can, for example,

evaluate any model that you’ve previously created,

which is then dynamically linked.

This means that any changes to the linked model

are automatically reflected here,

without having to go through another evaluation.

You can also go ahead and portray

any surface and line in your project

that matters to your geotechnical analysis.

In addition, if you build a new section,

you also have an opportunity to actively copy a layout

and direct it to another section.

Thus, you don’t have to reconstruct

the general distribution of your objects

every single time.

It’s done automatically.

This layout can be exported as a PDF,

scalable vector graphic,

or a GeoTIFF.

Or the entire set or individual section can be exported

in DXF format, AutoCAD file format,

and Bentley drawings.

Now that we’ve reached a milestone in our interpretation,

it is time to communicate the change to our team

and publish the updated model into Central.

That way the model can be accessed

by all stakeholders online,

reviewed in almost real-time,

and stored for audit purposes if required.

The publication process itself can actively take place

in the Leapfrog modeling suite.

All you need to do is, is to choose the modeling objects

you wish to visualize in the web portal, add the project,

and define a customizable project stage

to make it easy for your colleagues to understand

whether the current publication is of experimental nature

or needs to be approved through a peer-review process.

In addition,

you can then define a project branch

in our continued version tree.

The concept here is to differentiate

individual Leapfrog models

by content, site location, and modeling approach.

I will speak about the concept

of branching and its advantages

in more detail in a third part of our webinar series.

Once uploaded, all stakeholders associated with the project

are notified of the changes

through our online notification system in the web portal,

or via an email.

It is your choice to decide

how you wish to be alerted to the change.

The key is that you will hear

about the modifications right away,

to maximize the real-time integrated workflow approach.


Let’s navigate back to the Central portal

where we can visualize the model we just uploaded.

Here on the history page of the Central project,

we can see the version tree and the latest upload.

Each individual version upload

is outfitted with a set of metadata,

including a succinct comment,

that explains just exactly what has changed.

This consistent record makes it easy to navigate

the evolution of the project,

even a year or two down the road,

and to understand just exactly

why certain decisions were made in the first place.

To review more specific detail,

we can click on a particular version,

and navigate to the right.

Here, I’d like to highlight the comment panel,

where the collective conversation and intellectual exchange

around this version upload is preserved.

The beauty here is that each comment

is outfitted with a thumbnail image

that when clicked allows the reader it to be placed

quite figuratively into the midst of the conversation.

And indeed, by clicking on the image in the comment,

we now find ourselves

in the web visualization service of Central,

and in the middle of the 3D model.

The geotag placed identifies the exact XYZ location

which requires discussion.

And makes it easy for all the stakeholders,

even the non-Leapfrog users,

to quickly visualize the issue at hand

and define the next steps together.

Placing a comment is easy, and once done,

all stakeholders noted with at-mentions,

or who have subscribed to Central’s project notifications

can start interacting in real-time.

Being able to actively partake in a conversation,

providing comments and simultaneously review

geological objects and more within the 3D environment,

allows for a true multidimensional peer-review process,

that adds to the security of your organization

by building a consistent audit trail.

In this case, the comment left here

is designed to inform the geotechnical engineers in my team

that the geological model has been altered,

and that a new set of sections is available

for subsequent stability analysis in GeoStudio.

Now they can access the 3D model in the web portal,

and investigate a newly altered geology,

yet again, without the need for a Leapfrog license,

and actively reply or create a new commentary within here.

To access the sections of interest,

the geotechnical team can now navigate

to the data room of the project,

and choose the exact file format required

for subsequent analysis.

Once finalized,

they can then return and communicate

the modified stability analysis

to the remaining team through Central.

While this process is vastly accelerated

through the improved real-time communication,

the download of the files remains classic at this moment.

Having said that, you, our global customers,

have asked for more seamless integration.

And we have listened.

Both Seequent’s July and November releases

are all under the mantle of 2D

and 3D integration with Central.

The next part of our webinar series,

that focuses on the continued development

of the digital twin within GeoStudio,

will demonstrate what functionality

is on the immediate roadmap

to create a dynamic link between Central and GeoStudio.

So in summary, when we assess what it takes

to manage tailings storage facilities safely,

and consider the requirements

of the Global Tailings Standard,

teams have to think about the holistic modeling approach.

That is, the digital twin.

The digital twin becomes the basis for designs,

used at all phases of the project’s life cycle.

It invites the engineers to participate

in the investigation of the physical system,

to understand the geological constraints,

and make informed decisions

about the facility’s performance as it evolves.

A comprehensive digital twin

that consistently incorporates changing data,

and evaluates all spatial, numeric,

and intellectual information

in a 3D plus temporal context,

helps to identify problems early.

It can also help design targeted monitoring programs.

Interpreting monitoring data is a significant challenge,

as it goes beyond plotting a time series

and trigger thresholds.

Again, data is only valuable

if it is interpreted in the context of the digital twin.

An continuously updated digital twin

enables an adaptable design

that allows changes to be identified in the moment,

to accept the current construction trajectory

that meets the factors of safety.

The benefits associated with the proposed

integrated workflow are as follows:

The combined use of the products

allows the team of geoscientists

to truly collaborate, break down barriers,

and make confident decisions together.

Particularly by being able to notify

all stakeholders in real-time,

and by providing direct access

to each other’s data when it’s needed,

the team can arrive at important decisions faster.

The proposed workflow also enhances the team’s efficiency

through the ability to track, understand,

and link peer-reviewed modeling changes.

The removal of redundancies

through a more standardized process

using intuitive technologies and practices

allows for greater team productivity.

Coherent data flow,

and transparency of the decision-making process,

provides the team with an opportunity to learn

what has changed, why it has changed over time,

and how to fix it.

In the short term, that’s reducing risk preemptively.

And lastly, being able to create

a coherent audit trail for internal and public review

provides security to the operator

and organization as a whole.

We at Seequent truly believe a paradigm shift is required,

whereby tailings governance needs to shift

from dominantly reactive, long-term modeling,

to a more strongly agile, even predictive,

short-term modeling approach.

To help prevent failure isn’t about one piece of data,

or a single piece of technology,

but it’s how you bring all of the pieces together

that counts.

Thank you for your time and attention.

We look forward to welcome you again in mid-June,

to the second part of our webinar series,

“From a 3D Leapfrog Model to a Comprehensive

Geotechnical Analysis in GeoStudio.”

In the meantime, please don’t hesitate to contact us

should you have any questions.

And if you have a few minutes to spare,

we would greatly appreciate for your participation

in a short survey after this webinar.

Thanks again, and have a great day.