This step-by-step webinar reviews the integrated geotechnical workflow for analysing pore-water pressures and stability of a tailings storage facility.
The steps required to create a geological model in Leapfrog Works through analysing the slope stability results in SLOPE/W are presented.
Interoperability is key for an integrated, efficient workflow, especially when dealing with dynamic data in large geoengineering projects. Leapfrog Works and GeoStudio offer such connectivity through Seequent’s Central hub. The workflow presented in this webinar will step you through the creation of a geological model representing a tailings storage faciltiy in Leapfrog Works through the analysing the slope stability results of a generated 2D cross-section in SLOPE/W. This integration is made possible through the sharing of data across the Central hub. The final step will illustrate how the results of the SLOPE/W analysis can be uploaded back to Central and imported into Leapfrog Works.
<v Jeff>Hello, and welcome to the Leapfrog Works</v>
GeoStudio interoperability webinar.
My name is Jeff McKeon, a project geologist with Seequent,
and today I will be joined by Kathryn Dompierre,
a research and development engineer with Geoslope.
Today, I’m going to walk you through some
of the basic workflows in Leapfrog Works
while my colleague Kathryn will cover
some of the workflows in GeoStudio.
We are going to be using central sequence system
for data and model management, allowing version tracking,
intellectual exchange and collaboration on your models
with different groups within your company
to ultimately validate and approve your model
prior to your geo-technical analysis.
Before we go ahead and jump in to the demonstration,
I wanted to provide a brief introduction to Seequent.
Some of you may have heard of us
through one of our applications,
such as Leapfrog or GeoStudio,
but others may have never had this touch point.
We are a software company that builds solutions
for the geoscientific community.
It is our mission to enable our customers
to make better decisions about the earth,
environment and energy challenges,
because it is that robust decision making process
that provides security and longevity to your organization.
Seequent has a truly global presence with over 430 staff.
We provide active solutions to the civil energy,
environmental and mining sectors.
Through this cross sector engagement
with our global customers, our team continuously learns
and develops new cutting edge technology
that aspires to support your team and your decision making.
The key to our solution is that it provides
the means to work effectively as a team
and ensure data transparency.
These are the underlying principles
that allow for a robust review and decision making process
throughout the life cycles of various project types.
Ideally, this is accomplished by all stakeholders
in the project, whether modelers
from different geoscientific groups,
project managers, or third parties,
such as consultants and JV partners needing to have access
to data and as near real-time as possible.
In addition, everyone needs to work collaboratively
from a single source of truth
to create up-to-date models that facilitate
the development of a digital twin
or the digital representation of the physical system
through 3D and numeric models.
Once the digital twin is established,
then target ranges can be set that support a proactive
decision making process throughout the life cycle
of the project.
Now, this is the ideal scenario,
but it can only be achieved if certain conditions are met.
For example, all data needs to be consistently accessible,
no matter where stakeholders are.
The data needs to exist in a singular place
rather than being scattered over multiple servers,
external hard drives and computers.
To build this perfect, ideal holistic model,
we also need to understand the data contextually.
Each geoscientific group use
and utilizes raw data differently,
and we need to be able to trace data from the origin
throughout its transformation process.
And finally, every stakeholder needs to have direct access
to the team and to each other’s thought processes
and expertise to collaboratively analyze,
iteratively refine the digital twin
and take the next steps together to arrive
at robust target ranges.
When we meet all of these conditions,
then we can more rapidly define a potential failure
and take action as a team.
Sequence modeling suites, such as Leapfrog,
Oasis montaj and GeoStudio,
address the analytical needs of geologists,
geophysicists and geotechnical engineers.
As you know, we also partner
with industry standard software providers
such as FEFLOW and MODFLOW, Esri and more
to support hydrogeological models and the visualization
of a Central spatial data.
Central represents our cloud hosted data management system.
It underpins and enables the collaborative analytical
and decision-making processes
by providing a platform for active communication
and data tracking.
In combination, the outputs define a live digital twin.
What we wish to present to you today
is how our products allow for a transparent flow
of information, both physical and intellectual
to break free from reports and to allow the collective team
to act as a unit when it’s needed the most.
The workflow we would like to present today
shows how sequence modeling and data management solutions
are actively used in an iterative modeling process.
We are going to focus on the linkage
between the geological model
built on Leapfrog with GeoStudio.
We start off by collecting various types
of geospatial data across the site.
This includes GIS data, or whole data and surface data.
All these data points are then introduced into Leapfrog
to create an implicit geological model.
The geological model results as well as engineering data
and material property information
are then communicated and shared
with the geotechnical engineers.
The team is then able to carry out long and short-term
stability and SEEP analysis through GeoStudio
to arrive at a geotechnical model.
In combination, all spatial or numeric models
provide the means to define a true digital twin
of your project environment.
Through Central, this collective information
can then be shared, version tracked and peer reviewed
by all stakeholders to define the next steps
in the cycle and continuously update the digital twin
as the site evolves.
So at this stage, let’s just go ahead
and jump into our active demonstration of the workflows,
starting with Leapfrog Works.
So for those of you who have not worked with Leapfrog,
it is an implicit modeling engine
designed to create a holistic representations,
or a digital twins of your project area.
In this specific instance,
I’m going to be working with a tailings dam project.
It is highly interoperable and that it allows you
to pull data from many different softwares, sources
and many different exchange formats
into its respective folders.
The interface is split up into three different components.
You have your project tree, where all of your data
is housed and organized.
You have your scene view which is where you visualize
your data and your object list or shape list
where whatever is visualized in the scene
appears just so you can keep track
of what you’re looking at.
Jumping over to the project tree.
To import data into any of these folders,
all you do is right click
where it gives you your import options.
For boreholes, this is showing up gray
because I already have them in the scene.
Just going to to pull this in really quickly.
So boreholes can be pulled in from an array
of different sources.
You can connect to local database like access, open ground,
a direct connection through open ground
or local files like CSV or text files.
You can also pull directly from Central,
which is sequence, cloud hosted system
for data and model management,
which is what I did in this case
that’s why there’s this little C icon right here.
So once the data is loaded into your project,
you can start to build these holistic representations
of your project area.
So Leapfrog works by introducing X, Y, Z data
and space where it then uses the radial basis function
or RBF, which is akin to dual creaking
to build implicit models.
So each point is weighted statistically
to create a best fit surface.
This can be a topographic surface,
a lithological contact surface, or a water table surface.
All objects you build are dynamically linked to the source
data indicated by these hyperlinks underneath the object.
So I built this topography in Leapfrog
using the Martis 10 ft point data.
If you click on the hyperlink, it will direct you
and the project tree to that source data.
So if I drag this and we can see that this surface
was created by interpolating between these points.
Another thing to mention
is that these are dynamically linked.
So if I were to change the source data, this object
that the source data was built from will reprocess,
and then in turn anything using the topography
and your downstream modeling processes
will also be reinterpreted.
So with this source data,
you can create your geologic models,
your water table models, numeric models, contaminant models,
all in these folders down here.
Really quickly, I’m just going to show
the geological model building process
for those who have not used Leapfrog Works.
So to build a model, and this works in any
of these modeling folders.
You right click, and I’m going to select
a new geological model, I’m also really quickly going
to pull my borehole data into the scene.
Okay, so here, I’m going to select my lithology,
my surface resolution, as well as the model extents,
you can set them manually, modify them in the scene
or select an object to enclose.
I’m just going to enclose my lithological data and click okay,
and then this will create your geological model.
So it’s comprised of your boundary,
which is what we just set, fault system,
if you have faults in your project, your lithology,
and probably the two most important folders,
surface chronology and your output volumes.
So initially it will create this unknown output volume
based on the model extents that we set with the upper bound
being the topographic surface.
To curve up this, let me make it a little more transparent.
To curve this up, we’re going to come
into the surface chronology folder and create some surfaces.
I’m just going to create one to show this process.
So there’s a bunch of different surfaces,
I’m going to select a deposit from base lithology.
Here I’m going to one second, I’m going to create a surface
for my, this purply blue unit.
This is just a sedimentary unit.
So I’m going to select that and select use contacts below.
This will then create a point for every lower boundary
contact of this Q-O-G-O unit and when I’m happy,
I’m going to click okay,
and that will generate our surface right here.
Drag that into this scene and we can see
that the surface honors our borehole data very well.
So I already have this model built,
so I’m just going to go ahead and clear the scene
and show you all of the surfaces that I created
using this lithological data.
So these surfaces, a majority of these surfaces
were created just using the contact information.
You can also incorporate other sources of information,
such as GIS lines or contacts at your surface.
You can introduce your own points in polylines
so in areas where data is sparse, such as over in this area,
some surfaces may blow out where you don’t want them to.
So for instance, this red surface
is my fill surface right here.
So originally the surface was blowing out all over this area
because there aren’t any holes.
So I just went in and manually constrained them
with these two polyline objects.
So we can see one polyline here, one right here, here,
and up here, just to make sure
that we are building something that is representative
of our project area.
So all that being said, I’m just going to clear this scene
and drag the geological model back into the scene.
So once those surfaces curve up, that unknown output volume,
will then generate these different volumes for your units.
So once the model is built, it can then be published
to Central for peer review.
Central is one of our solutions,
it’s a system for data and model management.
It’s a store environment that can consistently version track
data coming into the project from the field
and dynamically link it.
To publish, you just click on this little Central icon
in the bottom left and select publish.
Here you have the option of selecting what you would like
to publish as well as the ability to include
the entire project so people can copy it down
on their server and work on it.
You can also select which stage this publication is.
I already have one published, so I’m just going
to directly jump into the Central server to do so.
You come over to Central projects and select open portal,
and then you’re going to select your project.
So I’m going to be jumping in and out
of Central in this webinar.
So I’m going to come back to all of the version tracking
and other functionality, but right now,
I want to highlight the Central data room.
That is just this little files tab here.
So the Central data room is a cloud hosted system
that allows you to organize your project data
and dynamically link it to specific objects in your project.
Here, you can add data as it comes into the field,
as well as version track it.
In this instance, I have an additional
roundup borehole information.
So I’m going to jump into my borehole data,
I’m just going to select my collar.
If you want to update an object,
you click on it and then come over
and select upload new version.
So I’m just going to click on the new collar information
and click Okay, and then that will allow you
to version track right here.
So I already did that for these two,
so I’m just going to jump back into Leapfrog.
When the data in Central has been updated,
a little clock icon will appear
in the project tree next to the object.
This indicates that this is out of date
and will need to be reloaded.
To do so, I’m just going to pull this in
so we can visualize this and maybe turn this transparency.
Okay, so as we can see, the little clock icon is here.
So I’m just going to right click on this object
and here you have the option to reload
from the latest file version, or you can reload
from any of the files in your file history.
In this case, I’m just going to reload
the latest file version, and that will prompt this screen.
So here we can see our mappings of our collar information.
It’s going to retain the mappings
that you had in the last import,
so I’m happy with this, I’m just going to click finish.
And now that is going in and reprocessing everything
that was dynamically linked to this portal information.
So here we can see that new round of boreholes is in,
and it is now reprocessing our geological model.
So throughout the life cycle of the project,
we will constantly be bringing in more and more information.
Like we just brought in a new round of drilling
and the project will consistently,
dynamically update throughout this entire process.
This can be geophysical data from Oasis montaj,
which is our geophysical suite of tools,
geo-technical analysis through GeoStudio
or geotechnical suite of tools
or any other suite of software
that you’re pulling information from.
And that information will go into its appropriate folders.
We can also, we have many APIs
with different software companies.
So flow models, we can create MODFLOW and FEFLOW grids
in Leapfrog to be used
in these respective software packages.
So I updated the model again,
and I’m going to go ahead and publish.
So I’m just going to jump back into the portal.
So again, here we can see the latest model
that publishing event that I just did.
We can see what changes have occurred,
who’s made these changes and established a peer review
process, as well as version track.
In the browser, you can also limit
who has access to these models
and you can connect stakeholders,
other people working on the model,
other modelers, geophysicists, geologists,
so that everybody can collaborate on building
the best representation or a digital twin
of your project area.
So in this browser as well,
it’s not only the raw data files and the data room
that we looked into previously, or the version tracking,
but you can also allow intellectual exchanges
between modelers and stakeholders
to actively peer review in real time.
So anybody connected to this project will get notifications
when any updates occur and those notifications appear
in this bell icon.
So if I click here, I can see that a day ago,
Steven Donovan commented on the master branch.
So if I click on that notification,
it will bring up the model that this applies to.
So we can see here, this is one of the older revisions.
So we can see here that he said,
film material looks to be blowing out,
please constrain the model.
And so that’s what I did.
I went in and I fixed this area.
I republished and said that I’ve fixed the blow out here.
So when I replied to his comment,
he will also get those notifications.
So this is the Central web browser.
Really quickly, I’m going to jump over
to our Central desktop application.
It has pretty similar functionality,
it’s just a little more built out.
So here is the Central browser.
Again, it is a desktop application.
Here you can see all of your project revisions.
It’s very similar to the web browser, but again,
it has a bit more functionality.
I’m going to go ahead and click on the most recent version.
And here you have the ability to pull any
of these objects into the scene.
The feature that I wanted to highlight
is the ability to actively compare different models,
different objects that can be applied to boreholes,
faults, lithological units,
and allows you to compare your interpretations
and see how they’ve changed over time.
So for this exercise, I’m going to focus on my fail unit.
To go ahead and compare this to a previous revision,
you just come to the top and select
this compare revisions button.
You’re going to come over and select the revision
that you want to compare this to.
I’m going to select my first revision
and then going to modify the appearance of the object.
I’m just going to turn wireframing on for the older revision.
That way we can see those differences.
So here we can see that the wireframe fill object
is the older revision and we can see
that this is that big blowout area that my colleague
previously commented on to fix, which is what we did.
And we can also see here that the additional round
of drilling also modified this object.
So once we’ve decided that this fail unit
and our entire model is at the highest geological certainty,
we can go ahead and jump back over.
And now that this model is validated,
you can change the stage to approved,
which will update the project and then notify everybody
attached to the project that this model is now approved.
So now I’m going to switch back over to Leapfrog.
Now that my project is approved
and the geological model has been validated,
I can move forward with my geo-technical analysis,
just going to jump into this scene view.
I want to do a slope stability and SEEP analysis
along my dam axis, so I’m going to need to create
a few cross sections along this chainage.
This yellow line right here is loaded into my GIS.
This is some vector data that has been draped
onto my typography along with the map.
To go ahead and build cross sections along this,
I’m going to come down to the cross sections
and contours folder, right click,
and then this is going to give me multiple options
for my cross sections.
Because I want to make cross-sections along this chainage,
I’m going to select new alignment serial section,
and here it’s going to give me the ability
to select a new polyline or an existing one.
I’m just going to go ahead and select the existing one.
And here we can see that these cross sections
are being populated along this chainage.
I’m going to modify some of these parameters,
going to change the width and the height to 500.
I’m going to move the line or move the cross sections
down a bit and then I’m going to change my chainage,
spacing to something like 250.
And when I am happy with this,
I’m just going to zoom in to increase this transparency a bit.
It looks like there a little bit below,
so I’m just going to modify this and that looks good.
And we can see here that they all say F,
meaning that this is the front of the cross section,
or the back of the cross section says B for back.
So when I’m happy with this,
I’m going to go ahead and click okay.
And that’s going to create the sections right here.
First, I’m going to need to evaluate
my geological model onto my sections.
So I’m going to right click on these and select evaluations.
And I’m just going to pull over my geological model
and click okay, and then that will evaluate
the geological model onto all of these sections.
Now that is done, I’m going to pull these into the scene
and I’m going to turn these off
and here we can see the geological model
now evaluated onto all of these sections.
I am then going to create a section layout.
So to do this, I just right click on the cross sections
and select new masters section layout.
And here we have a bunch of different options
for our cross sections.
I’m going to select my model
and I’ll just keep the default settings
as they are and click okay.
And so now this is populated.
I’m just going to move some things around.
To move things in the section around,
you just drag and drop, maybe move my sections down a bit,
drag this to the center.
I’m also going to want to evaluate
my boreholes onto my section.
So to do that, you just come to boreholes in this list,
right click and select add boreholes.
I’m going to select my lithology data.
And here we can select all of the boreholes or filter
by a minimum distance.
In this case, I’m going to filter by maybe 125
and then check this on and click okay.
So now my boreholes are on.
And when you are happy with your section layout,
you just click save and then right click
on your master section layout copy too,
and just go ahead and select all of the cross sections
you just created.
And once that is done processing,
you can drag this into the scene.
And here are multiple cross sections
with your section layout applied along this chainage.
So now that my cross sections are created,
I’m just going to grab a few,
put them in the data room for Kathryn to pull down
and do the slope stability analysis, and the SEEP analysis.
You can export these cross sections in an array
of different exchange formats.
You just right click and select export,
select your geological model
and here are the exchange formats that we offer.
We can do DXF, DWG as well as DGN.
So I’ve already exported these
and thrown them in the data room.
So at this point, I’m just going to hand it over
to Kathryn to run that analysis.
<v Kathryn>Thanks, Jeff.</v>
As Jeff demonstrated, he published cross sections
from his geological model of the tailings embankment
to Central and granted me access to this project.
Subsequently, I have opened up Central through my internet
browser and can now click on this project
to view its history.
I see there are two branches of the project.
I wish to use the cross sections from the master branch
that were approved and uploaded by Jeff.
These cross sections were placed in the files associated
with the project under geotechnical cross sections.
Here, I will select each one and download them
so I can use them to create my project geometry.
In Central, I can also view the positioning
of these cross sections in the overall geological model,
by going back to the overview tab
and selecting the master project file,
which opens it in the project scene.
Here, I will toggle on the output volumes
from the geological model,
and then select the three cross sections
at 2500, 2750 and 3000.
I see that the cross sections cut through the curved section
of the tailings embankment.
In GeoStudio, I have created a new project
with metric units.
In this file, I will add three, two-dimensional geometries,
one for each of the cross sections
that I have just downloaded from Central.
I will name them according to the position
of the cross section in the geological model
and close the defined project dialogue.
In the first two-dimensional geometry,
I will select file import, go to my downloads folder
and select the cross section at 2500.
I will ensure that the materials are imported
from the layer names and turn off the option
to translate the cross section horizontally
to start at X is equal to zero.
You will notice that the analysis details are very small,
and so I will go to define scale
and change the reference scale ratio to one to 3000.
Following the same procedure,
I will import the cross sections at 2750 and 3000
to the corresponding GeoStudio geometries.
My geometry definition is now complete.
The next step for setting up a GeoStudio analysis
is to define the physics.
In order to do so, I must first establish
what types of analysis I wish to simulate
on the model domain.
This is done in the defined project window.
I wish to simulate transient,
poor water pressure conditions.
To do so, I will first add a steady state SEEP/W analysis,
which establishes my initial conditions
throughout the domain.
Then I will add a transient CPW analysis as its child.
I will set the duration of the transient analysis to 60 days
with half day time steps that are saved every four steps.
I will add similar analysis
to the other two geometries in the project file.
Once I’ve finished adding the analysis,
I must now specify the material properties
to complete the definition of the physics on the domain.
I’ve already copied the material properties
over from a previous GeoStudio file
conducted on the same project site.
Note that some of the materials use the saturated only
material model while others closer to the ground surface
use the saturated-unsaturated material model.
The next step in setting up our GeoStudio project
is to define the boundary conditions
for the SEEP/W analysis.
There are two default hydraulic boundary conditions.
The drainage boundary represents the conditions
along a potential seepage face.
I will use this on the downstream side of the embankment.
I must also add a new boundary condition that represents
the water level in the tailing storage facility.
I will add a new water total head boundary condition
to find by a function that represents
the changing water level over time.
To define the function, I will copy the field data
for the water surface elevation over time
into the function definition window.
I can now apply the boundary conditions to the model domain.
I will select the apply to multiple analysis option
in the draw Boundary Conditions dialogue,
and apply my boundary conditions to both the steady state
and transient seepage analysis.
The last step of defining the seepage analysis
is to set the finite element mesh size,
which I will specify as five meters.
I’m now ready to solve the analysis.
When solved, the file automatically changes to results view.
I will add an ISO surface at a pore water pressure value
of zero kPa to represent the position
of the free attic surface throughout the domain.
I could also turn on the total head
or pore water pressure contours.
In the transient analysis, I can view the results over time
by clicking through the timestamps shown
in the result times window.
I can now add a SLOPE/W analysis to determine the stability
of the system as the water level changes.
This project includes both soil and rock materials,
which can be represented with a wide range
of material models available in SLOPE/W.
I have already done so in a different GeoStudio file.
You will note that as the water level in the tailings
facility rises, the critical factor of safety decreases.
Once I’ve completed my model in GeoStudio,
I can export my analysis cross section to a DXF or DWG file
that could be imported back
into the original Leapfrog Works project file.
I will reopen Central and update the cross sections
in the project data files.
I can also upload the GeoStudio project file,
so my colleagues have access to it.
We’ve now reached the end of this webinar.
Please take the time to complete the short survey
that appears on your screen, so we know what types
of webinars you’re interested in attending in the future.
Thank you very much for joining us
and have a great rest of your day.