Skip to main content

Lyceum 2021 | Together Towards Tomorrow

Open pit and underground mining operations dedicate significant resources to collecting costly and highly detailed datasets, often with a narrow scope of use.

As a geoscientist, gaining access to such information can provide an opportunity to re-asses the understanding of mine scale structural geology and mineral system, with significant implications for assessing geotechnical risk. This session showcases some novel avenues of investigation relevant to the often-neglected overlap between structural geologists and geotechnical engineers.



Anthony Reed
Senior Structural Project Geologist, Mining One


30 min

See more Lyceum content

Lyceum 2021

Find out more about Seequent's mining solution

Learn more

Video transcript

(bright electronic music)

Hi everyone.

I am Anthony Reed, a Structural Geologist

working with Mining One,

and today I’m presenting analysis of high resolution

and consistent data for structural geology

and geotechnical studies.

I have to put your standard disclaimer in there

not to make any critical business decisions

based on this presentation alone.

But do reach out to me if you’d like

a similar treatment of your data.

So a little bit about Mining One,

we’re one of the largest mining consultancies in the world

at the moment, and we service, primarily,

mining companies of course,

but actually provide services to a number of other

sectors that are related to mining as well.

We have offices all over the world

and we operate in all those orange countries

that you can see on the screen there.

Got most of the big players in the mining industry,

as clients already, but always happy to add some more.

And we cover everything in the mining space,

from conceptual stages through to the implementation

of underground and open cut pits

all the way through to mine closure.

And you can see from that list,

generally everything that you need to run a mine,

we’re involved with.

Honing in on the geology space,

we do cover all stages of geology from early exploration,

through geoscientific modeling,

mineral resource modeling and reserve estimation,

independent valuations for internal or external valuations.

We can design drill programs,

and we have competent people qualified

under JORC and the NI-43-101 codes to sign off.

We conduct a QA/QC and have independent experts

qualified under VALMIN.

In the geotechnical space,

we cover pretty much everything to do with mine stability,

so very large geotechnical team.

And while there’s a large list

that I’m not going to go through,

I’d like to focus on down the bottom,

our partnership with a software package called Cavroc,

which is essentially a front end plugin

for the FLAC3D geotechnical modeling software,

which is mostly command line based.

It consists of an Octree Mesher

which brings geotechnical data into the block model space,

much as you would a resource modeling technique.

It’s got our IUCM,

which is a physics engine that uses that data

to model rock performance.

And then we’ve got two additional plug-ins,

StopeX and SlopeX, so one’s open pit

and one’s for underground,

which are GUI’s specifically designed for FLAC3D

to make the entire process far more accessible

to people on sites,

and it turns into a very agile and iterative process.

So that’s the sort of thing

that we would roll out for mines.

Now, why do I care about Cavroc?

So I’m a structural geologist and I’ve been working very

heavily with geotechnical engineers

for over a year at Mining One.

And what a surprise,

we have significant overlap in our spaces.

In the rest of the industry that I’ve been involved with,

structural geologists and geotechnical engineers

are usually kept at arm’s length,

but working together we’ve both had our eyes opened

into each other’s fields.

So the sort of data that’s particularly useful

for geotechnical modeling, that I also care about,

are discrete faults,

diffuse faults and zones of weakness, alteration zones

and halos, lithological volumes and boundaries,

which all geologists are familiar with,

and any anisotropy that’s in the rock mass, i.e., foliation.

Back to the title screen,

what do I mean by high resolution, consistent data sets?

Well high resolution sort of self explains.

Consistent, I mean a data set that is collected

with a well understood method

that’s very well leveled and very repeatable.

So things that I would normally consider that fit that bill

in the geology space are assays from Diamond Drill

and RC Core for exploration of resource,

hopefully that data’s in good quality

if you’ve got an operation.

More in the geotechnical and mining space,

we have huge data sets such as blast hole assays,

which geologists don’t usually get to see,

geotechnical parameters get logged, such as RQD,

machine learning derived data, such as color or clustering

on core or rock chips.

You can have observations directly from modeled geometry

of other objects,

so things like implicitly modeled orebodies

or other geology surfaces, and of course, photogrammetry,

which I’m sure everyone is into now,

and laser scanning for things such as stope pickups.

They’re all good examples of very high quality datasets.

I’ll start with the geotechnical data

and what that looks like.

So RQD is not something that I’ve played with

a hell of a lot in the past, as a geologist,

doing geological demanding,

but it’s a very consistent data set that remains consistent

from operation to operation.

It’s a fairly simple calculation.

Even when it’s logged manually,

you do still have to consider that there’s a human factor,

and occasionally you’ll get some leveling issues

with different people logging during different shifts.

But mainly the only problems with the data

is things like the first five meters of holes

sometimes have to be excluded

because you are hitting a rock with a large machine

and it doesn’t always reflect actual rock quality.

Things like drilling artifacts can be stuck in the database,

like Navidril or Wedges that need to be taken out.

Cover sequences can swamp any deeper weaker signals

that you might want to consider.

And occasionally you might want to throw out entire

campaigns, like the exploration call from the surface.

It could be something to do with drilling technique

rather than logging, but you know,

sometimes you want to make sure

you have a really consistent data set,

so that has to be done sometimes.

And what are we looking for?

Well the image on the left-hand side

is a really nice planar fault.

You’ve got a lovely, bright red signal there,

and that’s something that can be digitized and wireframed,

and you’ve got a brilliant fault

that can head through to your geotechnical modeling.

And what we would do with this data set is zoom around

and rotate it until we can find all of those

very clear planar features and model them all up

in a fault network.

Once you’ve done that,

that data can then be excluded and you can re-level

the remaining data set to look for weaker signals.

Now on the right, you can see from the upper left

to the lower right of that image with all the drill holes,

some fairly strong striations in that data set.

Remember this is RQD data,

so that is showing a rock weakness

that is lighting up in a plane.

It turns out that they are, or one of them at least,

is in the same position as a logged slate horizon

in this deposit.

However, you can see there’s a number of striations

and we really only have a single slate horizon logged,

so something else is going on.

So in this case, we did some further investigation

and found just enough evidence in the rest of the logging,

looking at a whole myriad of codes

that could be related to shearing and slates.

But yes, we do indeed have a lot more slate horizons

that are heavily sheared in this deposit.

And so we’ve used the geotechnical data

and interpreted that as a lithology log

to build a whole stack of new features.

In addition to that,

we found these ramping structures that run between them,

and after a lot of discussions with the site geologists,

as well as looking at core photos,

the entire set of slates has kind of been reclassified

as a set of a shear zone network,

where pretty much all of the things that are modeled

as slates and these ramping structures between them

are some sort of fault,

so they’re very, they’re geotechnically important.

They’re very, very thin,

but we finally have a way to visualize them and model them.

Now from a geotechnical modeling’s perspective,

they’re so thin that in the block model space

they don’t take up enough volume to even assign blocks

with their own sort of rock parameters.

So instead, in this case, we would have treated them

as discrete faults and fed that through

into the Cavroc plugin for numerical modeling.

Blast hole assays are a absolutely brilliant data set,

and as a geologist, I’ve only recently started asking

to look at these.

But while they usually lack in a breadth of assay types,

they make up for it in the sheer amount of data

that you get.

So what we’re looking at here is iron assays

up high on the pit wall of a deposit.

Well and truly outside of your normal resource drilling.

So this is essentially a blank slate up here

until we get the blast hall assay data.

In this case, it’s been deemed of critical geotechnical

value to understand where the dikes are in this deposit.

And a lot of them haven’t been picked out from the previous

drilling or mapping efforts because they’re difficult to see

on the ground, and we don’t fully understand the geometry

from these sparse resource drill outs.

But as you can see from the blast holes,

they just shine.

Blast holes at any deposit really give you the best insight

into the shape of your deposit.

And a lot of how their mineral style is actually working

for you there as well.

With some aid of photogrammetry,

’cause I had it in this case,

but mainly tracing those features

we were able to build that into a dike network,

that again, gets fed into your geotechnical model.

Now in the previous images we had iron highs,

but this time we’ve got an economic mineral

and you can see the dikes are taking that material out,

and so you’d be tracing the lows in this case.

And frequently the same feature can track from a signal

of being anomalously high to anomalously low,

as it tracks through an orebody.

Now from sparse resource drilling,

from one drill hole to another,

the rock can look completely different and you cannot link

those features, but in blast holes,

the drill spacing is so close and the data

is such high quality that you can see

that it’s actually the same feature tracking

all the way through.

So using this sort of data to reconcile your geology model

and your understanding of your orebody

is really quite an amazing opportunity.

CMS stope scans, so once we have a stope,

we often send in the CMS to laser scan that stope.

And in many cases, especially where we have overbreak,

everyone runs around

hoping that they can get more information

about why a stope overbroke.

So in this case, you can see from upper left

to lower right, there are a series of structures

that look like they’re involved in some overbreak

in this image.

You’ve got large blocks breaking back

to a very planar feature.

Now those were joints or faults that were not picked up

in the logging, not picked up in the mapping,

but clearly quite important in this area.

And so from analysis of this series of CMS scans,

we’ve identified in the yellow ovals

areas that we deemed risky,

even though they’re adjacent to a fallen stope anyway,

so they would have been thought of as risky,

but we’ve decided there’s a mechanism there.

Lo and behold, two months later,

we’ve got the rest of the data,

and yes, those zones were of risk,

they overbroke in the same way,

and there’s a whole lot more data to suggest that those

faults were involved in that area as well.

So, you know, the structures that are important

to the stope scale, are best visualized in the stope itself.

So it’s worth your geologists getting in there,

looking at the scans of the stopes

after they’ve actually been mined out.

Often, not an accessible dataset to a geology department.

So mapping on high resolution photogrammetry,

everyone’s on the photogrammetry bandwagon,

but I’m going to look at it again anyway,

just to show you what we do with it.

So what we’ve got is a couple of really obvious structures

sitting in this image.

You know, we’ve got one on the left-hand side

that’s colored by our dip direction on the right-hand side,

it’s just the photographic image.

And there’s a number of ways to model such a feature

for use in a geotechnical model.

So for instance,

we can see features where blocks

have fallen out of a pit wall

and literally just trace it up multiple benches.

And we can do that with a series of structural disks,

if we can really pick up the orientation well.

And if we can’t pick up the orientation,

then we can use striations,

so things like polylines or points to tie it together,

and often there’s a combination of the two.

Part of the problem that I have

with some of the more automated data sets that you can get

from Surrovision or ADAM tech is that the photogrammetry

meshes themselves, do have some inaccuracies.

And so if you’re trying to perform structural geology

on this data, you can’t always rely on an automated pickup

with flat faces.

You do need to have your geological brain on.

Does this make sense?

Is it a planar feature?

Does it bend while you’re doing the interpretation?

And from all that interp, we can make a structural network,

as you can see on the right-hand side there.

So just a little bit of a case study of what to do

with some really complex structural environments.

So this is, we had some failures in a pit,

and it was deemed very

important to get a structural geologist out there,

to do some mapping and find out what we could.

Only 12 out of the 40 days he was on site

was he able to even access the pit.

There was logistical issues with even getting in there.

The maximum approximate density that you can collect

when you’re walking around the pit is about one measurement

every 20 meters, realistically,

just in terms of time constraints.

There are heaps of inaccessible areas that we really needed,

of course, the ones close to various failures

are the ones that we didn’t get data for.

The positional accuracy of the GPS wasn’t particularly good,

and the relevance of individual measurements is questionable

because you really can’t tell a global context

when you’re walking out.

So in order to augment this study,

we decided to do a drone survey

and we have covered the entire pit in this,

but this is one little example of one little corner.

Some ground control points, just using a consumer drone,

a simple inclined grid planned with a back and forth

flight pattern.

And we processed fit for purpose, high detail products

within a day using Agisoft Metashape.

And this is what they look like.

So one of the joys of doing it all in house

is that we can tweak the final products

to be exactly what we need to pick up the features

that we want.

And if it’s not quite right,

we can dial up that detail and reprocess it again.

Now, if you’re dealing with a open and shut

request to provide photogrammetry data

from someone that’s not involved in the structural or

geotechnical model, often the product that is returned

is not always appropriate to do that level of mapping.

The outputs, we configure them to be compatible

with all the major stakeholders on the site as well,

so everyone was able to have a look at this data.

In this particular case that we’re seeing on the screen

is this 30-meter-high slip along bedding.

There’s no physical access to this location.

There’s rubble, there’s overhangs,

and even if you were able to walk up to it on foot,

you’re unlikely to get a representative measurement.

As you can see, that slip surface is bending

quite considerably, even just this plane of view.

So instead took the photogrammetry output into a package

called Cloud Compare and in a combination of picking faces

in Cloud Compare and manual digitizing in Leapfrog,

we’ve got a decent dataset together of bedding and joints,

all digitized on the computer screen.

Sighting down measurements for Leapfrog digitization,

which is often a little bit better when you’ve got,

you know, rough edges, but you can still see a good feature.

And the context is decided during the digitizing process.

So I knew I wanted to make a structural model,

so I digitized with that context in mind,

rather than just ending up

with a cloud of unlabeled measurements.

It took only about two hours to do this entire area.

So a similar number of measurements

to our structural geologist friend

that was there for, that was in the pit for 12 days.

It’s not automatic or replacing a skilled geologist,

it requires a skilled geologists to do this work.

We haven’t quite managed to automate that out of it yet.

We picked up all the ethanol and the joints of any visible

outcrop, and it’s easy to refine targets

for walk-up in the pit, if we’re not sure about something.

More to the story, three years prior,

there were some pseudo bedding models created.

And it certainly looks like from the old measurements

that there’s a scoop geometry that aligns pretty well

with where the failure was in the pit.

So the thought was, well we think we understand

that there is a failure risk along bedding.

Can we highlight that risk in great detail

on the photogrammetry?

Often in a pit study, I’ll be asked,

can I please have a dip and dip direction

of the features that we’re interested in?

And as you can see from that image there,

the feature that we’re interested in,

doesn’t have a single dip and dip direction,

it meanders all over the place,

so the answer would be, no,

we need to find some way a little bit cleverer to do that.

And hopefully we did.

We interpolated the unit vectors from the raw bedding data

that was taken from the photogrammetry and the previous

studies and mapped them directly

onto the photogrammetry data set.

So we’ve found the angular difference calculated

between the rock face from a photogrammetry

and the expected bedding,

and there’s a couple of little issues with needing,

we need trig functions in Leapfrog

to make this a bit more accurate,

and if we can get access to the,

the form interpolant algorithm

to sort of directly map values onto points

that would make this a lot more accurate as well.

But if you’re close to data,

this method works pretty well.

And this is the output.

So what we’re looking at here is that same survey area,

where we have very bright yellow,

is very close to aligned to bedding.

So faces in the rock that exists during

a photogrammetry survey that match the bedding orientation.

And down to the red pixels are five degrees away from that.

So the brighter, the color,

the closer to the bedding we were.

And you can see from that image,

that in the zone of the failure, towards the left-hand side,

there’s a huge area of yellow, and over to the right,

where the bedding veers away from the exposed,

from the pit plan, from the bench angles,

you get little planes of failed rock,

but none of them are really huge.

So from that data, we turned a couple of hundred

measurements manually into possibly millions

and picked up everywhere in the rock face

that looks to be relevant to bedding failure.

And from that we can evaluate, you know,

risk at any given location

based on what we can see in the pit walls.

But I think it’s a good case for having

some augmented reality as well,

where you could walk around the pit and see

what’s in front of you.

What is bedding?

What is a joint?

Is it close to your expected angle,

or is it something completely different?

Okay, onwards to our implicit vein

model structural analysis.

So we’re looking at the analysis

of an implicitly modeled object.

In this case, it’s a high grade ore lens.

It’s a fairly planar feature,

but there are a number of structures that interact with it.

We know from looking at the core and underground,

that there are structures in a number of orientations

that are important, but their persistence,

the significance of them and how they interact,

and yeah, it’s a little bit difficult to work out.

I call this a consistent data set

because we’ve got fairly good drill hole coverage

for this object, about 12 1/2 meter centers

across the entire thing, along a strike of 2 1/2 k’s,

and down dip of about one kilometer.

And it’s modeled from assays.

So even though there’s some geology interp,

it’s still a fairly consistent shape,

and every contact is very similar to the next contact along

in terms of how it’s defined.

So is there a way to get a bit more value

out of seeing those perturbations in that shape

than just looking at it?

And the answer is yes.

Again, we’ve taken the vertices from that object

into Cloud Compare

and looked at the different dip direction

and the constituent unit vectors that are orthogonal

to that mesh.

And in the top left-hand side,

you can see our dZ unit factor,

which is actually fairly similar to dX and dip

in this instance,

because it’s a north south striking orebody.

And we can see those horizontal features that were visible

are now far more continuous.

And we can also see some features that are ramping up

towards the left as well, that are fairly clear.

Beneath that we’re coloring with dY,

which is analogous to aspect in this case,

a different value, but a similar pattern.

And so some features that are running up and down

the length of this are seen quite prominently

in the purple and the green bands next to the purple.

The top right-hand side, we can see thickness,

which is calculated from the Leapfrog vein algorithm.

And those horizontal features can be seen causing a extreme

thinning of the orebody in the same location.

So not as much with the ones that ramp up

towards the left-hand side,

but definitely those horizontal ones are important.

The vertical ones in that image

appear to be thickened rather than thinned as well,

so starting to understand what the structures are doing

and what their angles are,

how they interact with the orebody

and why they’re important.

In the lower right-hand side, they’ve all just been traced.

So running around with the digitizing tool

and picking out everything we can see

from any of these datasets.

And of course there was a large amount of mucking around

with color and contrast in order to really understand

where everything was.

And from that, we generated a structural network.

Now we didn’t necessarily have the exact angles

for this study.

They can be tied into underground observations

when there’s time,

but what we did get out of this is an understanding

of where families of structures interact with the orebody,

where they interact with each other,

and what would be considered a structurally complex zone

in this orebody versus one that’s not.

And that is quite an important outcome.

So we went from,

we went from an understanding

that there are structures everywhere,

and we have a risk everywhere in this orebody

to really nailing down some locations of importance.

Finally, so all of this is innate of,

fill in a geotechnical numeric model

so we get the best evaluation of rock quality,

rock mechanics, and risk we can in open pit or underground.

Now Leapfrog does have a part to play

in populating this block model as well,

because, especially with the edge module,

we can populate a number of things into a block model space

that is useful in Cavroc.

So things like lithological instructional domains,

structures themselves, structural trends,

and they can be individual per domain,

or they can be global.

And we can feed, yeah,

we can feed the trend data into Cavroc as well.

For the geotechnical logging, RQD can be interpolated

fairly nicely in Leapfrog.

You do have to flip it around to do 100 – RQD

as the RBF algorithm likes to contour around high values.

And so I find that visualizing RQD as it’s being

interpolated is an extremely powerful thing

that a lot of people don’t do.

And you can ensure that your RQD interpolation

is geologically reasonable before it gets sent onto

a geotechnical numeric model.

Orientation interpolation, there is actually a workflow

using edge to get downhole measurements

through into the block model space

that involves creating form interpolant surfaces,

and then using VAT to populate a variable

interpolation field in edge.

But you can see from the lower right image,

the final result of that is an evaluation of angle

from the input data at every centroid in a block,

which is a particularly useful thing.

The final geotech block model,

in this case from Leapfrog only,

I was able to populate the domain, foliation orientation,

foliation intensity, the RQD, distance functions to objects,

and distance to reliable data in the model.

And so while it’s not my place to evaluate risk,

I was able to do little calculations

like looking inside the orebody for poor quality rock

that’s highly foliated

that’s sitting next to some valid data and pointing out,

areas of particular interests, geotechnically,

that are worth focusing on.

And that block model can just be delivered directly

to our numeric models.

So some final thoughts,

there’s a lot of opportunities for geology

in the geotechnical space.

Geotechnical engineers often have access to a lot more,

really high quality data than a geology team

would normally see.

And so it’s worth the two teams working together,

the data of critical mass to do this level of interpretation

in the structural space,

and just in the orebody knowledge space

is usually when a mine is more mature

than when you’ve got your exploration department going

great guns on your deposit.

Interpretation and reconciliation of the geology model

using these massive data sets at a mine

is not commonly done from what I can tell.

Most of the data is restricted to very narrow use cases,

such as the blast hole databases being for mining block

models, rather than being seen as a opportunity

to really understand your orebody again.

And better geology always feeds into better

geotechnical modeling, the understanding and the products.

So it’s really worth having this partnership

between geologists and geotechnical engineers.

Sadly, geotechnical numeric modeling is often the only good

justification for getting this work done.

They tend to be the ones that have the budgets and the need,

and the link directly to production,

rather than a structural geologist, such as myself,

who would often only be asked to do this

as part of a more academic study.

So hopefully what I’ve shown is that

it actually does provide a level of rigor to geotechnical

numeric modeling to get some really high quality structural

geology in there,

and especially from these datasets that geotechnical

engineers get to use every day.

So thank you very much for your time.

(bright electronic music)