A step-by-step webinar to go through analysing EM data in Oasis montaj and 1D and 2.5D TDEM VOXI modelling of conductive anomalies.
Recent improvements in geophysical algorithms can pull more information out of new and historical electromagnetic datasets. Oasis montaj is a geoscience platform for processing, modelling, and sharing geophysical data with other stakeholders. In Oasis montaj 9.10, improvements to the VOXI TDEM workflow allow for ground and airborne electromagnetic surveys to be inverted into conductivity or resistivity models of the earth. This workflow shows how to use the EM Utilities extension to set up the data for modelling, calculate and remove EM array noise, calculate the Tau values and analyse EM arrays. The data is then brought into VOXI for geophysical modelling using the benefits of the simple VOXI workflow for setting up data space parameters. This data is shared with the geological team through new upload and notification tools in Seequent Central.
Project Geophysicist – Seequent
<v Mark>Hello, everyone.</v>
Welcome to the webinar today.
Let’s take a moment to allow everyone to log on
and get connected.
I’ll just Welcome everyone to the webinar today.
So, today we’re going to be using Oasis montaj
and looking at some EM data.
My name’s Mark Lowe, I’m a project geophysicist
in the Perth office for Seequent.
And I’m just going through the plan for this webinar.
Just before we do some housekeeping,
there is an option to add some comments
and chats in the link that’s there provided on the screen.
Definitely encourage people to use that
to type your questions in and send them through
and I’ll make endeavor to get back to you
soon after the webinar.
So today we’re going to be looking at a dataset.
It’s an airborne EM data set,
regional data set in Queensland.
We’re going to go through setting up the TDEM database,
looking at the EM utilities extension,
noise calculations, tau calculations,
viewing coincidence arrays and error envelopes,
and making some changes to those.
And also setting up
and running a unconstrained VOXI inversion
using the 1D TDEM EM VOXI tool.
The data for today is courtesy of GSI Australia.
If you were to search for the Mount Isa east VTM survey,
that’s the data set there’ll be looking at today.
Welcome to this webinar on the EM utilities extension,
and also Foxy TDEM inversion within Oasis montaj.
My name’s Mark Lowe,
I’m going to be presenting today,
looking at an airborne EM survey in Queensland Australia.
And we’re looking at modeling a conductive anomaly there.
Using the EM utilities extension
for QA QC and inversion setup,
and also for looking at the time constant calculation.
And then we’ll be prepping the data set
for inversion in VOXI and running an inversion
and having a look at the results
so that we can share them
to a geological modeling platform
or viewing those in the 3D viewer.
So to start off with today,
I’m just going to create a new project
and give it a name.
And I’m going to load the EM utilities extension
using the manage menus icon
in the project Explorer page up here.
So EM utilities is a tool with a couple of
it’s, a menu item with a couple
of different tools for setting up a database
for a time domain EM system.
So usually a airborne system,
but it could also be a ground-based system.
So setting up the time windows
and also doing noise calculations
and time constant calculations as well.
We’re going to bring in our database.
And this database I’ve gotten straight
from Geoscience Australia website courtesy of them.
It is a to B to T data set near Mount Isa.
And I’ve already selected a number of lines
that I’m interested in,
which are over centered over at a target,
which is actually on line 1940.
It’s a stratigraphic unit.
That’s a fairly conductive
and has some orientation information I’d like to model.
So I’m just going to firstly create a subset database
based on this database that they’ve provided.
And then we’ll just get it ready to invert.
The original database is quite large.
I think around three and a half gigabytes,
I’ll just make that small.
And we brought in the database now.
Oh, here it is.
Sorry we’re one.
And I’ve got all my lines selected.
And what I’d like to do, firstly,
is just use the coordinate systems tool.
So right click on my easting coordinate system.
And I’ll just set up the coordinate system.
So VOXI requires projected coordinates
for running the model.
So everything is run in a projective coordinate system.
So in this case, I’m going to convert,
it’s already actually in a projection.
It’s actually in GDA 94 MGA’s zone 54.
So I was going to assign that.
That’s one of my favorites anyway,
and bring that in.
So my X and Y has now been assigned.
They obviously have made us,
I might just remove some of the unnecessary channels,
hiding those pressing the space bar,
longitude and latitude I don’t need,
the height will be important.
So I might keep that channel there.
I Don’t need the laser and radar.
The current channel,
we will keep the now the power line monitor is also useful
because of the location of the power lines.
I might keep that for now,
some of these other channels that we don’t need,
and we’re just going to be inverting the vertical component
rather than the horizontal component today.
But obviously if you’d like VOXI is able to,
and the EM utilities tool are able
to run these same analysis
on the X component of the field as well.
I’m going to hide it for now
and I don’t need distance.
So we’ve got the vertical component here.
I’m just going to show the array
and I’m going to show it with a logarithmic stretch.
And I’ll just rescale that.
So here’s our survey data that’s been provided
and just cycle through lines here.
This section here is the anomaly
I’d like to look out a bit further.
If I bring up,
you can see there’s some noise and areas here,
actually bring up the power line monitor.
There is some high voltage power lines in the area
to be careful with.
So they obviously are affecting
the measured component there as well,
and quite a way either side of the line.
So I could produce a mass channel
for the power line monitor,
and then to make sure that that’s not included
when we’re doing some of these calculations.
So is going to create a mask
and I’ll just do a threshold
based on around about one standard deviation
of that power line monitor.
Sorry, just greater than three,
or make it easy to tell me syntax error.
Here we go.
Just a simple Boolean expression.
And we said it a little bit low.
I remove our information that we want to try and model.
There we go.
So play around with the mask
and this is just so that I’m not bringing in
any extra noise when we are doing
our noise calculations in here.
Another thing I can notice here is that
we’ve got some, some nulls in there as well.
It’s going to look at the array
and there is some nulls channels
in those early times that obviously
not recorded quite close to the off time.
And again, right at the end,
looks like it goes up to road 49 at last data 0.3 48.
So the last window is also not collected.
And I’ve actually brought in the window database straight
from the logistics report.
So I created a window data set here,
which shows the table of windows straight
from the logistics report.
And I’m going to use this
it has the index of each of those windows.
I’m going to use that to set up
our array-based properties for this data,
but firstly, I’m going to have to subset this.
So there’s no dummies in there as well.
So what I’ll do is I’ll just firstly, subset the data
by going to database tools,
array channels, subset array,
I’m going to input array
and output the other subset.
It’s going to be a to P to T,
start element will be the fourth element
and the end element will be the 48th.
And now our first value there is,
is the first recorded value as well.
Seems a little low.
I might just do a quick check just
to make sure I’m not removing any data.
First values should be 3.3.
Right it’s the same values.
So that’s good.
So just subset it to the first values.
And so I can remove that channel now,
I’ve replaced it with the DPT
and I’m going to set up these arrays
array-based properties by right clicking
and selecting array based properties.
In here I’m going to import from our window database
and this wasn’t provided, but I just, yeah,
I just copied and pasted those results in
from logistics report associated with that survey.
And you can see there’s a couple of different ways
we can set up these time windows.
Starting ends mid times and widths, for example,
and we’ll automatically pick up the,
the index offset as I’ve determined in here as well.
But because I’ve removed that offset,
I’m going to just bring that back to zero here.
So here we go, I’ve got my width.
They’re all increasing, which is, which is great.
That’s how it should be set up.
And it’s continuous all the way through start
and end times at all in there as well.
And there’s a few options for base as well,
but we’ve just brought in discreet time windows for each.
So if I just plot that again,
what we can do here is
we can also show selected array elements
rather than all of the array elements.
So this can be useful for looking at early times,
late times, every fifth channel, for example,
just for now, I’m going to show some late time channels,
the last 10 channels.
And you can just see it as another way
of sort of observing that, that data in there.
I can’t, I’ve got their logarithmic stretch.
I’ll apply that again
and just rescale.
So there’s that feature there, I was trying to model.
so EM utilities.
So we set up the data.
Now we can go ahead and calculate noise on this data set.
So usually what you want to do is
look for an area which is fairly resistive
to calculate noise on
as it will be a background response for the survey area.
So if I just go ahead and show all of those,
all of those array elements again.
Here it is
and I’ll just do the logarithmic stretch.
so we looking for areas which are fairly continuous,
kind of short, pretty average sort of background response.
So in here, for example,
is that there’s a bit of a late time
sort of all gone all the way through an anomaly in there,
but let’s try another line.
What, for example, in here
it’s pretty flat overall,
does bounce around a little,
but there’s not this serious noise in there.
It’s away from that those spikes
that obviously the power lines,
and we can use that to calculate our noise.
So when you going to do corrections,
calculate noise input that BDT,
we could select the mask.
I won’t worry for now.
And I can select those marked rows as well.
And you can see I could calculate
based on those marked rows,
I could calculate based on the displayed line,
which will bring in the entire line,
selected lines, all lines for example.
But now I want to do it on the mark rows.
You can see the values get quite low
and bounce around zero.
In this case, I estimate the noise because of that.
Some of those mid-dish to late times
were bouncing around sort of after that
to 0.01 picovolts present mean for value.
I set that as a noise floor.
So it’s on from about channel 20 onwards.
Press okay and talking about units,
that’s something we have to set as well.
It’s going to edit this channel,
make sure that the units are applied in here.
So I’m going to grab these directly
from the logistics report actually.
And you can see that the SFZ channel, there it is.
I use a selection tool that is the units there,
which you want to use.
Could this be time data?
So we’ve calculated the noise.
Now we’re going to create a denoised data set.
So reject noise,
the output is going to be our DB DT denoise,
lines will be all lines selected lines is the same.
And you can see, I can,
I could also put a global multiplier in there as well,
sort of a factor that we could apply.
As we look at the results after we apply this noise,
we don’t want to be getting rid of real data.
We don’t want it to be underestimating the noise as well.
For now, I’m quite happy with that.
I’ve selected that range to do it on.
And the benefit of having a previous look at it as well,
a total of 145,000 spikes removed from the data.
So overall, it’s not, not heaps in this set of lines.
So we’ve got about 45 data points per row,
and there’s tens of thousands of rows per section of lines.
So it’s not, not a huge amount.
And we’ll see if we can see some of the effects on there.
What I might do is,
I’ll just show the de-noise channel as well,
which look pretty similar to the above.
And if I just bring that down a little bit,
make this view a little larger.
We could, we find it difficult to see much changes in there.
I’ll go back to that line of sort of interest in there.
So what’s another way that we could observe this data
rather than these array profiles.
So in EM utilities there’s another tool we can use,
if I just select an area of interest,
which I’m going to do it near sort
of a gradient changing area,
sort of in here,
see if there’s any effects on any of these array elements.
If you go to EM utilities, interpretation, view,
this is a really powerful tool
because it allows you to show the onboard noise
that you’ve just calculated
and also show comparison to your de-noise channel.
And if I just change the scaling
and I’ll put it out that noise for as well,
we could see the effect on here
and also move this around
to an area of interest.
So I’m going to get back to that,
that fit that I was interested in
sort of close to that anomaly there.
I just want to see if there’s any, ah, there we go.
There’s a green value in there.
So this was changed from .102, to .115.
So it was obviously slightly noisy,
slightly outside of that standard deviation
or that noise floor in this case,
I think I had 0.01.
And so it has, has fitted that point slightly.
So those are the sorts of things,
we can test and see if it’s doing a good job.
Oh, there was another one.
So in here we’ve got a bit of noise
and it’s fitted that point as well.
So if I scale these to, we’ll get to speak to that scale.
Of course, scale to all channels,
put a fixed range on this so that it stays consistent,
lots of tools we can use in this view, coincidence arrays.
And of course it’s all data tracked as well.
We can see that location.
If we have a map view,
we can see where that,
that represents that on a map as well.
So I’m going to use that again later on.
For now we’re going to do a time calculation.
So just go down to EM utilities interpretation,
and select our de noised calculated data
or our TBT,
either one set the lines we want to run it on.
So maybe just the selected line
and it’s going to create a fit tower,
which will be the maximum tower
from those the windows that we set
on this case can be a five point moving average
down the array,
the minimum and maximum values obviously set
from the base frequency.
I think this was the 25 Hertz.
So 20 milliseconds sounds about right.
And we can set those parameters
to use particular channels or not.
It does get noisy towards the end.
I could switch those off as well.
There’s quite a lot of settings
you could apply in here as well.
So let’s go and press okay for now
and to calculate what I just did on one line,
While that’s running.
So, so time constants or tau values,
obviously you could be used
as an interpretation tool in themselves
that could be gritted up that maximum tau value.
You could have shown it on a profile and look for peaks.
It’s definitely another a tool there
that can be used to see,
the characteristics of conductors.
I can show that value
and obviously there’s a bit of noise in there.
And then we can say that there is a peak
with a particular peak value as well.
4.9, for example, milliseconds.
All right, so that’s some of the tools
in EM utilities apart from that
is Foxy TDEM.
So let’s go ahead We’ll do a model
of some of these lines as well
and do an inversion.
I’ll just create a map showing these line locations,
map new, scan the data,
next, I’ll just call it a base map
and just to show that line path as well,
don’t need compass direction.
Okay, here we go.
It’s got to line path on there.
And obviously we could show the power line meter
it would show where the power lines are.
We could create the time channel for example,
lots of other things we could do to try
and highlight that anomaly.
I’ve already created an a polygon,
which was from create a polygon fall.
I’m going to draw that polygon.
The reason why I created this,
was so that it’s, it makes it quite easy
to set up the model in VOXI as well,
over an area of interest.
So actually had an area here
that I’d like to model further.
It’s centered over that
that conducted body I showed before
and the time windows.
So also the next thing that you will need,
you need three main things for VOXI inversion.
That’s that polygon, it’s the database or a grid
in this case will be a database for EM data
having all of that array data.
And you’re also going to need the topography.
So I’ve just grabbed some topography data
from data services seeker
does the SRTM 30 meter typography,
which is fine for the resolution of this area.
There is, this is a regional expiration information surveys,
two kilometers between each of these survey lines.
So the SRTM topography will be fine.
Obviously I can display that on here.
There it is,
and told us to put it topography stretch on there too.
So that, that was just covering that,
that area of interest there as well.
And there’s my topography and my polygon.
So I’m going to go ahead
and create a new VOXI from polygon.
I’ll call it a VTM one D domain, inversion.
The polygon I had was area of interest
and the typography is the SRTM.
So it’s just going to discretize that up
into a sort of a general resolution
based on the size of the area.
Make sure I select one D TEM
and I’ll boost up the resolution
just to make sure it’s below the
250 by 250 subscription size cutoff as well.
And press okay.
So what this will do,
it’s going to create a mesh
based on those dimensions will input the topography,
do a Cartesian cut to the top surface of that
with the top cells.
And then it’s going to do the next thing
apart from trimming it to the area
is going to ask for the ad data with it.
So this is where we can go through three simple steps
to bring in that EM data.
So I’m going to go, yes.
And we’ll go ahead and bring in the data.
Okay so lets notice that I haven’t saved it
since I’ve done run some changes.
So I just want to make sure that
those changes are all saved.
So yes, and it’s going to automatically
bring in the Eastern and Northern channels.
And with the elevation,
you have a couple of options,
either the elevation itself or a terrain clearance,
which we have the EM loop heights.
I’m going to use that as terrain clearance here.
The optimized sampling we want to make sure is,
is set to speed up our calculations here
’cause obviously the along along line distance
at the base frequency it’s going
to be quite quite close between each several point,
maybe 10 or so meters.
And I’m going to be using more around
a 50 to a 100 meter cell size here.
So I want to optimize that to speed up that inversion.
One sample per cell is quite enough,
it’s conductivity model.
The top of data it is a VTM system.
I had to change the configuration slightly for this one.
It is a long pulse,
and I’ll just put a little subscript GA on there.
The way I did that is I just set up
a new configuration in here.
I wrote it in and put in the parameters
for transmitter and receiver
and the wave form and everything in there.
The channel is Martinez channel
can see there is options for the X component.
I’m just going to do the vertical for now,
and I’ve actually calculated the noise.
So I’m going to use that here.
So that calculated noise is actually attached
to that channel now,
and we can load that from the EM channel.
And again, it’s going to ask that same error multiplier.
Which I said as one before, so I’m happy with that.
I’m going to stay with just no error multiplier applied.
Press next, some parameters around what’s going to import,
which all look good, start time, end time,
et cetera finish that.
And it’s going to go ahead
and bring in that data set,
sub-sample it to the mesh size
and create a new subset database
with the error information,
as well as the data information.
So it’s created a new database discritized on that,
the dimensions that we’ve input here.
Would you like to add the way form? Yes.
So now we want to go ahead
and add the wave form information,
and that’s going to come from a database too.
So Geoscience Australia actually provide
a wave form with this data set on their website,
which was fantastic.
And it was in a soft GDB format as well.
So it was called sort of grab it.
What’s a cited, a different database.
There it is, waveforms press okay.
And that was straight from the website actually.
And you can see that it’s brought in the time
and current channels from there as well,
which is, which is fantastic.
You can obviously make this waveform
from a logistics report
or from the contractor as well.
And you can see once that’s important comes
into this new window,
which shows the current build up
and off time and as well,
and the length of that,
and the window channels all brought in here as well,
as well as that error database
that we set up with EM utilities.
And if we just press the calculate here,
it’s going to just zoom to that Off-time cutoff
and then offset based on the start time
for those first windows
and continuing windows after there.
And you can see where each of those windows are.
And for example, that last window wasn’t being measured.
So that’s fine.
So I’m going to press okay.
So I brought in a waveform.
I’ve set my channel windows.
This dataset is, could be ready to invert already,
but I’m just going to set up my mesh a little differently.
and this is actually going to then go back
to the original database and re-sample it
based on my new mesh parameters.
So if I go right click modify, because of,
I want to try and retain as much
a long line resolution as possible.
I’m going to increase this to 50
and perpendicular to the line direction.
There’s not much information of course, offline.
So I could, I could break that down to a larger size,
much, maybe around 300 off those lines.
And as much as I can,
four meters maximum resolution
in the vertical direction is quite good as well.
And you could set advanced properties,
if you’d like to as well here.
I’ll go ahead and press okay.
So update my mesh, it’ll create a new database,
re-sample that to that new database.
It does a lot of things in the backend here
that you don’t have to worry about.
And but you can look further into these
and I’ll show you how to do that too,
cause that can be quite, quite worthwhile.
So there we go.
It’s re sampled to that larger mesh.
We could right click on the data source
and press view.
And if you do this,
it’ll bring out that new sub sample mesh.
And the reason why I’d like to view this,
is to make sure that the data I’m inverting
actually has the anomaly information in it,
that we want to try and target here.
So here’s the de-noise channel I’ve imported.
If I show that just the late times.
Yeah, there it is.
That’s great and then if I show my error,
so this should only show a value of 0.01
in those late times anyway.
I can see where that cut off is going to be.
And obviously it’s going to cut off most of this noise here.
If it’s below the noise threshold,
there’s a little bit of geology over here
it’s going to target
and this anomalies definitely is going
to be modeled in the late times.
So that’s fantastic.
That’s what I wanted to confirm there.
So that can be helpful just to check as well.
It’s going to go ahead and run this, this model.
There is constraints we could set up.
I don’t have much information at the moment
about what I expect to see
if I had a background conductivity,
I could set that in here.
I could set a voxel if I’d already run an inversion
and would like to constrain it further
based on that, or, sorry,
start off the model from most parameters
to speed up the inversion.
If I had a value of the background conductivity,
for example, if it was around 20 million meters,
I could have set that as a constant value here as well.
For now I’m just going to run with the default,
which is going to start from a zero background.
And from there, I just go to run the inversion.
So this will set it up.
It will send it to the Microsoft Azure cloud.
It’ll break it all down into a binary format,
which is, has no coordinate information in it.
It’s very secure.
It’s obviously running on Microsoft Azure platform
and about apart from the security aspects,
it’s the speed that really is a lot of the benefit here.
So we can run this inversion.
It should take around half an hour.
So I might just switch off for a moment
and I’ll save you waiting around
and, we’ll come back, I’ll come back on
when that inversion is complete,
but you’ll see that this is going
to speed up that process,
and then produce a model here
that we can have a little bit
of a further investigation of.
All right, just let that run
actually not as long as I thought so,
it took about 26 minutes to run that inversion
and we’ve got our results in here.
So we’ll have a closer look at those
let’s turn off the mesh.
So, so is it done?
So we’ve got a, I’ve got a conductivity model here.
We’ve got a predicted response.
We can look at it as well
and see how it fits to our observed data.
We could obviously clip through
the result we got here as well.
See if we can observe the anomaly in there,
which we can.
Not bring in as much in the deeper areas,
as I thought, but it’s okay.
We just needed a little bit more
of a fine tuning let’s set up.
And that’s just the first pass.
So why don’t we have a look at the predicted response?
I think this is really worthwhile to start off with,
and as we know inversion,
it’s an iterative process.
This was just one go,
we can run it again and try and find you in that.
Let’s see if we can tighten up that result as well.
So you can see when I,
I clicked to look at the predicted response,
it’s going to show the original observed data.
So for example, I can show those late time elements again.
I could cycle through to that line lines of interest,
and I can also show the final
or various iterations as well.
And if I just show that with the same scale
and can see how well it fits now,
it can be still a bit tricky in there
to observe what it looks like for each row.
So we can still use the EM utilities,
interpretation view Quinson and raise here as well.
And if I do that, then I can show my observed data.
I could show my error channel,
and I’ll just show the display
the second channel as the error
and show my final inversion and see how they compare.
And if I just moved down to where that anomaly is,
again we can see how well it fits.
And we could probably tell more,
we can see that there is still a bit,
that’s not fitting so well here.
It might be that can change those model parameters,
or I could check where those nodes are sitting
in comparison as well to the mesh size.
Those could be some mesh parameters.
we can do a bit of a fix up.
We can cycle through the other line as well,
where we can see that late anomaly there,
see how well it fits.
We see focusing on some of these light channels
and not too bad, early time, they wasn’t fitting so well.
And a bit later on on these edges
is not the best fit as well.
And we can see that in the result,
it’s a bit more vertical than what we expect,
but it’s a bit of a first pass.
What we can do here is you can right click
and choose display in 3D view.
And if I create and plot sections along lines,
these will produce GSF section grids,
which could also be then copied
and pasted into a Leapfrog go
with the right information, the right location as well.
So this is the latest version of Leapfrog.
The June release Leapfrog 2021.1.2
does allow importing GSF crooked section grids as well.
ISO surfaces we can generate,
on this one I might might leave that for now,
but you know, once we fix up
that inversion a little bit more,
when they’re doing that iterative process,
rerunning it again, it might be worthwhile,
probably would be worthwhile producing some ISO surfaces
or particular connectivity valleys, for example.
I might just turn off some of these layers
just to show those section grids,
turn off the voxel for now as well,
and turn off the surface.
And you can see some of those section grids
in there as well.
And obviously we could do some further modeling
and do some ISO services of that as well,
but that’s just a rundown on using those tools.
Of course the next thing I’d do,
I’d go back to the model
and see if we can fix this up a little bit further.
I’m not a 100% happy with how that turned out.
It’s always one of those things
when you’re doing the webinar
and it worked better the day before,
but that’s okay.
So it’s probably a matter of playing around
with those mesh parameters,
playing around with those noise parameters again.
Trying to fit that noise possibly a little bit better
and tidying up those mesh constraints
and rerunning that model again,
until it makes a bit more geological sense in there as well.
But I hope that gives you a good rundown,
of the tools and using those tools.
And other things we can do obviously is,
is play around with some of these
model building parameters in there.
Also, I didn’t quite go through those,
but there is some settings we want
to run that inversion again.
We could set that as a starting model, for example,
just to quickly show that, to right click modify,
set that final inversion result actually
as our starting model.
And we could use that then to,
to run the model again
and try and refine this a little bit further
rather than having to start
from the starting point.
So that could be another first step as well,
even if we actually resize the model mesh,
we can still run that.
So there’s a few steps to look at there,
as I’m sure everyone knows inversion
is an iterative process.
It’s a matter of, you know,
playing around with these constraints,
trying a few different models,
but the power of, you know, these tools
really is that it gives you in the EM utilities
more options to view those results.
And to recalculate that noise
based on sort of what you see as fitting
to the geology and the system parameters that are there,
kind of go through how to,
we went through how to set up the system in VOXI as well,
and ready for inversion.
And then obviously running that on the cloud,
means we can speed this up.
So that was 25 minutes to run that inversion,
we can run it again, get a better result
and just continue that process as well,
and get straight into the geological modeling
and adding value using that chip,
that geophysics data as well.
So hope everyone enjoyed that webinar today.