Skip to main content

Lyceum 2021 | Together Towards Tomorrow

What have been the benefits of digital ground modelling, and how has it improved the project outcome?

If you could change one thing about how the industry does something today, what would it be and why? These are just two of the questions we’ll be posing to a panel of industry thought leaders from Mott MacDonald, DB Engineering and Arcadis. Join us and together let’s explore how digital transformation has already positively affected the role ground engineering plays in transport infrastructure projects, and what still need to change in the near future.



Bernd Heer
Senior Geotechnical Consultant, DB Engineering & Consulting GmbH

Peter Fair
Geotechnical Data & 3D Ground Modelling Specialist, Mott MacDonald – UK

Andrea Gillarduzzi
Technical Director, Geotechnical Engineering, Arcadis Consulting (UK) LTD

Gareth Crisford
Regional Head of Civil, Environment and Energy, EMEA, Seequent


40 min

See more Lyceum content

Lyceum 2021

Find out more about Seequent's Civil solution

Learn more

Video transcript

(enlightening music)

Hello, and welcome everybody

and thank you for joining us for this session

on Digital Information in Transportation Infrastructure

in Ground Engineering.

I’m going to be joined today

by three other panelists.

We have Bernd Heer from DB Engineering Consulting GmbH,

joining us from Germany,

Andrea Gillarduzzi from Arcadis,

and Peter Fair from Mott MacDonald.

And we’re going to be going through

talking about examples of where digital innovation

is helping in ground engineering to date,

and also, discussing some of the challenges

ahead in this sector where we can expect

to see improvements.

So without further ado,

let’s make best use of the time

and go round with some introductions.

Bernd, could I ask you to kick off please?

<v ->Yeah, so as you said, my name is Bernd Heer.</v>

I’m working with railway sector

for about 23 years now,

most of the time for companies within the German Railway.

Now, with DB Engineering and Consulting,

and I’m looking after the strategic

development of the geotechnical branch

of DB Engineering and Consulting.

<v ->Thank you, Andrea.</v>

<v ->Hello, my name is Andrea Gillarduzzi.</v>

I’m a Technical Director for Geotechnics for Arcadis UK.

I lead the Arcadis global initiative

for subsurface digital modeling,

and I lead the Arcadis UK Tunneling Community of Practice.

At present, I’m mostly working as engineering manager

for one of the lots of HS2.

<v ->Thank you, and finally, Peter.</v>

<v ->Hi there, so I worked for MacDonald</v>

within the geotechnics practice,

and currently, working on HS2 as a GI data manager,

making sure that the data is all there,

ready to be used by our designers,

and then bringing that into the 3D, 4D environment,

the BIM environment, making sure that

our geotechnical data is talking

with the wider BIM platforms.

<v ->Great, and so on that note, I’m sorry,</v>

I should add, I’m the Head of Segments,

which are Seequent’s regional business

for civil environment and energy.

And yeah, delighted to be joined by

far more about this industry

than I could expect to discuss here on my own,

so it’s great to have you joining.

With the ground conditions increasingly

being incorporated in the BIM environment,

the business information monitoring,

are there any examples of where going digital

is actually connecting things faster

within the project and improving the outcomes?

Maybe you could start with that, Bernd.

<v ->Yeah, well, the 3D models we’re providing now</v>

enable a faster and smoother tendering processes,

especially in relating to German regulations

called VOB Part C where we have to provide

the contractors with very detailed information

about the ground and the amount,

the volumes of soil or rocks

they’re expecting of a certain type,

and Leapfrog enables us to do just that,

to give detailed information about that,

which enables a more precise tendering.

And also, in the end,

during the production and conception process,

there is less room for discussion with the contractors,

which helps the project in general, really.

And we’re seeing that.

We’re working on that.

We’ll expect a lot better outcome in the future.

<v ->Thank you, how about you, Andrea?</v>

<v ->Yes, we use a digital subsurface modeling</v>

in the better plethora of jobs in Arcadis.

For example, when we prepare the tender documentation

and the DCO application for Lower Thames Crossing,

which is a major proposed new crossing

to the East of London,

we used an approach based on GIS

and the proprietary software.

The GIS included over 200 layers of information

ranging from aerial photograph, satellite imageries,

like InSAR, color-infrared

over 800 historical burrows which were digitized,

new information, burrows topography, bathymetry.

They were all integrated to create a ground model

which then it was used as the basis

for mast hole assessment for risk assessment,

including GI, for planning the GI and delivery,

and then effectively, as the core for the B model,

which included, obviously, the existing run conditions,

and the modification proposal,

and the assets at the area,

all the utilities, the proposed diversions,

and that land ownership,

stakeholder management, and the like.

We are also using GIS approach in extensively in G2S,

which is one of the lot of HS2

and is used for anything from mast-holding

to planning the works and including some

automatic construction, for example,

for the trenches, which are going

to be located at top of the embankments.

<v ->Wow, that’s quite a lot.</v>

Thank you, Andrea.

And to you, Peter, same question.

<v ->Yeah, we’ve certainly seen</v>

being able to work within BIM environment,

the 3D environment really beneficial, again,

working within the HS2 arena,

where we’ve got some on lots M1, M2.

We’ve got over 6,000 increasing

nearly to 7,000 boreholes of the route,

and so many cuttings that the contractor,

Balfour Beatty VINCI’s asking us

to use the BIM environment to help understand

the earthworks material class.

Probably a year ago, we did that

based on statistical analysis,

which wasn’t looking at the spatial data

apart from is it in the cutting

or is it not in the cutting?

So where we’re digging out the cutting,

which could be several hundred meters long

up to a couple of kilometers long,

and some of them, 20, 30 meters deep.

There’s a huge amount of material coming out there.

And being able to do a 3D analysis of that data,

so looking at the liquid limit,

the moisture state, percentage passing,

and the PSD, and total percent of sulfate,

and the final one which is the slate durability.

Bringing the actual GI data directly

into the BIM environment, 3D environment

using Leapfrog, we’ve been able

to do numerical 3D models based on that.

We’ve been able to use, combine those

with the original geological models

that we created for those cuttings,

and then the output of that has allowed us

to have a full earthworks material class

geological model that we can give real volumes

back to the conductor of the types of material

we’re expecting to excavate.

Now, one of the really key things about that

is it’s also been able to allow us

to see where not only the percentage

of the material, the volume of material,

but how easy that’s going to be to excavate.

So for example, if you’ve got lots of good material

across the whole length of the cutting,

but it’s only in small pockets here and there,

the contractor can’t actually excavate it.

So we can see that in 3D model,

much easier than we could

when we were just doing the statistical analysis.

So we can now assess whether we’re actually

able to get that good material out and reuse it,

or whether it will just be pulled out

as a mass of poorer quality material.

<v ->Thanks very much.</v>

So there’s quite a variety of use there

from tendering to in-project coordination

and even the material reuse there

during the construction.

So let’s move on and think about another question,

and that is thinking about how international standards

have already come in to improve

the digital project like ISO 19650.

As we think about the digital assets

and the design life, what are your

thoughts on improving the digital asset?

And maybe Andrea, we start with you on this one.

<v ->I think it’s fairly fundamental that any model</v>

which gets generated by consultant

during any phase of design development,

or indeed the maintenance needs

to be transferred to the owner.

Without a single model,

we’re not going to have a single

understanding of the other settings.

Information needs to be collected together

to be better understood.

I think of it that there is,

we’re still facing a challenge as an industry

about the transfer of data,

and the definition of ownership of the data,

and the liability attached to it.

At present, we are sometimes faced with a concern

that sharing data expands our liability,

and this discourages collaboration

between the different parties,

and is something which I believe we need

to overcome as soon as possible

to lose an immense opportunity.

<v ->Thanks, Andrea, and it’s a great point.</v>

Peter, do you have a comment on this?

<v ->Yes, sure, what I think is one of the key things</v>

is that we’re finding that on a lot of jobs

where the asset has already been built,

and we’re going back to do some surveying,

or inspections, or the client

wants some more information,

particularly on the national infrastructure type clients

is that we’re going back, and we’re doing say,

ground investigation or other assessments,

very close to where they’ve previously been done before,

and you can see that each year, borehole has been drilled

through that concrete lining or whatever it might be,

and that data is not being collected,

it’s not being saved,

it’s not being put back in a usable way.

There might be a PDF stuck somewhere

or a paper report even,

and we can’t access those very easily.

So being able to have those in an environment

where we can easily see what information was collected

and all of that information that was collected

is saved in a way that is usable going forward,

and that those investigations

aren’t just limited to whether

there was a void behind the liner or not, say,

but actually, what else was found

when they did that investigation.

Just collecting all that information,

making sure it’s stored in a reusable way is really key,

and something like ISO 19650 will enable us

to keep reaching towards that.

<v ->Great, thanks, Peter.</v>

And over to you, Bernd.

<v ->Yeah, I think we’re in a data age</v>

where we look at data, or use data,

and try to store data that nonetheless,

we have to have a use case for storage and accessibility.

Whether we actually have a single asset

for the BIM model that is safe for the next 120 years

is a question of whether it’s worth the cost as well.

For critical infrastructure,

I think, there’s no question.

We need to have that model available

for use in emergencies for future projects

on that critical infrastructure.

For every little single building,

well, that discussion will come

if there is no use case,

and the cost outstrips the benefits for that.

I think, we have to talk about that,

and we have to find a way in the industry

to deal with that.

But generally, yes, we want that data stored.

We wanted access, and we want to have access

to the data in the future,

and we need to make sure that we have access

to that data in the future,

which is one of the challenges we face, I think.

<v ->Yeah, interesting observation on…</v>

And yeah, we all know the cost of storage

gets cheaper and cheaper over time,

but also, everything that we’re collecting

increases as well, so it’s always a delicate balance.

<v ->And also, if it’s not updated,</v>

if it’s not ensured that the data is continuously updated,

which is not such a big point for ground conditions,

but every other aspect of the building.

If there’s been work done in the building,

which is not being put into the model, well,

then we actually work on wrong assumptions, probably.

So there needs to be a standard for that.

<v ->And I’m hearing here that it’s not just</v>

about technological solutions

because in a lot of times, they exist,

or they just need a bit more connectivity,

or interoperability with another application,

but there’s systemic challenges within the industry

of agreeing how to, what to save from a project.

And talking about liability,

talking about exposure and risk throughout the project.

Any more comments on that?

<v ->Well, we need to address the question</v>

whether it’s, okay, Peter, please.

<v ->Sorry, Bernd, yes.</v>

<v ->Okay, we need to raise the question</v>

whether it’s actually more dangerous

to have an incomplete model

than to have no model at all.

If it’s not ensured that the model

is continuously updated for an asset,

well, is it worth keeping it then?

And I’m sure that’s the danger

we all have in mind that that might happen.

<v ->And Peter, did you have a point now?</v>

<v ->I was, oh yeah.</v>

And the comment I was going to say

is agreeing with Andrea in terms

of when models are passed on

from one consultant to another,

how do we ensure that that is able

to be used by the ongoing consultant?

I think that’s a really good point that Andrea raised.

So often, we don’t have the assurance of that model.

And then as a consultant, we then feel

that we have to rebuild that model from scratch

just to prove to ourselves that

it is of the required level,

and that seems to undermine all this.

<v ->I think that the interesting point</v>

is the fact that we shall share factual data,

but any kind of ground model is by its own,

inherently, an interpretation.

There is nothing right or wrong,

it’s just trying to collect information

and trying to make sense of it.

So in many respects, when we share a model,

we already shared a liability,

and we need to acknowledge that is the case.

I think it is important that

we become economical in the way we work.

If you bring, for example,

what is happening in London every day,

probably today, they must’ve been excavating

a couple of tens of meters of trenches for utilities.

Somebody must have overflown a drone

to over a canal to check whether something is leaking.

Somebody must have done a CCTV camera survey.

Old data which get lost,

they’re not even remotely recorded.

Imagine how much data, how much value

we deprive in ourself as a society,

and how much value that is bankable, which get wasted.

It is unnatural to keep on working this way.

<v ->It’s a great point.</v>

And especially, about starting again

on each contractor coming new to a project,

starting again with the ground modeling.

And an interesting observation from before

was around the boreholes being considered

as factual data when they are in fact,

their interpretations already being

run through human interpretation.

But largely, they’re considered as fact

at the start of the project,

but they’re their own form of interpretation, aren’t they?

It’s just that everybody’s agreed that that’s become fact.

<v ->Gareth, I think there is where it gets to the point,</v>

the significant of having standards.

We agree that there is a conventional way for logging

the burrows described in the geological information,

acquiring data and up-testing.

Similar, we need to agree a similar

approach to deal with data,

and that will make it more transferable,

and easier to user.

And to say that then there are also technology

which allow to overcome the subjectiveness of,

for example, the login,

or we got to these days,

scanners for a solid core of rock,

and this approach is extensively used

in other industry, other civil engineering,

for example, in mining.

And that we can have then,

much more information from scanning

because it’s just not optical scanning,

we can use a color-infrared, we can use X-ray,

we can use a lot of other techniques

which allow to squeeze out more

from the same set of data,

and once again, to become economical.

<v ->Very good, and there’s…</v>

Is it then becoming a thing to go back

to the core imagery within a project

and not just the borehole logs?

<v ->I think that the matter is, at present,</v>

out of a borehole, we use only visual.

Fundamentally, somebody described a core,

and that is the only thing

which is get the perceived

and then transmitted to others.

Should anybody has been a field engineer,

field geologist would understand that

when you touch a core,

you got your tenant information.

If you can smell the core,

you can get that information

about the organic material and so on.

At present, we’re at the primordial

of transferring the information

we acquired in the field to a user,

which is located remotely.

And that there are a lot of new,

there’s a lot on new technology,

which in future, will help us

to get a more rounded experience

of what happened in the field,

even if you’re not there.

And I don’t know, augmented reality,

fully-immersive VR, and so on are all tools,

which probably would come to user,

and will enhance our experience,

and that will bring a new layer of datas

which are presently are completely disregarded.

<v ->So we could end up with a multi-sensory</v>

experience for the information acquired.

So I’m just thinking,

with the effects of future climate change,

more extreme weather events,

the volatile weather that we’re seeing

with heat waves and flash flooding becoming more common,

and seeing the effects of that

on our transportation infrastructure,

could digital assets and common data environments

be the key to updating ground models from sensor data?

And how would you like to see this evolving?

Maybe we start with you, Peter, on this one.

<v ->Okay, thanks, Gareth.</v>

Yeah, the key thing for me, I think,

in terms of understanding the sort of first part

of that question, Gareth, is actually

when we have the digital assets,

it allows us to respond quicker.

So it’s not so much in terms of the design,

but more, if we’ve got the digital asset,

when events happen, we can respond quicker

’cause we’ve got information at our fingertips.

And we understand the ground,

we can see the ground, we can see the bores,

we can see the structures that are being put in them,

and very quickly, with a drone survey,

we can see now, what’s happened.

So you’ve got a cutting that slipped

or embankment that slipped on a highway

or a railway asset alignment, longitudinal asset.

And we can quickly make a good,

informed, data-driven decision

without having to wait for further

ground investigation to be done

or further intrusion investigation.

We’ve got the data there,

we’ve got the asset there,

we’ve got the model there,

and we can very quickly respond to that

in a much more informed way

than we have been able to in the past.

<v ->Great, Bernd, any comment there?</v>

<v ->Well, I agree with Peter that’s one use of this.</v>

Based from the experience in Germany,

we just had four weeks back in the Atal

with our complete infrastructure

from German River completely destroyed

over 35, 40 kilometers.

We were looking a bit deeper,

and if we have prognosis systems

like rain forecast, weather forecast,

and we know the terrain,

and we can make all that smart

in an internet of systems, on internet of things,

and they can actually talk to our assets,

to our structures, then I can see a vision

that the structure actually puts out warnings

about what’s going to happen to it within that

sort of parameter sets that’s going happen

within the next few days or weeks, so.

But that’s just based on the experience we just had.

Even if we can’t save the infrastructure itself,

we can save other infrastructure,

or we can save human lives.

<v ->It’s a great goal, but Andrea, any comment?</v>

<v ->I think that going digital is going</v>

to have a potential significant impact

in having a better strategy to deal with climate change.

At present, we tend to deal a lot with the climate change

by changing the range of assumptions we use in our design,

and that typically, we make them broader

to deal with the greater variability.

In terms of this might, will result

in infrastructure which is more expensive

and let’s say, bigger to construct,

which means in turns are likely

or significant carbon footprint,

which then triggers more climate change,

and so on, and so on.

I think that should we start to use BIM

and the digital more in a more clever way,

we would be able to obtain much more significant data

about the real range of our ability,

and that might result in economical design.

And then as mentioned by my colleagues,

we would be able to maintain the infrastructure

in a much more effective way,

and therefore, we will be able to design it

less robust in the sense that we will start

to use a leaner design, which then in turns,

will have a lesser impact on the environment.

So I would suggest that data, data, data.

When you obtain the data,

you never know from what they might be used,

but if you’ve got the data,

you can decide that something that

you haven’t got is just guesswork,

and it’s not a nice place to be

for any engineering practice.

<v ->So then, during-</v>
<v ->Andrea, in your mind,</v>

okay, just a question for Andrea.

In your mind, are you doing something

like a backend analysis as well with that?

<v ->So Peter, would you mind to repeat?</v>

The volume went down a bit.

<v ->Okay, if you think about producing leaner designs</v>

using all the data you have

and having a custom maintenance system on the asset,

really, we’ll also looking back during the asset life

and seeing how could you’ve designed it differently,

even leaner or where you went tooling,

sort of a cycle of learning?

<v ->Yep, this, in the future,</v>

we’re going to move it to some

sort of internet of things.

And so, asset will be installed with sensors.

So going by the sector, software, other technologies,

and the like to provide some feedback on their performance,

which might help to design something

in future which is better,

but also, to retrofit and to adopt what is already there.

In many respects, this is already

been happening for decades and decades.

I remember jobs I designed 20 years ago,

I could design it 20 years ago

on highways on unstable slopes,

and the entire asset were installed

with the someters, inclinometers

that recurrent light for interferometry, and so on,

and that allowed to have a linear design.

So I don’t think that there is anything new.

It’s just that we’re going to use more

extensively than in the past.

And the present technology is going to use it

in a more effective manner to transfer

the information rather than to be owned

by a single project or single entities.

<v ->So then maybe back to you, Bernd.</v>

You’ve touched on it earlier,

in fact, you all have, on the sensors

driving the ongoing design and maintenance.

Where do you see the biggest

and most disruptive technology changes

coming from in the future with regards

to internet of things, kind of smart everything?

<v ->It’s everything, I think,</v>

everything will change eventually.

There’s a midterm effect, I think,

in the next five years where we learn

to integrate the available data much quicker,

which is, mostly for me, in the design phase,

where we start doing automated foundation design

with the 3D modeling be available,

and also, other, the data comes into play

where which we automate directly,

or which we generate directly from available sources.

Probably, what Andrea has already mentioned

with pipes and cables with augmented reality

during the conceptual process as well.

In the long-term, I think,

the role of the geotechnician, as it is today,

will change more to a curator of the results

to a curator of the data that has been

automatically gathered by sensors,

by different systems from historical sources.

And we, as the geotechnicians,

we actually only look, curate,

and validate that data before it gets put into the system.

I think that’s one of the main changes we will see.

There will be a lot less report-writing,

a lot less checking for drawings,

a lot less setting up meetings,

talking about what that actually means

’cause we will all agree on available data.

<v ->Great, anyone else got a comment, Andrea?</v>

<v ->I think that I totally agree with Bernd</v>

is that I think it’s going to be the way forward.

I have to say that if we start where the data

coming from different sources, which are independent,

not that, they having different databases

helps to validate the data.

An example is if we have to go

to the traditional geotechnical engineering,

if you got the geophysics for the ground by itself,

it has to validate, so.

If you combine it with burrows that you can calibrate,

then it takes a completely different dimension.

If then you combine it with, for example,

excavation of the very same size

when you got geophysic and burrows,

you squeeze out even more.

The same happens with a whole other set of data.

Data by a single set of data by its own

lack of certain value.

If you combine it with the rest, it’s much better.

So for example, let’s say, you got to monitor

an expansion joint over a bridge.

You combine it with a meteorological monitoring,

you get a better understanding of why things change.

You combine it with the traffic data

over a long period, you can understand,

for example, the wear of the joint, and so on, and so on.

So have the data, you add value.

And a lot of the data can be easily obtained

by embedding sensors and other technology

in what we build over to retrofit existing structures,

existing assets with sensors.

<v ->Great, Peter, any thoughts on this?</v>

<v ->A couple of quick ones.</v>

In the short term, I’m thinking

in terms of the amount of rigs that we have out onsite

and the feedback we get from those.

They’re recording so much data all the time

that the drilling rigs or the piling rigs,

and being able to use that directly to refine,

to lean, to help create more leaner designs,

almost as we’re building the assets

could be something that could change,

being able to bring that data

directly back into the design.

So we’ve done an initial design,

and then we’re out onsite, we’re drilling,

and using that data in a way

that can directly influence the design

as almost as the asset’s being constructed

could be something that’s of interest.

And then longer term, with the AI stuff,

I could see that actually, the data,

then I think Bernd touched on it earlier,

was we can use that data to almost do automatic design.

And as Bernd said, we become more curators,

and then we become the checkers

and the approvers of an automated design.

We’re starting to see that now with the CPT assessments

where the AI is doing better assessment

and doing less errors than some

of the human resources are doing already.

And so, being able to get the technology

to do a lot of the hard grafting,

the initial assessment first up

would be certainly, something that’s very interesting.

<v ->Great stuff.</v>
<v ->Yep, Peter, it’s two</v>

very good points.
<v ->Good to see some.</v>


<v ->Yeah, Peter, two very good points.</v>

In terms of drilling, I cannot agree more with you.

Even now, when we do, for example, investigation,

we record the machinery that they got

to use to record a lot of data,

which you don’t get to transfer.

For example, the torque used by the rig,

which can give information about the strengths

of the rock and of the soil.

All data that are available,

should that somebody be in the field,

they’ll probably will use it.

When we’re remote, we, at present, miss them,

and that is a waste.

Regarding automatic design,

another good, very good point,

effectively, automatic design is already

happening in certain industry.

For example, in drill and blast,

we have already have a jumbo,

so which effectively position the drilling holes

for the blaster in almost automatically,

the best position to obtain the best rate of production,

and the safest production,

and avoidance of over-excavation, over-profiling.

So something is already happening,

probably not as fast as it should,

but it is probably the way forward,

especially for certain jobs of significant lengths

and linear, its function rather linear.

For a more tricky one, so which might request

a significant say, manual input.

<v ->Good, all right, so.</v>

<v ->It requires a change of thought within the industry.</v>

If I think about what Peter said,

that we actually adapt the design

during the construction process

based on the results from piling platforms.

I think that will take a lot of what,

change management, within the civil engineering community,

at least in Germany, because the design

has been checked, has been approved

by the states and local government.

So you have to provide an automated circle system

that actually repeats that approval,

as I called, on very quickly,

so you can adjust the design onsite.

It’s a big step for everybody involved in the industry,

and we have to walk that way,

but we’re not there yet, at least not over here.

<v ->Yeah.</v>
<v ->So that probably brings us</v>

to our final question,

which is Magic Wand Time.

If you could change one thing about the industry today,

what would it be and why?

Bernd, do you want to kick that off?

<v ->I can start.</v>

What springs to mind is that from personal experience

with the company in the next few years, we,

and even now, with Leapfrog,

and bringing Leapfrog data into other systems,

what we notice is that software providers have an interest,

but not enough interest in making their systems

not propriety, but open to everybody.

I think there’s a lot of work to be done

if we want to share data across different platforms,

across different processes.

And if that can be done,

it will be a big help to all of us

in the industry, I think.

<v ->Very good.</v>
<v ->I’m sorry for that, Gareth.</v>

<v ->Hey, no, we’re on a continual journey</v>

where we’re trying to be open to all,

all the connectors and all the types.

In fact, we’ve got APIs being developed

as an open API into the cloud repository.

So yeah, it’s a continual journey for us.

<v ->I see that, I see that in Seequent,</v>

and I already said there’s a conversation going on

with we were being asked what we can change,

and I want to see that it crosses

of our industry to make it available to everybody, really.

<v ->Great, thank you.</v>

And Andrea, your Magic Wand moment.

<v ->I will like to be sure that the digitalization</v>

is not for only an elite of engineers.

There is a risk that people

which is more technologically savvy,

possibly a younger generation and so on,

will take over, or as a private company have an interest,

or having people driving the initiative,

and they might not be able to invest in that

and train everybody.

And there’s a serious risk of a disconnect

between the younger generation,

which is technologically savvy and maybe the older one,

which have a plethora of experience

and can bring an immense value to jobs.

We need to be sure that the knowledge of a driller

in the field, where he might have worked

for 20, 30 years, gets used.

We need to be sure that we’re not just waiting

for the older generation to be phased out

because we’re going to miss

entire generation of knowledge.

And we really look really scared to lose quite a lot.

Therefore, my wish would be let’s go digital

but all together.

Everybody might have a different pace

and different take in it

but we cannot leave anybody behind in this industry.

It would be disastrous for us.

<v ->That’s a great sentiment.</v>

So we’ve got open software engagement

where you can move your data

and your observations around

we’ve got, keeping hold of experience

and making sure that it’s opened up

to the technology generation,

and that experience isn’t lost.

Peter, your Magic Wand?

<v ->Magic One, yeah, it’s actually very similar to Andrea’s.</v>

It’s about the training, the learning environment.

It’s a very real question for me at the moment.

That is so much new stuff coming out,

especially with these big providers

that have been on Autodesk.

When I say been, I was not necessarily meaning

that the Seequent side of things,

but that there is so much to learn.

And many of my colleagues are struggling

just to keep up with all the new softwares

and new environments, how to even use things

like ProjectWise, or BIM 360,

or the Seequent Central environment.

They’re all similar but when I start introducing

something like the Seequent Central environment,

they get confused very quickly

because they’re trying to learn

several other environments at the same time.

Or when you’re talking about modeling,

then they’re used to another environment,

or they’re trying to train up in that environment too.

We were trying to encourage people to use

in the Seequent Central environment as a visualization tool,

but they’re also being told

they need to look at iTwin

or one of the other navigator,

or these other visualization tools.

And so really, I don’t know what the answer is.

Implant into brains, I’m not sure.

But for me, that seems to be a real struggle,

but I think even the trainers need to do

or something about the software providers kind of things,

just to recognize that there is so much new technology,

software technology that’s coming out,

that some people are just getting bamboozled by it all.

And I was going back to Andrea’s point,

especially some of the colleagues now

that have been around a few years,

so they’ve really got that wisdom in them,

central to the geotechnical triangle.

But if they can’t use it,

then we’re worried of getting it lost.

So that’s it from me.

<v ->Great, great, great sentiment.</v>

Again, that’s around kind of workflows,

and best practice, and trying to figure out

how we can utilize the great tools

that are coming out in the software environment,

but to solve the problems that we have.

So it kind of always bringing it back

to what problem are we trying to solve?

What tools do we need to do that?

How much do we understand about

the workflows and the technology?

And then how do we set that in

as how we’re going to do it for the short-term future

until it’s surpassed by something else.

But I absolutely agree with you.

It’s such a pace of change at the moment

with everything that’s coming out,

and the possibilities that we will start

racing onto the next thing.

<v ->Gareth, some times back,</v>

I came across a very interesting paper

from an author called Maca Eka,

which effectively have developed

effectively a cube matrix to identify

what are the priorities of a company

when they buy software or develop software.

And effectively, you’ve got the three axes.

On one axis, you’ve got the interactivity

between the humans and the machines.

So effectively, it ranges from people using that,

for example, threat model for doing analysis

or other which might use it simply for visualizations

just to a replacement from paper and pencil.

Then there is the knowledge dimension

where you need to acknowledge these people

which will it be an expert in geology and geotechnics,

civil engineering, and others that

will not be in an organization.

For example, the management might have lost

certain kind of skills.

And then you got to use a dimension,

which is effectively arranged between the people,

which is for attitude, skills, or any other reason,

they would remain fairly passive,

passive engaging in the others,

which will become a leading force behind the software,

and possibly, start to develop scripts, and so on.

There is a very fine line between the three axes.

And effectively, what is successful for a company

who in turn will select the software,

is trying to find a balance between the three,

and identify a product which allows

most people to use it, doesn’t alienate

who is over-skilled and or who is under-skilled.

That can be used by pretty much

by the average Joe or the hyper expert.

And at the end, that they leave us

with what we needed for our own databases.

It’s quite important that this kind of analyses

are carried out by companies and by developers

because there is hardly any point of dabbling in something

which is not going to be used

and going to be effective.

And that is a good way also

to assess the software we got

and understanding whether they’re really fit

for purpose for what we do on a daily basis.

<v ->Brilliant, great, great final observation.</v>

So I think we’re probably at time.

We may even need some trimming on this one

but I’d like to thank you all

for taking part in this conversation.

There’s many things illuminated in here

that I think we should solve over a beer or two

at some point in the future.

And we’ll have the courage,

hopefully, a lot of thinking outside of this,

that will continue us on our path

for a while to come.

But thanks very much for your contributions,

and I’m sure we can answer things

in the Chat as well.

(enlightening music)