Another Change of the Guard? Could Micro-oil Dominate the North Sea?

The recent 30th licensing round in the UK North Sea has had geologist Jake thinking about where the next sustainable growth in the basin might come from.

To see a closer view of the whiteboard click on the picture below.

board image.png

TRANSCRIPT

Hello, and welcome back to the Cognitive Whiteboard. My name's Jake, and I'm a geological intern here at Cognitive Geology. Today, I would like to talk to you about how we could potentially add a new sustainable wedge onto the life cycle of the North Sea. Firstly, like most basins, the giant discoveries were made in the early stages of the North Sea. In the case of the North Sea basin, these discoveries were made by the supermajors, who were capable of developing customized facilities to handle and cope with the hostile environment. Into the '90s and the early 2000s, these facilities became very common in our industry, and we saw mid-sized IOCs be capable of developing these and operating assets off-shore, which actually led to our production rates picking back up to the highs we saw in the '80s as well. Now, there are two more wedges here at the end. The independents joining the party roughly around the late 2000s and bringing more assets online. But what I'm really interested in is this blue wedge here at the end. As oil price picked up, we see production massively increase as well, which is what we would expect. As the price drops to around about $30 a barrel, we saw production just completely stop. Just completely stop. And that is, if this shows anything, it's a very unsustainable way of operating. So why did this happen? What led to this?
 
In the early days, these huge fields were built and developed with customized facilities that were volume-driven with large NPVs and huge scale, and the recovery was also driven and enhanced by enhancement recoveries, water injection, gas injections as well. But if this has shown us anything, maybe we need to change how we do things, and maybe we need to move to a primary depletion or primary production method. This would allow us to operate more quickly, more swiftly, and also at a much lower operating cost and upfront cost. Now, it's going to require changing economic focus, and it's going to require both companies and government to be on board with this. These big fields at the start were initially NPV driven. They had huge volumes of oil being produced, and gas as well, and they had the life extended by the enhanced recovery. But if this graph has shown us anything, that throughout the life cycle production's declining, and we don't want to end up with a wedge like this blue one at the end when prices drop, production completely drops as well.
 
So maybe, for a more sustainable wedge to be developed at the end, we need to move towards a discounted profitability index. What that would involve is governments accepting this primary depletion and ultimately lower recovery factors, which would bring down our initial upfront costs and also push the production earlier into the life cycle of the project. We'd be developing and producing oil much earlier, which would see our revenues being generated much sooner. Now, why would we go about this? Why would governments accept this? Maybe, if we could convince the government that, as an industry, this is something to do, then the UK North Sea is the perfect place to do it. There are over 140 fallow fields that have seismic well data and production tests, for the most part, that prove there's hydrocarbons in place. So right there, we've got no exploration cost upfront, which is going to bring down our initial costs. It's just a case of convincing our government that it is the right way to go about it. These resources will stay in the ground under this initial regime, where we're enhancing recovery and spending lots of money to extract these high volumes of oil, but if we accept the primary depletion and we move forward with this, then maybe we could add a sustainable wedge onto the end of our life cycle. I'm really interested to hear your thoughts. Do you think this is the way to go? Do you think we could convince the government, our government, as an industry, this is the best thing to do? Please leave your comments in the box below.

2001: A Geomodelling Odyssey

We welcome friend of Cognitive Geology, geomodelling guru, and creator of Geo2Flow, Dan O'Meara to the Cognitive Whiteboard, to share some of his insights into 3D fluid saturations.

If you would like to visit the Geo2Flow website please follow this link: Geo2Flow

Click on the image below to see a larger version of the whiteboard.

dan.png

TRANSCRIPT

Hello, and welcome back to the Cognitive Whiteboard. My name is Dan O'Meara, and I'm here to talk to you about a way to evaluate your 3D Saturation models, and along the way this plot that I show you will stimulate a lot of discussion with your interdisciplinary asset teams. So I've been in the industry for quite some time, and I was actually there at the dawn of geomodeling, and when we started to put together geomodels. We had some simple expectations. One was that our models were going to honor our observations and the main observations were the well data, the well logs. Another was that our model should honor the physics. So when it comes to porosity, there is no physics, some people tend to use just geostats for porosity. But when it comes to saturation, we know there's physics to be honored and that's the physics of capillarity. We have plenty of data in the laboratory and that's why we get capillary pressure curves. So the physics becomes very important when it comes to saturations, and the physics tells us that as you go up in a reservoir, it's not just an easy thing where the oil saturation uniformly increases but the oil saturation is going to depend upon the heterogeneity in both the porosity and the permeability, they're all connected.
 
If you put together models that are just driven by the physics, what we're going to do here is we're going to evaluate those models, and I put together this kind of a plot to try and evaluate them, here's how the plot goes. So we're looking at 3D models and we're only going to consider cells that are penetrated by Wells because the Wells are where we have observations. So along the x-axis here, we're going to plot the properly pore volume weighted saturation log, the water saturation log. And on this axis, the y-axis, we're going to plot the water saturation that's sitting out there in the 3D model. And all we're going to do is look and see how we're doing, how they compare. You would expect to get a 45-degree line, at least I did with all of the data generally being on the 45-degree line. Because after all, if you honor the physics, you should be honoring the observations. But what I found was kind of strange and one of the things that just jumped out right away is you see data up here along the top. And what that's saying is that the water saturation that's coming out of the model is 100%, but yet there's that's contradicting it saying, "We have as much as 80, 90% oil, so what could possibly be going on?" So I turned back to the geological models here and realized, "Hey, there are faults in those models." With the faults comes the possibility of compartmentalization. So, for instance, in this model here we've got four free water levels, and you start to realize that this kind of a data is a signature for compartmentalization.
 
If you look at this plot more and you'll see that usually, I was not seeing nice straight lines even when I discounted for this, and if you're in this area here, the water saturation in 3D is higher than the water saturation in the wells. So here you're underestimating the reserves and conversely you're overestimating reserves in this part of the plot. So in going around the industry over the last 20 years or so, I would see people who were struggling where they had a computer programs here that basically said, "Hey, I can do the physics," but when it came to observations, when we put together these kind of plots here, we were repeatedly seeing things like this. Where we had lots of scatter here and then we had what seemed to be evidence of multiple compartmentalization there. So what I did is to come up with a methodology that both honors the physics which is the first thing that people were doing, but also honors the observations. So that every time that we put together a model in 3D for saturation, we get plots like this which is what you'd expect from porosity. You should honor the observations and you should honor the physics because after all that's what Mother Nature does. If you want to learn more about how to get there, then Luke will provide you with links to our website below. So  that's all from the Cognitive Whiteboard today, see you again next time.

Cognitive Geology are shortlisted for Innovation Award

We're very excited to have been shortlisted alongside Air Control Entech, Rig Deluge Ltd, and Tendeka for the Innovation Award at this years Press and Journal Gold Awards.  

The Innovation Award is sponsored by Balmoral, whose managing director and chairman, Jim Milne is quoted by Energy Voice describing the judging process: 

“Once again, the judging session was very exciting. Choosing between all of the entries was a difficult job because there were so many good ones, but there has to be a winner.

“I am looking forward to seeing cheerful winners getting their prizes on the night.”

If you would like to read the complete article by Rebecca Buchan and discover the finalists in all the categories click here.

We are all looking forward to finding out the winners on September 7th... fingers crossed for a Cognitive win!

Being an Economically-savvy Explorer

Our Senior Geologist Kirsty returns to the Cognitive Whiteboard, this time she's sharing some thoughts about why economic factors are for geologists too!

To view a close-up of the whiteboard click on the image below:

Kirsty whiteboard 2 - savvy explorer.png

TRANSCRIPT

Hello, and welcome back to the Cognitive Whiteboard. My name's Kirsty Simpson. Here at Cognitive Geology, one of the things we want our tools to do is to enable geoscientists to visualize the economic impact of their decisions. This led me to think about how often working as a New Ventures geologist I was asked to be an economically savvy explorer. Why was I being asked to think about economics you ask? Well, fundamentally my job was to make money for the company just like everybody's is. 
 
So, when you're bringing opportunities in to your New Ventures team, as a geoscientist, if you see one that's technically interesting, it’s so tempting to just get stuck straight into your subsurface workflow, build your regional framework, do your play fairway evaluation, talk to the petroleum systems team and get a model built, and then go and work for your petroleum prospect workflow. Get your volumetrics, calculate your risks, and then go and speak to your economist. That's a really long workflow before you get to discussing the economics of your opportunity. What if that economist says to you, "This is never going to meet the thresholds. What are you doing? What are you bringing me?" Well, I would argue, there's another way of starting your workflow and that's by thinking slightly more economically. 
 
What you can do is you can start by thinking, "What are my company's economic drivers? Am I looking for production in the near to mid term? So, do I need to find drill-ready opportunities or discoveries that are ready for development? Or am I looking for prospects and leads because we need to fill our prospective resources inventory hopper?" Once you've decide what kind of opportunity you're looking for, that's when you want to start thinking, "Well, where am I going to look for it?" And one thing you might start with is a graph like this which tells you what the percentage contractor take per country is. And that way, you might identify number of countries, or regions, or basins that you want to start looking for opportunities in. You also need to consider the geopolitics of those regions because you want to be able to actually get in there safely. And also, you need to think about the market because don't find eight billion barrels of oil if you can't sell it to anybody. 
 
Once you've overcome those economic stumbling blocks or hurdles, that's when you can start your subsurface workflow, build that framework, and do that play fairway evaluation. Out of that play fairway evaluation, you can identify your yet to find, whether you're looking for oil or gas, whether you're looking in deepwater, on the shelf or onshore. And that's the sort of information you can take to your economist and ask her to do some scoping economics on. As a result of that, she can give you a MEV, a minimum economic volume, that you require to meet in order to overcome your company's economic threshold. And that way, after just a few weeks, you might be able to throw out some or all of the opportunities you've identified. If you throw them all out, great, just go back and start again. Or otherwise, you take the few that do meet that threshold into your detailed subsurface workflow. And this way, you screen more opportunities and the only ones that you actually take through this process to your final economic decision are the ones that are likely to make it over your threshold and the ones that you're going to get in front of the bid committee. That way, your management are going to see you as an economically savvy explorer. 
 
I think that's all I've probably got time for. Thank you for joining me at the Cognitive Whiteboard and I hope to see you again soon.

The Bigger Picture: Uncertainty from Subsurface to Separator

Our new Hutton Product Owner, Jim Ross, shares some insights from his point of view as a petroleum engineer carrying out production modelling. Very interesting for the geos among us to hear how our models might be applied.

Click on the image below to see the whiteboard in more detail.

 

Jim whiteboard photo.png

TRANSCRIPT

Hello and welcome back to the Cognitive Whiteboard. My name is Jim Ross. I'm the new product owner of Hutton here at Cognitive Geology. And today I'm going to be talking to you about "The Bigger Picture: Uncertainty from Subsurface to Separator." My background is chemical engineering and most recently, I've been working as a petroleum engineer in the field of integrated production modeling. What that means is taking models of all the different parts of our oil and gas system and constructing a model that represents the entire behavior of the system and how they're going to interact and affect each other once we start producing from that asset. 
 
And what you notice when you're doing this is that everybody in the oil industry kind of works in their own little silos. You might have production engineers who are responsible for the operation design of the wells, you might have a drilling team who decide where and when we're going to be actually drilling those wells, a facilities team who would look at things such as the process or refinery required to support that production and anything else that's required to actually make that happen, and then the reservoir model, which of course, would be looking at how the fluids are actually going to move through the porous medium of the reservoir rock. However, they're all working towards a common goal, which is usually something like how much oil am I going to produce and when am I going to produce it. 
 
And in creating these models, I would often get asked, "How do I know this is correct? How do I know that what we've predicted is what will actually happen? And the short answer is, we don't. We don't know that that's entirely correct. And the reason for that is because of all the various uncertainties we might have in the process of building these models. For instance, do I have a reliable lab report? Do I know what my fluid density is, my API gravity, gas gravity is going to be? Do I know what the ratio is going to be of gas and oil and even how much water I'm going to produce over time? Do I know what the drainage region of my well is going to be and how much of that I'm actually going to contact during production and what that pressure decline is going to be, even how many wells I'm going to have and what type they're going to be and how it's going behave if we're not operating at the design capacity of those facilities? 
 
And what that leads to is some very anxious engineers, which is why I can have sympathy with this guy here. So, something that we've tried to move towards is to look less at precisely what these inputs would be and look at what impact they have by looking at the different possibilities on the production side. So for instance, a production engineer might sensitize on tubing size to see how much the well is going produce and what effect that will have on the production. If we're looking at a perforation, we might look at the perforation efficiency, how well we're making those perforations and what impact that would ultimately have on any wells that we drill. If we undertake a stimulation job, then we might look at what the stimulated PI would be after. 
 
There's a range of possibilities for what we might end up with there. What we're trying to build here at Cognitive Geology is something that takes into account the geological possibilities. So what are the different possibilities for filling my rock properties across the entire grid? And what impact does that have on the process? What that allows us to do is to move away from something which is precise, but we don't necessarily have a lot of confidence in, to something that is approximately accurate that then tells us what the impact of our decisions and the impact of our unknowns would be so we can have greater confidence in what we've predicted going forward. That's all I want to talk to you about today but I look forward to seeing you at the Cognitive Whiteboard again in the future, and I'll see you then.

Frontier Exploration: Avoiding Interpretation Bias

We welcome our new Senior Geologist Kirsty as she presents her first Cognitive Whiteboard.

Click on the image below to see the whiteboard in more detail.

whiteboard 1.png

TRANSCRIPT

Hello and welcome back to the Cognitive Whiteboard. I'm Kirsty, one of the new Geo's here at Cognitive Geology. My background is as an explorationist and I've joined the company to bring some of those exploration workflows into the products that we're developing. In preparation for today's video, I was watching some of Luke's old blogs, and one of the things that struck me was Luke talking about the difference between accuracy and precision, and that really struck a chord with me and some of the learnings that I had to take on board as a young geologist learning my trade in exploration.

So, early on, when I was a youthful geologist at the cliff face with my hand lens, I was looking at the really fine detail, getting excited about the bedforms that I could see, walking along the whole outcrop, seeing all the contacts and mapping them really accurately. When I joined BG group in 2009, and started working in exploration, I discovered that we really rarely have that kind of luxury of data density. In fact, if we start working in frontier basins, for example we've got a basin here maybe 4 or 500 kilometres in diameter with a handful of wells that we can you use to learn about the basin from. So, we've got very fine vertical detail with big horizontal spacing, tens, hundreds of kilometres between them, and from those, we're very comfortable with our vertical detail and it allows us to build depositional models. We can use those depositional models to show the depositional environments in map form, and this really looks like the kind of map that I would have drawn early in my career at BG Group. It looks really nice and geological, it looks like a snapshot of something we might see today in a depositional system.

 It's very appealing. But what it also is, is a precise view of one single possible interpretation of the data that we're seeing in my depositional model, and it doesn't encompass the full range of time and geological processes that are happening in that basin. So, if you draw something like this, you risk becoming really anchored to that one interpretation very early on with very little data to tie it back to. I would go back and tell myself 10 years ago that what I really need to be is accurate but encompassing the full range of possibilities, and therefore, what I should be drawing is not lovely geological facies maps like this which look great. But things that really represent what I actually know right now and that would be a play fairway style map where I'm showing where I believe the sand dominated facies to be. Where I believe the mud dominated facies to be. That still allows me to focus my exploration, recommend which blocks we should be bidding on in a license round. But it also doesn't tie me to an interpretation that I can't really back up at the moment with my data. And we also risk, because this looks so nice, it tends to get used again and again and again within the company and we've all seen this happen. And if this gets persisted 10 years down the line, when I'm not involved in that asset anymore, it can really bias people's decision-making and we don't have that information when this was drawn, and therefore, we could miss some really good prospectivity as a result of tying ourselves to one specific interpretation in time. Well, that's all I've got time for right now. But I look forward to seeing you back at the Cognitive Whiteboard soon.

Dragon Eggs and Unicorn Tails

Luke introduces the new series of whiteboard videos by telling us about the myth of hard data.

Click on the image below for a closer look at the whiteboard.

TRANSCRIPT

Hello and welcome back to the Cognitive Whiteboard. It's been a while but we have a new cast of characters that we will be introducing shortly. But I'm going to kick off the first of this series of videos with an attack on the hardness of the oil field data sets. To begin with, let's do some mathematics, not a place I normally start with, but if we look at a grid cell in a geological model, let's have a look at the reality of how well we've sampled that single grid cell, let alone the rest of the field.
 
By the time we get down the reservoir, we usually around a seven inch bit sample, doesn't really matter, but let's assume seven inches and a pretty common grid cell size might be 50 by 50 meters. If we do the mathematics on calculating the sample rate, our well bore area is about 0.02 of the square meter converted into metrics and the grid cell is around about two and a half thousand square meters of rock area, so that sample rate is 1 in 125,000. Question: does that well bore represent the perfect average of that grid cell? Let's just put it and leave it there for now.
 
But let's have a look at an oil field for example. Let's take Britain's biggest oil field the Forties. We have a hundred and three wells in it at 90km2 of area. Do the same mathematics and we are at 1 in 45 million as a sample right for that oil field. So even in this well-developed field, we have a pretty big challenge in trying to say we have statistics here, perhaps that's the reason why we use the term ‘geostatistics’, as to whether we want to be explicitly honoring all the mathematics to this or we want to be a little bit pragmatic and understand that our sample rates are a bit spurious. I would argue on the side of using a little bit of geological intelligence rather than just mathematics here, which is often where we start. But let's even look in a single well bore just how confident we are that we know where that well bore is. 
 
I was involved in a peer review where we had an issue that one of the wells was off by more than a 150m at the bottom hole location and that was proven because the velocity anomaly that was required to tie that well was just unheard of. It turned out the well was actually on the down thrown inside of a fault where it had been previously assumed to be on the up thrown side. That was discovered because we did a gyro survey over these wells to try to explain some of the issues. We found that about 30% of the wells were off by more than 50 meters and when we corrected all of those we added about 90 million barrels of oil back into that oil field and suddenly all the production history, you know, the general behavior that field started making a lot more sense.
 
Let's talk about that production history though. On the single well basis, how confident are we that we know the production is what we say it is? And this is probably some of the softest data that we have in the oil industry. The production data particularly when you're looking at a downhole zonal allocation can be very, very subject to uncertainty and inaccuracy. The well bore itself is often in practical terms not perfect. Cement bonds can create leakage points behind pipe, the jewelry itself wears over time, and the control of the flow can become problematic, and most of the time, wells are being produced through a cluster so the allocation back to the single well, let alone the zone can be really problematic. 
 
When we look at these production allocations, it's just worth bearing that in mind. Just a really hilarious point to that, we had a 28-day cycle in one oil field that turned out to be due to the hitches of the operational guys. One of the blokes was measuring the production data accurately, the other guy was just kind of eyeballing it from a distance, and that ended up with this 28-day cycle to our production data that we thought was tidal to start with. In reality, it was just inaccuracy in that measurement method. When the questioning comes, do I honor all of my all of my data? I do feel a little bit like Gandalf going up against the Balrog because the reality is I can't match all of it. Most of the time, there is going to be inaccuracy somewhere in the piece of the of the puzzle and I can't always be confident where that lies. What I'm always trying to do is develop the most coherent story I can within the realms of uncertainty that these data provide. Just a little bit of a story there, I hope that's helpful to you, if you've come across any other strangeness in your fields that turned out to be part of this, I'd love to hear about it in the comments below. That's all for now from the Cognitive Whiteboard. I'll see you back here again time.

Growth Lies in Innovation

Luke and Gemma Noble, Audit Director at EY, feature in an opinion piece discussing growth and innovation in the oil and gas sector. 

Times have been challenging in the oil and gas sector for the last few years and companies are looking closely at how they can continue to adapt and grow.

Gemma addresses the need to focus on knowledge capture, talent development and innovating across processes, systems and technologies.

Luke talks about recognising and addressing the key drivers of the operating companies whilst acting as a technology innovator.

To read the full article click here.  

Cognitive Geology wins two awards in one night

Cognitive Geology was nominated for awards at both the Digital Tech Awards and The Made In Scotland Awards 2018. The presentations were awarded last night (26th April) in Glasgow. 

Our COO Eileen McLaren was representing us at the Digital Tech Awards and we are very honoured to have been named the Digital Technology Business of the Year (Emerging, less than 10 staff and less than 4 years of age).

Meanwhile across town Luke and Alex were sporting their kilts at the Made in Scotland ceremony, where they were on the spot to accept the Exporter of the Year award. 

Click through the images to see the full lists of winners.

 Eileen at the Digital Tech Awards 

Eileen at the Digital Tech Awards 

 Luke and Alex at the Made In Scotland Awards 

Luke and Alex at the Made In Scotland Awards 

Luke Nominated for EY 'Rising Star' Award

More exciting news for Cognitive Geology today, our founder and CEO Luke Johnson has been announced as a finalist in the 'Rising Star' category of Scotland's EY Entrepreneur of the Year!

 

Luke is quoted as saying “Discovering that I’ve been selected as a finalist in the EY Entrepreneur of the year awards tops off what has been an amazing first quarter of 2018 – we’ve just moved into our brand new offices, have grown the team to 25 and are working on some really exciting new contracts.”

To read the full article about Luke and the other nominees click here

 

 

Eileen Joins the Board

Eileen-M-CROP.jpg

More exciting news here at cognitive geology!! We are really excited to announce that our VP of Software Engineering, Eileen McLaren, has agreed to take on the role of Chief Operating Officer and also joined the board as a director.  

Here's an excerpt of the article from digit magazine by Brian Baglow. See the original article here.

 

One of the key people in both Skyscanner and FanDuel, Eileen McLaren, joins the ambitious geo-technology company’s board – aiming for a trifecta of tech unicorns.

Eileen McLaren,one of Scotland’s most respected digital technology executives, having been instrumental in scaling tech unicorns Skyscanner and FanDuel, has joined the board of Cognitive Geology.

Having helped Skyscanner and FanDuel scale from start-up to global success, Eileen joined Cognitive Geology as VP of Software Engineering in July 2017. She has now been promoted to Chief Operating Officer (COO) and appointed a director, taking a seat on the company’s board.

Founded in 2014, Cognitive Geology designs and builds specialist software for geoscientists in the oil and gas industries, improving accuracy in finding, appraising and developing oil and gas reserves.  In 2017 the company secured £2 million from Maven Capital Partners and Enzo Ventures to develop their range of product solutions for the oil and gas industry.

Cognitive Geology’s Hutton software claims to help companies in the oil and gas sector be ‘approximately accurate, rather then precisely wrong’. Hutton reduces oilfield investment uncertainty by extracting progressive trends from complex geological datasets. Geologists can identify and sequentially remove the effects of the various geological processes, resulting in a vastly reduced range of uncertainty.

The geoscience software industry is estimated to be worth an annual $4.5 billion and is forecast to double over the next several years as antiquated software is replaced by next generation technology solutions. In the recent past Cognitive Geology has secured a seven figure contract with one of the world’s largest energy companies

Eileen McLaren, COO, told DIGIT:  “I’m delighted to join the board of Cognitive Geology as I see it as one of the most exciting early stage tech businesses in Scotland at the moment. I’ve been fortunate enough to have worked at some of the most successful tech companies around and the planning, drive and the quality of the execution I see at Cognitive Geology has the same hallmarks I saw in the early days at Skyscanner & FanDuel.”

 

Cognitive Geology Raises Investment To Fuel Growth

 (Back): Brian Corcoran, Eileen McLaren, Mark Collins, Kelli Buchan. (Front): Fiona Johnson, Luke Johnson, Alex Shiell.

(Back): Brian Corcoran, Eileen McLaren, Mark Collins, Kelli Buchan. (Front): Fiona Johnson, Luke Johnson, Alex Shiell.

We interrupt our regular geology programming with a special announcement: we're delighted to report that Cognitive Geology has recently raised a £2 million investment round to help us scale our business and get our first product Hutton into the hands of more geologists around the globe.

As regular visitors to this blog will know, at Cognitive Geology we have been developing the next generation of technical software for geoscientists and reservoir engineers since the beginning of the industry downturn – helping geologists do their jobs more effectively, and thus helping oil companies in their quest for more cost effective solutions under the new price regime. Now with several major customers on board, at this point in our journey we're very pleased to announce two investment partners in the form of UK private equity house Maven Capital Partners and Enso Ventures, a London-based early-stage venture capital firm.

Here's what our CEO Luke Johnson had to say about the news: “The oil industry, one of the first to adopt 3D visualisation and high performance computing, once led software innovation but today it has fallen behind gaming & cloud technologies. As geologists and technologists ourselves, our mission is to take advantage of modern software architectures to improve the efficiency of our peers around the globe by giving them the best software possible, for the complex tasks they face each day.

To that end, the feedback from our customers so far has been overwhelmingly positive. Today’s low oil prices increase the requirement for better technology. Enso & Maven, both forward-thinking investors, were quick to recognise the fortuitous timing of our company growth. This investment round will provide us the capital to scale our Edinburgh headquarters and position ourselves as innovators of 3rd generation software solutions for the energy services sector."

David Milroy, Investment Director at Maven said: “Luke is a true entrepreneur having created Hutton to tackle the challenges he encountered himself when working as a geologist responsible for high value upstream oil and gas projects. Cognitive has a differentiated approach that provides geologists with the most robust geological explanations for the data they observe and in doing so helps them make more informed decisions. Luke has assembled a very capable technical team as well as an experienced advisory board and we look forward to helping the team execute their exciting plans.”

Enso's CFO, Oleg Mukhanov, added: “We are excited to support this cutting-edge business, which is set to revolutionise the way geological data is analysed and interpreted. We strongly believe that Cognitive’s high calibre team is capable of delivering on this task, which will put them at the forefront of their industry.”

Thanks to everyone in the geology community who has supported us so much thus far: hugely appreciated - you are the folks we are building for!

Ok: that's all on the investment side for now - back to the geology!

Simulation Sprints: Minimal Cost, Maximum Value

We're at the Cognitive Whiteboard again with Luke, discussing simulation sprints, and how a lean and iterative approach can provide robust business recommendations, in a fraction of the time taken to build a “perfect” model.

Click on the image below to see a larger version.

Cognitive+Whiteboard+simulation+sprints.jpg

Transcription

Simulation Sprints: Minimal Cost, Maximum Value

Hello and welcome back to the Cognitive Whiteboard. My name's Luke and today we're talking about simulation sprints. This isn't a technical workflow, it's a project management one. But it's something that's completely revolutionised the way I do my work and I'd like to share with you how this can help you make much more cost-effective analyses of your reservoirs in a vastly shorter period of time.

Now the method is not something that I claim credit for because it was challenged to me by a person called Michael Waite in Chevron who once came to me with a new field to look at, a mature field with a lot of injection and production and asked me to give him a model by the end of the day. My reaction to that was shock - I guess would be the polite way of putting it - because that's obviously, an impossible ask for you to try to build a reservoir model and understand what's going on in a complex brown field in just a matter of a single day.

But Michael wasn't being silly. He was challenging me to use a methodology that would allow us to make quick and effective decisions when we clearly knew what the business decision that was coming up was going to be. What we had in this particular case is a mature field with only about 12 or so locations left for us to drill, and six well slots that we needed to fill. We had wells that were declining in production and we were going to replace them. And so, there really wasn't any other choice other than optimizing those well, bottom-hole locations.

And so, in that context you can come back and say well, we don't necessarily need to answer the ins and outs of the entire reservoir but rank and understand those locations so that we can drill the optimum ones during the drilling campaign. And he introduced to me this concept of simulation sprinting. What it is, is a quick loop study that you can do numerous times, iterating through progressive cycles of increasing precision until you get to a point of accuracy that allows you to make a valid and robust business recommendation.

The first one in a single day, we were not going to be able to build a realistic reservoir model by any means. What we were able to do in a single day is do some pretty decent production mapping results. So, taking the last six months of production looking at the water cut, we got together with the production mapping team. And we were able to design a workflow that we could do that day that would give us an idea of what was going to address this bigger objective, and try to say what would be the lowest water cut because that's another value measure that we could use to understand these wells.

Importantly, because we're gonna do this a lot, even though we call it sprints, the key is to work smart, not hard, because it's gonna keep going on over time. So, you wanna be able to do this within the normal working hours. Don't burn the midnight oil, otherwise, you'll burn out before you get to make your robust business decision.

The really important piece in this cycle though is the number five. When we come to assess the outcomes of any one of the experiments that we've done, we need to rank the wells that we had in order of economic value. So, whatever way we were trying to devise it, we needed to have those well targets ranked from best to worst at the end of each one of these simulations sprints. That's what Mike was asking me for at the end of the day.

And when we do this assessment, we also spend time to have a look at what's the worst part of the technical work that we've done because that's gonna form the basis of the next objective of the next sprint cycle. And we could come back and progressively increase the length of this sprint loop. So, the first one was done in a day, the second was in two and then four and eight and so on. But we can adjust this as is needed to determine how we could address these experiments.

But as we come back through this loop, and constantly re-rank ourselves, what was fascinating is that after only four weeks, the answer never changed again. The order of the top six wells was always the top six. And what that shows you is that really, with some quite simple approaches you can get to the same decision that you could with a full Frankenstein model. You can get to that recommendation without having to do years' worth of work.

So, we were able to make that recommendation. It was an expensive campaign, so we didn't stop at four weeks. We ended up stopping at about four months. But really importantly, we would've taken six months, perhaps a year to get that kind of a stage and we were already significantly ahead of that at the end of this routine. So, it's a method that has really changed the way I do my work, and it's something I really recommend you give a go. Hopefully, you enjoyed this, and I'll see you back here next time at the Cognitive Whiteboard.

MANAGING UNCERTAINTY: ROBUST RANGES USING TRENDS

Warming to his heretical theme, Luke is back to discuss using trend analysis to drive uncertainty in geostatistical models

If you'd like a demo of Hutton - our product that Luke mentioned - just drop him a note: luke@cognitivegeology.com.

I think this is my favourite whiteboard drawing yet - enjoy it in glorious full screen by clicking below!

TRANSCRIPTION

REPLACING THE VARIOGRAM: ROBUST RANGES USING TRENDS.

Hello, and welcome back to the Cognitive Whiteboard. My name's Luke, and wow, did that last video generate a lot of interest in the industry! What we did was we talked about how variograms and trend analysis can work hand in hand to try to investigate how your properties are being distributed in three dimensional space. Today I want to show you how we can use the trend analysis to drive the uncertainty in your models as well. In doing so, I think I'll officially be promoted from geostatistical heretic to apostate. But let's see how we go.

What I want to do today is really run you through how I used to go about doing geological uncertainty management and how I do it today. I started by thinking about shifting histograms. I think a lot of us do this. If we wanted to get a low case, what if the data was worse than what we observed, or a high case, we could shift it upwards in the other direction? I've done this many times before in the early parts of my career. It's not a particularly valid way of doing it in many examples. When you do just shift the histogram and fit to the same world data, you'll generate people's and dimples around your wells, which is undesirable. But if you shift the observed data as well by saying, "Well, the petrophysicist has some uncertainty in their observations," what we're really beginning to invoke is that the greatest degree of uncertainty associated with that is at the wells. And I think we can all agree that the greater degree of uncertainty is away from the wells. There are important uncertainties here, but we have bigger ones to deal with up front.

The other way of trying to manage our uncertainty is also in the structure in how we distribute that data. Different variogram models are useful for doing this. We can say fairly that the interpretation of a geological variogram, that experimental data that you get is - particularly in a horizontal direction - usually uncertain. We don't have enough information there to be confident on how that variogram structure should look, so it's fair to test different geological models and see what will happen. What's interesting is, of course, if you vary the histogram, you'll change STOIP with a similar recovery factor, just generally better or worse. Whereas if you change this, you'll vary the connectivity, but you won't really change the STOIP very much. And it's often difficult to link this variogram structure back to a conversation you can have at the outcrop.

So over my, I guess,  five or six years now, I've been focusing on addressing uncertainty by saying, "Actually, the sampling of our data - the biased, directional drilling that we've gone out and sought the good spots in the reservoir, typically - is really what we need to try to investigate." How much does that bias our understanding of what could exist away from the well control?

Got an example here, a top structure map of a field, a real field through here with five appraisal wells along the structure, and the question is: in these two other locations that are gonna get drilled, is it gonna be similar, different, better, worse? And how could we investigate the uncertainty of the outcome on those locations?

We could, for example, investigate: is the variation that we observe in this data set a function of depth primarily? Or perhaps it's a function of some depositional direction - in this case, the Y position, as a theory. We don't know at this stage which one it is, and depending on which one we do first, we end up with different correlations. In fact, you can see on this sequence after taking out the Y position, if I analyze the residual for depth, we end up with not two porosity units by 100 meters of burial, but only one porosity unit being reduced as a function of depth. So you can see, fundamentally, we're invoking a different relationship as a result of that burial.

What's really interesting is that these are highly uncertain interpretations, but they're valid ones. And they give us different answers, not just in terms of the absolute positions of those two wells, but actually for the entire model the answer is different. And this is very representative of your uncertainty in three dimensional space. This input histogram with that particular shape is more to do with the sampling of your wells in the particular locations that they were, whereas this trend model behind it is helping you understand is there any variation in three dimension space that's going on? So we can end up with very different plumbing and in-place structures by really investigating how these trends can go.


You can do all of this manually one after the other through these routines. Our product Hutton does it for you automatically. We run about 300 different paths through your data set to try to investigate how that could go, and we find it's a very powerful way of really developing robust but different geological interpretations to address your uncertainty in your reservoir. If you're interested, drop me an email: I'll let you know more about it. But for now, that's all from the Cognitive Whiteboard. Thank you very much.

Replacing the Variogram

This week, Luke nails his sacrilegious thesis to the door - that the variogram can be replaced by quality data analysis. 

We're building a product called Hutton to do exactly what has been outlined above: If you'd like to get a sneak peak, drop Luke a line at: luke@cognitivegeology.com

To dig deeper into this week's whiteboard, click on the image below to get a fullscreen version.


TRANSCRIPTION

Replacing The Variogram

Hello. Welcome back to the Cognitive Whiteboard. My name is Luke, and today, I'm nailing my thesis on the door. I am going up against the institute to commit to geostatistical heresy by showing you that variograms are not an essential element of our geological modeling processes. In fact, I wanna show you that with careful trend analysis, you can do a better job of producing geological models that represent your geology in a much more powerful way and give you much more predictive answers.

To do this, I've built a data set - it's a complex data set, they're quite geologically realistic, some tilted horse blocks through here into which we have modeled some complex geological properties. We vary those over x and y by some unknown function. We have some very complex sequence cyclicity of fining upwards and coarsening upwards trends. And we've over printed this with a burial trend. And what we want do is see how we go about representing that with this perfectly patterned drill data set, either with trends or with variogram analyses and determine which one we think is better.

So, let's first off have a look at the variogram of the raw data set and we can see immediately some of those structures, some of those trends that we impose in the model, are showing up in our variograms. We have some obviously low sills in some of the data sets that have some structures, some correlation over distance before they flatten out into our sill. But we do have some weird cyclicity that's happening and we should wonder what's going on there. So in truth, we know that this is just the complexity of having lots of different processes creating nested variograms and various sequences of cyclicity. And all geologists that are familiar with this kind of routine will know to try to take some of these trends out.

One way we could start is by subtracting the stratigraphic trend. This isn't often done but it's very, very powerful. You could take, for example, a type log and remove that from your log, from your data set, or you could do what I've done here and essentially subtract the midpoint or the mean value from every one of your K layers and see what you get after you take that out. You're basically representing sequence cyclicity when you do this. You wanna keep that trend, this black line here, because you have to add it back into your model afterwards. But when you do it, you see a reduction in the vertical variogram, as you would expect. We have described a lot of the variation that's occurring in this direction as a function of sequence cyclicity and it's not just random. And so typically, you'll see a reduction of probably the half of the variation in the vertical sense. But it won't have any impact on the major/minor directions because the same columns exist everywhere in x and y space.

Once we take that trend out, we'll have a new property - it’ll be porosity given that trend - and we can do a trend analysis on that property. So, now we're doing it against depth. And what's interesting is as you take out trends progressively, you start to see the second and third order effects that might not have been obvious in the raw data sets. In this case, it really tightened up our observation of the depth trend. And we can subtract that next. Take that trend out because it's not random and see what it does to our variograms. Now, this one changes the major and minor variograms, not the vertical one, even though that seems counterintuitive, and it's doing that because your burial depth is varying as a function of its map position. So that's why it changes those two variograms.

And again, we can keep diving down deeper and deeper into our data sets, removing progressive trends, linking this to geological processes, and pulling it out of your data. In the end, with this perfect data set, if we had described all of those trends, you would see no structure left in your data or next to no structure. And that's because you have done a pretty good job of describing geology, not just random processes. Your nugget and your sill is almost identical. That means that the observational point has just as much information immediately as it does at some great distance. That's great, you no longer need a variogram. You have done it instead with trends.

Now, this is obviously a perfect data set with an unrealistic perfectly sampled series of wells. Let's imagine what we would do with a realistic sample set with much more sparse and biased samples. Well, most gross geological trends are obvious, even in small data, small amounts of samples. But these horizontal variograms are something that we basically never get to measure in the real world. And so, we spend our time in peer reviews often defending whatever settings we have done in these major and minor directions with no basis or outcrop that we can link that to.

So, if you want to do something in this space, you can make your models much more predictive because you can end up driving geology into it and removing the dependence upon your random seed. You can do all of this in pretty much any commercial package today, but it's not particularly easy or intuitive. So, we've gone ahead and built a product for you that will do this in a much more powerful way. We call it Hutton, named after the father of geology, because with these small observations, we can make interpretations that can change our understanding of the world.

Hutton comes to the market in March 2017. It will help guide you through this trend analysis process, it even has some intelligence that can help you automate that. And if you're interested in finding out how to throw away the variogram and bring geology back into your geological models, please drop me an email, and I'll happily show it to you. But for now, in the meanwhile, that's all from us, and I'll see you next time at the Cognitive Whiteboard.

Learning to Speak Cajun: Geomodeling for Well Planning

Luke is back at the Cognitive Whiteboard, looking at the importance of learning the language and processes of your colleagues when planning wells.

Click on the image below for a detailed view of this week's whiteboard.

download (3).jpg

TRANSCIPTION

Hello, welcome back to the Cognitive Whiteboard. My name's Luke, and today I'm going to use my outrageous Australian accent to try to speak Cajun. I want to talk you through a drilling history that I've been involved with, a six-well campaign where something went wrong and we used geomodeling methodologies to make sure that it wouldn't happen again. 

So, let me talk you through the field firstly. It's a salt raft on the West African coast that has a number of different reservoir units. The main productive interval was this yellow one through here where the lower two units particularly were of much better quality and we'd had a lot combining production across those zones. The field had initially been exploited on primary decline and then water injection, so it's quite a complex reservoir management story. 

What we needed to do was make a better job of exploiting the shallowest unit of that main reservoir. We had, however, a reservoir above us that had been under production for some time, and it had some changes that had resulted in the pressure regime associated with that that we hadn't accounted for properly. So, in the pre-drill pressure prediction, we essentially said that the   pressure was going to be relatively consistent across the field. The fracture gradient would therefore be also pretty similar, and our mud weight window required only one casing depth, and we would drill through both the shallow unit and the target reservoir with the same open hole section. 

When we drilled it, however, we encountered a problem, and the problem was this. We had had, above that target, the biggest producer of that shallow reservoir, and so it had been causing a lot of pressure depletion locally in that area. We had a subsurface blowout at that point. We had a kick that we hadn't accounted for, we didn't anticipate it, it hadn't been observed before, a small kick that would've been easily contained by the well design. However, with this depletion above here, the impact of that on the gradient of the shallow reservoir with that additional kick weight resulted in a fracture inside here and massive losses of our mud system into that reservoir unit. We even ended up producing quite a lot of that drilling mud from the reservoir later on.

So, we had a situation where that well design didn't work. The problem was we had five other wells that were designed identically that needed to do the same job, and from the driller's perspective, they were not going to go near it and touch that with a bargepole. We were just post Macondo, so everyone was very sensitive around our drilling parameters. We didn't want to have anything going wrong, even more than normal, and so what they wanted to do was redesign with an additional casing string that would have required us to case between the two units that are quite close together preventing us then from being able to get a horizontal section into the target reservoir. 

That sub-vertical production would have resulted in a roundabout 60 million barrels of lost reserves. So, we really needed to make sure that that had to happen. This was a field in late production life, so it was probably the last campaign that would ever exploit that particular reservoir, and we really wanted to make sure that this is what had to be done.

 

We spent some time with the drillers and we realized that they do a lot of their benchmarking in 1D. They're doing that in these kinds of pressure elevation plots, so they just dump all of the wells, essentially, onto these kinds of diagrams and look out for these red spots where there's some crossover and say, "Well, there's a one in X number of chance of that happening, therefore, that's not an acceptable risk." 

What we were able to do by building a geomodel that went all the way through to the surface and incorporating all of the basic rock properties of these reservoirs but actually copying the pressure matched current day pressures from all of our simulation history models. We were able to show that the three-dimensional relationships of these pressures today under the current poor pressure regime meant that the other wells were going to be safe with that design. In fact, they're all placed underneath injectors surprisingly, and the effect was actually reversed. 

So, by putting all of this together, inverting our model into drilling mud weight, we were able to generate a example using geomodeling technologies that communicated very, very clearly to the drillers that the wells in the remaining campaign would be safe. So, we executed the plan as we had initially designed the wells, and they all came in safely, and they're now producing very nicely. So, without doing this kind of a work and linking it all together into a single story using those three-dimensional models and the current pore pressure matched simulation models, we would've missed some 60 million barrels of additional reserves. 

I hope you liked this little example. I would love to hear your stories around similar problems during a production scenario. Thank you very much.

 

 

 

Tetris in the third dimension: object-based modeling

Luke is back at the Cognitive Whiteboard to take a look at how object-based modeling is a lot like playing your old Gameboy favourite in 3D. 

Click on the image below for a detailed view.

tetris+in+3d.jpg

TRANSCRIPTION:

Tetris in the third dimension: object-based modeling

Hello, and welcome back to The Cognitive Whiteboard. My name's Luke, and today we're playing Tetris in 3D.

GEOLOGICAL MODEL from A different perspective 

I want to talk today about how I like to use object-based modeling for some of the more complex facies environments that I encounter. I'm going to use that as an example; this complex fluvial system that Boyan, from the Wave Consortium, recently highlighted on a fantastic LinkedIn post, where he really showed how you can use Google Earth to understand the true complexity that can occur on a geocellular scale for a geological model.

 The original article by Boyan is  here

The original article by Boyan is here

So, this trace of his study through here has got 50 by 50 meter grid cells overlaying on it - a typical size, perhaps, for a geological model. What we can see, when we look at it, is that each individual cell within this system has got radically different potential permeability relationships. We might have systems that are mostly sand, but with a mud plug along one edge, would have no permeability in an east-west direction. Likewise, it could be the north-south orientation that gets destroyed by a mud plug. What it means is that when we try to represent the facies inside this system, it's not suitable just to think about rock types at one point, and then continuous properties in isolation: we need to pay attention to the relationships of both the facies and the continuous properties of those facies in 3D.

And so, with object-based modeling we can do this but, importantly, before we get in there, you can't model this system with one of these. So, the fluvial environment models that you see in object-based modeling typically look like these channel systems; often just a simple sinusoid with some levee banks. I can't see that anywhere here. I don't ever see that. In fact, what I usually see in these kinds of systems are things that look more like an aeolian dune or an oxbow lake. So my recommendation: usually I'm using things like the aeolian dunes, one after the other, to represent the development of those point bars that come across these systems.

 

Connecting the pieces: object-based rather than pixel-based modeling

We do have levees. We do have overflow banks. But by and far the dominant systems are those depositional environments that are occurring because of channel migration - so we want to represent that in our geology. So, how could we then use a shape like this and still get that kind of behavior in the permeability vector? So, what we can do with object-based modeling that's somewhat unique to object methods - as opposed to pixel-based methods - is that you can preserve individual orientation attributes of each one of the objects. In particular, directional trends, depth trends, and distance and curvature are properties that you can represent out of any one of your objects, and you can use this when you come to do your correlations for porosity and permeability now.

download (2).jpg

So, how could I represent, perhaps, this feature here using those outputs? Well, if I knew that I had an oxbow lake coming around the edge, then I could preserve that permeability in the direction of the object. It's high, but counter to it is low. Likewise, I might want to say that these point bar systems have got coarser sediment at the base of them, and they're fining upwards across them, so that depth trend could come in very, very handy. It's very difficult, when you're doing your data analysis, to have hard data to correlate this against so you're gonna have to model that based upon your understanding of geology. But this method, together with that kind of third dimension in the context, is really the only way that I've ever seen that works to deliver the relationship between rock types and continuous properties - except for one other method that is developing that I've seen out there, coming out of Geneva. A method where you use multi-point statistics. They simultaneously simulate rock types and continuous properties, but as far as I know, that's not particularly commercially available yet. Anyway, I hope you enjoyed this. Thanks very much, and I'll see you again at The Cognitive Whiteboard next time.

My 24 Million Dollar Mistake

At the Cognitive Whiteboard this week Luke describes a 24 million dollar mistake that he made, and how it changed the way he thinks as a geologist.

Click on the image below to get a more detailed view of this week's Cognitive Whiteboard.

My+24+Million+Dollar+Mistake.jpg

TRANSCRIPTION:

My 24 Million Dollar Mistake

Hello, welcome back to The Cognitive Whiteboard. My name's Luke and today we're not going to talk about technical best practices. I'm going to share with you an example of why I think communication is at least half the job that we do. I'm gonna illustrate that with an example from my history where I think I made a 24 million dollar mistake in an appraisal well.

Firstly, the well was drilled safely and it was drilled with no environmental impacts, and we achieved all of our appraisal objectives on time and on budget. So it wasn't a mistake in that regard - but I will explain to you why I think it is. So we had a setting where we were drilling for a lowstand sandstone. It was unusual target for the region. Typically, we were looking for something much deeper, but this lowstand was essentially within marine shales. It was in 1,500 meters of water - so quite deep for us to drill from - and it was underlying a very complex overburden of submarine canyons of cuts and fills, filled with various clay stones and calcilutites making it very, very difficult depth conversion.

download.jpg

Unusual Reservoir

The reservoir itself as well was quite unusual. What we had in this reservoir was a structural clay, so if you haven't seen this before, it's common for us to see dispersed clays - typically orthogenic cements that are occurring at the grain boundaries. We have laminated cements that are commonly depositional. Structural clays, though, few of us had ever encountered where essentially bioturbation had been so pervasive that these little creatures had essentially concentrated all the clay into fecal pellets, and it was providing framework support for the reservoir. So despite a 30% to 40% clay content, we had fantastic porosity and permeability.

However, those pellets were relatively ductile and what we observed in the core was that we would see a dramatic reduction in porosity and permeability associated with increasing external stresses. So the theory then was that if we went down deeper in depth, particularly below mud line, we would probably expect to see a poorer quality reservoir.

 "What do you mean, you drilled it anyway?"

"What do you mean, you drilled it anyway?"

We Were Right - But We Were Wrong

And so I went to my mentor and explained that this had happened. And to my surprise, he put his head in his hands and said, "If you knew the answer, why did you drill that well?" And I really... This is a turning point in my career. It really put me back on my heels. And this is where I think my 24 million dollar mistake came. If we knew this so well and we had such good technical justification, had we worked more on our "Rosetta Stone" of translating technical jargon to business speak at this conversation, had we build a bridge between these two divides and managed that, we may have postponed a 24 million dollar drill. 

Now this was a major capital project and so it was always going to get drilled, so it is a data point that we needed, but could it have been delayed? I don't know that for sure, but I look back on that and I reference this in my career - and that's the reason why I spend so long on these boards - because communication is at least half of the job that we should be doing as a geologist.

Thanks very much. I'll see you again here next time.

Juggling Multiple Vertical Facies Proportional Curves

Responding to feedback from our viewers, this week Luke pulls one of his old tricks out of his sleeve: how he blends multiple vertical curves together to produce a coherent three-dimensional trend model. Enjoy!

If you'd like to explore this week's whiteboard, click the image below to get a closer look.

How+to+create+a+three-dimensional+facies+model.jpg

TRANSCRIPTION: 

Juggling Multiple Vertical Facies Proportional Curves

Hello, and welcome back to The Cognitive Whiteboard. My name is Luke and, today, this is the third video we're filming around facies modelling. I was planning on talking more about methodology at this point, but there was so much interest in how I blend multiple vertical curves together into a coherent three-dimensional trend model, that I thought I'd show you exactly how I do that. So let's get started.

WELL DATA AND VERTICAL CURVES

1476796763617.png

What we want to do is have a conceptual depositional model in mind. So this is something you have built up from your field data and analogous nearby data sets so that you understand how the facies might be brought together in the overall system of that formation. And we’ll take a look at the wells that we've acquired and see where they cluster inside that depositional model. Because, what we're going to do, is say that these clusters of data give us some observed information, but it's an incomplete set. We need it to represent that depositional suite in a more robust fashion.

So we're going to cluster our wells together. In this case, I've got three particular groups that I've used that represent my depositional model, and I'm also concerned that there might be another set of facies that hasn't been sampled by my wells.

So when I start with my data, I've upscaled the facies logs three different times for different groups of wells, and now I'm going to build some vertical proportion curves. And I'm going to use some detective work to understand how to take this observed set, that has a terrible sample rate, to construct something that represents that depositional region of the model. We're going to do that for each one of the scenarios that we want to carry, including the one that we haven't sampled, so that we're going to really base this entirely upon depositional concepts. And we're going to build vertical curves four times over, and these will exist there, and every IJ column of your grid will have these percentages of these particular facies types.

Now, obviously, this is tremendously uncertain at this stage, so it's well worth considering; does this need to be analysed multiple times so that you're representing your uncertainty appropriately? But let's say for now these are the trends that we want to do. So, essentially, we have picked up the vertical trends out of our depositional model linked to the wells, but that's really where it's coming from, so that we come down and say, "How are we going to distribute this geological behaviour onto that grid?"

MAPS AND EQUATIONS

1476797266971.png

What I do then is I construct some simple two-dimensional maps that overlay the reservoir. So I can now map out where vertical curve one will exist, where two, where three, and where four should be on a map-based sense. This is simply a surface that goes literally from value one to value four, and I've sculpted that myself using geological intuition.

So for example, I want to say, "What if this carbonite facies exists?" Perhaps it could be in this corner of the map or the other corner, or perhaps along the entire edge. And I want to see, does this change the answer for the particular business decision that I'm facing next. And I can simply blend together these four properties that are the same everywhere in three-dimensional space into a model that has different vertical curves in every point in space by using this equation here.

So you have to increment that for each one of the trends that you're doing and, obviously, you could combine one and four if you wanted to think about how you could do that mathematics; you could obviously change the numbers around a little bit. But in doing so, you can blend these curves together so that if you're 30% of the way between one and two on this map, the vertical curve you'll get will have 70% of this facies assemblage and 30% of that facies assemblage with those same vertical patterns. And in doing so, you can really take control of the way your facies models appear, and I find this a fantastic way to generate very different geological scenarios and test what's going to change my next business decision.

I hope this is helpful, and I'd love to hear some critique on the method. So please, feed back in the comments and let me know what you think.

Removing The Blindfold: Taking Control in Facies Analysis

Luke is back at the Cognitive Whiteboard again for the second video in our series on facies - this week he looks at data analysis in facies modelling.

Below is a still image of this week's whiteboard, for your further perusal!

Data+analysis+in+facies+modelling.jpg

TRANSCRIPTION

 

Removing The Blindfold: Taking Control in Facies Analysis

 

Hello! Welcome back to The Cognitive Whiteboard. My name's Luke and today, we're filming the second video around a series on facies modelling. In this video we're going to focus in on how we do the data analysis behind the facies model. It's arguably one of the most difficult parts of the workflow and so it's a place to take your time and get it right.

First things first

Really, before we get into any of the analysis of this, we should have had the conversation with the stratigrapher, the geophysicist, and all of the rest of the geologists in your region to understand what your depositional environment is likely to be in your particular area, and make sure that you understand how the best practices reflect that kind of depositional system. And the job that we're going to do as modellers is going be to implement that theory in three dimensional space.

A common starting point is to look at the vertical proportions that we see in the wells and this isn't an easy task to do by any means. It's not easy for a couple of reasons. Firstly, we're sampling a discrete property, and we're sampling it usually with a handful of wells and so it's quite difficult to see a lot of character in that. We have essentially thrown away a lot of information, we made the choice of bidding a bunch of rock types together into a particular facies class. So it's important that we take that observational data, tie it back into the depositional concept and produce a conceptual vertical curve that mirrors what you're trying to invoke inside the model.

1474908296432.jpg

Worthwhile extra work

Now, I, personally, am not smart enough to get the right curve for a full field in one go. Quite frankly, I don't know what the right vertical proportion curve would be for that model there in that grid, because it varies across space. Essentially, over here, it's 100% of the orange facies and over here, there's none of it. So what's the right vertical proportion curve? It really depends upon the grid structure as well as the depositional model. So I find a much simpler routine is to actually generate more than one vertical curve across my model, I do this manually, and then draw some polylines and use a little bit of mathematics to blend them together and construct a combined vertical proportion concept, essentially a three dimensional model now that blends together both my map-based theory and my vertical proportion curves. It's a little bit more work but I find it gives me a lot more control.

And then of course, it's very important that we see if we can drive something out of the seismic to give us an insight into the reservoirs. When we do so, just, of course, be aware of the vertical resolution that you get from seismic. There may be a lot of tuning effects. This reservoir is thinning off to the left here so I will be quite skeptical about what I'm seeing here. And of course, the seismic wavelet may be several times thicker than the reservoir target so it's important to understand: is this an average effect over the whole reservoir? Is it a particular stratigraphic zone, because any seismic image sees not only the target of interest but also the rock adjacent to that.

And of course, the seismic interpreter, the geophysicist, is going to be seeing the same rocks binned in a different way. They're going to be binning it by acoustic properties and the sedimentologist might be binning it by other methods. So it's quite important, referring back to the first video, that we have that clear and calm conversation so that we can relate these properties to the facies that we have observed in the wells and bring it all together.

1474908320811.jpg

Adding mathematics

And finally, that point of bringing it together, most tools - when we come to execute the facies modelling routines - allow you to blend more than one trend into essentially a single prior that will go into the geostatistical routine. Most of the time, commercial software will give you essentially slider bars of these various kinds of properties which are essentially saying “I love” or “I hate” this property, it works or it doesn't work, I'm going to try to blend this together. Arguably - maybe because that's quite easy to control mathematically in an uncertainty workflow at the end - I'm not a big fan of this because I don't get to see that outcome in three dimensional space before it goes into the modelling routine. So again, it's relatively simple mathematics: a couple scripts, and you can end up combining these together yourself and have a nice three dimensional concept that you get to QC yourself before it goes into the modelling routine.

 

I hope this is all helpful. We use this kind of concepts in the next video and start talking about executing specific routines.