Red Pill or Blue Pill?: The Impact of Fluid Separation Processes

Jim is at the Cognitive Whiteboard again, inviting you to take the red pill, leave The Matrix and question everything you thought was true about production data used for modelling.

For a closer look at the whiteboard see below:

boardstill.PNG

 

TRANSCRIPT

Hello, and welcome back to the Cognitive Whiteboard. My name is Jim Ross and today I'm going to be talking about the impact of fluid separation processes. And I have gone for a Matrix theme with today's whiteboard because I'm going to be asking you to make a choice, the blue pill or the red pill. If you take the blue pill, you wake up at your desk and you believe whatever your production technologists or reservoir engineers want you to believe. You take the red pill, you stay at the Whiteboard and I show you how deep the rabbit hole goes. 

Now that I'm done with quoting The Matrix, what I'm actually going to talk about today is something which I and a number of clients have learned the hard way across our careers, and that's that not all barrels of oil are created equal. 

And to demonstrate that, let's start off with a fixed mass, a fixed volume of reservoir-conditions oil. Now, if I flash that to standard conditions, I will end up with a certain set of properties associated with the two fluids that I get. There will be a gas-oil ratio and then the densities of those two phases. However, if I take the very same oil and put it through a different separation process - I have got an extra stage here - I will end up with a different set of properties. I will end up with a different gas-oil ratio and a different density between the two phases. Now, that seems reasonable enough. If I start with the same thing but put it through a different process, I will end up with a different result, but it's not the only thing that can affect it. If we haven't got tight temperature control, it can be affected by the time of day and how the temperature changes there or, on a longer time scale, we can also look at seasonal effect due to it being summer or winter. 

Now, being a chemical engineer by training, I usually thought about things in mass balance. But when I joined the oil industry, that's not how we operate. We think of things in terms of volumes. In particular, it's usually standard-conditions volumes. And that's what we use to report, that's what we use for modelling and fiscal allocation: we tend to do on the basis of standard-conditions oil rates. But what we've seen here is, depending on the process I follow, I will end up with different properties and, therefore, in this case, a different rate off the back of that.

So what that means is that rate that we often hold to be gospel is anything but. The rate through path one and the rate through path two are not the same. Now, it's quite common to have different paths. We might have a well-test path and we might have a field process path. That's not uncommon. And the percentage difference might be quite small. But when we're dealing with things on the scale of the reservoir, that can actually have quite a profound knock-on impact. The fluid properties we have already spoken about, they can have knock-on impact then, the fluid saturations and how they'll respond to production and pressure changes, the well performance, multi-phase flow, the field development planning and well design. Are we designing for the correct path to surface? Are we using the correct rates? 

History matching itself can be really complicated by this. If this separation process is changing over the life of the field, how is that being accounted for in the historical rates that we are now trying to match to? Basically, it leads to a very Matrix-like awakening where we suddenly find ourselves questioning everything we thought was real, every piece of production data that we have come across. Not all hope is lost, however. It is easy enough to convert between two paths just by looking at the shrinkage factor through both and then looking at that ratio and correcting appropriately. Now, the most robust way to do that is an equation of state. It's also the most pernickety, and I could do a whole series of videos just on that process. So I'm not proposing that we go through how you do that but it's more to question where those numbers have come from, and it's okay to question them once we are armed with this knowledge. How are those rates measured? Were they measured at all? Were they done using allocation factors? If so, how were they calculated? Has this correction already been attempted? And if so, what was the physical basis for doing that? 

So, basically, once we have this knowledge, we can take on any modeling challenge. We can use that sceptical attitude about where the numbers have come from to try and help us narrow down any difficulties we might have with our modelling. Once we do that, we can take anything on and we can make full use of our awakening to these possibilities. 

I hope that's been helpful. I look forward to seeing you at the Cognitive Whiteboard again in the future. Until then, take care.

Reservoir Dogs: Sequential vs. simultaneous modelling solutions

Jim is back at the Cognitive Whiteboard, trying to avoid a stand off between modelling solutions that is unlikely to end well...

For a closer look at the whiteboard, just take a look below:

TRANSCRIPT

Hello, and welcome back to the Cognitive Whiteboard. My name is Jim Ross, and today we're going to be talking about sequential versus simultaneous modelling solutions. In particular, how to avoid an ugly, confrontational standoff between the two - not unlike this iconic scene from Quentin Tarantino's "Reservoir Dogs".

When we're modelling things in oil and gas, we have a lot of variables in play at any one time. And, broadly speaking, we have two ways that we can address this. We can look at a multi-variable regression, which will try and tweak all the variables simultaneously to minimise our error to observe data, or we can look at a set of sequential functions of each variable that tries to reduce the error as we go along. So which one should we use? I'm going to say both.


Mathematically speaking, [simultaneous solving] is the preferable solution. If we give it the same starting points and the same search algorithm, we will end up with the same solution. However, we know that physics and, in particular, geology, rarely works like that: rarely is everything happening at once, and quite often, we have different behaviours imprinting upon each other as we look at our different facets of our modelling. To illustrate this, I'm going to look at a simple mass balance-based example from earlier in my career. And it's just a simple tank, so mass in, mass out, and how the pressures and saturations respond to that coming and going of fluid.
 
So, if we look at the reservoir pressure history, we'll see that it has a gentle enough decline to start, there's a sudden spike, and then it levels off a little bit before it completely drops off a cliff and the production starts to peter out. If we try and use this multi-variable, simultaneous solution on that, we're not going to get a very good history match with this simple model. And that's because we're trying to match fundamentally different periods of behavior all at the same time. The answer is not to lump more complexity into the model, but it's actually to take a step back and look at what's happening across the history of this model, this actual field, with this sequential behavior. In this first period, really, all we've got here is fluid expansion, which will be dependent upon the value of the stock tank oil in place, and the aquifer drive. What's the strength of the aquifer in this field? In the second period, if we actually look at the production history of the field, we can see that that's the point at which water injection starts. So naturally, that's going to have a little impact. But if we can nail down a value of the variables [in period 1], we can carry that forward into the second part. Once we've done that, and maybe tuned the transient aquifer response, we can then look at this third period. And it was thought that there was some sort of fracture event and they were then losing fluid to another tank and thus pressure in the field. 
 
So what we can do is we can take each solution and move it forward to inform the next one. We can take the stock tank oil and aquifer strength determined here and carry that forward, where we can then tune the transient response of that aquifer. And we can then take all that forward and then look purely characterising that transmissibility, and typically that would be a transmissibility factor we come up with. In other words, we simplified the problem by looking at a simultaneous solution, but considering each period of time in turn, and applying some sequential thought to that. In other words, if we can successfully combine these approaches, the final result is more robust. And we can not only directly calculate our past behavior, but we can be confident that the future behavior will be predicted well by our model. The alternative is one of these self-defeating standoffs that were so common in "Reservoir Dogs" and not only are we not going to be able to reliably create past behavior, we're going to have no hope of predicting the future behavior.

I hope that's been helpful. I look forward to seeing you at the Cognitive Whiteboard again in the future, and I hope that’s soon. I'll see you then.

Communications in Large Organisations: Supply Chain Management of Information

Kirsty is back with a lighthearted look at how to grease the wheels of communication and knowledge transfer in your organisation…and - yes - she does appear to have a new hair colour in this video too!

For a closer look at those personas click on the image below:

Communications pic.png

TRANSCRIPT

Hello, welcome back to the Cognitive Whiteboard. My name is Kirsty. And today I'm going to talk to you about something a little bit different. I'm going to talk about communication in an organisation. Why is communication and the flow of information so important? Well, it's the lifeblood of the organism. It's the fuel that drives the machinery of any business, drives those business decisions to be made. When I think about the relationships that I have in the company and how I like to communicate with them, I divide them into personas. I find it helps me understand what people's roles are in the company. And once I've understood what their role is, it allows me to understand what I need to give them to be able to do their role. 

And here, I've just put some of them, examples of these personas, up on the board. They might be technical, or they could be more about helping you understand the way the business functions, the organisation that you're working in. But once you've identified the different personas, it makes it really easy to identify what their role is and, therefore, what you need to give them so that they can do their role. And one really good illustration of this is that as part of this subsurface workflow, we always end up having to go through the assurance stage. And it can really feel like the assurance team's job is to stop you getting where you need to go. 

But when you think about their persona, their role in the organisation, you see that what it is to do is to ensure portfolio consistency. Their job is to make sure that every opportunity development across the organization has been assessed to the same standards. And the reason that they need to be assessed at the same standards is so the decision-making team can understand that they are evaluating everything at the same level so that the top-down decision, the flow of money can happen appropriately. And once you understand that's their role, then it's really easy to see what you need to give them. You need to show them the evidence that you've applied the company's standards to your opportunity and where you haven't been able to what other solution you've used and the reasons why you've used that solution. 

So when you start to think about the personas, you start to see the company more as a machinery where every role is a different cog and the flow of information, the communication is the fuel that drives those cogs and the oil that lubricates them. And that's a much better way, I think, of visualising your company than maybe the organisational top-down organisational structures or even these more matrix-style organisational structures, because it really helps you understand what relationships you need to build. And I think that's really key. 

If you see a new executive come into a company, they'll spend quite a lot of time wandering around talking to people. And that's what they're doing. They're getting to understand how the machinery of their organisation works. They're not just coming straight in using the chain of commands to make big decisions and changes. They get to know the organisation, because everyone's individual before they make those big decisions. Well, I like to use personas to help understand my organisation, be really interested to see how you like to do it. Thanks for joining me at the Cognitive Whiteboard. And I look forward to seeing you again soon.

Creaming Curves and Exploration: Use through the Basin Lifecycle

Our Senior Geologist Kirsty is back at the Cognitive Whiteboard, this week, inspired by Jacob's analysis of North Sea production, she's taking a look at creaming curves and how we can use them to add value in exploration.

Click on the image below for a closer look at the whiteboard.

Picture1.png

TRANSCRIPT

Hello and welcome back to the Cognitive Whiteboard. My name's Kirsty and today I want to talk to you about creaming curves and how we in the oil industry use them for exploration. First off, what is a creaming curve? Well, a creaming curve is when we plot the cumulative discovered volumes in a basin against the number of new-filled wildcat wells.  We do that to minimize the effect of oil price fluctuation on our interpretation.

When we start plotting these curves for basins around the world, we start to see common factors and common patterns occurring. These patterns are these hyperbolic style functions that we see here, where we're getting large discoveries and then as we drill them out, we see a tail forming on the play. 

Picture3.png

We see inflection points as well where new large discoveries were made in the basin. And, in the case of the Norwegian North Sea like I've got plotted up here, these tend to be geological. But, they can be other factors as well. For example, as we all know, the technology in the oil industry has moved on so much over the last 30 or 40 years and we've moved from being able to drill on-shore and shallow marine to deep water. Or in the case of on-shore U.S., we've opened up really interesting new plays by being able to access unconventional oil and gas.

 We can also find that some of these upticks and fluctuations can be because of the economics. For example, we've got the infrastructure in the basin, so now plays which weren't accessible have now become economic. Or even political factors opening up new basins in new countries.
 
So, how can we use these creaming curves as explorationists to make money for our company? Well, we do have to be a bit careful. Here I've plotted the creaming curve for the deep-water Gulf of Mexico. And, if I look at it in this gross function like this, I can see inflection points, I can draw some hyperbolic functions, and I can start to make some inferences from those.
 

Picture2.png

For example, I might look and say, "Well, this is great, this basin's still growing. There's more exploration to be done here." But, looking at it, I don't think my most recent play is still adding very much volume. Actually, from my interpretation, it looks like these older plays are the ones that are adding the volume. But, can I really make that assumption from this dataset? Well, no, I can't. As an explorationist, I probably need to go in and look at the data in more detail and actually find out what discoveries are adding these volumes and which plays they come from. And if I separate out the plays, that's where I can add real value to my company.

Looking forward, where in the world am I looking to see really interesting exploration happening? Well, I mentioned political factors opening things up. The Mexican side of the Gulf of Mexico has opened up recently and some big experienced players from the U.S. side of the Gulf of Mexico are in there exploring. So, I'm really expecting to see some big new discoveries and a really interesting uptick in the creaming curve for that basin.

 I'm also really excited to see the basins that are in the early part of exploration where they've seen their first big discoveries recently and how they're going to develop. For instance Guyana, and Suriname where Kosmos are drilling this year, and also over in West Africa. And, other places I'm interested in are those mature basins. There's going to continue to be lots of interesting exploration in the North Sea and also in Atlantic Brazil, where Petrobras are looking to open a new play in the really mature Espίrito Santo Basin.
 
So, those are the places I'm looking. I'd be really interested to hear what you think about it and how you use creaming curves for exploration, and also where you're excited to see the next wells be drilled. Thanks for joining me at the Cognitive Whiteboard. I look forward to seeing you here again soon.

Another Change of the Guard? Could Micro-oil Dominate the North Sea?

The recent 30th licensing round in the UK North Sea has had geologist Jake thinking about where the next sustainable growth in the basin might come from.

To see a closer view of the whiteboard click on the picture below.

board image.png

TRANSCRIPT

Hello, and welcome back to the Cognitive Whiteboard. My name's Jake, and I'm a geological intern here at Cognitive Geology. Today, I would like to talk to you about how we could potentially add a new sustainable wedge onto the life cycle of the North Sea. Firstly, like most basins, the giant discoveries were made in the early stages of the North Sea. In the case of the North Sea basin, these discoveries were made by the supermajors, who were capable of developing customized facilities to handle and cope with the hostile environment. Into the '90s and the early 2000s, these facilities became very common in our industry, and we saw mid-sized IOCs be capable of developing these and operating assets off-shore, which actually led to our production rates picking back up to the highs we saw in the '80s as well. Now, there are two more wedges here at the end. The independents joining the party roughly around the late 2000s and bringing more assets online. But what I'm really interested in is this blue wedge here at the end. As oil price picked up, we see production massively increase as well, which is what we would expect. As the price drops to around about $30 a barrel, we saw production just completely stop. Just completely stop. And that is, if this shows anything, it's a very unsustainable way of operating. So why did this happen? What led to this?
 
In the early days, these huge fields were built and developed with customized facilities that were volume-driven with large NPVs and huge scale, and the recovery was also driven and enhanced by enhancement recoveries, water injection, gas injections as well. But if this has shown us anything, maybe we need to change how we do things, and maybe we need to move to a primary depletion or primary production method. This would allow us to operate more quickly, more swiftly, and also at a much lower operating cost and upfront cost. Now, it's going to require changing economic focus, and it's going to require both companies and government to be on board with this. These big fields at the start were initially NPV driven. They had huge volumes of oil being produced, and gas as well, and they had the life extended by the enhanced recovery. But if this graph has shown us anything, that throughout the life cycle production's declining, and we don't want to end up with a wedge like this blue one at the end when prices drop, production completely drops as well.
 
So maybe, for a more sustainable wedge to be developed at the end, we need to move towards a discounted profitability index. What that would involve is governments accepting this primary depletion and ultimately lower recovery factors, which would bring down our initial upfront costs and also push the production earlier into the life cycle of the project. We'd be developing and producing oil much earlier, which would see our revenues being generated much sooner. Now, why would we go about this? Why would governments accept this? Maybe, if we could convince the government that, as an industry, this is something to do, then the UK North Sea is the perfect place to do it. There are over 140 fallow fields that have seismic well data and production tests, for the most part, that prove there's hydrocarbons in place. So right there, we've got no exploration cost upfront, which is going to bring down our initial costs. It's just a case of convincing our government that it is the right way to go about it. These resources will stay in the ground under this initial regime, where we're enhancing recovery and spending lots of money to extract these high volumes of oil, but if we accept the primary depletion and we move forward with this, then maybe we could add a sustainable wedge onto the end of our life cycle. I'm really interested to hear your thoughts. Do you think this is the way to go? Do you think we could convince the government, our government, as an industry, this is the best thing to do? Please leave your comments in the box below.

2001: A Geomodelling Odyssey

We welcome friend of Cognitive Geology, geomodelling guru, and creator of Geo2Flow, Dan O'Meara to the Cognitive Whiteboard, to share some of his insights into 3D fluid saturations.

If you would like to visit the Geo2Flow website please follow this link: Geo2Flow

Click on the image below to see a larger version of the whiteboard.

dan.png

TRANSCRIPT

Hello, and welcome back to the Cognitive Whiteboard. My name is Dan O'Meara, and I'm here to talk to you about a way to evaluate your 3D Saturation models, and along the way this plot that I show you will stimulate a lot of discussion with your interdisciplinary asset teams. So I've been in the industry for quite some time, and I was actually there at the dawn of geomodeling, and when we started to put together geomodels. We had some simple expectations. One was that our models were going to honor our observations and the main observations were the well data, the well logs. Another was that our model should honor the physics. So when it comes to porosity, there is no physics, some people tend to use just geostats for porosity. But when it comes to saturation, we know there's physics to be honored and that's the physics of capillarity. We have plenty of data in the laboratory and that's why we get capillary pressure curves. So the physics becomes very important when it comes to saturations, and the physics tells us that as you go up in a reservoir, it's not just an easy thing where the oil saturation uniformly increases but the oil saturation is going to depend upon the heterogeneity in both the porosity and the permeability, they're all connected.
 
If you put together models that are just driven by the physics, what we're going to do here is we're going to evaluate those models, and I put together this kind of a plot to try and evaluate them, here's how the plot goes. So we're looking at 3D models and we're only going to consider cells that are penetrated by Wells because the Wells are where we have observations. So along the x-axis here, we're going to plot the properly pore volume weighted saturation log, the water saturation log. And on this axis, the y-axis, we're going to plot the water saturation that's sitting out there in the 3D model. And all we're going to do is look and see how we're doing, how they compare. You would expect to get a 45-degree line, at least I did with all of the data generally being on the 45-degree line. Because after all, if you honor the physics, you should be honoring the observations. But what I found was kind of strange and one of the things that just jumped out right away is you see data up here along the top. And what that's saying is that the water saturation that's coming out of the model is 100%, but yet there's that's contradicting it saying, "We have as much as 80, 90% oil, so what could possibly be going on?" So I turned back to the geological models here and realized, "Hey, there are faults in those models." With the faults comes the possibility of compartmentalization. So, for instance, in this model here we've got four free water levels, and you start to realize that this kind of a data is a signature for compartmentalization.
 
If you look at this plot more and you'll see that usually, I was not seeing nice straight lines even when I discounted for this, and if you're in this area here, the water saturation in 3D is higher than the water saturation in the wells. So here you're underestimating the reserves and conversely you're overestimating reserves in this part of the plot. So in going around the industry over the last 20 years or so, I would see people who were struggling where they had a computer programs here that basically said, "Hey, I can do the physics," but when it came to observations, when we put together these kind of plots here, we were repeatedly seeing things like this. Where we had lots of scatter here and then we had what seemed to be evidence of multiple compartmentalization there. So what I did is to come up with a methodology that both honors the physics which is the first thing that people were doing, but also honors the observations. So that every time that we put together a model in 3D for saturation, we get plots like this which is what you'd expect from porosity. You should honor the observations and you should honor the physics because after all that's what Mother Nature does. If you want to learn more about how to get there, then Luke will provide you with links to our website below. So  that's all from the Cognitive Whiteboard today, see you again next time.

Cognitive Geology are shortlisted for Innovation Award

We're very excited to have been shortlisted alongside Air Control Entech, Rig Deluge Ltd, and Tendeka for the Innovation Award at this years Press and Journal Gold Awards.  

The Innovation Award is sponsored by Balmoral, whose managing director and chairman, Jim Milne is quoted by Energy Voice describing the judging process: 

“Once again, the judging session was very exciting. Choosing between all of the entries was a difficult job because there were so many good ones, but there has to be a winner.

“I am looking forward to seeing cheerful winners getting their prizes on the night.”

If you would like to read the complete article by Rebecca Buchan and discover the finalists in all the categories click here.

We are all looking forward to finding out the winners on September 7th... fingers crossed for a Cognitive win!

Being an Economically-savvy Explorer

Our Senior Geologist Kirsty returns to the Cognitive Whiteboard, this time she's sharing some thoughts about why economic factors are for geologists too!

To view a close-up of the whiteboard click on the image below:

Kirsty whiteboard 2 - savvy explorer.png

TRANSCRIPT

Hello, and welcome back to the Cognitive Whiteboard. My name's Kirsty Simpson. Here at Cognitive Geology, one of the things we want our tools to do is to enable geoscientists to visualize the economic impact of their decisions. This led me to think about how often working as a New Ventures geologist I was asked to be an economically savvy explorer. Why was I being asked to think about economics you ask? Well, fundamentally my job was to make money for the company just like everybody's is. 
 
So, when you're bringing opportunities in to your New Ventures team, as a geoscientist, if you see one that's technically interesting, it’s so tempting to just get stuck straight into your subsurface workflow, build your regional framework, do your play fairway evaluation, talk to the petroleum systems team and get a model built, and then go and work for your petroleum prospect workflow. Get your volumetrics, calculate your risks, and then go and speak to your economist. That's a really long workflow before you get to discussing the economics of your opportunity. What if that economist says to you, "This is never going to meet the thresholds. What are you doing? What are you bringing me?" Well, I would argue, there's another way of starting your workflow and that's by thinking slightly more economically. 
 
What you can do is you can start by thinking, "What are my company's economic drivers? Am I looking for production in the near to mid term? So, do I need to find drill-ready opportunities or discoveries that are ready for development? Or am I looking for prospects and leads because we need to fill our prospective resources inventory hopper?" Once you've decide what kind of opportunity you're looking for, that's when you want to start thinking, "Well, where am I going to look for it?" And one thing you might start with is a graph like this which tells you what the percentage contractor take per country is. And that way, you might identify number of countries, or regions, or basins that you want to start looking for opportunities in. You also need to consider the geopolitics of those regions because you want to be able to actually get in there safely. And also, you need to think about the market because don't find eight billion barrels of oil if you can't sell it to anybody. 
 
Once you've overcome those economic stumbling blocks or hurdles, that's when you can start your subsurface workflow, build that framework, and do that play fairway evaluation. Out of that play fairway evaluation, you can identify your yet to find, whether you're looking for oil or gas, whether you're looking in deepwater, on the shelf or onshore. And that's the sort of information you can take to your economist and ask her to do some scoping economics on. As a result of that, she can give you a MEV, a minimum economic volume, that you require to meet in order to overcome your company's economic threshold. And that way, after just a few weeks, you might be able to throw out some or all of the opportunities you've identified. If you throw them all out, great, just go back and start again. Or otherwise, you take the few that do meet that threshold into your detailed subsurface workflow. And this way, you screen more opportunities and the only ones that you actually take through this process to your final economic decision are the ones that are likely to make it over your threshold and the ones that you're going to get in front of the bid committee. That way, your management are going to see you as an economically savvy explorer. 
 
I think that's all I've probably got time for. Thank you for joining me at the Cognitive Whiteboard and I hope to see you again soon.

The Bigger Picture: Uncertainty from Subsurface to Separator

Our new Hutton Product Owner, Jim Ross, shares some insights from his point of view as a petroleum engineer carrying out production modelling. Very interesting for the geos among us to hear how our models might be applied.

Click on the image below to see the whiteboard in more detail.

 

Jim whiteboard photo.png

TRANSCRIPT

Hello and welcome back to the Cognitive Whiteboard. My name is Jim Ross. I'm the new product owner of Hutton here at Cognitive Geology. And today I'm going to be talking to you about "The Bigger Picture: Uncertainty from Subsurface to Separator." My background is chemical engineering and most recently, I've been working as a petroleum engineer in the field of integrated production modeling. What that means is taking models of all the different parts of our oil and gas system and constructing a model that represents the entire behavior of the system and how they're going to interact and affect each other once we start producing from that asset. 
 
And what you notice when you're doing this is that everybody in the oil industry kind of works in their own little silos. You might have production engineers who are responsible for the operation design of the wells, you might have a drilling team who decide where and when we're going to be actually drilling those wells, a facilities team who would look at things such as the process or refinery required to support that production and anything else that's required to actually make that happen, and then the reservoir model, which of course, would be looking at how the fluids are actually going to move through the porous medium of the reservoir rock. However, they're all working towards a common goal, which is usually something like how much oil am I going to produce and when am I going to produce it. 
 
And in creating these models, I would often get asked, "How do I know this is correct? How do I know that what we've predicted is what will actually happen? And the short answer is, we don't. We don't know that that's entirely correct. And the reason for that is because of all the various uncertainties we might have in the process of building these models. For instance, do I have a reliable lab report? Do I know what my fluid density is, my API gravity, gas gravity is going to be? Do I know what the ratio is going to be of gas and oil and even how much water I'm going to produce over time? Do I know what the drainage region of my well is going to be and how much of that I'm actually going to contact during production and what that pressure decline is going to be, even how many wells I'm going to have and what type they're going to be and how it's going behave if we're not operating at the design capacity of those facilities? 
 
And what that leads to is some very anxious engineers, which is why I can have sympathy with this guy here. So, something that we've tried to move towards is to look less at precisely what these inputs would be and look at what impact they have by looking at the different possibilities on the production side. So for instance, a production engineer might sensitize on tubing size to see how much the well is going produce and what effect that will have on the production. If we're looking at a perforation, we might look at the perforation efficiency, how well we're making those perforations and what impact that would ultimately have on any wells that we drill. If we undertake a stimulation job, then we might look at what the stimulated PI would be after. 
 
There's a range of possibilities for what we might end up with there. What we're trying to build here at Cognitive Geology is something that takes into account the geological possibilities. So what are the different possibilities for filling my rock properties across the entire grid? And what impact does that have on the process? What that allows us to do is to move away from something which is precise, but we don't necessarily have a lot of confidence in, to something that is approximately accurate that then tells us what the impact of our decisions and the impact of our unknowns would be so we can have greater confidence in what we've predicted going forward. That's all I want to talk to you about today but I look forward to seeing you at the Cognitive Whiteboard again in the future, and I'll see you then.

Frontier Exploration: Avoiding Interpretation Bias

We welcome our new Senior Geologist Kirsty as she presents her first Cognitive Whiteboard.

Click on the image below to see the whiteboard in more detail.

whiteboard 1.png

TRANSCRIPT

Hello and welcome back to the Cognitive Whiteboard. I'm Kirsty, one of the new Geo's here at Cognitive Geology. My background is as an explorationist and I've joined the company to bring some of those exploration workflows into the products that we're developing. In preparation for today's video, I was watching some of Luke's old blogs, and one of the things that struck me was Luke talking about the difference between accuracy and precision, and that really struck a chord with me and some of the learnings that I had to take on board as a young geologist learning my trade in exploration.

So, early on, when I was a youthful geologist at the cliff face with my hand lens, I was looking at the really fine detail, getting excited about the bedforms that I could see, walking along the whole outcrop, seeing all the contacts and mapping them really accurately. When I joined BG group in 2009, and started working in exploration, I discovered that we really rarely have that kind of luxury of data density. In fact, if we start working in frontier basins, for example we've got a basin here maybe 4 or 500 kilometres in diameter with a handful of wells that we can you use to learn about the basin from. So, we've got very fine vertical detail with big horizontal spacing, tens, hundreds of kilometres between them, and from those, we're very comfortable with our vertical detail and it allows us to build depositional models. We can use those depositional models to show the depositional environments in map form, and this really looks like the kind of map that I would have drawn early in my career at BG Group. It looks really nice and geological, it looks like a snapshot of something we might see today in a depositional system.

 It's very appealing. But what it also is, is a precise view of one single possible interpretation of the data that we're seeing in my depositional model, and it doesn't encompass the full range of time and geological processes that are happening in that basin. So, if you draw something like this, you risk becoming really anchored to that one interpretation very early on with very little data to tie it back to. I would go back and tell myself 10 years ago that what I really need to be is accurate but encompassing the full range of possibilities, and therefore, what I should be drawing is not lovely geological facies maps like this which look great. But things that really represent what I actually know right now and that would be a play fairway style map where I'm showing where I believe the sand dominated facies to be. Where I believe the mud dominated facies to be. That still allows me to focus my exploration, recommend which blocks we should be bidding on in a license round. But it also doesn't tie me to an interpretation that I can't really back up at the moment with my data. And we also risk, because this looks so nice, it tends to get used again and again and again within the company and we've all seen this happen. And if this gets persisted 10 years down the line, when I'm not involved in that asset anymore, it can really bias people's decision-making and we don't have that information when this was drawn, and therefore, we could miss some really good prospectivity as a result of tying ourselves to one specific interpretation in time. Well, that's all I've got time for right now. But I look forward to seeing you back at the Cognitive Whiteboard soon.

Dragon Eggs and Unicorn Tails

Luke introduces the new series of whiteboard videos by telling us about the myth of hard data.

Click on the image below for a closer look at the whiteboard.

TRANSCRIPT

Hello and welcome back to the Cognitive Whiteboard. It's been a while but we have a new cast of characters that we will be introducing shortly. But I'm going to kick off the first of this series of videos with an attack on the hardness of the oil field data sets. To begin with, let's do some mathematics, not a place I normally start with, but if we look at a grid cell in a geological model, let's have a look at the reality of how well we've sampled that single grid cell, let alone the rest of the field.
 
By the time we get down the reservoir, we usually around a seven inch bit sample, doesn't really matter, but let's assume seven inches and a pretty common grid cell size might be 50 by 50 meters. If we do the mathematics on calculating the sample rate, our well bore area is about 0.02 of the square meter converted into metrics and the grid cell is around about two and a half thousand square meters of rock area, so that sample rate is 1 in 125,000. Question: does that well bore represent the perfect average of that grid cell? Let's just put it and leave it there for now.
 
But let's have a look at an oil field for example. Let's take Britain's biggest oil field the Forties. We have a hundred and three wells in it at 90km2 of area. Do the same mathematics and we are at 1 in 45 million as a sample right for that oil field. So even in this well-developed field, we have a pretty big challenge in trying to say we have statistics here, perhaps that's the reason why we use the term ‘geostatistics’, as to whether we want to be explicitly honoring all the mathematics to this or we want to be a little bit pragmatic and understand that our sample rates are a bit spurious. I would argue on the side of using a little bit of geological intelligence rather than just mathematics here, which is often where we start. But let's even look in a single well bore just how confident we are that we know where that well bore is. 
 
I was involved in a peer review where we had an issue that one of the wells was off by more than a 150m at the bottom hole location and that was proven because the velocity anomaly that was required to tie that well was just unheard of. It turned out the well was actually on the down thrown inside of a fault where it had been previously assumed to be on the up thrown side. That was discovered because we did a gyro survey over these wells to try to explain some of the issues. We found that about 30% of the wells were off by more than 50 meters and when we corrected all of those we added about 90 million barrels of oil back into that oil field and suddenly all the production history, you know, the general behavior that field started making a lot more sense.
 
Let's talk about that production history though. On the single well basis, how confident are we that we know the production is what we say it is? And this is probably some of the softest data that we have in the oil industry. The production data particularly when you're looking at a downhole zonal allocation can be very, very subject to uncertainty and inaccuracy. The well bore itself is often in practical terms not perfect. Cement bonds can create leakage points behind pipe, the jewelry itself wears over time, and the control of the flow can become problematic, and most of the time, wells are being produced through a cluster so the allocation back to the single well, let alone the zone can be really problematic. 
 
When we look at these production allocations, it's just worth bearing that in mind. Just a really hilarious point to that, we had a 28-day cycle in one oil field that turned out to be due to the hitches of the operational guys. One of the blokes was measuring the production data accurately, the other guy was just kind of eyeballing it from a distance, and that ended up with this 28-day cycle to our production data that we thought was tidal to start with. In reality, it was just inaccuracy in that measurement method. When the questioning comes, do I honor all of my all of my data? I do feel a little bit like Gandalf going up against the Balrog because the reality is I can't match all of it. Most of the time, there is going to be inaccuracy somewhere in the piece of the of the puzzle and I can't always be confident where that lies. What I'm always trying to do is develop the most coherent story I can within the realms of uncertainty that these data provide. Just a little bit of a story there, I hope that's helpful to you, if you've come across any other strangeness in your fields that turned out to be part of this, I'd love to hear about it in the comments below. That's all for now from the Cognitive Whiteboard. I'll see you back here again time.

Growth Lies in Innovation

Luke and Gemma Noble, Audit Director at EY, feature in an opinion piece discussing growth and innovation in the oil and gas sector. 

Times have been challenging in the oil and gas sector for the last few years and companies are looking closely at how they can continue to adapt and grow.

Gemma addresses the need to focus on knowledge capture, talent development and innovating across processes, systems and technologies.

Luke talks about recognising and addressing the key drivers of the operating companies whilst acting as a technology innovator.

To read the full article click here.  

Cognitive Geology wins two awards in one night

Cognitive Geology was nominated for awards at both the Digital Tech Awards and The Made In Scotland Awards 2018. The presentations were awarded last night (26th April) in Glasgow. 

Our COO Eileen McLaren was representing us at the Digital Tech Awards and we are very honoured to have been named the Digital Technology Business of the Year (Emerging, less than 10 staff and less than 4 years of age).

Meanwhile across town Luke and Alex were sporting their kilts at the Made in Scotland ceremony, where they were on the spot to accept the Exporter of the Year award. 

Click through the images to see the full lists of winners.

 Eileen at the Digital Tech Awards 

Eileen at the Digital Tech Awards 

 Luke and Alex at the Made In Scotland Awards 

Luke and Alex at the Made In Scotland Awards 

Luke Nominated for EY 'Rising Star' Award

More exciting news for Cognitive Geology today, our founder and CEO Luke Johnson has been announced as a finalist in the 'Rising Star' category of Scotland's EY Entrepreneur of the Year!

 

Luke is quoted as saying “Discovering that I’ve been selected as a finalist in the EY Entrepreneur of the year awards tops off what has been an amazing first quarter of 2018 – we’ve just moved into our brand new offices, have grown the team to 25 and are working on some really exciting new contracts.”

To read the full article about Luke and the other nominees click here

 

 

Eileen Joins the Board

Eileen-M-CROP.jpg

More exciting news here at cognitive geology!! We are really excited to announce that our VP of Software Engineering, Eileen McLaren, has agreed to take on the role of Chief Operating Officer and also joined the board as a director.  

Here's an excerpt of the article from digit magazine by Brian Baglow. See the original article here.

 

One of the key people in both Skyscanner and FanDuel, Eileen McLaren, joins the ambitious geo-technology company’s board – aiming for a trifecta of tech unicorns.

Eileen McLaren,one of Scotland’s most respected digital technology executives, having been instrumental in scaling tech unicorns Skyscanner and FanDuel, has joined the board of Cognitive Geology.

Having helped Skyscanner and FanDuel scale from start-up to global success, Eileen joined Cognitive Geology as VP of Software Engineering in July 2017. She has now been promoted to Chief Operating Officer (COO) and appointed a director, taking a seat on the company’s board.

Founded in 2014, Cognitive Geology designs and builds specialist software for geoscientists in the oil and gas industries, improving accuracy in finding, appraising and developing oil and gas reserves.  In 2017 the company secured £2 million from Maven Capital Partners and Enzo Ventures to develop their range of product solutions for the oil and gas industry.

Cognitive Geology’s Hutton software claims to help companies in the oil and gas sector be ‘approximately accurate, rather then precisely wrong’. Hutton reduces oilfield investment uncertainty by extracting progressive trends from complex geological datasets. Geologists can identify and sequentially remove the effects of the various geological processes, resulting in a vastly reduced range of uncertainty.

The geoscience software industry is estimated to be worth an annual $4.5 billion and is forecast to double over the next several years as antiquated software is replaced by next generation technology solutions. In the recent past Cognitive Geology has secured a seven figure contract with one of the world’s largest energy companies

Eileen McLaren, COO, told DIGIT:  “I’m delighted to join the board of Cognitive Geology as I see it as one of the most exciting early stage tech businesses in Scotland at the moment. I’ve been fortunate enough to have worked at some of the most successful tech companies around and the planning, drive and the quality of the execution I see at Cognitive Geology has the same hallmarks I saw in the early days at Skyscanner & FanDuel.”

 

Cognitive Geology Raises Investment To Fuel Growth

 (Back): Brian Corcoran, Eileen McLaren, Mark Collins, Kelli Buchan. (Front): Fiona Johnson, Luke Johnson, Alex Shiell.

(Back): Brian Corcoran, Eileen McLaren, Mark Collins, Kelli Buchan. (Front): Fiona Johnson, Luke Johnson, Alex Shiell.

We interrupt our regular geology programming with a special announcement: we're delighted to report that Cognitive Geology has recently raised a £2 million investment round to help us scale our business and get our first product Hutton into the hands of more geologists around the globe.

As regular visitors to this blog will know, at Cognitive Geology we have been developing the next generation of technical software for geoscientists and reservoir engineers since the beginning of the industry downturn – helping geologists do their jobs more effectively, and thus helping oil companies in their quest for more cost effective solutions under the new price regime. Now with several major customers on board, at this point in our journey we're very pleased to announce two investment partners in the form of UK private equity house Maven Capital Partners and Enso Ventures, a London-based early-stage venture capital firm.

Here's what our CEO Luke Johnson had to say about the news: “The oil industry, one of the first to adopt 3D visualisation and high performance computing, once led software innovation but today it has fallen behind gaming & cloud technologies. As geologists and technologists ourselves, our mission is to take advantage of modern software architectures to improve the efficiency of our peers around the globe by giving them the best software possible, for the complex tasks they face each day.

To that end, the feedback from our customers so far has been overwhelmingly positive. Today’s low oil prices increase the requirement for better technology. Enso & Maven, both forward-thinking investors, were quick to recognise the fortuitous timing of our company growth. This investment round will provide us the capital to scale our Edinburgh headquarters and position ourselves as innovators of 3rd generation software solutions for the energy services sector."

David Milroy, Investment Director at Maven said: “Luke is a true entrepreneur having created Hutton to tackle the challenges he encountered himself when working as a geologist responsible for high value upstream oil and gas projects. Cognitive has a differentiated approach that provides geologists with the most robust geological explanations for the data they observe and in doing so helps them make more informed decisions. Luke has assembled a very capable technical team as well as an experienced advisory board and we look forward to helping the team execute their exciting plans.”

Enso's CFO, Oleg Mukhanov, added: “We are excited to support this cutting-edge business, which is set to revolutionise the way geological data is analysed and interpreted. We strongly believe that Cognitive’s high calibre team is capable of delivering on this task, which will put them at the forefront of their industry.”

Thanks to everyone in the geology community who has supported us so much thus far: hugely appreciated - you are the folks we are building for!

Ok: that's all on the investment side for now - back to the geology!

Simulation Sprints: Minimal Cost, Maximum Value

We're at the Cognitive Whiteboard again with Luke, discussing simulation sprints, and how a lean and iterative approach can provide robust business recommendations, in a fraction of the time taken to build a “perfect” model.

Click on the image below to see a larger version.

Cognitive+Whiteboard+simulation+sprints.jpg

Transcription

Simulation Sprints: Minimal Cost, Maximum Value

Hello and welcome back to the Cognitive Whiteboard. My name's Luke and today we're talking about simulation sprints. This isn't a technical workflow, it's a project management one. But it's something that's completely revolutionised the way I do my work and I'd like to share with you how this can help you make much more cost-effective analyses of your reservoirs in a vastly shorter period of time.

Now the method is not something that I claim credit for because it was challenged to me by a person called Michael Waite in Chevron who once came to me with a new field to look at, a mature field with a lot of injection and production and asked me to give him a model by the end of the day. My reaction to that was shock - I guess would be the polite way of putting it - because that's obviously, an impossible ask for you to try to build a reservoir model and understand what's going on in a complex brown field in just a matter of a single day.

But Michael wasn't being silly. He was challenging me to use a methodology that would allow us to make quick and effective decisions when we clearly knew what the business decision that was coming up was going to be. What we had in this particular case is a mature field with only about 12 or so locations left for us to drill, and six well slots that we needed to fill. We had wells that were declining in production and we were going to replace them. And so, there really wasn't any other choice other than optimizing those well, bottom-hole locations.

And so, in that context you can come back and say well, we don't necessarily need to answer the ins and outs of the entire reservoir but rank and understand those locations so that we can drill the optimum ones during the drilling campaign. And he introduced to me this concept of simulation sprinting. What it is, is a quick loop study that you can do numerous times, iterating through progressive cycles of increasing precision until you get to a point of accuracy that allows you to make a valid and robust business recommendation.

The first one in a single day, we were not going to be able to build a realistic reservoir model by any means. What we were able to do in a single day is do some pretty decent production mapping results. So, taking the last six months of production looking at the water cut, we got together with the production mapping team. And we were able to design a workflow that we could do that day that would give us an idea of what was going to address this bigger objective, and try to say what would be the lowest water cut because that's another value measure that we could use to understand these wells.

Importantly, because we're gonna do this a lot, even though we call it sprints, the key is to work smart, not hard, because it's gonna keep going on over time. So, you wanna be able to do this within the normal working hours. Don't burn the midnight oil, otherwise, you'll burn out before you get to make your robust business decision.

The really important piece in this cycle though is the number five. When we come to assess the outcomes of any one of the experiments that we've done, we need to rank the wells that we had in order of economic value. So, whatever way we were trying to devise it, we needed to have those well targets ranked from best to worst at the end of each one of these simulations sprints. That's what Mike was asking me for at the end of the day.

And when we do this assessment, we also spend time to have a look at what's the worst part of the technical work that we've done because that's gonna form the basis of the next objective of the next sprint cycle. And we could come back and progressively increase the length of this sprint loop. So, the first one was done in a day, the second was in two and then four and eight and so on. But we can adjust this as is needed to determine how we could address these experiments.

But as we come back through this loop, and constantly re-rank ourselves, what was fascinating is that after only four weeks, the answer never changed again. The order of the top six wells was always the top six. And what that shows you is that really, with some quite simple approaches you can get to the same decision that you could with a full Frankenstein model. You can get to that recommendation without having to do years' worth of work.

So, we were able to make that recommendation. It was an expensive campaign, so we didn't stop at four weeks. We ended up stopping at about four months. But really importantly, we would've taken six months, perhaps a year to get that kind of a stage and we were already significantly ahead of that at the end of this routine. So, it's a method that has really changed the way I do my work, and it's something I really recommend you give a go. Hopefully, you enjoyed this, and I'll see you back here next time at the Cognitive Whiteboard.

MANAGING UNCERTAINTY: ROBUST RANGES USING TRENDS

Warming to his heretical theme, Luke is back to discuss using trend analysis to drive uncertainty in geostatistical models

If you'd like a demo of Hutton - our product that Luke mentioned - just drop him a note: luke@cognitivegeology.com.

I think this is my favourite whiteboard drawing yet - enjoy it in glorious full screen by clicking below!

TRANSCRIPTION

REPLACING THE VARIOGRAM: ROBUST RANGES USING TRENDS.

Hello, and welcome back to the Cognitive Whiteboard. My name's Luke, and wow, did that last video generate a lot of interest in the industry! What we did was we talked about how variograms and trend analysis can work hand in hand to try to investigate how your properties are being distributed in three dimensional space. Today I want to show you how we can use the trend analysis to drive the uncertainty in your models as well. In doing so, I think I'll officially be promoted from geostatistical heretic to apostate. But let's see how we go.

What I want to do today is really run you through how I used to go about doing geological uncertainty management and how I do it today. I started by thinking about shifting histograms. I think a lot of us do this. If we wanted to get a low case, what if the data was worse than what we observed, or a high case, we could shift it upwards in the other direction? I've done this many times before in the early parts of my career. It's not a particularly valid way of doing it in many examples. When you do just shift the histogram and fit to the same world data, you'll generate people's and dimples around your wells, which is undesirable. But if you shift the observed data as well by saying, "Well, the petrophysicist has some uncertainty in their observations," what we're really beginning to invoke is that the greatest degree of uncertainty associated with that is at the wells. And I think we can all agree that the greater degree of uncertainty is away from the wells. There are important uncertainties here, but we have bigger ones to deal with up front.

The other way of trying to manage our uncertainty is also in the structure in how we distribute that data. Different variogram models are useful for doing this. We can say fairly that the interpretation of a geological variogram, that experimental data that you get is - particularly in a horizontal direction - usually uncertain. We don't have enough information there to be confident on how that variogram structure should look, so it's fair to test different geological models and see what will happen. What's interesting is, of course, if you vary the histogram, you'll change STOIP with a similar recovery factor, just generally better or worse. Whereas if you change this, you'll vary the connectivity, but you won't really change the STOIP very much. And it's often difficult to link this variogram structure back to a conversation you can have at the outcrop.

So over my, I guess,  five or six years now, I've been focusing on addressing uncertainty by saying, "Actually, the sampling of our data - the biased, directional drilling that we've gone out and sought the good spots in the reservoir, typically - is really what we need to try to investigate." How much does that bias our understanding of what could exist away from the well control?

Got an example here, a top structure map of a field, a real field through here with five appraisal wells along the structure, and the question is: in these two other locations that are gonna get drilled, is it gonna be similar, different, better, worse? And how could we investigate the uncertainty of the outcome on those locations?

We could, for example, investigate: is the variation that we observe in this data set a function of depth primarily? Or perhaps it's a function of some depositional direction - in this case, the Y position, as a theory. We don't know at this stage which one it is, and depending on which one we do first, we end up with different correlations. In fact, you can see on this sequence after taking out the Y position, if I analyze the residual for depth, we end up with not two porosity units by 100 meters of burial, but only one porosity unit being reduced as a function of depth. So you can see, fundamentally, we're invoking a different relationship as a result of that burial.

What's really interesting is that these are highly uncertain interpretations, but they're valid ones. And they give us different answers, not just in terms of the absolute positions of those two wells, but actually for the entire model the answer is different. And this is very representative of your uncertainty in three dimensional space. This input histogram with that particular shape is more to do with the sampling of your wells in the particular locations that they were, whereas this trend model behind it is helping you understand is there any variation in three dimension space that's going on? So we can end up with very different plumbing and in-place structures by really investigating how these trends can go.


You can do all of this manually one after the other through these routines. Our product Hutton does it for you automatically. We run about 300 different paths through your data set to try to investigate how that could go, and we find it's a very powerful way of really developing robust but different geological interpretations to address your uncertainty in your reservoir. If you're interested, drop me an email: I'll let you know more about it. But for now, that's all from the Cognitive Whiteboard. Thank you very much.

Replacing the Variogram

This week, Luke nails his sacrilegious thesis to the door - that the variogram can be replaced by quality data analysis. 

We're building a product called Hutton to do exactly what has been outlined above: If you'd like to get a sneak peak, drop Luke a line at: luke@cognitivegeology.com

To dig deeper into this week's whiteboard, click on the image below to get a fullscreen version.


TRANSCRIPTION

Replacing The Variogram

Hello. Welcome back to the Cognitive Whiteboard. My name is Luke, and today, I'm nailing my thesis on the door. I am going up against the institute to commit to geostatistical heresy by showing you that variograms are not an essential element of our geological modeling processes. In fact, I wanna show you that with careful trend analysis, you can do a better job of producing geological models that represent your geology in a much more powerful way and give you much more predictive answers.

To do this, I've built a data set - it's a complex data set, they're quite geologically realistic, some tilted horse blocks through here into which we have modeled some complex geological properties. We vary those over x and y by some unknown function. We have some very complex sequence cyclicity of fining upwards and coarsening upwards trends. And we've over printed this with a burial trend. And what we want do is see how we go about representing that with this perfectly patterned drill data set, either with trends or with variogram analyses and determine which one we think is better.

So, let's first off have a look at the variogram of the raw data set and we can see immediately some of those structures, some of those trends that we impose in the model, are showing up in our variograms. We have some obviously low sills in some of the data sets that have some structures, some correlation over distance before they flatten out into our sill. But we do have some weird cyclicity that's happening and we should wonder what's going on there. So in truth, we know that this is just the complexity of having lots of different processes creating nested variograms and various sequences of cyclicity. And all geologists that are familiar with this kind of routine will know to try to take some of these trends out.

One way we could start is by subtracting the stratigraphic trend. This isn't often done but it's very, very powerful. You could take, for example, a type log and remove that from your log, from your data set, or you could do what I've done here and essentially subtract the midpoint or the mean value from every one of your K layers and see what you get after you take that out. You're basically representing sequence cyclicity when you do this. You wanna keep that trend, this black line here, because you have to add it back into your model afterwards. But when you do it, you see a reduction in the vertical variogram, as you would expect. We have described a lot of the variation that's occurring in this direction as a function of sequence cyclicity and it's not just random. And so typically, you'll see a reduction of probably the half of the variation in the vertical sense. But it won't have any impact on the major/minor directions because the same columns exist everywhere in x and y space.

Once we take that trend out, we'll have a new property - it’ll be porosity given that trend - and we can do a trend analysis on that property. So, now we're doing it against depth. And what's interesting is as you take out trends progressively, you start to see the second and third order effects that might not have been obvious in the raw data sets. In this case, it really tightened up our observation of the depth trend. And we can subtract that next. Take that trend out because it's not random and see what it does to our variograms. Now, this one changes the major and minor variograms, not the vertical one, even though that seems counterintuitive, and it's doing that because your burial depth is varying as a function of its map position. So that's why it changes those two variograms.

And again, we can keep diving down deeper and deeper into our data sets, removing progressive trends, linking this to geological processes, and pulling it out of your data. In the end, with this perfect data set, if we had described all of those trends, you would see no structure left in your data or next to no structure. And that's because you have done a pretty good job of describing geology, not just random processes. Your nugget and your sill is almost identical. That means that the observational point has just as much information immediately as it does at some great distance. That's great, you no longer need a variogram. You have done it instead with trends.

Now, this is obviously a perfect data set with an unrealistic perfectly sampled series of wells. Let's imagine what we would do with a realistic sample set with much more sparse and biased samples. Well, most gross geological trends are obvious, even in small data, small amounts of samples. But these horizontal variograms are something that we basically never get to measure in the real world. And so, we spend our time in peer reviews often defending whatever settings we have done in these major and minor directions with no basis or outcrop that we can link that to.

So, if you want to do something in this space, you can make your models much more predictive because you can end up driving geology into it and removing the dependence upon your random seed. You can do all of this in pretty much any commercial package today, but it's not particularly easy or intuitive. So, we've gone ahead and built a product for you that will do this in a much more powerful way. We call it Hutton, named after the father of geology, because with these small observations, we can make interpretations that can change our understanding of the world.

Hutton comes to the market in March 2017. It will help guide you through this trend analysis process, it even has some intelligence that can help you automate that. And if you're interested in finding out how to throw away the variogram and bring geology back into your geological models, please drop me an email, and I'll happily show it to you. But for now, in the meanwhile, that's all from us, and I'll see you next time at the Cognitive Whiteboard.