A Bayesian look at Star Trek's 'Redshirts'

Jim is back at the Whiteboard, and is using Bayes’ Theorem to see if the redshirts in Star Trek are really the most likely to meet an unfortunate end…


Hello, welcome back to the Cognitive Whiteboard. My name's Jim. And today we're going to be taking a Bayesian look at Star Trek's red shirts. And in particular, the idea that the red-shirt-wearing characters in Star Trek are the perfect example of a disposable character who's unlikely to make it to the end of the episode without dying. And if you look at the numbers from the original Star Trek series where William Shatner was Captain Kirk, that would certainly seem to be the case. The majority of the deaths are people wearing a red shirt. However, I thought it'd be interesting to use Bayes' theorem to actually test this hypothesis and figure out who of these three people should be the most worried. Should it be myself wearing the science and medical team blue, our CEO, Luke wearing the leadership command gold, or Eileen, our Chief Operations Officer wearing the operations and engineering red?

So to do that we're going to need our information here about who died and also some information about the total breakdown of the crew on the Starship Enterprise. So, if we take that information and Bayes' theorem, which I'll explain in a moment, we can then try and calculate how concerned each of these people should be. So, what we actually want to look at is what is the probability that someone will die given they are wearing a red shirt. And that's what we can use Bayes' theorem for. Testing a hypothesis based on some known observation that we have made. And to do that, we need to combine several probabilities. The first one is the likelihood function, and that's telling us what is the probability that somebody was wearing a red shirt given that they died. 
And that would be our original data here. So that would be our 26 red shirts out of 45.

We then got the prior and that is the prior probability of dying in the first place. So that’s a total of 45 out of the total crew of the enterprise of 429. And then finally, we've got the marginal probability, which is our probability of wearing a red shirt regardless of whether we live or die during the series. And that would be our 240 red shirts out of the total 429. So, if you put those numbers in, we actually come out with a surprisingly low 11%. And the reason for that really is because yes, the majority of the characters who die are wearing a red shirt, but the majority of the crew are wearing a red shirt, so they're not necessarily more likely to actually meet their end on the show.

If you calculate for the other colors, then the blue shirt comes out at about 7% and then the command gold comes out 19%. They're actually nearly twice as likely to meet their end compared to the red shirts. So, Eileen has no real need to be any more concerned than me, but Luke maybe has something to worry about if he gets sent off to an unknown planet. However, we all know that the best example of a disposable character is someone who isn't given a name. So we don't have room for all the maths, but you can actually calculate this for somebody who doesn't have a name and use that to update the prior in Bayes' theorem. And that's something else you can do based upon more observations that you make, you can update the probabilities that you calculate and update this prior information here and the marginal as well.

So if we do that and actually go through the process of updating that prior, if the red shirt wearing character does not have a name, then that probability shoots up to 33%. So that was just quite a good example of the use of Bayes' theorem but also how we can go about updating the prior information as well. So, Luke should be concerned a little bit more than before, but we know Eileen's name so she is absolutely fine. So hopefully we'll come up with some more examples of these that we can maybe look at as the Whiteboard series goes on. But until then, I hope that was interesting and I'll see you back here in the future. Take care.

Virtual Metering & Time Travel

Jim is back at the Whiteboard to talk about how we can use virtual metering to get more out of our production data and travel back in time. Where we’re going, we don’t need well tests…!

Hello. Welcome back to the Cognitive Whiteboard. My name is Jim and today we're going to be talking about virtual metering and time travel. In other words, where we're going, we don't need well tests.


 Now, to start off with this I'm going to talk a little bit about the sort of measurements we might have in our well. We might not have all of these - we might even have very few of them - but this is basically a way we can make use of these measurements to tell us what's happening in the well. So, for instance, I've got an ESP well here, so I might have pressure readings around the pump itself, a wellhead pressure reading, and I'm now going to have certain assumptions about the fluid that is passing through that well, and also the reservoir itself. So, I will have an estimate of water cut, GOR, the oil density API, and also what the reservoir pressure is. Those are not things we would typically measure in an ongoing operation, but we'd have our understanding of what we might expect them to be.

Now, based upon all these various measurements, I can then calculate a rate through the well. And the traditional method of doing this, and the one that is the most common, is a VLP IPR intersection. So, based upon the fluid properties and what I understand the reservoir pressure to be, and the wellhead pressure, I can then calculate those two curves, and the intersection will give me my rate. That's just one way of calculating the rate. If I have any of these other measurements, I can introduce other methods. So, for instance, if I have the dP across the pump, then I can use the pump calculation to determine what that should be. If I've got a choke, and I have a measurement of the flow line pressure and the wellhead pressure, those two measurements can be taken in order to produce the choke rate. And all these different methods will have sets of physics that overlap, and are independent, and they're all based on the idea of what the fluid properties are, what my water cut is, what the GOR is and so forth.

So, if they disagree, I can then use that to determine what might have actually changed. So, if they diverge over time then that can tell us something, because if we've got everything correct, reservoir pressure, water cut, everything, then they will agree perfectly, and that tells us that that is the correct rate. However, when we start to diverge, they'll do it in different ways depending on what has changed. Now what that will allow us to do is actually fire up the oilfield equivalent of a flux capacitor, and travel back in time and see what changed at that point. Because, for instance, if the water cut goes up, then the choke will respond differently to the reservoir, and the choke has very little interaction with the reservoir. Those sets of physics are largely independent. So, depending on how they start to diverge, we can then use that to interpret what's happened. In other words, we can go back in time and basically pretend that we did a well test. We can determine what the fluid properties are without needing to intervene directly in the well.

So, what I'm saying is, virtual metering - which is basically what this technology is called - can then be used to help us understand what is happening in the well even if we have less data, if we don't have the data point of the well test. So, we can do that, determine what's going on. We can even use that as part of our future prediction. If the reservoir pressure's going down, that's a trend that might continue, same with the water cut. And of course, we might see an interaction between these different things that are going on, and that will be reflected in the data that we have. If we don't have all of it, we can maybe be less certain, but we can still home in on a rough answer of what we expect to be happening. So that'll be an ongoing theme in these whiteboards, making better use of our data and seeing what we can get from it if it's particularly sparse. So, I hope that's been useful. I hope to see you back at the Cognitive Whiteboard soon. Until then, take care.

Correlation vs Causation

The Cognitive Whiteboard is back! To kick off this new series, Luke talks about Correlation vs Causation for physical properties, and whether the concept means Gangnam Style is the song of the century…



Hello, and welcome back to the Cognitive Whiteboard. My name is Luke, and today, we're filming the first of Series 3. We're going to talk about correlation and causation and really address some of the differences between them. I’m going to highlight the point with a correlation between the number of songs that appear in Rolling Stones' Greatest from the 20th century against the production of oil from the lower 48 states of America. 

What we see if we put those two data sets together is an apparent relationship, and you could argue that we might see that the number of songs that came through in the '60s predicted that oil boom that came a little bit later. Take another step forward, and we say that shale boom that's on right now might tell us that the beginning of the 2000s are going to appear prominently in the Rolling Stones' Top 500 of the 21st century. Now, what I think is quite honestly an obnoxious song, "Gangnam Style", would be right at the beginning of that. So, it was the top song in its year, is that going to be in that list? 

Obviously, this is not predictive at all. It's completely rubbish, but it's amazing how often we make correlations and assume causal relationships. A great example of that in the geosciences is actually porosity to permeability. Porosity is dominated by the pore volumes, permeability is dominated by the pore throats. You can look through this proof, do it for yourself, you'll determine that you can prove that porosity and permeability have a correlation but not a causal link between the two. In a depositional system like a shoreface system, the porosity is going to be really heavily affected by sorting, so in the upper regions of the system, you're going to have essentially consistent porosity. If you logged it through here, you wouldn't see anything different under neutron density. However, the grain size is going to radically change permeability, so within this system of apparently static looking porosity, you should see quite a significant relationship in permeability. 

Those kinds of differences of that correlation in those regions can really affect how you predict fluids are going to flow, so it's important we retest it. How could we do that? Well, we could think about the relationship between these two, except that there is a correlation, not a causal link between them, and try to find what is the causality that's creating that association. On this case, porosity and permeability are both quite strongly linked to their depositional position and their burial history, and both of them have similar kinds of relationships in terms of where and what direction they start to degrade. 

The thing is, though, and we use the illite transition zone here to highlight, that the variation doesn't remain the same for both of the properties. If we take a reservoir that has a little bit of calinite in it, a little deposition, and start burying it, once we get beyond the smectite zone, that smectite is going to turn to illite. And illite, as we all know, is terrible for our permeability because it blocks up our pore throats. So we're going to suddenly see a rapid divergence between the relationship of porosity and permeability, and it's going to happen because of the relationship with depth. It might not be important in your reservoir, I'm not saying it is. It depends entirely upon the shape of your structure, but what we want to make sure we're always doing as geoscientists is throwing a bit of scepticism on any of these correlations that we can't associate to a direct causal relationship and retesting it as we go through it.

It's one of the things that I think will remain for a very long time, a core requirement for a professional to help make these interpretations. I think it's good news because I don't think a machine is replacing us yet. When you look at the way that machine learning is going to work, it's essentially these kind of correlations on steroids. We're talking about many dimensions of analyses that we can start to do, but they can find correlations that aren't necessarily going to be predictive because they could develop a chart very much like this one. Now, if you have enough data, the theory would be that you eventually get beyond that, but the geoscience isn't necessarily in that space. So, this is one of the points that allows me to sleep comfortably at night and feel like there's still need for a geologist coming forward. 

Interested to hear what your thoughts are. This one's going to probably raise a little bit of a question, but happy to have that conversation as well. Let me know your comments below, and until next time, I'll see you back at the Cognitive Whiteboard.

Red Pill or Blue Pill?: The Impact of Fluid Separation Processes

Jim is at the Cognitive Whiteboard again, inviting you to take the red pill, leave The Matrix and question everything you thought was true about production data used for modelling.

For a closer look at the whiteboard see below:




Hello, and welcome back to the Cognitive Whiteboard. My name is Jim Ross and today I'm going to be talking about the impact of fluid separation processes. And I have gone for a Matrix theme with today's whiteboard because I'm going to be asking you to make a choice, the blue pill or the red pill. If you take the blue pill, you wake up at your desk and you believe whatever your production technologists or reservoir engineers want you to believe. You take the red pill, you stay at the Whiteboard and I show you how deep the rabbit hole goes. 

Now that I'm done with quoting The Matrix, what I'm actually going to talk about today is something which I and a number of clients have learned the hard way across our careers, and that's that not all barrels of oil are created equal. 

And to demonstrate that, let's start off with a fixed mass, a fixed volume of reservoir-conditions oil. Now, if I flash that to standard conditions, I will end up with a certain set of properties associated with the two fluids that I get. There will be a gas-oil ratio and then the densities of those two phases. However, if I take the very same oil and put it through a different separation process - I have got an extra stage here - I will end up with a different set of properties. I will end up with a different gas-oil ratio and a different density between the two phases. Now, that seems reasonable enough. If I start with the same thing but put it through a different process, I will end up with a different result, but it's not the only thing that can affect it. If we haven't got tight temperature control, it can be affected by the time of day and how the temperature changes there or, on a longer time scale, we can also look at seasonal effect due to it being summer or winter. 

Now, being a chemical engineer by training, I usually thought about things in mass balance. But when I joined the oil industry, that's not how we operate. We think of things in terms of volumes. In particular, it's usually standard-conditions volumes. And that's what we use to report, that's what we use for modelling and fiscal allocation: we tend to do on the basis of standard-conditions oil rates. But what we've seen here is, depending on the process I follow, I will end up with different properties and, therefore, in this case, a different rate off the back of that.

So what that means is that rate that we often hold to be gospel is anything but. The rate through path one and the rate through path two are not the same. Now, it's quite common to have different paths. We might have a well-test path and we might have a field process path. That's not uncommon. And the percentage difference might be quite small. But when we're dealing with things on the scale of the reservoir, that can actually have quite a profound knock-on impact. The fluid properties we have already spoken about, they can have knock-on impact then, the fluid saturations and how they'll respond to production and pressure changes, the well performance, multi-phase flow, the field development planning and well design. Are we designing for the correct path to surface? Are we using the correct rates? 

History matching itself can be really complicated by this. If this separation process is changing over the life of the field, how is that being accounted for in the historical rates that we are now trying to match to? Basically, it leads to a very Matrix-like awakening where we suddenly find ourselves questioning everything we thought was real, every piece of production data that we have come across. Not all hope is lost, however. It is easy enough to convert between two paths just by looking at the shrinkage factor through both and then looking at that ratio and correcting appropriately. Now, the most robust way to do that is an equation of state. It's also the most pernickety, and I could do a whole series of videos just on that process. So I'm not proposing that we go through how you do that but it's more to question where those numbers have come from, and it's okay to question them once we are armed with this knowledge. How are those rates measured? Were they measured at all? Were they done using allocation factors? If so, how were they calculated? Has this correction already been attempted? And if so, what was the physical basis for doing that? 

So, basically, once we have this knowledge, we can take on any modeling challenge. We can use that sceptical attitude about where the numbers have come from to try and help us narrow down any difficulties we might have with our modelling. Once we do that, we can take anything on and we can make full use of our awakening to these possibilities. 

I hope that's been helpful. I look forward to seeing you at the Cognitive Whiteboard again in the future. Until then, take care.

Reservoir Dogs: Sequential vs. simultaneous modelling solutions

Jim is back at the Cognitive Whiteboard, trying to avoid a stand off between modelling solutions that is unlikely to end well...

For a closer look at the whiteboard, just take a look below:


Hello, and welcome back to the Cognitive Whiteboard. My name is Jim Ross, and today we're going to be talking about sequential versus simultaneous modelling solutions. In particular, how to avoid an ugly, confrontational standoff between the two - not unlike this iconic scene from Quentin Tarantino's "Reservoir Dogs".

When we're modelling things in oil and gas, we have a lot of variables in play at any one time. And, broadly speaking, we have two ways that we can address this. We can look at a multi-variable regression, which will try and tweak all the variables simultaneously to minimise our error to observe data, or we can look at a set of sequential functions of each variable that tries to reduce the error as we go along. So which one should we use? I'm going to say both.

Mathematically speaking, [simultaneous solving] is the preferable solution. If we give it the same starting points and the same search algorithm, we will end up with the same solution. However, we know that physics and, in particular, geology, rarely works like that: rarely is everything happening at once, and quite often, we have different behaviours imprinting upon each other as we look at our different facets of our modelling. To illustrate this, I'm going to look at a simple mass balance-based example from earlier in my career. And it's just a simple tank, so mass in, mass out, and how the pressures and saturations respond to that coming and going of fluid.
So, if we look at the reservoir pressure history, we'll see that it has a gentle enough decline to start, there's a sudden spike, and then it levels off a little bit before it completely drops off a cliff and the production starts to peter out. If we try and use this multi-variable, simultaneous solution on that, we're not going to get a very good history match with this simple model. And that's because we're trying to match fundamentally different periods of behavior all at the same time. The answer is not to lump more complexity into the model, but it's actually to take a step back and look at what's happening across the history of this model, this actual field, with this sequential behavior. In this first period, really, all we've got here is fluid expansion, which will be dependent upon the value of the stock tank oil in place, and the aquifer drive. What's the strength of the aquifer in this field? In the second period, if we actually look at the production history of the field, we can see that that's the point at which water injection starts. So naturally, that's going to have a little impact. But if we can nail down a value of the variables [in period 1], we can carry that forward into the second part. Once we've done that, and maybe tuned the transient aquifer response, we can then look at this third period. And it was thought that there was some sort of fracture event and they were then losing fluid to another tank and thus pressure in the field. 
So what we can do is we can take each solution and move it forward to inform the next one. We can take the stock tank oil and aquifer strength determined here and carry that forward, where we can then tune the transient response of that aquifer. And we can then take all that forward and then look purely characterising that transmissibility, and typically that would be a transmissibility factor we come up with. In other words, we simplified the problem by looking at a simultaneous solution, but considering each period of time in turn, and applying some sequential thought to that. In other words, if we can successfully combine these approaches, the final result is more robust. And we can not only directly calculate our past behavior, but we can be confident that the future behavior will be predicted well by our model. The alternative is one of these self-defeating standoffs that were so common in "Reservoir Dogs" and not only are we not going to be able to reliably create past behavior, we're going to have no hope of predicting the future behavior.

I hope that's been helpful. I look forward to seeing you at the Cognitive Whiteboard again in the future, and I hope that’s soon. I'll see you then.

Communications in Large Organisations: Supply Chain Management of Information

Kirsty is back with a lighthearted look at how to grease the wheels of communication and knowledge transfer in your organisation…and - yes - she does appear to have a new hair colour in this video too!

For a closer look at those personas click on the image below:

Communications pic.png


Hello, welcome back to the Cognitive Whiteboard. My name is Kirsty. And today I'm going to talk to you about something a little bit different. I'm going to talk about communication in an organisation. Why is communication and the flow of information so important? Well, it's the lifeblood of the organism. It's the fuel that drives the machinery of any business, drives those business decisions to be made. When I think about the relationships that I have in the company and how I like to communicate with them, I divide them into personas. I find it helps me understand what people's roles are in the company. And once I've understood what their role is, it allows me to understand what I need to give them to be able to do their role. 

And here, I've just put some of them, examples of these personas, up on the board. They might be technical, or they could be more about helping you understand the way the business functions, the organisation that you're working in. But once you've identified the different personas, it makes it really easy to identify what their role is and, therefore, what you need to give them so that they can do their role. And one really good illustration of this is that as part of this subsurface workflow, we always end up having to go through the assurance stage. And it can really feel like the assurance team's job is to stop you getting where you need to go. 

But when you think about their persona, their role in the organisation, you see that what it is to do is to ensure portfolio consistency. Their job is to make sure that every opportunity development across the organization has been assessed to the same standards. And the reason that they need to be assessed at the same standards is so the decision-making team can understand that they are evaluating everything at the same level so that the top-down decision, the flow of money can happen appropriately. And once you understand that's their role, then it's really easy to see what you need to give them. You need to show them the evidence that you've applied the company's standards to your opportunity and where you haven't been able to what other solution you've used and the reasons why you've used that solution. 

So when you start to think about the personas, you start to see the company more as a machinery where every role is a different cog and the flow of information, the communication is the fuel that drives those cogs and the oil that lubricates them. And that's a much better way, I think, of visualising your company than maybe the organisational top-down organisational structures or even these more matrix-style organisational structures, because it really helps you understand what relationships you need to build. And I think that's really key. 

If you see a new executive come into a company, they'll spend quite a lot of time wandering around talking to people. And that's what they're doing. They're getting to understand how the machinery of their organisation works. They're not just coming straight in using the chain of commands to make big decisions and changes. They get to know the organisation, because everyone's individual before they make those big decisions. Well, I like to use personas to help understand my organisation, be really interested to see how you like to do it. Thanks for joining me at the Cognitive Whiteboard. And I look forward to seeing you again soon.

Creaming Curves and Exploration: Use through the Basin Lifecycle

Our Senior Geologist Kirsty is back at the Cognitive Whiteboard, this week, inspired by Jacob's analysis of North Sea production, she's taking a look at creaming curves and how we can use them to add value in exploration.

Click on the image below for a closer look at the whiteboard.



Hello and welcome back to the Cognitive Whiteboard. My name's Kirsty and today I want to talk to you about creaming curves and how we in the oil industry use them for exploration. First off, what is a creaming curve? Well, a creaming curve is when we plot the cumulative discovered volumes in a basin against the number of new-filled wildcat wells.  We do that to minimize the effect of oil price fluctuation on our interpretation.

When we start plotting these curves for basins around the world, we start to see common factors and common patterns occurring. These patterns are these hyperbolic style functions that we see here, where we're getting large discoveries and then as we drill them out, we see a tail forming on the play. 


We see inflection points as well where new large discoveries were made in the basin. And, in the case of the Norwegian North Sea like I've got plotted up here, these tend to be geological. But, they can be other factors as well. For example, as we all know, the technology in the oil industry has moved on so much over the last 30 or 40 years and we've moved from being able to drill on-shore and shallow marine to deep water. Or in the case of on-shore U.S., we've opened up really interesting new plays by being able to access unconventional oil and gas.

 We can also find that some of these upticks and fluctuations can be because of the economics. For example, we've got the infrastructure in the basin, so now plays which weren't accessible have now become economic. Or even political factors opening up new basins in new countries.
So, how can we use these creaming curves as explorationists to make money for our company? Well, we do have to be a bit careful. Here I've plotted the creaming curve for the deep-water Gulf of Mexico. And, if I look at it in this gross function like this, I can see inflection points, I can draw some hyperbolic functions, and I can start to make some inferences from those.


For example, I might look and say, "Well, this is great, this basin's still growing. There's more exploration to be done here." But, looking at it, I don't think my most recent play is still adding very much volume. Actually, from my interpretation, it looks like these older plays are the ones that are adding the volume. But, can I really make that assumption from this dataset? Well, no, I can't. As an explorationist, I probably need to go in and look at the data in more detail and actually find out what discoveries are adding these volumes and which plays they come from. And if I separate out the plays, that's where I can add real value to my company.

Looking forward, where in the world am I looking to see really interesting exploration happening? Well, I mentioned political factors opening things up. The Mexican side of the Gulf of Mexico has opened up recently and some big experienced players from the U.S. side of the Gulf of Mexico are in there exploring. So, I'm really expecting to see some big new discoveries and a really interesting uptick in the creaming curve for that basin.

 I'm also really excited to see the basins that are in the early part of exploration where they've seen their first big discoveries recently and how they're going to develop. For instance Guyana, and Suriname where Kosmos are drilling this year, and also over in West Africa. And, other places I'm interested in are those mature basins. There's going to continue to be lots of interesting exploration in the North Sea and also in Atlantic Brazil, where Petrobras are looking to open a new play in the really mature Espίrito Santo Basin.
So, those are the places I'm looking. I'd be really interested to hear what you think about it and how you use creaming curves for exploration, and also where you're excited to see the next wells be drilled. Thanks for joining me at the Cognitive Whiteboard. I look forward to seeing you here again soon.

Another Change of the Guard? Could Micro-oil Dominate the North Sea?

The recent 30th licensing round in the UK North Sea has had geologist Jake thinking about where the next sustainable growth in the basin might come from.

To see a closer view of the whiteboard click on the picture below.

board image.png


Hello, and welcome back to the Cognitive Whiteboard. My name's Jake, and I'm a geological intern here at Cognitive Geology. Today, I would like to talk to you about how we could potentially add a new sustainable wedge onto the life cycle of the North Sea. Firstly, like most basins, the giant discoveries were made in the early stages of the North Sea. In the case of the North Sea basin, these discoveries were made by the supermajors, who were capable of developing customized facilities to handle and cope with the hostile environment. Into the '90s and the early 2000s, these facilities became very common in our industry, and we saw mid-sized IOCs be capable of developing these and operating assets off-shore, which actually led to our production rates picking back up to the highs we saw in the '80s as well. Now, there are two more wedges here at the end. The independents joining the party roughly around the late 2000s and bringing more assets online. But what I'm really interested in is this blue wedge here at the end. As oil price picked up, we see production massively increase as well, which is what we would expect. As the price drops to around about $30 a barrel, we saw production just completely stop. Just completely stop. And that is, if this shows anything, it's a very unsustainable way of operating. So why did this happen? What led to this?
In the early days, these huge fields were built and developed with customized facilities that were volume-driven with large NPVs and huge scale, and the recovery was also driven and enhanced by enhancement recoveries, water injection, gas injections as well. But if this has shown us anything, maybe we need to change how we do things, and maybe we need to move to a primary depletion or primary production method. This would allow us to operate more quickly, more swiftly, and also at a much lower operating cost and upfront cost. Now, it's going to require changing economic focus, and it's going to require both companies and government to be on board with this. These big fields at the start were initially NPV driven. They had huge volumes of oil being produced, and gas as well, and they had the life extended by the enhanced recovery. But if this graph has shown us anything, that throughout the life cycle production's declining, and we don't want to end up with a wedge like this blue one at the end when prices drop, production completely drops as well.
So maybe, for a more sustainable wedge to be developed at the end, we need to move towards a discounted profitability index. What that would involve is governments accepting this primary depletion and ultimately lower recovery factors, which would bring down our initial upfront costs and also push the production earlier into the life cycle of the project. We'd be developing and producing oil much earlier, which would see our revenues being generated much sooner. Now, why would we go about this? Why would governments accept this? Maybe, if we could convince the government that, as an industry, this is something to do, then the UK North Sea is the perfect place to do it. There are over 140 fallow fields that have seismic well data and production tests, for the most part, that prove there's hydrocarbons in place. So right there, we've got no exploration cost upfront, which is going to bring down our initial costs. It's just a case of convincing our government that it is the right way to go about it. These resources will stay in the ground under this initial regime, where we're enhancing recovery and spending lots of money to extract these high volumes of oil, but if we accept the primary depletion and we move forward with this, then maybe we could add a sustainable wedge onto the end of our life cycle. I'm really interested to hear your thoughts. Do you think this is the way to go? Do you think we could convince the government, our government, as an industry, this is the best thing to do? Please leave your comments in the box below.

2001: A Geomodelling Odyssey

We welcome friend of Cognitive Geology, geomodelling guru, and creator of Geo2Flow, Dan O'Meara to the Cognitive Whiteboard, to share some of his insights into 3D fluid saturations.

If you would like to visit the Geo2Flow website please follow this link: Geo2Flow

Click on the image below to see a larger version of the whiteboard.



Hello, and welcome back to the Cognitive Whiteboard. My name is Dan O'Meara, and I'm here to talk to you about a way to evaluate your 3D Saturation models, and along the way this plot that I show you will stimulate a lot of discussion with your interdisciplinary asset teams. So I've been in the industry for quite some time, and I was actually there at the dawn of geomodeling, and when we started to put together geomodels. We had some simple expectations. One was that our models were going to honor our observations and the main observations were the well data, the well logs. Another was that our model should honor the physics. So when it comes to porosity, there is no physics, some people tend to use just geostats for porosity. But when it comes to saturation, we know there's physics to be honored and that's the physics of capillarity. We have plenty of data in the laboratory and that's why we get capillary pressure curves. So the physics becomes very important when it comes to saturations, and the physics tells us that as you go up in a reservoir, it's not just an easy thing where the oil saturation uniformly increases but the oil saturation is going to depend upon the heterogeneity in both the porosity and the permeability, they're all connected.
If you put together models that are just driven by the physics, what we're going to do here is we're going to evaluate those models, and I put together this kind of a plot to try and evaluate them, here's how the plot goes. So we're looking at 3D models and we're only going to consider cells that are penetrated by Wells because the Wells are where we have observations. So along the x-axis here, we're going to plot the properly pore volume weighted saturation log, the water saturation log. And on this axis, the y-axis, we're going to plot the water saturation that's sitting out there in the 3D model. And all we're going to do is look and see how we're doing, how they compare. You would expect to get a 45-degree line, at least I did with all of the data generally being on the 45-degree line. Because after all, if you honor the physics, you should be honoring the observations. But what I found was kind of strange and one of the things that just jumped out right away is you see data up here along the top. And what that's saying is that the water saturation that's coming out of the model is 100%, but yet there's that's contradicting it saying, "We have as much as 80, 90% oil, so what could possibly be going on?" So I turned back to the geological models here and realized, "Hey, there are faults in those models." With the faults comes the possibility of compartmentalization. So, for instance, in this model here we've got four free water levels, and you start to realize that this kind of a data is a signature for compartmentalization.
If you look at this plot more and you'll see that usually, I was not seeing nice straight lines even when I discounted for this, and if you're in this area here, the water saturation in 3D is higher than the water saturation in the wells. So here you're underestimating the reserves and conversely you're overestimating reserves in this part of the plot. So in going around the industry over the last 20 years or so, I would see people who were struggling where they had a computer programs here that basically said, "Hey, I can do the physics," but when it came to observations, when we put together these kind of plots here, we were repeatedly seeing things like this. Where we had lots of scatter here and then we had what seemed to be evidence of multiple compartmentalization there. So what I did is to come up with a methodology that both honors the physics which is the first thing that people were doing, but also honors the observations. So that every time that we put together a model in 3D for saturation, we get plots like this which is what you'd expect from porosity. You should honor the observations and you should honor the physics because after all that's what Mother Nature does. If you want to learn more about how to get there, then Luke will provide you with links to our website below. So  that's all from the Cognitive Whiteboard today, see you again next time.

Cognitive Geology are shortlisted for Innovation Award

We're very excited to have been shortlisted alongside Air Control Entech, Rig Deluge Ltd, and Tendeka for the Innovation Award at this years Press and Journal Gold Awards.  

The Innovation Award is sponsored by Balmoral, whose managing director and chairman, Jim Milne is quoted by Energy Voice describing the judging process: 

“Once again, the judging session was very exciting. Choosing between all of the entries was a difficult job because there were so many good ones, but there has to be a winner.

“I am looking forward to seeing cheerful winners getting their prizes on the night.”

If you would like to read the complete article by Rebecca Buchan and discover the finalists in all the categories click here.

We are all looking forward to finding out the winners on September 7th... fingers crossed for a Cognitive win!

Being an Economically-savvy Explorer

Our Senior Geologist Kirsty returns to the Cognitive Whiteboard, this time she's sharing some thoughts about why economic factors are for geologists too!

To view a close-up of the whiteboard click on the image below:

Kirsty whiteboard 2 - savvy explorer.png


Hello, and welcome back to the Cognitive Whiteboard. My name's Kirsty Simpson. Here at Cognitive Geology, one of the things we want our tools to do is to enable geoscientists to visualize the economic impact of their decisions. This led me to think about how often working as a New Ventures geologist I was asked to be an economically savvy explorer. Why was I being asked to think about economics you ask? Well, fundamentally my job was to make money for the company just like everybody's is. 
So, when you're bringing opportunities in to your New Ventures team, as a geoscientist, if you see one that's technically interesting, it’s so tempting to just get stuck straight into your subsurface workflow, build your regional framework, do your play fairway evaluation, talk to the petroleum systems team and get a model built, and then go and work for your petroleum prospect workflow. Get your volumetrics, calculate your risks, and then go and speak to your economist. That's a really long workflow before you get to discussing the economics of your opportunity. What if that economist says to you, "This is never going to meet the thresholds. What are you doing? What are you bringing me?" Well, I would argue, there's another way of starting your workflow and that's by thinking slightly more economically. 
What you can do is you can start by thinking, "What are my company's economic drivers? Am I looking for production in the near to mid term? So, do I need to find drill-ready opportunities or discoveries that are ready for development? Or am I looking for prospects and leads because we need to fill our prospective resources inventory hopper?" Once you've decide what kind of opportunity you're looking for, that's when you want to start thinking, "Well, where am I going to look for it?" And one thing you might start with is a graph like this which tells you what the percentage contractor take per country is. And that way, you might identify number of countries, or regions, or basins that you want to start looking for opportunities in. You also need to consider the geopolitics of those regions because you want to be able to actually get in there safely. And also, you need to think about the market because don't find eight billion barrels of oil if you can't sell it to anybody. 
Once you've overcome those economic stumbling blocks or hurdles, that's when you can start your subsurface workflow, build that framework, and do that play fairway evaluation. Out of that play fairway evaluation, you can identify your yet to find, whether you're looking for oil or gas, whether you're looking in deepwater, on the shelf or onshore. And that's the sort of information you can take to your economist and ask her to do some scoping economics on. As a result of that, she can give you a MEV, a minimum economic volume, that you require to meet in order to overcome your company's economic threshold. And that way, after just a few weeks, you might be able to throw out some or all of the opportunities you've identified. If you throw them all out, great, just go back and start again. Or otherwise, you take the few that do meet that threshold into your detailed subsurface workflow. And this way, you screen more opportunities and the only ones that you actually take through this process to your final economic decision are the ones that are likely to make it over your threshold and the ones that you're going to get in front of the bid committee. That way, your management are going to see you as an economically savvy explorer. 
I think that's all I've probably got time for. Thank you for joining me at the Cognitive Whiteboard and I hope to see you again soon.

The Bigger Picture: Uncertainty from Subsurface to Separator

Our new Hutton Product Owner, Jim Ross, shares some insights from his point of view as a petroleum engineer carrying out production modelling. Very interesting for the geos among us to hear how our models might be applied.

Click on the image below to see the whiteboard in more detail.


Jim whiteboard photo.png


Hello and welcome back to the Cognitive Whiteboard. My name is Jim Ross. I'm the new product owner of Hutton here at Cognitive Geology. And today I'm going to be talking to you about "The Bigger Picture: Uncertainty from Subsurface to Separator." My background is chemical engineering and most recently, I've been working as a petroleum engineer in the field of integrated production modeling. What that means is taking models of all the different parts of our oil and gas system and constructing a model that represents the entire behavior of the system and how they're going to interact and affect each other once we start producing from that asset. 
And what you notice when you're doing this is that everybody in the oil industry kind of works in their own little silos. You might have production engineers who are responsible for the operation design of the wells, you might have a drilling team who decide where and when we're going to be actually drilling those wells, a facilities team who would look at things such as the process or refinery required to support that production and anything else that's required to actually make that happen, and then the reservoir model, which of course, would be looking at how the fluids are actually going to move through the porous medium of the reservoir rock. However, they're all working towards a common goal, which is usually something like how much oil am I going to produce and when am I going to produce it. 
And in creating these models, I would often get asked, "How do I know this is correct? How do I know that what we've predicted is what will actually happen? And the short answer is, we don't. We don't know that that's entirely correct. And the reason for that is because of all the various uncertainties we might have in the process of building these models. For instance, do I have a reliable lab report? Do I know what my fluid density is, my API gravity, gas gravity is going to be? Do I know what the ratio is going to be of gas and oil and even how much water I'm going to produce over time? Do I know what the drainage region of my well is going to be and how much of that I'm actually going to contact during production and what that pressure decline is going to be, even how many wells I'm going to have and what type they're going to be and how it's going behave if we're not operating at the design capacity of those facilities? 
And what that leads to is some very anxious engineers, which is why I can have sympathy with this guy here. So, something that we've tried to move towards is to look less at precisely what these inputs would be and look at what impact they have by looking at the different possibilities on the production side. So for instance, a production engineer might sensitize on tubing size to see how much the well is going produce and what effect that will have on the production. If we're looking at a perforation, we might look at the perforation efficiency, how well we're making those perforations and what impact that would ultimately have on any wells that we drill. If we undertake a stimulation job, then we might look at what the stimulated PI would be after. 
There's a range of possibilities for what we might end up with there. What we're trying to build here at Cognitive Geology is something that takes into account the geological possibilities. So what are the different possibilities for filling my rock properties across the entire grid? And what impact does that have on the process? What that allows us to do is to move away from something which is precise, but we don't necessarily have a lot of confidence in, to something that is approximately accurate that then tells us what the impact of our decisions and the impact of our unknowns would be so we can have greater confidence in what we've predicted going forward. That's all I want to talk to you about today but I look forward to seeing you at the Cognitive Whiteboard again in the future, and I'll see you then.

Frontier Exploration: Avoiding Interpretation Bias

We welcome our new Senior Geologist Kirsty as she presents her first Cognitive Whiteboard.

Click on the image below to see the whiteboard in more detail.

whiteboard 1.png


Hello and welcome back to the Cognitive Whiteboard. I'm Kirsty, one of the new Geo's here at Cognitive Geology. My background is as an explorationist and I've joined the company to bring some of those exploration workflows into the products that we're developing. In preparation for today's video, I was watching some of Luke's old blogs, and one of the things that struck me was Luke talking about the difference between accuracy and precision, and that really struck a chord with me and some of the learnings that I had to take on board as a young geologist learning my trade in exploration.

So, early on, when I was a youthful geologist at the cliff face with my hand lens, I was looking at the really fine detail, getting excited about the bedforms that I could see, walking along the whole outcrop, seeing all the contacts and mapping them really accurately. When I joined BG group in 2009, and started working in exploration, I discovered that we really rarely have that kind of luxury of data density. In fact, if we start working in frontier basins, for example we've got a basin here maybe 4 or 500 kilometres in diameter with a handful of wells that we can you use to learn about the basin from. So, we've got very fine vertical detail with big horizontal spacing, tens, hundreds of kilometres between them, and from those, we're very comfortable with our vertical detail and it allows us to build depositional models. We can use those depositional models to show the depositional environments in map form, and this really looks like the kind of map that I would have drawn early in my career at BG Group. It looks really nice and geological, it looks like a snapshot of something we might see today in a depositional system.

 It's very appealing. But what it also is, is a precise view of one single possible interpretation of the data that we're seeing in my depositional model, and it doesn't encompass the full range of time and geological processes that are happening in that basin. So, if you draw something like this, you risk becoming really anchored to that one interpretation very early on with very little data to tie it back to. I would go back and tell myself 10 years ago that what I really need to be is accurate but encompassing the full range of possibilities, and therefore, what I should be drawing is not lovely geological facies maps like this which look great. But things that really represent what I actually know right now and that would be a play fairway style map where I'm showing where I believe the sand dominated facies to be. Where I believe the mud dominated facies to be. That still allows me to focus my exploration, recommend which blocks we should be bidding on in a license round. But it also doesn't tie me to an interpretation that I can't really back up at the moment with my data. And we also risk, because this looks so nice, it tends to get used again and again and again within the company and we've all seen this happen. And if this gets persisted 10 years down the line, when I'm not involved in that asset anymore, it can really bias people's decision-making and we don't have that information when this was drawn, and therefore, we could miss some really good prospectivity as a result of tying ourselves to one specific interpretation in time. Well, that's all I've got time for right now. But I look forward to seeing you back at the Cognitive Whiteboard soon.

Dragon Eggs and Unicorn Tails

Luke introduces the new series of whiteboard videos by telling us about the myth of hard data.

Click on the image below for a closer look at the whiteboard.


Hello and welcome back to the Cognitive Whiteboard. It's been a while but we have a new cast of characters that we will be introducing shortly. But I'm going to kick off the first of this series of videos with an attack on the hardness of the oil field data sets. To begin with, let's do some mathematics, not a place I normally start with, but if we look at a grid cell in a geological model, let's have a look at the reality of how well we've sampled that single grid cell, let alone the rest of the field.
By the time we get down the reservoir, we usually around a seven inch bit sample, doesn't really matter, but let's assume seven inches and a pretty common grid cell size might be 50 by 50 meters. If we do the mathematics on calculating the sample rate, our well bore area is about 0.02 of the square meter converted into metrics and the grid cell is around about two and a half thousand square meters of rock area, so that sample rate is 1 in 125,000. Question: does that well bore represent the perfect average of that grid cell? Let's just put it and leave it there for now.
But let's have a look at an oil field for example. Let's take Britain's biggest oil field the Forties. We have a hundred and three wells in it at 90km2 of area. Do the same mathematics and we are at 1 in 45 million as a sample right for that oil field. So even in this well-developed field, we have a pretty big challenge in trying to say we have statistics here, perhaps that's the reason why we use the term ‘geostatistics’, as to whether we want to be explicitly honoring all the mathematics to this or we want to be a little bit pragmatic and understand that our sample rates are a bit spurious. I would argue on the side of using a little bit of geological intelligence rather than just mathematics here, which is often where we start. But let's even look in a single well bore just how confident we are that we know where that well bore is. 
I was involved in a peer review where we had an issue that one of the wells was off by more than a 150m at the bottom hole location and that was proven because the velocity anomaly that was required to tie that well was just unheard of. It turned out the well was actually on the down thrown inside of a fault where it had been previously assumed to be on the up thrown side. That was discovered because we did a gyro survey over these wells to try to explain some of the issues. We found that about 30% of the wells were off by more than 50 meters and when we corrected all of those we added about 90 million barrels of oil back into that oil field and suddenly all the production history, you know, the general behavior that field started making a lot more sense.
Let's talk about that production history though. On the single well basis, how confident are we that we know the production is what we say it is? And this is probably some of the softest data that we have in the oil industry. The production data particularly when you're looking at a downhole zonal allocation can be very, very subject to uncertainty and inaccuracy. The well bore itself is often in practical terms not perfect. Cement bonds can create leakage points behind pipe, the jewelry itself wears over time, and the control of the flow can become problematic, and most of the time, wells are being produced through a cluster so the allocation back to the single well, let alone the zone can be really problematic. 
When we look at these production allocations, it's just worth bearing that in mind. Just a really hilarious point to that, we had a 28-day cycle in one oil field that turned out to be due to the hitches of the operational guys. One of the blokes was measuring the production data accurately, the other guy was just kind of eyeballing it from a distance, and that ended up with this 28-day cycle to our production data that we thought was tidal to start with. In reality, it was just inaccuracy in that measurement method. When the questioning comes, do I honor all of my all of my data? I do feel a little bit like Gandalf going up against the Balrog because the reality is I can't match all of it. Most of the time, there is going to be inaccuracy somewhere in the piece of the of the puzzle and I can't always be confident where that lies. What I'm always trying to do is develop the most coherent story I can within the realms of uncertainty that these data provide. Just a little bit of a story there, I hope that's helpful to you, if you've come across any other strangeness in your fields that turned out to be part of this, I'd love to hear about it in the comments below. That's all for now from the Cognitive Whiteboard. I'll see you back here again time.

Growth Lies in Innovation

Luke and Gemma Noble, Audit Director at EY, feature in an opinion piece discussing growth and innovation in the oil and gas sector. 

Times have been challenging in the oil and gas sector for the last few years and companies are looking closely at how they can continue to adapt and grow.

Gemma addresses the need to focus on knowledge capture, talent development and innovating across processes, systems and technologies.

Luke talks about recognising and addressing the key drivers of the operating companies whilst acting as a technology innovator.

To read the full article click here.  

Cognitive Geology wins two awards in one night

Cognitive Geology was nominated for awards at both the Digital Tech Awards and The Made In Scotland Awards 2018. The presentations were awarded last night (26th April) in Glasgow. 

Our COO Eileen McLaren was representing us at the Digital Tech Awards and we are very honoured to have been named the Digital Technology Business of the Year (Emerging, less than 10 staff and less than 4 years of age).

Meanwhile across town Luke and Alex were sporting their kilts at the Made in Scotland ceremony, where they were on the spot to accept the Exporter of the Year award. 

Click through the images to see the full lists of winners.

Eileen at the Digital Tech Awards 

Eileen at the Digital Tech Awards 

Luke and Alex at the Made In Scotland Awards 

Luke and Alex at the Made In Scotland Awards 

Luke Nominated for EY 'Rising Star' Award

More exciting news for Cognitive Geology today, our founder and CEO Luke Johnson has been announced as a finalist in the 'Rising Star' category of Scotland's EY Entrepreneur of the Year!


Luke is quoted as saying “Discovering that I’ve been selected as a finalist in the EY Entrepreneur of the year awards tops off what has been an amazing first quarter of 2018 – we’ve just moved into our brand new offices, have grown the team to 25 and are working on some really exciting new contracts.”

To read the full article about Luke and the other nominees click here



Eileen Joins the Board


More exciting news here at cognitive geology!! We are really excited to announce that our VP of Software Engineering, Eileen McLaren, has agreed to take on the role of Chief Operating Officer and also joined the board as a director.  

Here's an excerpt of the article from digit magazine by Brian Baglow. See the original article here.


One of the key people in both Skyscanner and FanDuel, Eileen McLaren, joins the ambitious geo-technology company’s board – aiming for a trifecta of tech unicorns.

Eileen McLaren,one of Scotland’s most respected digital technology executives, having been instrumental in scaling tech unicorns Skyscanner and FanDuel, has joined the board of Cognitive Geology.

Having helped Skyscanner and FanDuel scale from start-up to global success, Eileen joined Cognitive Geology as VP of Software Engineering in July 2017. She has now been promoted to Chief Operating Officer (COO) and appointed a director, taking a seat on the company’s board.

Founded in 2014, Cognitive Geology designs and builds specialist software for geoscientists in the oil and gas industries, improving accuracy in finding, appraising and developing oil and gas reserves.  In 2017 the company secured £2 million from Maven Capital Partners and Enzo Ventures to develop their range of product solutions for the oil and gas industry.

Cognitive Geology’s Hutton software claims to help companies in the oil and gas sector be ‘approximately accurate, rather then precisely wrong’. Hutton reduces oilfield investment uncertainty by extracting progressive trends from complex geological datasets. Geologists can identify and sequentially remove the effects of the various geological processes, resulting in a vastly reduced range of uncertainty.

The geoscience software industry is estimated to be worth an annual $4.5 billion and is forecast to double over the next several years as antiquated software is replaced by next generation technology solutions. In the recent past Cognitive Geology has secured a seven figure contract with one of the world’s largest energy companies

Eileen McLaren, COO, told DIGIT:  “I’m delighted to join the board of Cognitive Geology as I see it as one of the most exciting early stage tech businesses in Scotland at the moment. I’ve been fortunate enough to have worked at some of the most successful tech companies around and the planning, drive and the quality of the execution I see at Cognitive Geology has the same hallmarks I saw in the early days at Skyscanner & FanDuel.”


Cognitive Geology Raises Investment To Fuel Growth

(Back): Brian Corcoran, Eileen McLaren, Mark Collins, Kelli Buchan. (Front): Fiona Johnson, Luke Johnson, Alex Shiell.

(Back): Brian Corcoran, Eileen McLaren, Mark Collins, Kelli Buchan. (Front): Fiona Johnson, Luke Johnson, Alex Shiell.

We interrupt our regular geology programming with a special announcement: we're delighted to report that Cognitive Geology has recently raised a £2 million investment round to help us scale our business and get our first product Hutton into the hands of more geologists around the globe.

As regular visitors to this blog will know, at Cognitive Geology we have been developing the next generation of technical software for geoscientists and reservoir engineers since the beginning of the industry downturn – helping geologists do their jobs more effectively, and thus helping oil companies in their quest for more cost effective solutions under the new price regime. Now with several major customers on board, at this point in our journey we're very pleased to announce two investment partners in the form of UK private equity house Maven Capital Partners and Enso Ventures, a London-based early-stage venture capital firm.

Here's what our CEO Luke Johnson had to say about the news: “The oil industry, one of the first to adopt 3D visualisation and high performance computing, once led software innovation but today it has fallen behind gaming & cloud technologies. As geologists and technologists ourselves, our mission is to take advantage of modern software architectures to improve the efficiency of our peers around the globe by giving them the best software possible, for the complex tasks they face each day.

To that end, the feedback from our customers so far has been overwhelmingly positive. Today’s low oil prices increase the requirement for better technology. Enso & Maven, both forward-thinking investors, were quick to recognise the fortuitous timing of our company growth. This investment round will provide us the capital to scale our Edinburgh headquarters and position ourselves as innovators of 3rd generation software solutions for the energy services sector."

David Milroy, Investment Director at Maven said: “Luke is a true entrepreneur having created Hutton to tackle the challenges he encountered himself when working as a geologist responsible for high value upstream oil and gas projects. Cognitive has a differentiated approach that provides geologists with the most robust geological explanations for the data they observe and in doing so helps them make more informed decisions. Luke has assembled a very capable technical team as well as an experienced advisory board and we look forward to helping the team execute their exciting plans.”

Enso's CFO, Oleg Mukhanov, added: “We are excited to support this cutting-edge business, which is set to revolutionise the way geological data is analysed and interpreted. We strongly believe that Cognitive’s high calibre team is capable of delivering on this task, which will put them at the forefront of their industry.”

Thanks to everyone in the geology community who has supported us so much thus far: hugely appreciated - you are the folks we are building for!

Ok: that's all on the investment side for now - back to the geology!