7 minute read

Improving performance with intelligent computing at the wellhead

Production from US oilfields growing the drive to increase efficiency and boost production from the hundreds of thousand wells dotted across the country is gaining pace, and one of the most promising solutions is the digital wellhead.

Take any flight from Texas to California, and you will fly over West Texas and New Mexico. With a quick glance out of the window, you will see the land pockmarked with a seemingly boundless patchwork of square sandy areas as far as the eye can see. What your eyes won’t be able to tell you from that height is that each one of those squares contains an oil wellhead. Welcome to the Permian Basin that supplies almost a third of the US domestic crude oil.

The case for digital wellheads The oil fields in the basin contain tens of thousands of wells, growing every year. Over the past five years, more than 5,000 wells have been added to the inventory. What you will also notice is that aside from the wells there is very little else in the region as far as infrastructure goes. Each of these wells is unmanned and, more often than not, located in areas with poor access. Despite these challenges, the wells need to be inspected to check for gas leaks and any structural damage, which can be a costly and time-consuming operation. Aside from the need to avoid falling foul of compliance or health and safety issues, there is the matter of data. In modern oil production, data is vital for planning and assessment, and wellheads are a significant and abundant source of data that at present, is mostly untapped. Although some of the newer wellheads may have smart digital sensors with built-in wireless communications, the vast majority are legacy installations with analog gauges. Even the new smart sensors require a technician to be in close proximity to download the data to a handheld device.

What is a digital wellhead?

“Operators are trying to digitize these fields and wells so that they can detect all

the important production parameters from a remote location,” Jane Ren, founder, and CEO of San Jose, California-based industrial software company, Atomiton, says. The answer is a digital wellhead that provides integrated functionality at the edge. This will allow real-time and predictive analytics of wellhead integrity, well performance and environmental risk.

The idea of the digital wellhead is to give it a brain, or more accurately, an edge computing device. “The plan is to bring everything onto a level playing field,” Ren adds. “Whether the well instrumentation is analog or digital, we would like all the wellhead information to be available on the same platform even though it comes from different devices and vendors.”

That is the ultimate vision, but the most important thing is to achieve a digital wellhead, regardless of whether or not it has remote connectivity, in that there must be multipurpose computer intelligence on site. This will have a positive impact on three particular areas. The first is automation, the second is precision, and the third is prediction. “These are the three big advantages that were not able to be achieved before we added edge intelligence,” Ren says.

Delivering wellhead automation

A typical example of automation is gas leak detection. This is a significant concern at wellheads, some of which could even be abandoned wells but where the operator is still liable for any leak from the structures. The challenge around gas detection without automation is that it always requires people to visit the site. These inspection teams will attend to the wellhead site with infrared sensors or cameras, move around the area

and try and detect any possible gas leaks. These wellheads are remote, so there is travel involved, the inspections are infrequent, so when they do visit the wellheads there is the added risk that they may be exposed to health and safety risks.

“The idea behind having automation and intelligence at the wellhead is to be able to put gas sensors at the wellhead that can constantly monitor for any gas in the area,” Ren explains. “With an automated drive and the sensor located on a rotating mount, it can turn in various directions controlled by the edge computer. This reduces the need for people to visit the site.”

There are several philosophies when it comes to digitizing the wellhead, either through a device strategy or an edge strategy. One option would be to replace the dumb, analog sensor with a smart

sensor that has embedded computing and communications capabilities. This can be expensive and often only delivers a point to point solution with the individual device providing its information in isolation. Aside from that lack of integration, this method can also be challenging to scale

“Edge-based digitization is about being open to the low-level sensors,” Ren explains. “The sensor can be dumb, but it is given intelligence by the edge computing device. You are adding intelligence to sensors inside one computing device that can include gas detection, pressure or flow monitoring and even structural monitoring.”

Adding precision and prediction Precision is another benefit of edge computing capabilities. There are two ways you can detect any issues such as leaks or corrosion. One is to use a threshold-based anomaly such as pressure changes. With edge computing, these detections can be much more precise. On the market now are relatively low-cost gas detection sensors that can be attached to seams or seals between the flanges. These can detect even small leaks; however, it requires data processing at a high level to filter out the noise of the baseline fluctuation. The software needs to be able to look at trends and continuously compare the anomalies before it can confirm a leak. It is not a single point detection, and that makes it much more feasible to do at the wellhead when you have computing resources there.

The third area is prediction, which encompasses two separate things. The first is incidents, whether there are a leak or pressure issues in the well. The second type is called risk factors. “Having a risk factor does not mean there will be an incident, for example, there may be some structural changes at the flanges that is not bad enough to create an incident yet, or there may be heavy corrosion around the wellhead structure that could lead to a future leak,” Ren says. “The prediction is to use the risk factor monitoring to be able to derive the probability of a wellhead incident so the maintenance and inspection visits can be targeted, and condition based.”

So why would prediction require edge processing? “One of the newer ways of monitoring corrosion or erosion of the wellhead is by using advanced, AI-based image analytics,” Ren says. “It will continuously monitor the patches of color changing on the pipes in a well structure to be able to detect the color and pattern changing as the corrosion and erosion have been advancing. Then it will need to be able to integrate the risk factors of humidity and temperature in that particular well.

“All those risk factors need to be monitored, now combined with the prediction, the algorithm must be able to tell better which well could be more exposed to incidents or quality issues. All that sort of prediction requires computing and is not something that can be done by a single sensor or device. And that’s what we mean by putting the multipurpose compute on the well. It’s also called digital wellheads.”

Additional benefits

There are two types of resources that cost money at the wellhead. The first is people, the wellhead maintenance crews that visit the site number in their hundreds, then there is the cost of any regulatory infractions. By using intelligence at the wellhead these visits can be dramatically reduced, not down to zero, but

they would become condition based as opposed to schedule based. Every well is different and can be exposed to various risk factors, so when they visit each well at which time for which inspection will be different, and that will reduce the cost.

There is also an additional, secondary effect of the digital wellhead, and that is the increased data capturing and integrated processing. “The reality is that there is more data generated at a wellhead than there is data processed,” Ren explains. “Wellhead data is very valuable in monitoring and diagnosing the health of the well, the productivity of the well in addition to the quality of the well. You would have to aggregate multiple wellhead data together to be able to make better conclusions about the reservoir.”

Most of the digitization efforts at wellheads are currently centered on creating digital data without being able to integrate this information to undertake analytics. By putting computing power at the wellhead, you are not only putting it at the edge but aggregating multiple edge devices into a center that can create more valuable information on the reservoir and its long-term health and productivity. “That is why we selected the wellhead as a crucial strategic point because it is a data-rich point,” Ren concludes. n

This article is from: