Latest News

Organised By

 

 

 

Media Partners

 



Enterprise Management 360'

InsideBIGDATA

CMS Wire

Datafloq



Worldoils

The Oil & Gas Year

Oil & Gas Technology





Insight Middle East and Africa

 

midstreamanalysis-TWILtd-flickrModel-based, predictive analytics lends itself to crunching multiple data sources to pinpoint risks for pipeline integrity management, while analytics for process monitoring and measurement evolve to better discover crucial variances

Analyst firm IDC Energy Insights expects that by 2016, 50 per cent of industry companies will have advanced analytics in place, with common business cases being predictive analytics and optimisation of drilling, production, and asset integrity.

Analytical software, now at the disposal of midstream engineers and operators, is a far cry from the trending and process historians, which have been relied upon to monitor and fine-tune operations. Current software is more predictive and model-based and may pull in data from multiple control-level sources and using Web dashboards. SCADA and plant historians remain important, and are changing to extract more relevant information for users, and to serve as key data sources for the new breed of predictive analytics.

Major technology and consulting providers to the industry also see opportunity in predictive analytics, such as GE Oil & Gas and technology consulting company Accenture, which announced a partnership in September 2014 to apply predictive analytics to pipeline integrity.

The industry is seeking ways to move data up a level for advanced analytics, while looking at ways to improve on solutions at the measurement and SCADA levels. “The idea is that you have all these data from the control and sensor level and so why not model some scenarios and make some predictions?” said Chris Niven, research director for IDC Energy Insights.

Pipeline integrity is the midstream area perhaps ripest for predictive analytics, but companies also face compliance and customer service pressures to maintain close, accurate tabs on production flow and product quality. “The thought of analytics is great, but you need to have a problem to solve before analytics adds value. The main thing is to look at analytics in terms of the outcomes it can help you achieve,” added Niven.

Unique midstream challenges
The midstream faces unique challenges – not only must companies gather hydrocarbons from upstream producers, they may operate processing facilities to ‘sweeten’ the gas and remove impurities. The core challenge is to distribute product to customers via pipelines and storage facilities while keeping close track of how much product is flowing to customers.

Pipeline networks may span hundreds of miles, and often have a wide mix of meters, gas chromatographs and SCADA systems used within one network. In fact, the majority of natural gas pipelines in the US were built prior to 1970, according to a 2012 report commissioned by the Interstate Natural Gas Association of America. With proper maintenance, these older pipelines can remain safe.

“Complexities such as the linear nature of the assets, buried pipe, the age of the assets, the difficulties to replace them, and also the changeover in ownership of pipelines, have created an environment that is challenging from a data perspective. The ability of midstream companies to digitise information and understand what is going on within their networks is a major challenge,” said Brad Smith, product line leader at GE Oil & Gas. The emerging remedy is to gather data from multiple sources and feed it into an analytics platform to make predictions on risk and how to best allocate spending for maintenance, Smith said. The data sources for pipeline integrity would include data from inline inspection or smart ‘pigging’ systems, data from enterprise asset management systems on repairs, leak detection systems, as well as data from SCADA, metering, and instrumentation.

A predicative analytics platform can take in disparate data, apply it against a model, and identify asset sections or conditions of highest risk. According to Smith, “We are trying to build a better real-time view of risk and where it might occur in a pipeline. The idea is not necessarily to reduce pipeline integrity costs, but to drive better effectiveness of that spending.”

Columbia Pipeline Group (CPG), with operations based in the Marcellus and Utica shale plays, is the first customer to implement predictive analytics from GE and Accenture. The platform is being used to analyse pipeline integrity across CPG’s 15,000-mile network of interstate natural gas pipelines.

Engineers typically don’t partake in the set-up of a predictive analytics engine, according to Smith, but they do typically get involved in detailing the type of dashboards and key performance indicators (KPIs) they want the platform to generate. There also is the need for engineers to work with analytics vendors to identify missing data to generate better predictions.

While there is work involved in helping vendors establish a predictive analytics solution, the result is a dashboard and geospatial view of pipeline risk that cuts the time that would otherwise be spent gathering data. “With an analytics platform, the users can concentrate on driving efficiencies and improvements,” said Smith.

Scott Strandberg, Canadian midstream product manager with IHS, also sees predictive, model-based analytics gaining favour in the midstream. Configuring these analytics starts with establishing good data. Furthermore, it is important to feed it accurate asset information such as pipeline diametres and materials - in addition to the history of the product types and volumes - that have moved through a section of pipe.

There might be more than 100 data points that an analytics engine crunches to come up with a prediction of pipeline integrity risk. “You start to look at things such as leak detection data and data coming off of pigging systems. The analytics software is now able to pinpoint which sections of pipeline are more susceptible to risk,” said Strandberg.

Operational analytics evolve
Aside from predictive platforms, SCADA solutions also are evolving to become better at spotting bad data, flagging deviations, and providing a better foundation for production reporting, according to Strandberg. Vendors are using monikers such as ‘intelligent’ SCADA to denote the ability of these systems to track more data sources and spot deviations. “Some SCADA systems are starting to become almost the production volume record as well, and generally, they are getting more sophisticated than the SCADA systems of the past,” said Strandberg.

SCADA vendors are starting to use rules and heuristic techniques to spot deviations impacting the accurate understanding of flow, production, and gas quality. “On the SCADA side of it, the vendors are starting to enable rules that manage more stringent data values so that you can’t have a fluid analysis that is out of range. It comes down to better variance reporting by being able to establish what your tolerances are,” Srandberg said.

SCADA and plant historians are essential for monitoring real-time process data, but tapping into data from electronic flow meters (EFMs), and keeping track of measurement and flow data is equally important in the midstream, according to Steve May, president of Computerised Processes Unlimited. “Because you are moving so much in a midstream environment, if your measurement is off by a little, you could be losing a lot of money. Therefore, in midstream, you also need to be able to monitor and analyse measurement data, not just the real-time process data,” he added.

Measurement software taps into data from EFMs, storing the data for analysis and production reporting. Through the ability to set thresholds on deviations, measurement software can help spot problems such as missing data, suspect data, or uncollected data. For example, May inferred that repeating data on a differential meter could indicate the meter is stuck or frozen.

Midstream companies also use measurement software for government reporting and compliance with standards such as American Petroleum Institute (API) 21.1 for gas measurement and 21.2 for liquid measurement. Customers of midstream companies also typically want periodic reports that show them how much gas they are consuming under monthly contracts. Without measurement software that can quickly generate consumption reports, operators and control room engineers must scramble to compile such reports by analysing EFM data. “Automatic reports make life a lot easier. What used to happen is that customers would call the control room and say, ‘how much gas have I taken?’ Compiling those reports distracts the control room operators,” May pointed out.

Measurement software also can keep track of lost and unaccounted product, and this forms the basis for knowing the balance of product entering and leaving pipelines and other assets. “Midstream organisations want to know that the overall system is balanced across all of the pipelines, and they want to know if any meters or measurements are out of calibration or have issues. They need solutions that can spot deviations, quickly re-calculate volumes, and basically can turn data into information, rather than just capturing data so that you can figure it out on your own,” May said.

Basics still apply
Major technology providers such as GE and Honeywell are using some of the same ‘big data’ analytics they use to monitor assets such as jet engines to comb through data generated by pipeline compressors. Meanwhile, business intelligence software packages that have rich data visualisation capabilities are being used by integrators in the midstream.

According to Munsoor ur-Rahmaan, a business development lead with INTECH Process Automation, much of INTECH’s work in midstream analytics is to create dashboards for business intelligence tools with data visualisation and mapping capability to highlight and analyse trends derived from multiple process-level systems. One of INTECH’s recent projects was for a large oil and gas producer in the Middle East, pulling in data from dozens of sources spanning upstream and midstream operations into a Web-based dashboard. The dashboard can also be accessed via tablets. “The users want to look at their data uniformly, and analyse it accordingly. They want a single view of exceptions and issues,” ur-Rahmaan said.

HIS’s Strandberg said he sees a mix of approaches in midstream. Some large organisations are pursuing enterprise-class dashboards for SCADA and plant intelligence, while others continue to use more of a point solution approach, with separate SCADA systems dedicated to particular types of equipment. Regardless of the approach, the end-user organisation still must pay close attention to variances, data quality, and determine if it makes sense to invest in better instrumentation, such as replacing analog flow recorders with EFMs.

Setting a solid foundation for analytics should be part of the facility design process, according to Lindel R. Larison, COO and founding partner with Tall Oak Midstream, an Oklahoma City-based midstream company. Tall Oak has two natural gas gathering and processing systems in Oklahoma, the Tall Oak CNOW system and the Tall Oak STACK system.

“If you look at the total cost of a new plant - whatever that cost might be - a very small percentage of that is going to go into instrumentation. I think the key thing is to ensure you have enough instrumentation - enough measurement devices - so that when the plant and gathering system is up and running, you have the ability to monitor everything very closely, warehouse the data, and make the adjustments as necessary to ensure that you can continually improve from an operational and customer-focus perspective,” said Larison.

While predictive analysis for pipeline integrity is emerging, engineers must advise experts on underlying data sources and configuring KPIs. Measurement is a key aspect in midstream analysis, so seeking reports that could update trends and generate reports could prove useful for analysts on the field.

Getting back to the basics is key in data analysis – the foundation relies on plant and asset design, and installation of appropriate and new equipment would yield accurate predictions, and clearer results.