AI

Using big data and machine learning for prediction of corrosion in pipelines

[Editor’s note: DNV GL, advisors to the oil, gas and energy industries, recently conducted a five-day hackathon on how machine learning techniques could help find pipeline corrosion. They wrote about in the company blog.]

We were curious about one of the key concepts in our current strategy – could we manage to become a bit more “data smart” on integrity management and maintenance planning on pipelines? We wanted to learn more about the opportunities and maturity level with technologies like big data, machine learning, artificial intelligence and the internet of things. How easy was it to apply and how could it potentially fit into our current product portfolio?

Scenario and use case of our hackathon

In our hackathon, we set up a mixed team of business representatives, experienced developers, data scientists and domain (pipeline) experts. In total we were around 10 people involved. We had good support from Microsoft with some of their experts within Azure and machine learning. The approach of having an interdisciplinary team turned out to be crucial for the success of the hackathon.

Our dataset came from an onshore pipeline system with a total length of 1,455 miles. We had both pipe state data (depth of cover, coating, casings, welds and much more) as well as condition data. In machine learning, you would normally create a “training data set”. Our training set was defined out of roughly 59,000 rows of data where around 3,000 having measured corrosion. We applied specialized tools for the data management, data cleaning and machine learning.

We wanted to investigate whether we could create predictive algorithms for the corrosion in pipelines. In this case, we wanted to be able to predict susceptibility to corrosion in areas of a pipeline system where it was not possible to inspect using inline inspection devices, typically referred to as being ”unpiggable” pipelines. Assessment of such areas of a pipeline has to date been both expensive and in many instances ineffective in preventing serious pipeline failures.

Background info – pipeline “pigs” and big data

Pipeline “pigs” are devices that are inserted into onshore and offshore oil and gas pipelines to perform different types of tasks without stopping the fluid flow. These devices were initially designed to remove residues, plugs or anything else that could prevent or slow down the flow inside the pipelines. Today pigging is used throughout the life of a pipeline, for many different reasons. For instance, pigs are applied for internal inspections to detect corrosion or cracks that could lead to failure and to provide a basis for repair decisions that can help prevent leakage and ruptures, which can be explosive and dangerous to people, property and the environment. There are basically two different technologies used on the “pigs” that assess pipe wall condition, ultrasound and magnetism. The choice is based upon the objectives of the inspection.

Read the source article at blogs.dnvgl.com.

Let’s block ads! (Why?)

AI Trends

Click to comment

You must be logged in to post a comment Login

Leave a Reply

To Top
Social Media Auto Publish Powered By : XYZScripts.com