Fundamentals of Wireless Sensor Networks Using Machine Learning Approaches: Advancement in Big Data Analysis Using Hadoop for Oil Pipeline System With Scheduling Algorithm

Fundamentals of Wireless Sensor Networks Using Machine Learning Approaches: Advancement in Big Data Analysis Using Hadoop for Oil Pipeline System With Scheduling Algorithm

E. B. Priyanka (Kongu Engineering College, India), S. Thangavel (Kongu Engineering College, India) and D. Venkatesa Prabu (Kongu Engineering College, India)
DOI: 10.4018/978-1-7998-5068-7.ch012
OnDemand PDF Download:
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Big data and analytics may be new to some industries, but the oil and gas industry has long dealt with large quantities of data to make technical decisions. Oil producers can capture more detailed data in real-time at lower costs and from previously inaccessible areas, to improve oilfield and plant performance. Stream computing is a new way of analyzing high-frequency data for real-time complex-event-processing and scoring data against a physics-based or empirical model for predictive analytics, without having to store the data. Hadoop Map/Reduce and other NoSQL approaches are a new way of analyzing massive volumes of data used to support the reservoir, production, and facilities engineering. Hence, this chapter enumerates the routing organization of IoT with smart applications aggregating real-time oil pipeline sensor data as big data subjected to machine learning algorithms using the Hadoop platform.
Chapter Preview
Top

Introduction

IoT refers to the use of multiple connected devices to capture and use data generated by embedded sensors, actuators and other physical objects via a common network. IoT has grown quickly and will continue to do so in the coming years. The evolving technological environment will unleash new horizons and service aspects which will lead to an improved quality of life for customers and will also be beneficial for companies in their productivity. IoT applications are capable of sensing and transmitting the data at the same time (Alnasir & Shanahan, 2020). Moving into the future there will be a huge demand for these IoT devices and we will be using those devices in our homes, workplaces while traveling and in reality, all the places we can imagine. The growing number of IoT devices will also bring new opportunities and challenges. Cloud computing is a popular choice for the processing and analysis of large data volumes. Organizations can easily manage and deploy powerful clusters that run and allow distributed processing in different software environments. Scheduling is an important part of distributed computing that allows users to utilize the resources available for a faster processing time (Zhang et al., 2020).

Offshore oil and gas pipelines are vulnerable to the environment as any leak and burst in pipelines cause oil/gas spill resulting in huge negative impacts on marine lives. Breakdown maintenance of these pipelines is also cost‐intensive and time‐consuming resulting in huge tangible and intangible loss to the pipeline operators (Alves et al., 2018). Pipelines health monitoring and integrity analysis have been researched a lot for successful pipeline operations and risk‐based maintenance model is one of the outcomes of those researches. A submarine pipeline is the major transportation way of subsea oil and gas after exploration and exploitation. However, because of scouring caused by current and wave, third-party damage and seaquake, or design defect, submarine pipelines have a relatively high probability of leakage failure given by (Baranowski et al., 2019). Once a leakage occurs, it may cause severe fire and explosion due to accident escalation, and pose a threat to human safety, environment, asset, and reputation has given brief layout by (J.Chen et al., 2013). The leakage failure risk of the submarine pipeline is unable to be eliminated, but preventive and mitigative measures can be taken to reduce the occurrence probability and consequence severity of leakage accident. Risk analysis is an efficient tool for identifying risk factors and developing strategies to prevent an accident. It includes three steps, i.e. hazard identification, frequency analysis and consequence analysis conducted. Also, an integrated risk-based assessment method developed by (Priyanka et al., 2020b) is used to predict failure probability and consequence of submarine oil pipelines, and it mainly focuses on corrosion failure of pipelines. By developing a probabilistic and numerical model for the dropped object risk assessment of the submarine pipeline, in which the collision probability of dropped object is estimated by scenario sampling while the accident consequence is simulated through finite element approach. However, the above studies mainly adopt traditional risk analysis methods or focus on the risk of single cause, and the risk analysis involving comprehensive causes and consequences of submarine pipeline leakage is not mentioned (Ye, 2020). In light of the above, it is necessary to use an integrated approach to conduct a risk analysis of leakage failure for the submarine oil and gas pipeline. The conventional risk analysis methods are known as static, it is unable to capture the variation of risk as the change occurs in the operation and environment. Also, revealed that conventional technologies use generic failure data, which makes them non-case-specific and introduces uncertainty into the results. The present work focusses on the application Hadoop platform for big data analytics to analyze real-world oil pipeline pressure data to analyze the performance of transportation (Chen at al., 2010). Besides, the scheduling algorithm is incorporated with cloud computing techniques with the data allocation cluster group to get the final decision in a more fast and robust manner.

Complete Chapter List

Search this Book:
Reset