Vous êtes sur la page 1sur 20

Big Data and Analytics

Applied to Oil and Gas


How Agile Companies Thrive in an Internet-Connected World
Quorum Whitepaper

Visit us at www.qbsol.com to learn more.


Table of Contents
Executive Summary..................................................................................................................... 3

Introduction.................................................................................................................................. 3

Technology Components............................................................................................................. 5

Cloud and IoT.......................................................................................................................... 5

Integration and Storage......................................................................................................... 6

Database................................................................................................................................. 6

Data Models............................................................................................................................ 7

Machine Learning................................................................................................................... 8

The Connected Energy Value Chain............................................................................................ 9

Land Acquisition and Management...................................................................................... 9

Drilling and Completions......................................................................................................10

Production Operations.........................................................................................................12

Back Office............................................................................................................................15

Midstream.............................................................................................................................16

Conclusion..................................................................................................................................18

References..................................................................................................................................19

About Quorum............................................................................................................................20

ENERGY DRIVEN BY INNOVATION 2


Executive Summary
Informational value is the most important commodity throughout the oil and gas industry—
from upstream operators to pipelines that deliver products to world markets. Newly
embedded devices and sensors make it easier to collect data at an ever-increasing rate;
while cloud storage technologies make it simpler to store the data. However, real value
is created from data when informational context and associations are made inside and
outside of an organization.

Current market volatility provides proof that optimization is needed across the oil and gas
value chain. Companies that leverage data to optimize business processes will not only
survive, but thrive during market downturns and dominate the industry on the next upswing.

Introduction
Energy producers live and die by the decisions they make; the concept is well known. E&Ps
take incredible risks to find, drill, complete, and produce oil and gas. Midstream companies
invest billions to gather, process, and send that energy to market.

Specialized applications are available to solve specific and complex problems in the energy
industry. However, good decision making in select areas of the business is insufficient to
thrive in today’s energy market. Companies must execute in all areas of the business, on
top-performing projects, with impeccable timing, and be ready to shift resources as market
conditions change – all while continuing to operate their existing assets.

In the current market environment, companies are cutting budgets for capital projects.
These decreased investments could have a material impact on future production and
accumulating reserves to enable accelerated growth when prices recover. Agile companies
that are making these decisions are not developing a single model; they are generating
numerous models that need to factor in many complex data driven elements, which are
often interdependent.

ENERGY DRIVEN BY INNOVATION 3


Additionally, models are not created and then filed away; they are continually tested and
retested to validate business decisions. Predictive analytics allows companies to minimize
the impact of bad decisions through early detection and ultimately decrease these
occurrences altogether.

In the real world, an E&P must consider many factors that are all dependent on moving
targets, especially given the volatility of the markets today. First, companies must focus
on what they have control over, and then move beyond to areas where the only guarantee
is change.

Examples of real-world variables include:

Land: drilling obligations and lease expirations

Capital Outlay: drilling and completion cycle times

Production and Expenses: pipeline capacity gas, field optimization,


and complete water management

Risk Management: commodity pricing, net revenue, existing


and future hedges

Analysis: completion design, well spacing, initial production rates,


and production decline curves, equipment failures

Financial: cash flow, reserves, credit terms

ENERGY DRIVEN BY INNOVATION 4


Technology Components
Quorum refers to the ability for oil and gas companies to identify relationships, understand
context, and analyze data across the energy value chain as “connected energy intelligence”.
Growth in Internet-connected devices that collect and exchange data, commonly referred to
as the Internet of Things (IoT), makes connected energy intelligence possible. Leveraging
the data from all areas of the oil and gas business with IoT and cloud computing then
bringing them together in new ways with analytics, big data, and machine learning—these
technology components are the basis for implementing connected energy intelligence.

Any big data system delivers one or more of the five Vs: i

VELOCITY
Rate of data
received VERACITY
Quality of
VOLUME data
Amount of
data

VARIETY VALUE
Different types Analytics on
of data the data

Source: Big Data: Using SMART Big Data, Analytics and Metrics to Make Better Decisions
and Improve Performance (Marr)

Cloud and IoT


The cloud and IoT go hand-in-hand since the former enables the latter. The cloud delivers
scalability on demand, without the fixed overhead of computing, storage, and communication
capacity. The result is ubiquitous computing, a world where computing cannot be
distinguished between one thing and the next, one decision to the next. The cloud is a
necessary ingredient of connected energy intelligence because of one thing: scale.

In 2015, nearly half of the drilling rigs in North America were idled in less than 12 months.
Decisions were made during this time that shifted hundreds of billions of dollars as
companies went from drill mode to survival. When the market shifts upwards, how quickly
will the industry be able to pivot? Cloud computing and Internet-connected devices deliver
the business agility and insight required by the oil and gas industry to quickly adapt to
changing conditions.

ENERGY DRIVEN BY INNOVATION 5


The landscape for cloud and big data storage technology continues to change rapidly with
new or improved technologies that leverage related software, many of which are open
source. Open source vendors deliver value-added software and services as commercial add-
ons to open source software. The use of open source software makes the adoption of new
big data technology economically feasible for industries like oil and gas that are historically
slow to innovate in areas that are not directly tied to commercial operations. Moreover,
open source solutions are typically available on a subscription basis, which allows for
predictable total cost of ownership (TCO) and makes it possible to transition IT from a
capital expense to an operating expense. However, IoT involves more than just deploying
data collectors and collecting information in the cloud; therefore, companies must also
consider how the all data streams will be ingested and stored for later consumption.

Integration and Storage


Considerations for data ingestion and integration are akin to the midstream
sector of oil and gas. Large, volatile flows of hydrocarbons must be conditioned
and balanced into steady flows product to downstream markets. The technology
strategy must involve communication protocols, telecommunication limitations,
stream analytics, and system integration.
To satisfy these requirements, cloud platform offerings provide a toolbox of
components that, when assembled together, offer a robust set of integration,
protocol, and storage options. Example offerings include:

ŠŠ IBM Bluemix: (www.ibm.com) cloud platform as a service (PaaS)


that supports several programming languages and services
ŠŠ Amazon Web Services: (aws.amazon.com) collection of cloud
computing services that comprise the on-demand computing platform
ŠŠ Microsoft Azure: (azure.microsoft.com) cloud platform and
infrastructure for building, deploying, and managing applications
and services
ŠŠ Google Cloud Platform: (cloud.google.com) set of modular cloud-
based services with a host of development tools
ŠŠ Dell Boomi: (www.boomi.com) integration platform for connecting
cloud and on-premises applications and data
ŠŠ MuleSoft: (www.mulesoft.com) integration platform as a service
(iPaaS) for connecting applications, data sources and APIs

ENERGY DRIVEN BY INNOVATION 6


Database
Integration platforms give companies the flexibility to change the format and
destination of any piece of data at any time. Many companies struggle initially
with uncertainty about where all the data will be stored, and this hesitation results
in the failure to collect enough data to make analysis meaningful. Analytics
platforms address this problem by enabling connectivity between disparate
data sources, making it unnecessary to choose one – and only one – database
technology like building a single enterprise data warehouse. This approach is
simply not required anymore.
Additionally, non-relational databases (NoSQL) provide a mechanism for storage
and retrieval of data at scale. Non-relational databases are required for big data
initiatives because relational database management systems (RDBMS) cannot
scale and are by definition much more rigid. It is important to note that there is no
single database system, unstructured or structured, that fits every scenario.
A few of the many options available include:

ŠŠ MongoDB: (www.mongodb.org) a NoSQL, cross-platform document-


oriented database
ŠŠ Apache Cassandra: (cassandra.apache.org) an open source
distributed database management system designed to handle large
amounts of data across many commodity servers, providing high
availability with no single point of failure
ŠŠ Apache HBase: (hbase.apache.org) an open source, non-relational,
distributed database modeled after Google’s BigTable
ŠŠ Redis: (redis.io) a data structure server that is open source, networked,
in-memory, and stores keys with optional durability
ŠŠ Basho Riak: (www.basho.com) a distributed NoSQL key-value data
store that offers high availability, fault tolerance, operational simplicity,
and scalability
ŠŠ Apache CouchDB: (couchdb.apache.org) an open source NoSQL; a
document-oriented database implemented in the concurrency-oriented
language Erlang
ŠŠ Amazon DynamoDB: (aws.amazon.com) a fully managed proprietary
NoSQL database service
ŠŠ Azure DocumentDB: (azure.microsoft.com) a fully managed, multi-
tenant distributed database service for managing JSON documents at
Internet scale

ENERGY DRIVEN BY INNOVATION 7


Data Models
Data acquisition, storage, and integration are links along the energy information
value chain. Data models define the relationships between many structured and/
or unstructured data elements. It is important to understand that data models
can and should change as new discoveries are made. But, they do have to start
with a project and a reason to justify their creation. Adequate documentation
around the model and the metadata around each data element should be at the
core of any new project.
Models should be built around each area of the business through a process of:

1. Defining the project 4. Building a model


2. Collecting the data 5. Deploying the model
3. Analyzing the data and evaluating it

Machine Learning
Today, most companies would gain tremendous benefits from connecting data
silos together, providing visibility into the basic relationships that are already
known to exist, such as vendor and cost. It is not surprising that technology
leaders over the last decade dominate in machine learning and artificial
intelligence platforms. IoT and the potential value created by the petabytes of
data streaming from billions of these devices, is why these leaders are advancing
the platforms and tools to eliminate the barriers to entry into this cutting-edge
field. A few of the platforms and toolkits to consider for use:

ŠŠ IBM Watson: (www.ibm.com/Outthink) a technology platform that


uses natural language processing and machine learning to reveal
insights from large amounts of unstructured data
ŠŠ Azure Machine Learning: (azure.microsoft.com) cloud platform and
toolset that includes complex algorithms and technology to apply
against custom data models
ŠŠ Amazon Machine Learning: (aws.amazon.com) a cloud platform
based on the same proven, highly scalable, ML technology used by
Amazon’s internal data scientist community
ŠŠ TensorFlow: (www.tensorflow.org) an open source software library
for numerical computation using data flow graphs developed by
Google
ŠŠ DMTK: (www.dmtk.io) a framework for training models on multiple
servers, a topic modeling algorithm, and a word-embedding algorithm
for natural language processing
ŠŠ Torch: (torch.ch) a scientific computing framework opened sourced
by Facebook with wide support for machine learning algorithms that
puts GPUs first

ENERGY DRIVEN BY INNOVATION 8


The Connected Energy Value Chain
Land Acquisition and Management
An E&P’s reserves and the land it sits under, as well as a transporter’s capacity and right-
of-ways, are the largest assets held by energy companies. From the beginning of the land
acquisition phase in a new basin, companies must efficiently document, catalog, prioritize,
and value their leaseholds and mineral rights. As market conditions change, exploratory
wells are tested and field development begins, creating data silos that prove difficult to
overcome in many corporate software ecosystems.

With disparate and disconnected systems, market volatility masks the problem when prices
increase and leaves executives scrambling for answers when prices rapidly decline. There
are a number of analytics and data services available today that utilize public and internal
sources of data that provide visibility to land availability and drilling activity.

Once the assets have been acquired and field development begins, a real-time field
development engine can continually evaluate and refine reserves, re-value assets, and even
alter development plans as conditions are met and milestones achieved. In the world of IoT,
data streams are already available that provide each department with the real-time data
they need but rarely provide value to any other organization unit.

During the land acquisition phase in a newly discovered basin, landmen descend and
acquire as much land as possible for the lowest cost. Each mineral owner and each lease
can be different. In the deal negotiating confusion, there is an emotional component to land
deals, and time is typically the only means of placing true value on each deal.

By using artificial intelligence technologies such as cognitive and natural language


interpretation, all emotion can be removed from by applying an objective score. Scoring
can be done by gathering inputs, scanning the lease with a mobile device, and processing
the content of each page to gather. Some of the inputs from the list below will continue to
change over time as the field is developed and reserves are proven.

ŠŠ Lease terms including signing ŠŠ Joint venture opportunities


bonus and royalties
ŠŠ Expected development cost
ŠŠ Location of each lease
ŠŠ Production output capacity
ŠŠ Lease type and total included acreage
ŠŠ Mineral owner demographics
ŠŠ Location to nearest producing well
ŠŠ Service provider performance
ŠŠ Expected production rating
ŠŠ Resource costs
ŠŠ Relation to other leases and
competitive risk

ENERGY DRIVEN BY INNOVATION 9


When commodity prices drop rapidly, as they have done twice in the last decade, companies
that make better and faster decisions will be positioned to widen the gap between them and
the rest of the market, especially when capital budgets get cut. With the scoring system
described above in place, connecting the dots to identify assets with the lowest objective
score should be the first sent to the data room. Objectivity also benefits executives who
now face tough decisions of selling assets that were expensive to acquire.

Connected energy intelligence answers the questions below in real time, all the time:

What assets should be divested? What assets have the lowest risk?

What assets have the highest value? What divestiture will have the least
impact on future shareholder value?

Most of these questions could be answered by one division, but they become more difficult
to answer when considering:

ŠŠ Internal factors
Field development backlog
Expected decline curves versus actual
Delayed rentals and lease expirations
Seismic data and existing analysis
ŠŠ Downstream activities
Processing facility construction
Gathering systems
Pipeline capacity
ŠŠ External factors
Service provider performance and cost
Resource and equipment costs
Hedging and risk

Companies with solid balance sheets look for acquisition targets that can add long-lasting
shareholder value from distressed companies. Similar to a land acquisition where each
lease is objectively scored, the entire data set for a potential deal must be processed,
analyzed, and scored. Data produces information that can be used to provide the
acquisition and development roadmap given current and expected market conditions.

ENERGY DRIVEN BY INNOVATION 10


Drilling and Completions
Finding, drilling, and completing oil and gas wells is extremely capital intensive. Therefore,
as companies strive to improve production output and extend the life of their assets, it
is not surprising that big data and analytics are already leveraged for these activities.
Unfortunately, when oil is $100 per barrel, everyone is an expert. Bad decisions tend to
result from gut feelings rather than data. Improvements are made, some through trial and
error and others via brute-force analysis. Industry-leading producers have been employing
analytics for years.

Data models use hundreds, sometimes thousands, of controllable inputs: drill location,
depth, frac stages, lateral length, proppant type and amount, pump pressures and rates,
etc. Outside economic factors can also play a significant role from one basis to another
and even from one pad to another. The frac water source and location of nearby flow back
locations for blending reused water at these lower prices will drive whether or not a well
gets drilled at all.

The drilling process no longer consists of a tool-pusher capturing drilling depth on a paper
sheet. Data streaming is the new normal, where fiber optics are used to sample hundreds
of sensors every few seconds and gather rates, pressures, weight, torque, molecular of gas
captured, etc. It is now common for a drilling engineer in Houston to monitor drilling activity
in Pennsylvania remotely. However, these systems are still very much point to point and
lack enterprise integration.

An advanced form of artificial intelligence being used by leading unconventional E&Ps


is called Prescriptive Analytics®, a term coined by Atanu Basu, the founder and CEO of
big data analytics software company Ayata (www.ayata.com).ii According to Basu, “It
[Predictive Analytics] uses any kind of data to predict, prescribe, and automatically adapt.
The more it sees, the smarter it gets.”

Prescriptive Analytics compresses learning curves to help operators arrive at better


answers, faster, with far less risk and financial exposure compared to current practices.
The figure below is Predictive Analytics in its simplest form.

Source: Ayata (www.ayata.com)

ENERGY DRIVEN BY INNOVATION 11


Companies that continue to push for improvements are building their field development
plans and designing their drill sites and completions long in advance. If the wrong decision
is made while the bit is in the ground, additional costs and delayed revenue are the most
that companies can hope for out of a well. The worst case is complete loss. And sadly,
companies typically have more than enough data to drive better decisions. Companies with
a culture of continuous improvement through analytics find that integrating 3D seismic
data with their completion design improves the long term economics of every well drilled by
increasing production and reducing the risk associated with well design.

The figure below illustrates how Ayata Prescriptive Analytics software enables completion
designers to get the recipe for an optimized well design. As improvements are made, more
data is collected, and the system gets smarter, yielding additional improvements.

Source: Ayata (www.ayata.com)

Production Operations
For any company that expects to make it through this or any future downturn, the times
when data is not used on a day-to-day, hour-by-hour basis for oil and gas production are
long gone. Technology and information have been lockstep with capital expenditures for
decades, and production operations and the corresponding operating expenses were largely
ignored. However, companies seek efficiency when commodity prices decline. Efficiency is
not a luxury. It is mandated for survival.

ENERGY DRIVEN BY INNOVATION 12


The ability for companies to collect, process, and analyze production and operational data
has never been easier. Oil and gas producers are able to make incremental improvements
that compound over the entire productive life of a well. The omission of real continuous
improvement strategies during production is why most leading experts and technology
innovators are particularly focused on production optimization that uses process control
automation and preventative equipment maintenance.

Production optimization is not solely dependent on producing more oil and gas all the time;
to truly optimize production, companies must produce smarter with less overhead. Efficient
E&Ps have already introduced mobility into every aspect of operations. Accuracy is a side
effect of manual data collected at the source using smart devices such as tablets. However,
capturing a set of important data points, one time per day, by sending a field technician is
incredibly inefficient.

Utilizing devices and sensors to capture pressures, rates, and equipment control data on
high frequencies yields far more accurate data sets, improving an engineer’s ability to
make production decisions. The scarcity of engineering knowledge increases the time
between receiving event information and a decision’s positive outcome or negative impact.
Intelligent systems must also deliver information back to the field technician, providing
actionable information rather than just a better means of collecting data.

In a study conducted by McKinsey & Company, automation and optimization will yield the
most substantial results for any upstream company. iii Process control automation of oil
and gas assets has been a part of localized production operations for decades from the
perspective of a well, facility, and platform driven by safety and maximizing output. With
the limiting factors of computing and storage infrastructure removed, the industry moves
beyond observational analysis to continuous improvement via data models with many input
variables associated to data points sourced outside of the production system. Warm bodies
in front of a monitor and keyboard do move the needle far enough to enable companies to
thrive in extremely low commodity markets.

Artificial lift was an early form of automation for production operations and, in nearly all
cases, is limited by control automation at a very narrowly focused level. As a connected
analytical system that leverages sensor, business, and third-party data, artificial lift should
be controlled by systems that produce the most for the company, not in terms of volumes,
but revenue.

Consider an unconventional eight natural gas well pad and the effects of equipment failures
on one or two wells. In this situation, a control system can manage total productive output
by enabling other wells to continue to produce longer or with more frequent cycles when
other wells are not available. It can even allow for the wells coming back into production to
have more pipeline and production availability to ‘make up’ for lost production.

ENERGY DRIVEN BY INNOVATION 13


Companies wielding big data across the enterprise, combining back-office ERP data
alongside sensor data collected throughout the entire natural gas field, can use analytics to
optimize artificial lift of wells across multiple well pads. Connected intelligent platforms go
beyond the control panel to optimize artificial lift across the entire field, making changes to
output across multiple wells in the following areas:

ŠŠ Downstream pipeline capacity


ŠŠ Expected rate of volume gain
ŠŠ Contractual obligations to trading partners and royalty owners
ŠŠ Impact to future production declines
ŠŠ Wear on production equipment

The biggest source of lost revenue throughout oil and gas is unplanned downtime, most
of which is preventable through proper maintenance programs. An intelligent optimization
platform goes beyond the current expectation of using human capital to schedule and
deploy human resources. The manufacturers of large and expensive production equipment
are leading the charge in this space and rightly so. It is no longer good enough to sell a
good piece of equipment that works day one. The expectation is added value through
instruction, training, and outright packaged solutions that leverage the sensor and control
data from each component to determine the likelihood of failure by more than just time.

Preventative maintenance is an immediate driver for the Industrial Internet of Things (IIoT),
largely because manufacturers of large capital equipment understand the value of data and
how much control they have over their products after they go into production. A natural gas
compressor can have thousands of moving or controlled parts, each with its own useful life.
Data models have to aid in determining and extending the life of each part. Some of these
data models analyze the actual life based on real recorded conditions in order to arrive at
more accurate representations of useful life and what maintenance programs extend it.
This approach also reduces downtime.

The gains realized by keeping large, critical equipment in service longer without unexpected
failures is obvious. When the sensor and control data is pushed from individual control
panels into data models that leverage data from the rest of the enterprise, the result is
operational excellence. Excellence is never achieved; it is the target. There is always some
incremental improvement to make, another competitive advantage to gain.

Improvement can come in the form of intelligently distributing the scarce set of knowledge
from field operators. Some would describe this distribution as route optimization. However,
route optimization is often used to describe reactionary mechanisms to address unplanned
downtime and lost production.

ENERGY DRIVEN BY INNOVATION 14


Route optimization goes beyond downtime and includes:

ŠŠ Geology and geoscience characteristics


ŠŠ Financial models including contractual obligations and weighted impact for each
decision on a company’s bottom line
ŠŠ Location, location, location
Of the oil and gas well
Of the personnel in the field
Of the personnel in the field in relation to
other potential issues and impactful changes
ŠŠ Availability of resources coupled with the success rate and average
execution time for task completion
ŠŠ Preventative maintenance data models and prescribed actions
for improvements to direct
ŠŠ Conditional scheduling to maximize every mile of ground covered
by remote personnel
ŠŠ Well sites that have not been visited in N days or devices
that have failed communication in X hours

Connected energy intelligence allows for all of the above to be leveraged in order to build
and distribute optimized routes throughout the day. Improving logistics for proper resource
allocation in a 24-7, commodity-based industry will lead to future compounded benefits.

Back Office
Back-office cost centers of oil and gas companies will become the controllers of innovation
in the future. Why? Because they sit in the middle of big data for the oil and gas industry
and they put a dollar value on all of it.

It is often surprising how much effort goes into keeping the data moving between data
silos. Data models exist today in other industries such as retail and manufacturing that are
very similar to what energy companies require. Energy producers need to manage inputs
through vendor performance and optimize output through hedging and risk management,
the same way an automobile manufacturer has to manage its suppliers and the demands
that are influenced by commodity markets.

Hundreds of vendors are used to locate, drill, complete, and produce oil and gas, which
is very similar to a manufacturer. The energy industry is built on relationships and an
assumption of value. Commodity downturns test the business relationships between E&Ps
and service providers as every invoice is scrutinized at a granular level. Decisions made on
relationships alone are why E&Ps were able to achieve immediate efficiency gains through
service provider cost reductions. Similar to manufacturing, a big data solution can identify
performance and quality trends of every vendor, allowing decisions on the merits of quality
and reliability rather than cost or location.

ENERGY DRIVEN BY INNOVATION 15


All commodity-based companies should inherently see themselves as big data companies
that must effectively navigate risks. It is no longer sufficient to look out over months at
macro-level trends alone. The shift that is underway, allowing companies to thrive in the
most volatile of markets, has moved beyond transactional systems that can take weeks
or months to identify macro trends occurring in the market. A predictive model that is
connected to a company’s ERP system and real-time production data, capital projects, and
downstream partners provides increased visibility to opportunities that can be exploited.
The data model and analytical results should be open and available within an organization,
possibly even shared externally with trading partners. The need for coordination between
a commodity trading platform and production operations is easy to see in the following
example.

Production operations exceeds expectations with efficiency improvements and a reduction


in reservoir decline rates resulting in more product to deliver to the market. However,
limited pipeline capacity in a particular area of the field results in penalties and associated
fees. Using models focused solely on reservoir optimization in terms of net revenue to the
company would miss marginal availability on a particular gathering system. Predictive
models incorporating pipeline capacity and costs associated to delivery would indicate a
greater need for operations to focus on legacy areas of a developed field with plenty spare
capacity. Even with a stellar hedging program, the spot market provides companies plenty
of opportunity to increase revenue by tapping into secondary macroeconomic forces such
as social media and weather. In conclusion, the ability to make up pennies in tight markets
is the difference between profitability and viability.

ŠŠ Macro market trends: pricings, demand


ŠŠ Secondary macro driving forces: weather and social media
ŠŠ Production operations and real time rates
ŠŠ Trading partner and downstream availability and pricing
ŠŠ Regulatory compliance visibility

ENERGY DRIVEN BY INNOVATION 16


Midstream
As hydrocarbons flow from upstream production sites to their downstream destinations,
midstream processes and conditions the products for market. Oil and gas processors and
transporters suffer from many of the same operational issues, such as downtime caused
by equipment failures and field logistic inadequacies. Unconventional production growth
has added to the volatility to one of the more stable segments of the energy industry as
producers expand in new basins not equipped for massive increases of natural gas and
liquid production.

Consider a natural gas gathering and processing company with operations in a rapidly
expanding basin. New plants are being constructed, compressor stations are brought
online, sales meters are installed at each well pad, and everything is connected through
SCADA. The device and sensor data is also streamed to a large big data analytics platform
that is also connected to the back-office ERP, measurement, contract, financial, as well as
commodity prices fed from the NYSE. The company has also worked out terms with their
trading partners on data enrichment agreements that allow for information to flow between
organizations.

Since companies in this sector make the most profits through maintaining a consistent
flow of product at a target capacity, big data solutions have application value. First, let’s
review an example of where it is simply not possible for a single human being to process,
analyze, interpret, and interpolate where and how to make adjustments to flow rates and
send those notifications and control alerts to their upstream customers every hour of
the day. By tapping into real-time data streams from remote devices and sensors, and
applying machine learning, opportunities can be identified that improve the flow of gas
through the system, resulting in more favorable terms for customers. Connecting this
sort of intelligence, once identified, may simply be a matter of distributing actionable
notifications. In order to increase the likelihood of acceptance, real-time analytics can apply
financial terms and not only provide upstream customers with the desired rate, but also the
anticipated revenue value of the change.

The interesting aspect of this scenario is that it is continuous and repeatedly testable by
incentivizing with market forces. Offering customers a five percent reduction in fees to
maintain a consistent 15 percent increase in total volume speaks for itself.

The same data streams can feed other neural networks, enhancing other systems in
order to:

ŠŠ Identify imbalances in the system and feed the opportunity costs and offers for
adjustments
ŠŠ Push gas conditional control parameters back to process control systems to so that
all processing equipment is performing
ŠŠ Estimate line pack

ENERGY DRIVEN BY INNOVATION 17


Conclusion
As connected systems for oil and gas expand through the cloud, with data flowing
seamlessly from drill bit to burner tip, upstream production activity can account for
changes in market conditions and energy demand without any human intervention.
Consider for a moment, the global stock markets that are powered by sophisticated
computing systems. Stock market computing services make thousands of transactions per
second, and the gain of fractions of a second is a competitive advantage over other trading
systems. Why should the energy industry neglect to capitalize on similar efficiencies, even
the most nominal?

Additional opportunities for improvement are available through the application of cloud
computing, machine learning, and predictive analytics against continual ingestions of
data that affect the results of each model. Cloud computing encourages the efficient use
of capital while expanding computing and storage capacity. And with today’s capabilities
in machine learning, companies can expand beyond one scenario that engineers and
management conceptualize, and instead run through many potential scenarios to produce
the best path forward.

Companies will continue to have better access to data from their trading partners and
services providers including: devices in the field that stream real-time production data, real-
time market and commodity pricing, and risks and the predictive models. Results from each
of these business areas all play an important role in the recommendations that executives
will use to direct their organizations through the ups and downs.

ENERGY DRIVEN BY INNOVATION 18


References
i Marr, B. (2015). Big Data: Using SMART Big Data, Analytics and Metrics to Make Better Decisions and Improve
Performance. Wiley.

ii Ayata. (2016, March). Atanu Basu and Daniel Mohan. http://ayata.com/prescriptive-analytics/

iii McKinsey & Company. (2014, August). Digitizing Oil and Gas Production. Retrieved March 15, 2016, from McKinsey &
Company website: http://www.mckinsey.com/industries/oil-and-gas/our-insights/digitizing-oil-and-gas-production

All product names, logos and brands are property of their respective owners.

ENERGY DRIVEN BY INNOVATION 19


Quorum makes innovative software for hydrocarbon and energy
business management. Our platform of integrated solutions is
designed with deep industry expertise using next-generation
technology. It delivers advanced functionality, improved
efficiency, and enhanced regulatory compliance. And it is proven
to maximize profit throughout the energy value chain to drive
customer success.

For additional information about Quorum or to request a demo,


please contact Quorum Sales at 713.430.8612 or visit qbsol.com.

qbsol.com | © 2016 Quorum Business Solutions, Inc. All Rights Reserved.

Vous aimerez peut-être aussi