Vous êtes sur la page 1sur 15

Validating Reservoir Models to Improve Recovery

Valid and worthwhile reservoir simulation hinges on careful preparation of a


reservoir model and clear understanding of model uncertainty. Creative integration
and comparison of seemingly disparate data and interpretations elevate reservoir
models to new heights of reliability. Through this process, all data incorporated
into a model and ultimately fed into a simulator become more valuable, resulting
in increases on the order of 10% in predicted ultimate recovery.

Jack Bouska
BP Amoco plc
Sunbury, England

For help in preparation of this article, thanks to Ian Bryant,


Schlumberger-Doll Research, Ridgefield, Connecticut, USA;
Henry Edmundson, Schlumberger Oilfield Services, Paris,
France; Omer Gurpinar, Holditch-Reservoir Technologies,
Denver, Colorado, USA; Steve McHugo, Geco-Prakla,
Gatwick, England; Claude Signer and Lars Snneland,
Geco-Prakla, Stavanger, Norway; and James Wang,
GeoQuest, Houston, Texas. We thank the Bureau of
Economic Geology, The University of Texas at Austin, for
permission to use the Stratton field 3D Seismic and Well
Log Data Set, and BP Amoco and Shell UK E&P for
permission to use the Foinaven area data.
ARI (Azimuthal Resistivity Imager), FMI (Fullbore
Formation MicroImager), MDT (Modular Formation
Dynamics Tester), OFA (Optical Fluid Analyzer), RST
(Reservoir Saturation Tool) and TDT (Thermal Decay Time)
are marks of Schlumberger.

Summer 1999

Mike Cooper
Andy ODonovan
BP Amoco plc
Aberdeen, Scotland

Chip Corbett
Houston, Texas, USA

Making and testing predictions are part of our


everyday existence and basic to most industries.
Safety equipment, medical treatments, weather
forecasts and even interior designs are evaluated
by simulating situations and predicting the
results. Similarly, the oil and gas industry makes
predictions about hydrocarbon reservoirs to
decide how to improve operations.
Reservoir optimization requires carefully
constructing a reservoir model and performing
simulations. Interpreting and integrating qualitycontrolled data from a variety of sources and vintages, and at different scales, are prerequisites
for preparing a comprehensive reservoir model.
Most computer simulators take the reservoir
model and represent it as three-dimensional
blocks through which fluids flow (previous page).
The data, models and simulations provide a more
complete understanding of reservoir behavior.

Alberto Malinverno
Michael Prange
Ridgefield,
Connecticut, USA

Sarah Ryan
Cambridge, England

Working together, skilled interpreters use a


simulator to predict reservoir behavior over time
and optimize field development strategies accordingly. For instance, the effectiveness of infill
drilling locations and trajectories can be determined through simulations of multiple scenarios
or assessment of the impact of the uncertainty of
specific parameters. Reservoir simulation is also
useful in evaluating different completion techniques as well as deciding whether to maximize
production rate or ultimate recovery. In this article, we consider how the integration of all available data to validate and constrain reservoir
models leads to more realistic reservoir
simulation (next page).

21

Traditional Approach

Leading-Edge Approach

Distributed disciplines

Multidisciplinary teamwork

Use of data in isolation, obscuring relationships


between data (for example, seismic and core data)

Integration of data and interpretations to confirm


reservoir models

Inconsistent or poorly documented


interpretation techniques

Archiving of interpretations and consistent methods

Overdependence on simple reservoir maps

Seismic-guided reservoir property mapping

Simulation dependent on computer availability


and capability

Simulations run on personal computers


or using massively parallel processing

Unlimited modification of simulation input values


to achieve match with production history

Reservoir models constrained by integrated data and


interpretations and prudent adjustment of inputs

Limited use of simulation to guide data acquisition

Modeling and simulation to determine optimal


timing for data gathering, such as 4D seismic surveys

> Simulation approaches. In the past, single-discipline interpretation and lack of computing capability
limited the use of reservoir simulation. Now, a more sophisticated approach to simulation makes the
most of multidisciplinary teams and nearly ubiquitous computers.

Reservoir simulation is a tool for reservoir


management and risk reduction.1 Although the
first simulations were performed during the 1950s,
for a long time limited computer availability and
slow speed confined their use to only the most
significant projects.2
At present, reservoir simulation is performed
most commonly in high-risk, high-profile situations, but could improve virtually any project. The
list of typical applications is varied and extensive
(next page):
New discoveries, to determine the number of
wells and the type and specification of facilities
needed for production. Particular attention is
paid to the reservoirs drive mechanism and the
development of potential oil, gas and water profiles. All assessments are subject to the risk of
limited data, sometimes from only a single well.
Deepwater exploration and other areas where
initial test wells are expensive. Estimates draw
on restricted data, such as seismic data and
results from a single well.
Fields in which production surprises occur and
development expenditures have already been
incurred. New measurements or production
strategies might be advisable.
Secondary recovery implementation. Appropriate decisions are essential because of the
expense of enhanced production startup.
Divestment and abandonment decisions.
Simulation can help determine whether a field
has reached the end of its economic life or
how remaining reserves might be exploited.

22

These applications of simulation are made


possible by new programs and computers that
are faster and easier to use. (A full review of the
advances in simulation software that have
occurred in the last few years is beyond the
scope of this article, but will be covered in a
future article in Oilfield Review.) The new simulators run on less expensive computers and allow
rapid studies to rank opportunities. Along with
these capabilities, however, arises the possibility
that simulations might be performed indiscriminately or before a validated reservoir model has
been built, potentially prompting misleading or
erroneous results and poor decision-making.
There is also the risk of performing simulations
based on limited data.
Developing a first-rate reservoir model from
limited data at a variety of scales is difficult. In
its most basic form, model validation is achieved
through integrating different types of data.
Researchers are investigating the best way to
integrate some new types of data, such as multicomponent seismic data, into reservoir models.
A more sophisticated approach involves uncertainty analysis (see Validating Models Using
Diverse Data, page 24 ).

In some cases, it is best to begin with the simplest model that fits the data and the objectives of
the project and reproduces reservoir behavior. In
all cases, the starting point should be an evaluation of what answers are required from reservoir
simulation, the accuracy needed and the level of
confidence or the acceptable range of quantitative predictions. The model complexity might be
increased as more data become available. The
reward for increasing model complexity can be
evaluated after each simulation run to decide
whether more complex simulation is justified.
Estimates of well flow rates and predictions
of reservoir performance from simulations affect
design of production facilities and should be
believed, even if they seem unlikely. For example,
a deepwater Gulf of Mexico field required expansion and de-bottlenecking of facilities soon after
initial production because the initial reservoir
model was compromised by a pessimistic view of
the interpreted reservoir continuity and flow
rates. Better predictions allow operators to size
facilities correctly the first time rather than having
to re-engineer them.
The quality of predevelopment reserve estimates, field appraisals and development strategies relates closely to reservoir architecture and
structural complexity; reserve estimates tend to
be underestimated in large, less complex fields,
whereas reserves in smaller, more complex fields
are commonly overstated. Poor reservoir models
and resultant incorrect calculations of reserves,
whether too high or too low, have negative economic consequences. In the North Sea, deficient
reservoir models have led to improper facilities
sizing and suboptimal well placement, even in
fields where simulation studies were carried out.3
Better validation of models, particularly using 3D
seismic data, might have averted over- or undersizing production facilities or drilling unnecessary
wells in some cases. In other cases, reservoir
simulation has allowed identification of the key
drivers of reservoir performance so that datagathering efforts can be targeted to reduce
uncertainty in those areas. Alternatively, facilities can be designed to be flexible within a given
reservoir uncertainty.

Oilfield Review

Increasing the Value of Data


Operating companies spend considerable time
and money acquiring datafrom multimilliondollar seismic surveys and cores from costly
exploratory wells, to sophisticated well logs and
production tests during and after drilling. Data
acquisition presents potential risksto both
project economics and the well itselfsuch as
logging or testing tools becoming stuck, a core
barrel malfunctioning or having to shut in or kill a
producing well. One would expect, then, that
data would be analyzed and incorporated into
models as fully as possible or not collected in the
first place. Most reservoir simulations rely heavily on production data from wells and only four
types of geological or geophysical reservoir
maps: structure of the top of the reservoir, reservoir thickness, porosity and the ratio of net pay to
gross pay. These maps are often constructed
from seismic and well log data alone.
Incorporating all available data, such as core
analyses, seismic-guided reservoir property distributions and fluid analyses, is a cost-effective
way to strengthen and validate reservoir models
across disciplines.
A reservoir model usually combines production rates and volumes with geological and geophysical maps of subsurface strata derived from
well logs and seismic data. Aquifers are often
included in the model and sealing rocks are typically treated as zero-permeability layers. The
subsurface maps take into account well locations

Situation
New discoveries

Deepwater exploration

and trajectories. The reservoir model is strengthened if a geological map of permeability values is
created by applying a porosity-to-permeability
transform to the porosity map according to permeability values interpreted from well tests, well
logs or cores. Even more rigorous results are
obtained when, in addition to inclusion of well
rates and produced or producible hydrocarbon
volumes, all available production data are input
into the model. These include pressure, gas/oil
ratios, and fluid densities, saturations, viscosities
and compressibilities.
In many instances, though, reservoir models
fail to encompass the full diversity of reservoir
data because only a few basic geological and
geophysical maps, constructed from a subset of
the data available, are used to describe variations in the data. Additional data and interpretations are needed to make reservoir models
more robust. For example, core data can serve as
calibrators for geological, petrophysical and
engineering data and interpretations, but are
often used only as guides to permeability. Core
analysis refines model values of porosity, permeability, capillary pressure and fluid saturation.
Whole cores, while not necessarily representative of the entire reservoir, offer tangible
evidence of grain size and composition, sorting,
depositional environment and postdepositional
reservoir history, such as bioturbation, cementation or diagenesis.

Desired Results

Pitfalls or Other Considerations

Determine optimal number of


infill wells

Limited data, sometimes from


only a single well

Size and type of production


facilities

Drive mechanism

Decide whether to maximize


production rate or
ultimate recovery

Terms of operating license


or lease

Prospect evaluation

Limited data, no wells available

Seismic and well test data enable mapping of


permeability barriers, but are rarely used in tandem. For example, horizon dip, azimuth, coherency
or other seismic attributes might indicate fault
patterns.4 Such information is especially useful
when contemplating the addition of directionally
drilled or multilateral wells. These types of interpretations are just the beginning; all other data
types should be similarly scrutinized.
As mentioned earlier, the reliance of simulators on four simple subsurface maps has
impaired simulation effectiveness. Simulation
becomes more realistic as additional data are
incorporated into the reservoir modelreconciling all available data tends to rule out some
interpretations. For example, permeability values
can be inferred from well logs and confirmed by
core and well test data, and possibly related to
seismic attributes, rather than merely computed
from an empirical transform of a porosity map
and well test data. Reconciling conflicting data
requires acceptance of a hierarchy of data confidence. This hierarchy might be developed on the
basis of probable measurement errors.
(continued on page 26)
1. For a general introduction to reservoir simulation:
Adamson G, Crick M, Gane B, Gurpinar O, Hardiman J
and Ponting D: Simulation Throughout the Life of a
Reservoir, Oilfield Review 8, no. 2 (Summer 1996): 16-27.
2. Watts JW: Reservoir Simulation: Past, Present, and
Future, paper SPE 38441, presented at the SPE
Reservoir Simulation Symposium, Dallas, Texas, USA,
June 8-11, 1997.
3. Dromgoole P and Speers R: Geoscore: A Method for
Quantifying Uncertainty in Field Reserve Estimates,
Petroleum Geoscience 3, no. 1 (February 1997): 1-12.
4. Key SC, Nielsen HH, Signer C, Snneland L, Waagb K
and Veire HH: Fault and Fracture Classification Using
Artificial Neural Networks Case Study from the Ekofisk
Field, Expanded Abstracts, 67th SEG Annual
International Meeting and Exposition, Dallas, Texas, USA,
November 2-7, 1997: 623-626.

Scenario planning
Mature fields

Answers to sudden
production problems

Implementation of secondary
recovery

Determine appropriate
recovery method

Divestment or abandonment

Determine future
production volumes

Relatively inexpensive way


to extract maximum value
from development costs

Unanticipated future production problems might reduce


property value

> Simulation uses. Reservoir simulation is useful during all phases of the life of a reservoir and in
both high- and low-risk projects.

Summer 1999

23

Validating Models Using Diverse Data

A shared earth model is a model of the geometry and properties of the reservoir constrained
by a variety of measurements. To be predictive,
the model should approximate the key features
of the actual reservoir as closely as possible
(right). In a valid reservoir model, predictions
from the model agree with the measured data. A
good fit between predictions and measurements
is not sufficient, though. Several models might
agree equally well with the data. The best model
is the one that agrees best with the data and
with prior information on the model parameters. The uncertainty of the model is defined as
the range of model parameters consistent with
the measurements.
Consider a thin bed imaged by seismic data
(below right). The uncertainty of the shared
earth model in this case is described by the
range of thickness and impedance values that
satisfy the data. This range defines a probability
density function (PDF).

Reservoir Model

Measured Data

Shared Earth Model

Predicted Data

Seismic data
Well log data
Dynamic data

> Model inputs. A shared earth model begins with seismic, well log and dynamic data from the
actual reservoir. In this example, the reservoir is represented by a model (top left). Measured
seismic data (top right) are compared with data predicted by the model (bottom right),
which can be adjusted to improve the fit. The final three-dimensional shared earth model
(bottom left) incorporates all available data.

V
P
Poor fit

V
P

Uncertainty ellipse

h
Good fit
Best fit

> Quantifying uncertainty. The thin bed shown as a red layer (left) has thickness h
and acoustic impedance VP. The plot to the right displays the posterior probability
density function. Thickness and impedance values within the red uncertainty ellipse
satisfy the data, and within that ellipse are red circles denoting a good fit and the
best fit. The red circle outside the uncertainty ellipse does not satisfy the model.

24

Oilfield Review

Researchers at Schlumberger-Doll Research,


Ridgefield, Connecticut, USA, are using a
Bayesian approach to quantify the uncertainty
of reservoir models (right). A prior PDF
represents initial information on model parametersthe vector m. This prior PDF can
be combined with a likelihood PDF, which
quantifies information provided by additional
data, to obtain a posterior PDF. When new data
become available, the posterior PDF is used as
the initial or prior PDF, and the model is again
refined. As additional measurements are incorporated in the model, the uncertainty decreases
and a better reservoir description follows.
Effective model validation using a Bayesian
approach requires three modes of operation:
interactive, optimization and uncertainty. In the
interactive mode of the prototype application
in development, the user modifies the reservoir
model and observes the consequences of interpretation decisions on the data (below right).
In the example shown, predicted seismic data
are compared with measured data. In the optimization mode, the user selects the model
parameters to optimize. The software finds the
best local fit of the model to the data. Finally,
in the uncertainty mode, the uncertainty ellipse
of selected reservoir properties is computed and
displayed. The uncertainty ellipse represents
the range of acceptable models.
The prototype application has been used to
test a Bayesian validation approach against
diverse data types, including seismic data, well
logs, while-drilling data and production information. By validating a reservoir model against
all available data before beginning the historymatching phase, the range of admissible models
can be reduced substantially. The result is a
more predictive reservoir model.

m2

m2

m2
Uncertainty
ellipse
Best
model

m1

m1

Prior PDF
p (m)

m2

Posterior PDF
p (m d[1])

m2

m2

m1
Prior PDF
p (m d[1])

m1

Likelihood
L (m d[1])

m1
Likelihood
L (m d[2])

m1
Posterior PDF
p (m d[1],d[2])

> Reducing uncertainty. In a Bayesian approach, a prior PDF quantifies the initial
information on model parameters, expressed as vector m. The prior PDF (top left) is
refined by the inclusion of new data (top center) to create the posterior PDF (top right).
The uncertainty of the model is shown in the red uncertainty ellipse. The blue circle
represents the best model. The posterior PDF then becomes the prior PDF (bottom left)
when more data become available (bottom center). The next posterior PDF
(bottom right) has a smaller uncertainty ellipse and a slightly different optimal model.

> Validation modes. Prototype software developed by Schlumberger includes an interactive mode in which the
user assesses the effects of interpretation decisions on reservoir models. In this case, the center of the upper
panel shows predicted seismic data as dotted lines and measured data as solid lines after the upper horizon,
shown in green to the left, has been moved. The lower panel shows a better fit between the predicted and
measured data (center) and the model uncertainty in the ellipse to the right.

Summer 1999

25

Limitations of Reservoir Models


Generating and fine-tuning the model entail close
collaboration by the reservoir team. As in other
phases of exploration and production, such as
geological and geophysical interpretation or
drilling preparations, handing off results from
one team member to the next along a chain is
less effective than working together from the
outset.5 Reservoir teams analyze data and perform simulations more rapidly as their experience
increases. Working as a team also ensures that
no one gets bogged down in endless tinkering
with input parameters to try to obtain a match
with production history.
In addition to working interactively, the team
must employ consistent methods to ensure that
normalization and interpretation are performed
properly. If data are not normalized and interpreted consistently, relationships between data
might be obscured, such as that between porosity
and seismic attributes (see Model Validation,
next page).6
Any uncertainty in the data limits confidence
in reservoir models and reservoir simulations.
Permeability barriers, pinchouts, faults and other
geological features are not always apparent from
well, seismic and production data. Their exact
locations might be off by tens or hundreds of
meters and their effectiveness as flow barriers or

conduits might be miscalculated. Formation


thicknesses are usually defined by integrating
seismic and averaged well log data, although
the resolution of seismic data is on the order of
tens to hundreds of feet, whereas well logs show
variations at the scale of inches. Under- or overestimating pay thicknesses directly impacts
simulation reliability.
Averaging techniques also affect simulation
results, especially when reservoir properties are
highly variable. Also, problems may occur when
averaging fine detail, such as interpretations
from well logs, to integrate with data of lower
resolution, such as seismic data. For example, a
reservoir that consists of several distinct layers
with different properties might not behave like a
single layer of the same overall thickness and
average properties. The uncertainty of many
measurements increases dramatically with distance from the wellbore. Even though there is a
different level of uncertainty with each data type,
proper model validation forces comparisons of
independent data and interpretations.
Upscaling, or representing the data at a common scale, coarsens the fine-scale reservoir
description in the shared earth model to the degree
that a computer can cope with it (below). This step
usually reduces the number of cells, or subdivisions

of the reservoir model. Horizontal upscaling in the


absence of horizontal wellbores is typically simpler
because there is generally less fine detail in seismic data, whereas vertical upscaling is complicated by the greater amount of detail available at
the wellbores. Thickness and porosity, whose variations typically follow simple, linear averaging
laws, are less prone to upscaling problems than
permeability. The average permeability of a twolayer system in which one layer has zero permeability is not one layer with the average
permeability of the two layers. The reservoir model
must be built around such impermeable layers.
No matter how carefully a model is prepared
and simulation performed, the dynamics of production might affect the reservoir in ways that
reservoir simulation might not predict. History
matching, or comparing actual production volumes and measured pressures with predictions
from simulations, is the most common method
for judging the quality of the reservoir model. The
assumption is made that if the model yields a
simulation that matches past production, then
the model is more likely to be a useful tool for
forecasting.7 Certainly, a model that does not
match past production history or reservoir
response to past production is unlikely to correctly predict future production.

Simulation model

Upscaling

Geological modeling

Well logs
Petrophysical modeling

3D and 4D seismic data


Seismic modeling

Drilling data

Classification system

Reservoir simulations

> Shared earth model. A numerical representation of the subsurface, housed in a database shared by multidisciplinary team members, allows constant
access to and updating of the reservoir model used for simulation. As databases and software improve, the simulation model and the shared earth model,
which now must be upscaled before being used as a simulation model, will be the same.

26

Oilfield Review

VSP well tie

Time horizon
interpretation

Data loading

Attribute extraction

Reservoir property
distribution

Depth grids

Petrophysical
interpretation

Geologic
correlation

Model construction

Weighted average

> Model construction workflow. Once data from the Stratton field were loaded, the team worked together from the outset to correlate well logs,
vertical seismic profile (VSP) data and seismic data. Interpreted seismic horizons, depth conversion results and extracted attributes were compared with
normalized well log porosities and geologic log correlations. The consistent relationship between the weighted average porosity and seismic amplitude
prompted generation of a reservoir property distribution map, a seismic-guided map of porosity distribution in this case, to complete the reservoir model.

Obtaining a good match between the production history and predictions from simulations is
inexpensive in some cases, but can become time
consuming when the model is continuously
refined and simulated. In certain situations, such
as waterfloods, tracers in the form of chlorides,
isotopes or brines are introduced into injected
water to reveal patterns in the reservoir.
Comparisons of these patterns with expected
patterns can be used to reevaluate input values,
for example, porosity, permeability and transmissibilitythe ease with which fluid flows from
one model cell to another, to improve the history
match. Whenever a new well is drilled, it offers
an opportunity to check the quality of a reservoir
simulation, principally by comparison of observed
pressure with the pressure predicted by the
model at the drilling location.
The difficulty of simulating a reservoir
underscores the need to constrain the reservoir
model with all available data. A reservoir model
constrained and validated by geological, geophysical and reservoir data before initiating
simulation extracts as much information as possible from the data and provides a better result.
Also, understanding the range and impact of
reservoir uncertainty allows a quantitative and
qualitative judgment of the accuracy or range of
model predictions.

Summer 1999

Model Validation
A data set from the Stratton field of south Texas
(USA) demonstrates the value of cross-disciplinary
interpretation and model validation in calculating
in-situ gas reserves in the Frio formation (above).8
The data include 3D seismic data, logs from nine
wells, correlations of geological markers and a
vertical seismic profile (VSP). Resistivity, neutron
porosity, bulk density, and spontaneous potential,
gamma ray or both curves were available for the
nine wells. Preliminary examination of the well
logs and VSP data guided selection of horizons in
the Frio formation for seismic horizon tracking.
The VSP provided a good tie between the well
and the seismic data along with good understanding of the phase of the seismic data.9
A thin, clean Frio sand that is easy to correlate
and ties to a mappable seismic event was
selected for both well-by-well analysis and multiwell petrophysical interpretation that ensured
consistent analysis of all the logs. The interpreters
observed that porous zones seemed to correspond

5. Galas C: The Future of Reservoir Simulation,


Journal of Canadian Petroleum Technology 36, no. 1
(January 1997): 5, 23.
6. Corbett C: Improved Reservoir Characterization Through
Cross-Discipline Multiwell Petrophysical Interpretation,
presented at the SPWLA Houston Chapter Symposium,
Houston, Texas, USA, May 18, 1999.
7. This assumption does not always hold. For example,
a reservoir model might match the production history
even when there is bypassed oil. Additional seismic
data might reveal undrained reservoir compartments
in this case.
8. Corbett C, Plato JS, Chalupsky GF and Finley RJ:
Improved Reservoir Characterization Through CrossDiscipline Multiwell Petrophysical Interpretation,
Transactions of the SPWLA 37th Annual Logging
Symposium, New Orleans, Louisiana, USA, June 16-19,
1996, paper WW.
9. Phase refers to the motion of, or means of comparison
of, periodic waves such as seismic waves. Waves that
have the same shape, symmetry and frequency and that
reach maximum and minimum values simultaneously
are in phase. Waves that are not in phase are typically
described by the angular difference between them, such
as 180 degrees out of phase. Zero-phase wavelets
are symmetrical in shape about zero time whereas nonzero-phase wavelets are asymmetrical. Non-zero-phase
wavelets are converted to zero-phase wavelets to
achieve the best resolution of the seismic data. Known
(zero) phase well synthetics and vertical seismic profiles
(VSPs) can be compared with local surface seismic data
to determine the relative phase of the surface seismic
wavelets. Such knowledge allows the surface seismic
data to be corrected to zero phase.
For more on combining vertical seismic profiles with
other geophysical data: Hope R, Ireson D, Leaney S,
Meyer J, Tittle W and Willis M: Seismic Integration
to Reduce Risk, Oilfield Review 10, no. 3
(Autumn 1998): 2-15.

27

Well 18
Well 7

0.10

Well 13
Well 20

Single-well effective porosity

0.09
Well 19
0.08
Well 11
0.07
Well 12

0.06

Well 9

0.05
0.04

Well 10

30

35

40

45

50

55

60

65

Amplitude

0.17
Well 10
0.16
Well 19
Well 9
Multiwell effective porosity

0.15
Well 13
Well 12

0.14
0.13
0.12
Well 20
0.11
Well 11
0.10
Well 18

0.09
30

35

Well 7
40

45

50

55

60

Amplitude

> Single well versus multiwell interpretation. Well-by-well petrophysical analysis


(top) obscures the relationship between porosity and seismic amplitude. In this
example from the Stratton field, the plot of effective porosity versus seismic
amplitude shows considerable scatter around the line of best fit because the well
logs were not analyzed consistently. The relationship between seismic amplitude
and porosity is clear when the logs are normalized and consistent analytical
methods are used (bottom). The observed relationship between seismic
amplitude and effective porosity allowed interpreters to use the seismic data
to generate a map of effective porosity.

28

65

to high seismic amplitude. To confirm this observation, crossplots of effective porosity and amplitude were prepared. The crossplots of the
well-by-well petrophysical analysis showed significant scatter, whereas the multiwell analysis
demonstrated a clear relationship between seismic amplitude and effective porosity (left).
Next, an equation that related effective
porosity to amplitude was used to generate a
map of effective porosity. The mathematical relationship between the weighted average porosity
values at the wellheads and the seismic amplitudes at those locations guided the mapping.
Combining carefully integrated core porosity, logderived porosity and seismic attributesin this
case, amplitudeproduced a single, validated
porosity map constrained by several independent
sources of porosity information (next page).
Using each type of data in isolation in the
Stratton field example obscured relationships
between data and probably would have resulted
in a set of incompatible subsurface maps that
were not physically realistic.
The difference between the single-well analytical approach and the consistent, normalized
petrophysical analysis in the Stratton field
affects the economic evaluation of the reservoir.
The single-well approach precluded integrating
the well logs with the seismic data to generate a
seismic-guided porosity map because the crossplot of effective porosity and amplitude indicated
no consistent relationship between the well logs
and seismic data. The in-situ gas volume calculated by single-well petrophysical analysis is
12% greater than that calculated from the validated, seismic-guided porosity distribution. An
overstated gas volume might lead to unnecessary
infill drilling.
In another case offshore Malaysia, 3D seismic data, well logs, wellbore image logs and
core data enabled generation of time-depth
relationships and synthetic seismograms to tie
logs to seismic data.10 The relationship between
effective porosity, seismic amplitude and acoustic impedance, expressed as a calibration function, allowed prediction of effective porosity
throughout the 3D seismic data, similar to the
previous Stratton field example. Additional
data, such as pressure measurements from
wireline tools or well tests, make the reservoir
model more robust and improve confidence in
the predictions from simulation.11

Oilfield Review

7000
6000
100
5000

90
Well 9

80
4000
3000

70
60
Well 12

50
0.040
0.060
0.080
0.100
0.120
0.140
0.160
0.180
0.200
0.220

2000

Well 11
Well 18

Well 10

40
Well 13

30
1000

Well 7

Well 20

20
10
Well 19

20

40

1000

2000

60

3000

80

4000

100

5000

120

6000

140

7000

160

8000

9000

180

200

10,000 11,000

> Seismic-guided porosity distribution. In the Stratton field of south Texas, USA, a clear relationship between effective
porosity and seismic amplitude permitted seismic-guided mapping of effective porosity. This map could not have been
created without consistent, multiwell petrophysical analysis. Yellow and orange represent areas of high seismic
amplitude; blue represents low amplitude.

Model Manipulation
Because simulation inputs are subject to revision
by the project team to improve the match between
the simulation and production history, it is important to restrict the input model as much as the
data permit and avoid unnecessary adjustments of
input values. Simulation software typically allows
interpreters to change not only the geological and
geophysical maps used to build a reservoir model,
but also variables such as pressure, temperature,
fluid composition and saturation, permeability,
transmissibility, skin, productivity index and rock
compressibility. Seasoned interpreters have different opinions about what changes to simulation
inputs are acceptable, but prudently adjusting
simulation input parameters often improves the
history match.
10. Corbett C, Solomon GJ, Sonrexa K, Ujang S and Ariffin T:
Application of Seismic-Guided Reservoir Property
Mapping to the Dulang West Field, Offshore Peninsular
Malaysia, paper SPE 30568, presented at the SPE
Annual Technical Conference and Exhibition, Dallas,
Texas, USA, October 22-25, 1995.
11. For another example of seismic-guided property mapping:
Hart BS: Predicting Reservoir Properties from 3-D
Seismic Attributes With Little Well ControlJurassic
Smackover Formation, AAPG Explorer 20, no. 4
(April 1999): 50-51.

Summer 1999

Simulation experts use a three-stage


approach to fine-tune a reservoir model, beginning with the energy balance, then an adjustment for multiple fluid phases, and finally the
well productivity. The energy balance stage
accounts for the reservoir pressure. The relative
permeabilities of different fluid phases are
adjusted in the second stage. The final step uses
recent productivity test data, such as bottomhole
flowing pressure, tubing surface pressure and
total fluid production rate, to further improve the
history match.
Reservoir thickness values typically are constrained by seismic data and well logs, but are
wrong if the interpreter tracks seismic horizons
incorrectly, if logs and seismic data are not tied
properly, or if well logs are off-depth or miscor-

related. Poor-quality seismic data, a common


problem in structurally complex areas, can
hamper horizon tracking. Reprocessing can
improve seismic data quality.
The depth to the reservoir should also be well
constrained if log and seismic data are interpreted diligently. Comparing well logs or synthetic seismograms generated from logs with
seismic data improves depth conversion.
Additional data, such as VSPs, also tend to
improve depth conversion. In structurally complex areas, however, depth-based processing
from the outset is preferable to depth conversion.
The enhanced integrity that validation brings to a
depth-converted structure mapthat is, a map
displayed in units of depth rather than the seismic unit of timeis demonstrated by integrating
dipmeter data with the structure map or by comparing depth-converted seismic sections to dip

29

-800
-1200
-800

-1600

-1000
-1800

-1400
-1200

Structure contours in meters


Well location
Strike and dip from dipmeter

Depth, ft

415
1443

413
1443

412
1443

410
1443

408
1443

406
1443

404
1443

11,050

18,000

11,150

21,000

11,250

24,000
27,000

11,350

30,000
11,450

33,000

11,550

36,000

11,650

39,000
42,000

11,750

45,000
11,850

48,000

11,950

51,000

12,050

54,000

12,150
T67V

f/s*g/cm3

> Confirming depth conversion. Dipmeter data reduce interpretive contouring options for structure maps
if the mapper honors the data (top). Dipmeter data from the depth of interest, plotted at each well, show
reasonable conformity with structure contours in the upper right and lower sections of the map, but
refute the contouring of the upper left area in this fictitious example. Dip interpretation from an image log,
tied to an actual depth-converted seismic section, confirms dip direction and magnitude at horizons of
interest (bottom). The color variation in the seismic section represents acoustic impedance.

interpretations from wellbore images, such as


FMI Fullbore Formation MicroImager or ARI
Azimuthal Resistivity Imager logs (left). By interpreting seismic data, well logs and wellbore
images together rather than independently, the
interpreter ensures that the final structure map
has been rigorously checked.
Though typically not introduced in the largescale model-construction phase, information
about reservoir fluids can offer important insight.
Formation tester data, such as MDT Modular
Formation Dynamics Tester results, indicate the
location of a fluid contact. This information, combined with well log and seismic data, yields a
more constrained starting model for simulation.
Other fluid information may be used to constrain the fine-scale model in the vicinity of the
wellbore. Perforation locations are considered
known, but the effectiveness of the perforations
may be evaluated with production logs, and
changes in fluid saturations monitored with RST
Reservoir Saturation Tool or TDT Thermal Decay
Time data.
The ratio of net pay to gross pay can vary
widely across a reservoir, but like other simulation input values, should not be altered during
the history-matching stage without good reason.
The net-to-gross ratio might be adjusted if supported by drilling results, such as well logs and
cores, or production logs.
Permeability values are obtained in several
ways, including core and log analysis and well
tests, so comparisons of the values from each of
these approaches can limit the range of input values, at least at well locations. Effective assimilation of wellbore image logs, probe permeability
data and core data allows characterization of
horizontal permeability near the wellbore and
prediction of vertical permeability.12 Saturation
values, established through well log analysis, are
verified by capillary pressure data from special
core analysis, wireline formation tester results or
RST measurements. All of these input parameters, within reason, are considered adjustable by
simulation experts.
12. Thomas S, Corbett P and Jensen J: Permeability and
Permeability Anisotropy Characterisation in the Near
Wellbore: A Numerical Model Using the Probe
Permeameter and Micro-Resistivity Image Data,
Transactions of the SPWLA 37th Annual Logging
Symposium, New Orleans, Louisiana, USA,
June 16-19, 1996, paper JJJ.
13. Crombie A, Halford F, Hashem M, McNeil R, Thomas EC,
Melbourne G and Mullins OC: Innovations in
Wireline Fluid Sampling, Oilfield Review 10, no. 3
(Autumn 1998): 26-41.
14. For more on skin: Hegeman P and Pelissier-Combescure J:
Production Logging for Reservoir Testing,
Oilfield Review 9, no. 2 (Summer 1997): 16-20.

30

Oilfield Review

Some reservoir engineers minimize adjustments to PVT samples, which indicate reservoir
fluid composition and behavior at the pressure,
volume and temperature conditions of the reservoir. In cases of surface recombination of the
sample or sample collection long after initial production, however, the engineer might decide to
adjust PVT values. At the other extreme, production rates and volumes, and pressure data from
wells are considered inalterable by some
experts, although exceptions are made at times,
such as when production measurement equipment fails. Many experts choose to honor the
most accurate representation of production data.
Placing restrictions on the alteration of input
values makes a good history match from simulation more elusive, but many input values may be
adjusted during simulation. Transmissibility is
computed by the simulator using the input porosity and permeability. A high computed transmissibility value can be overridden if well tests,
formation tester data or seismic data provide evidence of separate sand bodies, stratigraphic
changes, faults or other types of reservoir compartmentalization. Differences in fluid chemistry
or pressure from one well to another also suggest reservoir compartmentalization. In-situ fluid
samples obtained from the OFA Optical Fluid
Analyzer component of the MDT tool are uncontaminated and can be brought to surface without
changing phase for chemical analysis.13
Production logs, well tests and pressure transient analyses indicate skin, which is a dimensionless measure of the formation damage
frequently caused by invasion of drilling fluids or
perforation residue.14 When the location, penetration and effectiveness of perforations are of
concern, production logs provide information that
may affect the model input for skin. If a field is
located in a geological trend of similar accumulations, skin values in the trend might be a useful
starting assumption if data within the field are
initially scarce.

Summer 1999

. . . Validating interpretations and models across disciplines


addresses complex problems that are difficult to solve
within the confines of a single discipline . . . Multidisciplinary
validation of reservoir models increases the value of
data beyond the cost of data-gathering activities alone . . .

Adjustment of the productivity index, another


input parameter, affects the quality of a history
match. The productivity index, often expressed in
units of B/D/psi or Mcf/D/psi, is a measure of
how much a well is likely to produce. If the skin
value is known, the productivity indexusually
computed from model inputs that include the
skincan be computed more accurately. When
differing stimulation or completion techniques
among field wells are used, productivity index
values often vary from well to well. For example,
hydraulic fracturing of a single well in a field might
enhance permeability and, therefore, productivity
of that well alone.
The options available to change a reservoir
model to improve the match between one simulation run and a fields production history might
appear endless. At some point, practical limits on
data collection, computational power and time
for modifying input parameters curtail simulation
iterations. Independent analyses that support the
interpretations of other team members increase
confidence in reservoir simulation. A proper simulation workflow helps accomplish this goal.
Working as a team ensures that all data are used
to validate the reservoir model.
Validating interpretations and models across
disciplines addresses complex problems that are
difficult to solve within the confines of a single
discipline. A multidisciplinary team sharing a
database and iteratively validating and updating
shared earth models, or geomodels, achieves this
goal.15 Operating companies report increases on
the order of 10% in predicted ultimate recovery
through proper data integration, simulation and
reservoir development.16 Cycle time also
decreases in many cases, probably because of
ready access to data and interpretations for
everyone involved in the project.

Multidisciplinary validation of reservoir models increases the value of data beyond the cost of
data-gathering activities alone. In 1998, for
example, Geco-Prakla acquired 3D multicomponent seismic data for Chevron in the Alba field in
the North Sea. The objectives of the survey were
to better image the sandstone reservoir, identify
intrareservoir shales that affect movement of
injected water and map waterflood progress.
After integration of the new shear-wave data to
improve the reservoir model, two additional
wells were drilled in the field. The first well is
producing up to 20,000 B/D [3200 m3/d]; the second well is being completed and has resulted in
the discovery of Albas highest net sand. Both
wells have confirmed some of the features
observed on the converted-wave data. Because
the first well was drilled less than a year after
seismic acquisition started, Chevron felt the new
data arrived in time to make a significant commercial impact on the fields development.17
15. For more on the shared earth model and integrated
interpretation: Beardsell M, Vernay P, Buscher H,
Denver L, Gras R and Tushingham K: Streamlining
Interpretation Workflow, Oilfield Review 10, no. 1
(Spring 1998): 22-39.
16. Major MJ: 3-D Gets Heavy (Oil) Duty Workout, AAPG
Explorer 20, no. 6 (June 1999): 26-27.
ORourke ST and Ikwumonu A: The Benefits of
Enhanced Integration Capabilities in 3-D Reservoir
Modeling and Simulation, paper SPE 36539, presented
at the SPE Annual Technical Conference and Exhibition,
Denver, Colorado, USA, October 6-9, 1996.
Sibley MJ, Bent JV and Davis DW: Reservoir Modeling
and Simulation of a Middle Eastern Carbonate
Reservoir, SPE Reservoir Engineering 12, no. 2
(May 1997): 75-81.
17. For more on the Alba field survey and shear wave
seismic data: MacLeod MK, Hanson RA, Bell CR and
McHugo S: The Alba Field Ocean Bottom Cable Seismic
Survey: Impact on Development, paper SPE 56977,
prepared for presentation at the 1999 Offshore European
Conference, Aberdeen, Scotland, September 7-9, 1999.
Caldwell J, Christie P, Engelmark F, McHugo S, zdemir H,
Kristiansen P and MacLeod M: Shear Waves Shine
Brightly, Oilfield Review 11, no. 1 (Spring 1999): 2-15.

31

Modeling for Data Acquisition


In addition to the general question of determining how best to produce a reservoir, simulation
can demonstrate the best time to acquire additional data. Time-lapse (4D) seismic surveys,
which are repeated 3D surveys, can be acquired
at optimal times predicted by careful model construction and simulation.18 As oil and gas are produced from a reservoir, the traveltime, amplitude
and other attributes of reflected seismic waves
change; simulation can demonstrate when these
changes become visible.19 Because 4D seismic
data acquisition, processing and interpretation
can cost millions of dollars, it is critical to determine from modeling studies whether reservoir
variations will be discernible in the new survey.
This type of prediction differs from routine simulation techniques.
Traditionally, reservoir engineers have been
interested in matching simulated well production
with actual production data. These data come
from one point in spacethe wellheadbut are
nearly continuous in time. There are other types
of history matching, though. Seismic data represent a single point in time but offer almost complete spatial coverage of the reservoir. Also, the
modeled and observed parameters are differentseismic amplitudes are of concern rather
than fluid pressures, for example.
To perform seismic history matching, first, the
seismic response to a saturated reservoir is
modeled. After some period of production, the
seismic response to the depleted reservoir is
calculated. The seismic response might be complex and include a combination of changes in
amplitude, phase, attenuation and traveltime.
The initial and depleted reservoir responses differ because the composition of the pore fluids
and the reservoir pressure change during production, both of which affect the seismic velocity of
the reservoir. The synthetic responses are compared with recorded seismic datathe initial 3D
survey and the subsequent repeated survey. The
difference in seismic character of the reservoir
from the initial survey to the later survey is a
function of the compressional (P) and shear (S)
seismic velocities and is interpreted as a change
in fluid content and pressure.
18. Gawith DE and Gutteridge PA: Seismic Validation of
Reservoir Simulation Using a Shared Earth Model,
Petroleum Geoscience 2, no. 2 (1996): 97-103.
19. Pedersen L, Ryan S, Sayers C, Sonneland L and Veire HH:
Seismic Snapshots for Reservoir Monitoring,
Oilfield Review 8, no. 4 (Winter 1996): 32-43.
20. In the case of the Magnus field, located on the UK continental shelf, the effect of pressure changes on timelapse seismic data is greater than the effect of changes
in pore fluids. For more information: Watts GFT, Jizba D,
Gawith DE and Gutteridge P: Reservoir Monitoring of
the Magnus Field through 4D Time-Lapse Seismic
Analysis, Petroleum Geoscience 2, no. 4
(November 1996): 361-372.
21. Pedersen et al, reference 19: 42.

32

Predicted Seismic Properties

Predicted Seismic Data

Actual Seismic Data

> Forward modeling to optimize data acquisition. Predicted properties of seismic data at
time t2 (top left) are used to predict the appearance of seismic data (middle left). These
predictions are revisited after acquisition of actual seismic data at time t2 (bottom left).
Seismic properties at time t3 (top right) are predicted next from actual t2 seismic data.
By considering fluid changes in the reservoir and their effects on seismic waves, and
then modeling the seismic data that would result from surveying at time t3 (middle right),
additional seismic surveys for reservoir monitoring will be acquired at the optimal time
t3 (bottom right).

Multicomponent seismic data and amplitude


variation with offset (AVO) analysis of compressional-wave data both reduce the ambiguity of
distinguishing the effects of pressure changes
from the effects of pore fluids. Without AVO
processing, ordinary marine 3D seismic data are
fundamentally ambiguous because they respond
to compressional waves only, but compressional
waves respond to both pressure and saturation.
Multicomponent seismic data separate compressional and shear components, allowing the interpreter to separate saturation effects and pressure
effects that influence porosity, because shear
waves do not respond to pore fluids.20

Interpretation of a seismic response change


from an initial seismic survey to a repeated survey enables detection and spatial calibration of
additional faults, movement of oil-water contacts
and gas coming out of solution. Small faults in
the reservoir section are often visible as linear
features of decreased amplitude in a seismic
amplitude or coherency plot. Sealing faults also
appear as patches of undrained hydrocarbons
whose well-defined edges represent the fault.
Movements of oil-water contacts are visible as
changes in amplitude and possibly as flat
spots. When the reservoir pressure drops below

Oilfield Review

bubblepoint and gas comes out of solution,


P-wave seismic data typically show a significant
brightening of seismic amplitude. Multicomponent
seismic data, which include shear-wave data,
show no brightening because the shear waves
do not respond to pore fluids. Such a response
confirms the presence of gas.
Seismic history matching has benefited reservoir management decisions in the Gullfaks field,
where interpretation of 4D seismic data indicated the existence of previously unseen sealing
faults and the presence of bypassed oil. The
faults themselves were not seen on the new
data, but were interpreted where fluid content
changed abruptly in geophysical maps showing
the time-lapse results. After fluid transmissibility
across faults was reevaluated, the potential for
bypassed pay supported drilling an additional
well.21 That well, drilled by Statoil, initially produced 12,000 B/D [1900 m3/d] from a formerly
undrained compartment and confirmed the pres-

ence of a gas cap predicted by seismic data.


Statoil also reentered and sidetracked an abandoned well and produced 6300 B/D [1000 m3/d].
Currently, the scientists at Schlumberger
Cambridge Research, Cambridge, England, are
combining information on the distribution of fluids,
pressure and other properties from the reservoir
simulator with rock properties from the geomodel, or earth model, to generate a forward
model of seismic response (previous page). In
particular, the porosity, bulk modulus and shear
modulus from the geomodel are combined with
saturation and pressure information from the
simulator. The forward model provides information about the elastic moduli and density of the
reservoir, from which P-wave velocity, density
and acoustic impedance, or other properties, can
be derived.
Next, the synthetic seismic data are compared with actual seismic data. Numerous comparisons can be made between vintages of

seismic data, such as maps of seismic attributes,


prestack gathers for AVO studies and so on, but
each approach has the common goal of determining the area of mismatch between predicted
and recorded seismic data and analyzing the reasons for the differences.
One major challenge in interpretation of an
observed time-lapse seismic response is that the
non-uniqueness in a particular seismic response
must be considered. For example, an observed
change in amplitude might represent a change in
saturation of oil or free gas, a change in the
amount of gas dissolved in the oil, a change in
pressure, or, most likely, a combination of these.
Clearly, it is important to know which of these
factors are significant in the reservoir, and seismic
modeling can help determine this.
In the Foinaven study area, West of
Shetlands, UK, normal-faulted, layered turbidite
reservoirs form separate reservoir compartments
(below). A preproduction baseline 3D survey was
Gas cap

DC2
Faroe
Islands

DC1

Clair
Schiehallion
0

Foinaven

Shetland
Islands

South

1 km
1 mile
Reservoir
T35
T34
T32
T31

Study area
DC1

DC2
North

Orkney
Islands

SCOTLAND

20

40
20

60

80
40

Horizontal scale

100 km
60 mi

0
0

1 km

Vertical
1 mile scale, m

0
50
100

> Foinaven field. Located West of Shetlands (left), the Foinaven field produces from four main turbidite reservoirs. The reservoir map (top right) shows gas
caps in red and the strike of the normal faults as black lines. The platforms and well locations are shown in black. The Foinaven study area is indicated by
the blue box. The cross section (bottom right), which extends from south of platform DC1 to north of platform DC2, shows the layered reservoirs that have
been compartmentalized by normal faulting that must be drained by carefully constructed directional wells.

Summer 1999

33

Synthetic Seismic 4D Response

Surface Seismic 4D Response

Repeat
3D survey
1998

Repeat
3D survey
1998

T35 sands not yet


on production
T32 sand after 10
months of production

Oil + Free gas (~5%)


Bright

Water
1000

2000

3000

4000

5000

6000

7000
Baseline
3D survey
1995

Baseline
3D survey
1995

T32 sand before


production

Oil
Dim

Water
1000

2000

3000

4000

5000

6000

7000

> Visible changes in repeated surveys. Cross-sectional synthetic seismic displays for the baseline survey and repeated survey (left)
show the development of a gas cap. The actual seismic sections confirm the predictions from seismic modeling (right).

Small
gas caps

OWC

500

1000

1500

-3.40
-3.17
-2.94
-2.71
-2.48
-2.25
-2.02
-1.79
-1.56
-1.33
-1.10
-0.87
-0.65
-0.42
-0.19
0.04
0.27
0.50

2000 m

Free gas
Enlarged
gas caps
OWC
Free gas
N
0

500

1000

1500

2000 m

acquired in 1995 and a repeat survey was shot in


1998 after 10 months of oil production. The
amplitude brightening in the synthetic seismic
data indicated that development of gas caps
would be visible as the reservoir pressure
dropped and gas came out of solution during production (above). The repeat survey verified the
existence of the expanded gas caps as highamplitude events (left). Because seismic property
modeling predicted a good match between the
baseline survey and the repeat surveysynthetic
seismic sections from simulator model matched
real seismic data at appropriate timesthe reservoir team had more confidence in the reservoir
simulator model, both in this area and, by extrapolation, elsewhere in the field. As a result, additional 4D seismic data will be acquired across
Foinaven and neighboring Schiehallion field to validate the reservoir models and ultimately support
the placement of a number of infill production and
injection wells.
Correct timing of data acquisition maximizes
the value of the data. Careful study before repeat
3D survey acquisition can ensure optimal acquisition timing, which is field-dependent. In the
Foinaven case, timely acquisition of 4D seismic
data helped determine where the reservoir was
connected or segmented and where flow barriers
existed so the team could select optimal infill and
water-injection well placement.

> Amplitude changes. Map views of the 1995 baseline survey and the 1998 repeat survey clearly
display the changes in seismic amplitude that result from gas cap development. The small gas
caps in the original survey and the synthetic data shown above it (top) enlarged significantly
after 10 months of production (bottom). The oil-water contact (OWC) remains consistent
between the two surveys.

34

Oilfield Review

Reservoir performance

Reservoir modeling

Interpretive data

Flow simulation

Development scenario

History
matching

Production

Reservoir
development
Production and reserves
forecasting

Uncertainty analysis
and risk management

Development and Production Planning

Reservoir Monitoring and Control

Reservoir Characterization

Field Implementation

> Future reservoir management. Reservoir optimization is an iterative process that normally begins with reservoir characterization of a new discovery, but can be implemented at any stage in an existing field. Reservoir management will rely increasingly on
monitoring and modeling reservoir performance to optimize oil and gas production. The key additional element will be ongoing
collection of data at the reservoir scale, including seismic data and wellbore measurements, so that the development plan can
be assessed and, where necessary, modified. Monitoring the reservoir closely will overcome the current problems of history
matching using only the loose constraints of production data.

Future Possibilities
Reservoir simulation has already helped oil and
gas producers increase predicted ultimate recovery, and further improvement is likely. In addition
to ongoing software and shared earth model
enhancements, reservoir monitoring with downhole sensors, 4D seismic surveys or other methods is becoming increasingly cost-effective,
particularly when new data are acquired at optimal times (above).22
22. Watts et al: reference 20.

Summer 1999

A prudent, efficient team that works together


to develop a field reduces cycle time and
expense. Sharing data and interpretations allows
the team to maximize the value of its achievements for more realistic reservoir simulation and
improved understanding of the reservoir. These in
turn advance reservoir management, reserve
recovery and project economics. Currently, this
method relies heavily on the teams motivation to
work together. Software provides strong support,
but is not yet fully integrated to handle the complete spectrum of oilfield data simultaneously.

In the future, new software that validates the


shared earth model will incorporate measured
uncertainties of data and interpretations. As the
model is refined with the capture of new data,
any change in uncertainty will be addressed
automatically. Forward modeling will further
reduce uncertainty and risk, and maximize the
value of additional data. If the shared earth
model is consistently updated and new data and
interpretations are incorporated, project team
members will have another tool to better cope
with increases in both the volume of data and the
productivity expectations of their companies.
GMG

35

Vous aimerez peut-être aussi