SPE 38441
Society of Petrcloum Engineers
Reservoir Simulation: Past, Present, and Future
4J.W. Watts, SPE, Exxon Production Research Company
Cenyiht197 Soca Palo Exner.
‘sper ead ttn 67S emt Sten Son
‘paper was sled he Pry Canin be 1997 SPE Reser Seusston
‘ovo hove tt been ovewed oye Socay @ Porcaum Egret dae cj o
“Srechn ty nw suber ita mara epee, oe ro neat) rec ay
Ste Socal Pain Ergrens oer. rember Papers pevwrine
outro saat putesten oven ty Este Caren
Ergonrs acre reson eaeon, oops of ay bar ot the pape
‘ners purposes wi the win ser the Sealy Patou Engrs
[Senuted Bemananornrnce'n pet m terse e an tara far mee han 30
Soon autaione may tt oe, coped. "the ‘cas mat conan conpoicet
Soren Renecron Tx same USA taotareacnSaae
Abstract
Reservoir simulation is a mature technology, and nearly all
‘major reservoir development decisions are based in some way
on simulation results. Despite this maturity, the technology is
changing rapidly. Itis important for both providers and users
of reservoir simulation software to understand where this
change is leading. This paper takes a long-term view of
reservoir simulation, describing where it has been and where it
is now. It closes with a prediction of what the reservoir
simulation state of the art will be in 2007 and speculation
‘Fegarding certain aspects of simulation in 2017,
Introduction
Today, input from reservoir simulation is used in nearly all
major reservoir development decisions. This has come about
in part through technology improvements that make it easier
to simulate reservoirs on one hand and possible to simulate
‘them more realistically on the other; however, although
reservoir simulation has come a long way from its beginnings
in the 1950's, substantial further improvement is needed, and
this is stimulating continual change in how simulation is
performed.
Given that this change is occurring, both developers. and.
users of simulation have an interest in understanding where it
is leading. Obviously, developers of new simulation
capabilities need this understanding in order to keep their
products relevant and competitive. However, people that use
simulation also need this understanding: how else can they be
‘confident that the organizations that provide their simulators
are keeping up with advancing technology and moving in the
right direction?
333
In order to understand where we arc going, it is helpful to
know where we have been. Thus, this paper begins with a
discussion of historical developments in reservoir simulation
‘Then it briefly describes the current state of the art in terms of
how simulation is performed today. Finally, it closes with
some general predictions.
The Past
This paper views the past as a progression, a series of
advances. Sometimes this takes the form of a graph, such as
CPU speed versus time. In other cases, i is a listing of events
grouped by decade. In either case, the intent is to convey an
impression of how rapidly development has occurred, with the
hhope that that will assist in estimating the future rate of
change.
Computing. The earliest computers were little more than
adding machines by today's standards. Even the fastest
computers available in the 1970's and early 1980's were
slower and had less memory than today's PC's. Without
substantial progress in computing, progress in reservoir
simulation would have been mesningless.
Figure 1 gives the computing speed in millions of floating
point operations per second for the three fastest CPU's of thei
times: the Control Data 6600 in 1970, the Cray 1S in 1982,
‘and a single processor on a Cray T94 in 1996, The
performance figures are taken from Dongarra's compilation of
LINPACK results! and should be reasonably representative of
reservoir simulation computations.
Figure I shows single processor performance. Today,
high-performance computing is achieved by using multiple
processors in parallel. The resulting performance varies
widely depending on problem size and the number of
processors used. Use of, for example, 32 processors should
lead to speedup factors of 15-25 in large reservoir models. As.
a result, although Figure 1 shows rate of performance
improvement to be slowing, if parallelization is factored in, it
may actually have accelerated somewhat. No attempt is made
to incorporate parallel performance into Figure 1 because of
its wide variability.
Consider Table 1, which comperes the Cray 1S to today's
Intel Pentium Pro. The 1S was the second version of Cray's,2 JW WATTS:
SPE 32441
first supercomputer. Not ony was i the state of the art of its,
time, but it represented a tremendous advance over its
contemporaries. As such, it had the standard supercomputer
price, a little under $20 million. It comprised large, heavy
pieces of equipmem whose installation required extensive
building and electrical modifications. The Pro, on the other
hand, costs a few thousand dollars, can be purchased by mail
order or from any computer store, and can be plugged into a
simple wall outlet. The 200 MHz Pro is over twice as fast as
the Cray (according to Dongarra’ numbers) and is commonly
available with upto four times as much memory.
Model Size, Over time, model size has grown with computing
speed. Consider the maximum practical model size to be the
largest (in terms of number of gridblocks) that could be used
in routine simulation work. In 1960, given the computers
available at the time, this maximum model size was probably
about 200 gridblocks. By 1970, it had grown to about 2000
sridblocks. In 1983, it was 33,000; Exxon’ first application
of the MARS program was on a model of this size running on
the Cray 1S*, In 1997, it is roughly 500,000 gridblocks for
simulation on a single processor. Figure 2 plots these values.
This semi-log plot is a nearly straight line, indicating a fairly
constant rate of growth in model size. The growth in model
size is roughly consistent with the growth in computing speed
shown in Figure 1
Technical Advances, The first "eservoir simulator" modeled
single-phase flow in one dimension. Today, the norm is three-
phase flow in three dimensions with many gridblocks and
complex uid representation. Many advances in
computational methods made this transition possible. Table 2
lists some of these by decade. In general, the advances in
Table 2 are chosen because they are still in use today or they
paved the way for methods used today. Also, they are
methods that are used in many types of simulation, as opposed
to techniques for modeling specific phenomena such as
relative permeability hysteresis or flow in fractured reservoirs,
The high points in the tble are discussed below.
Reservoir simulation began in 1954 with the radial gas
flow computations of Aronofsky and Jenkins’. The first work
to receive notice outside the field of reservoir engineering was
Peaceman and Rachford’s development of the alterating-
direction implicit (ADI) procedure’. More than 40 years later,
[ADI is still being used, though seldom in reservoir simulation
Today, itis hard to appreciate how little was known atthe
beginning of the 1960s. Even concepts that today seem
obvious, such as upstream weighting, were topics of debate
Yet, by the end of the decade, the first true general-purpose
simulators had come into being,
One of the difficulties in the 1960's was solving the matrix
equations, Today, a solver must be fast and casy 10 use.
Then, there were problems that could not be solved at all. The
first effective solver was SIP‘. Though it was sometimes
334
troublesome to use, it nearly always could be made to work.
‘Another mathematical breakthrough of the time was
development of implicit-in-time methods, which made it
practical to solve high flow velocity problems such as well
coning
The 1970's saw publication of Stone's three-phase relative
permeability models'!. These continue to be widely used.
‘Another innovation that has stood the test of time was the two-
point upstream method". Despite widespread efforts since
then, only incremental improvements to it have been found.
Also having tremendous lasting impact was the development
of solvers that used approximate factorizations accelerated by
corthogonalization and minimization”. These made possible
methods that were largely parameter-free. Finally,
Peaceman’s well correction for determining bortor-hole
pressure from gridblock pressure and well rate™ is almost
universally used today
In the early 1980's, a significant advance occurred with
the development of nested factorization”. Both fast and very
robust, nested factorization may be today’s most widely used
matrix solver method. Another major step occurred in
compositional simulation. Although the first compositional
simulators were developed in the late 1960's, their
formulations included an inherent inconsistency that hurt their
performance”. The volume balance and Young-
Stephenson” formulations solved this problem and, at the
same time, made it practical to write a simulator that ean
efficiently solve both black-oil and compositional problems.
Development of cornerpoint geometry made it possible to use
non-rectangular gridblocks”, providing a capability that was
useful in a variety of applications.
In the late 1980's, efforts shifted to issues related 10
geologic modeling, geostatistics, upscaling, flexible grids, and
parallelization, and this emphasis has continued in the 1990's,
The list of accomplishments for the 1990's is much shorter
than those for the preceding three decades. This is, of course,
partly because the decade is not finished yet, but it may also
be partly because not enough time has elapsed to make lasting
contributions recognizable. Perhaps it also stems in part from
the diversion of effort from general computational work into
the nuts and bolts of interactive software and parallelization
Finally, it also may be, unfortunately, a result of the
reductions in research effort that began in the mid-1980"s
Simulator Capabilities. The preceding section discusses
technical advances. They would not be of interest had they
not led to improvements in capabilities from the user's
standpoint, Table 3 lists, again by decade, the state of the art
capabilities that were available in simulators of the time. No
attempt is made to cite the literature in this discussion. These
capabilities tended to become available in several simulators
at about the same time, and offen there was no external
publication describing them.
‘The computers of the 1950's permitted use of only theSPE 35441
RESERVOIR SIMULATION: PAST, PRESENT, AND FUTURE 3
‘rudest models. Three-dimensional simulation was out of the
question, and only small two-dimensional models were
possible. Everything had to be kept simple, so single-phase
flow or incompressible two-phase flow and very simple
‘geometry were used.
‘The more powerful computers of the 1960's enabled more
realistic description of the reservoir and its contents. Three
phase, black-oil fluid treatment became the norm. It became
possible {0 run small three-dimensional models. Multiple
‘wells were allowed for, they could be located where desired,
and their rates could be varied with time, Gridblock sizes
could vary, and gridblocks could be "keyed out," or
climinsted from the system. By the end of the decade,
implicit computational methods were available, permitting
practical well coning modeling.
The 1970s seems to have been the enhanced oil recovery
(GOR) decade. The first compositional simulators. were
developed. Computing limitations forced these to use large
‘gridblocks, so numerical dispersion was a problem. Also,
there were weaknesses in the initial formulations used.
Nonetheless, they made it possible to begin to model
phenomena that had up to then been ignored. Because of
heavy interest in EOR, much effort went into modeling,
miscible and chemical recovery. Finally, advances in implicit,
computational methods provided the solution stability
required to model thermal processes.
In the 1980's, it became no longer adequate for the user to
tell the simulator where to put the wells and how much to
produce from them. In prediction runs, the simulator became
responsible for making such decisions on its own. This led to
development of complex well management software that,
‘among other things, determined when to work over existing
wells or drill new ones, adjusted rates so as to adhere to
constraints imposed by separator capacities, maintained
‘computed reservoir pressures at desired values, and allocated
produced gas to injection and gas lift. Other developments led
to approaches for modeling fractured reservoirs. The normal
restriction of topologically rectangular grid connectivity was
lifted to allow taking into account shifting of layers across
faults. Finally, work began on interactive data preperation
and display and on graphical user interfaces in general
‘The dominant efforts of the 1990's have been on various
‘ways to make simulators easier fo use. These have included
continuation of the work on graphical user interfaces,
‘widespread attempts at data integration, and development of
‘automatic gridding packages. The use of numerical geologic
models, often depicting fine-scale property variation,
generated statistically, has become widespread. There has
‘been considerable work on methods for *upscaling" reservoir
properties from these models’ cells to the much larger
sridblocks that simulators can use. Gridding flexibility has
increased through use of local grid refinement and more
complex grid geometries. A current thrust is integration of
simulation with non-reservoir computational tools such as
facilities models and economics packages.
Evolution of Software and Support. As simulators. became
more powerful and flexible, their user communities changed.
As this happened, developers and supporters had to change the
way they did their jobs. As a result, software development
and support practices progressed through several stages. The
progression is still under way, with one stage left to be
accomplished.
1. Developer use. Initially, a very small team of developers
devised computational methods, implemented them in
simulator, and applied this simuletor themselves. The
first applications were intended to test the simulator and
its concepts and the next ones to demonstrate its
usefulness on real problems. After these tests, the
developers made production runs intended to benefit the
corporation. Portability of this simulator was not an
issue, because there was only one computer for it to run
2, Team use. By the next stage, the team had grown and
developed internal specialization, Within it, one group
developed the simulator and another applied it.The
simulator required customization foreach new
application. Despite the group's specialization,
developers were still frequently involved in applications
because of the simulators need for continual support
‘The simulator frequently failed, and it was essentially
undocumented. The simulator ran on a single computer,
and portability was still not an issue.
3. Local use. In this stage, the simulator was used by people
‘who were located near to its developers but worked in
other parts of the organization. The simulator still
required customization for most new studies. Other
support was requited frequently but not continually.
Failures still occurred, but less frequently than before
Documentation was adequate to permit use of the
simulator with some assistance from developers. The
simulator ran on several computers, but they were all of
the same type
4, Widespread use. In this stage, the simulator first began to
receive use by people at remote locations. It seldom
required customization, but it still needed occasional
support. It rarely failed. Documentation was thorough,
but training was required for effective use of the
simulator. Most applications were performed by
specialists in the use of the simulator. The simulator ran
‘on a small variety of computers,
5. General use By this stage, the simulator will have
become widely used by people with varying expertise. It
rarely will need customization, will requite support only
infrequently, and seldom will fail. Its documentation will
be thorough and easily understood. Its user interfaces,
will be intuitive and standardized. Little training will be4 JW. WATTS:
‘SPE 30441
required to use the simulator. The user will need
knowledge of reservoir engincering, but he will not need
to be a simulation exper.
Each transition to a new stage changes what is required of the
simulator and those who support it, Each transition has been
‘more difficult than the ones that preceded it. A tra
was surprisingly difficult was from local use to
use, In the local use stage, the simulator was being used
frequently and for the most part was functioning correctly. As
a result, it seemed safe to send it to remote, perhaps overseas,
locations. Doing so led to many more problems than
expected. The remote computer differed slightly from the
developer's, its operating system was at a different release
level, and it was configured differently. The remote users
tried the simulator once or twice, and it did not work. These
users did not know the developers personally, and they had no
really convenient way to communicate with thera. Typically
the users abandoned attempts to use the simulator, and the
developer was slow to lear of this failure.
Interestingly, vendors were forced by their business to
make the transition to widespread use before petroleum
companies’ in-house developers had to. The vendors have
been dealing with the related problems since the early 1970's;
in-house developers began addressing them later, with varying
degrees of success, This transition is made difficult by the
high standards in usability, functionality, and robustness that,
rust be met. Developing software and documentation of the
quality needed is very time-consuming and requires different
sets of skills than those traditional to reservoir simulator
development
‘Vendor History and Role, Researchers working at major oil
company laboratories developed the first reservoir simulators.
It was not until the middle 1960's that vendors started to
appear. Following is a brief history of certain of these
vendors. Inclusion of a particular vendor in this discussion is
nnot intended to imply endorsement, and exclusion is not
intended to imply criticism. The firms discussed are those
with which the author is familiar; for this reason North
American-based firms are more likely to be included than
those based elsewhere.
The first to market a reservoir simulator was D. R.
McCord and Associates in 1966, Shortly thereafter, Core
Laboratories also had a reservoir simulator that they used in
consulting work.
‘Tne year 1968 saw the founding of INTERCOMP and
Scientific Software Corporation, two companies that
dominated the reservoir simulation market in the 1970’s
Despite their market success, a number of new companies
were formed in the 1970°s and early 1980's, The first was
INTERA, which was formed in 1973 by merging an
INTERCOMP spinoff with the environmental firm ERA.
INTERA initially focused on environmental work, but
eventually got into reservoir simulation as discussed below. In
1977, the Computer Modelling Group was formed in Calgary
with support from the province of Alberta. J. S. Nolen and
Associates and Todd, Diettrich, and Chase were formed in
1979. In 1981, a reservoir simulation business was formed at
the existing exploration-related firm Exploration Consultants
Limited (ECL). Finally, SimTech was founded in 1982 and
Reservoir Simulation Research Corporation in 1984
The founding of these firms was followed by a series of
mergers and acquisitions. These began with in 1977 with the
acquisition of INTERCOMP by Kaneb Services. In 1983,
Kaneb Services sold INTERCOMP to Scientific Software
Corporation, the two firms merging to form Scientific
Software-Intercomp, oF SSI
In the middle 1980°s, J. S. Nolen and Associates was
acquired by what came to be Westem Arlas. In 1996,
Landmark Graphics acquired Wester Atlas’ reservoir
ulation business. Since then, Landmark was in tum
acquired by and became a division of Halliburcon.
In the middle 1980's, INTERA acquired ECL's reservoir
simulation business. A few years later, INTERA split into two
completely separate companies, one based in the United States
and the other in Canada. The reservoir simulation business
went with the Canadian INTERA. In 1995, Schlumberger’
GeoQuest subsidiary acquired INTERA's reservoir simulation
business. Shortly later, the United States-based INTERA,
which had become part of Duke Engineering and Services,
reentered the reservoir simulation business by acquiring
SimTech
Finally, in 1996, the Norwegian firm Smedvig acquired
Reservoir Simulation Research Corporation.
AAs of the second half of the 1990's, vendor simulators are
very widely used. As vendor products have improved, some
petroleum companies have reduced or dropped altogether their
in-house efforts. Those companies found that vendors could
provide tools that were at least as,good as those that they
could develop themselves, and that they could lower their
development and support costs by using the vendors’
products. On the other hand, several large companies
‘continue to find it in their best interests to develop proprietary
tools, perhaps incorporating certain vendor components into
their systems.
The Present
The following discussion briefly reviews today's common
practices in reservoir simulation. It is intended to depict what
People actually do when performing state of the art
simulation. It focuses on large, field-scale models.
Geologic Modeling. The reservoir is defined using a
numerical geologic model. Often the fine detail ofthis model
is generated statistically. The mode! typically has on the order
cof one to ten million cells. It is based on geologists’ beliefs
about the reservoir’s depositional environments and data from
a variety of sources such as seismic, well logs, and cores. AtSPE 38461
RESERVOIR SIMULATION: PAST, PRESENT, AND FUTURE. 6
times, several randomized versions of the model, called
“realizations,” are simulated in order to quantify uncertainty in
the simulation results.
Gridding. The user defines major reservoir units to which the
simulation grid must conform, Within these units, the grid is,
generated automatically with some guiding input from the
user, The gridblocks are generally rectangular or nearly so.
Local refinement of the grid in key parts of the model is used.
Special computations account for flow across faults between
eridblocks that are not opologically neighbors.
Upscaling. The typical simulation gridblock consolidates
several tens to several hundreds of geologic model cells.
Effective permeabilities on the simulation grid are determined
from the geologic model permeabilities using either an
appropriate numerical average or flow-based upscaling
‘Sometimes finely-gridded segment models are run prior to
field-scale simulation, with the results ofthese runs being used.
to determine pseudo relative permeabilities, thus effectively
accomplishing multiphase upscaling,
Data Import, Well data exist in databases. Historical
production and injection rates are exported from these
databases in a form that can be read by the simulator or
converted by utilities into simulator input. Most of this
process is essentially automatic in the sense that it requires
litle intervention by the user. On the other hand, the user
‘must work with historical pressure data and data describing
the wells (such as completion intervals and tubing sizes) to get
them into the form needed by the simulator.
Fluid property data are also available in databases.
Laboratory-measured results vary from sample to sample,
however, and the user must manually average them in some
way. Similar manual averaging is also needed for relative
permeability and capillary pressure data. In addition,
pseudofunctions are often used. Use of pseudofunctions is
declining as finer grids become possible.
Simulation, The following characterizes today’s reservoir
simulation based on certain aspects of how it is performed
Type. The large majority of simulations are of
conventional recovery processes, and most of these
simulations use black-oil fluid treatment with variable
bubblepoint pressure. However, use of compositional fluid
represeniation is growing and has become significant. When a
compositional representation is used, fluid behavior is
normally computed using an equation of state.
Full compositional fluid representations are also used in
simulating miscible recovery, particularly at laboratory scale.
For fiekd-scale processes, the norm is to use simpler
approaches, particularly in modeling carbon dioxide flooding.
Simulation of steam injection processes is common, but
simulation of in situ combustion is raze, partly because use of
in situ combustion is rare and partly because simulating itis,
difficult and expensive,
Most simulation effort probably goes into modeling entire
reservoirs or substantial portions of them, but in terms of
number of models run there may be more simulations of
single wells. This is being driven largely by the growth in use
of horizontal wells.
‘Model Size. 100,000 gridblock field-scale black-oil
models are common, and models several times larger are not
‘unheard of. Compositional models tend to be smaller by
perhaps a factor of cree to five.
Typical vertical-well coning models have about a thousand
to several thousand gridblocks, while models of horizontal
wells are a factor often larger.
Most steam process models are ofa single well and have at
most a few thousand gridblocks
Computing Platform. Most reservoir simulation is
performed on Unix workstations and servers. Simulation
performed on other platforms is probably split fairly evenly
between PC’s and supercomputers, Currently very little is,
done in parallel, but that is changing rapidly. Simulation pre-
and post-processing are done predominantly on Uni
workstations, but use of high-end Windows-based PC's is,
growing.
History Matching. History matching is still essentially a
manual process, assisted perhaps by software that organizes
and displays results of history matches. Fully automatic
history matching is still not feasible. Some optimization of
small” sets of parameters, such as regional permeability
multipliers, is performed automatically
Predictive Well Management. During prediction, the
simulator in effect operates the reservoir. It drills new wells,
recompletes existing ones, restricts rates to satisfy facility
constraints, switches wells to low pressure gathering systems,
tc. It decides when to perform these actions based on rules
and other information provided by the user.
However, modeling of flow in the tubing and surface
network is less sophisticated. Pressure drops are normally
determined only in the wellbore, and these computations are
typically made using tables. Most network computations are
based entirely on material balance. In effect, physics is
largely ignored after the fluid leaves the wellhead. This is
changing, and some simulators can model these flows more
rigorously.
Use of Results. Reservoir simulation is most often used in
making development decisions. These begin with initial
evelopment and continue with decisions such as those
relating to in-fill drilling and implementation of enhanced oil
recovery processes. Simulation is not commonly used in day-
to-day operations, but it is used at times in making major
‘operating decisions.6 2.W. WATTS:
‘SPE 33441
‘The Future
Before attempting to predict the future, it is good to be aware
of the business drivers, the most important of which are
discussed below. These lead into current technical objectives,
also discussed below. These objectives then form the basis of
several predictions.
Business Goals. In the view of petroleum company
‘management, research and development related to reserve
simulation should address the four goals discussed below.
‘The order in which they are listed does not indicate priority,
and which is most important varies from situation to situation.
Compress calendar time. Time is money, and the more
quickly a study can be performed, the more valuable it is
Deadlines associated with possible property acquisitions are
getting tighter, and competitive pressures are growing, In
addition, a study that does not take long probably will not cost
‘much,
Decrease cost. Petroleum companies continue to be under
pressure to reduce costs. Despite reservoir simulation’s
computing-intensive nature, its major cost comes from the
engineering labor it requires.
‘Reduce expertise required. In a sense, this goal is closely
related to the preceding one, since maintaining expertise is
expensive. However, it also relates to the widespread push to
decentralize and flatten organizations. As this occurs, more
simulation is being done locally by generalist reservoir
engineers, rather than by centralized groups of specialists
‘These reservoir engineers cannot be expected to be simulation
experts
Improve accuracy and realism. These are the traditional
goals of simulation research and development. New
challenges have been created by the grids coming out of
geologic models and by more complex well geometries.
‘Technical Objectives and Efforts to Address Them. A good
technical objective must meet two criteria: it must address one
‘or more of the business goals, and it must be achievable with,
reasonable effort in the intended time frame. Substantial
progress on the following objectives should be possible within
10 years, and achievement of them should be possible within
20 years. The following states each objective, expands upon it
very briefly, and describes current work addressing it.
Require’ the user to provide only data describing the
physical system, its lids, and its history. Do not require him
to specify computational data such as solver selection,
iteration parameters, and timestep controls. Create the
simulation grid for him with no intervention on his par.
Several organizations are working on gridding, and some
of their work relates to automating the gridding process.
‘Current linear equation solver work relates primarily to
parallelization, but within this work there is an ongoing
fitempt to improve robusiness and reduce the effort
required by the user.
338
Automate history matching. Determine the best possible
fit to existing historical data, consistent with the presumed
depositional environment and other characteristics of the
geologic model
Recent work has led to an economical way to compute
derivatives for use in gradient-based optimization
‘methods. These will become more commonly applied, but
‘much more is needed for truly automatic history
‘matching.
Minimize the time and effort required to access
information required to perform the study and to generate
the results that are the study's objective. Integrate data when
doing so is practical and provide efficient import capa
when it is not. Likewise, integrate downstream computations
‘when practical and provide efficient export capabilities when
rot.
The Petroleum Open Systems Corporation (POSC) and its
members are addressing the data integration issue,
Recent acquisitions of reservoir simulation vendors by
larger service organizations is leading to integration with
computing tools both upstream and downstream of
simulation.
Predictions, Following are predictions regarding the year
2007 state of the art of reservoir simulation and related
technologies.
1. The dominant high-end computing platform for
simulation calculations will be a Unix server
comprising multiple nodes, with each node having @
small number of processors. Where high performance
is not needed, the dominant platform will be a
‘multiprocessor PC running NT.
2. The dominant pre- and post-processing and
visualization platform will be the top of the line version
cof whatever the PC has become,
3. Integration of reservoir simulators with the software
and databases that provide their input data will be much
better than itis today.
4. Reservoir simulation will be essentially automatic,
given the geologic model, fluid data, and rock (ie,
relative permeability and capillary pressure) data
5. The largest black-oil simulations witl use at least 10
million gridblocks.
6. Most simulators will be based on unstructured grids.
7. Integrated reservoir-surface network calculations will
bbe common,
8. Use of history maiching tools will be widespread, but
the history matching process will not be automatic
‘The above predictions are made with some confidence. It
seems reasonable to expect most of them to be generally
correct, with perhaps one or two tuming out wrong
Attempting to predict further into the future is more
problematic. Nonetheless, it may be instructive to consider