You must log in to edit PetroWiki. Help with editing

Content of PetroWiki is intended for personal use only and to supplement, not replace, engineering judgment. SPE disclaims any and all liability for your use of such content. More information

Message: PetroWiki content is moving to OnePetro! Please note that all projects need to be complete by November 1, 2024, to ensure a smooth transition. Online editing will be turned off on this date.


Geostatistical reservoir modeling

PetroWiki
Jump to navigation Jump to search

Geostatistical reservoir-modeling technologies depart from traditional deterministic modeling methods through consideration of spatial statistics and uncertainties. Geostatistical models typically examine closely the numerous solutions that satisfy the constraints imposed by the data. Using these tools, we can assess the uncertainty in the models, the unknown that inevitably results from never having enough data.

Reservoir characterization overview

Reservoir characterization encompasses all techniques and methods that improve our understanding of the geologic, geochemical, and petrophysical controls of fluid flow. It is a continuous process that begins with the field discovery and all the way through to the last phases of production and abandonment.

Reservoir modeling is the final step in the reservoir-characterization process, and consists of building an upscaled geologic model for input to the fluid-flow numerical simulator. Dynamic reservoir simulation is used to forecast ultimate hydrocarbon recovery on the basis of a given production scheme, or to compare the economics of different recovery methods. Conducting a dynamic flow simulation requires several input data types.

The high-resolution geologic model (HRGM), for example, uses:

  • A grid-size specification
  • A geometric description of bounding surfaces, faults, and internal bedding geometries
  • A 3D distribution of permeability and porosity
  • Relative permeability and capillary pressure/saturation functions or tables

Other necessary information could include:

  • Fluid pressure/volume/temperature (PVT) properties
  • Well locations
  • Perforation intervals
  • Production indices
  • Production or injection rates
  • Limiting production or injection pressures

Geostatistical reservoir modeling workflow

The final step in the reservoir-characterization process, reservoir modeling, consists of building multiple HRGMs, and upscaling and performing flow simulations.

The HRGM integrates multidisciplinary data. The reservoir architecture is built using depth-converted seismic horizons and stratigraphic data, and the geometries and facies of the depositional environments are simulated within this framework, using information from boreholes, cores, seismic lines, and outcrops. Petrophysical properties (porosity φ, permeability k, and water saturation Sw) are distributed within the appropriate facies. The high-resolution models may contain tens of millions of grid cells and require upscaling before flow simulation.

Part of the reservoir-modeling process can use geostatistical methods that consider the spatial nature of geologic data. Specifically, geostatistical reservoir characterization allows for valid construction of a pdf of hydrocarbon volumes and other key reservoir properties. From such distributions, proven, probable, and possible scenarios (P10, P50, and P90) can be selected and upscaled for presentation to full-field fluid-flow simulators for engineering analysis.

Data requirements and steps necessary to create an HRGM that uses geostatistical technology, for input to a fluid-flow simulator are discussed in the basic elements of a reservoir characterization study. Creating such a model involves integrating the structural, stratigraphic, and petrophysical model into a 3D numerical representation of the reservoir. The high-resolution model typically must be upscaled before importing it to the fluid-flow simulator.

Geologic and reservoir issues

Reservoir modeling involves several geologic and engineering elements, though these actually are difficult to categorize strictly as either geologic or engineering because of the cause/effect relationship they have with one another. For example, the modeling scale traditionally is thought of as a geologic element, but it affects the amount of upscaling required, and so becomes an engineering element, as well. Likewise, stochastic-modeling methods provide many plausible images of the reservoir, thus generating multiple realizations and scenarios, an operation generally performed by the geoscientist. Ranking and selecting these realizations and scenarios are the final steps before going to the flow simulator and are performed as a joint effort, but a stochastic-modeling study puts onto the reservoir engineer the additional burden of history-matching multiple models, which can be a major undertaking. Thus, the modeling team would be wise to select a limited appropriate set of models for this effort.

Modeling scale

Geologists want the highest-resolution geologic model possible, much to the dismay, though, of the reservoir engineer tasked with creating it. Consider, for example, a 5 × 8 km reservoir that is 400 m thick. If the geologist decides to create grid cells that are 50 × 50 m horizontally and 1 m vertically, the resultant 3D grid will have more than 6.5 million cells. Although this is not an especially large stochastic model, it is larger than most reservoir engineers are willing to flow-simulate. Thus, the high-resolution 3D grid is coarsened and the petrophysical properties upscaled to a few-hundred-thousand-cell dynamic model whose size is more compatible with the flow simulator.

Stochastic modeling at a coarser scale often is suggested by reservoir engineers, who tend to consider such an approach as equally valid and far more practical than creating an high resolution geologic model (HRGM) and then upscaling it before flow simulation. The argument for coarsening embraces the idea that upscaling decimates the geologic and petrophysical detail, and so questions the need to model at a scale finer than that of the flow-simulation grid to begin with. Furthermore, the upscaling process is fraught with assumptions, and because not all upscaling techniques are equal, they can bias the results to the selected method.

The results of these two approaches are not equivalent, and the volume support issue at least partly can explain the concerns about performing a conditional simulation at too coarse a scale. A coarse-scale simulation may save time, but it relies on a priori knowledge about vertical layering (e.g., predefined flow units), the optimum horizontal cell size, and the petrophysical property distributions, parameters that neither should be predefined arbitrarily nor based solely on volumetric and material-balance calculations. Unfortunately, the high-resolution stochastic-modeling approach usually will increase the cycle time of a reservoir study because there is more work to be done. Constructing a stochastic model at too coarse a resolution often has proved inaccurate. It can imply a blind assumption that the geologic detail in a higher-resolution model is unnecessary. That said, there is a limit to the capabilities of a flow simulator, and an overly high-resolution model serves no one’s interest. The key is to strike a balance that keeps the project objectives clearly in mind.

The most advantageous workflow uses an appropriate fine-scale model as a guide when defining the flow units and constructing the flow-simulation grid. Both approaches undoubtedly will decrease or “smooth” the existing heterogeneity, but modeling first at a finer scale can produce a more informative, upscaled grid that preserves the critical heterogeneity. Considering the example above regarding the size of the model, a typical flow-simulation grid cell easily could contain 450 000 m3 (300 × 150 × 10 m) of rock. It is unrealistic to think that such a volume of heterogeneous rock could be represented adequately by a single value of porosity and one permeability value in each of the x, y, and z domains. It would be prudent to optimize the upscaled flow grid on a detailed geologic model where coarser cells could be used for nonreservoir layers and finer cells used in key reservoir layers, where the effects of heterogeneity are important. Note, however, that this does not mean that finer-scale models are the norm—detailed models must be justified.

Regridding and upscaling

Regridding and upscaling generally are considered part of the current workflow for reservoir characterization as a way of coarsening a 3D grid for numerical reservoir simulation, which makes the flow-simulation computational process achievable in a reasonable time frame; however, with increased computer power and innovative approaches to flow simulation, upscaling may not be an issue in the near future. During the 1990s, the model size of flow-simulation projects grew from 50,000 cells to more than five million because of the availability of faster computers and parallel-processing technology, and there is little doubt that this trend will continue. Additionally, fast streamline simulators capable of handling million-node models or more are becoming very popular. Though they are somewhat more limited than full-field flow simulators, they are sufficient for resolving many reservoir questions.

Multiple simulations and scenarios

Stochastic-modeling methods provide many plausible images of the reservoir. Recall that realizations are the result of sampling the uncertainty by changing only the seed number from simulation to simulation, whereas scenarios reflect major changes in the assumptions about the depositional model or the structural framework. Thus, each scenario can have multiple realizations, with the possibility of generating hundreds of models that honor the available data.

Ranking the stochastic models

Obviously, no company can afford the time or expense to history-match all the realizations generated in a stochastic-modeling study, nor is it necessary to do so. The primary reason for creating all these models is to quantify uncertainty in the geologic model to make better reservoir-management decisions. The fast streamline simulators offer a means to screen and rank realizations relatively quickly on the basis of some agreed-upon criteria. Once the realizations are ranked, the simulations most closely corresponding to, for example, a P10, P50, and P90 are upscaled and imported to the flow simulator, so that flow simulation and production performance no longer are based only on the most likely (P50) scenario. The P10 and P90 results provide a measure of uncertainty in future production performance and are error bars on the P50 model. Narrow error bars offer more confidence in the predicted performance, but wide error bars indicate more uncertainty and more potential risk.

Volume support

Data in the petroleum industry comes from a variety of sources, measured across many different scales, e.g., core permeability vs. well-test permeability, or seismic data vs. well data. In practice, such data often are integrated without regard to the vast differences in their measurement scales, which is problematic. An excellent example of this is the traditional calibration of core porosity to log-derived porosity. Core-plug measurements of porosity often are aligned with log data over a common interval by using a mathematical adjustment, such as some form of linear or nonlinear regression. In this example, the assumption is that the core data are more precise because porosity is measured in the laboratory on a small rock volume. Although the procedure is mathematically possible, it is not necessarily appropriate because it ignores the issue of support, the rock volume on which porosity is measured, which should make any such comparison suspect, particularly when data are being interpolated. In this case, the mathematical calibration procedure is tantamount to shifting, stretching, and squeezing the data to achieve a better fit. In other physical sciences, such as ore mining, computations of variables measured on a different support are not performed unless adjustment is made for volume support because not doing so can lead to very costly errors in ore reserve estimates. In the petroleum industry, though, the change of support typically is not addressed.

Consider another example of the volume-support effect when estimating porosity in a typical grid cell using a common computer gridding algorithm. The size of a 2D grid cell often is determined using a rule of thumb of one well per grid cell. A grid mesh consisting of 50-m2 grid cells would be considered a fine grid mesh, and interpolating porosity values from boreholes over such a fine mesh would not be given a second thought. The depth of investigation of a neutron-porosity log is approximately 0.08 m, and the area of resolution around the borehole is approximately 0.02 m2. During an interpolation of a porosity measurement over an area of rock of 0.02 m2, the porosity value is implicitly assumed to be the same as for area of 2500 m2. With a grid cell of 300 × 150 m, the assumption extends over an area of 45 000 m2. This problem becomes increasingly more dramatic in 3D.

Geostatistics attempts to combine appropriately data that have been measured at different scales, using a calibration method that categorizes covariables as hard data and soft data. These terms often are used informally, their difference generally being relative, tied to the degree of interpretation required to derive the data values and their scale of measurement. In the earlier example regarding core-plug measurements of porosity, the core porosity is the hard datum and the log porosity is the soft datum. Well data, too, are considered hard data, whereas seismic data are soft data. There are two good reasons for calibration[1]: First, it forces the proponent of any piece of information to document its origin and its relevance to the modeling effort, and second, it allows the impact of that information on the final reservoir forecast to be assessed through sensitivity analysis or by using geostatistical stochastic simulation.

In practice, the hard data are honored exactly in the numerical model, ignoring measurement error, whereas the soft data are honored less precisely and serve as a guide during the interpolation or simulation process outside the range of influence of the hard-data values. The degree to which the soft data are honored depends partially on the strength of their correlation to the hard data. The support of the soft data is assumed the same as in the traditional linear (or nonlinear) regression method. The degree of influence by the soft data affects only the weights used in the estimation or simulation procedure and is a function of a cross-covariance model that considers correlation, distance, and orientation. The scale estimated or simulated through the calibration process is that of the hard data.

Most geostatistical software packages can take into account hard-data measurement errors, but soft-data errors typically are much more difficult to measure and quantify. For example, calibration data often are so sparse that a proper calibration is impossible; for these cases, the calibration is borrowed from an analog. Any a priori decision to freeze the uncalibrated data must be made by all members of the reservoir modeling team.

Summary

Geostatistics is a powerful technology toolbox for reservoir characterization, and stochastic modeling clearly is not a simple game of tossing a coin for predicting what is present in the interwell space. Furthermore, numerical flow simulation and production performance are not based on the “most likely” (P90) scenario, and geostatistical methods allow us to test several scenarios and to select realizations representing, for example, the P10, P50, and P90 outcomes.

A good reservoir model is invaluable in selecting well locations and well designs (vertical, horizontal, and multilateral) and in assessing not only the number of wells needed to produce the reservoir economically, but also the bypassed pay potential and the value of infill drilling. A model of sufficient detail is required to make the best reservoir-management decisions, accounting for uncertainty, for the most efficient recovery of hydrocarbons.

Developing an integrated 3D reservoir model answers this requirement because it provides a reliable way to estimate the gross rock volume and original hydrocarbons in place to determine the economics of producing the reservoir, determine production facility requirements, rank development opportunities of alternative reservoirs, and allocate equity shares with partners. Modern portfolio management includes risk assessment, and a stochastic-modeling method helps to quantify uncertainty in the HRGMs. Using geostatistical reservoir-modeling technologies to integrate all static and dynamic data in a consistent framework ensures a better model.

Nomenclature

φ = porosity, fraction or percent
k = permeability, md or darcies
Sw = water saturation, the percentage of the total fluid that is attributable to water; fraction or percent

References

  1. Macini, P. and Mesini, E. 1994. Rock-bit wear in ultra-hot holes. Presented at the Rock Mechanics in Petroleum Engineering, Delft, Netherlands, 29-31 August 1994. SPE-28055-MS. http://dx.doi.org/10.2118/28055-MS

Noteworthy papers in OnePetro

Use this section to list papers in OnePetro that a reader who wants to learn more should definitely read

External links

Use this section to provide links to relevant material on websites other than PetroWiki and OnePetro

See also

Geostatistics

Geostatistical conditional simulation

Geology in reservoir models

Basic elements of a reservoir characterization study

PEH:Geologically_Based,_Geostatistical_Reservoir_Modeling

Page champions

Category