Dynamic Graphics is committed to responding to the challenges our clients face in their day-to-day work.
Backed by strong client partnerships, DGI is well positioned to capitalize on the current trend of more intensive high technology application in the petroleum, environmental, and earth-related sciences, ensuring the company′s leading role as a geospatial software solutions provider far into the 21st century.
Time-lapse seismic monitoring is producing huge data volumes at an increasing rate: Permanent Reservoir Monitoring is exaggerating this stream of unique and valuable information. Any potential 4D seismic signal is acknowledged to be of greatest use when integrated into the remainder of the reservoir management information for unified analysis (Calvert, 2005). True understanding—and hence better decision-making based on the 4D seismic response—can only come with the full and complete integration of these temporal data with all the other dynamic data from the field. We suggest a method for this temporal data fusion.
Assuming that a detectable 4D seismic result has been observed, we concentrate on the visualization and analysis of that response with particular emphasis on the integration with reservoir simulation data and other dynamic field observations. As discussed by Waggoner (2001), four specific areas in which time-lapse seismic results can be used include:
In the following discussion we examine methods for the temporal data fusion from some of these interpretive areas.
Whilst visualization solutions for subsurface 3D data are numerous, challenges are met when integrating diverse data with a strong temporal⁄dynamic component. A flexible solution for this 4D visualization is especially required for quantitative comparisons, for example, examining a 4D seismic response with production history or reservoir simulation prediction. In fact, dynamic data are often the rule, rather than the exception, in hydrocarbon development situations. Temporal data could include:
As elaborated by van Gestel et al. (2008), when all these data, and more, are fully integrated together in a dynamic environment accessible to the entire multi-disciplinary asset team then decision-making can be vastly accelerated and improved.
Moreover, there are abundant additional opportunities for data QC in a 4D visualization space—for instance comparing the reservoir simulation realization of completion status with that recorded in the geologic database.
The image below, as well as that above-right, show an integrated viewing environment with temporal well data (status and completions changing with time) integrated with dynamic reservoir simulation and 4D seismic data. The production and cell history plots shown above are also dynamically linked into this same visual environment.
Despite various tools at our disposal for the rapid visual integration of diverse reservoir management data, obstacles remain when spatiotemporal data density is high. In particular the typical 2D workstation screen is limited to essentially displaying, at one time, only one of potentially any number of spatially and temporally overlapping data volumes.
Moreover, the problem is compounded by the ″fifth dimension″ of analysis: model uncertainty. Multiple 4D models need to be rapidly screened and assessed—often in a visual fashion—to grade their suitability or need for adjustment.
Proposed solutions to the ″visual data density″ challenge include, of course, temporal animation on the workstation or other display device played and captured at a variety of speed scaled to real calendar time.
Other possible solutions include the virtual duplication of the workspace but in a coordinated and ″linked″ fashion that is intuitive, even transparent, to the end-user.
Shown below are two possible solutions. On the left is a series of vertical stacked model spaces where a duplicated vertical dimension allows for easier data comparison, including data of dissimilar vertical units (c.f. time & depth). On the right is an example of a visualization environment using multiple locked viewing windows allowing simultaneous manipulation and comparison of a range of different properties or time steps from a single simulation run. In a similar manner geologic uncertainty, such a fault transmissibility, can be rapidly reviewed across multiple realizations with consequent changes in reservoir initial conditions feed back to the simulator.
Whereas some observed 4D seismic responses are robust enough only for qualitative interpretation (of the types discussed above), in many instances a quantitative Seismic History Match can be attempted&thus correlating in a statistical sense the observed changes from the 4D seismic data with the predicted changes in the reservoir simulation model. To be fully useful and wholly integrated into the quantitative interpretive workflow, some key functionalities should be available in this workflow. Moreover, these functions should ideally be available seamlessly within an environment providing access all relevant & supporting data.
Proposed steps in a fully-integrated quantitative analysis workflow could include the following:
Calculated differences between an assumed “base” condition and a later “monitor” date are fundamental to the 4D analysis. At a minimum this should include the ability to compute differences between any two time points in a simulation run, any two vintages of a 4D seismic campaign, and likewise differences between seismic attributes from different dates. Also desirable would be full arithmetic control of the input products, including temporal manipulations of secondary dynamic data such as production and microseismic data.
One often essential step in the quantitative workflow is the ability to easily extract⁄compute averages and sums across a vertical interval of interest—usually the reservoir itself. This function needs to be available for both the seismic data (in effect allowing for the extraction of seismic attributes from a volume of data) and in cellular space so a comparable “interval” average (or sum) can be extracted from the simulation results or any particular property or delta (difference).
Perhaps the most critical step in the quantitative analysis is the numerical sampling of one property attribute alongside a second spatially overlapping but inherently dissimilar attribute. This sampling, otherwise called back-interpolation, can have manifestations in both the spatial and temporal domain in a wide variety of data sets. Typical examples could include the back-interpolation of seismic impedance data into a well log; reservoir simulation data into a well log; seismic attributes into a interval averaged simulation extraction; or, as shown in the image at right, seismic attribute volumes into the reservoir simulation grid geometry. Each of these cases can potentially have a dynamic component leading, for instance, to back-interpolation of a 4D seismic volume into the time & space structure of a reservoir simulation grid for further numerical comparison.
In the image above a seismic discontinuity attribute (″coherency cube″) from around a North Sea Salt dome (in color) is displayed. The discontinuity attribute has been back-interpolated into the well data. This would allow, for example, the direct comparison of fault⁄fracture information from the seismic attribute with similar information recorded in image⁄well logs.
The quantitative computations and extractions culminate with the statistical comparison of properties from the different sources: in the current example this would be the cross-plotting of the interval-averaged sum of the differenced reservoir simulation attribute against the seismic attribute extraction differenced across the same, or very similar, temporal interval.
The image below shows one potential such cross-plot. In this example the data points are further colored by a third attribute and all the statistics are linked back in a spatial and temporal sense to the 3D⁄4D viewing space. This means, for example, that when the 4D viewing space is animated in calendar time, the cross-plot changes to reflect the new seismic and simulation time-points. Moreover, when points or regions are selected in the cross-plot, those space points⁄regions are highlighted in the 4D viewing space, and vice versa.
Linear regression can be used to further elaborate on the match, or lack thereof, between the datasets. Points and areas can be selected based on the correspondence to a regression line.
The quantitative analysis and interactive cross-plotting will likely lead to new insights regarding the seismic history match and result in changes and adjustments to be communicated back to the simulator or other elements within the process. The quantitative workflow should include adequate outputs and product generation in mutually acceptable formats to make this feedback as rapid and easy as possible.