Académique Documents
Professionnel Documents
Culture Documents
This publication has been provided pursuant to an agreement containing restrictions on its use. The
publication is also protected by Federal copyright law. No part of this publication may be copied
or distributed, transmitted, transcribed, stored in a retrieval system, or translated into any human
or computer language, in any form or by any means, electronic, magnetic, manual, or otherwise,
or disclosed to third parties without the express written permission of:
Trademark Notices
Landmark, OpenWorks, SeisWorks, ZAP!, PetroWorks, and StratWorks are registered trademarks
of Landmark Graphics Corporation.
Pointing Dispatcher, Log Edit, Fast Track, SynTool, Contouring Assistant, TDQ, RAVE, 3DVI,
SurfCube, SeisCube, VoxCube, Z-MAP Plus, ProMAX, ProMAX Prospector, ProMAX VSP,
MicroMAX, DepthTeam and Landmark Geo-dataWorks are trademarks
of Landmark Graphics Corporation.
Technology for Teams is a service mark of Landmark Graphics Corporation.
ORACLE is a registered trademark of Oracle Corporation.
IBM is a registered trademark of International Business Machines, Inc.
AIMS is a trademark of GX Technology.
Motif, OSF, and OSF/Motif are trademarks of Open Software Corporation.
UNIX is a registered trademark of UNIX System Laboratories, Inc.
SPARC, SPARCstation, Sun, SunOs and NFS are trademarks of SUN Microsystems.
X Window System is a trademark of the Massachusetts Institute of Technology.
SGI is a trademark of Silicon Graphics Incorporated.
All other brand or product names are trademarks or registered trademarks of their respective
companies or organizations.
Note
The information contained in this document is subject to change without notice and should not be
construed as a commitment by Landmark Graphics Corporation. Landmark Graphics Corporation
assumes no responsibility for any error that may appear in this manual. Some states or jurisdictions
do not allow disclaimer of expressed or implied warranties in certain transactions; therefore,
this statement may not apply to you.
ProMAX 2D Seismic Processing
and Analysis
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Conventions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Mouse Button Help . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Exercise Organization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Manual Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Agenda . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Day 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Introductions, Course Outline, and Miscellaneous Topics . . . . . . . . . . . . . . . . . . . . . . 1
ProMAX 2D Geometry - Manual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
ProMAX 2D Geometry - Full Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
ProMAX 2D Geometry - Extraction with Editing . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Trace Editing using Trace Statistics and DBTools . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Day 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Parameter Selection and Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Elevation Static Corrections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Brute Stack . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Neural Network First Break Picking. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Refraction Static Corrections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Stack Comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Velocity Analysis and the Volume Viewer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Day 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Residual Statics Corrections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Dip Moveout (DMO) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
PostStack Signal Enhancement. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Velocity: QC, Editing, Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
PostStack Migration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Additional Topics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
DMO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-17
Apply DMO to the data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-18
Final Stack . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-20
Tapering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-4
Appendices
Preface
About The Manual
This manual is intended to accompany the instruction given during the
standard ProMAX 2D course. Because of the power and flexibility of
ProMAX, it is unreasonable to attempt to cover all possible features and
applications in this manual. Instead, we try to provide key examples and
descriptions, using exercises which are directed toward common uses of
the system. For more progressive training please take Advanced 2D.
The manual is designed to be flexible for both you and the trainer.
Trainers can choose which topics, and in what order to present material
to best meet your needs. You will find it easy to use the manual as a
reference document for identifying a topic of interest and moving
directly into the associated exercise or reference. You are encouraged to
copy the exercise workflows and optimize them to your personal
situation.
Conventions
MB1 refers to an operation using the left mouse button. MB2 is the
middle mouse button. MB3 is the right mouse button.
Shift-Click: Hold the shift key while depressing the mouse button.
Drag: Hold down the mouse button while moving the mouse.
Mouse buttons will not work properly if either Caps Lock or Nums Lock
are on.
Exercise Organization
Each exercise consists of a series of steps that will build a flow, help
with parameter selection, execute the flow, and analyze the results.
Many of the steps give a detailed explanation of how to correctly pick
parameters or use the functionality of interactive processes.
The flow examples list key parameters for each process of the exercise.
As you progress through the exercises, familiar parameters will not
always be listed in the flow example.
The exercises are organized so that your dataset is used throughout the
training session. Carefully follow the instructors direction when
assigning geometry and checking the results of your flow. An
improperly generated dataset or database may cause a subsequent
exercise to fail.
Manual Organization
The manual will take you through a typical workflow of a geoscientist
processing a land 2D seismic dataset. The processing functions of
ProMAX will be introduced and discussed as they appear in the
workflow.
Processing WorkFlow
1. Geometry Assignment Field Data
2. Trace Editing
3. Parameter Selection
5. Brute Stack
6. Velocity Analysis
7. Residual Statics
Agenda
Day 1
System Overview
Directory Structure
Program Execution
Ordered Parameter Files
Parameter Tables
Disk Datasets
Tape Datasets
Day 2
Brute Stack
Refraction Statics
Refraction Statics Calculation - coordinate based
Apply Refraction Statics
Stack with Refraction Statics
Stack Comparisons
Compare Stacks
Day 3
PostStack Migration
Additional Topics
Geometry is clearly one of the most important aspects of processing. These next three chapters are
examples of a difficult, an easy, and a most common approach to geometry assignment.
Chapter Goals
Geometry Assignment Map
Land Geometry
View Shot Gathers
Load Geometry in Spreadsheet and Database
View Database Attributes
Load Geometry to the Trace Headers
Graphical Geometry QC
Chapter Summary
Chapter Objectives
UKOOA ASCII
O.B. Field
Data
Notes
UKOOA
Import SEG-? Input
Spreadsheet
Manual Import
Input
Database
Import
Seismic Data
(ProMAX)
Extract
Database
Files
Geometry Inline Geom
Spreadsheet Header Load
Inline Geom
Header Load
Valid Trace
Numbers
Overwrite
Trace Headers
Seismic Data Seismic Data
(ProMAX) (ProMAX)
Spreadsheet
Import
Manual
Input Field
Data
SEG-Y Input
Geometry
Spreadsheet Inline Geom
Header Load
Ordered Parameter
Files
Disk Data
Output
Seismic Data
(ProMAX)
Land Geometry
If the input seismic data has pertinent geometry information in the trace
headers, you can extract this information using the process Extract
Database Files prior to working with the spreadsheet.
Give your area a descriptive name that has meaning to you. You
might want to use your name in this case.
2. When the Line menu appears add a new line named Watson
Rise.
SEG-Y Input
Type of storage to use: ------------------------------Disk Image
Enter DISK file path name: -----------------------------------------
-----------/misc_files/2d/segy_0_value_headers
----Default the remaining parameters----
Automatic Gain Control
----Default all parameters for this process----
Trace Display
Number of ENSEMBLES (line segments)/screen: -------2
----Default the remaining parameters----
Use the Next Ensemble icon to move through all 20 shots for this
line. Notice how the shot rolls onto the spread and that there is a
discontinuity between channels 60 and 61.
Channel 1 60 61 120
Station 387 388 389 390 391 392 393 394 . . . 446 449 450 451 452 453 . . . 505 506 507 508
Channel 1 60 61 120
Station 387 388 389 390 391 392 393 394 . . . 446 449 450 451 452 453 . . . 505 506 507 508
448.5
Channel 1 60 61 120
Station 388 389 390 391 392 393 394 395 . . . 447 450 451 452 453 454 . . . 506 507 508 509
20 Sources
120 Channels
55 ft. Receiver Interval
220 ft. Source Interval
2 Second Record Length
4 ms Sample Rate
Dynamite Source
Observers Report
The second flow will load the geometry from the database to the
trace headers.
3. Select Setup, and fill out the menu with information from the
observers log.
5. Enter source and receiver station interval, and leave the survey
azimuth blank as it will be calculated later.
6. Enter the first and last live station numbers, select Yes to base
source station numbers on receiver station numbers. Set source type
to shot holes, and units are feet. You may also enlarge the font.
Receivers spreadsheet
1. Select Receivers from the main spreadsheet window.
Select Edit Insert, and insert the proper number of rows after
the last marked block. Scroll to the bottom of the spreadsheet. If
you created more than 139 blocks, mark the excess blocks by
selecting block 140 with shift-MB2. This will select all blocks
numbered 140 and greater. Select Edit Delete, and OK.
After you are certain that you have exactly 139 rows in the
spreadsheet, mark all rows active with MB3 again, so that you
can easily work with the entire spreadsheet.
This is an old land line, for which there were no XY values recorded.
We will make up some fake XYs assuming that the line is straight,
runs from West to East, and has a nominal receiver spacing of 55ft.
When working with ASCII file import there are three required
steps:
Open the ASCII file.
1
3
4 2
In the Filter box of the File Import Selection window, enter the
directory path (.../misc_files/2d/*) to your ASCII file,
followed by /*, then select Filter. Select the ASCII filename and
OK.
10. Highlight the columns that contain the numbers for the attribute
you selected by holding down MB1 and dragging from left to right.
13. Click MB3 with the cursor positioned over the word Station or
one of the other columnar attributes.
NOTE:
Look at the Mouse Button help descriptions at the bottom of the ASCII text window.
Note that they now reflect block selection and deletion options.
14. Use MB1 to select the first row to exclude, and MB2 to select the
last row to exclude, and press Ctrl-d. You will want to exclude title
rows, blank rows, and rows with information that you do not want
to import.
This writes a Ignore Record for Import message on all the defined
rows.
15. There are also rows at the bottom of this file containing source
information that need to be ignored.
This will check for any cards with inappropriate information, and
allows you to interactively delete them.
18. Select Merge existing station values with matching station data
and click OK.
20. Make sure you have 139 stations defined in your receiver
spreadsheet, and the information looks correct.
Sources spreadsheet
1. Select Sources from the main spreadsheet window.
Notice that you did not input 388.5 as the observers report states.
This is because the spreadsheet will only accept integer
numbers. You will specify this half station difference using the
skid column later.
Also notice that the x, y, and z values updated. Because you told
the spreadsheets that the source and receiver station numbers
were linked, the Sources spreadsheet uses the x, y, and z values
entered in the Receivers spreadsheet. Therefore, the source
elevations are the elevations of the previous receiver location. In
our case, you need to interpolate elevations between receiver
locations. We will do this later from the Database tool.
Finally, you can see from the Observers Report that a few of the
shot station numbers do not increment by four. Fix the station
numbers for those shots in the spreadsheet now. Notice that the
x, y, and z values change as you change the Station number.
Direction of Increasing
Station Numbers
Shot (x,y) (Source Azimuth)
7. Scroll the spreadsheet to the right, and fill the Skid column with
27.5. This is where you specify the inline offsets that move the
shots from integer station numbers to half station numbers.
ProMAX uses the following sign convention:
8. Import the Uphole time and Hole Depth information from the
ASCII file using the same procedure as described in the Receivers
spreadsheet.
Patterns spreadsheet
At this point, leave the Sources spreadsheet, and fill in the patterns
spreadsheet. After filling out the pattern, you will finish the remainder
of the Sources spreadsheet.
There are two methods of defining patterns. If the shot gap stays in a
constant location, use the Static Gap Method. This method is only
available if you chose to assign midpoints by matching pattern numbers
using first live chan and station in the setup menu. If your shot gap
changes locations, use the Dynamic Gap Method. This method is
available if you chose either to assign midpoints by matching pattern
numbers using first live chan and station, or matching pattern number
using pattern station shift.
Ch 1 Ch 60 Ch 61 Chn 120
Shot
Sources Spreadsheet
Gap Chan=0 and Gap Size=0
Patterns Spreadsheet
Pat Min Max/Gap Chan Rcvr Rcvr Rcvr
Chan Chan Inc MinChan MaxChan Inc
1 1 60 1 387 446 1
Ch 1 Ch 60 Ch 61 Chn 120
Shot
Sources Spreadsheet
Gap Chan=60 and Gap Size=2
Patterns Spreadsheet
In this method, you specify the first and last channels and stations in
the Pattern spreadsheet. The shot gap size and location is specified
in the Sources spreadsheet.
You will now define your cable configuration, that is the relationship
of channels to receiver locations. When you enter the Pattern
spreadsheet for the first time, a window will appear that asks you to
enter some information about the number of channels.
4. Select File Exit to save the information, and exit the Patterns
spreadsheet.
With the default column order, you cannot see the Station
column after scrolling the spreadsheet to the right.
To change the displayed order of the columns select
Setup Order the menu bar.
Follow the mouse button help, and click MB1 in the column
heading for Station, Pattern, Num Chn, Shot Fold, 1st Live Sta,
1st Live Chn, Gap Chan Dlt, Gap Size Dlt, and Static.
This tells the Sources spreadsheet to use pattern number 1 from the
Patterns spreadsheet. Recall that you only defined one pattern for
this survey.
This specifies that there are 120 channels for each shot on this
survey.
This column will be calculated and filled when you assign midpoints
later in the exercise.
9. Fill the 1st Live Sta column with information from the Observers
Report.
Notice that the first live station for this survey is 387 for all but the
last five shots.
This specifies that the first live channel for each shot is 1.
11. Leave the Gap Chan Dlt column blank, and leave the Gap Size Dlt
column filled with zeros.
14. Display a basemap of both the shots and receivers, and measure the
station azimuth.
Notice that the receivers are displayed as a plus + sign, and the
shots are displayed as an asterisk *. Also notice the two offset
shots. To get a better view of the shots select Display Sources
Control Points White.
Now select the Cross Domain icon to allow you to measure the
station azimuth. Press MB3 (notice the mouse button help) near the
first shot on the line, and drag the mouse to the end of the line.
While still holding down MB3, make note of the azimuth (Azi)
readout in the mouse button help. For this line, the azimuth should
be 90 degrees.
Select File Exit Confirm in the XYGraph display.
15. From the main Land Geometry window, select Setup, and enter 90
for the Nominal Survey Azimuth. Select OK to save the
information an close the window.
16. Make sure that you only have 20 rows in the Sources spreadsheet.
TraceQC spreadsheet
1. The information in the traces spreadsheet will be calculated by the
binning process. You can not edit this information.
Binning
1. Select Bin from the main window. There are three steps to be
completed in order:
Assign Midpoints
Finalize database
Computes the SIN and SRF for each trace and populates the
TRC OPF.
This step calculates CDP numbers for each trace by adding source
and receiver numbers. The first CDP will be 775 (387 + 388), the last
CDP will be 989 (464 + 525). This step also creates the OFB ordered
parameter file.
6. The binning step filled in the data in the Traces spreadsheet. You
can QC this information from a basemap. From the Receivers
spreadsheet, select View View All Basemap.
7. Highlight the Cross Domain icon. Click and hold MB1 near a
source location to see which receivers contributed to that shot. Drag
your mouse to the end of the line to see the receiver range change.
Click and hold MB2 near a receiver location to see which shots
contributed to that receiver.
1. Select Exit from the Flow Editing menu of the User Interface.
CDP (Common Contains information varying by CDP location, such as CDP x,y
Depth Point) coordinates, CDP elevation, CDP fold, nearest surface location.
By projecting the SRF elevations into the SIN elevations you will
correct for the skid of the elevation being on the half station. For
example, compare the land geometry database for receiver and shot
elevations at station number 428. You see that they both read an
elevation of 842 feet. Looking at the elevation for station number
429, however, you see an elevation of 845.3. From the observer notes
and geometry assignment you remember that the shot is actually at
station location 428.5, and therefore at an elevation around 843.6.
In the popup window, type in ELEV for the new attribute name,
then click on OK. Your new attribute will be plotted. Notice how
station 428 has been corrected.
5. You can verify the source elevation was corrected by going back
into the source spreadsheet.
6. There are several useful QC plots that can be made from the
DBTools or from the XDB Database Display. Some examples are
listed below.
SEG-Y Input
Type of storage to use: ------------------------------Disk Image
Enter DISK file path name: -----------------------------------------
-----------/misc_files/2d/segy_0_value_headers
----Default the rest of the parameters----
Inline Geom Header Load
Primary header to match database: ---------------------FFID
Secondary header to match database: ----------------None
Match by valid trace number?: -------------------------------No
Drop traces with NULL CDP headers?: --------------------No
Drop traces with NULL receiver headers: ----------------No
Verbose diagnostics?: --------------------------------------------No
Disk Data Output
Output Dataset Filename: -----------Shots-with geometry
New, or Existing, File?: ----------------------------------------New
Record length to output: ------------------------------------------0.
Trace sample format: ----------------------------------------16 bit
Skip primary disk storage?: -----------------------------------No
2. In SEG-Y Input, select Disk Image and enter the path given to you
by your instructor for the raw shot dataset.
3. In Inline Geom Header Load, select FFID as the Primary and None
as the Secondary headers to match the database.
6. Edit your flow 1.1-View Shots to check the trace headers of your
dataset.
<SEG-Y Input>
Disk Data Input
Select dataset: ----------------------------Shots-with geometry
Trace read option: -----------------------------------------------Sort
Select Primary trace header entry:--------------SIN
Select secondary trace header entry:---OFFSET
Select order list for dataset----------------------------------*:*
Automatic Gain Control
----Default all parameters for this process----
Trace Display
Number of ENSEMBLES (line segments)/screen: -------2
Do you want to use variable trace spacing?------------Yes
----Default the remaining parameters----
9. While viewing the data in Trace Display, use the dx/dt icon to
measure the first break velocity of a few shots. Write down this
value as it will be used later in the Graphical Geometry QC section.
Graphical Geometry QC
For a quick check of all the data, you could input all 20 shots instead
of 4.
5. Set the maximum number of traces per screen to 139. This will
cover the full spread 120 channels plus 5 extra shots 4 channels
apart.
6. Select Individual for Trace scaling option, if you have any spikes
in your data.
The spikes will bias the entire screen scaling scalar and cause many
of the traces to appear having zero amplitude.
This process uses Screen Display for displaying your data, instead
of Trace Display. When you execute with MB2, the data is
automatically displayed.
NOTE:
If you find any mistakes you must go back to the spreadsheets and correct them.
Then you will need to rebin. Finally, to get the proper trace headers loaded you need
to rerun the inline header load flow.
Chapter Summary
In this chapter we will cover the steps necessary to assign geometry to a line if all of the required
information is present in the trace headers of the input dataset. This approach can be very quick
and effective for reprocessing data.
Chapter Objectives
SEG-Y Input
Seismic Data
(ProMAX)
Extract
Database
Files
Disk Data
Output
Ordered Parameter
Ordered Parameter
Files
Files
Seismic Data
(ProMAX)
If the input SEG-Y headers are fully populated then you are done and
the data should be ready to be processed without touching the
spreadsheet or having to run an Inline Geometry Header Load. A fully
populated trace header must have the following valid values:
Make sure you are in your Area. Go to the Line level of the ProMAX
User Interface and click on Add. Type in the line name,
Database Full Extraction, and then press Enter.
SEG-Y Input
Type of storage to use: ----------------------------- Disk Image
Enter DISK file path name: -----------------------------------------
-----------------------------------/misc_files/2d/segy2d_remap
Remap SEGY header values?: -------------------------------Yes
Input/override trace header entries: ---------------------------
sou_sloc,,4I,,181/srf_sloc,,4I,,185/
cdp_sloc,,4I,,189/cdp_x,,4I,,193/
cdp_y,,4I,,197/cdp_elev,,4I,,201/
Extract Database Files
Is this a 3D survey: ------------------------------------------------No
Data Type: --------------------------------------------------------LAND
Source index method: -----------------------------------------FFID
Receiver index method: ------------------------------STATIONS
Mode of operation: ----------------------------------OVERWRITE
Pre-geometry extraction?: ---------------------------------------No
Extract CDP binning?: -------------------------------------------Yes
Minimum cdp bin in survey: ----------------------775
Calculate trace midpoints coordinates?: -----------------Yes
Extract OFB binning?: --------------------------------------------No
Disk Data Output
Output Dataset Filename: --------raw shots w/ extract
New, or Existing, File?: ----------------------------------------New
Record length to output: ------------------------------------------0.
Trace sample format: ----------------------------------------16 bit
Skip primary disk Storage?: -----------------------------------No
This input SEG-Y file already has most geometry information in its
trace headers. The remap option allows information in non-standard
or extended header locations to be accessed and assigned to a
ProMAX header word. The ProMAX spreadsheets use the values for
SOU_SLOC, SRF_SLOC, CDP_SLOC, CDP_X, CDP_Y, and
NOTE:
If no receiver information exists in the input trace headers and you answer no to Pre-
Geometry Initialization, the job will fail. If no receiver information exists in the
input trace headers and you answer Yes to Pre-Geometry Initialization, the SRF
OPF will be built anyway. You must then enter the missing information into the
Receivers spreadsheet, as well as define pattern information in the Sources and
Patterns spreadsheets.
7. Now confirm that the SEG-Y headers were complete by doing some
QC plotting from the Database to check that the trace, receiver,
shot, and CDP OPF files look proper.
Chapter Summary
In this chapter we will cover the steps necessary to assign geometry to a line if some of the
required information is present in the trace headers of the input dataset.
Chapter Goals
SEG-Y Input
Seismic Data
(ProMAX)
Extract
Database
Files
Geometry
Geometry
Spreadsheet
Spreadsheet
Ordered Parameter
Ordered Parameter Disk Data
Output
Files
Files
Inline Geom
Header Load
Valid Trace
Numbers
Overwrite
Trace Headers Seismic Data
(ProMAX)
Make sure you are in your Area. Go to the Line level of the ProMAX
User Interface and click on Add. Type in the line name,
Database Partial Extraction, and then press Enter.
SEG-Y Input
Type of storage to use: ----------------------------- Disk Image
Enter DISK file path name: -----------------------------------------
-----------------------------------/misc_files/2d/segy2d_remap
Remap SEGY header values?: -------------------------------Yes
Input/override trace header entries: ---------------------------
-----------------sou_sloc,,4I,,181/srf_sloc,,4I,,185/
Extract Database Files
Is this a 3D survey: ------------------------------------------------No
Data Type: --------------------------------------------------------LAND
Source index method: -----------------------------------------FFID
Receiver index method: ------------------------------STATIONS
Mode of operation: ----------------------------------OVERWRITE
Pre-geometry extraction?: ---------------------------------------No
Extract CDP binning?: -------------------------------------------Yes
Minimum cdp bin in survey: ----------------------775
Calculate trace midpoints coordinates?: -----------------Yes
Extract OFB binning?: --------------------------------------------No
Disk Data Output
Output Dataset Filename: --------raw shots w/ extract
New, or Existing, File?: ----------------------------------------New
Record length to output: ------------------------------------------0.
Trace sample format: ----------------------------------------16 bit
Skip primary disk Storage?: -----------------------------------No
This input SEG-Y file already has most geometry information in its
trace headers. The remap option allows information in non-standard
or extended header locations to be accessed and assigned to a
ProMAX header word. The ProMAX spreadsheets use the values for
SOU_SLOC and SRF_SLOC These are not standard SEG-Y
headers, and therefore must be stored in the extended header section
of the SEG-Y data. Choose the remap option to read in these values.
NOTE:
If no receiver information exists in the input trace headers and you answer no to Pre-
Geometry Initialization, the job will fail. If no receiver information exists in the
input trace headers and you answer Yes to Pre-Geometry Initialization, the SRF
OPF will be built anyway. You must then enter the missing information into the
Receivers spreadsheet, as well as define pattern information in the Sources and
Patterns spreadsheets.
Since you used Extract Database Files, the default option in setup is
to Assign midpoints by existing index number mappings in the TRC.
Reset the units to feet, leave the rest of the Setup window blank and
select OK.
All of this information should be correct. You may notice that some
of the receivers are not in sequential order. You can sort these by
selecting Setup Sort Ascending. Choose OK in the warning
window that appears, and then select the Station column with MB2.
This will sort the spreadsheet by ascending station number. Check
for incorrect information, and select File Exit. Choose Proceed
and then OK to the following messages.
5. Select Sources.
7. Select Bin.
NOTE:
You must execute all three options available in this window. Each of these options
may be time consuming in the case of 3D data, so they are separated out in this
menu.
Select only one of the three Bin midpoints options. In this case,
select Using previously assigned CDP numbers, user defined
OFB parameters, since our input SEGY trace headers included
CDP numbers. Use a Binning bias of 0 and an offset bin center
increment of 55. Click OK. Select OK when successfully
completed.
8. Select Finalize Database, then OK. This step fills in the LIN order
From the Flows window, access the database with the Database
global command option, and check various attributes for
correctness.
All traces in the dataset are described in the geometry. If there are
any missing traces in the input file, the job will fail.
The Inline Geom Header Load uses the valid trace number found on
each trace of each ensemble to assign geometry.
If the existing HDR files are not large enough to accept the data to
write out, you must:
Chapter Summary
This chapter will serve as your introduction to the real power the DBTools program.
Chapter Objectives
2. Trace Editing
3. We first need to pick a time gate that will be used by the Trace
Statistics process. On the first shot select Picking Pick
Miscellaneous Time Gate... Trace Stats Gate by
AOFFSET.
Pick the top of the gate following the first break times. Use MB3 to
add NEW LAYER for the bottom gate. Track the end of the
reflection data in this case, near 2 seconds:
When you are done picking choose File Exit/Stop Flow. Select Yes
to save edits before exiting.
You should get the IDA window and the trace display window. Make
sure IDA is working by using the forward and reverse arrows.
5. Leave the Trace Display running, but, exit from the flow menus and
press Database on the User Interface.
Use the View Predefined Source fold map pull down menu.
For this example you may elect to change the background to white
and then change to a monochrome color using the Options White
Using MB1, drag the cursor across the anomalous range of the plot.
The points will turn red and all the others will turn black.
Notice that a few points will also turn red on the other displays. This
is the power of the summary statistics plot. This demonstrates that
the high amplitude traces are distributed amongst the shots and
receivers. (i.e. there does not appear to be any single high amplitude
shot or receiver, these are randomly placed traces)
The points that were highlighted red will turn pink indicating that
they are now selected
Notice that some of the shots on the shot location map turned black:
These are the shots that have the high amplitude traces.
4. PD these shots to the trace display using the bow and arrow PD
icon so that the display will only show you the shots that contain
the high amplitude traces.
This way you are only presented with a few shots to examine instead
of the entire data volume of shots. You should only have three shots
available to page through in the display.
5. Open a Trace Kill table using the Picking Kill Traces... pull
down. Assign this table a name such as Kill list from
DBTools interactive and choose CHAN as the secondary
sort key for the list.
7. To check that you killed the proper traces select the Paintbrush
icon which toggles on and off the kills.
1. Highlight the one line of the histogram that represents all of the
traces except for the highest amplitude on the TRC-AMPL plot.
Notice that almost the entire plot remains red except for a few traces
that are marked in black.
new range of
interest?
5. Select these new points using the Select All highlighted pull
down.
6. Project these points to the shot map using the Project SIN pull
down.
7. PD these shots to the Trace Display using the bow and arrow PD
icon.
You can always reset the range of points displayed on the histogram by
using the Focus On All pull down.
8. After all traces of interest have been selected to the edit list, Exit
from the Trace Display, saving the results. Exit from the main
DBTools window with the Database Exit pull down, select
Commit to save the LOG_AMP attribute you created to the
database.
Chapter Summary
Directory Structure
Program Execution
Ordered Parameter Files
Parameter Tables
Disk Datasets
Tape Datasets
Chapter Objectives
Directory Structure
/lib
lib*.a
/plot
/port /help /promax
*.lok - Frame help
/lib/X11/app-defaults *.help -ASCII help
Application window
managers /promax3d
/promaxvsp
/menu /promax
*.menu
Processes
/promax3d
/promaxvsp
/misc
*_stat_math
*.rgb-colormaps
ProMax_defaults
/bin
start-up executable
/etc
config_file
product
install.doc
pvmhosts
qconfig
license.dat
/scratch
/queues
/ProMAX/sys
Software that is Operating System Specific resides in /ProMAX/sys
which is actually a symbolic link to subdirectories unique to a given
hardware platform, such as:
/ProMAX/port
Software that is Portable across all Platforms is grouped under a
single subdirectory /ProMAX/port. This includes menus and Processes
(/ProMAX/port/menu), helpfiles(/ProMAX/port/help), miscellaneous
/ProMAX/etc
Files unique to a particular machine are located in the /ProMAX/etc
subdirectory. Examples of such files are the config_file, which contains
peripheral setup information for all products running on a particular
machine, and the product file, which assigns unique pathnames for
various products located on the machine.
/ProMAX/scratch
The scratch area defaults to /ProMAX/scratch. This location can be
overridden with the environmental variable,
PROMAX_SCRATCH_HOME. We recommend you point this to the
biggest file system you have write permission. The DMO, Migrations,
and Spreadsheets are heavy users of this file system. We also
recommend that you periodically clean this file system.
/Area
DescName Area subdirectory
and its files
Project
/Line
DescName
17968042TVEL
31790267TGAT 1) Parameter Table files
36247238TMUT
12345678CIND
Index and Map Dataset files
12345678CMAP
/12345678
HDR1 2) Dataset subdirectory
HDR2 and Header and Trace
Dataset files
TRC1
TRC2
/Flow1
DescName
TypeName 3) A Flow subdirectory
and its files
job.output
packet.job
/OPF.SRF Database
/OPF.SRF subdirectory and a
#s0_OPF60_SRF.GEOMETRY.ELEV span file
Program Execution
Program Execution
control of the Executive, and handle their own data input and output by
directly accessing external datasets. In these instances, the Super
Executive is responsible for invoking the stand-alone programs and, if
necessary, multiple calls to the Executive in the proper sequence.
The Packet File, packet.job, defines the processes and their type for
execution. The Super Executive concerns itself with only two types of
processes:
Executive processes
Stand-alone processes
Disk Data
Input AGC
Trace Display
F-K Filter
InterProcess
Communication
Disk Data Output
Tool
Processing Pipeline
Each individual process will not operate until it has accumulated the
necessary traces. Single trace processes will run on each trace as the
traces come down the pipe. Multi channel processes will wait until an
entire ensemble is available. For example, in the example flow the FK
filter will not run until one ensemble of traces has passed through the
DDI and AGC. If we specify for the Trace Display to display 2
ensembles, it will not make a display until two shots have been
processed through the DDI, AGC and FK filter. No additional traces
will be processed until Trace Display is instructed to release the traces
that it has displayed and is holding in memory by clicking on the traffic
light icon or terminating its execution (but continuing the flow).
Note: All the processes shown are Executive processes and thus operate
in the pipeline. An intermediate dataset and an additional input tool
process is needed if a stand-alone process were included in this flow.
Disk Data
Input AGC
F-K Filter
Decon
Disk Data
Input
Disk Data
Output
NMO
CDP Stack
Bandpass
Filter
Disk Data
Output
One pipe must complete
successfully before a new
pipe will start processing
complex tools Accepts and returns a variable number of seismic traces such
as, stack. This type of process actually controls the flow of
seismic data.
Organization
Database Structure
Organization
The Ordered Parameter Files contain information applying to a line and
its datasets. For this reason, there can be many datasets for a single set
of Ordered Database Files.
CDP (Common Contains information varying by CDP location, such as CDP x,y
Depth Point) coordinates, CDP elevation, CDP fold, nearest surface location.
OPF Matrices
The OPF database files can be considered to be matrices or flat files. The
OPF database files are not a relational database.
Each OPF is indexed against the OPF counter and there are various
single numbers per index. Note the relative size of the TRC OPF to the
other OPF files. The TRC is by far the largest contributor to the size of
the database on disk.
OPF Matrices
Database Structure
The ProMAX database was restructured for the 6.0 release to handle
large 3D land and marine surveys. The features of the new database
structure are listed below:
Each order is contained within a subdirectory under Area and Line. For
example, the TRC is in the subdirectory OPF.TRC.
Index: Holds the list of parameters and their formats. There is only
one index file in each OPF subdirectory. The exception to this is the
LIN OPF. The LIN information is managed by just two files, one
index and one parameter, named LIN.NDX and LIN.REC.
Span: These files are denoted by the prefix, #s. Non-span files lack
this prefix. The TRC, CDP, SIN, and SRF OPF parameters are span
files. The first span of 10MB for each parameter file is always
written to primary storage. Span files are created in the secondary
storage partitions listed in the config_file as denoted with the OPF
keyword. Span files may be moved to any disk partition within the
secondary storage list for read purposes. Newly created spans are
written in the OPF denoted secondary storage partitions. All
subsequent spans are written to the secondary storage partitions
denoted by the OPF keyword in a round robin fashion until the
secondary storage is full. Then, subsequent spans are created in
primary storage. Span file size is currently fixed at 10 megabytes, or
approximately 2.5 million 4 byte values per span file.
export from the geometry database to the ProMAX database files as was
required prior to the 6.0 release.
Database append is allowed. Data can be added to the database via the
OPF Extract tool or the geometry spreadsheet. This allows for the
database to be constructed incrementally as the data arrives.
For example, the x coordinate for a shot in the SIN has the following
name: #s0_OPF60_SIN.GEOMETRY.X_COORD. Where #s0_OPF60
indicates a first span file for the parameter, _SIN denotes the Order,
GEOMETRY describes the information type of the parameter, and
X_COORD is the parameter name.
Index file names contain the three letter Order name. For example, the
index file for the TRC is called OPF60_TRC.
NOTE:
The index file for each Order must remain in the primary storage partition. Span
parameter files may be moved and distributed anywhere within primary and
secondary storage.
Within each Order, there are often multiple attributes, with each
attribute being given a unique name.
Parameter Tables
WARNING:
Remember, you name and store the parameter tables in their specific Area/Line
subdirectory. Therefore, you can inadvertently overwrite an existing parameter
table by editing a parameter table in a different processing flow.
1. Select File: Select a file to import. If the text file does not contain
valid line terminators, use Width to set the line width and then re-
read the file.
5. Filter the File for Invalid Text: Search the marked columns and
rows for any invalid text. Text may be excluded or replaced within
this interactive operation.
When the application is initialized, the main ASCII File Export window
will appear. After a file and format has been selected, then the ASCII
text is displayed and the Apply button is activated. The steps involved
in performing a file export are as follows:
1. Select File: Select a file for export within the File Export Selection
dialog.
6. Cancel the Export Operation: Press the Cancel button to close the
export windows and return to the calling spreadsheet.
Disk Datasets
/ProMAX/data/usertutorials/landexample/12345678CIND
/ProMAX/data/usertutorials/landexample/12345678CMAP
/ProMAX/data/usertutorials/landexample/12345678/TRC1
/ProMAX/data/usertutorials/landexample/12345678/HDR1
Map File keeps track of trace locations, even if data flows over
(....CMAP) many disks. Given a particular trace number, it will find the
sequential trace number within the dataset. This rapidly
accesses traces during processing. The map file is a separate
file, as it may grow during processing, it is always held in the
line directory.
CIND HDRx
CMAP TRCx
Secondary Storage
In a default ProMAX configuration, all seismic dataset files reside on a
single disk partition. The location of this disk partition is set in the
$PROMAX_HOME/etc/config_file with the entry:
In addition to the actual trace data files, the primary storage partition
will always contain your flow subdirectories, parameter tables, ordered
parameter files, and various miscellaneous files. The ...CIND and
...CMAP files which comprise an integral part of any seismic dataset are
always written to primary storage.
Since the primary storage file system is of finite size, ProMAX provides
the capability to have some of the disk datasets, such as the ...TRCx and
...HDRx files, and some of the ordered parameter files span multiple
disk partitions. Disk partitions other than the primary disk storage
partition are referred to as secondary storage.
WARNING:
If the Primary file system fills up ProMAX will crash and will not be able to launch
until space on Primary has been cleaned up.
Under the default configuration, the initial TRC1 and HDR1 files are
written to the primary storage partition. It is possible to override this
behavior by setting the appropriate parameter in Disk Data Output. If the
parameter Skip primary disk partition? is set to Yes, then no TRC or
HDR files will be written to the primary disk partition. This can be
useful as a means of maintaining space on the primary storage partition.
(To make this the default situation for all users, have your ProMAX
system administrator edit the diskwrite.menu file, setting the value for
Alstore to t instead of nil).
Tape Datasets
Although the index and map files still reside on disk, copies of them are
also placed on tape(s), so that the tape(s) can serve as a self-contained
unit(s). If the index and map files are removed from disk, or never
existed, as in the case where a dataset is shipped to another site, the tapes
can be read without them. However, access to datasets through the index
and map files residing solely on tape must be purely sequential.
Tape datasets are written by the Tape Data Output process, and can be
read using the Tape Data Input or Tape Data Insert processes. These
input processes include the capability to input tapes by reel, ensemble
number, or trace number. Refer to the relevant helpfile for a complete
description of the parameters used in these processes.
The use or non-use of the tape catalog in conjunction with the tape I/O
processes is determined by the tape catalog type entry in the appropriate
$PROMAX_HOME/etc/config_file. Setting this variable to full
activates catalog access, while an entry of none deactivates catalog
access. An entry of external is used to indicate that an external tape
catalog, such as the Cray Reel Librarian, will be used. You can override
the setting provided in the config_file by setting the environment
Getting Started
The first step in using the ProMAX tape catalog is to create some labeled
tapes.
The following steps are required to successfully access the tape catalog:
1. Label tapes.
0 AAAAAA 0 1 4
1 AAAAAB 0 1 4
2 AAAAAC 0 1 4
3 AAAAAD 0 1 4
4 AAAAAE 0 1 4
The fields are: volume serial number (digital form), volume serial
number (character form), tape rack slot number, site number, and media
type, respectively. You can manually edit these fields.
Chapter Summary
Can you explain how data passes through: single trace tools,
ensemble tools, interprocess communication and stand-alone tools
Data analysis tools and the resulting filtering processes, such as F-K analysis and filtering, are
good examples for parameter testing. In this chapter, one of the exercises is to design filters in the
F-K domain and compare the F-K filtered data to your input seismic data.
Chapter Objectives
3. Parameter Selection
This chapter gives the processor a framework of how to define and test
parameters, gates, windows and processing flows. Upon completion of
this chapter you should:
4. In Trace Display, use variable trace spacing. This will use the
secondary sort key of OFFSET to variably space the traces. Also,
set the number of ensembles per screen to 2.
Decon
Gate
Parameter tables
6. If you did not save your trace kill table from chapter 4, go ahead
and pick the bad traces here: Picking Kill traces... Kill
list from Trace Display.
7. Pick a top mute to get rid of first break and refracted energy:
Picking Pick Top Mute... FB Mute by AOFFSET. Use the
Paintbrush icon to see the effects of your current picks. In this case
you should see only hyperbolas after Paintbrush applies the top
mute.
10. If you desire you can pick the reverse traces: Picking Reverse
traces... reverse traces by AOFFSET. In general the
reverse traces will be flagged in the field by the observers log. The
statics routines will also detect the reverse traces for you.
11. Select File Save Picks, then select File Exit/Stop Flow.
Parameter Test
Parameter Test creates two header words. The first is called REPEAT
data copy number and is used to distinguish each of the identical copies
of input data. The second is called PARMTEST and is an ASCII string,
uniquely interpreted by the Screen Display processes as a label for the
traces.
NOTE:
Entering five nines (99999) is a flag that tells the process to use the values
found in Parameter Test for this parameter.
After viewing the tests and deciding on the most appropriate value
for the dB/sec correction, select File Exit/Stop Flow.
10. Select View from the flow builder menu and look at the processes
that were actually executed in your flow.
11. Edit you flow again, and change the following Trace Display
parameters:
13. Use the Next ensemble icon to display the four tests, then use the
Animation tool to review the tests. Check to see if you would still
use the same value for dB/sec as you chose before.
Branch your processing stream so that each copy of the data may be
processed with different parameters.
Finally, you may use a process called Trace Display Label to generate a
header word for posting a label on the display.
1. Copy your previous flow and edit it to look like the following:
2. Use the same parameters as the previous flow for the first four
processes.
3. In True Amplitude Recovery, set the db/sec to the value you chose
in the previous flow.
The ELSE process selects all traces, not previously selected with IF
or ELSEIF. In our case, having selected two of the three copies of
data for filtering, leaves only the third data copy (REPEAT=3) for
the ELSE branch. In this example, you will apply deconvolution and
filter.
After viewing the data in this mode, you may choose to display each
copy on a different screen, and use the screen swap mode.
Separating trace data into signal and noise is often possible in F-K space.
This separation can be exploited by defining a filter to reject the noise
or accept the good data. There are ProMAX tools to view data in F-K
space, design filters and subsequently apply filters to enhance your data.
F-K Analysis
In this exercise you will bring in one shot with some slow linear noise.
After inspection in both the time domain and F-K space, design a filter
to reject the noise. You will want to try a polygon filter as well as a fan
filter to attenuate the noise.
NOTE:
This is not a real processing flow, since, you would normally do the FK filter
before the deconvolution. For class purposes we are using the deconvolution
to enhance the ground roll so that we can demonstrate how powerful FK filters
are at attenuating ground roll.
2. For all processes prior to F-K Analysis, use the same parameters as
the previous flow.
3. In F-K Analysis, enter 122 for the panel width to account for the
shot gap in the transform.
4. Set 55 ft. for the distance between traces (do not let this default to
0).
7. With the default display you will see four panels. View only the
TX, and FK panels by selecting Configuration TX-and-FK.
8. Use the dx/dt icon. You should identify the ground roll energy in
the F-K domain by the velocity you measure in T-X space.
10. With the F-K data displayed, select the Picking tool icon to build a
table for interactively picking a reject zone.
3) Desired
Polygon
2) Move
points 1) Rectangle
control points
11. Pick a polygon to include all the noise to filter. It is best to start with
a square or rectangle and then use MB1 to add new control points
and MB3 to move the control points to customize the shape of the
polygon as illustrated on the previous page.
12. After building the desired polygon, examine the response of the
data to the filter by selecting FilterResponse FilteredOutput.
FK Filtered Output
13. You may also want to view the impulse response of the filter by
selecting FilterResponse ImpulseResponse. To better view the
operator now select Controls TX Display... Clip by
amplitude .008 and then select OK.
14. After using Interactive Data Access option to view other shots,
select File Exit/Stop Flow, and then select Yes to save your
polygon.
2. Use the same parameters as before for the first seven processes,
except turn off the Interactive Data Access.
4. Use the Repeat option in IF to send one copy of the shot to the F-K
Filter process.
Spectral Analysis
In this exercise you will run Interactive Spectral Analysis in all three
modes, and then compare the results of running deconvolution on the
data. Deconvolution testing may become very involved in certain
situations. One criterion that you may use to help decide on decon
parameters is to look at amplitude (or power) spectra of the trace data
before and after decon. If the decon has worked properly, you should see
some flattening, or whitening of the spectrum after decon relative
to before. In this exercise we will look at such a comparison on a single
shot record.
1. Build the following flow to run the ISA in its simplest configuration:
4. Exit from the display using the File Exit and Stop Flow pull
down menu.
In this mode you can select a Single Subset of the available data for
the purposes of computing the average power and phase specta.
7. Click on the Select Rectangular Region icon and then draw a box
around an area of interest. The data window and spectral windows
will change configuration to match your data selection.
You can move or redraw this window as many times as you wish.
8. Exit from the display using the File Exit and Stop Flow pull
down menu.
11. Click on the Select Rectangular Region icon and draw a box
around an area of interest and then select the
Options Spectral Analysis pull down menu.
14. Copy your flow to compare a shot before and after deconvolution
with an IF-ELSEIF loop.
16. You can use the Slope icon to calculate the dB roll on/off of the
amplitude spectrum.
17. Click on the Next ensemble icon to display the data after decon.
18. Select the Options Spectral Analysis pull down menu again to
show the spectral estimate for the data after decon. Observe the
flattened amplitude spectrum and the change in the dB scale.
19. When done File Exit and Stop Flow from each of the display
windows.
Chapter Summary
ProMAX offers three methods of applying datum-static corrections, depending on whether or not
the sources are on the surface. All of these options are within the Datum Statics Calculation and
the Datum Statics Apply processes, which actually calculate and apply the static corrections. This
process utilizes a database_math file to create and manipulate related database entries. (This file
can be found in the $PROMAX_HOME/port/misc directory.) These database values are then used
to create trace header entries and apply appropriate static shifts to traces.
You can also use refraction statics to calculate and apply datum statics. Refraction statics will be
covered in a later chapter.
Chapter Objectives
This chapter explains how to calculate and apply elevation statics. Upon
completion of this chapter you should:
Elevation Statics
Compute static time shifts to take the seismic data from their
original recorded times, to a time reference as if the data were
recorded on a final datum F_DATUM (usually flat) using a
replacement velocity (usually constant).
Partition the total statics into two parts, the Pre (before) NMO term
and Post (after) NMO terms relative to N_DATUM.
Apply the Pre (before) NMO portion of the statics and write the
remainder to the trace header.
In Datum Statics Calculation* you have the option to shift prestack data
to a floating datum or a final datum. You supply a final datum elevation
and a replacement velocity. The elev_stat_math file then establishes
values in the database for F_DATUM, N_DATUM, S_STATIC,
R_STATIC, and C_STATIC. Details of this process can best be
understood by examining the contents of the elev_stat_math file. This
file typically resides in $PROMAX_HOME/port/misc.
Datum Statics Calculation* then creates four new header entries for
statics: NMO_STAT, FNL_STAT, TOT_STAT and NA_STAT. The
integer multiple of the sample period (usually a multiple of 2 or 4 ms)
portion of NMO_STAT is automatically applied by Datum Statics
Apply, shifting traces to the floating datum. The fractional sample
period portion is written to the NA_STAT header entry and applied
later. Normally the NA_STAT is applied during NMO, which will
interpolate the data to the fractional static properly.
Receiver
N_DATUM
Vweathering NMO_STAT
Surface
Elevation
NMO_STAT
Shot
Vreplacement
Base
Weathering
FNL_STAT
S_STATIC C_STATIC R_STATIC
F_DATUM
Database Attributes:
N_DATUM = floating datum
For this dataset, use a final datum elevation of 800 ft. and a
replacement velocity of 8000 ft/sec.
3. Select the Database Math Method - Shot Hole Using Uphole Info.
Shot Hole Using Uphole Info: If you want to honor the shot depth
and uphole information, use the elev_stat_math file.
Shot Holes Ignoring Uphole Times: If you do not trust the uphole
information, then you can override the weathering velocities
calculated with uphole times and shot depths and supply your own
weathering velocity. This option will use the noup_stat_math file.
8. When the job completes exit the flow, and select the Database
menu.
9. From the DBTools window select the SRF tab (order), and then by
double clicking view the following attributes: R_STAT01,
F_DATUM, DATUMVEL, and ELEV (receiver elevation). Notice
the inverted relationship between the static and the elevation.
Select the SIN tab, and view the following attributes: S_STAT01,
and ELEV (elevation of surface at the shot locations).
10. Why are the source and receiver statics opposite signs? Perhaps the
shots are buried beneath the final datum?
From the CDP order, view ELEV, and N_DATUM (floating datum).
Notice the effect of the 51 point CDP smoother you applied.
Data output from this flow will later be input to velocity analysis.
3. Once the job finishes view the shots with flow 1.1-View Shots.
Examine the trace headers for NMO_STAT, FNL_STAT,
TOT_STAT, and NA_STAT using the Header icon.
If shot and receiver statics to a final datum have been calculated outside
of ProMAX, the statics can be incorporated into a processing flow. Use
the ASCII file import option in XDB Database Display to create entries
which may be accessed by Datum Statics Apply. Datum Statics Apply
creates the necessary database entries, and partitions these imported
statics into NMO_STAT and FNL_STAT. The sample period multiple
portion of NMO_STAT is applied to the traces by Datum Statics Apply,
and the remainder is stored in NA_STAT to be applied later. Recall:
NMO_STAT = S_STATIC + R_STATIC + C_STATIC. Therefore
Datum Statics Apply will recalculate NMO_STAT using the
N_DATUM and C_STATIC previously calculated by Datum Statics
Calculation*
When these statics are imported to the SIN and SRF Ordered Database
Files, they must both be of type Geometry and the Attribute names must
be USERSTAT.
For this class, no ASCII format statics file is available, therefore, you
will use the XDB Database ASCII Save functionality to output an
ASCII file of shot and receiver statics created in the previous exercise.
You will then import these statics back to the database. This will allow
you to see both the ASCII import and export portions of the database.
Caution:
Apply User Statics is an alternate method for applying datuming type statics. Only
one of the datuming processes should be run on a dataset. Use either Datum Statics
Apply, Apply User Statics, or Apply Refraction Statics, but only one. Refer to
the helpfiles for additional statics related information.
3. Click on File and enter the full path and filename (including
extension) of the ASCII file. Click OK and the contents of the
ASCII file are displayed. The ASCII/CLIENT path is a generic
ASCII file import functionality.
4. Once the ASCII file is displayed, select the Order (SRF or SIN),
Infotype (GEOMETRY), and Attribute (USERSTAT).
5. Click on Location Index and then define the rows and columns to
import.
6. Click on Display.
Be sure you complete the ASCII Import steps for both shot and
receiver ASCII files.
9. Datum Statics Apply will know to use the user_stat_math file for
the Database Math Method.
10. Execute the flow. The trace headers are updated and the are traces
shifted to the floating datum.
11. Once the job finishes view the shots with flow 1.1-View Shots.
Examine the trace headers for NMO_STAT, FNL_STAT,
TOT_STAT, and NA_STAT using the Header icon.
Chapter Summary
Chapter Objectives
5. Brute Stack
This chapter creates you first QC stack of the data. Upon completion of
this chapter you should:
5. Enter the description name for your imported velocity. Use a name
similar to imported from ascii file.
This opens two new windows, an empty viewing window and a File
selection window.
7. Input the absolute path name to the directory where the velocity file
is stored and append a /* to the end of the pathname. Click on
Filter. (/misc_files/2d/*.)
The ASCII file is opened, and the contents displayed in the Import
viewing window.
9. Click on Format.
10. Enter a new format definition name Vels Import Format or select
a previously defined format (you probably do not have any yet).
12. Click on CDP and then drag the mouse over the appropriate
columns on the import file window to define the correct columns
for the CDP value.
NOTE:
You do not have to select the rows to import since the database will search for valid
CDP numbers with associated velocities.
15. Select Overwrite ALL existing values with new import values
and OK.
16. The XCOOR and YCOOR columns are ignored for 2D.
17. Click on File Exit to save the parameter table and exit from the
editor.
18. Check the table for correctness by going back to the list of tables
from the User Interface and select to Edit the table.
CDP/Ensemble Stack
You will now use the CDP/Ensemble Stack process to create a stacked
section of the data with elevation statics.
2. In Disk Data Input, select your shots with elevation statics applied,
and sort by CDP.
Display Stack
5. You may also stack and display the user statics dataset STK-user
statics as a QC.
Chapter Summary
For First-break picking and trace editing, ProMAX uses a Cascade-Correlation Learning
Architecture. Advantages of this algorithm include decreased network learning time and the
ability to incrementally add to an existing network. The neural network compares various
attributes of the correct pick to other possible picks within a window. The network recognizes the
ability of an attribute to predict the correct pick and accordingly weights the network connection
to that attribute.
Chapter Objectives
The first break picker in Trace Display gives you the opportunity to
interactively create and train a neural network to pick first breaks. You
will manually pick some first breaks and use these picks to train a neural
network. The neural network will then try to pick first breaks on selected
shots, and you can QC these picks using Trace Display.
NOTE:
The NN First Break Picker menu in Trace Display only appears if geometry is
defined, and your dataset matches the database. You can check if geometry matches
the database vie MB2 under the Dataset listing from the Flows menu.
Interactive Training
1. Copy your flow 4a.2-Apply Datum Statics and add/delete/edit
processes so that it looks like the following:
Select the pick polarity and the signal/noise gate length. The neural
network works well with peaks and a gate length of 100 ms. Select
OK to accept these parameters. The neural network itself, however,
may key off of instantaneous phase/frequency, amplitude before or
after the first break, or any other pattern it can recognize.
6. Select the nn first break gate table from the Pick Layers window,
and pick the top of the gate.
Manually pick first breaks using MB1. Pick first breaks on 20-30
traces. Because training is interactive you can incrementally train
the network. This means you do not need many picks to begin
training, as more picks can be added in future training runs. More
picks means longer training time.
The One time Recall option applies the neural network to the
currently displayed gather. A First Break NN Recall window
appears.
10. If the picks are bad, modify your FB Training Data and retrain the
network.
To modify training picks, click on the Picking tool icon. Your new
table of picks appears in the Pick Layers window. Remove the table
from the list and activate the FB Training Data. Modify or add to
these training picks, select First Break NN Training, and use the
same weight table. Iterate through steps 6, 7, and 8 until you are
satisfied with the results. If you still cannot get satisfactory results,
try purging the Neural Network (FirstBreakPicker
Purge Neural Net) and starting over.
11. Set Neural Net Recall to Continuous and click the Next ensemble
icon to go to the next shot.
You can retrain if necessary, or if you think the picks are close
enough, select File Exit/Stop Flow, and choose to save edits
before exiting.
The weight table, and time gates are saved and can be used in the
batch NN First Break Picker process to pick the entire dataset.
This step uses the neural network weight matrix to pick first breaks on
all shots. In the case of first-break picking, neural network picks are
stored in the ordered database and can be accessed for various uses
including refraction static analysis.
You must specify a starting offset for the picker. Specify an offset
with good S/N and no shingling of refractors. For this data, an offset
value of about 1000 ft. is adequate.
Edit the same flow, and toggle NN First Break Picker inactive, and
Trace Display active, and execute the flow. From the menu bar in
the Trace Display window, select Picking Edit Database Values
(first breaks)... Select NN_PICK as the Infotype, and PICK0001
(the 12345678 picks are from the interactive picker) from the OPF
File Selector, and use the same name to save edits.
Dont spend too much time editing picks here. The easiest way to
view and edit your picks is to use the first break editing capabilities
of the Refraction Statics process in the next chapter. Also do not
worry about zero picks on the dead traces.
Chapter Summary
The refraction statics processes expects R_STATIC and S_STATIC to be present in the database.
Once these attributes are in the database the refraction statics processes can fill them in with more
accurate static values than simple elevation static calculations. The recommended method to
create R_STATIC and S_STATIC database entries is to run the process Datum Statics
Calculation*, before running the refraction statics processes.
Refraction Statics
Refraction Statics Calculation - coordinate based
Apply Refraction Statics
Stack with Refraction Statics
Chapter Objectives
Refraction Statics
NOTE:
First breaks must be picked and written to the database prior to this exercise. Please
refer to the Neural Network First Break Picking exercise earlier in this manual.
Refraction Statics - 2D
In this exercise you will use the Refraction Statics* process and first-
break pick times to calculate a near-surface model and travel-time
corrections.
NOTE:
This process does not use XY values, therefore it is not applicable to crooked lines.
Crooked may be defined as any line with a greater than 15 degree bend. If you are
calculating refraction statics on a crooked line, refer to the Refraction Statics
Calculation* process described later in this chapter.
This process calculates shot and receiver refraction statics to shift to the
final datum and updates the database. Results of this exercise will be
used by Datum Statics Apply in a later exercise.
Refraction Statics*
Select display DEVICE: -----------------------------This Screen
Select First Break Times file: -TRC:NN_PICK:PICK0001
Get LAYER Picks from DATABASE: -------------------------No
Get Refractor Velocities from DATABASE: ----------------No
Select TRACE data file: ---------------Shots-with geometry
Compute V0 from UPHOLE data?: -------------------------Yes
Number of layers: ----------------------------------------------------1
Use Delay Times in velocity/depth model?: ------------Yes
Use Deep Hole delay time algorithm?: ---------No
Use GRM in velocity/depth model?: -----------------------Yes
Specify GRM minimum XY distance: -------------0.
Specify GRM maximum XY distance: ------------0.
Specify GRM XY distance increment: ----------55.
Final datum Elevation: -----------------------------------------800
Replacement Velocity: ----------------------------------------8000
Use Uphole Time in source statics algorithm?: ---------No
Select your first break pick file. Picks are typically in the database in
the TRC order and NN_PICK Infotype. Select the batch PICK0001
file for this exercise. Input trace data will be the raw shots. Enter a
final datum of 800 ft. and a replacement velocity of 8000 ft/sec.
Use the Edit Picks option for final editing of first-break picks prior
to inversion. Use the mouse button helps to guide your editing; use
the options on the right side of the screen to edit your data. To guide
your editing you may want to turn on the seismic by toggling on Add
Traces. Click MB2 below the data to move to the next set of shots,
or MB3 to move backwards. Select Done to go back to the main
menu. Select Yes to Output Updated Picks to the Database, and
provide a name RefrEdit for the pick file.
Warning:
The editing in this function currently snaps to a sample and not necessarily the true
peak. This could lead to up to a 4ms pick error. Residual statics, however, should
correct for these slight errors.
This option displays pick times for both sides of the spread, as in the
case of split spread shooting. Define the offset range for each layer
by holding down and dragging MB1 over the corresponding range,
then releasing MB1. This is an interpretive process. Note: The
displayed velocity is only a guide; you are not assigning a velocity
for the layer. Avoid inflection points where refractors are shingling.
Also avoid low S/N areas. The velocity you get should be on the
order of 7500 f/s. Select Done and then Yes to Output Refractor
Picks to Database.
This option allows viewing the shot and receiver statics calculated
from the model data. Source statics from the elevation of the shot
through the model to final datum are displayed with the character
s. Receiver statics from the elevation of the receiver through the
model to final datum are displayed with the character r. Select Yes
to Output STATICS to the DATABASE.
NOTE:
In the main menu, click MB2 on any previous box to view its current values or MB1
to re-edit those values. If you choose to re-edit, be sure to step through all
subsequent options to correctly recalculate your final statics.
11. Exit the current flow. From the Flows window, access the database
with the Database global command option.
The main disadvantages are that there is not a graphical interface for
editing. The source and receiver static solutions are applied to the data
in a future step, Apply Refraction Statics.
NOTE:
First break times must be picked and written to the database prior
to this exercise. Please refer to the Neural Network First Break Picking
exercise earlier in this manual.
As a part of this exercise you will see that there are two ways to enter
the refractor offset ranges. These are:
Manually.
In this exercise you will use first-break pick times to calculate a near-
surface model and travel-time corrections. This process calculates shot
and receiver refraction statics to shift to the final datum and updates the
database. Results of this exercise will be used by Apply Refraction
Statics in the next exercise.
>Refraction Statics*<
Refraction Statics Calculation*
Select first break time file: -----TRC:NN_PICK:PICK0001
Number of layers: ----------------------------------------------------1
Identification number: ----------------------------------------------1
Minimum fold: ---------------------------------------------------------1
Shooting Geometry: -----------------------------2D split spread
V0 options: -------------------Compute V0 from uphole data
INPUT REFRACTOR OFFSET?: ----------------------------- Yes
Refractor Offset specification: --------User typein
Enter SIN and refractor offsets: -----------------------
-------------------------------10:-1800--500,500-1800/
COOMPUTE REFRACTOR VELOCITIES?: -------------- Yes
Type of INITIAL velocity computation: ---- MEAN
Smooth INITIAL velocities before output?: ---Yes
Length of INITIAL velocity smoother: ----------201
Edit first break times (median velocity?: ------No
COMPUTE DELAY TIMES?: ----------------------------------- Yes
TYPE of delaytime ALGORITHM: -Gauss-Seidel
Number of iterations: ------------------------------------5
TYPE of delay time computation: ----------- MEAN
Iterate refractor velocity?: -------------------------- No
COMPUTE REFRACTOR DEPTH MODEL?: ------------- Yes
First refractor smoothing: -----------No smoothing
COMPUTE SOURCE and RECEIVER STATICS?: ----- Yes
Final datum Elevation: -----------------------------800
Replacement method: -----------Refractor Velocity
COMPUTE RESIDUAL STATICS:? ----------------------------No
Select the first break time to use for the statics decomposition. These
time picks will be in the TRC OPF and will normally be of the type
NNPICK. Select the PICK0001 file. If you have output an edited
pick file, it will be stored with an infotype of FBPICK. Enter the
number of layers to model, in this case use one layer. The
identification number will be 1 for the first run through the process.
The shooting geometry is 2D split spread.
Once CDP velocity is available, delay times for shots and receivers
may be computed. This is done by iteration, starting with source
delay time estimates, followed by receiver delay time estimates, and
(optionally) finalized by CDP velocity updating. Values are not
computed for any SIN, SRF or CDP that does not meet the minimum
fold (menu parameter) criterion. Once the decomposition is
complete for each refractor, these missing values are interpolated
based on X and Y.
The depth model stage inputs delay times and refractor velocities in
CDP, interpolates refractor velocity into SIN and SRF, computes a
depth model for sources and another for receivers. Optionally, the
first refractor depth in SRF may be projected into CDP, smoothed,
projected back into SRF, V0 recomputed in SRF based on the
smoothed depths, new V0 projected from SRF to SIN, and finally
SIN and SRF depth models computed.
It is important to note that the Datum Statics Apply process first checks
to see if other statics have been applied to the traces by an earlier
processing step. If statics are applied, Datum Statics Apply first removes
these statics returning the traces to their original recorded time
reference. Also, if previous statics contained any hand statics or shot
delay corrections, these statics are also removed and should be
reapplied.
NOTE:
We do not have to recalculate the datum statics (...C_STATIC...) unless you want to
change the smoother of N_DATUM, the datum elevation, or the replacement
velocity. Datum Statics Apply will back out the elevation statics before it applies
the refraction statics.
For Source statics, the order is SIN and the Infotype is Geometry.
You will have an available list of parameters files, saved in
Refraction Statics*. Select one of the following statics files:
For Receiver statics, the order is SRF and the Infotype is Geometry.
Select one of the following statics files:
Chapter Summary
This flow is used throughout the rest of the class to compare stack sections.
Compare Stacks
Chapter Objectives
5. Brute Stack
In this chapter you learn a slick way to compare stack datasets. This
technique is quite valuable in testing processing flows and parameters.
Upon completion of this chapter you should:
Compare Stacks
The stack with elevation statics will appear first. Use the Next
ensemble icon to display the stack with refraction statics. After both
stacks have been displayed, use the animation tool to compare the
stacks.
You may want to execute this flow again, and display both stacks on
a single screen.
Chapter Summary
Precomputing data at predefined locations is also supported to speed the interactive session. When
used in the precomputed mode, the process reads in precomputed analysis data, as opposed to
standard CDP-ordered data. This precomputed data is generated using Velocity Analysis
Precompute. Preprocessing of data must be performed at the precomputing step.
Chapter Objectives
6. Velocity Analysis
Supergather Formation*
Read data from other lines/surveys?: ---------------------No
Select dataset: -----------------------Shots-decon/refr statics
Presort in memory or on disk?: -----------------------Memory
Maximum CDP fold: ---------------------------------------------180
Minimum center cdp number---------------------------------825
Maximum center cdp number--------------------------------950
Cdp increment--------------------------------------------------------25
Cdps to combine-------------------------------------------------------9
Bandpass Filter
Ormsby frequency filter values: -------------------3-6-50-60
----Default all remaining parameters----
Automatic Gain Control
----Default all parameters for this process----
Velocity Analysis Precompute
Disk Data Output
Supergather Formation*
Bandpass Filter
Automatic Gain Control
Velocity Analysis Precompute
Number of CDPs to sum into gather: --------------------------9
Apply partial NMO-to-binning:--------------------------------Yes
Apply differential CDP mean statics?:---------------------Yes
Absolute offset of first bin center: -------------------------27.5
Bin size for vertically summing offsets: -------------------55
Maximum offset: ---------------------------------------------6572.5
Use absolute value of offset for stacking?: --------------Yes
Minimum semblance analysis value: -------------------7000
Maximum semblance analysis value: ----------------20000
Number of semblance calculations:--------------------------50
Semblance sample rate (in ms): ------------------------------20
Semblance calculation window (in ms): -------------------40
Number of stack velocity functions: -------------------------17
Number of CDPS per stack strip---------------------------------5
Scale stacks by number of live samples summed:---Yes
Method of computing stack velocity functions:--------------
----------------------------------------------Top/base range
Velocity variation at time 0: ---------------------1000
Velocity variation at maximum time:---------3000
Velocity guide function table name:------------------------------
------------------------------------imported from ascii file
Maximum stretch percentage for NMO: --------------------30
Long offset moveout correction?:-------------------------NONE
Disk Data Output
Output Dataset Filename: ------------------------------------------
------------------------Precomputed Velocity Analysis
Set the number of CDPs to sum into gathers as 9, and set the bin
sizes.
One
Group
Interval
Partial NMO and SUM Move the Traces to the NMO of the Bin Centers
Full NMO and SUM Flatten the Traces to the Zero Offset Time of the Gather
Velocity Analysis
In this flow we will set the prameters for velocity analysis to use the
precomputed data from the previous flow.
2. Set the Disk Data Input parameters as shown. Make sure to sort the
input data by the user-defined header word SG_CDP.
Next, select Yes for Set semblance scaling and autosnap parameters
to display the semblance submenu. The default settings will work
fine so turn off the semblance submenu by clicking No for Set
semblance scaling and autosnap parameters. The submenu
parameter settings will be retained and used even though they are
not visible.
The parameter Set which items are visible works the same way. Both
the visiblity and semblance parameters can also be changed
interactively from within the velocity analysis tool.
NOTE:
The Velocity Analysis parameters are only our initial guesses. Once inside the
Velocity Analysis Viewer we can change any of the parameters interactively.
Activate the picking icon, and begin picking a function with MB1.
You can pick in either the semblance display, or the velocity stack
strips display. As you pick velocities on the semblance plot, the
picks are also displayed on the velocity strips, and vice versa. Use
the Next ensemble icon to move to the next analysis location
After you pick the first location and move to the second you may
want to overlay the function that you just picked as a second guide.
You can do this by clicking on View Object visibility...
Average of all CDPs (blue). This will display the average of all of
the functions that have been picked in the output table to date.
Once you have determined your favorite settings, you can set the
flow parameters so your Velocity Analysis display is automatically
configured that way..
NOTE:
Your velocity picks are automatically saved to an RMS velocity ordered parameter
file when you move from one location to the next or Exit the program. You also have
the option to save picks using the Table/Save Picks option.
2. Return to the ProMAX User Interface. Toggle off all processes and
add Volume Viewer/Editor to the flow.
Make sure you use the same velocity table that you are currently
using in Velocity Analysis.
Also, make sure you select Yes to Interact with other processes
using PD? This will allow the PD (point dispatcher) to
communicate with Velocity Analysis.
If you have not picked any velocities, the display will contain zero
values and the screen will be all blue and the velocity scale will be
very large. If you have picked at least one velocity function, you will
only see a vertical color variation in the Cross Section window.
When you are finished picking this new analysis location, click on
the Next ensemble icon again. This will not only move you to the
next analysis location, but will automatically send the velocity picks
just made to the Volume Viewer/Editor displays.
With the PD icon activated, position the mouse cursor over a node.
The cursor should change from an x to an o. Click MB1 to
retrieve that velocity function into the Velocity Analysis display.
Clicking MB2 deletes that analysis location.
11. Continue picking velocities in Velocity Analysis until you finish all
of the locations on this project.
12. To finish picking, first make sure that the Point Dispatcher PD
icon in Volume Viewer is deactivated. Then in Velocity Analysis,
click on the File Exit/stop flow pull down menu in the velocity
analysis and the File Exit pull down in the Volume Viewer/
Editor.
Chapter Summary
Autostatics Flowchart
Data Preparation for Input to Residual Statics
Calculation of Residual Statics
QC and Application of Residual Statics
External Model Autostatics Overview
External Model Autostatics Flowchart
Chapter Objectives
7. Residual Statics
Autostatics Flowchart
Autostatics
Flowchart
1. Pre-Process
(geometry, gain recovery,
noise reduction, deconvolution,
refraction or elevation statics,
NMO, BPF, AGC)
CDP
Stack
Pick
Autostatics
Horizon
All of the residual statics process are standalone and require that all
preprocessing be applied to the data and output to a disk dataset prior to
executing the residual statics processes. At this point in the processing
sequence, the input to autostatics should have geometry information,
gain recovery, noise reduction, deconvolution, refraction or elevation
statics, and NMO applied.
4. Apply an AGC and bandpass filter to clean the data going into
residual statics calculation.
This data is input into the residual statics process in a later flow.
10. From the menu bar in Trace Display, select Picking Pick
Autostatics Horizons...
Enter a gate width=100 (ms). The gate width should be bigger than
twice the maximum residual static expected. In swampy/marshy
areas this may be a large value. Click on OK when finished.
12. Pick a horizon using MB1. This identifies the center of the time
gate. Horizons may extend across the entire dataset or cover only a
portion of the data. CDPs not included in a horizon will not be
included in residual statics calculations for that horizon.
NOTE:
Autostatics horizons are picked from stacked data that has been shifted to the final
datum. The residual statics processes automatically shifts these time horizons to the
processing datum, the same datum input CDP gathers are referenced to. This
process of applying C_STATIC to the horizons is automatic and transparent to the
user.
You will be prompted to enter a new smash value and time gate for
each horizon. Notice also the new horizon is represented in the Pick
Layers window with a number in parentheses.
The residual statics process will average the static solutions in areas
of overlapping windows. About a 10 trace overlap should provide a
smooth transition between static solutions. Too much overlap can
lead abrupt edges to the static solution.
14. To quit and save the autostatics horizon parameter table select
File Exit/Stop Flow.
Correlation Autostatics
Differential Autostatics
Autostatics calculation
In this exercise, you will calculate residual statics using Maximum
Power and Correlation Autostatics. An additional exercise at the end of
this section describes the external model routines, Gauss-Seidel
External Model Autostatics and Cross Correlation Sum External Model
Autostatics.
Correlation Autostatics*
Select Trace data file: -------------CDP-input to res. statics
Select Autostatics HORIZON file: --------------------horizon1
Select Autostatics VELOCITY file: vels from precompute
Maximum velocity error (percent): ----------------------------5.
Number of CDPs for velocity smoothing: -----------------51
Minimum # of traces for vel. estimate: ---------------------36
Minimum % of offset range for vel. estimate: ------------25
Maximum statics allowed (milliseconds): -----------------20
Statics partitioning iterations: -----------------------------------4
Minimum live samples in a gate (percent): ---------------60
Seek/report reversed sources/receivers/channels: Yes
Create a NEW database entry for each run?: -----------No
>2D/3D Max. Power Autostatics*<
2. Select your NMO corrected CDP gathers as the input trace data to
Correlation Autostatics. Select your autostatic horizon and RMS
velocity tables.
Upon completion, click on View from the Flow menu and look at the
contents of the job.output file. Check the range of source and
receiver statics values. Do you have any reversed traces?
>Correlation Autostatics*<
2D/3D Max. Power Autostatics*
Select Trace data file: -------------CDP-input to res. statics
Select Autostatics horizon file: -----------------------horizon1
RMS statics change convergence criteria: -------------0.05
Maximum number of iterations: ------------------------------10
Minimum live samples in a gate (percent): ---------------60
Maximum static allowed (ms): -------------5,7,10,15,20(6)
Correlation accept percent: -------------------------------------10
Compute Statics for whole line?: ----------------------------Yes
Use envelope of correlations?: --------------------------------No
Apply previously computed residuals?: -------------------No
Restrict offsets?: ----------------------------------------------------No
Final minimum static: ------------------------------------------- -20
Final maximum static: --------------------------------------------20
Run ID: --------------------------------------------------------------0000
Report static values after each iteration?: ----------------No
Upon completion, click on View from the Flow menu and look at the
contents of the job.output file. Check to see if your solution has
converged. Also check the range of source and receiver statics
values.
1. From the Flows menu select the Database option, and then select
Database XDB Database Display from the main DBTools
menu. Select Database Get from the XDB display.
2. Select SRF order, Statics infotype, and the two statics files
RCOR000 & RPWR000.
The source (SIN) and receiver (SRF) statics for Maximum Power
Autostatics are SPWR0000 and RPWR0000.
2. Execute (with MB2) and use the screen swap feature to compare
stacks.
3. Execute this flow again (with MB2) using the Max. Power
Autostatics.
1. Pre-Process
(geometry, gain recovery,
noise reduction, deconvolution,
refraction or elevation statics,
NMO, BPF, AGC)
RMS SIN:STATICS:SGEMxxxx
Vels 2. Apply NMO and
Sort to CDPs SRF:STATICS:SGEMxxxx
5a. EMC
Gauss Seidel
Eigen TRC
Matrix 3. Eigen 4. External STATICS
Time Stack Model TRM0001
Gate
Correlation
Correlations
(trace data)
Autostatics
Horizon 5b. EMC
Xcor Sum
SIN:STATICS:SPEMxxxx
SRF:STATICS:SPEMxxxx
Conventional Stack
Model Trace
Eigen Stack
Principal Component input traces
Model Trace
You first need to pick a time gate that will be used in the Eigen Stack
process:
1. Build the following flow to pick an eigen matrix time gate on NMO
corrected CDP gathers.
5. Select a secondary key of CDP, and pick a window from a data area
that has a high Signal/Noise ratio. Make sure that your window
includes the area of interest. Use MB3 inside of the Trace Display
area to select a new layer for the bottom of the window. This
display is also a good QC to check your velocities. If the CDP
gathers are not flat you may have a problem with your velocities.
The Eigen Stack process stacks flat events in a CDP gather. Events
with large trace to trace moveout will not be included in the output
Eigen Stack.
10. Edit the flow to pick autostatics horizons on your Eigen Stack.
12. From the menu bar in Trace Display, select Picking Pick
Autostatics Horizons...
Enter smash=1 (in traces) and the gate width=300 ms and click on
OK.
Pick a horizon using MB1. This identifies the center of the time gate.
Horizons may extend across the entire dataset or cover only a
portion of the data. CDPs not included in a horizon are not included
in residual statics calculations for that horizon.
NOTE:
Autostatics horizons are picked from stacked data that has been shifted to the final
datum. The residual statics processes automatically shift these time horizons to the
processing datum, the same datum input CDP gathers are referenced to. This
process is automatic and transparent to the user.
You will be prompted to enter a new smash value and time gate for
each horizon. Notice also the new horizon is represented in the Pick
Layers window with a number in parentheses.
15. Save your autostatics horizon picks and exit Trace Display.
17. In Disk Data Input, input the NMO corrected CDP gathers.
Select Eigen Stack for the model trace dataset and select your
autostatics horizons for the Horizon file.
21. Build the following flow to calculate your residual statics using
EMC Autostat: Gauss-Seidel*:
These time shifts will be decomposed into the source (SIN), receiver
(SRF), and structure (CDP) statics.
Upon completion, click on View from the Flow menu and look at the
contents of the job.output file. You can graphically check the range
of source and receiver statics values in the database with DBTools or
XDB Database Display.
24. Edit the previous flow to use EMC Autostat: Xcor Sum*.
25. In Xcor Sum, input the correlation trace data that was output from
the External Model Correlations flow.
Upon completion, click on View from the Flow building menu and
look at the contents of the job.output file. Check the range of source
and receiver statics values.
In the database you will see SPEM0000 for both source and receiver
statics calculated using the Xcor Sum and SGEM0000 for the source
and receiver statics from Gauss-Seidel.
29. Fill in the parameters as listed above, and then execute the flow.
30. When you are finished viewing the stack in Trace Display, select
File Exit/Continue Flow.
Chapter Summary
Can you build a Model Stack to pilot some of the statics routines
Before DMO is applied to the data, the trace data are typically grouped into offset bins using the
Trace Binning process. Once the data is binned, processes using Kirchhoff and F-K
implementations of DMO are available to perform the prestack partial migration.
Chapter Objectives
Trace Binning requires a list of bin centers and bin increments. For off-
end shooting, the minimum offset bin increment is typically specified as
either twice the shot interval or the nominal change in offset from trace
to trace within a CDP. For symmetric split spread geometries, the shot
interval should work. For asymmetric split spreads, the data will
determine which interval is appropriate. In general the rules of thumb
for DMO offset bin spacing are:
Since the shot interval of 200 m yields a bin increment of 400 m, the
center of your first offset bin would lie midway between 400 and 500,
or 450 m. Your DMO offset bins would now be 450 +/- 200, 850 +/-
200, 1250 +/- 200, etc. See diagram below:
300 400 500 600 700 800 900 1000 1100 1200 1300 1400........
Bin centers may be based on either the signed offset or the absolute
value of the offset. If absolute value of offset is used, traces with the
same magnitude offset are combined in the same ensemble. You also
have the opportunity to vary the width of the offset bins (the bin
increment) as a function of trace offset. This may become important for
lines that were collected with a regular, but asymmetric split spread
geometry.
1. Build the following flow to view the first offset bin in your survey:
2. Use Disk Data Input to sort the data first by offset bin number (from
the Alternate List of header words) and then sort by CDP.
For this first exercise, we will only display the first offset bin, so set
the number of input ensembles/output ensemble to 1. We will use
this process later to combine more than one offset bin for display.
This process will insert a dead trace anytime the spacing between
CDPs is greater than 1.
Notice that there are very few live CDPs for this single offset bin.
Since DMO operates in the offset domain, it would be desirable to
have offset bins that contain live traces for nearly all CDPs. This is
the reason that we merge several offset bins prior to performing
DMO.
7. Compute first guess at bin width and center of first bin.
For this geometry, the average shot interval is 220 ft. so the first
guess at a DMO offset bin width (using the off-end assumption)
would be 220 * 2 = 440 ft.
The near offset of this data is 27.5 ft. and the traces are 55 ft. apart,
so the source-receiver offsets would be:
27.5 82.5 137.5 192.5 247.5 302.5 357.5 412.5 467.5.....
220 660
For an offset bin width of 440 ft., the center of the first bin would lie
halfway between 192.5 and 247.5 or at 220 ft.
8. Modify the flow to display the data with a bin width of 440 ft.
9. The offset bin spacing for this line is 55 ft, so 8 offset bins would
equal 440 ft.
Notice that most of the CDP locations are filled by live traces. This is
what we want for DMO binning.
If you were to display the data with a bin width of 220 ft. (4 offset bins)
you would see that a width of 220 ft. is adequate for the near offsets, but
too small at the farther offsets.
11. Next we will view the DMO binning parameters in the database.
First we need to transfer the AOFFSET header to the database:
Database Edit Attribute Apply a Function...
12. Choose the abs function, and the OFFSET attribute. Type in
AOFFSET GEOMETRY for the result attribute and then select
OK.
The first will be TRC: OFFSET, CDP, SRF and the second will be
TRC: AOFFSET, CDP, SRF. The XYGraph using AOFFSET will
look similar to the following:
Check the offset bin centers by looking at the graphs and verifying
that each offset bin is evenly populated with CDPs. Also determine
if it is appropriate to combine the traces by absolute offset or if the
negative offsets should be processed separately from the positive
offsets. A general rule is to simultaneously process like offsets.
Use the Grid tool to analyze your bins on the display. Select
Grid Display. This will generate new icons to rotate and move
the grid, modify the cell size, and generate spider or histogram plots
of the cells. Now select Grid Parameterise, and fill in the values
as displayed.
For this data, if OFFSET is used, with a bin increment of 440, the
near offset bin centers are -220 and +220, the far offset bin centers
are -3300 and +6380, and we have 23 bins. If AOFFSET is used, the
near offset bin center is 220, the far offset bin center is 6380, and we
have 15 offset bins.
Notice that the near offsets would only need a bin width of 220 ft.
for continuous CDP coverage, but the far offsets need a bin width of
From this display you can zoom in and QC that each offset bin has
cells populated with continuous CDPs.
If you plan to further process the DMO gathers, you may want to set
the OFFSET header word equal to DMOOFF values.
7. Now lets QC the DMO offset bin centers in the database. Select
Database XDB Database Display 3D XYGraph
TRC:AOFFSET,CDP,DMOOFF. You may have to use the mouse
button help on the bottom left of the window to help you locate
which database entry is DMOOFF, since its label will be an 8 digit
unix parsed name. When selected click Display.
8. From the XYGraph menu, select Color Edit. From the color
editor menu select File Open, and select the contrast.rgb color
file. Each DMO offset bin will now be displayed in a different
color.
9. Edit the 8.1 flow to display the results of your DMO binning:
Use pad traces to insert a dead trace whenever the spacing between
CDPs is greater that 1.
This display will show how many CDP traces exist per bin and will
also show any gaps or unpopulated CDPs in the offset plane.
12. Are the trace gaps in the DMO offset panels reasonable? If not you
will need to adjust you offset binning parameters and re-run flow
8.2.
DMO
NMO
DMO
Inverse NMO
Velocity
Analysis
Re-iterate this process until the difference between input and output
velocities in velocity analysis is small.
Input the same velocity field used to NMO correct the input gathers.
6. Execute the flow, and view the common offset planes after DMO.
8. Optional: After the DMO job finishes, build a flow that sorts the
data to CDP and views the gathers.
Toggle off the first three processes in the above flow. Change the sort
to CDP in Disk Data Input, change the Trace Display ensembles per
screen to 215 and execute.
Final Stack
1. Build the next flow to stack results.
2. In Disk Data Input, input your DMO data in CDP sort order.
If you notice any large artifacts from the DMO process, it probably
resulted from a bad trace. You could either go back and kill the bad
trace, or apply an AGC prior to DMO.
Chapter Summary
In this chapter we will cover F-X Decon and Dynamic S/N Filtering as well as techniques to
subtract adjacent traces (Trace Math) and to add back (BLEND) a proportion of the original data
to the processed data.
Chapter Objectives
To further clean up and optimize the stack, some type of poststack signal
enhancement is almost always applied. This chapter explores some of
ProMAXs techniques of reducing noise and enhancing signal in
poststack data. Upon completion of this chapter you should:
Signal Enhancement
In this exercise, you will compare the results of your residual statics
stack, with stacks that are processed with signal enhancement
techniques.
3. Make four copies of your stack with Reproduce traces and choose a
trace grouping of All Data.
Enter the Repeat number to pass through this portion of the flow.
Please refer to the chapter on parameter analysis if you are not
familiar with the IF-ENDIF conditional logic.
6. Output the four copies of the dataset with Disk Data Output. This
dataset will be used in the next flow.
View the 2D filtered data, and compare the stacks using the
animation tool. Which dataset looks the most mixed?
Trace Math
Trace Math will allow you to add, subtract, multiply or divide adjacent
traces, or apply a scalar to the traces. We will use this process to subtract
stacks created using different processing techniques.
2. In Disk Data Input, input the file you just created, and select the FX
Decon copy, and the Original Input copy.
Chapter Summary
Are you comfortable with F-X Decon and Dynamic S/N Filtering
Manipulation of velocity fields is necessary for other purposes such as seismic trace inversion,
finite difference modeling and time to depth conversion.
In this chapter, we will discuss how to edit and modify velocity fields, using two different velocity
tools.
Chapter Objectives
Edit
Edit Vel
Function
To edit the velocity field, you must edit the control points that define
the velocity. A velocity control point generally consists of a vertical
group (or function) of velocity-time pairs at a certain CDP location.
To view these control points, click on the Edit Vel Function icon
and move your mouse into the velocity field.
Move the mouse pointer from location to location and watch as the
blue function changes in the edit window. You will also notice that
the function nearest the mouse pointer changes from a solid line to a
dashed line. Click MB2 near one of the locations to freeze the
function in the edit window; the function does not change as the
mouse moves across the velocity locations.
Click MB1 near a different velocity location. You should now have
a blue line and a pink line in the edit window. Click the Edit icon on
the right of the edit window, and follow the mouse button help to edit
the pink function. After editing, the velocity field can be updated by
clicking on the Update button on the top of the edit window.
5. Once your velocity field has been edited to your satisfaction, apply
a general smoothing. From the menu bar select Modify Smooth
Velocity field. This brings up the Smoothing Parameters window.
8. After you have saved your smoothed velocity field, you may
compute and display Interval Velocities by selecting Modify
Convert RMS to Interval Velocity Smoothed gradients Dix
equation.
If there are large anomalies in the interval velocity field, you may
need to select Modify Undo last change, perform more editing
on the RMS field, and then convert to interval velocities again. You
can also directly edit and smooth this interval velocity field in the
same manner as described above for the RMS velocities.
9. Once you are satisfied with your interval velocity field, select
File Save table to disk and exit. This will save the Converted
Stacking Vels file to disk for use in the FD Migration.
Velocity Manipulation
2. Select Stacking (RMS) velocity for both the input and output types.
Once you have created the new velocity table, you could QC the file
by inputting the field into the Velocity Viewer/Point Editor.
2. Select Interval Velocity in time as the input and output table types.
Read in your interval velocity file, and assign an output file name.
Once you have created the new velocity table, you could QC the file
by inputting the field into the Velocity Viewer/Point Editor.
2. Select Interval Velocity in time as the input and output table types.
Chapter Summary
Chapter Objectives
PostStack Migrations
Migration Name Category Type Velocity V(x) V(t/z) Steep Rel
Dip Times
Memory Stolt F-K F-K Time VRMS(x,t) Poor Poor Fair 0.2
Phase Shift Phase Shift Time VINT(t) None Good Good 1.0
Steep Dip Explicit FD FD (70 deg) Time VINT(x,t) Fair Good Good 21.0
Time FD (50 deg) Time VINT(x,t) Fair Good Fair 10.0
Reverse Time T-K Reverse Time Time VINT(t) None Good Good 2.5
NOTE:
These tests were run on an IBM 370 RS6000 system. Your times will depend on
your specific environment, workload, dataset, and processing parameters.
Tapering
Upper edge
taper default is 2 traces
Bottom
taper default is 200ms
Lower edge
taper default is 20 traces
Poststack Migration
At this point, you should have your best stacked dataset with statics and
velocities applied, a pre-processed input velocity parameter table
(edited, smoothed, shifted to datum), and an idea of the types of
migrations you would like to run.
Apply FK migration
In this exercise, you will run a FK migration on your data.
Set the velocity scaling factor. Normal ranges are 85-100 percent.
Set the velocity scaling factor. Normal ranges are 85-100 percent.
Apply FD Migration
1. Copy your previous flow, and add FD Migration:
Compare Migrations
1. Use your previous flow 5.3-Compare Stacks, to compare the
various migration datasets to the input stack.
Chapter Summary
* Pre-Initialization
no * Does Shot and Receiver X, Y, and
station information exist in the
yes headers and do you want to use it?
no
* Full Extraction
no
* Do you want to minimize the
number of times that you have to
yes read the data?
* From Field Notes and Survey
no
yes * Do I have Valid Trace Numbers?
Table Diagram
Do you want to minimize the Yes From Field Notes and Survey
number of times to read the
data?
No Partial Extraction
Inline Geom Header Load is the main program used to assign geometry
values to individual trace headers from the OPF database files. One of
the main issues related to this geometry assignment procedure is to
define how a trace in a data file will be identified in the Trace Ordered
Parameter file. One of the options is to use a specific trace header word
called the "valid trace number". In order to utilize the "valid trace
number" we will have to spend some time discussing its origin and how
it can be used.
This means that every trace in the output data file exists in the
database and there is a one to one correspondence in all values in
the trace header to those in the database.
After a successful run each trace will also be assigned the "valid
trace number" if it was not pre-assigned using Extract Database
Files.
1. to read the "valid trace number" from the input trace header, or
Once a trace in a data file has been identified in the Trace OPF, the
information in all of the OPFs for that trace is copied to the trace
header.
The Extract Database Files program writes this trace header word
after it reads and counts a trace that it is entering into the TRC
database. In this case the "valid trace number" is pre-assigned.
The "valid trace number" is a unique number for every trace and is
stored in the trace header as TRACE_NO.
This trace header word continues to exist ONLY if you write a new trace
file after the extraction procedure.
writes the trace count number and SIN to the trace header
Full Extraction is used when you want to extract the shot and receiver
location and coordinate information from the incoming headers.
writes the trace count number and SIN to the trace header
IF you have run the extraction in either mode, AND written a new trace
data file, AND have not altered the number of traces in the database, you
now have valid trace numbers in the headers of the output data set
which you can use to map a trace in a data file to a trace in the database.
This mapping will be performed by Inline Geom Header Load after the
database is completed.
The extraction only partially populates the database. More work will
generally need to be done in the Spreadsheets to input the remaining
information.
After the Spreadsheets are complete, the next step would be to complete
the CDP binning procedures and then finalize the database.
With the database complete, you can continue with the next step of
loading the geometry information from the databases to the trace
headers. You may elect to address a trace by its "valid trace number"
1) it identifies the TRACE_NO of the incoming trace and finds that trace
in the TRC database.
2) it copies the appropriate TRC order values to the trace header and
then
3) finds the shot, receiver, cdp, inline, crossline, and offset bin for that
trace. The appropriate values from those orders are then copied to the
trace headers as well.
In the second option, Inline Geom Header Load does not know exactly
which TRACE_NO it is looking for. It does know which channel and
shot to look for based on the header word(s) that you selected. Given
that this mapping is unique, the program now knows which SIN and
CHAN to look for in the TRC database. Once the entry is found, the
TRACE_NO is copied to the headers and the steps outlined in the first
option are performed.
Again, the key to the second option is that you need to identify which
shot a trace came from by a "unique" combination of header words for
that shot.
This option may be appropriate for relatively small datasets which only
have FFID and CHAN in the input trace headers. This option should be
used when reading the field data and writing the data to disk for the first
time. In so doing, information, such as FFID, number of shots, number
of channels are written to the database, and are then available when the
geometry is completed. Selecting this option will also stamp the output
dataset with valid trace numbers, which allows you to process with
trace headers only and overwrite the dataset with updated geometry
from the database files. This is an important concept for the Inline Geom
Header Load process.
In the following example, you will assume that only the FFID and
recording channel number exist in the incoming trace headers. This
information will be extracted, using the perform pre-geometry database
initialization option in Extract Database Files.
SEGY Input
Type of storage: --------------------------------------- Disk Image
Enter DISK file path name: -----------------------------------------
-----------------------/misc_files/2d/segy_0_value_headers
MAXIMUM traces per ensemble: ---------------------------120
Remap SEGY header values: -------------------------------- NO
Extract Database Files
Is this a 3D survey: ------------------------------------------------No
Data Type: --------------------------------------------------------LAND
Source index method: -----------------------------------------FFID
Receiver index method: ------------------------------STATIONS
Mode of operation: ----------------------------------OVERWRITE
Pre-geometry extraction?: --------------------------------------Yes
Disk Data Output
Output Dataset Filename: ----------------Shots-raw data
New, or Existing, File?: ----------------------------------------New
Record length to output: ------------------------------------------0.
Trace sample format: ----------------------------------------16 bit
Skip primary disk Storage?: -----------------------------------No
Enter the full path name to the SEGY input dataset as described by
the instructor.
This initializes the SIN and TRC domains of the Ordered Parameter
Files, stamps the dataset with valid trace numbers, and allows for the
use of overwrite mode when performing the Inline Geom Header
Load step later.
6. In Disk Data Output, enter the name for a new output file, such as
Shots-raw data.
9. Check the OPFs, verifying the number of records in the dataset, the
number of channels/record, and the FFID range.
The only OPF files that should exist are LIN, SIN, and TRC. If SRF
exists, this means that you identified traces for receivers by
coordinates. You will also find that the SRF OPF has 1 value in it.
In this sequence, we ran the Extract Database Files process in the pre-
initialization mode. Here, we will read the output data from the pre-
initialization step and identify a trace relative to its valid trace number
with respect to the database.
3. In Inline Geom Header Load, match the traces by their valid trace
numbers.
Since the traces were read and counted with Extract Database Files,
you have a valid trace number to identify a trace. You have binned
all traces; therefore, do not drop any traces. Unless you have a
problem, there is no need for verbose diagnostics.
In the Extract Database Files path, the Inline Geom Header Load
process operates on a sequential trace basis, and includes a check to
verify that the current FFID and channel information described in the
OPFs matches the FFID and channel information found on each trace of
each ensemble. The Inline Geom Header Load process will fail if these
numbers do not correspond. You must then correct the situation by
changing the geometry found in the OPFs, or possibly by changing the
input dataset attributes.
ProMAX incorporates the functionality to create supergathers in a number of analysis and quality
control processes. Examples include: Velocity Analysis, Interactive Velocity Analysis, and
Velocity Quality Control. This exercise is useful to help understand the mechanism employed in
creating supergathers in these various processes.
Create Supergather
Create Supergather and Horizontally Stack
Create Supergather
7. Execute the flow and compare your results to the original. It should
look similar to the following:
The velocity function you use is not too critical. The reason for
NMO will be clear in the next step.
After making this selection you will see a new parameter called
Secondary Key Bin Size which was previously hidden. Set this value
to 350.
Notice the difference between this display and your last. Why are
they different this time?
One observation that should jump out is that there are fewer traces
on the screen. This is due to the summation of adjacent traces
performed by the Stack portion of the Combine and Stack option.
The summation is dependent on which header word you select as a
secondary key, and by the secondary key bin size.
You might use this type of operation to reduce the amount of data
going into a Prestack Migration.
CVS Analysis
A Disk Data Input step is required since it is not included within the
macro. Data should be preprocessed gathers without NMO and
should have a bandpass filter and scaling function applied. Sort to
CDP and include the range of CDPs to be stacked. For this exercise,
use all the CDPs in the line.
The display will appear in the old Stack Display tool. The last
constant velocity panel will appear along with 16 screen swap boxes
in the upper right of the display.
Move the cursor to the screen swap boxes in the upper right hand
corner of the display and use MB2 to enable picking of the CVS
panels.
NOTE:
You should see all the icons disappear except for the scroll icon and you should be
able to move your cursor into the data area. If you dont see the scroll icon and your
cursor remains in the screen swap boxes, you are not in CVS/CVM picking mode.
To correct this error, click MB1 in the screen swap boxes and then click MB2 in the
screen swap boxes. You should see the scroll icon remaining and you should be able
to move your cursor into the data area.
Move your cursor into the displayed stacked section. While holding
down MB2, move the cursor back and forth within the stacked
section. This will enable the screen swapping.
Once you have found a velocity panel that stacks an event best, use
MB1 to make a pick and to display it on-screen.
9. To finish picking, click MB1 on any of the screen swap boxes in the
upper right hand corner of the display.
NOTE:
Upon exiting the CVS Analysis display, two velocity tables are written to disk. One
file contains just the picks you made in the CVS Analysis display. The second file
is a fully interpolated velocity table based on the sparse picks you made on-screen.
The input velocity table and the output velocity table can be the same
file or a new output table can be added. The output table is
continuously updated as each new velocity function is picked.
7. Enter the number of velocity functions and number of CDPs for the
stack panels.
quickly jump to a new location on the isovel plot. The size of the box
in the isovel plot is controlled by the Horizontal and Vertical
Enlargement Factor.
NOTE:
The mouse button helps are very important in this process because they change
according to where the cursor is located on the screen.
This option appears below the semblance plot and allows the
reconfiguration of the display. Eliminate the isovel plot to allow
more room for the semblance, stack and gather.
13. Changing the velocity bounds of your fan is possible by using the
Vbound option located to the right of Pick.
14. Mute Analysis can be run at any CDP location. Click on your
analysis location.
When the Analysis Mode menu pops up, select Mute Analysis. Wait
for the computations to complete. Follow the same instructions as
picking a function, click on the Pick option and select Top Mute
from the menu. You will notice in the gather display that mute points
have already been selected. To choose your own mute, use MB1 to
select time/aoffset points. When finished, use MB2 to save the
output. Gathers and stacks are recalculated and you are prompted to
Update the Semblance. A mark is displayed on the isovel where the
analysis was done. Your mute is saved in the Parameter File menu
for Mute Gates and is automatically labeled as IVA with a time/date
stamp.
To restack your line with the new velocities, click on Action and
select Restack Line from the popup menu. The notification window
informs you that your CDPs are being restacked.
16. Exit.
When you are ready to exit IVA, use the Exit located at the bottom
of the screen. Select from the menu to either save to the database or
to abort the IVA session. Your velocity table can be found in the
Parameter Files for RMS (stacking) Velocity menu.
2. In Disk Data Input, input your raw shots with applied geometry.
100 is a bulk shift to move the trace samples away from time zero by
100 ms.
Except for the near offsets, the final LMO corrections are fairly large
negative numbers.
Add the LMO header entry, created in the Trace Header Math, to the
previous statics.
You may find that setting this display to four panels, and limiting the
time range from 0 to 500 ms is useful.
Reference Tables
Reference Graphs
Flows and Data Summaries
Reference Tables
LIN (Line) Contains constant line information, such as final datum, type of
units, source type, total number of shots.
CDP (Common Contains information varying by CDP location, such as CDP x,y
Depth Point) coordinates, CDP elevation, CDP fold, nearest surface location.
Memory Stolt F-K F-K Time VRMS(x,t) Poor Poor Fair 0.2
Phase Shift Phase Shift Time VINT(t) None Good Good 1.0
Steep Dip Explicit FD FD (70 deg) Time VINT(x,t) Fair Good Good 21.0
Time FD (50 deg) Time VINT(x,t) Fair Good Fair 10.0
Reverse Time T-K Reverse Time Time VINT(t) None Good Good 2.5
To help you decide on the optimal migration for a given situation, the
above table is a summary of the poststack migrations and how they
handle changes in velocity and dip.
5) Apply the PRE NMO term 5) Apply the PRE NMO term
NMO_STAT NMO_STAT
ProMAX uses the above logic when applying datum statics. Refer to the
following Datum Statics Terminology graph for a further description
of the statics variables.
Reference Graphs
S.P. CDP
Receiver
N_DATUM
Vweathering NMO_STAT
Surface
Elevation
NMO_STAT
Shot
Vreplacement
Base
Weathering
FNL_STAT
S_STATIC C_STATIC R_STATIC
F_DATUM
Database Attributes:
N_DATUM = floating datum
UKOOA ASCII
O.B. Field
Data
Notes
UKOOA
Import SEG-? Input
Spreadsheet
Manual Import
Input
Database
Import
Seismic Data
(ProMAX)
Extract
Database
Files
Geometry Inline Geom
Spreadsheet Header Load
Inline Geom
Header Load
Valid Trace
Numbers
Overwrite
Trace Headers
Seismic Data Seismic Data
(ProMAX) (ProMAX)
/lib
lib*.a
/plot
/port /help /promax
*.lok - Frame help
/lib/X11/app-defaults *.help -ASCII help
Application window /promax3d
managers
/promaxvsp
/menu /promax
*.menu
Processes
/promax3d
/promaxvsp
/misc
*_stat_math
*.rgb-colormaps
ProMax_defaults
/bin
start-up executable
/etc
config_file
product
install.doc
pvmhosts
qconfig
license.dat
/scratch
/queues
PROMAX_DATA_HOME
or
/Data
/Area
DescName Area subdirectory
and its files
Project
/Line
DescName
17968042TVEL
31790267TGAT 1) Parameter Table files
36247238TMUT
12345678CIND
Index and Map Dataset files
12345678CMAP
/12345678
HDR1 2) Dataset subdirectory
HDR2 and Header and Trace
Dataset files
TRC1
TRC2
/Flow1
DescName
TypeName 3) A Flow subdirectory
and its files
job.output
packet.job
/OPF.SRF Database
/OPF.SRF subdirectory and a
#s0_OPF60_SRF.GEOMETRY.ELEV span file
Flows
Upon completion of the course your flows menu should look similar to
the above.
Datasets: Seismic
Upon completion of the course your processing should have created the
above datasets. Note: how the naming convention allows for clues as to
the datasets contents.
Datasets: OPF-TRC
The TRC trace database is the largest of the Ordered Parameter Files
since it contains information varying by trace, such as FB Picks, trim
statics, source-receiver offsets. Note: the format in the database table is
variable name, variable/info type, and variable description.
Datasets: OPF-SRF
Datasets: OPF-SIN
Datasets: OPF-CDP
Datasets: OPF-CHN
Datasets: OPF-OFB
The OPF offset bin OPF contains information varying by offset bin
number, such as surface consistent amplitude analysis. OFB is created
when certain processes are run, such as surface consistent amplitude
analysis.
Datasets: OPF-PAT
The End
ProMAX 2D
Seismic Processing and
Analysis
I hope the class was beneficial, wlf.