top of page
  • Writer's pictureAlastair MacLeod

Resolving Multiple Pain, for VMB's Gain...

Solving the Demultiple Challenge

Depth migrated seismic should be the default option when integrating seismic for reservoir characterisation or prospect modelling.​


BUT! Remember, fellow geoscientist, that to achieve the best quality depth-migrated seismic you need clean gathers going into velocity model building. It doesn’t matter how well imaged a multiple appears after PSDM, when it shouldn’t be there in the first place!​


The processing geophysicist therefore needs to ensure key pre-processing steps such as denoise, deghost and demultiple are expertly parametrised and thoroughly tested.​


As a small company, at RockWave we guarantee you an experienced and highly skilled seismic processing team that is actively engaged and motivated by your geoscience challenges. Read on to hear how we recently solved a particularly complex multiple problem in the Dutch North Sea for our client, ONE-Dyas.

vintage 3D seismic from dutch north sea
Vintage 3D seismic from Netherlands.

The vintage dataset above shows dipping Jurassic sediments just beneath the high velocity chalk layer. These were generating strong multiple trends of all types (surface related peg-legs, interbeds within the chalk). The image below shows the data immediately prior to the demultiple stage of the RockWave project, with 5 gathers corresponding to the vertical blue lines shown on the stack. The data has been migrated (for illustration purposes only) and is flattened to base chalk.


seismic data prior to demultiple
Input to demultiple workflows, flattened to base chalk.

When being presented with such a challenge, it’s natural that a client may include a processing sequence within their invitation to tender that includes their assumption on what tools may be employed to solve it. This will be based on their knowledge of the most commonly applied tools in a similar geological setting, however without actually testing these on the dataset first, how can you know if they are the optimal tools for solving the challenge?


The figure below shows what the outcome might have been if a ‘conventional’ processing sequence, one that had been designed ahead of the project start, had been followed.


demultiple workflow 1
Conventional demultiple workflow results, flattened to base chalk.

This dataset has undergone a least-squares (LS) matching of shallow-water demultiple (SWME) models and LS matching an interbed demultiple (IMA) model. The chosen parameters have been tested and optimised as if this was going to be the only method employed here, and why not? These are advanced demultiple processes and appropriate for the challenge, and in the highlighted zone, the result provides a reasonable result to the left of the well, where there is good broad bandwidth dipping primary clearly visible. However there are remaining issues:

  1. Where the chalk thickens there is still a lot of uncertainty when interpreting the underlying dipping primary.

  2. In some places the SWME was struggling to attenuate the second surface related multiple bounce (red arrows on the gathers).


At RockWave, we understand that optimised seismic processing requires a data driven approach. You need to put data through a set of well-designed tests that will start revealing answers about which direction to take the workflow. This is where the experience and skill of the people performing the seismic processing begins to make the difference. There are so many variables to test that in order to maintain efficiency, you need to know which ones are significant.

The image below shows the result of Technical Director, Nick Woodburn’s experience, skill and enthusiasm. Improvements over the 'conventional' approach were found by simultaneously LS matching both SWME models and a Tau-P decon model, enhanced with a matching of the multiple model in the curvelet domain. Additional optimisations were then achieved by following up the first pass with an IMA to attenuate interbeds generated within the chalk layer. The dipping primary events are now interpretable across the full highlighted target zone, much to the delight of the ONE-Dyas geoscientists.


RockWave demultiple workflow for ONE-Dyas
The output from RockWave demultiple workflow, flattened to base chalk.

To us your project matters.


No matter the challenge, no matter the scale, no matter the budget, RockWave treat every project with the same care and attention.

Being fully independent, we have no large organisational goals or stakeholders to dictate a change of focus – it will always remain firmly on your processing challenge, as demonstrated by this example.

The images below compare the final RockWave stack with the vintage seismic. Not only is the client's target much better imaged for improved rock physics calculations, but by paying attention to the details in pre-processing, velocity model building in depth was more accurate and a superior tie of the seismic to available well data was acheived.


To read more about ONE-Dyas' experience using RockWave, click the link below.




59 views0 comments

Recent Posts

See All

댓글


bottom of page