Time-lapse monitoring using neural networks

Shang Huang, Daniel O. Trad

Time-lapse seismic monitoring quality is affected by near-surface noise, weak reservoir change amplitude and poor subsurface illumination. Except for some geophysical approaches, deep learning can solve the challenges above with high efficiency and accuracy. This project uses a stacked bidirectional long short-term memory neural network (SD-Bi-LSTM) to predict near-surface noise from baseline seismic data. Furthermore, the surface multiple is added in forward modeling to generate baseline and monitor data with expanded subsurface illumination. Results show that stacked bidirectional long short-term memory can predict and mitigate noise in monitor data. The final difference between baseline and monitor models has suppressed significant noise after combining SD-Bi-LSTM and surface multiples. Images have improved accuracy and quality.