Calibration of the orientation of borehole geophones has direct consequences on the accuracy of subsequent measurements taken by these tools. Using synthetic data generated from a simple layer-cake geological model, the effects of signal to noise ratio, source-receiver offset and receiver depth were determined to have an effect on this calibration. A signal to noise ratio of 1 or better was generally found to produce mean orientation angles within 0.5° of the true value; however, even a noise-free signal produced small errors in the calibrations. It was also found that increasing offset and decreasing receiver depth both improve the accuracy of azimuth calculations. Effects of the three examined variables were judged to be difficult to separate from one another, although a quantitative relationship of these to azimuth calibration would be useful to develop.
View full article as PDF (2.60 Mb)