We present a technique for dense 3D reconstruction of objects using an imaging sonar, also known as forward-looking sonar (FLS). Compared to previous methods that model the scene geometry as point clouds or volumetric grids, we represent the geometry as a neural implicit function. Additionally, given such a representation, we use a differentiable volumetric renderer that models the propagation of acoustic waves to synthesize imaging sonar measurements. We perform experiments on real and synthetic datasets and show that our algorithm reconstructs high-fidelity surface geometry from multi-view FLS images at much higher quality than was possible with previous techniques and without suffering from their associated memory overhead.
Our overall reconstruction approach comprises the following components:
We use HoloOcean , an underwater simulator to collect datasets of different objects of various shapes and sizes. Our method outputs 3D reconstructions of much higher quality compared to existing state-of-the-art methods.
We also evaluate our method on real-world datasets of a test structure submerged in a test tank using a SoundMerics DIDSON imaging sonar mounted on a Bluefin Hovering Autonomous Underwater Vehicle (HAUV). The sonar can achieve three different elevation apertures (1°, 14°, 28°). We test our method on three different datasets, one for each feasible aperture and show that our method generates superior reconstructions.
Robot and Water tank
3D Reconstructions for a Test Object Submerged Underwater.
@article{qadri2022neural, title={Neural Implicit Surface Reconstruction using Imaging Sonar}, author={Qadri, Mohamad and Kaess, Michael and Gkioulekas, Ioannis}, journal={arXiv preprint arXiv:2209.08221}, year={2022} }