Next best view (NBV) planning is a central task for automated three-dimensional (3-D) reconstruction in robotics. The most expensive phase of NBV computation is the view simulation step, where the information gain of a large number of candidate sensor poses are estimated. Usually, information gain is related to the visibility of unknown space from the simulated viewpoint. A well-established technique is to adopt a volumetric representation of the environment and to compute the NBV from ray casting by maximizing the number of unknown visible voxels. This letter explores a novel approach for NBV planning based on surfel representation of the environment. Surfels are oriented surface elements, such as circular disks, without explicit connectivity. A new kind of surfel is introduced to represent the frontier between empty and unknown space. Surfels are extracted during 3-D reconstruction, with minimal overhead, from a KinectFusion volumetric representation. Surfel rendering is used to generate images from each simulated sensor pose. Experiments in a real robot setup are reported. The proposed approach achieves better performance than volumetric algorithms based on ray casting implemented on graphics processing unit (GPU), with comparable results in terms of reconstruction quality. Moreover, surfel-based NBV planning can be applied in larger environments as a volumetric representation is limited by GPU memory.

Surfel-Based Next Best View Planning / Monica, Riccardo; Aleotti, Jacopo. - In: IEEE ROBOTICS AND AUTOMATION LETTERS. - ISSN 2377-3766. - 3:4(2018), pp. 3324-3331. [10.1109/LRA.2018.2852778]

Surfel-Based Next Best View Planning

Monica, Riccardo;Aleotti, Jacopo
2018-01-01

Abstract

Next best view (NBV) planning is a central task for automated three-dimensional (3-D) reconstruction in robotics. The most expensive phase of NBV computation is the view simulation step, where the information gain of a large number of candidate sensor poses are estimated. Usually, information gain is related to the visibility of unknown space from the simulated viewpoint. A well-established technique is to adopt a volumetric representation of the environment and to compute the NBV from ray casting by maximizing the number of unknown visible voxels. This letter explores a novel approach for NBV planning based on surfel representation of the environment. Surfels are oriented surface elements, such as circular disks, without explicit connectivity. A new kind of surfel is introduced to represent the frontier between empty and unknown space. Surfels are extracted during 3-D reconstruction, with minimal overhead, from a KinectFusion volumetric representation. Surfel rendering is used to generate images from each simulated sensor pose. Experiments in a real robot setup are reported. The proposed approach achieves better performance than volumetric algorithms based on ray casting implemented on graphics processing unit (GPU), with comparable results in terms of reconstruction quality. Moreover, surfel-based NBV planning can be applied in larger environments as a volumetric representation is limited by GPU memory.
2018
Surfel-Based Next Best View Planning / Monica, Riccardo; Aleotti, Jacopo. - In: IEEE ROBOTICS AND AUTOMATION LETTERS. - ISSN 2377-3766. - 3:4(2018), pp. 3324-3331. [10.1109/LRA.2018.2852778]
File in questo prodotto:
File Dimensione Formato  
LRA2852778.pdf

accesso aperto

Tipologia: Documento in Post-print
Licenza: Creative commons
Dimensione 1.05 MB
Formato Adobe PDF
1.05 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11381/2849407
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 27
  • ???jsp.display-item.citation.isi??? 23
social impact