Paper
27 March 2024 Knowledge distillation-based point cloud registration method
Linjun Jiang, Yinghao Li, Yue Liu, Zhiyuan Dong, Mengyuan Yao, Yusong Lin
Author Affiliations +
Proceedings Volume 13105, International Conference on Computer Graphics, Artificial Intelligence, and Data Processing (ICCAID 2023); 131052L (2024) https://doi.org/10.1117/12.3026745
Event: 3rd International Conference on Computer Graphics, Artificial Intelligence, and Data Processing (ICCAID 2023), 2023, Qingdao, China
Abstract
Point cloud registration involves aligning two point clouds with different spatial perspectives, commonly used in computer vision and artificial intelligence. Traditional methods lack robustness and accuracy, while deep learning approaches require many parameters, making them computationally expensive and time-consuming. This study aims to develop a compact and efficient model using knowledge distillation techniques to address challenges like initial pose differences and incomplete overlap, improving registration accuracy. Experiments on the ModelNet40 dataset with noisy and partially overlapping point clouds show that the distilled small model achieves favorable registration outcomes with fewer parameters and training time, effectively addressing the overlap problem.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Linjun Jiang, Yinghao Li, Yue Liu, Zhiyuan Dong, Mengyuan Yao, and Yusong Lin "Knowledge distillation-based point cloud registration method", Proc. SPIE 13105, International Conference on Computer Graphics, Artificial Intelligence, and Data Processing (ICCAID 2023), 131052L (27 March 2024); https://doi.org/10.1117/12.3026745
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Point clouds

Performance modeling

Reverse modeling

Education and training

Data modeling

Transformers

Visual process modeling

Back to Top