Paper
27 March 2024 Visual object tracking for mobile robots based on SAM-TRACK
Yiming Liu, Guowei Zhang
Author Affiliations +
Proceedings Volume 13105, International Conference on Computer Graphics, Artificial Intelligence, and Data Processing (ICCAID 2023); 131052V (2024) https://doi.org/10.1117/12.3026754
Event: 3rd International Conference on Computer Graphics, Artificial Intelligence, and Data Processing (ICCAID 2023), 2023, Qingdao, China
Abstract
The application of visual object tracking SAM-TRACK is widespread. For the problem of robot positioning and navigation, a single laser sensor obtains less information. Adding a depth camera to provide visual information will enhance the robustness of the system. Design a system for visual object tracking based on laser SLAM. Identify and segment targets by clicking, drawing edges, and inputting text, and calculate the target position by comparing the segmentation results with the depth map. At the same time, track the target in real-time to control the robot to ultimately reach the target position. Experiments have shown that multiple input methods can effectively achieve segmentation; By comparing with SiamRPN, it has been proven that the system is more efficient.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Yiming Liu and Guowei Zhang "Visual object tracking for mobile robots based on SAM-TRACK", Proc. SPIE 13105, International Conference on Computer Graphics, Artificial Intelligence, and Data Processing (ICCAID 2023), 131052V (27 March 2024); https://doi.org/10.1117/12.3026754
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image segmentation

Visualization

Mobile robots

Optical tracking

Cameras

Information visualization

Laser soldering

Back to Top