Real-Time Lidar-Inertial-Visual Odometry, Object Detection, And Mapping

Samawi, Taib Izzat (2025) Real-Time Lidar-Inertial-Visual Odometry, Object Detection, And Mapping. Other thesis, Institut Teknologi Sepuluh Nopember.

[thumbnail of 5025221085-Undergraduate_Thesis.pdf] Text
5025221085-Undergraduate_Thesis.pdf - Accepted Version
Restricted to Repository staff only

Download (2MB) | Request a copy

Abstract

Advanced autonomous mobility is limited by the latency gap between environmental perception and action execution, especially in high-precision three-dimensional (3D) object detection systems. Current 3D detection models, despite being highly accurate, posesses significant computational load and rely on non-real-time processing. This research addresses these challenges by introducing a novel edge-deployable framework for Real-Time Lidar-Inertial-Visual Odometry, Object Detection, and Mapping (RT-LIVO2DM). By designing an architecture that synergistically integrates data from LiDAR, IMU, and visual camera sensors, RT-LIVO2DM is aimed at computational efficiency on resource-constrained embedded systems. Furthermore, the sensor integration presented by the RT-LIVO2DM framework can facilitate high-precision 3D environmental mapping with color data, paving the way for further research related to Level 4/5 autonomous systems. The system’s performance is evaluated using the nuScenes dataset to validate its accuracy, latency, and computational load performance. RT-LIVO2DMachieved a 2D bounding box mAP50 of 44–77% and 3D bounding box mAP50 of 32–47% running at 6-10 FPS on the NVIDIA Jetson AGX Orin embedded platform using a modified version of the nuScenes and nuImages dataset. The framework demonstrates real-time 3D perception on resource-constrained hardware through fusion of LiDAR-Inertial Odometry with Visual-Inertial Odometry techniques, TensorRT-optimized YOLOv11 inference, and adaptive parameter configuration. The system provides colored 3D environmental mapping with flexible single-camera and omni-camera modes, supporting both high visual fidelity and 360-degree coverage for autonomous navigation applications.

Item Type: Thesis (Other)
Uncontrolled Keywords: Real-Time, 3D Object Detection, 3D Mapping, Perception System, Sensor Fusion, LiDAR, IMU, Computer Vision
Subjects: T Technology > TJ Mechanical engineering and machinery > TJ211 Robotics.
Divisions: Faculty of Intelligent Electrical and Informatics Technology (ELECTICS) > Informatics Engineering > 55201-(S1) Undergraduate Thesis
Depositing User: Taib Izzat Samawi
Date Deposited: 30 Jan 2026 09:09
Last Modified: 30 Jan 2026 09:09
URI: http://repository.its.ac.id/id/eprint/131263

Actions (login required)

View Item View Item