Please enable JavaScript.
Coggle requires JavaScript to display documents.
Simultaneous Localization and Mapping (SLAM) in Autonomous or Self Driving…
Simultaneous Localization and Mapping (SLAM) in Autonomous or Self Driving System
Background
Simultaneous Localization and Mapping (SLAM) is the process by which a mobile robot carrying specific sensors builds a map of the environment and at the same time uses this map to estimate its pose
SLAM (Simultaneous Localization and Mapping) is a challenging problem in mobile robotics and AI.
been studied for over two decades to enhance the autonomy and self-navigation of robots.
Over 25 years of research has made SLAM a practical technology with various indoor and outdoor applications.
Concept
Autonomous robots need a good understanding of their environment and accurate location tracking.
Robot location is described as a state, including position and orientation, within a map containing environmental features like walls and landmarks.
SLAM is crucial when there's no prior map available, making it essential for robots exploring unknown environments or in situations where GPS is inaccessible.
allows robots to build maps and simultaneously localize themselves without human intervention.
Principle
Radio SLAM
Radio-Based
Simultaneous Localization and Mapping
Real-Time 3D LiDAR SLAM
LiDAR SLAM is
more robust and widely used in outdoor scenarios for practical applications
relies on the active measurement of the geometric information of environment by LiDAR rather than texture information
includes a pose estimation odometer in the front-end and a global pose graph optimization in the back-end.
ORB-SLAM
Versatile and Accurate Monocular
SLAM System
a feature-based monocular simultaneous localization and mapping (SLAM) system that operates in real time, in small and large indoor and outdoor environments.
robust to severe motion clutter, allows
wide baseline loop closing and relocalization, and includes full automatic initialization.
Problem
Correspondence (data association)
process of associating currently observed landmarks with previously observed ones in the map.
errors occur when a robot mistakenly perceives a landmark as the same one observed in a different location.
result from inaccuracies in landmark recognition, making it challenging to differentiate similar features in the environment.
Imperfect movements, drifts, and motion uncertainties further contribute to data association uncertainties.
Uncertainty
location uncertainty
arises from the existence of multiple paths in the environment, making it challenging to determine the robot's precise location.
hardware uncertainty.
results from errors and noises in the robot's components, leading to inaccurate information extraction.
Multiple pathways between locations increase the uncertainty in a mobile robot's position.
Time complexity
Time complexity in algorithms is determined by the number of commands executed during its runtime, dependent on the input size.
input size is often approximated based on the number of landmarks and computations required for each landmark.
Efficient time complexity is crucial for practical application in large instances of SLAM, as it involves simultaneous tasks like navigation, mapping, and localization.
Time complexity and computational complexity are interrelated, with more landmarks leading to longer algorithm execution times.
Sensor’s noise/observation error
Environmental factors like changes in climate and lighting conditions can affect sensor outputs, especially in the case of visual sensors like cameras.
Sensor measurements are computed using mathematical models that may introduce inaccuracies when relating sensory signals to landmark states.
Small errors in perception can accumulate over long-term navigation, potentially leading to system failure.
The use of low-quality sensors exacerbates the issue of observation errors in SLAM.