Kiziloz, Dursun AlpPeker, Murat2026-02-082026-02-0820259798331597276https://doi.org/10.1109/ASYU67174.2025.11208308https://hdl.handle.net/20.500.12885/52962025 Innovations in Intelligent Systems and Applications Conference, ASYU 2025 -- 2025-09-10 through 2025-09-12 -- Bursa -- 214381This paper presents a comprehensive performance comparison between ORB-SLAM2 and ORB-SLAM3 systems under various image degradation scenarios. The freiburg3_sitting_halfsphere sequence from the TUM RGB-D dataset, which includes dynamic scenes, served as the experimental baseline. Three controlled variations of this sequence were generated by independently applying Gaussian blurring, image cropping, and contrast reduction to the original dataset. The algorithms were evaluated in a controlled Docker-based environment using RGB-D inputs. The evaluation focused on several key performance metrics including FPS, map point density per keyframe (MP/KF), Absolute Trajectory Error (ATE), and the x, y, and z components of the estimated trajectory. The findings reveal that ORB-SLAM3 generally outperformed ORB-SLAM2 in terms of tracking accuracy and FPS most scenarios, demonstrating its enhanced robustness, especially under conditions of blur and low contrast. However, in the cropped dataset both systems suffered a notable drop in performance. ORB-SLAM2 exhibited slightly better localization accuracy along the x-axis in cropped dataset, indicating that severe field-of-view reduction can diminish the advantages offered by ORB-SLAM3's architectural enhancements. This study uniquely provides an axis-wise trajectory evaluation of degraded visual data, offering insights into the robustness and limitations of modern feature-based SLAM systems in visually impaired environments. © 2025 IEEE.eninfo:eu-repo/semantics/closedAccessimage degradationSimultaneous localization and mapping (SLAM)visual SLAMComparative Analysis of ORB-SLAM2 and ORB-SLAM3 Under Visual Image DegradationsConference Object10.1109/ASYU67174.2025.112083082-s2.0-105022453872N/A