Multisensor-Based Algorithm for Removing Blind Spots and Enhancing Positioning in Industrial Mobile Robots
Main Article Content
Abstract
The robot's reliance on standard LiDAR sensors often leads to critical blind spots that can compromise worker safety, particularly in dynamic industrial environments. Conventional sensors have blind spots that can lead to accidents in the workplace between robots and humans. Having sensors that can eliminate blind spots is quite expensive. This paper presents a novel multisensor-based approach to address the critical blind spots inherent in LiDAR systems within industrial mobile robots. By integrating data from the existing robot's LiDAR sensor and additional RADAR and IMU sensors placed on the robot, the proposed system effectively fills in coverage gaps, enhances tracking accuracy, and provides robot position and velocity. Since LiDAR and RADAR provide the same information, an Extended Kalman Filter (EKF) was used to fuse the information that could be sent to the robot controller while also estimating in instances where data may not be present. It was also used to estimate IMU data in the cases where it has noise. Simulations conducted on the Gazebo platform, simulating real-world human-robot interactions, demonstrated that the proposed method significantly improves tracking reliability against sensor noise and blind spots. The results demonstrate the effectiveness of the proposed method in preventing collisions and improving worker safety in automated workplaces