

Terrabot- An AI Terrestrial Surveillance Rover
Abstract
Terrabot is designed to enhance rescue and surveillance operations, in regions where drones and other traditional monitoring methods are ineffective. When human engagement is required, the rover operates in dangerous and challenging-to-reach locations including impenetrable woodlands, under manholes and disaster debris. The core of the entire structure is the Raspberry Pi microcontroller which serves as the central processing unit and synchronizes all computational and sensory operations. The rover can safely traverse difficult terrain because to its ultrasonic sensor which enables it to avoid obstacles in real time. To identify individuals even in poorly lit areas, the rover uses a thermal camera trained on the YOLO (You Only Look Once) model. This model overlays the video feed with data regarding the presence of humans after processing the thermal images. With the option of remote manual control, when necessary, Terrabot's navigation technology uses GPS data to autonomously navigate pre-established routes. The thermal video and real-time GPS data are transmitted to a remote server via Wi-Fi technology. By processing and presenting the data on an operator-accessible web-based interface, real-time monitoring and control are guaranteed. Because of its ability to do complex tasks in difficult environments, the project is new and a vital tool for rescue and surveillance operations. The system can be utilized for a variety of purposes, including search and rescue operations, disaster assistance and environmental monitoring, due to its adaptability and robustness. When people's safety is of the utmost importance, it is a reliable choice.
References
Alotaibi, E. T., Alqefari, S. S., & Koubaa, A. (2019). LSAR: Multi-UAV collaboration for search and rescue missions. IEEE Access, 7, 55817–
https://doi.org/10.1109/ACCESS.2019. 2912826
Nair, V. G., D’Souza, J. M., & Guruprasad, K. R. (2024). Optimizing multi-agent search with non-uniform sensor effectiveness in distributed quadcopter systems. IEEE Access, 12, 85531–85547.
https://doi.org/10.1109/ACCESS.2024. 3399446
Yasin, J. N., Mohamed, S. A. S., Haghbayan, M.-H., Heikkonen, J., Tenhunen, H., & Plosila, J. (2020). Unmanned aerial vehicles (UAVs): Collision avoidance systems and approaches. IEEE Access, 8, 105139–
https://doi.org/10.1109/ACCESS.2020. 2999710
Carney, R., Chyba, M., Gray, C., Shanbrom, C., & Wilkens, G. (2022). Multi-agent systems for quadcopters. IEEE Access, 10, 1–12.
https://doi.org/10.1109/ACCESS.2021. 3138481
Gageik, N., Benz, P., & Montenegro,
S. (2015). Obstacle detection and collision avoidance for a UAV with complementary low-cost sensors. IEEE Access, 3, 599–609.
https://doi.org/10.1109/ACCESS.2015. 2443454
Nagatani, K., Kiribayashi, S., Okada, Y., Otake, K., Yoshida, K., Tadokoro, S., ... & Tsurutani, T. (2013). Field and service robotics in rescue missions. Journal of Field Robotics, 30(5), 507–
https://doi.org/10.1002/rob.21458
Li, J., Xiong, X., Yan, Y., & Yang, Y. (2023). A survey of indoor UAV obstacle avoidance research. IEEE Access, 11, 51861–51883.
https://doi.org/10.1109/ACCESS.2023. 3284790
Catellani, M., & Sabattini, L. (2024). Distributed control of a limited angular field-of-view multi-robot system in communication-denied scenarios: A probabilistic approach. IEEE Robotics and Automation Letters, 9(1), 739–
https://doi.org/10.1109/LRA.2023.332 9227
Ekaputra, R. A., Kee, S.-H., & Yee, J.-
J. (2024). Optimizing urban-scale evacuation strategies through disaster victim aggregation modification. IEEE Access, 12, 73581–73596.
https://doi.org/10.1109/ACCESS.2024. 3384865
Dousai, N. M. K., & Lončarić, S. (2022). Detecting humans in search and rescue operations based on ensemble learning. IEEE Access, 10, 26481–26492.
https://doi.org/10.1109/ACCESS.2022. 3157714
Refbacks
- There are currently no refbacks.