Open Access Open Access  Restricted Access Subscription Access

ENHANCING MOBILITY AND ACCESSIBILITY: A WHEELCHAIR WITH MULTI MODE OPERATION

Raghu Ram Chowdary Velevela, Dr. Suresh Babu Chandanapalli

Abstract


According to the World Report on Disability by the World Health Organization (WHO), approximately 15% of the global population lives with some form of disability, with 2 to 4% experiencing significant difficulties in functioning. Mobility impairment is one of the most common disabilities, often requiring assistive devices such as wheelchairs for independent movement. Over the years, various types of wheelchairs have been developed to enhance accessibility and mobility for individuals with disabilities. These include joystick-controlled wheelchairs, voice recognition-controlled wheelchairs and head gesture-controlled wheelchairs. However, despite advancements in assistive technology, many individuals face challenges in using these devices due to the limitations of unidirectional control systems. Furthermore, the high cost of advanced wheelchairs ranging between ₹80,000 and ₹1,50,000, makes them inaccessible to a significant portion of the disabled population. To address these challenges, a multi-modal control wheelchair system is proposed that enhances usability, accessibility and affordability. The proposed wheelchair integrates three distinct control modes, ensuring flexibility and ease of use based on the user’s preference and physical ability:  Gesture-Controlled Mode, Joystick-

Controlled Mode, Mobile-Controlled Mode. The implementation of this system is based on a Raspberry Pi 4 microcontroller, which serves as the central processing unit for integrating the three control modes. The Raspberry Pi 4 was chosen due to its high processing capabilities, cost-effectiveness, and compatibility with various sensor modules. The system is designed to be affordable, easy to use and adaptable to diverse user needs, making it a practical solution for individuals with mobility impairments. This paper will provide a detailed analysis of the proposed wheelchair design, the working mechanisms of each control mode and the implementation challenges encountered. Additionally, performance evaluations and user feedback will be presented to assess the feasibility and effectiveness of the system in real-world scenarios. By introducing a multi-modal control system, this research aims to enhance mobility, promote inclusivity, and empower individuals with disabilities to navigate their surroundings with greater ease and independence.

Full Text:

PDF

References


World report on disability: World Health Organization, 2011.

Chowdhury, SM Mazharul Hoque. Smart wheelchair for disable people. Diss. JAHANGIRNAGAR UNIVERSITY, 2019.

Speech and flex sensor-controlled wheelchair for physically disabled people presented by 1 Shruti Warad, 2 Vijayalaxmi Hiremath, 3 Preeti Dhandargi, 4 Vishwanath Bharath, 5 P.B. Bhagavati.

“Wireless Head Gesture Controlled Wheelchair for Disable Persons” presented by Shayban Nasif & Muhammad Abdul Goffar Khan (EEE department Rajshahi University).

Lee, J.; Ahn, B. Real-Time Human Action Recognition with a Low-Cost RGB Camera and Mobile Robot Platform. Sensors 2020, 20, 2886.

Islam, M.M.; Sadi, M.S.; Braunl, T. Automated Walking Guide to Enhance the Mobility of Visually Impaired People. IEEE Trans. Med. Robot. Bionics 2020, 2, 485–496.

Kamal, M.M.; Bayazid, A.I.; Sadi, M.S.; Islam, M.M.; Hasan, N. Towards Developing Walking Assistants for the Visually Impaired People. In Proceedings of the 2017 IEEE Region 10 Humanitarian Technology Conference (R10-HTC), Dhaka, Bangladesh, 21–23 December 2017; pp. 238–241.

Islam, M.M.; Tayan, O.; Islam, M.R.; Islam, M.S.; Nooruddin, S.; Nomani Kabir, M.; Islam, M.R. Deep Learning Based Systems Developed for Fall Detection: A Review. IEEE Access 2020, 8, 166117–166137.

Wan, J.; Wang, Y. The Human Face Recognition Algorithm Based on the Improved Binary Morphology. In Proceedings of the 2018 2nd IEEE Advanced Information Management, Communicates, Electronic and Automation Control Conference, IMCEC 2018, Xi’an, China, 25–27 May 2018; pp. 2625–2628.

Islam, M.M.; Islam, M.R.; Islam, M.S. An Efficient Human Computer Interaction through Hand Gesture Using Deep Convolutional Neural Network. SN Comput. Sci. 2020, 1, 211


Refbacks

  • There are currently no refbacks.