Back close

Self-E

School: School of Engineering

Project Incharge:Dr. R. Rajesh kannan
Self-E

The wheelchair features a sophisticated autonomous navigation system, which allows the user to move the wheelchair without any external help. It uses a laser sensor technology called LiDAR to provide a detailed map of the surrounding space. The generated map is sent onto an Android-powered device (smartphone or tablet) through a dedicated mobile app. Using touchscreen technology, the user can identify the desired location with a simple touch on the screen, and then the wheelchair travels accordingly to the destination specified in the screen. Self-E works in three different modes: automated, fixed-automated, and manual. Once the map is loaded in the Android phone, in the automated mode the user can navigate to any place on the map by touching the destination. In the fixed-automated mode, the user cannot touch and navigate, but there is a point of reference and the wheelchair takes the user to that spot. In manual mode, users can give directions on the phone by tapping it if they want to go somewhere in an unmapped room.

Related Projects

Development of Amperometric Glucose Biosensors
Development of Amperometric Glucose Biosensors
Nutrition – Awareness Program on Micronutrient Deficiencies
Nutrition – Awareness Program on Micronutrient Deficiencies
Security Event Processing Acceleration using GPGPU
Security Event Processing Acceleration using GPGPU
Development of an On-board Integrated Switched Reluctance Motor (SRM) traction drive for Electric Vehicles (EVs)
Development of an On-board Integrated Switched Reluctance Motor (SRM) traction drive for Electric Vehicles (EVs)
Mu-share: Multi-user Realtime Shared Access platform for Remote Experimentation
Mu-share: Multi-user Realtime Shared Access platform for Remote Experimentation
Admissions Apply Now