
A significant proportion of the world’s population lives with some form of disability. Persons with severe speech and motor impairment (SSMI) belong to an isolated part of the disability spectrum. The complexity of their physical condition makes it challenging for them to have natural interactions with their environment, even for activities of daily living (ADL). Advances in robotics and technology have provided opportunities to design systems to enable and support persons with disabilities in everyday activities, communication, education, and fun. This video showcases our work on affordable, inclusive and eye-gaze-controlled human-robot interaction (HRI) system for persons with SSMI. We have developed augmented reality-based video see-through interfaces to control a cyber physical agent by eye-gaze on the users. We have designed a multimodal joystick controller mechanism which actuates the thumb-sticks and buttons of the physical controller (gaming console) of any commercially available generic toy (toy car, drone, robot arm) based on the triggers (icons and buttons) activated by the eye-gaze of the user on the user interface. We have gamified the whole system to make the user experience more engaging and playful. The system has been tested with both, commercially available eye-trackers and bespoke webcam-based eye-trackers. This system can be used by persons with SSMI to practice and train eye-gaze pointing based interaction and operating cyber physical agents which can support their education and promote employment prospects.
The Covid 19 Pandemic resulted in a huge casualty among medical professionals. The present video demonstrates a low-cost, flexible robotic solution that helps medical professionals to maintain a safe distance from patients while checking their vitals like temperature, oxygen saturation, and visual inspection of body parts. We developed a robotic system with powered grippers and end effectors that can operate with legacy medical instruments. The human-robot interfaces use a video see-through augmented reality interface for the collocated situation and a virtual reality interface for remote teleoperation. The robotic system comprises different types of modules and these modules are mechanical linkage with cam and gear profiles to operate the medical devices by using servo motors. As part of user-centric design, we were constantly connected with medical professionals, we also showed the doctor the user interface, and the robotic system, and obtained further information from them.