Dr. David McNaughton, Susmita Sanyal, Ikenna Okafor, Darshan Shah, David Sun-Chu, Nathan Scribano, Hui Xu
Computer access for disabled people is an emerging issue as the internet and connected devices become more ubiquitous in society. Populations who cannot move their arms and legs have especially difficult time interacting with computers because traditional input peripherals simply do not work if one cannot move their arms. The only option for these populations is eye tracking and head tracking. The issues arise with the cost of current head and eye tracking cameras. Considering the disabled population is likely encumbered by medical bills, it is no surprise that they don’t purchase a thousand dollar camera just to check their email. Our proof of concept aims to provide a low cost solution and hopefully increase the amount of people interfacing with the computer.We used a combination of head and eye tracking to move a cursor. An issues arises when the program does into head tracking mode because the tracking data from eye tracking is still displaying, so to solve this we implemented a switching mechanism between the two tracking modes. The users will be able to intuitively switch between both tracking methods and utilize a wide range of gestures to interface with the computer. We found a way to integrate both head and eye tracking into one functioning solution, with the cost of the RealSense Camera being $150. This proof of concept shows an alternative to expensive hardware by using intelligent software. We hope that the disabled population can benefit from this progress.