Well its been a busy couple of weeks but yet another piece of amazing news has happened to me. My embedded entry for this years Imagine Cup has been selected and I am off to Warsaw at the start of July to represent the UK.
So what’s my big Idea?
Well after seeing a visually impaired person struggling in a supermarket I thought that technology could help, therefore I designed Senses.
It is an augmented reality system for blind and partially sighted people, incorporating visual, tactile and audio interfaces. Utilising the latest Windows Embedded, mobile and cloud technologies this project aims to improve overall quality of life, this is achieved by providing a means to better perform day to day tasks, such as reading text, identifying objects and people, and avoiding obstacles when walking.
This would be achieved by affectively adding a second set of eyes via the use of an external wide focus web camera attached to the user, and a more precise camera within a pre-existing Windows mobile device. These cameras would provide such tasks as Object Recognition, Facial Recognition and Optical Character Recognition (OCR). The augmented reality interaction between the user and the device would come via speech recognition via a wearable microphone, and response would come from text to speech functionally in a set of headphones.
If you are interested, check out my promotional video on youtube.