A view days ago I participated in Jugend forscht again but this time I was grouped in an older range and more serious competition. I teamed with Paul Foltin (15) and we developed a wheelchair that can be steered by command you give with your eyemovemet. We earned a first price on regional level and are going to take part on the next round – state level – in a month!! Wish us luck!
With this project we want to give disabled people back their freedom to move independently even if they are totally paralized (locked inside their body like ALS does to people).
We modified a webcam to run with infrared light, 3d-printed a custom case that fits cheap safety glasses as a frame and used this to track the position of the iris in the videoframes. You can use control dials to adjust the parameters for the sections of the video that correspond to moving commands. The video display is rationed in sections. You can trigger 4 different directions by looking in the particular zones.
The processing is done on a fast Odroid in Python with openCV since a Raspi is far too slow (we tried that first). Then there is an additional Arduino board that takes care of the control dials and the two H-bridge drives for the motors. It all sits on a board we have build ourselves…
At the moment the wheelchair is just a model (build with the mechanics from the robot platform qfix) but for the next round we want to advance our project by constructing a real wheelchair. Tomorror I will buy a used wheelchair and then we’ll try to attach motors to it.
I’ll try to keep you posted 🙂
You can find the PDF of our thesis here (in German):