summer holidays are great. The sun ins shining, I’ll spend the next 3 weeks at the seaside … so I had to keep my promises.
Here is the github repository for the project:
I changed the code from the version we used in the competition and made it more responsive. If your eye is in one of the “target direction” fields, the videocapture stills runs making adjustments easier (prior: waiting for about 2 secs for confirmation of direction). And now there is a TTS machine integrated, a speech synthesis, to give the direction commands extracted from eyemovement by voice – then they have to be confirmed. This is still a simple switch at the moment, but I am experimenting with electric activity from muscles to be used as a trigger [I can use electric muscle activity already for getting left/right eye movement; but you have to “glue” electrodes to the sides of your face and neck] 🙂
The comments are in German, but I’ll translate them soon….