I'm working at IRIT laboratory in Toulouse, France to design, develop and test a neuroprosthesis for blind people. We're in the early part of the design phase and some choices need to be tested by modelling the neuroprosthesis and using this model to perform real world tasks in simulated blindness experiments. In these experiments, the object position is determined in real time on camera images by Spikenet, a visual recognition software developed at Cerco lab. The -virtual- phosphenes are generated in a virtual reality helmet using both calculated object position and gaze direction determined via an eye tracker device.
The output from an other set of experiments on the construction of spatial representation in blind people will be combined to the results obtained with the modeling/prototyping of the system to determine the best way to represent visual information and drive the final design and first implementations of the neuroprosthesis.
I am also involved in NAVIG, a large project federating several laboratories and firms together with the city concil and an institute of blind people to create a device that will guide blind people in cities via spatialized sounds.