Proposal
There are a number of people in the lab. who suffer from RSI. One
solution is to avoid typing by using a speech interface instead.
Simon Crosby uses the
Dragon Dictate speech recogniser on a PC which sends phrases to a UNIX
box which converts them into mouse movements and key presses using
public domain program a2x.
In the current system Dragon Dictate translates phrases into a
primitive command language which specifies X events. I think that it
would be better if Dragon Dictate simply transmitted English phrases
which could then be translated by the UNIX box. This would allow for
a better interface to perform configuration. Also, it would be easier
to adapt commands according to the active application. For example,
the phrase "exitprogram" might send ctrl-x + ctrl-c if emacs was being used
or alt-q if netscape was in use.
Extensions
A simple version of this project is not difficult (hence rating). However, making
the interface more easily programmable and context sensitive adds
further complexity ( ). A more
intelligent interface could also be provided to allow access to menu
based programs (  ). For example, when using netscape it would be
nice to be able to say 'options general' to open the general
preferences from the options window. This will probable require
additions to an window manager, e.g. fvwm.
Previous Work
This project has been proposed before and completed before.
Special resources
None. The speech synthesiser may be simulated by the student typing
phrases into another X-terminal. It would, however, obviously be nice
to test the system connected to Dragon Dictate.
Possible supervisors
I think I'll be too busy to supervise this project. However, it is an
easy project to supervise.
|