next up previous contents index
Next: 3.4.6 Other Uses Up: 3.4 Other User Interfaces Previous: 3.4.4 Speech

3.4.5 Gesture

  A hand-gesture interface is being developed in collaboration with Dr. Thomas Huang and Dr. Rajeev Sharma of the University of Illinois. The current implementation uses two cameras to find the position and orientation of a finger on a person's hand. The information about the gesture is sent as a text command to VMD which overrides the Text tracker coordinates. If a tool is connected to this tracker, the result is identical to using one of the UNC tracker input devices.

Future work will progress on two lines. In the near future we will improve the pointing interface to allow use to select molecules by pointing to them. After that we will try to recognize specific hand positions and gestures in order to develop a control language, and synchronize these gestures with spoken commands.



Sergei Izrailev
Fri Jul 25 17:07:27 CDT 1997