This is an oldie…
The video shows my application to control a mobile robot at distance using arm gestures.
It was recorded in 2002 at Tec de Monterrey Campus Cuernavaca, México.
The system is composed by a Nomad Scout II robot with a cheap videocamera,
and a Silicon Graphics computer R5000 with a webcam.
Many features of the the system running can be seen on the computer’s monitor.
There are 3 main windows, the top-left window shows the images taken with the robot’s camera.
In the window on the right it can be seen the visual tracking of the right hand of the user.
On the blue window behind the other two shows the recognition results.
For gesture recognition we use dynamic naive Bayesian classifiers, a variant of hidden Markov models that considers
a factored representation of the attributes or features that compose each observation. This representation
requires less iterations of the Expectation-Maximization algorithm while keep competitive
To characterize gestures we use posture and motion features, two sets of features not commonly combined
for historical reasons :S
We have proved empirically that this kind of features are useful to recognize similar gestures.
More information of this work:
Dynamic Bayesian networks for visual recognition of dynamic gestures
Journal of Intelligent and Fuzzy Systems
H. Avilés and Enrique Sucar
Volume 12, Numbers 3-4 / 2002, 243 – 250
Visual recognition of gestures using dynamic naive Bayesian classifiers
Aviles-Arriaga, H.H.; Sucar, L.E.; Mendoza, C.E.; Vargas, B.
Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003. The 12th IEEE International Workshop on
Volume , Issue , 31 Oct.-2 Nov. 2003 Page(s): 133 – 138
Visual Recognition of Similar Gestures
International Conference on Pattern Recognition, 2006.
H.H. Aviles-Arriaga L.E. Sucar C.E. Mendoza
Available at: http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1699081
Any suggestions and comments are welcome.
Duration : 0:1:41