Abstract
This paper proposes a bottom up approach for static hand gesture recognition. By extending the local orientation histogram feature, we make it applicable to the human hand, an object of very little texture. The key steps are augmenting the local orientation histogram feature vector with its relative image coordinates, and clustering the augmented vector to find a compact yet descriptive representation of the hand shape. The recognition result is given by collective voting of all the local orientation histogram features extracted from the hand region of the testing image. The matching score is evaluated by retrieval in the image feature database of the training hand gestures. Locality sensitive hashing is used to reduce the computational cost for retrieval. The experimental results on labelled real-world images demonstrate superior accuracy and efficiency of our algorithm.