A new 2D static hand gesture colour image dataset for ASL gestures

dc.contributor.authorBarczak, A.L.C.
dc.contributor.authorReyes, N.H.
dc.contributor.authorAbastillas, M.
dc.contributor.authorPiccio, A.
dc.contributor.authorSusnjak, T.
dc.date.accessioned2013-05-22T03:45:12Z
dc.date.available2013-05-22T03:45:12Z
dc.date.issued2011
dc.description.abstractIt usually takes a fusion of image processing and machine learning algorithms in order to build a fully-functioning computer vision system for hand gesture recognition. Fortunately, the complexity of developing such a system could be alleviated by treating the system as a collection of multiple sub-systems working together, in such a way that they can be dealt with in isolation. Machine learning need to feed on thousands of exemplars (e.g. images, features) to automatically establish some recognisable patterns for all possible classes (e.g. hand gestures) that applies to the problem domain. A good number of exemplars helps, but it is also important to note that the efficacy of these exemplars depends on the variability of illumination conditions, hand postures, angles of rotation, scaling and on the number of volunteers from whom the hand gesture images were taken. These exemplars are usually subjected to image processing first, to reduce the presence of noise and extract the important features from the images. These features serve as inputs to the machine learning system. Different sub-systems are integrated together to form a complete computer vision system for gesture recognition. The main contribution of this work is on the production of the exemplars. We discuss how a dataset of standard American Sign Language (ASL) hand gestures containing 2425 images from 5 individuals, with variations in lighting conditions and hand postures is generated with the aid of image processing techniques. A minor contribution is given in the form of a specific feature extraction method called moment invariants, for which the computation method and the values are furnished with the dataset.en
dc.identifier.citationBarczak, A.L.C., Reyes, N.H., Abastillas, M., Piccio, A., Susnjak, T. (2011), A new 2D static hand gesture colour image dataset for ASL gestures, Research Letters in the Information and Mathematical Sciences, 15, 12-20en
dc.identifier.issn1175-2777
dc.identifier.urihttp://hdl.handle.net/10179/4514
dc.language.isoenen
dc.publisherMassey Universityen
dc.subjectHand gesture recognitionen
dc.subjectMachine learningen
dc.subjectAmerican Sign Language (ASL)en
dc.subjectComputer visionen
dc.subjectFeature extractionen
dc.subjectImage processingen
dc.subjectImage dataseten
dc.subjectHand image databaseen
dc.titleA new 2D static hand gesture colour image dataset for ASL gesturesen
dc.typeArticleen
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
GestureDatasetRLIMS2011.pdf
Size:
1012.86 KB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: