Gesture-based communication is on the rise in Human Computer Interaction. Advancement in the form of smart phones has made it possible to introduce a new kind of communication. Gesture-based interfaces are increasingly getting popular to communicate at public places. These interfaces are another effective communication medium for deaf and dumb. Gestures help conveying inner thoughts of these physically disabled people to others thus eliminating the need of a gesture translator. These gestures are stored in data sets so we need to work on developing an efficient dataset. Many datasets have been developed for languages like American sign language; however, for other sign languages, like Pakistani Sign Language (PSL), there has not been done much work. This paper presents technique for storing datasets for static and dynamic signs for any sign language other than British Sign Language or American Sign Language. For American and British Sign Languages many datasets are available publicly. However, other regional languages lack public availability of data sets. Pakistan Sign Language has been taken as a case study more than 5000 gestures have been collected and they will be made part of a public database as a part of this research. The research is first initiative towards building Universal Sign Language as every region has a different Sign Language. The second focus of the research is to explore methodologies where a signer communicates without any constraint like Data gloves or a particular background. Thirdly, the paper proposes use of spelling based gestures for easier communication. So the dataset design suggested is not affected by constraints of any kind.
CITATION STYLE
Saqib, S., & Asad, S. (2017). Repository of Static and Dynamic Signs. International Journal of Advanced Computer Science and Applications, 8(11). https://doi.org/10.14569/ijacsa.2017.081113
Mendeley helps you to discover research relevant for your work.