Abstract
The Indian Sign Language (ISL) is the natural language of the Indian deaf community to convey their thoughts, feelings, ideas, etc. with other people. ISL has a bulk of real human video sign language dictionaries but our main motive is to create the ISL dictionary by using synthetic animations which use virtual character Avatar (computer generated cartoon) instead of human videos. Because synthetic animation has more benefits as compared to real human videos in the terms of uniformity, cost, memory usage, and adaptability. As part of this project, a dictionary of English text to ISL was constructed as part of a translation system. Using a third-party programme called eSign editor Software, the English words are translated into Hamburg Notation System (HamNoSys). The HamNoSys code is then converted into a SiGML script, which is then translated into an animated sign by a 3D computer-generated cartoon. The produced dictionary contains a total of 3051 words, making it suitable for use in a translation system that converts spoken or written sentences into sign language motions for use in a deaf school. It’s a time-consuming and challenging task, but it’s a vital stage in the machine translation of all English text into ISL with 3D animations.
Access provided by Autonomous University of Puebla. Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Sign language is a visual gesture-based language used by deaf and mute people for their conversation [1]. Sign Language consists of manual and non-manual [15] parameters. Manual parameters [18] include hand shapes, orientations, locations, straight movements, circular movements. Non-manual parameters include eyes-gaze, shoulder movements, body movements, head nods, face expressions, lips movements, etc. Every word has its sign. As with spoken languages, there are numerous varieties of sign language [12]. But all sign languages are not same all over the world because it changes from region to region, nation to nation and country to country due to cultural and geographical variations, historical context, environmental facts, etc. Each nation or country has its sign language. For example, the British Sign Language (BSL) and the American Sign Language (ASL) are quite different and mutually unintelligible, even though the hearing people of the United Kingdom and the United States share the same spoken language. Sign Language is a complete and natural language. Every sign language has its signs and grammar rules correspond to a word or sentence [13]. It is estimated that more than 200 sign languages are available which are used by deaf and mute people in the world [20].
2 Dictionaries of Sign Language
A lot of work had been done for the construction of sign language dictionaries all around the world. These dictionaries were created for SL in a variety of countries, including China, Korea, the United States, and Vietnam. Real human videos or 3D computer-generated avatars were used to create these dictionaries. Suzuki et al. [21] had constructed a multilingual sign language dictionary for three gesture languages which consist of Japanese gesture Language, Korean gesture Language, and American gesture Language. On the system, the user can control the speed of sign language motion and select the targeted language. Fuertes et al. [6] developed bilingual dictionary for two languages which include Spanish Sign Language - Spanish (DISLE). This dictionary is available in electronic form for the hearing impaired community. The system provides two options to users for word search either in signs or Spanish [bilingual sign language dictionary]. Franc Solina et al. [19] developed a multimedia sign language dictionary that translates Slovenian text to Slovenian Sign Language (SSL). The multimedia dictionary has included 1800 individual signs in the form of video clips. The words that are not presented in the sign language dictionary or the name of persons are represented by finger-spelling [11, 16]. This dictionary was not only useful for reference purposes but also useful for learning in special educational institutions for hearing impaired people, as well as for normal people, who want to learn sign language. Cormier et al. [5] prepared a BSL Sign Bank dictionary for BSL. It includes 2528 words of signs in the form of human video clips. All the sign videos were recorded by the deaf using BSL. Martin et al. [14] constructed an ISL dictionary for disaster warnings. The messages were traced to translate them into 2D avatar animations. The dictionary gives information related to the disaster domain to deaf people in sign language. It includes 600 sentences and 2000 words related to disaster messages. Goyal et al. [8] developed an ISL dictionary using synthetic animations. The dictionary contains 1478 words signs in the form of 3D avatar animations.
3 Techniques and Methods
Hamburg Notation System (HamNoSys) was developed by Siegmund Prillwitz in 1984 at the University of Hamburg, Germany [10]. It is a phonetic transcription system based on stoke notation that can be used for other sign languages. This sign writing notation was developed for the construction of SL dictionaries, machine conversions. It has included 200 symbols which are covering all hand shape gestures as signing parameter that holds hand shapes, orientations, locations, and movements of hand [22].
3.1 eSIGN Editor Software
The eSIGN software consists of two components: the eSign Editor and the JASigning Player [2]. The eSIGN Editor program was developed by the eSIGN project team to allow us to create gestures in the forms of HamNoSys code and convert it to the SiGML script [24].
3.2 SiGML
SiGML is a Signing Gesture Markup Language that is used to represent HamNoSys symbols [3]. The HamNoSys symbols are described in the form of an XML scripting tag for each word. It provides many tools to display the sign of input HamNoSys code in the form of animated symbols and animated videos for communication purposes.
3.3 Synthetic Animations
Avatars are represented as ‘Virtual bodies’ to display input [7]. HamNoSys [23] notation for sign language is readable by three-dimensional (3D) executing software. It takes input as a SiGML file or XML tag and generates output as synthetic animations or avatars corresponding to the input word or text. Animations frames are given as input to the avatar in a sequence for each inputted text. When these frames are placed in a sequence of poses, then rendering software produces synthetic sign animations [17] corresponding to the specific frame definitions. Synthetic animations can be generated in real time reducing the time [9] and storage requirements to a large extent.
3.4 Construction of Proposed ISL Dictionary
The proposed steps for construction ISL dictionary is depicted Fig. 1.
-
English Text: The system takes English text as input. English word can be days of week, month name, basic phrases, dishes etc.
-
HamNoSys Code: This module convert English word into HamNoSys code by following the structure rules of HamNoSys tool.
-
SiGML code: This module converts HamNoSys code into SiGML code by following SiGML conversion rules.
-
Synthetic Animations: At last SiGML Player plays the SiGML file and generates synthetic animation corresponding to SiGML script file.
4 Comparative Analysis of Current System with Existing Systems
Various existing systems are available to represent sign language. Around the world a lot of work has been done to represent sign language worldwide. Several existed systems have been constructed in the form of pictures which has become oldfashioned. In the modern era of digitization and computerization, Some existed systems have designed using human videos but it required lot of time for recording and space for storage. We designed our current system using synthetic animations that required limited space for storage and time for execution. But it does not looks natural as human videos (Table 1).
For the accuracy purpose these signs are matched with ISLRTC videos. These signs are also verified by ISL interpreter and the students of deaf schools. Some signs screenshot like show, keep, and happy shown in Table 2.
5 Result and Discussion
We developed bilingual English text to ISL dictionary using synthetic animations. Unlike other spoken languages such as English, French, Hindi Punjabi, Tamil, etc. sign language cannot be in written form. HamNoSys is a notation system that can be used to write any sign language. This notation system can represent both manual and non-manual [4] parameters of signs. The eSIGN editor software is used to covert HamNoSys code to SiGML code. Then SiGML player generates animations by using the SiGML file. Currently, we have a collection of 3051 words that are used in daily life. 2748 words are transformed into HamNoSys symbols from this list. Name of person and unknown words are finger-spelled. The words are classified under the categorization such as alphabets, numbers, noun, verb, pronoun, interrogative, preposition, etc. are shown in Table 1.
6 Conclusion
In the current task, English words to ISL dictionary has been constructed for Indian gestures using various resources such as ISLRTC (Indian Sign Language Research Training Center) dictionary videos and some videos were recorded by expert ISL interpreters. Using computer generated avatar technology is much superior than using real human video because it consumes less storage space, takes less time for translation, SiGML files can be easily uploaded and downloaded without any delay and the virtual character customised to the user’s preferences. The dictionary can be used in a translation system that converts English text into gesture language and can be used in deaf schools for education. In the future, more signs will be added to extend this dictionary.
References
Aliwy AH, Ahmed AA (2021) Development of Arabic sign language dictionary using 3d avatar technologies. Indonesian J Electr Eng Comput Sci 21(1):609–616
Ayadi K, Elhadj YO, Ferchichi A (2018) Prototype for learning and teaching arabic sign language using 3d animations. In: 2018 International conference on intelligent autonomous systems (ICoIAS). IEEE, pp 51–57
Bangham JA, Cox S, Elliott R, Glauert JR, Marshall I, Rankov S, Wells M.: (2000) Virtual signing: Capture, animation, storage and transmission-an overview of the visicast project. In: IEE Seminar on speech and language processing for disabled and elderly people (Ref. No. 2000/025). IET, pp 6-1
Crasborn OA (2006) Nonmanual structures in sign language
Fenlon J, Cormier K, Schembri A (2015) Building BSL SignBank: the lemma dilemma revisited 1. Int J Lexicograp 28(2):169–206. https://doi.org/10.1093/ijl/ecv008
Fuertes JL, González ÁL, Mariscal G, Ruiz C (2006) Bilingual sign language dictionary. In: Miesenberger K, Klaus J, Zagler WL, Karshmer AI (eds) Computers helping people with special needs. Springer, Berlin Heidelberg, Berlin, Heidelberg, pp 599–606
Glauert J, Kennaway J, Elliott R, Theobald BJ (2004) Virtual human signing as expressive animation. In: Symposium on language, speech and gesture for expressive characters. University of Leeds, pp 98–106
Goyal L, Goyal V (2016) Development of Indian sign language dictionary using synthetic animations. Indian J Sci Technol 9(32):1–5
Goyal L, Goyal V (2017) Tutorial for deaf–teaching Punjabi alphabet using synthetic animations. In: Proceedings of the 14th international conference on natural language processing (ICON-2017), pp 172–177
Hanke T (2004) Hamnosys-representing sign language data in language resources and language processing contexts. In: LREC, vol 4, pp 1–6
Joy J, Balakrishnan K, Sreeraj M (2019) Signquiz: a quiz based tool for learning fingerspelled signs in Indian sign language using ASLR. IEEE Access 7:28363–28371
Karpov A, Kipyatkova I, Zelezny M (2016) Automatic technologies for processing spoken sign languages. Procedia Comput Sci 81:201–207
Kumar P, Kaur S (2020) Sign language generation system based on Indian sign language grammar. ACM Trans Asian Low-Resource Lang Inf Process (TALLIP) 19(4):1–26
Martin PM, Belhe S, Mudliar S, Kulkarni M, Sahasrabudhe S (2013) An Indian sign language (ISL) corpus of the domain disaster message using avatar. In: Proceedings of the third international symposium in sign language translations and technology (SLTAT-2013), pp 1–4
Mukushev M, Sabyrov A, Imashev A, Koishibay K, Kimmelman V, Sandygulova A (2020) Evaluation of manual and non-manual components for sign language recognition. In: Proceedings of the 12th language resources and evaluation conference. European Language Resources Association (ELRA)
Nanaware T, Sahasrabudhe S, Ayer N, Christo R (2018) Fingerspelling-indian sign language training tool. In: 2018 IEEE 18th international conference on advanced learning technologies (ICALT). IEEE, pp 330–334
Pina A, Cerezo E, Serón FJ (2000) Computer animation: from avatars to unrestricted autonomous actors (a survey on replication and modelling mechanisms). Comput Graph 24(2):297–311
Sanaullah M, Ahmad B, Kashif M, Safdar T, Hassan M, Hasan MH, Aziz N (2022) A real-time automatic translation of text to sign language. CMC Comput Mater Continua 70(2):2471–2488
Solina F, Krapez S, Jaklic A, Komac V (2001) Multimedia dictionary and synthesis of sign language. In: Design and management of multimedia information systems: opportunities and challenges. IGI Global, pp 268–281
Sugandhi PK, Kaur S (2018) Online multilingual dictionary using Hamburg notation for avatar-based Indian sign language generation system. Int J Cogn Lang Sci 12(8):120–127
Suzuki E, Suzuki T, Kakihana K (2006) On the web trilingual sign language dictionary to learn the foreign sign language without learning a target spoken language. In: Proceedings of the fifth international conference on language resources and evaluation (LREC’06)
Thomas G, Whitten J ()2012 Learning support for students with learning difficulties in India and Australia: similarities and differences. Int Educ J Comp Persp 11(1)
Wachsmuth I, Fröhlich M (1998) Gesture and sign language in human-computer interaction: international gesture workshop, Bielefeld, Germany, 17–19 Sept 1997, Proceedings, vol 1371. Springer Science & Business Media
Zwitserlood I, Verlinden M, Ros J, Van Der Schoot S, Netherlands T (2004) Synthetic signing for the deaf: Esign. In: Proceedings of the conference and workshop on assistive technologies for vision and hearing impairment. Citeseer
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Rani, A., Goyal, V., Goyal, L. (2023). Synthetic Animations Based Dictionary for Indian Sign Language. In: Goar, V., Kuri, M., Kumar, R., Senjyu, T. (eds) Advances in Information Communication Technology and Computing. Lecture Notes in Networks and Systems, vol 628. Springer, Singapore. https://doi.org/10.1007/978-981-19-9888-1_29
Download citation
DOI: https://doi.org/10.1007/978-981-19-9888-1_29
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-19-9887-4
Online ISBN: 978-981-19-9888-1
eBook Packages: EngineeringEngineering (R0)