It is language that distinguishes Homo sapiens from animals. A complex system in which smaller units combine into larger units, into sentences, into statements. Language is spoken, written down – and it is signed. “In sign language”, says Franz Dotter, “we find all the phenomena that we know from spoken languages, just expressed visually.” Sign language is not a haphazard support tool, it is a full language in its own right, a means of communication. And yet, some areas of sign language are simply terra incognita. How, for instance, can you emphasise something in sign language? How are texts segmented into structures, when instruments such as pitch, tone of voice and loudness are not available? “By pauses”, notes Dotter, “pauses and deliberate indicators (cues), such as blinking, palm position, changes in movements, looks, movements of the head and the body.”
Gestures – the evolutionary neighbour of sign language
This is the core of the project supported by the Austrian Science Fund FWF entitled Segmentation and Structuring of Austrian Sign Language (ÖGS) Texts led by Franz Dotter at the Centre for Sign Language and Deaf Communication at the Alpen-Adria-Universität Klagenfurt. With the help of two methods for identification and analysis, the project team determines manual and non-manual elements in signed texts. “We asked both ÖGS-signers and non-signers to identify segmentation units and to indicate the cues they perceived.” It was shown that both groups were able to perceive cues given by the hands and by pauses. People with no signing skills discerned a remarkable 40% of these cues. The situation is different for non-manual cues, such as looks or head and body movements, which were almost exclusively understood by the group of native signers only. “Sign languages have always been a means of communication, even for hearing people”, relates Dotter, “as we know from Australia and America, where signs were used for taboo words that one was not allowed to say aloud, but also for communication between different tribes.” The gestures that accompany spoken languages are in some sense an evolutionary ‘neighbour’ of sign languages: this so-called ‘body language’ consisting of hand and body movements as well as facial expressions is found universally. Walking two fingers across the back of your hand to signal discreetly it’s time to go, putting a finger to your lips, hand swiping and waving, the ‘conducting’ which unconsciously accompanies the act of speaking.
Sign language – not an artificial product
It is not known exactly how the sign language of deaf people developed, explains Dotter. “Presumably it has long been used among deaf communities. As of about 1770 and starting in France, systematic education in sign language began to be given to deaf people. The teaching in sign language is a product of the era of enlightenment – and even then it was recognised that no new sign had to be invented if a deaf community already had one for something.” Sign language is not an artificial product A caesura occurred in 1880. According to the “oralists”, the deaf were supposed to speak and not use sign language. The renaissance of sign language came about only in the 1970s and 1980s, when it was recognised as a full language. This is one of the reasons why the general public today perceives it as a recent phenomenon – and why research on it started a relatively short time ago. “It is highly interesting to gain insights into the development of a language”, is Dotter’s comment on the situation.
Non-manual – instruments for coding special information
For the project the researchers used dialogues and monologues, short stories, jokes, free narratives, trains of thought and resumés. While the manual signs are discernible and interpretable for many non-signers, non-manual boundary signals are a very different story. They involve movements of the head such as nodding, shaking or moving it up or down or to the side, movements of the upper body including weight shifts from one leg to the other, movements of the eyebrows, changes in gaze direction and blinking. Such non-manual cues can be used to express negation, conditionality, hypothetical thoughts or alternatives, temporal or causal relationships. Hence, they correspond to aspects which in spoken language are mainly expressed by intonation, pitch and loudness.
Eye direction – the grammar of sign language
“A team around Andrea Lackner investigated the non-manual elements”, Dotter explains. “In order to do that the team, for instance, first had to identify all changes in eye direction before being able to analyse them.” Whereas speakers may occasionally gaze off in another direction without that being of any significance, in sign language the eyes serve the function of an anchor. “If I first localize an absent person by an index (a word denoting a certain direction) in space and then look at this direction later on in the dialogue, my partners always know I’m speaking about that particular person”, says Dotter as an illustration. “With speakers the eyes can give a signal, in sign language they are part of the grammar.” The results of the project are essential for sign language grammar books and the typological comparison between oral and sign languages. For Austrian sign language teaching, the results of the project thus represent a significant contribution, although this is not all there is to it. Dotter: “Sign languages have the whole range, from everyday conversations and concrete terms to metaphors, abstraction and specialised academic vocabulary.”– A wide field indeed.
Personal details Franz Dotter is an associate professor emeritus at the Faculty of Cultural Studies at the Alpen-Adria-Universität Klagenfurt. Between 1996 and 2013 he served as head of the Centre for Sign Language and Deaf Communication. Andrea Lackner is an independent scholar. She was project coordinator and head of the linguistic research work of the FWF project.
Project Website: http://signnonmanuals.aau.at/ Publications and contributions