A cross-national research project provides the basis for support systems that translate nonverbal communication in real time for visually impaired people. A first prototype is already available. © JKU

Blind or visually impaired people miss out on important information in everyday interactions with sighted people. This includes body language and facial expressions that indicate the other person's emotions: a smile when greeting someone, an approving nod or a look of confusion when something is unclear. An attentive interlocutor will adjust to this situation and try to convey as many of these non-verbal elements as possible by means of verbal expressions or touch when dealing with visually impaired people.

This gets more challenging when communicating in a group. For example, in a professional meeting with twelve people in the room, various speakers, perhaps heated discussions and information presented on flipcharts or other visual information displays, participants with impaired vision have a hard time. How to perceive the gesture of a colleague pointing to a certain area of the flipchart? How to identify an unknown speaker? How to read the room to see if there are any non-verbal signals of agreement or disagreement after someone has spoken? 

In the future, support systems for visually impaired people should be able to provide such information in order to enable barrier-free participation in conferences. The development of technical aids that can interpret human actions in communication situations in real time is, however, complex and requires great investment of effort. For this reason, researchers from Johannes Kepler University (JKU) Linz are working together with teams from ETH Zurich and TU Darmstadt in a project intended to provide the scientific foundations on which future applications in this field can build.

Tracking, collecting and providing access

“The challenge is to track and collect facial expressions, gestures and other non-verbal forms of communication in a conference situation and to make them accessible to visually impaired people in alternative ways by means of an intelligent system,” explains Klaus Miesenberger, principal investigator and head of the Institute for Integrated Studies at JKU Linz. The information can be communicated via an earpiece for a visually impaired person or via a Braille terminal – an output device for Braille text. Smartwatches and smartphones could also be used as aids, as their sensors can detect movements and positions in the room. In addition to audio output, they can also transmit haptic signals to their users via vibration mechanisms.

The aim is also to give visually impaired users tools that enable them to engage in targeted gestures within the context of a meeting. They should be able to point to a person or to screen content or even to manipulate displays on a virtual flipchart. Again, smartphones and wearables can be useful in this context. The partner university in Darmstadt is also working on its own output device – a table that can move objects via a magnetic mechanism and display spatial relationships in a haptic way.

Developing basic rule systems

Basic research in this area presents many challenges. “Non-verbal communication is a big research topic, but there are hardly any formal descriptions of it that one could follow when developing an automated analysis of facial expressions and gestures,” says Miesenberger, citing one example. The researchers work with members of their target group to define information needs and interaction possibilities and to design systems for the required data processing. An important aspect is to assess the relative level of importance of incoming information and create a user interface that ensures both an overview and quick access. “We are developing a so-called reasoner for this purpose, a complex rule system that can decide, for example, what information needs to be delivered immediately and what should be available on demand,” Miesenberger notes. 

Joint brainstorming on the flipchart

Reinhard Koutny, a project collaborator and doctoral student of Miesenberger, has come up with one approach for a concrete support system. He is working on a system that not only makes content accessible via audio output, but also allows manipulation through gestures. “Meetings often involve joint brainstorming using postings arranged on a flipchart,” Koutny explains. “Our system enables visually impaired individuals to participate in these activities too.”

A virtual flipchart on a monitor interacts with the user's smartphone. “The device serves as a sensor with which the user can scan the display. The mobile phone vibrates when you move it over a posting, for example, and a description is provided via an audio output device in your ear,” explains Koutny. “Morever, the postings can also be moved or rotated by simply pointing at the element, pressing a mobile phone button and releasing it again when the element has reached the desired new position. “ Physical objects and people in the meeting room itself can also be surveyed by means of a similar principle. In this case, a smartwatch helps to capture information and directions. The application only requires modern devices that have the corresponding sensor technology. A first prototype of the system is already available.

One of the tasks Klaus Miesenberger and the project team still need to tackle is to connect the systematics they developed with the world of artificial intelligence. “In the future, we will record meetings using tracking technologies in order to create a database – anonymised, of course,” explains Miesenberger. “This data can then be used to create improved models using machine learning methods. We want the systems to learn to interpret situations in a more precise way in order to offer the user the appropriate options for interaction.”


Personal details

Klaus Miesenberger is a business informatics specialist with a research focus on human-machine communication for people with disabilities. Since 1992, he has been working at Johannes Kepler University (JKU) Linz on information systems that support blind and visually impaired students. In 2017 he was appointed head of the Institute for Integrated Studies at JKU.

Reinhard Koutny completed his master’s degree in software engineering at JKU Linz and is now a doctoral student and research assistant at the Institute. The project “Barrier-free meeting rooms for visually impaired people” received EUR 210,000 in funding from the Austrian Science Fund FWF and will run until the end of 2021.


Publications and contributions

Koutny R., Günther S., Dhingra N., Kunz A., Miesenberger K., Mühlhäuser M.: Accessible Multimodal Tool Support for Brainstorming Meetings, in: K. Miesenberger, R. Manduchi, M. C. Rodriguez, P. Peňáz (Hg.): ICCHP 2020: Computers Helping People with Special Needs, Serie LNCS, Vol. 12377, Springer 2020

Dhingra N, Koutny R, Günther S, et al.: Pointing Gesture Based User Interaction of Tool Supported Brainstorming Meetings, in: ICCHP 2020: Computers Helping People with Special Needs, 17th Int. Conference, Lecco, Italy, September 9–11, 2020, Proceedings, Part II. 2020

Günther, Sebastian; Koutny, Reinhard; Dhingra, Naina et al.: MAPVI: Meeting Accessibility for Persons with Visual Impairments; in: Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments - PETRA 2019 (PDF)