Accidents involving semi-autonomous cars are currently making the headlines. Tesla and Uber have developed such powerful driver assistance systems – lane-keeping assist in combination with distance-keeping systems – that drivers have the impression their car is driving autonomously, even though their hands are actually supposed to remain on the steering wheel. Recent accidents have provided a foretaste of the problems we will face with the next generation of assistance systems. A group of researchers from the Centre for Human-Computer Interaction in Salzburg is currently investigating these obstacles in the context of a project financed by the Austrian Science Fund FWF.
“Every traffic situation is different. We humans can very easily adapt to different situations, but it is very difficult to devise a system that covers all eventualities,” explains project team member Alexander Meschtscherjakov in the interview with scilog. Autonomous cars are divided into classes depending on their state of development. “Manual driving is level zero. Level one is when a car can either keep to the correct lane or its distance from the car in front. If a car can do both, we call it level two – the recent accidents involved cars from this group. What we’re talking about now is level three cars. These vehicles don’t require the driver’s hands to be on the wheel, leaving them free to do other things.” This involves the danger of people losing their driving skills to a certain extent, explains Meschtscherjakov: “In special situations, when sensors fail or the system is overwhelmed by bad weather, the driver will need to take over, but may not have the necessary training to do so.”
Self-assessment of people with little driving experience
Meschtscherjakov’s team approached the problem from various angles. They started by conducting surveys to find out how people with little driving experience assess their own skills. “We concentrated on two levels of losing control”, notes Meschtscherjakov. “On the one hand, we asked people how confident they were about complying with traffic regulations. According to our results, people feel relatively secure in this respect. The second question was about how confident they felt about their ability to react in dangerous situations. People who have not driven for a while believe that the requisite sensorimotor abilities tend to decrease. They feel uncomfortable, for example, when they need to overtake a lorry.”
A second approach consisted of using actual driving simulations on a computer that focused on the hand-over procedure when the software returns control to the human driver. “Two groups of people practiced this situation. One group then stopped practicing and the other continued”, explains the scientist. After six weeks both groups were given a comparative test in the laboratory. The results of these tests are currently being processed for publication.
Comparisons with aviation
The researchers also investigated a professional group that is familiar with a very similar process, namely pilots. When the autopilot hands over the controls, pilots are required to follow a precise protocol. “People who fly planes need to pass tests regularly and fly a certain number of hours manually. Adopting this procedure for autonomous driving would mean making sure people drive manually for a certain amount of time.” This is to be achieved through low-threshold incentives – the keyword here being “gamification”. Meschtscherjakov speaks of a point-scoring model where points can be collected during night rides or driving in heavy rain. “Another aspect is situational awareness”, adds Meschtscherjakov. “When pilots get an error message they have to follow an exact procedure. We are trying, in a similar way, to heighten situational awareness of handover situations, for instance by means of something like a checklist.” This has already been partially implemented, and the team is now working on a simulator version. In any case, it is necessary for users to focus on practicing these handovers. “It’s easier in aviation, because it’s a professional activity”, comments Meschtscherjakov. He has doubts about whether car drivers will accept these requirements as readily as pilots.
Social acceptance uncertain
One important issue is social acceptance: “An autonomous car has to be defensively programmed. As a result, such cars behave differently and sometimes wait a great deal longer in uncertain situations.” Nevertheless, Meschtscherjakov can imagine that in some years from now, level-three cars will be operating in designated lanes or limited inner-city areas. The researcher thinks the latter is feasible because of the low driving speeds. In principle, however, it should be noted that autonomous driving, at least at the level three described, does not allow for weaker driving abilities. “Driving skills have to be even higher”, emphasises the researcher.
Alexander Meschtscherjakov is an Assistant Professor and the Deputy Director of the Centre for Human-Computer Interaction at the University of Salzburg. The computer scientist’s research interests include persuasive technologies for interaction, user interfaces for cars and the user experience in a technological context.