avalon

Designing a new language for cars

Lights, sounds or screens with messages: How can self-driving cars interact with humans to make their intentions clear?

Autonomous driving promises to bring lots of benefits, ranging from more safety to increased mobility. However, the introduction of self-driving cars also means that we have to come up with means to communicate the intentions of these vehicles when there is no longer a driver to look at. Debargha Dey is one of the researchers at Eindhoven University of Technology (TU/e) who is developing a new language for cars – something that is much more difficult than it seems.

Picture this: it’s 2030. You’re sitting in your self-driving car on a wide road. While minding your own business, the car – programmed to bring you to your destination without any interaction on your part – wants to turn left into a narrow road. However, you’re stuck: the narrow road is blocked by a car that is waiting for you, as you have right of way. In a fully automated world, with no humans ready to intervene, this would be a possibly unending impasse.

Fortunately, there are so-called external Human-Machine Interfaces coming our way that use light, sound or any other means to indicate to the other road user that they can go first. Impasse solved!

eHMIs, as they are called in the world of smart mobility, are part of a relatively new and exciting research field that helps automated vehicles negotiate the outside world. One of the researchers involved in this field is Indian-born Debargha Dey, part of the Future Everyday group in the department of Industrial Design. Dey, who is just about to finish his PhD and will continue to stay on as a post doc at TU/e, has a very personal motivation for his involvement in this field.

Car crash

“I’ve always been fascinated by cars, making sketches of them as a child, but I’d never expected that working with cars would one day be a full-time job. One of the reasons that I decided to get involved in this field is a car accident I had some eight years ago, when I was working in the US as a software engineer. A car crashed into me at a T-junction. Although I suffered only slight injuries, the incident had a deep effect on me. Not only was I afraid to drive for some time, but I also found out that the other driver had been distracted by his satnav. This kindled a deep interest in me for the interaction between humans and cars”, he explains. 

“After attending a conference on Automotive UI, I decided to apply for a PDEng research position here at TU/e with professor Jacques Terken, and I never looked back. It was one of the best decisions of my life!”

Understanding the self-driving car

Up till a few years ago most researchers in smart mobility were primarily interested in the interaction between drivers, or between car and driver. Dey and his colleagues in the NWO-funded iCAVE program, on the other hand, focus on situations where the driver is no longer in control, and the person in the car no longer represents the intention of the car.

“Of course, we are some way from a situation where automated cars can do most of the driving (known as levels 3, 4 and 5 of car automation), but we believe that eHMIs can play a crucial role in getting there. Once autonomous cars are able to automatically and effectively signal their intentions to other road users, drivers can finally sit back and fully enjoy the benefits of autonomous driving.”

Bumper lights and more

A good example of Dey’s research, is the award-winning work he and his colleagues did into distance-dependent eHMIs. “Here we looked at the way people interact with a car when it approaches. From eye-tracking tests we did, we observed that people tend to look at a car not necessarily to make eye contact with the driver, but to assess its intentions.”

“As a car approaches, they first look at a car’s bumper, then move up to the grill and the hood, and finally on the windshield, looking for information about the situational awareness or intention of the driver. This inspired us to design an interface that dynamically communicates the car’s intentions through lights on the bumper and the windshield, depending how far it is away from the pedestrian”.

As we speak, Dey is involved in exciting new research that looks into using more abstract animal-based signals to communicate intention. “When a cat or a  hedgehog is angry, or aggressive, it puffs up, or makes itself bigger. Animal behavior like this shows the way to a more nuanced interactive language, that goes beyond directive symbols like stop signs, or text, that may be not visible or understandable to everyone.”

 

© Eindhoven University of Technology (TU/e) | Concept eHMI for self-driving car

 

Proof-of-concept

Based on an inventory Dey and his colleagues recently did of the research field, there are currently some 70 eHMI concepts for automated vehicles. They range from displaying text messages on external displays, over laser projections on the street, to personalized messages for smart and wearable devices. Often, these concepts are limited to demonstrations for promotion purposes, proof-of-concept prototypes, or purely virtual solutions.

One of the few eHMI concepts that has gone beyond the drawing board, is Drive.ai, a small company founded in 2015 by students at Stanford University and now owned by Apple. As a pilot, in 2018 it tested a fleet of brightly colored self-driving vans that use four external screens to communicate messages to other road users. Another example is Volvo’s 360c concept, which includes a standardised communication system enabling other road users and autonomous vehicles to understand what it will do next.

Challenges

Standardisation is one of the main challenges of eHMIs for self-driving cars. Automated cars will only be able to express themselves if they speak a language that is understood by all other road users. It is one of the reasons why Dey’s supervisor at TU/e, professor Marieke Martens, is actively involved with industry to develop common standards.

Standardisation will also help acceptance of eHMI-equipped cars by the public. Dey sees this as one of the main goals of his work. There appears to be quite some hesitation and mistrust in the public perception of driving automation technology. Early protoypes of automated vehicles on the road have been met with skepticism, and even antipathy – with some people even throwing stones at the cars.

“Up till now, experiments with self-driving cars have met with quite some pushback from other road users, with some people even throwing stones at them. It shows that people need to understand a new technology before they can accept it. I believe an effective eHMI that is clear and understandable can help in this.”

Finding the right balance

But, warns Dey, there are trade-offs. “Automated vehicles will never be perfect. Nor should they be. A self-driving car that is 100% safe won’t be able to drive. It will forever be condemned to stand still, waiting for the elusive guarantee that nobody will be hurt by its actions. Of course, that would defeat the whole purpose of the self-driving cars, which are supposed to bring more mobility for more people. Finding the ideal balance between perfect safety and calculated risk is critical, and good communication is key in this interaction. This is where eHMIs can help.”

This also means that at times automated vehicles should be able to ignore traffic rules, to avoid being stuck at a busy crossing, waiting for all the priority traffic to pass by. “Navigating a self-driving car means negotiating. Everyone has to give and take. Part of our work is designing an eHMI that helps express not only the intention of the car  (“I want to pass”)  but also the urgency of that wish (“I have been waiting here for 5 minutes now, so please make some room”).

Smart mobility revolution

And there is another challenge for Dey and his colleagues. “Traditionally, research has focused on One person – One car, but in a real situation, it’s never like that. There will be plenty of people, with different intentions and different right of ways,” says Dey. “If the car says, please feel free to go, it’s not clear whom the car is talking to. It’s very difficult to direct the message to one person, or only talk to the person who has right of way.”

Still, the young researcher is convinced that science, together with industry, will be able to solve these problems. “While there is no golden egg, and it’s still early stages, I’m hopeful that human-centered design of car interfaces can play an essential role in bringing about the smart mobility revolution.”

 

The original article was published on www.sonnenseite.com on 5 October 2020