Trend #1: Lower cost of development, shorter TTM
Trend #2: HMI mixing, unlocked
Trend #4: Your car soon becomes a mind reader
Trend #5: The voice replaces the touch
If the first time round was sneak peeks and insights on how we at siili_auto have taken the trends over, now it’s time to dig deeper: not (only) to boast with our achievements but to actually share with you what we saw in CES 2020 and how it confirmed our predictions just might prove themselves right.
Seamless, immersive experiences through multi-OS HMIs
We see it and it’s there: the software HMI platforms mature, providing more out of the box, enabling content to flow freely in between. The next step is seamlessly blending the platforms and ecosystems to bring together the best of both worlds.
This means the software tools’ ecosystem has matured to the point where new services can be built on top of existing building blocks with less effort and a shorter TTM than ever before.
Proof at CES 2020: Qt Company has presented a three-screen HMI demo built on top of an Android Automotive OS, an Embedded Linux and an Alexa Automotive SDK — the contents flowing freely between all of them.
Micro-HMIs to provide contextual information and feedback
We’re expecting screens in and around cars getting into places where they can provide contextual feedback or information for the people using them. It could be a screen providing a welcome message on a B-pillar in a car-sharing scenario or an HMI for a rear-seat HVAC control. Since some of these settings would have to work when the car is off, a power-efficient micro-HMI would be perfect for filling the screens with nice-looking interactive content.
Proofs at CES 2020: An emergence in minimalistic displays on car interiors and exteriors could be seen, a few of the awesome examples we spotted here:
Contextual personalization of the connected services in cars
It’s a great start that the user data collected in cars can be used to optimize the experience — simply making it more effective. But the next step is to provide contextual personalization so that the car’s actions will be optimized for the specific person in the car at the said moment. This has already been seen in the mobile phone voice assistant space over the past few years but a similar approach to reach cars in the coming year can be expected. No strong proof yet though, so let’s keep our eyes open.
Merge of the voice assistants into a seamless experience
The first step was getting the leading voice assistants into cars — the next one is to enable seamless switching between assistants so that people could use the ones they prefer in a way that’s familiar i.e. in-home or in a mobile phone. This should be possible thanks to the recently announced Voice Interoperability Initiative although we’ll have to see how it goes as not all the big players are on board yet.
And as a bonus? We see that the advance in hand gesture tracking in the shape of in-air haptics feedback provides another way of interacting with the car without having to touch anything — as shown by Ultraleap:
Convergence of screens and contextual information display
As the tech progresses, the HUDs become wider and displays translucent there might be an opportunity to merge these two: that would mean we’d be able to provide contextual information right in front of the driver’s sight. And heck, with the micro-HMIs from Trend #3 we can imagine a dashboard that has near to no screens. Yep, you heard us.
Proofs at CES 2020: Audi has presented a concept utilizing a wide translucent screen that pops up out of the dashboard. Meanwhile, Wayray has shown a working AR HUD: