Here’s the pick of the top four HMI technologies from CES 2020 by our Head of Technology Konrad Kabaja:
Qt Company has shown a three-screen HMI built on top of Android Automotive OS, Embedded Linux and Alexa Automotive SDK. We’ve seen the content and 3D UI elements flowing freely between these operating systems. We’ve seen Alexa working in both the offline and online mode controlling the car and providing similar voice assistant experience known from other devices. That means the software is ready to build experiences that go beyond a single ecosystem and seamlessly bring together the best of each world. This also means the software tools ecosystem matured to the point where the new services can be built on top of the existing building blocks with less effort. More time spent on building immersive experiences and less hassle with getting the basics covered. As an SW development company that makes us happy a lot!
Ultraleap has presented the hand in-air gesture recognition with haptic feedback. Hand gestures in the cars have been somewhat hit and miss experience so far. The addition of haptic feedback brings it to the next level though. The feedback works in a way that you feel a gentle blow of air on the skin that is done by an ultrasonic device and controlled by the software. All that happens in-air, no touching involved. The feedback is not static — it can move, change intensity and generally project various shapes on your hand. We’ve tried that on several demos — with 2D user interfaces, 3D stereoscopic screens, AR and VR — in all scenarios it brought the missing feedback loop to the hand gestures so that we’ve felt on our hand that something has happened or how to interact with certain elements. We feel excited by this technology as hand gestures provide a distraction-free way of interacting with the car just like voice assistants do. With haptic feedback, we see this can finally work in the car. The best thing is that haptics can work separately from the hand gestures unlocking new ways of interacting with the car.
Wayray has presented an augmented reality heads-up display that presents the image 15 meters ahead of the car. That means that whatever you project on the screen looks just like it would be on the street, instead of your windshield. That has two great benefits — you don’t have to adjust your eyes to switch between the road and the HUD, and you can present the contextual information close to the objects on the street similar to AR headsets, but without a need for a headset. That means better situational awareness and increased safety. We can see that will work with navigation systems by displaying guiding maneuver arrows, POIs, landmarks and so on, but we’re curious and excited to see what next use cases will be enabled by it!
Audi has presented a concept utilizing a wide translucent screen that pops up out of the dashboard. The screen’s transparency was controlled from opaque to fully transparent. What amazed us was that the control over the transparency was not only for the whole screen but individual sections of it — so there was a narrow opaque bar at the bottom of the screen showing telltales, then a center section was made opaque in real-time showing a movie and so on. That will allow for full flexibility what content is being displayed and where. And HMI can be displayed in the field of the view of the street as well. That means we could have a blend of the HMI display with the ultra-wide HUD all in one device contextually switching the UI content based on the need between the modes. We see this could also reduce the number of displays in the car and replace them with one large universal one leading to less fragmented experience. The challenge still is that the screen was not 100% transparent, it was slightly dim, so some work will still be needed to get this technology in the cars.