We focused on modern HMI UI and automotive components by using a single MCU. Nowadays, automotive hardware has rapidly become more powerful. In parallel, we can now use new tools for the rapid development of production-ready interfaces.
Teamed-up with Qt & NXP, we created the HVAC demo in a real case scenario for the automotive market. The demo is using dual-core architecture, dedicated GPU acceleration, and shared memory. All that gives a constant 60 fps on the newest NXP board with Qt for MCUs technology.
Looking more closely at the demo, we were focused on two main goals:
1. Create a concept for fully digital HVAC with rich & smooth UI graphics for the passengers.
The aim here was to achieve a high level of rendering performance using particular textures: static textures, transformed textures, and other animations. The demo uses the integration with a dedicated 2D GPU utilizing VGLite on the new NXP board (NXP i.MX RT1170) with Qt for MCUs.
2. Define the cost-effective and straightforward automotive safety-critical architecture.
We used the NXP i.MX RT1170 board features with dual-core architecture and the secondary core possibilities. This approach aims to have a distributed application responsibility and possibly safety-related parts on a single MCU. Our approach was to leverage the secondary core Cortex-M4 and run it next to Cortex-M7. The demo architecture is also including CAN communication and inter-core-communication with shared memory. As a next step, we plan to use hardware layers by using Qt for MCUs.
Based on our craft with the HVAC demo with dual-core architecture and new technology enablers, we proved that cost-effective automotive HMIs could be available for everyone!
The current state of the demo is the first step to develop our vision for innovative HMIs. With dual-core architecture & CAN communication, we defined a dedicated controller for the HVAC. The primary function of the KNOB controller is taking care of advanced functionality for passengers’ comfort configuration.
Navigation of the KNOB controller is based on swiping the screens, pushing, and rotating the unit. All screens will be adjusted based on the context. This approach is giving interesting possibilities for passengers. Please check below how we see it as the early concept.
This is an example of how we envision these gestures in the next-gen cars that appear very soon on the market, so let’s be in touch with our next steps!
To learn more and to try for free or to get started, visit the demo images for Qt for MCUS on NXP hardware and join us in our upcoming events and Qt Training: