Smart things
Imagine slipping on a pair of AR glasses so light, you forget you're wearing them. That's the magic of Meta-Bounds' AR glasses, unveiled at CES 2025. While Meta-Bounds isn’t the first company to venture into all-day wear AR glasses, it is the first to break the
35-gram barrier with a built-in advanced near-eye display system.
But make no mistake. What truly sets them apart is not just comfort and advanced immersive features, but their ability to deliver a myriad of new AI-powered experiences for 10 hours on a single charge.
The secret? The STM32N6, a microcontroller tailored for companies looking to seamlessly deliver the benefits of AI to our daily lives.
Weighing just 35 grams, Meta-Bounds' AR glasses can deliver a myriad of AI-driven experiences for 10 hours on a single charge. Before the STM32N6, achieving this level of performance on such a limited energy budget would have been impossible. So, what makes this microcontroller so special? Two highly specialized units: an image signal processing unit (ISP), and a neural processing unit (NPU) —branded as the
Neural-ART Accelerator. These ST homegrown NPU and ISP embedded in the STM32N6 can process massive amounts of data and images using pre-trained AI models, and convert them into valuable actions. This leaves the microcontroller’s main processing core free to focus on other tasks. That’s how Meta-Bounds' AR glasses can deliver so many experiences on such a minimal power budget.
![]()
“
The STM32N6 microcontroller has been
a breakthrough for our products.
Its neural processing unit and image signal processor enabled us to deliver advanced features in ultralight AR glasses.”
Dr. Zhou Xing, Founding Partner of Meta-Bounds
Creating stylish yet powerful AR glasses was, until recently, a tricky task. Engineers had to juggle multiple components, including image sensors, ISPs, memory, and application processors. The STM32N6 simplifies everything. How? First, there’s no need to send data back and forth to an external memory, as the embedded ISP and extensive 4.2-megabit memory handles it all. This all-in-one approach translates in both lower latencies and greater energy efficiency. Plus, this microcontroller doesn’t require a cooling system, saving weight.
Put simply, the STM32N6 creates a virtuous cycle, allowing designers to choose thinner image sensors without an ISP. When ST presented the STM32N6, Meta-Bounds instantly saw an opportunity to bring their vision to life—always-on, all-scenario, all-day wear AR glasses.
“
With ST’s edge AI solution, we have redefined the next generation of information display and AR interactions.
”
Dr. Zhou Xing, Founding Partner of Meta-Bounds
But concretely, what can AI do for AR glasses? AI enables advanced functions such as speech recognition and object classification. As a key example, consider a conversation between two people that don’t speak the same language. With Meta-Bounds' AR glasses, real-time translation appears before your eyes. The AI-powered glasses capture and translate audio, while advanced micro-projection technology displays the translation seamlessly. The same principle applies to images. Imagine driving in a foreign country and encountering an unfamiliar traffic sign. The AR glasses can analyze the sign and display the translation or instructions near your eyes. This is just one example. Brands adopting Meta-Bounds’ AR glasses can create countless AI-powered experiences.