In Person Event

Embedded Vision Summit

May 21 - 23, 2024 In Santa Clara Convention Center, Santa Clara, CA

The premier conference for innovators incorporating computer vision and Edge AI in their products.
100+ Sessions/talks. That focus on computer vision + AI.
4 Tracks. That focus on practical computer vision.
1,500+ Attendees. Share insights on new AI technologies.

Join us at Booth #623 where we will cover the latest ST technical insights

Business trends and vision technologies—all with a focus on practical, deployable computer vision and visual/perceptual AI. 

Vision logo

About the Summit

About the Summit

The Summit attracts a global audience of 1500+ technology professionals from companies developing computer vision and edge AI-enabled products including embedded systems, cloud solutions and mobile applications.   We look forward to seeing you there!

Explore our demos

rgb-depth-fusion-kea-camera

RGB depth fusion (KEA camera)

RGB depth fusion (KEA camera)

Using ST's premium indirect Time-of-Flight (iToF) and an RGB camera, ST's partner Chronoptics was able to achieve unrivalled performances. By combining the data from these two sensors, it creates a real-time embedded 3D camera perfect for industrial, vision-based systems, and AR/VR applications. This technology allows for precise depth sensing and accurate object recognition while providing colorization, making it ideal for a variety of use cases.

Event-based-like camera

Event-based-like camera

Event-based-like camera

This demo showcase state-of-this art global shutter sensor with event-based-like capabilities. This sensor redefines image processing by outputting only frame-to-frame differences, significantly reducing data volume while ensuring critical detail retention. Its rapid response to scene changes is ideal for high-speed applications. The sensor's low power consumption, small size and high dynamic range make it perfect for embedded systems in automotive, robotics, and surveillance.

Multimodal Occupancy Sensor

Multimodal Occupancy Sensor

Multimodal Occupancy Sensor

The STEVAL-PDETECT1 board, combined with STEVAL-STWINBX1 and FP-SNS-DATALOG2, provides advanced multimodal sensing for human occupancy detection. It uses ToF, ALS, infrared, microphone, motion, and environmental sensors for analytics. FP-SNS-DATALOG2 enables data storage on SD, wireless control via BLE, or PC streaming via USB, with a plug-and-play example for various applications.

2024-summit

STM32MP2 Machine Vision

STM32MP2 Machine Vision

The STM32MP257F-EV1 eval kit of our new STM32MP2 MPU series is combined with a raw camera to perform real time multi people detection and poses estimation. The STM32MP2 Neural Processing Unit (NPU) runs a TinyYolov8 pose neural network at high FPS.

Possible of applications are in-store analytics, smart buildings, fall detection, in the automotive space such as driver and passengers monitoring or even motion-controlled gaming.

Register for embedded vision summit

Order your free tickets for Embedded Vision Summit 2024 using the link below. ​See you in Booth #623, Halls A, B.

On

21 - 23 May, 2024

In

Convention Center, Santa Clara, CA
Register now!
Santa Clara Convention Center, CA

5001 Great America Pkwy, Santa Clara, CA 95054.

 Convention Center, Santa Clara, CA  Convention Center, Santa Clara, CA Register now!

Register for embedded vision summit

Order your free tickets for Embedded Vision Summit 2024 using the link below. ​See you in Booth #623, Halls A, B.

21 - 23 May, 2024

Convention Center, Santa Clara, CA

 Convention Center, Santa Clara, CA Register now!