The GenX320 event-based Metavision® sensor is the world's smallest and most power-efficient event-based sensor on the market, enabling a new level of intelligence, autonomy and safety for integrated vision devices. It is the first vision sensor optimized to enable the breakthrough neuromorphic event-based vision paradigm on low-power edge vision systems. It directly addresses the power (microwatt), latency (microsecond time resolution) and dynamic range (>120dB) requirements of edge devices which are often battery-powered, must operate autonomously in challenging lighting conditions, and process and analyze data locally. Operating at the microwatts to milliwatts power range and capable of microsecond time resolution, the GenX320 improves the integrability and usability of event-based vision in embedded at-the-edge vision systems.
Prophesee has successfully commercialized the neuromorphic vision approach over multiple sensor generations, including an HD-resolution sensor co-developed with CIS market leader Sony, and has now introduced the present fifth-generation sensor family. The new sensor has been reduced in array size, and optimized for ultra-low power operation, featuring a hierarchy of low-power modes and application-specific modes of operation. On-chip power management and an embedded microcontroller core further improve sensor flexibility and useability at-the-edge. An on-chip Event Signal Processing (ESP) pipeline includes timestamping, filtering, throughput regulation and data formatting functions. An Event Rate Controller (ERC) allows to cap the output rate to a programmable limit rate, a Spatio-Temporal Contrast filter (STC) detects and removes redundant bursts and trails of events triggered by high contrast visual features, and an Anti-Flicker (AFK) filter detects and filters events generated by flickering lights. For the design of the GenX320 neuromorphic sensor, the explicit goal was to improve integrability and usability in embedded at-the-edge vision systems, which in addition to size and power improvements, means to address the challenge of event-based vision's unconventional data format, non-constant data rates and non-standard interfaces to make it more usable for a wider range of applications. We have done that with multiple integrated event data pre-processing, filtering, and formatting functions to minimize external processing overhead. In addition, industry-standard MIPI and CPI data output interfaces offer low-latency connectivity to embedded processing platforms, including low-power microcontrollers and modern neuromorphic processor architectures. The GenX320 sensor has also been further optimized for low-power operation, featuring a hierarchy of power modes and application-specific modes of operation. On-chip power management further improves sensor flexibility and integrability. To meet aggressive size and cost requirements, the chip is fabricated using a CMOS stacked process with pixel-level Cu-Cu bonding interconnects, achieving a 6.3µm pixel pitch.