STURDeCAM31: The future of vision systems for autonomous vehicles

Autonomous Vehicles (AVs) redefine transportation, enhancing safety, traffic flow, and accessibility. They revolutionize industries and enable innovative mobility solutions for diverse users.

Photo courtesy of author

Photo courtesy of author

Autonomous Vehicles (AVs) include cars, trucks, tractors, drones, etc., all designed to perform tasks that traditionally required human drivers. They leverage technologies like computer vision, sensor fusion, and machine learning algorithms to interpret data and make real-time navigation decisions.

The impact of such vehicles is far-reaching since:

1. They improve road safety by reducing accidents caused by human errors such as distraction or impaired driving.

2. They enhance traffic flow and reduce congestion through proper route management and coordinated driving behaviors. This can also play a big role in potentially lowering emissions and fuel consumption.

3. AVs can transform the transportation sector by providing new mobility solutions for those unable to drive, such as older adults and people with disabilities.

Also, their business impact is multifaceted, as they can help reinvent industries like logistics and delivery services.

How do autonomous vehicles work?
As you can imagine, embedded vision is an integral part of enabling autonomous vehicles, as it provides the system with the capability to perceive and interpret its environment. They use high-resolution cameras and powerful processors to perceive and interact with their environments. It enables them to perform tasks like object recgnition, pattern analysis, obstacle detection, etc.

The cameras are carefully selected based on factors like sensor type, resolution, and frame rate, while processors provide the computational power needed for real-time data handling. This combination allows the robots to capture, process, and interpret visual data, creating a nuanced understanding of their surroundings.

For instance, autonomous vehicles can perform critical tasks like Adaptive Cruise Control. The cameras capture and analyze data from various angles, enabling the system to monitor surrounding traffic. The information from multiple cameras allows for improved object detection, increased accuracy in tracking the position and speed of neighboring vehicles, and a broader field of view.

STURDeCAM31 – a state-of-the-art camera solution for autonomous vehicles
e-con Systems has designed and developed STURDeCAM31, a state-of-the-art 3MP 120dB HDR camera for autonomous vehicles. It is compatible with NVIDIA Jetson AGX Orin by default, with readily available hardware support for all other boards.

What sets the STURDeCAM31 apart? Let's look at its competitive advantages.

High Dynamic Range (HDR) and sub-pixel technology
In contrast to traditional HDR modules, the STURDeCAM31 distinguishes itself with an automotive-grade Sony® ISX031 sensor, incorporating sub-pixel technology. This approach divides each pixel on the sensor into two sub-pixels, capturing brighter and darker scene areas. It captures a short-exposure image to preserve highlights' details and a long-exposure image to capture shadow areas' details. Specialized algorithms merge these multiple exposures to produce a final HDR image that blends properly exposed elements from each exposure.

Therefore, HDR performance is elevated, particularly in demanding low-light conditions, while eliminating motion blur under various lighting conditions.

Durability certified with an IP69K rating
STURDeCAM31's IP69K certification means it excels in maintaining a dust-tight seal and withstanding close-range, high-pressure, and high-temperature spraydowns. This certification underscores the camera's consistent operational reliability, ensuring its seamless suitability for the demanding requirements of autonomous vehicles.

Synchronized 8-camera solution with AGX Orin
The STURDeCAM31 makes it a breeze to configure a synchronized 8-camera setup while supporting the NVIDIA® Jetson AGX Orin platform. This allows for the simultaneous capture and transmission of data from all eight cameras to provide a full 360-degree field of vision and diverse viewpoint data. The synchronized data can be extremely useful for intelligent navigation, thereby improving the performance of autonomous vehicles. Furthermore, it optimizes the accuracy of visual data so that image stitching occurs smoothly with better spatial perception.

GMSL2 interface with cable support
STURDeCAM31 harnesses a GMSL2 interface and comes with a 3m coaxial cable and a FAKRA connector. An optional 15-meter cable is available for extended reach. Plus, the camera is compatible with multiple interfaces, such as USB, MIPI, and FPD LINK III. This adaptability offers a range of choices for data transmission and system integration for clear communication within autonomous vehicles.

GMSL health monitoring capabilities
STURDeCAM31 has GMSL health monitoring capabilities (via the GMSL link) for uninterrupted data transmission between the sensor and the Microcontroller Unit (MCU). Regular assessments of sensor functionality and GMSL link status help minimize data transmission errors. This health supervision ensures dependable control signal reception and accurate data processing. Also, automated diagnostics can conduct real-time evaluations of data integrity, signal strength, and link stability. So, it establishes a robust fail-safe mechanism to safeguard the streaming process.

Other competitive advantages of STURDeCAM31 include:

· Interface compatibility: USB, MIPI, and FPD LINK III

· LED Flicker Mitigation (LFM)

· Two lens options: A 70-degree DFOV narrow-angle lens and a 170-degree DFOV wide-angle lens

· Camera streaming resilience feature

· Compact form factor: Minimized surface and seal vulnerabilities during high-pressure wash-downs

How STURDeCAM31 empowers autonomous vehicles
The application of STURDeCAM31 is as diverse as its features, as it can be used in many ways across several applications. For example, they can be integrated as front cameras, rearview cameras, or surround view cameras.

Let's see how the STURDeCAM31 works as a surround-view camera in an autonomous vehicle.

The STURDeCAM31 offers advanced placement flexibility, allowing for strategic positioning at key points around the vehicle to create a 360-degree surround view. This heightens situational awareness and acts as a countermeasure against blind spots, offering a high level of spatial insight. Equipped with a high-performance 3MP CMOS sensor, STURDeCAM31 captures granular visual data with superior clarity, with its robust 120dB HDR technology. It ensures that the camera feeds are unaffected by different lighting conditions.

Now, let's look at how the STURDeCAM31 performs as a front camera.

STURDeCAM31's innovative split-pixel technology helps provide the autonomous vehicle with a stream of highly accurate and reliable visual data. This visual data becomes instrumental in real-time path analysis, obstacle recognition, and dynamic path adjustments, optimizing navigation capabilities. Also, its rugged IP69K-rated enclosure is engineered to withstand harsh conditions and ensures peak performance as a front camera. As expected, STURDeCAM31 is ideal for demanding environments such as dusty warehouses or outdoor settings marked by ever-changing weather conditions.

Conclusion
e-con Systems' STURDeCAM31 is a standout camera solution for autonomous vehicles. Leveraging advanced features, it provides thorough visual data for these systems to perceive and interact with their environment while eliminating blind spots. In a nutshell, it enables these futuristic vehicles to better navigate, avoid obstacles, and interact with dynamic surroundings.