Ambarella Joins Forces with Hella in ADAS

Article By : Junko Yoshida, EE Times

Goes head-to-head with companies like TI, NXP and Renesas on congested market

PARIS — As the tech and automotive industries began walking back their high expectations for autonomous vehicles in 2018, they are eagerly shifting gears to Plan B — a renewed emphasis on advanced driver assistance systems (ADAS).

Of a variety of sensors designed for ADAS features, the race is most competitive and concentrated in computer vision. Vision is the most broadly used advanced sensor.

One crucial question in the ADAS market is who will contend in 2019 as an alternative to Intel/Mobileye — the most dominant player in the computer vision-based ADAS market. Candidates include NXP, Texas Instruments, and Renesas. New to the club is Ambarella, Inc., a Santa Clara, California-based developer of high-resolution video processing and computer vision chips.

Ambarella revealed this week a collaboration with HELLA Aglaia, a tier-two developer focused on computer vision software for the automotive market. HELLA Aglaia is a full subsidiary of HELLA, a German tier one based in Lippstadt.

Hardware differentiation
Although Ambarella’s exposure to the automotive market thus far has been limited mostly to dashcams, Chris Day, vice president of marketing and business development at Ambarella, is hopeful that the partnership with HELLA Aglaia will change that picture.

Ambarella is confident that it can go head to head with TI, NXP, and Renesas on ADAS, based on Ambarella’s “high-performance computer vision processor CV22AQ CVflow that runs at an extremely low power,” said Day. Running typically below 2.5 W, “the next generation of ADAS camera and computer vision processor can fit in a single, tiny box placed at a windshield, for example. OEMs or tier ones have no need to worry about thermal limitations in that box.”

In contrast, competitive ADAS vision processors that run as high as 15 W would require a two-box solution, according to Day. While a camera fits in one box, another box would be required to house a computer vision processor, he added.

Why consider Ambarella?
Asked why OEMs and tier ones should consider Ambarella with other vision platforms already out there, Kevin Mak, senior analyst at Strategy Analytics, told us, “Their CVflow architecture and small die (10 nm in CV22AQ) enables a highly efficient performance-per-watt.”

Ambarella is no novice in the world of ADAS and autonomous vehicles. Since 2015, when it acquired VisLab, the Vision and Intelligent Systems Laboratory at the University of Parma, Ambarella has had its sights set on the highly automated vehicles sector.

VisLab team has been spearheading efforts to develop both a deep neural network and stereovision process and optimize its software to Ambarella’s automotive quality CV22 vision processor. The vision processor offers an image signal processor (ISP) and massive artificial intelligence (AI) computing performance.

CV2 block diagram (Source: Ambarella)
CV2 block diagram (Source: Ambarella)

But for Ambarella to bring its CV22 chips to the automotive market, the key is a partner like HELLA Aglaia to deliver a complete computer vision software stack all the way up to apps, explained Day.

The HELLA Aglaia/Ambarella partnership isn’t exclusive, however. Strategy Analytics’ Mak told EE Times that HELLA Aglaia already has “the open vision platform with NXP and the front windshield camera partnership with Renesas.”

Why do OEMs want ‘open’ solutions?
The most oft-cited criticism of Mobileye — despite high chip performance and a leadership role in the ADAS market — is that its solution is “black box.”

But if the computer vision software inside the black box works, who cares if it’s a closed system?

Strategy Analytics’ Mak observed, “Competition is getting tougher between OEMs to make the safest cars on the market.” He stressed, “If the processing SoC is not ‘open,’ then OEMs are stuck with a processor that cannot be modified as their rivals seek to enhance the performance of their ADAS features.”

OEM and tier-one customers want an open camera platform with the flexibility to add software features, explained Kay Talmi, managing director at HELLA Aglaia, in a statement. This needs to be combined with the performance necessary to run the next generation of deep neural network algorithms.

Without question, OEMs are increasingly worried about a host of new requirements thrown at them by organizations like the New Car Assessment Program (NCAP) and Euro NCAP, a European car safety program.

Mak pointed out, “If you look at the Euro-NCAP safety reports, they measure the effectiveness of each automatic emergency braking (AEB) requirement — including City and Interurban — and each situation including the pedestrian walking and the cyclist cycling alongside the host vehicle or across the path of the host vehicle on a crosswalk.”

Euro-NCAP also looks at low-light conditions. Mak added, “NCAP AEB requirements are getting tougher more quickly over time and test protocols are published at a short notice.”

Euro NCAP roadmap (Source: Euro NCAP)
Euro NCAP roadmap (Source: Euro NCAP)

In short, OEMs are under the gun. They must keep updating their software as new features are demanded by organizations like Euro-NCAP. Mak noted, “Euro-NCAP, for example, will implement AEB Crossing, Junction & Head-On, and AEB Reversing in 2020. It will also implement Evasive Steering or Autonomous Emergency Steering in 2020. J-NCAP has already implemented Pedal Misapplication in 2018 and will review performance requirements in 2020.”

Who competes with HELLA Algaia?
Besides HELLA Aglaia, who else competes in the computer-vision software-stack space? Mak said that there are many.

“Mobileye, for example, will be the main competitor,” he said. But there is a wide range of software developers in ADAS. Mak noted that they include the OEMs themselves, their Tier 0.5 consultants such as FEV, IAV, tier ones, and their in-house subsidiaries (e.g., Continental + Elektrobit, Visteon + AllGo, and, of course, Aglaia is part of HELLA), plus some other specialist developers.

— Junko Yoshida, Global Co-Editor-In-Chief, AspenCore Media, Chief International Correspondent, EE Times

Leave a comment