Top chipmakers are putting a sharp focus on self-driving vehicles at CES 2022 this week, with Intel-owned Mobileye introducing new processors for the burgeoning market and both Mobileye and Qualcomm expanding partnerships with automakers.
In addition, Qualcomm — which has made autonomous vehicles (AVs) a key growth area for its Snapdragon chip platform — unveiled the Snapdragon Ride Software Development Kit (SDK) to enable OEMs and developers to build applications for such capabilities as perception and drive policy that can run atop the chipmaker’s Snapdragon Ride Platform.
The announcements at CES highlight the growing efforts in the autonomous vehicle market, a space that Statista analysts predict will grow from about $105.7 billion last year to almost $400 billion by 2025. Intel and Qualcomm are part of a growing tech sector that includes companies like Nvidia and is focused on autonomous vehicles.
For its part, Mobileye — which Intel bought for $15.7 billion in 2017 to quickly expand its presence in connected and self-driving cars — introduced two versions of the latest generation of its EyeQ system-on-chip (SoC) as well as plans for EyeQ Ultra, a SoC the company called a supercomputer that can the deliver the capabilities needed to manage Level 4 autonomy in self-driving cars.
Level 4 essentially means the vehicle can drive itself, though humans can still take control.
Mobileye’s ‘Crown Jewel’
In a virtual presentation at CES, Mobileye President and CEO Amnon Shashua said EyeQ Ultra will be the “crown jewel” of the company’s portfolio and touted its expected performance and efficiency. It won’t require the power cost and consumption tradeoffs that come with integrating multiple SoCs and will mirror Mobileye’s EyeQ SoCs by being developed in lockstep with the company’s software, which will drive power efficiency without reducing performance.
EyeQ Ultra will be built on a 5-nanometer process and deliver the performance of 10 EyeQ5s — the current silicon offering from Mobileye — in a single package. It also will use four different proprietary accelerators that each will address specific tasks and the accelerators will be paired with CPU cores, GPUs and ISP (image signal processor), which will enable the platform to process data from a camera-only subsystem as well as from another subsystem that combines radar and lidar input.
In addition, EyeQ Ultra will process input from the vehicle’s central computing system as well as the high-definition map and driving policy software. All this will be done at 176 TOPS (trillions of operations per second], a way of measuring the performance of artificial intelligence (AI) chips. The ability to process all that data at a 176 TOPS level shows how efficient the platform will be, Shashua said.
“Not everything is about TOPS,” he said. “The fact that we have a very, very diverse cores allows us to be to be very, very efficient.”
It will have 12 RISC-V cores and 64 accelerator cores, as well as a power envelope under 100 watts.
It will take a while for Mobileye to get there. The industry won’t see the first silicon for the EyeQ Ultra until the end of 2023 and it will be another two years to reach full automotive-grade production, according to the CEO.
New EyeQ6 Chips
In the meantime, Mobileye is readying the next generation of EyeQ processors for advanced driver assist systems (ADAS). EyeQ6 will come in two flavors, with the EyeQ6L succeeding the EyeQ4 in a package that will be 55% smaller and will be attached to windshields of Level 2 applications, where the vehicle can control the steering as well as speed, but humans do the rest. Mobileye began testing it last year it is expected to reach production by mid-2023.
The EyeQ6H will support top-line ADAS and partial AV capabilities. The company said the chip will have the compute power of two EyeQ5 chips but support visualization and perform better when processing heavy AI workloads. It also will offer a range of features, including all ADAS Level 2+ functionalities — such as processing data from multiple cameras, hands-free highway driving and centering the vehicle in lanes — and the ability to host third-party apps for such jobs as driver monitoring and parking.
EyeQ6H will begin sampling this year and be in production toward the end of 2024. Both of the EyeQ6 chips will be manufactured on a 7nm process.
On the Rise
Mobileye is seeing its fortunes rise as the autonomous vehicle space expands. Revenue has grown 40% year-over-year, now standing at $1.4 billion, and Shashua — who co-founded the company in 1999 — touted 41 new design wins that represent 50 million new cars on the road. The company announced last year the shipping of its 100 millionth chip and Intel late in 2021 said it plans to take the company public.
The company includes more than a dozen of the top carmakers as customers. At CES, Volkswagen said it will use Mobileye’s crowd-sourced mapping Roadbook software to provide such ADAS features as lane-keeping and center in its Travel Assist 2.5 software in VW, Skoda and Seat branded vehicles. Roadbook leverages “swarm data” from Mobileye-equipped devices and collected in the cloud to create precise, high-definition maps.
Longtime partner Ford will put Mobileye’s Road Experience Management (REM) mapping software into its BlueCruise system, which lets users operate their vehicles hands-free but monitors them via a driver-facing camera. The REM software will expand BlueCruise-enabled vehicles to drive hands-free on certain divided highways and roads without visible lane markings. The companies also are developing an open platform that will enable Ford to integrate its own solutions.
Mobileye and electric mobility tech maker Zeekr are working to build an all-electric Level 4 vehicle that will incorporate Mobileye’s REM, True Redundancy and Responsibility-Sensitive Safety-based driving policy with the SEA architecture from Zeekr parent Geely Holding Group that enables redundant braking, steering and power, all under an Open EyeQ idea for integrating the technologies. The consumer version of the vehicle is expected to launch in China in 2024 and then hit other parts of the world.
Qualcomm’s Snapdragon Ride Platform, introduced at CES in 2020, includes SoCs with built-in high-performance computing and AI engines, vision solutions for front and surround cameras and a toolset for simulation and learning frameworks. The SDK software package is aimed at such areas as AI and imaging systems cameras and includes FastADAS APIs.
In addition, there is a developer toolkit and portal.
“The rich tooling support includes enhanced diagnostic and tracing capabilities to identify any bottlenecks quickly; profiling support to analyze and benchmark CPU, GPU, memory, power, thermal and network performance; scheduling tools to determine the best compute engines to run the applications [and] calibration tools to configure various sensors,” Tharakram Krishnan, director of product management at Qualcomm, wrote in a blog post.
The company also unveiled the Snapdragon Ride Vision System, an addition to its Snapdragon Ride Platform. The new open and modular software stack is built atop a 4nm Snapdragon SoC that is designed for front and surround cameras for ADAS and automated driving systems. It will be in vehicle production in 2024.
The system offers “a more open, adaptable and scalable platform for computer vision solutions,” Nakul Duggal, senior vice president and general manager of automotive at Qualcomm, said in a statement, adding that it “can offer automakers the opportunity to customize more advanced driving experiences for every vehicle class.”
Qualcomm also announced expanded partnerships with a range of carmakers, including Renault, Volvo and Honda and unveiled the opening of an engineering software office in Berlin that will deliver the latest capabilities of its Snapdragon Digital Chassis to European automakers and their customers.