ADAS and Autonomous Driving’s Evolution after the Advent of High Performance and Quantum Computing

ADAS and Autonomous Driving’s Evolution after the Advent of High Performance and Quantum Computing

At the turn of the new century, features like Lane Departure Warning System, Blind Spot Detection System, Automatic High Beam Control and Traffic Sign Board Recognition were introduced in new cars.

ADAS and Autonomous Driving's Evolution after the Advent of High Performance and Quantum Computing

By Naresh Neelakantan, Senior Architect-Automotive Portfolio, Sasken Technologies Ltd.

A History of ADAS and Computing Platforms' Role

During the modern era of automation for in-vehicle electrical and electronic architecture, USA via the big 3 i.e. GM, Ford, and Chrysler (Fiat Chrysler Automobiles) started introducing features to making driving easier with features like electronic power steering, automatic gearshifts, disk-type brakes, and power windows. In the late 90s, the US Department of Defence (DoD)led by the Defence Advanced Research Projects Agency (DARPA)began a paradigm shift towards radical automation,leading to the creation of ADAS and Autonomous Driving features. At the turn of the new century, features like Lane Departure Warning System, Blind Spot Detection System, Automatic High Beam Control and Traffic Sign Board Recognition were introduced in new cars. Earlier ADAS systems primarily depended on cameras as their sensor source.However, a need for larger computational power rose to handle first generation image processing and algorithm related processors or controllers. This was due to a shift in design accommodating the non-raw computation approach towards the image processing pipeline.These included MAC units, image co-processor accelerators, and cache pipeline along with memory density to accommodate more data.

This disruption was latched on by processor and controller companies who understood the system aspects associated with intensive computation, speed, and memory overhead.Different chip manufacturers adopted different approaches albeit, with the common goal of satisfying the ADAS system feature implementation. A select few felt increasing the computational pipeline with an increase in memory feed would support convoluted or MAC operations faster. Others thought an ecosystem of image co-processors shifting programming from just simple operations, to more complex compute intensive operations working in parallel would make the difference. Some others began accommodating an increase in memory capacity for storing the essential features of data intensive operations. Regardless of the different approaches, they were all working in parallel for an imminent revolution which happened decades later, with the rise of Autonomous vehicles. These were in-built essential chip components for the now dubbed Level-1 Autonomous Driving cars or vehicles with Advanced Driver Assistance Systems (ADAS).

Autonomous Driving's Middle Age:ADAS Features' Maturity and Signal Processing Chip Advancement

The rapid advancement of data introduction into vehicles, led to internet and smartphone giants like Google, Samsung and Apple stepping in to tackle the battle around data ecosystems.However, none of them started their own independent production of vehicles. The proven battleground was in high-end vehicles from BMW, Mercedes and Volvo with each developing their ecosystem of vision sensors around chips from companies like Mobil eye, Texas Instruments and Free scale. This also sparked an interest among other OEMs and car or truck manufacturers. Tier-1s focusing on ADAS and its safety like Autoliv and Takata launched with a focus on increased sensors, apart from cameras and incorporating ultrasonic sensors to radars and LiDAR. Applications started maturing with Freeway Assist, Intersection Merge Assist, Adaptive Cruise Control, Reverse Assist and Auto-Parking gaining momentum during research after the prior mentioned algorithms matured.

Chip manufacturers like Mobileye became disruptors in the space by offering the computing ecosystem satisfying the computational, sensor, and data complexity of in vehicleADAS features. RF also started gaining strength in terms of radar usage leading to the necessity of DSP computations from the receivers' side. Other chip manufacturers like STMicroelectronics, NXP, Analog Devices, and Infineon added the RF processing aspectof the ecosystem to ease realization of features from vehicle manufacturers. Stringent norms towards vehicularemissions, traffic regulations, and vehicle insurance started picking up to enhance safety and convenience of passengers and road users.Hardware components in ECUs built around such chips therefore became aligned with these goals as seen with Level 1 and Level 2 Autonomous Driving around 2009- 2018.

Current Situation ofLevel 2+ Autonomous Vehicles withHigh Performance Computing

By last year, OEMs and Tier-1s realized that consumers require comfort, convenience, and safety in their driving experience for routine and casual transits. This led the ecosystem into a newer paradigm of connected, autonomous, electric vehicles with shared mobility, led by disruptors like Tesla in the Bay Area.

Connectivity revolved not only around infotainment and telematics applications but also with over-the-air updates of the ECUs within,as seen in Tesla and high-end cars. Connectivity brought about a drastic makeover for chip manufacturers and Tier-1s manufacturing TCUs and infotainment systems. They had to adopt strategies ensuring controlled and cyber-safe implementation of updates that would not expose update related critical infrastructure and building filters, permissions, and firewalls. The considerations for gateway implementation gained prominence in chips with limited unauthorised access of critical vehicle infrastructure and actuators, like brakes and accelerations. A prominent example being the Jeep Cherokee's vulnerability that provided unauthorised access to the braking infrastructure that was identified by cyber security professionals.

Connectivity is also an important part of the current topic of discussion in the Autonomous context.This is because it was not just sensors and algorithms that matured, but also the ability to have fail-safe ecosystems as seen in auto-pilot functionality for planes and bullet trains. There were cases of functional safety requirements for handling multiple sensors and sensor fusion around the detection of potential crash scenarios.Hence, the algorithms needed more shifting of data analysis between ECUs or Domain Controller Units with Cloud computing through the connectivity interface gateway built in vehicles. Also, there was a lot of V2X information exchange occurring,to provide the right information apart from the range of sensors described earlier. Security patches and other software patches were also all needed to constantly update the infrastructure within vehicles based on the risks present. The map infrastructure also required constant updates for re-routing, precision in 3D mapped and non-mapped GPS regions and other information like weather, which changes constantly. Autonomous vehicles beyond Level 2 classification from the consortia surrounded themselves around the ideology, 'Data is the new oil'.

This change in the architecture of the processing or computing units was beyond the actual control of such Level 2+ Autonomous vehicles.Also the data collection needed for Artificial Intelligence and Deep Learning algorithms being realized on the ECUs/DCUs/Edge Computing called for high performance computers and a Cloud computing environment. Vendors providing graphic chips like AMD, Nvidia, and MediaTek realized the reverse pipeline readily available in their platforms. Nvidia immediately started shipping chips reversing their graphics pipelines thereby catering to AI/ML computational platforms to enable Autonomous Driving Intelligence. Soon, the chip industry realized that there was a revolution with Mobileye and Nvidia dominating these platforms. NXP and Qualcomm announced their chipsets in the latest CES to close in on the lead that Mobileye and Nvidia had, with Intel having already acquired Mobileye.

The Entry of Quantum-era Computing

Around this time, another revolution was underway in the Cloud ecosystem with Google releasing its Tensor Processing Units. Quantum computing had started gaining traction in the backyard of computing's front runner, IBM followed shortly by Microsoft.This ecosystem provided the possibilities of solving NP-hard and NP-incomplete problems with brute force through sheer computational TFLOPs even greater than those of supercomputers. Volkswagen is trying to solve traffic bottlenecks by making use of such systems as recently announced in 2020. Some suppliers are also considering the realizing of quantum edge platforms as a replacement for ECUs and DCUs,along with quantum sensors capturing maximum information surrounding them.

The Automobile industry is transforming itself by constantly up scaling and catering to the comfort and convenience required not only in high-end vehicles but also the medium and low-end segments.This ecosystem will play a crucial role in the revolution of smart cities that will run parallel to advancements in vehicle ecosystems.

About the Author

Naresh Neelakantan is currently a Senior Architect for the Automotive Portfolio at Sasken Technologies Limited. With 15+ years of experience working with various Automotive companies, Naresh is enabling Sasken to navigate the next revolution in Automotive technologies. He is currently helping Sasken make strategic decisions in the areas of AUTOSAR and Autonomous Driving, while aligning the Automotive ECU-to-DCU market with Sasken's Chip-to-Cognition vision.He is an expert in Research & Technology Management of Embedded Software Architectures combining Analog sensors with Digital technologies and has successfully delivered more than 25 products (including micro controllers, ECUs, BSPs, design models, architectures and applications). They have been implemented for over 1 million on-road vehicles around the globe and he has also participated in more than 20 Model Year programs.

About Sasken |

Sasken is a specialist in Product Engineering and Digital Transformation providing concept-to-market, chip-to-cognition R&D services to global leaders in Semiconductor, Automotive, Industrials, Smart Devices & Wearables, Enterprise Grade Devices, SatCom, and Transportation industries. For over 30 years and with multiple patents, Sasken has transformed the businesses of 100+ Fortune 500 companies, powering more than a billion devices through its services and IP.

Picture Credit |

Get The CEO Magazine to your Door Steps; Subscribe Now

Software Suggestion

No stories found.

Best Place to Work

No stories found.

CEO Profiles

No stories found.

Best Consultants

No stories found.

Tips Start Your Own Business

No stories found.
The CEO Magazine India