IEyeNews

iLocal News Archives

The Path to a Driverless Future – How Helm.ai is Building the World’s Safest, Most Scalable Autonomous Technologies

Helm.ai

July 23, 2024 – Level 4 (L4) fully autonomous driving is achievable, but the best way to get there is to start with assisted driving. That’s the belief of Vlad Voroninski. He’s the CEO and Co-Founder of Helm.ai, a company focusing on the software aspect of autonomous driving. It’s building full-stack AI software for OEMs for their high-end advanced driver assistance systems (ADAS), or L2+.

L4 is defined as fully automated driving, where the system takes over the driving completely. However, the vehicle’s autonomy is still restricted to certain conditions, such as highways and parking.

Helm.ai provides a scalable approach to developing AI software for autonomy that unifies L2+ and L4.

“We leverage large-scale unsupervised learning, a method we call Deep Teaching, which we have been developing internally since 2016. We began with perception, but nowadays, we are expanding our capabilities to include intent prediction, path planning, and generative simulation,” explains Voroninski.

The company differentiates itself from other autonomous vehicle (AV) technology developers as it’s a software-only provider. Helm.ai licenses the technology to its customers.

“Technologically, we have a lot of unique IP that lends itself to novel product capabilities that aren’t possible with other approaches, particularly when it comes to scalability, cost-effectiveness, pace of development, and addressing rare corner cases.”

Its AI technology is trained unsupervised on real driving data at scale, learning to exhibit human-like driving behaviors and handle unexpected situations. In effect, it’s teaching itself.

“We’re particularly excited about the ability of our models to learn directly from driving data, leading to AI systems with an inherent understanding of the world in which they operate and natural human-like driving behaviors, which are impossible to hand-craft at scale,” he adds.

Utilizing Generative AI

Founded in 2016, the California-based startup has the financial backing of the likes of Goodyear, Honda and the South Korean Tier 1 supplier, Mando Corporation. It closed a $55 million Series C financing round in August 2023.

“The capital we’ve raised allows us to advance our AI-first autonomous driving software, including our foundation models for intent and path prediction, as well as our generative simulation technology. We’ve ramped up hiring in research and engineering and have invested heavily in infrastructure (compute) to accelerate model training,” says Voroninski.

As well as Honda, Helm.ai works with a number of global automakers and Tier 1 suppliers across different markets.

“Our AI models are naturally robust to a wide range of corner cases and geographies, allowing deployment in any market. Additionally, OEMs can generate any number of training and validation scenes using our AI-based simulation technology to improve the models for weather and illumination, geographic locations, and rare corner cases.”

Meanwhile, the company continues to move forward. It has just launched a generative AI model, called VidGen-1, that produces video sequences of driving scenes for autonomous driving development and validation.

The model simulates realistic video footage of various scenarios across multiple cities internationally, encompassing urban and suburban environments, a variety of vehicles, pedestrians, bicyclists, intersections, turns, and weather conditions.

VidGen-1 offers automakers significant scalability advantages compared to traditional non-AI simulations.

“Generating and predicting video sequences of a driving scene represents the most advanced form of prediction, as it includes both intent prediction and path planning. This capability is crucial for autonomous driving because, fundamentally, driving is about predicting what will happen next,” states Voroninski.

Seeking Safety And Reliability

Looking to the future, Voroninski predicts we will continue to see more ADAS over the coming years. Looking further ahead, within a number of years we can expect to see scalable L4 deployments.

“There will be an intermediate regime where a L2 system is L4 capable but still deployed as an L2 system from a product perspective. This means that activating the Advanced Driver Assistance System (ADAS) in a car can be even safer and more reliable than human driving. However, the driver will still be liable for what happens, so it won’t let you sleep in the back seat but will offer tremendous safety and convenience advantages,” he says.

“Eventually, when an automaker decides to enable L4 capability, it will be like having a private AI driver that’s much cheaper and more reliable than a human operator.”

He also predicts that innovations in the self-driving space will intersect with advancements in robotics.

“There are markets where L4 already exists, like mining, but it will go through its own renaissance period. I am particularly optimistic about humanoid robotics, which represents an entirely new paradigm. From an AI perspective, the approach we take to L2 through L4 naturally extends to robotics. In a few years, we will see an explosion of such applications,” concludes Voroninski.

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *