Argo AI teamed up with advocacy group the League of American Cyclists (LAB) to come up with guidelines for how self-driving vehicles should identify and interact with cyclists. The goal is to set a standard for other AV companies in the industry to follow, particularly as the self-driving industry moves away from testing and toward commercialization and will become more commonplace in the coming years.
The World Health Organization estimates that 41,000 cyclists are killed in road traffic-related incidents every year. While self-driving vehicles are expected to reduce collisions significantly, much of that anticipated safety is a result of good coding at the start. Self-driving cars learn from massive databases that categorize and identify objects and situations that might arise, and Argo’s guidelines emphasize training its models in a way that specifically notes cyclists, cycling infrastructure and cycling laws.
“The creation of these guidelines is part of Argo’s dedication to building trust with community members and developing a self-driving system that provides a level of comfort to cyclists, by behaving consistently and safely,” Peter Rander, president and co-founder of Argo AI, said in a statement. “We encourage other autonomous vehicle developers to adopt them as well to further build trust among vulnerable road users.”
Argo, which currently operates self-driving test vehicles throughout the U.S. and parts of Germany, said it collaborated with LAB’s community to hear about common cyclist behaviors and interactions with vehicles. Together, Argo and LAB came up with six technical guidelines for self-driving systems to detect cyclists, predict cyclist behavior and drive consistently.
Cyclists should be a distinct object class
Treating cyclists as a distinct class and labeling them as such will create a diverse set of bicycle imagery for a self-driving system to learn from. Systems should be trained on images of cyclists from a variety of positions, orientations, viewpoints and speeds. Argo said this will also help the system account for the different shapes and sizes of bikes and riders.
“Due to the unique behaviors of cyclists that distinguish them from scooter users or pedestrians, a self-driving system (or ‘SDS’) should designate cyclists as a core object representation within its perception system in order to detect cyclists accurately,” according to a statement from Argo.
Typical cyclist behavior should be expected
Cyclists can be pretty unpredictable. They might lane split, walk their steed, make quick, jerky movements to avoid obstacles on the road, yield at stop signs, hop off the sidewalk and into the street. A good self-driving system should not only be able to predict their intentions, but also be prepared to react accordingly.
“A SDS should utilize specialized, cyclist-specific motion forecasting models that account for a variety of cyclist behaviors, so when the self-driving vehicle encounters a cyclist, it generates multiple possible trajectories capturing the potential options of a cyclist’s path, thus enabling the SDS to better predict and respond to the cyclist’s actions.”
Map cycling infrastructure and local laws
Self-driving systems often rely on high-definition 3D maps to understand their surrounding environment. Part of that environment should be cycling infrastructure and local and state cycling laws, Argo said. This will help the self-driving system to anticipate cyclists’ movements – like merging into traffic to avoid parked cars blocking the bike lane or running red lights if there’s no traffic – and keep a safe distance from the bike lane.
The system should act in a consistent, understandable and extra safe manner around cyclists
Self-driving technology should operate in a way that seems natural so that the intentions of the AV are clearly understood by cyclists, which includes things like using turn signals and adjusting vehicle position while still in one lane if preparing to pass, merge or turn.
In addition, if driving near cyclists, the system should “target conservative and appropriate speeds in accordance with local speed limits, and margins that are equal to or greater than local laws, and only pass a cyclist when it can maintain those margins and speeds for the entire maneuver,” Argo said.
The self-driving system should also give cyclists a wide berth in case they fall, so it can swerve or stop.
Prepare for uncertain situations and proactively slow down
Self-driving systems should account for uncertainty in a cyclist’s intent, direction and speed, Argo said. The company gave the example of a cyclist traveling in the opposite direction of the vehicle, but in the same lane, suggesting that the vehicle be trained to slow down in that circumstance.
In fact, in most uncertain circumstances, the self-driving system should lower the vehicle’s speed and, when possible, give some more space between vehicle and cyclist. Slowing down speeds when the system is uncertain is pretty standard already in the AV developer world, even if it’s not always targeted specifically at cyclists.
Continue to test cycling scenarios
The best way to make the safety case for AVs is to keep testing them. Argo and LAB suggest developers of self-driving tech should continue both virtual and physical testing that’s specifically dedicated to cyclists.
“A virtual testing program should be made up of three main test methodologies: simulation, resimulation, and playforward to test an exhaustive permutation of autonomous vehicle and cyclist interactions on a daily basis,” said the company. “These scenarios should capture both varying vehicle and cyclist behavior as well as changes in social context, road structure, and visibility.”
Physical testing, which is usually done on closed courses and then on public roads, allows developers to validate simulation and ensure the tech behaves the same in the real world as it did in virtual. Argo says developers should test AVs on likely scenarios as well as “edge cases,” or rare situations. Testing on multiple public roads in many cities to give the system a diverse set of urban environments to learn from can generate both rare and common cases.
Chasing public acceptance … and safety, of course
Social acceptance is one of the key hurdles to bringing more AVs to the roads, and many people are not yet convinced of the safety of autonomous vehicles. In fact, nearly half of those polled by market research firm Morning Consult said AVs are either somewhat less safe or much less safe than cars driven by humans.
Making a vehicle safe for all road users is only half of the battle. Companies like Argo AI also have to ensure the people believe their vehicles to be safe, and standardizing safety practices across the industry might be one way to do that.