Uber Drivers Forum banner
1 - 2 of 2 Posts

·
Premium Member
Joined
·
2,371 Posts
Discussion Starter · #1 ·
Driver-assist systems that can keep pace with the traffic flow and keep a car centered in its lane are becoming more commonplace, says Jake Fisher, director of auto testing at Consumer Reports.

"As these systems become more capable, they're actually becoming more dangerous," he says. "Once a driver and vehicle make several trips using the features without incident, it's human nature for them to stop watching the road and other cars as intently as they should."

Lessons Learned
  • Drivers must pay attention: People using advanced driver-assist systems like Autopilot must always pay attention to the road. The National Transportation Safety Board report released this week about a fatal Model X crash in California in March shows that Autopilot can't be relied upon to stop, turn or accelerate when appropriate because of the limitations of its programming. Despite its name, Autopilot operates only as a suite of driver-assist features.
  • It's not just Tesla: Cadillac, Infiniti, Mercedes-Benz, Nissan, and Volvo offer systems similar to Autopilot, under various names. These systems, such as Volvo's Pilot Assist, can maintain a vehicle's place in the flow of traffic and keep it within the lines of its lane-and that could lull drivers into complacency. Autopilot isn't the only system that has these limitations, and all of them should only be used with the driver's full attention to the road. Only Cadillac's Super Cruise has a driver-facing camera that will issue warnings if the driver stops looking at the road.
  • Pedestrian detection still needs work: This important technology is still in its nascent phase, as evidenced by an Arizona crash when a self-driving Uber test vehicle killed a woman pushing her bike across the road. Uber's software reportedly identified the woman as an object, then as a vehicle and finally as a bicycle. Even though the modified Volvo SUV's systems identified an object ahead, it did not alert the human test driver to the situation, and it didn't stop the vehicle on its own.
  • Automatic Emergency Braking (AEB) has limits: Though effective in important ways, this feature can't save drivers in every situation. AEB typically won't keep a car from crashing at high speeds. It works to slow down a vehicle and lessen the force of impact. That's still a potentially life-saving difference, but it's not a magic bullet for avoiding a collision. Multiple crashes every day, minor and serious, show that drivers can put too much faith in AEB.
  • Sudden changes can put drivers at risk: Several Tesla crashes follow a common scenario. A Tesla vehicle operating on cruise control is following another vehicle. The lead vehicle suddenly leaves the lane to avoid something ahead that's stationary or moving slowly. The Tesla driver-assist systems don't have time to react to the object suddenly in its path, such as a stopped fire truck, and there's a collision.
  • Adaptive cruise control will do what drivers ask of it: Cars using this driver-assist system often accelerate to the driver's pre-set speed preference when a slower lead vehicle veers out of the way, even if there's an object in the way, until and unless it detects that object. The NTSB reported this week that the Tesla Model X in the fatal crash in California in March that killed the driver accelerated before it crashed into a road barrier.
  • There may be a test car on the road with you: The Uber crash in Arizona underscores how few standards there are for the testing of self-driving cars, and how states and the federal government are currently giving companies license to determine whether their technology is safe enough to test on public roads.
Limits of Technology
Many automakers working to create effective self-driving car technology are banking on the redundancy of multiple types of sensors-cameras, radar and laser-based lidar-to weed out false positives (indications of obstacles that aren't there) and arrive at correct decisions as these cars move about on public roads.

Tesla has been in the minority of companies, betting that cameras and radar will be enough. Tesla CEO Elon Musk has been outspoken about this, arguing that lidar (a laser-based radar system that can create detailed maps of roads) is expensive and its effectiveness is overrated. That's one reason he says Tesla vehicles sold today have the hardware needed for future full autonomy, and the main obstacles are perfecting software and regulatory approval.

The crashes involving Tesla vehicles striking stationary objects-the crash attenuator in California, a stopped fire truck in Utah in May, and another fire truck in California in January-show the limitations of relying on just cameras and radar, says Raj Rajkumar, director of the Connected and Autonomous Driving Collaborative Research Lab at Carnegie Mellon University in Pittsburgh. Camera-based systems have to be trained to recognize specific images, and if they encounter something in the real world that doesn't match their expectations, the radar has to pick it up, Rajkumar said.

Tesla's system missed the fire trucks, and there was also an incident reported in China where a Tesla crashed into a stopped garbage truck. The company's technology appears to work well with moving objects, but not stationary ones, Rajkumar said.

"Consumers need to be extremely cautious about the claims being made," Rajkumar said. "There's a lot of hype."

The entire article at https://www.consumerreports.org/cars-what-weve-learned-from-tesla-and-self-driving-crashes/
 
1 - 2 of 2 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top