Uber Drivers Forum banner
1 - 20 of 38 Posts

·
Registered
Joined
·
2,907 Posts
Discussion Starter · #1 ·
Tempe police released photographs from the pedestrian death involving an Uber self-driving car. A 49-year-old woman was hit and killed by a self-driving Volvo operated by Uber while crossing a street in Tempe

Automotive parking light Vehicle registration plate Vehicle Automotive lighting Grille
Tire Wheel Grille Car Vehicle
Bicycle Bicycles--Equipment and supplies Wheel Tire Bicycle wheel rim
Tire Wheel Car Vehicle Window
Bicycle Tire Wheel Bicycles--Equipment and supplies Land vehicle
Car Vehicle registration plate Vehicle Grille Tire
 

·
Registered
Joined
·
2,907 Posts
Discussion Starter · #5 ·
SAN FRANCISCO (Reuters) - Police in Tempe, Arizona said evidence showed the "safety" driver behind the wheel of a self-driving Uber was distracted and streaming a television show on her phone right up until about the time of a fatal accident in March, deeming the crash that rocked the nascent industry "entirely avoidable."

A 318-page report from the Tempe Police Department, released late on Thursday in response to a public records request, said the driver, Rafaela Vasquez, repeatedly looked down and not at the road, glancing up just a half second before the car hit 49-year-old Elaine Herzberg, who was crossing the street at night.

According to the report, Vasquez could face charges of vehicle manslaughter. Police said that, based on testing, the crash was "deemed entirely avoidable" if Vasquez had been paying attention.

Police obtained records from Hulu, an online service for streaming television shows and movies, which showed Vasquez's account was playing the television talent show "The Voice" the night of the crash for about 42 minutes, ending at 9:59 p.m., which "coincides with the approximate time of the collision," the report says.

It is not clear if Vasquez will be charged, and police submitted their findings to county prosecutors, who will make the determination. The Maricopa County Attorney's Office referred the case to the Yavapai County Attorney's office because of a conflict and that office could not be reached late Thursday.

Vasquez could not immediately be reached for comment and Reuters could not locate her attorney.

The Uber car was in autonomous mode at the time of the crash, but Uber, like other self-driving car developers, requires a back-up driver in the car to intervene when the autonomous system fails or a tricky driving situation occurs.

Vasquez looked up just 0.5 seconds before the crash, after keeping her head down for 5.3 seconds, the Tempe Police report said. Uber's self-driving Volvo SUV was traveling at just under 44 miles-per-hour.

Uber declined to comment.

Last month, an Uber spokeswoman said the company was undergoing a "top-to-bottom safety review," and had brought on a former federal transportation official to help improve the company's safety culture. The company prohibits the use of any mobile device by safety drivers while the self-driving cars are on a public road, and drivers are told they can be fired for violating this rule.

Police said a review of video from inside the car showed Vasquez was looking down during the trip, and her face "appears to react and show a smirk or laugh at various points during the times that she is looking down." The report found that Vasquez "was distracted and looking down" for close to seven of the nearly 22 minutes prior to the collision.

Tempe Police Detective Michael McCormick asked Hulu for help in the investigation, writing in a May 10 email to the company that "this is a very serious case where the charges of vehicle manslaughter may be charged, so correctly interpreting the information provided to us is crucial." Hulu turned over the records on May 31.

According to a report last month by the National Transportation Safety Board, which is also investigating the crash, Vasquez told federal investigators she had been monitoring the self-driving interface in the car and that neither her personal nor business phones were in use until after the crash. That report showed Uber had disabled the emergency braking system in the Volvo, and Vasquez began braking less than a second after hitting Herzberg.

Herzberg, who was homeless, was walking her bicycle across the street, outside of a crosswalk on a four-lane road, the night of March 18 when she was struck by the front right side of the Volvo.

The police report faulted Herzberg for "unlawfully crossing the road at a location other than a marked crosswalk."

In addition to the report, police released on Thursday a slew of audio files of 911 calls made by Vasquez, who waited at the scene for police, and bystanders the night of the crash; photographs of Herzberg's damaged bicycle and the Uber car; and videos from police officers' body cameras that capture the minutes after the crash, including harrowing screams in the background.

The crash dealt Uber a major setback in its efforts to develop self-driving cars, and the company shuttered its autonomous car testing program in Arizona after the incident. It says it plans to begin testing elsewhere this summer, although in some cities it will have to first win over increasingly wary regulators.

https://www.reuters.com/article/us-...driving-car-crash-police-report-idUSKBN1JI0LB
 

·
Premium Member
Joined
·
4,773 Posts
I doubt Uber told the "safety driver" it is okay to sit in the driver seat and watch Hulu all night.

When Arizona allowed the testing to go on here, the car isn't the one with a drivers licence. The person in the left front seat is. In my view the status of the "autonomous mode" is the same as cruise control. The driver has the personal responsibility to take over as necessary, just as he would in any car with only cruise engaged.

About testing on a closed track... there is only so much that can be tested in a proving ground. Eventually they have to test out on public roads--with a driver ready to take over at any moment. There are just too many real world driving conditions that can't be recreated in a controled environment.

We all like to criticize Uber. Heck I'm sure I'm not the only one here that doesn't mind the setback to self driving cars. I would encourage us to look at the at the facts objectively, and not just jump to bagging on Uber by default.

Like many tragedies, there where several failures that led to it. Any one of them could have prevented this. The pedestrian failed to yield when crossing road. The autostop failed because it was disabled. Why was it disabled? Well, that IS a good question, and it needs to be answered! But the final factor, the most important one, is there was a driver in that car that was supossed to take over in any hazard. Uber was paying someone to fill that role, and she failed. Failed because she was watching Hulu.
 
  • Like
Reactions: Older Chauffeur

·
Premium Member
Joined
·
2,371 Posts
ccording to the report, Vasquez could face charges of vehicle manslaughter. Police said that, based on testing, the crash was "deemed entirely avoidable" if Vasquez had been paying attention.
People need to understand those MONITORS are not there to actively drive the cars, but to intervene in case the so called SELF driving system FAILS. Doing here job for Uber, of course Rafaela failed, because the software failed and killed a woman, but as a driver, you need to remember - the monitor is NOT an active driver. Is Uber and all the other culprits, not clarifying this for everybody to understand. If that person is hired to drive, there is NO self driving testing, only driving.

The Uber car was in autonomous mode at the time of the crash, but Uber, like other self-driving car developers, requires a back-up driver in the car to intervene when the autonomous system fails or a tricky driving situation occurs.
There is no "tricky situation". Is ONLY the system FAILING to navigate the roads, as it was promised by the developers.

I doubt Uber told the "safety driver" it is okay to sit in the driver seat and watch Hulu all night.
Uber told that MONITOR, to intervene and COVER UP any system failures during testing, not to actively drive that vehicle.

When Arizona allowed the testing to go on here, the car isn't the one with a drivers licence.
You are completely WRONG.
"The Self-Driving Oversight Committee will advise the Department of Transportation, the Department of Public Safety, universities and other public agencies on how best to advance the testing operation of self-driving vehicles on public roads."
They refer to the system not to the human inside. The word SELF is simply self explanatory.

The person in the left front seat is. In my view the status of the "autonomous mode" is the same as cruise control
Your view is wrong again. When a vehicle is in Cruise Control, the driver actively maintains the car on the correct path (the road) while in self driving mode, the monitor is there JUST IN CASE.

About testing on a closed track... there is only so much that can be tested in a proving ground. Eventually they have to test out on public roads--with a driver ready to take over at any moment. There are just too many real world driving conditions that can't be recreated in a controled environment.
You are wrong again. Testing should not interfere with the general public. When pharmaceutical companies are testing their drugs, they DO NOT sell them to the general public, but get volunteers to take them. During the third phase of testing, the advance clinical trial, the drug is tested by patients that previously AGREED to be part of testing.

The autostop failed because it was disabled. Why was it disabled? Well, that IS a good question, and it needs to be answered!
You really show you never payed attention to this tragedy to understand the details, because you are wrong again.

Volvo factory installed Active Driver Assist was/is NOT part of the self driving system additionally installed by Uber on that SUV. Because these 2 different systems have different ways to detect obstacles in the car's path, they also could generate conflicting decisions, and that is the reason the Volvo FACTORY system was disabled.

Self driving cars developers goal is to ELIMINATE any conflicting readings in their systems, because those could generate errors. You need to learn what a false positive is and why developers want to eliminate all of them from their sensors readings.
 
  • Like
Reactions: doyousensehumor

·
Premium Member
Joined
·
4,773 Posts
Whoa, whoa, honestly we agree more points than you realize. Except I think they were testing a system, and you think thought they were done.
People need to understand those MONITORS are not there to actively drive the cars, but to intervene in case the so called SELF driving system FAILS.
So if the SDC that is being tested is good enough that the (monitor or driver) doesn't have to intervene for days he's more of a monitor? Is an airplane pilot a monitor while the autopilot is on? Nope, he's still a pilot. Should the autopilot go wonky, its his responsibility to intervene.
Uber SDC was under development.
Testing.
Read: its not ready yet!
I watched these Volvos in person! They literally drive around in circles on the same route between Scottsdale and Tempe day and night. You don't seem to understand they were testing them with a driver inside that intervenes as necessary.

There is no "tricky situation". Is ONLY the system FAILING to navigate the roads, as it was promised by the developers.
"The developers" were not finished developing it! Thats why there was a driver in every one of the tester mules, ready to take over.

Uber told that MONITOR, to intervene and COVER UP any system failures during testing, not to actively drive that vehicle.
Cover up, thats not good. To my knowledge they were looking for any shortcomings so they could tweak the software code, and make adjustments.

The word TESTING is self explanatory.
Ducey set up a committee of a few guys that talk to other guys in the DOT and the state police and the universities. What do they talk about? Maybe they meet up at Hooters once a month, and talk about how to roll out the red carpet for Uber since Uber donated a lot of $$ for his election campaign! Nothing there that gave SDC software any legal driving license.

Your view is wrong again. When a vehicle is in Cruise Control, the driver actively maintains the car on the correct path (the road) while in self driving mode, the monitor is there JUST IN CASE.
Legally the driver is the one responsible for the car. Cruise control is one dementional compared to SDC, but you can buy aftermarket cruise and install it onto a car that doesn't have one. I could try to test it in a parking lot, i suppose, but the only way to know for sure if it is road worthy is to take it on a public road. I may find out that it needs adjustment. Or i might have to intervene if it gets stuck. Ether way it would be ultimately my responsibility as a driver. I can't just relax and watch Hulu!

You are wrong again. Testing should not interfere with the general public. When pharmaceutical companies are testing their drugs, they DO NOT sell them to the general public, but get volunteers to take them. During the third phase of testing, the advance clinical trial, the drug is tested by patients that previously AGREED to be part of testing.
It was rumored Uber took pax in some of these tester mules. I thought that was a liability for them.
When General Motors, VW, Ford, Toyota, design a car from scratch, they first test it at their proving grounds. After a while of testing there is only so much that can be done in the proving grounds! So then they test in real world public roads. The Pruis, for example, introduced electric steering, electric throttle, electric brakes, all in one car. 2 electric motors and 1 gas. It is constantly doing a dance between 3 motors. All software. The public didn't "agree" to share the roads with that vehicle but the drivers testing those vehicles assumed personal responsibility.

You really show you never payed attention to this tragedy to understand the details, because you are wrong again.
Wrong! I have been. I find it interesting.

Volvo factory installed Active Driver Assist was/is NOT part of the self driving system additionally installed by Uber on that SUV. Because these 2 different systems have different ways to detect obstacles in the car's path, they also could generate conflicting decisions, and that is the reason the Volvo FACTORY system was disabled.
Of course.
 

·
Premium Member
Joined
·
1,973 Posts
Funny they blame the human driver for being inattentive and could have prevented the accident....True.

But the car was in autonomous mode and ran the bicyclist down like rodent road kill.

The car F’ed up.
 

·
Premium Member
Joined
·
2,371 Posts
Whoa, whoa, honestly we agree more points than you realize
Yes, I agree, and those "wrongs" could look and sound harsh, but that was not my intention. Sorry if it felt that way.

The TESTING (term you focus on) is done for a SELF (term you seem to ignore) driving car. Because none of those cars are privately owned, the permit is given to the company that owns them. The company hires or has the MONITORS, that are required to have a driver license in order to take over/cover up IN CASE of a failure.

A self driving car, it's name defines that, is driving by itself. Usually, testing is not supposed to be done on public roads, and you very well mentioned
When General Motors, VW, Ford, Toyota, design a car from scratch, they first test it at their proving grounds.
There is NO car manufacturer to test their projects on public roads because there is NO reason to do that. The only system that deals with active driving variables is the self driving software. Avoiding pedestrians, stopping for the red lights, making right and left turns or avoiding obstacles doesn't require testing on public roads if the car is designed to have a driver, because drivers have tests every 4 years in order to renew their driver licences. On the other hand, only a self driving car system lacks of experience, mileage and adjusting on public roads, so the SDC's developers pushed for testing on public roads, which is TOTALLY unnecessary and highly dangerous for the general public that never voluntarily agreed to be part of it. Do you agree, or I should just assume you already agreed and ignore your opinion, like the local authorities did by allowing companies to use people as guinea pigs?

Your comparison with Autopilot from a plane is well exaggerated because (and I am going to use your words from your previous comment) "There are just too many real world driving conditions" that make driving a lot more complex and difficult than flying.

"The developers" were not finished developing it! Thats why there was a driver in every one of the tester mules, ready to take over.
You are making very dangerous statements and if you can prove the SDC's are on the roads as NOT well developed and finished products, I will nicely ask you to post your information right here. I WANT TO SEE YOUR SOURCE ON THIS, PLEASE! If you only assume that, is another story. Your perception could be wrong. That individual sitting in the car is hired to watch the systems, monitor them, take over IF necessary, and report back.

"Vasquez had previously told investigators from the NTSB that she had been "monitoring the self-driving system interface," which is displayed on an iPad mounted on the vehicle's center console, at the time of the crash." so they have an Ipad in the car displaying system parameters in real time.

In May this year, a Mobileye self driving car got through a red light while having a tv crew inside and they've recorded it. "Nobody was hurt, and Channel 10's video seems to show a Mobileye safety driver monitoring the vehicle, but allowing the car to proceed without trying to stop it. "

Also "Uber expects that a driver may sometimes need to take control of the vehicle, but the specific circumstances in which that's the case are somewhat unclear." "The navigation and self-driving tech in Uber's vehicles is also a lot more advanced than Tesla's semi-autonomous Autopilot mode, which isn't meant to completely replace the need for a driver, and is still in beta according to the company. Uber uses LIDAR, a system that creates a 3D map of the areas surrounding the car using lasers, as well as a typical radar system, and cameras to detect objects before collisions."
As you can easily understand, Uber doesn't want to specifically explain what monitor's duties are, particularly because in case of an accident, the company wants to get away with it, and ignore its clear responsibility of making the decision to hire that person (suddenly not fit after a tragedy) and putting the robots (unfinished as you say) on the roads, intentionally endangering people lives.

Ducey set up a committee of a few guys that talk to other guys in the DOT and the state police and the universities. What do they talk about? Maybe they meet up at Hooters once a month, and talk about how to roll out the red carpet for Uber since Uber donated a lot of $$ for his election campaign! Nothing there that gave SDC software any legal driving license.
What they talk about? Is the first sentence under that link - "Governor Doug Ducey has announced the members of the Arizona Self-Driving Vehicle Oversight Committee - a team of transportation, public safety and policy experts who will support the state in the research and development of new "self-driving technology" that will allow vehicles to drive without direct or active human operation." So NO driving license whatsoever, only permits to operate given to the companies doing the research and development.

Wrong! I have been. I find it interesting.
Again, I know it looks and sounds harsh. Was not intentional and I am sorry for making it look and sound that way. I am sure if you would have had more information about the "autonomous cars" oxymoron, you would have been able to immediately understand the scam.

Politicians maybe going at Hooters once a month, to enjoy Uber's golden shower and roll the red carpet for them, could be more realistic than you think. I think you're onto something big here....
 

·
Premium Member
Joined
·
2,371 Posts
The driver needs to do some time for this. Unfortunately an example needs to be made so all of these joke sdc "safety drivers" know that these things will never work and to never take your eyes off the road for even a second.
IMO is the self driving cars software developers that are responsible for their system failure. This tragedy proves if the driver is removed/not present/not paying attention, SDC's software developed by incompetents kills people on the roads.
 

·
Banned
Joined
·
4,953 Posts
Funny they blame the human driver for being inattentive and could have prevented the accident....True.

But the car was in autonomous mode and ran the bicyclist down like rodent road kill.

The car F'ed up.
False.

The car detected the pedestrian and, if it had been allowed to, could have avoided the accident. UBER F'ed up by disabling the programming that allowed the car to react. The driver KNEW they were 100% responsible in a situation like this.

You can blame Uber, you can blame the driver (the police have), but you can't blame a SDC that was prevented from doing its job.

The driver needs to do some time for this. Unfortunately an example needs to be made so all of these joke sdc "safety drivers" know that these things will never work and to never take your eyes off the road for even a second.
Except, of course, as you well know, the car could have avoided this if it had been allowed to and thus, did work as intended.
 

·
Premium Member
Joined
·
2,371 Posts
The car detected the pedestrian and, if it had been allowed to, could have avoided the accident. UBER F'ed up by disabling the programming that allowed the car to react. The driver KNEW they were 100% responsible in a situation like this.
You can blame Uber, you can blame the driver (the police have), but you can't blame a SDC that was prevented from doing its job.
This guy here, has no clue whatsoever and he clearly shows he doesn't know what he is taking about.

"The car detected the pedestrian and, if it had been allowed to" -

Forehead Nose Chin Hairstyle Eyebrow


"UBER F'ed up by disabling the programming that allowed the car to react."

Eyebrow Jaw Television program Watch Gesture


" but you can't blame a SDC that was prevented from doing its job"-

Gesture Font Collar Tie Thumb


people in the vehicle are still responsible for the actions of the car
So if something goes wrong with this rope railway it is people's that are inside fault?

 

·
Registered
Joined
·
6,557 Posts
False.

The car detected the pedestrian and, if it had been allowed to, could have avoided the accident. UBER F'ed up by disabling the programming that allowed the car to react. The driver KNEW they were 100% responsible in a situation like this.

You can blame Uber, you can blame the driver (the police have), but you can't blame a SDC that was prevented from doing its job.

Except, of course, as you well know, the car could have avoided this if it had been allowed to and thus, did work as intended.
lol the only way it worked as intended is if it was specially designed to mow down people on the road

and this article only seems to confirm what I was one of the few people to state when this originally happened, and that's that this driver is liable for manslaughter, for those paying attention

while elaine contributed to the accident, the driver had a duty to avoid hitting her
 

·
Registered
Joined
·
6,557 Posts
This guy here, has no clue whatsoever and he clearly shows he doesn't know what he is taking about.

"The car detected the pedestrian and, if it had been allowed to" -

View attachment 238683

"UBER F'ed up by disabling the programming that allowed the car to react."

View attachment 238685

" but you can't blame a SDC that was prevented from doing its job"-

View attachment 238677


So if something goes wrong with this rope railway it is people's that are inside fault?

nope, just as it relates to sdc's

it already happened somewhere, there was an article about it

a rider got a ticket when the sdc he/she was riding in, committed an infraction
 

·
Premium Member
Joined
·
2,371 Posts
nope, just as it relates to sdc's

it already happened somewhere, there was an article about it

a rider got a ticket when the sdc he/she was riding in, committed an infraction
Yes, that person (identified by the TV station reporting on that story as "the driver") got a ticket.
And No, getting a ticket doesn't mean the person inside is responsible for software behavior (sensors readings and actuators acts upon the environment).

There are 2 stories
Cruise Self-driving car passenger slapped with ticket in San Francisco, police say
British Tesla driver banned after caught in the passenger seat while Autopilot was engaged


You probably refereed to the first one, where "The ticketing officer believed that the car was in self-driving mode, however the person inside was cited for failing to yield to a pedestrian, Linnane said. That individual, whether they were driving or not, "is still responsible for the vehicle," she added."

I will go through this step by step.

1. The motorcycle officer is following traffic on a street in SF.
2. A car fails to yield to a pedestrian that is crossing the street.
3. The officer turns the lights showing intention to stop that vehicle.
4. The monitor inside sees the following cop, disengages the self driving mode, takes control and pulls over. The monitor is the only person knowing that car was in self driving mode.
5. The officer approaches the car asking for driver licence and proof of insurance/registration. While the monitor grabs the licence and proof on insurance, the cop continues asking if the monitor knows why he got pulled over.
6. The monitor mentions the vehicle is in testing, has an operating permit from the city of SF, and it was in self driving mode.
7. The cop gets the driver licence and proof of insurance going to his motorcycle to verify the individual in the car and the police records. He doesn't know if the car was in self driving mode as the monitor told him, but needs to make a decision according to the law. - "California law requires the vehicle to yield the right of way to pedestrians, allowing them to proceed undisturbed and unhurried without fear of interference of their safe passage through an intersection,"
8. The cop issues a traffic violation citation/ticket on the monitor's name, for not yielding to a pedestrian, and let's him go.

Now, that issued citation doesn't mean that individual in the car did something wrong. That citation states the traffic officer witnessed a traffic violation and proceeded as required.

There is no law in the US or in the world to penalize an infraction done by a robot. And was no way to check the system engaging-disengaging logs on the spot to verify if the monitor was telling the truth or not.

The cops are not establishing fault. That is courts job, but people, by not going to court for their tickets are subsequently admitting fault and choose to pay than make their case.

By issuing that citation, the officer gave that monitor the opportunity to go to court, see a judge, make his case and prove he was not actively driving that car so he didn't do anything wrong. The judge is the one to analyse the information presented by both parties and decide.

The only monitor's wrong doing was regarding the job Cruise hire him to do - monitor the robot, take over/cover up in case something goes wrong, and report back. He failed to cover up/take over.

In the second article, that idiot already admitted fault. Tesla is not testing, there are well defined software limitations described by the company in their legal documentation given to their customers and it is almost impossible once you signed you acknowledge that, to come out and say you never knew about that. "Patel has pled guilty to the offense, and has been banned from driving for 18 months, and will be required to pay a £1,800 fine, carry out 10 days rehabilitation, and to perform 100 hours of community service."
 
  • Like
Reactions: doyousensehumor

·
Banned
Joined
·
4,953 Posts
lol the only way it worked as intended is if it was specially designed to mow down people on the road

and this article only seems to confirm what I was one of the few people to state when this originally happened, and that's that this driver is liable for manslaughter, for those paying attention

while elaine contributed to the accident, the driver had a duty to avoid hitting her
The car was NOT in autonomous mode. Its avoidance features were disabled. It detected the woman 6 seconds before impact and would have reacted according to data retrieved from the system. Blaming the car is wholly inaccurate. The driver was 100% in charge of avoidance. But you knew that, didn't you?
 

·
Registered
Joined
·
6,557 Posts
The car was NOT in autonomous mode.=
Yet again, you lie. I mean, nothing comes out of your mouth that isn't a lie.

https://www.google.com/search?q=ube...rome..69i57.6183j0j9&sourceid=chrome&ie=UTF-8

The vehicle was traveling in autonomous mode at the time of the crash.
company's sensing system, was in autonomous mode with a human
was in autonomous mode when it struck Elaine Herzberg around 10 p.
I can go on and on.

Yes, that person (identified by the TV station reporting on that story as "the driver") got a ticket.
And No, getting a ticket doesn't mean the person inside is responsible for software behavior (sensors readings and actuators acts upon the environment).

There are 2 stories
Cruise Self-driving car passenger slapped with ticket in San Francisco, police say
British Tesla driver banned after caught in the passenger seat while Autopilot was engaged


You probably refereed to the first one, where "The ticketing officer believed that the car was in self-driving mode, however the person inside was cited for failing to yield to a pedestrian, Linnane said. That individual, whether they were driving or not, "is still responsible for the vehicle," she added."

I will go through this step by step.

1. The motorcycle officer is following traffic on a street in SF.
2. A car fails to yield to a pedestrian that is crossing the street.
3. The officer turns the lights showing intention to stop that vehicle.
4. The monitor inside sees the following cop, disengages the self driving mode, takes control and pulls over. The monitor is the only person knowing that car was in self driving mode.
5. The officer approaches the car asking for driver licence and proof of insurance/registration. While the monitor grabs the licence and proof on insurance, the cop continues asking if the monitor knows why he got pulled over.
6. The monitor mentions the vehicle is in testing, has an operating permit from the city of SF, and it was in self driving mode.
7. The cop gets the driver licence and proof of insurance going to his motorcycle to verify the individual in the car and the police records. He doesn't know if the car was in self driving mode as the monitor told him, but needs to make a decision according to the law. - "California law requires the vehicle to yield the right of way to pedestrians, allowing them to proceed undisturbed and unhurried without fear of interference of their safe passage through an intersection,"
8. The cop issues a traffic violation citation/ticket on the monitor's name, for not yielding to a pedestrian, and let's him go.

Now, that issued citation doesn't mean that individual in the car did something wrong. That citation states the traffic officer witnessed a traffic violation and proceeded as required.

There is no law in the US or in the world to penalize an infraction done by a robot. And was no way to check the system engaging-disengaging logs on the spot to verify if the monitor was telling the truth or not.

The cops are not establishing fault. That is courts job, but people, by not going to court for their tickets are subsequently admitting fault and choose to pay than make their case.

By issuing that citation, the officer gave that monitor the opportunity to go to court, see a judge, make his case and prove he was not actively driving that car so he didn't do anything wrong. The judge is the one to analyse the information presented by both parties and decide.

The only monitor's wrong doing was regarding the job Cruise hire him to do - monitor the robot, take over/cover up in case something goes wrong, and report back. He failed to cover up/take over.

In the second article, that idiot already admitted fault. Tesla is not testing, there are well defined software limitations described by the company in their legal documentation given to their customers and it is almost impossible once you signed you acknowledge that, to come out and say you never knew about that. "Patel has pled guilty to the offense, and has been banned from driving for 18 months, and will be required to pay a £1,800 fine, carry out 10 days rehabilitation, and to perform 100 hours of community service."
When the cop issues an infraction, he's obviously issuing it to the person in the car. You don't issue citations to robots. Lol

I mean, come on, man. Get real.
 
1 - 20 of 38 Posts
Top