- cross-posted to:
- usnews@lemy.lol
- cross-posted to:
- usnews@lemy.lol
A Tesla was in its self-driving mode when it crashed into a parked patrol vehicle responding to a fatal crash in Orange County Thursday morning, police said.
The officer was on traffic control duty blocking Orangethorpe Avenue in Fullerton for an investigation into a suspected DUI crash that left a motorcyclist dead around 9 p.m. Wednesday when his vehicle was struck.
A Fullerton Police Department spokesperson said the officer was standing outside his vehicle around midnight when he saw a Tesla driving in his direction and not slowing down.
It really doesn’t help that the media isn’t putting “Self-Driving” Mode in quotes since it isn’t fucking self-driving.
It is though: self driving into objects
“We never said it was good self-driving.”
Isn’t that what Tesla called it?
Tesla calls it “Full Self Driving” and it’s a lie. So capitalize it and put it in quotes, rather than call it self-drive mode like that’s an actual thing.
The actual name: Full self driving (supervised) is so shady. Supervised is just a less crappy sounding way to indicate that you will have to take over and drive sometimes. So sometimes the car drives itself and sometimes you drive. So partial self driving, partial human driving. I’m surprised they didn’t call it “Partial Full Self Driving”. That would certainly amp up the trolling factor and really separate the true believers who would come out defending it with Olympic level mental gymnastics.
It is an actual thing, just not on Teslas. It must’ve chapped Musk’s ass something fierce that Mercedes-Benz got the DOT approval before him.
https://www.caranddriver.com/reviews/a45326503/mercedes-benz-drive-pilot-review/
It’s “self-driving”, not “self-stopping”. Luckily the police were able to assist with cruiser-based rapid deceleration.
Technically it is self-driving but just in the sense that it doesn’t need any external power sources like horses to pull it.
It’s self-driving, not self-driving-well.
“SLAMS”
Finally, some real journalism
And again, I thank rich people for being test subjects on new technology.
The victims involved in crashes aren’t always rich. People in other cars or pedestrians and cyclists can be injured by these mistakes.
If only it were that simple. WE are all the test subjects in this case whether we like it or not.
No we’re not. They’re the rats, we’re the maze.
I’m the cheese
The cheese stands alone
Yeah, I’m lonely. Rub it in.
I mean, testing new technology wasn’t really the issue here, the name of Tesla’s ‘self driving’ mode is just a lie. This is an idiot driver who should have been paying attention and wasn’t.
But we already knew he was an idiot, he did buy a Tesla.
“ACCAB” - That Tesla
*The extra C is for Cars. All Cop Cars Are Bastards.
FTP - It’s not just a protocol.
Sure it is. All the coders these days grew up with it.
Are we about to see self driving cars finally getting some of that truck-doggy-style-action with the popo mobiles?
I feel like if it was “r/carsfuckingcars”, the self driving cars are a little… premature (one and done) in this scenario.
Flat cars theory
That must have been SO scary for the cop! He wouldn’t know whether to shoot the car or the passenger!
Must have been a white Tesla. If it was a black one, it would have been a no brainer.
So my Tesla is pink. I guess the cop would’ve thrown a flashbang.
Or a rubber bullet to your face.
¿porque no los everyone in line of sight including bystanders?
They give so much lenience to Tesla.
Yet Cruise was kicked out of California for someone else hitting a pedestrian into the Cruise vehicle and running. This was while also providing dashcam footage to capture the assailant.
TBH if this process could work a little faster then maybe evolution could remove all the ai tech bros from the gene pool.
At least the ones that can’t follow directions and pay attention to the road.
Maybe I’ve been too harsh on self driving Tesla’s…
Tesla… Back (into) the Blue.
Fuck Elon, and to a lesser extend, Tesla and all. But this seems like yet another user error on several accounts. I thought “autopilot” was only supposed to be used on freeways. And obviously assisted by a human who should have seen a fucking parked cop car coming and intercede anyway.
But that said, fuck Elon and his deceptive naming of a fucking primitive tech that’s really only good at staying in a lane at speed under ideal conditions.
I thought “autopilot” was only supposed to be used on freeways. And obviously assisted by a human who should have seen a fucking parked cop car coming and intercede anyway.
It depends on which system they actually had on the vehicle. It’s more complicated than random people seem to think. But even with the FSD beta, it specifically tells the driver every time they activate it that they need to pay attention and are still responsible for the vehicle.
Despite what the average internet user seems to think, not all Teslas even have the computer capable of Full Self Driving installed. I’d even say most don’t. Most people seem to think that Autopilot and FSD are the same, they’re not, and never have been.
There have been 4+ computer systems in use over the years as they’ve upgraded the hardware and added capabilities in newer software. Autopilot, Enhanced Autopilot, and Full Self Driving BETA are three different systems with different capabilities. Anything bought prior to the very first small public closed beta of FSD a couple years ago would need to be replaced with a new computer to use FSD. Installation cost is included if someone buys FSD outright, or they have to pay for the upgrade if they instead want the subscription. All older Teslas however would be limited to Autopilot and Enhanced Autopilot without that computer upgrade.
The AP and FSD systems are not at all the same, and they use different code. Autopilot is designed and intended for highways and doesn’t require the upgraded computer. Autopilot is and always has been effectively just Traffic Aware Cruise Control and Auto steer. Enhanced Autopilot added extra features like Summon, Auto lane change, Navigetc.ate on Autopilot (on-ramp to off-ramp navigation) but has never been intended for city streets. Autopilot itself hasn’t really been updated in years, almost all the updates have been to the FSD beta.
The FSD beta is what is being designed for city streets, intersections, etc. and needs that upgraded computer to process everything for that in real time. It uses a different codebase to process data.
The spokesperson said that the Tesla was in self-drive mode and **the driver admitted to being on a cellphone at the time of the crash. **
That seems to answer all the questions about this accident.
Not really. “Self drive mode” isn’t the name of either of the driving systems, that could be either Autopilot or the Full Self Driving beta. The spokesperson was for the police department, not Tesla. They’re unlikely to know there is a difference between the two systems, like most people.
It’s at best, secondhand info from the driver, and most likely him saying he was on the phone and the car was driving, which would be the same thing they likely said with either system running. I doubt the driver is explaining differences between AP and FSD to the police.
I think the above commenter was just pointing out that the driver admitted to not paying attention to the road
My Subaru with adaptive cruise control is smart enough to not zoom into the back of a parked car. If my car with a potato for a CPU can figure it out then why can’t a tesla in any more with it’s significant more advanced computer?
It is simple, it depends on what the vehicle is using to actually process other vehicles to maintain distance from.
These systems process a lot of information, and a lot of it is pretty bad data that needs to be cleaned to remove erroneous readings before it can be processed. Sensors stream a lot of info, and not all of it is perfectly accurate. The same is true for a Tesla or any other vehicle, and filtering that data accurately means a better experience.
Say your vehicle has a forward facing radar, and you’re driving along the highway and the radar gets a return for a large object in front of the car 100 feet ahead when the returns immediately before were showing a 300 foot clear zone. Is it more likely that a large object suddenly appeared in front of the car, or that this return is erroneous and the next few returns after will show a clear zone again? Overhead signs and overpasses can show similar returns to a large truck in your lane for instance. This is one advantage lidar has over radar, more accurate angle measurements at all distances.
So say the vehicle acts on that return and slam on the brakes because the “object” is only 100 feet ahead at highway speeds. Then the erroneous return goes away and there’s a clear road again. That’s the “phantom braking” I’m sure you’ve seen various people talk about. The system reacting to an erroneous return instead of filtering it out as a bad reading. Now random braking in the middle of a highway is dangerous as well, need to minimize that. Is it more likely a massive wall suddenly appeared directly in front of the car, or that it’s a couple bad readings? The car has to determine that to make a decision on what to do. And different types of sensors will detect things differently. To some sensors, materials like paper are essentially invisible for instance but metal is clear as day. If the sensor can’t detect something, it won’t react.
Note that these readings do not involve a camera at all. They inherently work differently than a human driver does by looking at the road. So many people online want to point out that sensors are more “reliable” or “trustworthy” compared to vision since there’s little processing, you just get a data point, yet sensors will provide bad data often enough that it needs to have a filter to remove bad data. A camera works like a person, it can see everything, you just need to teach it to ide tify what it needs to pay attention to, and what it can ignore, like the sky, or power lines, or trees passing by on the side of the road. But not the human on the side of the road, need to see that.
Then we get into the fact that various sensors exist on older vehicles that have been removed from newer ones. Things like radar and ultrasonic sensors have been removed in favor of using computer vision via the cameras directly, like a human driver watching the road. Going frame by frame to categorize what it sees for vehicles, people, cones, lanes, etc. and comparing to previous frames to extrapolate things like motion, movement, and relative speed. But with cameras you have issues with things like lights blinding them, just like a bright light blinds a person. Maybe the camera can’t see for some reason, like a light shining directly in the lens. It takes a little time for it to try and adjust exposure to compensate for a bright light shining directly in the lens.
You might suggest using as many sensors as possible then, but that makes it nearly impossible to actually make a decision then. Sensor integration is a huge data processing issue. how do you determine what data to accept and what to ignore when you get conflicting results from different types of sensors? This is why Tesla is trying to just do it all via vision. One type of sensor, roughly equivalent to a human but with wider visual spectrum sensitivity. Just classify what’s in each frame and act on it. Simple implementation, just needs A LOT of data to train it in as many situations as possible.
And that camera is where we get to emergency vehicles specifically. In my opinion, these emergency vehicle accidents are likely the camera being blinded repeatedly by the emergency lights rotating and the camera shifting exposure up and down every second or so to try and maintain an image it can actually process. As a human, at night, those lights make it hard for even me to see the rest of the road.
It’s not like regular drivers never crash into emergency vehicles either, they just don’t make national news, just like the 33 car fires every hour in the US alone.
It’s not a simple thing, and even your “simple” car by comparison is doing a lot to filter the data it gets. It could be using completely different kinds of data than another vehicle for that cruise control, so given the right circumstances it may react differently.
For what it’s worth, my Model 3 has rarely had issues with Autopilot acting in any sort of dangerous manner. A few phantom braking issues back when I got it in 2018, but I haven’t had a single one of those in maybe 4 years now, even in areas where it would almost always react that way back when I got it. Sometimes a little lane weirdness with old poorly marked lane lines, or even old lane lines visible in addition to the current ones in some areas. It’s pretty easy to tell the situations AP might have issues with once you’re used it just a few times.
All cars since mid 2019 have the computer required for FSD.
At this point that includes the majority of all Teslas ever sold. Somewhere between 750k and 800k of 6 million don’t have the hardware. And of those 100-200k are upgradeable, maybe more but the research time isn’t worth it.
That being said, it still could have been AP and not FSD as the media gets it confused all the time.
Is that the actual time cutoff? My 2018 model 3 that came with Enhanced Autopilot was originally said to have the hardware necessary for FSD (Computer 2.5 the car says), but there were updates before FSD became actually available.
I never considered buying it so I never paid more than cursory attention to all of the different hardware revisions, only major ones like Computer 3, removing radar during parts shortages around COVID, the Ultrasonic sensors, etc.
Also I hadn’t realized that it had actually been that long since I bought it, without most of the regular time-based car maintenance like oil changes time has flown by with it. Or that production had ramped up so significantly since I got my Model 3. I knew it had ramped obviously and that the Model Y launched, but I didn’t realize how significant all of that actually was when added together.
Ya, it’s been that long and they’ve made a lot of cars since heh.
I got mine in H1 2019 and it was HW 2.5, and sometime shortly after that HW3 came out. At the time I knew that was the situation but I wasn’t concerned since they said they’d upgrade it.
It took awhile after HW3 came out to be offered the upgrade though. By the time I was eligible, we were in the peak of early covid lockdowns and I wasn’t traveling to the not so close service center for the upgrade.
Eventually they did it via mobile service and I got it.
Despite what the average internet user seems to think, not all Teslas even have the computer capable of Full Self Driving installed.
Uh.
http://tesla.com/blog/all-tesla-cars-being-produced-now-have-full-self-driving-hardware The Tesla Team, October 19, 2016
Be sure to read the part where it says that as of today, ALL Tesla cars will have the hardware needed for full self-driving at a safety level SUBSTANTIALLY GREATER than that of a human driver.
Also, can people cut the bullshit, Elon is on video suggesting year after year that FSD will take you hundreds of miles without a single intervention.
How does that make sense, that something that performs SUBSTANTIALLY GREATER than a human driver needs constant supervision by the inferior driver? Elon sold FSD as a magical autopilot since 2016, adding “Supervised” to the label like 3 months ago doesn’t suddenly put the blame on the people who believed the lie.
You seem to have missed the simple fact that there are cars produced prior to that. And the fact that the hardware announced there also wasn’t enough on its own. There were additional changes in 2019 to a newer computer necessary for FSD. My 2018 Model 3 has that hardware, but I cannot get FSD without a computer upgrade. Not to mention even older vehicles that can’t even accept the new computer hardware.
So yeah, not ALL Teslas in the road are capable of FSD.
Dude, talk about missing the point. I wasn’t the one who claimed all Teslas after 2016 were capable of Autopilot. ELON MUSK made that claim… you aren’t saying I’m Elon, right, because that’s pretty insulting.
Elon made the claim. He called his product Autopilot and then Full Self Driving. He got up on stage and released videos claiming the cars could drive hundreds of miles without an intervention. He released faked videos. He’s truly a con man for the ages.
Dude, no, you’re trying to change the point. My claim wasn’t time-specific, your link has nothing to do with the actual conversation in this comment thread. I said that the average person thinks ALL Teslas are capable of FSD, and that is factually incorrect, there are Teslas on the road that cannot do FSD,
Your link is about cars made after X date, you’re changing the conversation to fit the narrative you actually want. Even if we go by that date though, that’s what they thought at the time, it was later determined that yet another computer upgrade would actually be required, hence the 2019 date for actual FSD compatibility as it was rolled out to the closed public beta. Vehicles made in that middle ground get the computer upgrade for free if the owner purchases FSD.
My car, made in 2018, cannot do FSD without a computer upgrade, so it is not capable of FSD as is despite being made after 2016. The rest of the hardware is compatible but it needs a newer brain. Some vehicles older than mine (older Model S/X) aren’t even able to get an upgraded computer and will NEVER be FSD capable. So factually, not ALL Teslas on the road can do FSD. End of conversation.
He called his product Autopilot and then Full Self Driving. Autopilot and Full Self Driving are two different systems, and always have been. I get that you want them to be the same, but they aren’t. The vehicle makes it clear they aren’t the same in the settings, the website’s purchase pages over the years have shown the differences, and the support pages make it clear these are different, they are referenced as different products all the time. While Autopilot now comes on every Tesla as standard with no additional purchase, FSD is an additional cost and always has been.
you aren’t saying I’m Elon, right, because that’s pretty insulting. I never claimed you were Elon, no idea where you seemed to get that idea. But at this point it’s clear you have some sort of vendetta and are incapable of actually following the conversation due to your bias, so I’ll leave it here. Have a good day.
Last year, Tesla announced that they had improved with autonomous emergency braking system, to go ‘beyond standard AEB functionality’. And yet, here we have a story where a Tesla drove straight into a stationary vehicle and, according to the cop, didn’t slow down.
Yes, the driver should have been paying attention, but why did the AEB do nothing?
I just heard from Enron Musk that it crashed into the patrol car way more safely than a human would have done.
Also according to Enron Musk Full self driving has been working since 2017, and is in such a refined state now, that you wouldn’t believe how gracefully it crashed into that patrol car. It was almost like a car ballet, ending in a small elegant piruette.As Enron Musk recently stated, in a few months we should have Tesla Robo Taxies in the streets, and you will be able to observe these beautiful events regularly yourself.
Others say that’s ridiculous, he is just trying to save Enron, but that’s too late.
All I do at night is open my garage door to let my car out. A few months later here I’m a millionaire. Thank you full self driving Roboenron 😍
Yes I remember that, and then he has repeated every year since, that they will be ready next year. But THIS year he changed his tune somewhat, and claimed it was a matter of months.
How is this con man not in jail?how is he able to get a $40?$50? billion bonus?
He is very good at what he does, being a con man.
But I think step one is to be a malignant narcissist, having such an oversized ego and disregard for others, that you actually believe you deserve it.Apart from that I can’t really explain it, except IMO the stockholders that gave him that bonus are morons, that let themselves be exploited.
JesusElon is my co-pilotthoughts and prayers
ai being based again
GTA 2024
Don’t play with my emotions
Based robot car
Uh oh tesla self drive is woke
Everyone tried to warn Elon not to use Fury Road as training data.