- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Judge finds ‘reasonable evidence’ Tesla knew self-driving tech was defective::Ruling clears way for lawsuit brought against company over fatal crash in 2019 in which Stephen Banner was killed near Miami
It’s asinine that Tesla is trying to do full self driving without actually using some sort of LiDAR. Using video/photos to judge distance is just unreliable and stupid.
but it’s MUCH cheaper, so keeping with every other shitty idea he’s ever had, Musk was REALLY banking on Tesla engineers to make a crazy breakthrough so he could reap billions in reward.
It worked at SpaceX because of a perfect concoction of all the best rocket scientists and engineers wanting to work at SpaceX, since it was one of the only space programs not owned by a government and could push the boundaries, the technology being possible and wildly practical to implement, and massive government subsidies.
Tesla is in the car market, which is notoriously competitive and, while they do have massive government subsidies, they don’t have the best engineers and musk’s insistence that they “figure out” how to shove autonomous driving into a medium that simply doesn’t provide enough information drives even the better engineers away.
I really wish my government would stop funding his ego and let his fantasy projects die already.
The tech has gotten so cheap now that there is no reason to skimp out on it.
Oh there definitely is, marginally higher profits at the cost of public safety
A tale as old as capitalism: short term profit first, who gives a shit about later
And tesla is the car company who has the highest benefits by car sold.
We talk about a company who got rid of ultrasound sensors for make the car reverse parks.
With two offset cameras, depth is reliable, especially using a wide angle and narrow angle lens offset. This is what OpenPilot does with the Comma 3 (FOSS self driving).
Radar is better, but some automotive radar seems to only be great at short ranges (from my experience with my fork of OP in combination with radar built into a vehicle).
depth is reliable
What if one of them is dirty? What if you are driving with the sun right in front of you? What on a foggy winter day? The big problem here isn’t even what the cameras are or aren’t capable of, but that there is little to no information on all the situations Tesla’s autopilot will fail in. We are only really learning that one deadly accident at a time. The documentation of the autopilots capabilities is extremely lacking, it’s little more than “trust me bro” and “keep your hands on the wheel”.
The fact that it can’t even handle railroad crossing and thinks trains are a series of trucks and buses that blink in and out of existence and randomly change directions does not make me wanna blindly trust it.
Here is an alternative Piped link(s):
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Do you work in the field? Sun/fog/etc are all things that can be handled with exposure adjustments. It’s one place a camera is more versatile than our eyes.
All that being said my experience is from indirect work on OpenPilot, not from Tesla. So a system that’s not commonly used by the average person, and does not have claims of commercial FSD.
depth is reliable
No it’s not. World is filled with optical illusions that even our powerful brains can’t process and yet you expect two web cams to do. And depth is not the only thing that’s needed when it comes autonomous driving. Distance is an absolute factor. Case in point, two killed (if not more) bikers because they had 2 tail lights instead of one and Tesla thought it’s a car far away instead of motorcycle close by. Ran them over as if they were not there. Us as humans would see this rider and realize it’s a motorcyle… first because of sound second because our brain is better at reasoning. And we’d avoid the situation. This is why cars MUST have more sensors, because processing is lacking so much.
Here is an alternative Piped link(s):
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Sitting in a Tesla and watching it try to understand anything other than highway driving is so unnerving. It gets so much wrong about other cars’ direction of travel that it’s not too shocking one occasionally is plowed into or plows into someone else
Using video/photos to judge distance is just unreliable and stupid.
All depend how powerful is the computer managing the datas. A human brain does the job by example.
Humans don’t have the best track record when it comes to safe driving, might not be the best idea to imitate them when there is better tech around.
But how many of those bad records are due to their eyes ?
Most causes of those bad records are bad decisions( checking phone, speeding, cutting lanes, etc). It is rarely due to bad sight.
The issue with lidar is bad weather. If it rains, or is foggy, it doesn’t work, or give weird result.
Apparently there is some radar which can see through bad weather.
I thought the whole point was to overcome human shortcomings, not just make a worse version of a human driver. Humans don’t even rely purely on visual cues.
Humans don’t even rely purely on visual cues.
When you drive, beside few exceptions, all our cues are visual.
No dude. Sound and air pressure are cues as well.
If audio clues was indispensable, deafs people wouldn’t drive.
What? A cue being solely sufficient to make a decision and a cue bei g used in conjunction with other things are two separate things.
Tesla FSD… Coming
2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022, 2023,2024At this rate Ferrari will win another championship before FSD comes out
The sun will explode before Tesla succeeds in making full self-driving work with only basic cameras.
Are there at least two front facing cameras for depth perception?
Which means pretty much nothing. Perception is just perception, not reliable absolute data.
I think they quietly reversed that decision and cars now have lidar
Tesla doesn’t not have a radar. Just two cameras and they removed the radar. So it’s blind right now.
This is the best summary I could come up with:
A judge has found “reasonable evidence” that Elon Musk and other executives at Tesla knew that the company’s self-driving technology was defective but still allowed the cars to be driven in an unsafe manner anyway, according to a recent ruling issued in Florida.
The lawsuit, brought by Banner’s wife, accuses the company of intentional misconduct and gross negligence, which could expose Tesla to punitive damages.
The ruling comes after Tesla won two product liability lawsuits in California earlier this year focused on alleged defects in its Autopilot system.
“It would be reasonable to conclude that the Defendant Tesla through its CEO and engineers was acutely aware of the problem with the ‘Autopilot’ failing to detect cross traffic,” the judge wrote.
Bryant Walker Smith, a University of South Carolina law professor, told Reuters that the judge’s summary of the evidence was significant because it suggests “alarming inconsistencies” between what Tesla knew internally, and what it was saying in its marketing.
“This opinion opens the door for a public trial in which the judge seems inclined to admit a lot of testimony and other evidence that could be pretty awkward for Tesla and its CEO,” Smith said.
The original article contains 462 words, the summary contains 195 words. Saved 58%. I’m a bot and I’m open source!
“It would be reasonable to conclude that the Defendant Tesla through its CEO and engineers was acutely aware of the problem with the ‘Autopilot’ failing to detect cross traffic,” the judge wrote.
If that’s what this centers I don’t think this necessarily a correct ruling
That semi was moving fairly slowly as it was crossing the road as any semi would from a stop.
Radar does not detect stationary objects at high speeds, which this slow moving cross traffic vehicle could look like. I imagine there’s some limit where a cross traffic item moving very slowly would appear for all intents and purposes stationary as it fills the entire roadway horizontally and not just a portion of it.
The car explicitly warns you that Radar won’t detect stationary objects at high speeds. Other manufacturers explicitly warn about this very same problem as well.
It’ll be interesting to see what happens with this case, but if that’s what it hinges on, IMO, it doesn’t look good for the plaintiff.
Doesn’t Tesla only use cameras and image processing though? As in no radar at all?
This was in 2019 when radar was a more primary than vision, or vision only.
Here’s the basis of the finding:
Palm Beach county circuit court judge Reid Scott said he had found evidence that Tesla “engaged in a marketing strategy that painted the products as autonomous” and that Musk’s public statements about the technology “had a significant effect on the belief about the capabilities of the products”.
Judge Scott also found that the plaintiff, Banner’s wife, should be able to argue to jurors that Tesla’s warnings in its manuals and “clickwrap” were inadequate. He said the accident is “eerily similar” to a 2016 fatal crash involving Joshua Brown in which the Autopilot system failed to detect crossing trucks.
The bot that parses the articles creates a worse summary than you’d get by just reading random sentences.
In any case, we should note that this finding was reached after the recent media disclosures that Musk and Tesla deliberately created a false impression of the reliability of their autopilot capabilities. They were also deceptive in the capabilities of vehicles like the cybertruck and their semi, as well as things like range estimation, which might show a pattern of deliberate deception - demonstrating that it is a Tesla company practice across product lines. The clickthrough defense compared to what the CEO says on stage on massively publicized announcements sounds to me a bit like Trump’s defense that he signed his financial statements but noted that by doing so he wasn’t actually confirming anything and the people who believed him are the ones to blame.
Given his groundless lawsuit against media matters and his threats against the ADL, I think Elon might have started circling the drain.
Ah, that makes a lot more sense. Shouldn’t trust the bot.
I’d only add that the “click through” is actually a well laid out screen with info graphic showing the problem and a few lines of text.
They’d be hard pressed to say that warning was difficult or hard for anyone to read or understand.
Unrelated but relevant, but like GDPR where privacy explanations need to be short, concise and easy to understand, I’d say the click through thing was more than adequate and would exceed those.
But as you point out, that’s only a part of it.
Edit: trying to find an image of it for reference, but my GoogleFu is failing me :(
Edit: to further clarify, I’m only talking about the radar and stopped vehicles in terms above. The whole agreement I think was larger in some places, but my memory is a little foggy on that without images.