Tesla Autopilot Recall: Crashes into Emergency Vehicles
Vložit
- čas přidán 4. 07. 2024
- Uncover the shocking details behind Tesla's Autopilot nightmare in this in-depth video. After more than two years of investigation, the National Highway Safety Administration (NHSTA) has issued a compelling recall due to vehicles crashing into emergency vehicles, alongside other incidents resulting in multiple injuries and fatalities. Including a Tesla driving around a bus with it's stop signs activated seriously injuring a student.
Join the Crew! shop.stachedtraining.com
Affiliate Links - Helps me to continue to create content!
Fire Dept. Coffee Veteran Owned, Firefighter Run
bit.ly/42mOHXi
Moditech Crash Recovery System
stachedtraining.com/moditech-crs
*I may earn a commission should you chose to sign up for a program or make a purchase using my links.
@realdawnproject
WSJ video - • Tesla Dashcam Footage ...
00:00 - Introduction
00:28 - What is Autopilot
01:30 - Crashes into Emergency Vehicles
02:33 - Dawn Project Data
04:26 - Nonstandard Roads & Night Driving
04:45 - Autopilot & School Busses
05:27 - Dawn Project Testing
05:40 - Tesla Recall
I would bet if there was no auto pilot, and no airbags, but a six inch metal spike coming out of the centre of the steering wheel - everyone would pay attention and drive safely 😊
Or metal shrapnel that's ejected?
That's not a bad idea. 🤔
@@StacheDTraining Takata style
You never met Florida man 😁
"Pay attention and be prepared to take over" is, mentally, far harder than just driving.
Good point, like watching your teenage driver 😂
Absolutely 100%. It is also way easier to stay focused if you are driving fast on a winding mountain road than if your are driving on a road with no turns for 100 miles. It's the perceived danger that keeps us attentive.
And staying on high alert is more exhausting.
You are absolutely correct.
@@susanpetropoulos1039 It's not that exhausting with enough adrenaline going through your veins. Hence why race car drivers can easily keep super focused for hours at a time and why keeping very focused, even for a small amount of time, is nearly impossible if you are just waiting for something to happen without doing anything.
We live in a time where humans cant even open a door. The less we use our body and mind, the more it deteriorate... Its like a horror film going on before our eyes and it feels like nobody cares....
Ironically you are absolutely correct on humans struggling with the concept of opening a door. I volunteer for a heritage railway in the UK and the railway now have to send a safety video to people who have booked tickets online to show them how to operate the doors on the old carriages as most people now can't understand the operation of a door handle as most new trains have push buttons to open the doors.
@@bentullett6068 Really . Damn that is quite disturbing . Im not an old timer (last of the gen x as they call it ) but im glad i grew up around those kind of folks in my life who tought me how to figure things out by myself . Im not trying to diss anyone by all means but it looks like the people in charge does not trust us anymore with even the simplest thing like this you explained now . But still have a wounderful new year and thanks for the little story there. Best regards from Norway (ps sorry about my writing its not my first language)
I am slow to adjust to new technology, for sure, but I still can't get over the insane idea that we have cars out there that are "self-driving." What madness is this?! I wanted to say something like, "lol it can't be worse than drivers already are," but it's a new kind of horror that you can be plowed under by an errant robot at any time without warning.
The drivers are supposed to be paying attention and be ready to take over. This is people being lazy and complacent.
I've still never used regular cruise control in a car. My car has the feature, but the control has never been touched since I don't trust it - sounds extreme, but that's me. I can't imagine using entire automation.
@@ptonpche is talking about self driving cars with no people in them, they have driverless taxis now in most cities.
@@MattExzy Same here.
@@MattExzyyou don't trust cruise control? That's pretty extreme IMO
I use it every day on my commute and couldn't imagine not using it
It sets your vehicle at a speed you determine so you travel at a consistent speed, you don't unknowingly slowdown/speed up from user error of having a light foot
When my roommate would drive she would regularly get distracted and then slow down a good 10mph and he speed would fluctuate all over the place
I would have to remind her about cruise control and then suddenly we are traveling at a consistent speed
I wouldn't call a kid walking in front of your car a nonstandard situation. Anyone who injures or kills another person driving in autopilot should be charged.
No kidding, it's why I don't drive on holloween night
1st I would like to say that your coverage is fair , balanced, & factual.
2nd "the computer told the driver 150 TIMES to put his hands on the steering wheel IS UNACCEPTABLE! "auto-drive system " should have a safety over ride program that pulls the car over and shuts the power off until the driver is compliant.
Fantastic idea, could be used for all sorts of things like "non payment" of taxes, traffic fines ect. just disable the guilty parties vehicle till they pay up, excellent idea!
It's supposed to eventually just come to a stop in the road, on the basis that if you're having a medical emergency, you'll get noticed more quickly. Honestly, there isn't really a good solution to the problem of "I've told someone else to do something dangerous, and now they can't do that, and I won't take control back."
@@cayminlast How kind of you to assume the person being forcibly stopped in their vehicle is guilty of something first.
Because we all know that power won't ever be abused.
The computer will only warn you five times and disables itself. Not sure how this person got 150 warnings.
Autopilot drives into the back of motorcycles & under trailers too. It's murder but nothing will be done because of Teflon Musk.
The 'night' problem comes from Tesla only using cameras (and obviously their ability to 'see' is reduced at night, just as ours is).
The 'non-standard' situation problem, is that when the camera 'sees' something, it must check in the car's database to see if that is something it already 'knows'. If it is, the database tells it what to do. If it isn't in its database (the 'non-standard' bit), a prudent safety strategy says 'stop'. But Elon doesn't want his customers being stopped for things that 'probably aren't a problem' ... so he told his programmers to tell the car "if you don't specifically know that something you see is a problem, then presume it isn't and just keep going".
Oh, and to keep the processing time to search through the database as short as possible, they purposely delete items that are not common (like an Amish horse-drawn buggy).
And the couple of motorcyclists killed at night, by the Tesla running into the back of them? That is because Teslas only use cameras, so have no way of telling how far away something is, other than comparing the apparent size with the size of the 'identified object' in its database. It is believed that in each case, the Tesla identified the motorcycle directly in front of it, as a car way off in the distance (and so did not slow down).
There are many issues due to using cameras without other redundant systems.
With radar acc motorcycles are not always detected when they are hugging the lane lines.
But you notice this one time and are aware of this.
Stationary objects are a pain for radar. So with camera+radar in foggy weather it’s still not 100% to detect objects in time.
search thru the database? I think its a vision system at play, with a pre-trained model, that gives you an answer on what it sees and usually a confidence score about how sure the model is that it looks at what the answer is. So it could say "Dog at position x, im 20% confident". The problem with using cameras, is that something could sometimes look like something completely different than what it is. So lets say in some sunset, a gray car looks like the asphalt on the road.. the car will probably crash into it. Also from my tests with using dual cameras for depth instead of laser, is that its like not at all accurate compared to a laser.
@@roybm3124 Radar needs to have the "phase shift" (not sure about the english word, sorry) to be able to detect objects. So you are right. That means also if you have the same speed as the object in the radarview, it might be hard to see. I would say these selfdriving cars should need by law all systems, Laser (lidar), Radar, Ultrasound, and Cameras. And if any sensor notices something wierd in the data, the system needs to slow the car down and ask the driver to take over directly.
What's crazy is if you had a Tesla with a radar that came with it that system's been deactivated something you paid for.
A car crashing head on into a Tesla, and you say the autopilot not to blame. If the owner was driving it themselves, I think they would be steering out of the way most of the time, so I do blame the autopilot, because there is doubt that the crash would have happened with a human at the wheel.
The only matter is who behind the wheel.
@@GF-mf7ml Yes that's right. In 49 years of driving many times I have taken evasive action to avoid a head on crash many times. Two main ones that I will not forget is, in a 30 mph zone road curving to the left, and a car comes at me at very high speed. He brakes and ends up in a spin, I end up fully on the pavement as he travelling side wards skids past me. Got the reg and reported him to the police, and he got done for it. Seconded time turning off a main roundabout into a road with a long que of cars waiting to get on. A ambulance with lights flashing on the wrong side of the road coming at me very fast, I ended up off the road in grass. He endangered my life, if he had his siren going I would have heard it. Both time I'm sure that if I was on that Tesla auto pilot, I would have been dead.
How do insurance work for this? I mean, if I was an insurance company, I dont think I would like to insure a selfdriving car at all?.
The HAL 9000 should be a reminder of what can happen if we allow computer technology to be totally in control. Wait untill the kill switch is installed in the new cars in 2026, which is already mandated by the Infrastructure bill, and you were not allowed to VOTE on.
Just like taxation without representation you will not have any say in your safety behind the wheel of any new car.
That would be a great number plate HAL-9000.
Elon claimed straight out the software would take you from Cali to New York hands free. He encouraged unsafe use of the software.
In our modern vocabulary it's called "Misinformation", used to be called 'Bull Sh!t'
It also shouldn’t be called “Full Self Driving”. I own a Tesla and there are three levels of auto pilot. I hardly use it because it’s more of a pain in the butt than just driving. Enhanced Auto Pilot and Full Self Driving are in beta testing. You pay $12,000 for it to “test” it. I’m willing to bet a lot of these crashes are due to morons trying to trick the safeguards. They hang weighs off the steering wheel so the system thinks someone is holding onto the wheel. They even tape photos over the camera to try and trick the system into believing they are alert. This is more of a moronic Tesla owner issue than a Tesla issue.
He said that would happen in the future. Not now. I think you probably knew that when you wrote this comment.
How in hell do the governments allow this ‘autopilot’ mode to be used in these and other vehicles. Did the vehicle with autopilot have to sit a license test and who’s going to be responsible for any damages and deaths involved. I don’t remember voting for this on the same roads that I use.
The Governments usual response is "Trust the Science"!
We’re in an experiment without giving consent. Autopilot tries to solve an non-existent problem and makes for lx drivers.
I can't think of anything more stressful than babysitting your car's computer, except for..... babysitting a kid running loose outdoors.
It's actually not bad if you use it in the situations where it performs well, like stop-and-go traffic on a freeway, or cruising in light traffic on a freeway. You're still paying attention, but you can look down to change the radio channel, or look to the side to see what that new store is advertising. And it's convenient cruising behind someone who is letting their speed drift up and down, because the car will slow to match and you don't have to be staring ahead to see if the guy just dropped 2MPH off his speed and now you're getting close.
Occasionally it'll do something stupid like slam the brakes for a quarter second, or notice it's on a bridge going over a much slower road and turn down the cruise control speed without mentioning it, but you get used to the quirks.
if the controls for the vehicles accessories were not on a tablet screen you wouldn't need to look down.@@darrennew8211
With the heavier weight of the EV'S if they run into a pedestrian there is a higher chance of causing a fatality.
A pedestrian doesn't care about a few extra hundred pounds. Any vehicle hitting a pedestrian will do serious damage. If anything, the lack of an engine will likely improve HIC (head impact criteria)
@@StacheDTraining 9,000lb Hummer EV begs to differ...
yes but if you look carefully you will notice that the pedestrians are also a few hundred pounds heavier then they use to be.
Twaddle, a solid metal object at a certain speed will do the same damage regardless of weight, weight only affects stopping distance.
@@StacheDTraining Have you seen the Cybertruck? You'll get sliced in half. And that angle is reinforced, so you're not bouncing off that hood.
Saving the environment and universe one crash and fire at a time.
I wonder if all the bright flashing and strobe lights blind or confuse the car's cameras and sensors.
That's what I've thought for a while.
If it’s confused it will tell the driver to take over controls.
But ... that's kind of the point. Flashing strobe lights should mean "slow or stop", not "ignore these things."
Autopilot is blind to construction props, stopped school buses and more. Battery repair/replacement reported at $20,000 dollars or more. Tire treads wear out very quickly due to weight. Putting out a fire in one of these vehicles is a nightmare as it has been reported that once out, the fire can re-ignite.....days later. Maybe one day the kinks will get worked out.
current versions of FSD beta can handle construction zones pretty well. Still years to go till its ready for no human behind the wheel, but slowly improving.
Tyres too are very expensive. And currently only premium brand tyres available. That will change maybe. That is IF EVs are still around. It does seem insurance costs will kill them and crash repairers who do not want them as their insurers will not cover them.
So many MANY accidents were prevented by the driver paying attention and overruling the auto-pilot. Simply what i always stayed; Never trust a computer and do not use a computer when there is wide array of variables and conditions. That is why auto-pilot in a plane works: no curbs, no pedestrians, no streetlights, no trees, no animals, no crowded random vehicles around, no holes in the road, no blind corners, no busses stopping, no trains crossings, etc etc etc.
A school bus is hardly a “non-standard situation”.
They are ubiquitous.
What happened to common sense? These cars are not ready for prime time! Autopilot is dangerous. The risk is too great.
I think that is a good question.
belief in Musk's promises is strong enough in his cultists to overshadow any common sense
It’s a driver assist system, the driver is responsible of the auto pilot system. Not the other way around. So the driver have to learn it’s flaws and when to take action.
@@roybm3124i know right, what idiot would think that FULL SELF DRIVING would actually mean it could drive itself.
Who desires to babysit constantly a flawed and dangerous diving system and take the moral and legal liability? Just F'n drive the damned car as is your responsibility. Our highways and streets should not be subject to the public beta testing this dangerous technology. @@roybm3124
Imagine turning on cruise control and actually thinking that you suddenly aren't responsible for driving your own vehicle anymore.
Don't forget that people who are this hopelessly regarded only need to take three turns in a parking lot to get a driver's license in the US...
I use adaptive cruise control on my Subaru Forester all the time. It improves my safety. But I have to keep my hands on the steering wheel, eyes on the road, and steer the car. Full Self Driving is not ready for public roads.
They really need to get this autopilot working properly so when the time comes after mandating everyone have an EV, “they” can log in remotely and control everyone’s personal vehicles.
It will either be that or a foreign actor will hack in and cause a major incident.
Think of even now with OTA access to the vehicle computers, companies can shut you out of your personal property.
Someone should use that car takeover event as the idea for a movie script. Oh wait. Netflix already has.
You think they want everyone in EV's? That's impossible, but, they do want the lower class(no middle) with nothing, but Public Transit, for their 15 Minute City commute
Good luck try to turn my steering wheel. It have physical lock. I won't drive by wire anyway.
Heaven help you if you're a cop directing traffic in the road and a Tesla on "full self driving" shows up.
That's the beauty of Tesla. Every single sensor in a tesla records data all the time. This data can and will be analyzed by authorities, when in doubt. That's why almost every case of "Autopilot failure" turns out to be human failure.
To get a proper baseline, you have to compare non-auto pilot rates of accidents to auto-pilot rates of accidents to see if it is a benefit or added hazard. Regardless if it is better or worse, you will have accidents with autopilot so until you do a proper study, you can't say that it increases the risk of driving.
Most of these accidents and deaths are caused by people not paying attention, by drivers not doing they thing they are supposed to do, *pay attention and be ready to take over*.
At least there's an emergency vehicle on scene when you crash into an emergency vehicle. What better "safety" "feature" can you ask for?
More than ever people have less common sense and responsibility, auto pilot only adds to this growing issue. Maybe in the next 100 years everything will be done for people and they wont be able to think at all for themselves!
It's going to be like that movie Wall-E.
Although these crashes show the weaknesses of the Tesla Auto Pilot, all of these are the fault of the driver. No matter how good the Auto Pilot is, YOU are still THE PILOT.
I just rely on knowing how to drive without hitting stuff.
In South Carolina it’s common to have to yield to left Turning traffic while making a right turn on GREEN 😮 which in have never seen anyplace else
In the past, I was a drone pilot. After a few erratic automated flight incidents, put through grief being told each incident was my fault even though I was just monitoring what the aircraft was doing by itself, I learned to never trust automated features. You, the user, are responsible for anything that goes wrong. Since this is the case, I only trust myself. I’m glad I experienced that with a half pound drone and not a 5,000 lb. vehicle.
That really is the crux of the situation. You can be the pilot in command and have full control, or you can be a passenger and have zero responsibility. There can be no gray area in the middle. I am 50 years old and do not even use cruise control. When I am driving, I am driving.
I couldn’t agree more. I only use cruise control on wide open roads without anyone around. I had a Toyota Sienna with radar cruise control years back and it was amazing technology but I never trusted it because of its rudimentary flaws. This was an ‘08 Toyota and 15 years on, I still don’t trust it. I am in control.
Recall the DRIVER.
first time here, but man, that's a stash you can definitely trust!
Driver assistance systems(or what you call them) should be "passive", i.e the driver is always the primary controller of the vehicle but assistance step in if the driver is going over sidelines, or becomes unconscious and things like that. Well my opinion anyway, driver complacency is not a good thing.
Human environments should be for humans, autonomous robots belong in back rooms.
Every major automaker has an ADAS system that has the same capabilities of basic Autopilot (traffic aware cruise control and lane keeping). But police don’t ask unless it’s a Tesla. The media doesn’t broadcast it unless it’s a Tesla. Do you really think that other ADAS system haven’t hit emergency vehicles or other objects? How would those manufacturers have trained their vehicles on relatively rare situations like emergency vehicles without being able to get video from their fleet?
Most manufacturers don't only use cameras. I'll do a video in the future on the topic. Also, this isn't media driven. It's driven off a NHSTA investigation.
Two comments: 1. - If someone can ignore 150 warnings (which is designed to protect the occupants of the vehicle AND all the vehicles around it), then the system should essentially shut the car down after a pre-determined number of warnings e.g. 10. That last warning should include the statement that the car will be immobilised and authorities called (i.e. whether they are drunk or medically incapacitated). It should be able to find a suitable place to bring the vehicle to a safe stop away from the road. 2. - About the bus incident, and this is a serious question. Are children not taught to check the road before crossing it, regardless of the circumstances? I know the whole purpose of the warning lights and STOP signs is to stop the traffic, but if they get off the bus and cross the road immediately in front of the bus, they are crossing blind, and any vehicle (such as that Tesla mentioned) isn't going to see them.
School bus is not a non-standard situation, it's so standard that it is in the law. Shame!
The automation is, honestly, capable of a lot. It’s not however a living, thinking being, which is where you start to approach ‘enough’. It sounds like it’s failing to recognize when it’s outside standard scenarios and fails to apply extra caution - ‘there are stopped emergency vehicles in my lane with lights flashing - let’s slow down in case I need to stop or switch lanes’
The algorithm should be - I see flashing lights, disengage autopilot and scream loudly at the driver.
I read that tesla may be instructed to recall cars due to suspension issues.
"The Dawn project"!?? How is he doing now!?
Recall or over the air update
It's one in the same these days.
We need ZERO auto pilot
My husband used to work in engineering for companies that made car parts, and there is a phrase, 'Risk Compensation', i.e. the safer you feel because of all your shiny in car gadgets, the more you drive like a dick.
auto pilot or full self driving for any vehicle is banned in Australia
Cheers john
At the end of the day, it's not the fault of autopilot or Tesla's technology. It is 100% the fault of the drivers. How about a fair comparison between accidents that have been avoided thanks to autopilot.
Sir, I have over 50K miles of personal experience driving alongside Teslas FSD. I'm also professionally very active in the AI space. Respectfully, you got some fundamental things wrong here. As an owner of the vehicle, I can confirm from my own personal testing and experiences that some of what you said simply isn't true. I'm curious about your sources because what you're saying is so incorrect it's almost a crime to say it. I'm assuming you're not just Tesla bashing and you want to put out factual content. I'd be happy to connect with you to review my experiences with both my Teslas driving me around. After 50K miles I can promise you one thing, my car won't hit an emergency vehicle just sitting out on an open road with its lights on, but I can tell you exactly why some do. No pressure, let me know if you want to have quick chat.
Thank you. It’s only Tesla owners than can see though all these negative BS post. Unfortunately no one will believe us. LOL. People love to click on hype and add fuel to their hate.
If are to lazy to drive the dam thing! Ask a friend for a ride!
correction - people think they are smart.
not - people are smart.
not sure why you would have a car an not drive it yourself. half the fun of having a car.
My whole life I hear people complain about their computer, phone and countless other computerized devices not working properly ... why in the name of gods green earth would you trust it to drive.
ooo that's right - your modern and smart.
My Ford Everest 3 times in 2 years, decides car in front is slowing down to quickly and my car auto brakes then goes to low power mode.
Had to roll to side road, turn off.. On.... All good.
Ford dealer advised, nothing found wrong.
If software engineers can't get this right, ehy would I trust a fully automated vehicle.
I don't like public airplanes for this reason.
we have been manually driving cars for well over a hundred years, if it aint broke why fix it is my thoughts.
Can't blame the vehicle if the culprit is a loose nut between the steering wheel and driver seat. 🤔😉
Auto pilot is no different to using cruise control on older cars you don't just drive into the back of things with cruise control on. How many of you drive a car with cruise control on and don't pay attention.
I’ll bet $100 that they’ll be putting heart monitoring systems in steering wheels.
So the cars don't respect authority? Lmfao 😂
Autopilot shouldn't have been put in the Tesla. The new update might not work
Autopilot makes sense in an airplane, because they operate very differently to a car, perhaps they shouldn't have called it autopilot.
It's only a glorified level 2. "Auto pilot" is only in the name.
Well if droids could think, where would we be?
If it works really well, but has flaws, it's not working very well......
We are living in Idiocracy.
do not know... when the computer is driving one should be permanently as careful as one should be as if driving.
that will not happen ever. if the car drives itself one will not be as careful.
so .... the software should be perfect with no errors. and that is a utopia.
No. Just no.
They should just ban it.
Seems the owners and other road users here are just : Lab Rats
beta testers
Judging from this video, a Tesla is not in my immediate future. They still have work to do.
You are out of your wheelhouse on this one.
Look at the data, a Tesla on Autopilot is less likely to run into emergency vehicles than a person.
Autopilot is SAFER for emergency workers, not more dangerous.
So the NHSTA didn't require a recall on the autopilot?
Its not needed get rid of it
Ntsb knows more about autopilot crashes than tesla? A cinic person will say tesla knew and did nothing until forced to.
Cars, even teslas, dont "crash into people" they run over them, not sure about the word. I never said a car crashed into a person.
Umm how about NO never there nuts yea it called a sell phone.
In the event just before a crash or similar. The driver doesn't react in time, because they expect the car to. By which point its too late. 😮
The whole joy of driving a car , is ? Actually Driving the Car ! WTF
Isn't this the sort of problem you figure out before you release the vehicle onto the road?
Good deal for Tesla. You pay THEM over $10k to die beta testing their software.
It's outrageous that any company is allowed to 'develop' safety-critical technology on open public roads using actual people - drivers, cyclists, motorcyclists, pedestrians, traffic, emergency, and maintenance workers - as crash-test subjects.
You think that’s bad, wait till you find out about MRNA medicines.
Its the reckless marketing often used by Tesla. Waymo has an almost clean safety record with no known fatality.
Waymo is a bit different. It only functions in a limited area. Speed was capped. Waymo has had around 150 accidents with 3 injuries. It still has issues driving into construction zones.
@StacheDTraining Yes, it function as a driverless taxi in designated zones. But the experience by users are generally very positive. The injuries involved are not serious. The accident rate is also far lower than those caused by human driver.
Why is auto pilot legal for fuck sake?
technology will ALWAYS fail its only a matter of time.
But how can Electric Jesus get it so wrong? After all, he told everyone he "...knows more about manufacturing than anyone else on the planet..." (Which I would assume includes.... God?)
Maybe yes maybe no? Sorry, if you want to use an autopilot must be a requisite ro record all video and data and provide it to police.
Privacy and patents, ok, is not allowed on public roads then, easy as that. And what about the privacy of everyday people being recorded by those cars? Even lots of them end on youtube, but if i got killed by a car police can access to that same recording to know who or what killed me?
I remember seeing a video where a Tesla using autopilot literally just stopped because it got confused on a busy city road, which then caused a huge pile up behind it. The driver was apparently asleep (possibly using those weight things on the steering wheel to trick the system) and only woke up when he was hit. He got out of the car assuming that the pickup truck behind him was the only vehicle involved and it wasn't until he looked up from seeing where the truck had hit him that he moticed there were multiple vehicles involved. I think he was going to have a hard time explaining that to the insurance company.
Its really amazing how much better the human brain is compared to the most state of the art computers even today. Autopilot is not safe now and won't be safe anytime soon
Eyes are a whole lot better too.
At least you said "soon"
At this rate, we're going to dumb ourselves down enough that even an Autopilot is smarter and more decisive.
All joking aside; I'm sure improvements in AI are going to help self-driving cars adapt to more conditions, faster. But I do still worry about the mental devolution of humans in the last 30 years... Or maybe it's not that we're getting dumber, but access to the internet is just showing us how many idiots have lived among us since forever?
@@lunasakara7306 One of the problems is that humans can use social skills to figure out other humans. Example: you come to a flashing yellow arrow, and you've never seen a flashing yellow arrow before, so you have to figure out what it means. I can think "I know what a green arrow and a red arrow means, and I know what a flashing solid circle means, so I can figure out what the designer of this traffic signal would have intended me to understand this signal to mean." A car is likely never going to do that.
Similarly with (say) a pedestrian standing on the side of the road. I can look at him and guess pretty confidently whether they're waiting to cross or not, just from body language, or even seeing the stuff they're carrying (like, car keys in hand? A shopping bag logo'ed with the name of the store behind them?)
I refuse to drive a rolling mobile device on four wheels.
I love moustache, I have moustache but yours is a disaster
Simply not ready to be passed out as safe for use.
Why does society NEED autopilot on vehicles? So people can go to sleep? So they can play with their phones? To start work early? All bad reasons.
It's handy on freeways with traffic that keeps changing speed. It's nice to cruise along and be able to (say) change sunglasses, change the radio station, look at the car beside you, etc. It makes it much safer to take your eyes off the road for three or four seconds. IME, driving 200 freeway miles on AP is like driving 50 miles without it, in terms of effort and fatigue.
@@darrennew8211 you’re addicted to the kool aid. Its not needed. If you’re too tired to drive without an autopilot you’re to tired to drive with one. And 3 or 4 seconds is more than enough time to get killed. What is already happening is people, some have already lost their lives, are relying on technology too much. It isn’t reliable 100% of the time and never will be. That’s obvious to all but the fan boys.
Musk (I'm NOT defending/supporting auto-driving systems) had an observation of "autopilot just needs to be better than the average american driver". And good luck with perfect software, not going to happen.
The average is lowered by selling touch-only cars and driver assistants that fail when really needed.
It not only has to be better than the average driver, it also has to not get involved in accidents that humans would never even begin to get involved in. If it had half as many accidents, but all of them involved running into the back of a police car stopped on the side of the road with lights on, it's not going to fly.
That has an easy fix: just program the car to break and stop if the driver doesn't keep his hands on the steering wheel for more than a few seconds. That would force the driver to pay attention.
That's what it's supposed to already do. Did you see the pictures of people putting artificial machines on the wheel to fool the car?
It actually does that but idiot owners always find a way to “trick” the safeguards.
Elon is insane
Complexity is the enemy of reliability.
The most important thing to learn about EV's, is not to buy them.
Digi-computeletronic navigational A.I. (autopilot) disturbs me. Call me a paranoid Pamela, but i’d be too scared to EVER take my hands off the wheel, autopilot or no… 😓
EM has got to go to jail, naming it FSDM (IMO). It's still not there and having to pay $$$ for this. What a scam.
Why is Tesla still in business, their false range predictions, disgusting build quality and countless faults mark them out as the biggest Con job in automative history.