Tesla Autopilot Recall: Crashes into Emergency Vehicles

Sdílet
Vložit
  • čas přidán 4. 07. 2024
  • Uncover the shocking details behind Tesla's Autopilot nightmare in this in-depth video. After more than two years of investigation, the National Highway Safety Administration (NHSTA) has issued a compelling recall due to vehicles crashing into emergency vehicles, alongside other incidents resulting in multiple injuries and fatalities. Including a Tesla driving around a bus with it's stop signs activated seriously injuring a student.
    Join the Crew! shop.stachedtraining.com
    Affiliate Links - Helps me to continue to create content!
    Fire Dept. Coffee Veteran Owned, Firefighter Run
    bit.ly/42mOHXi
    Moditech Crash Recovery System
    stachedtraining.com/moditech-crs
    *I may earn a commission should you chose to sign up for a program or make a purchase using my links.
    ‪@realdawnproject‬
    WSJ video - • Tesla Dashcam Footage ...
    00:00 - Introduction
    00:28 - What is Autopilot
    01:30 - Crashes into Emergency Vehicles
    02:33 - Dawn Project Data
    04:26 - Nonstandard Roads & Night Driving
    04:45 - Autopilot & School Busses
    05:27 - Dawn Project Testing
    05:40 - Tesla Recall

Komentáře • 241

  • @davidrobert2007
    @davidrobert2007 Před 6 měsíci +39

    I would bet if there was no auto pilot, and no airbags, but a six inch metal spike coming out of the centre of the steering wheel - everyone would pay attention and drive safely 😊

  • @smorris12
    @smorris12 Před 6 měsíci +26

    "Pay attention and be prepared to take over" is, mentally, far harder than just driving.

    • @snorman1911
      @snorman1911 Před 6 měsíci +2

      Good point, like watching your teenage driver 😂

    • @kirkjohnson6638
      @kirkjohnson6638 Před 6 měsíci +2

      Absolutely 100%. It is also way easier to stay focused if you are driving fast on a winding mountain road than if your are driving on a road with no turns for 100 miles. It's the perceived danger that keeps us attentive.

    • @susanpetropoulos1039
      @susanpetropoulos1039 Před 6 měsíci +1

      And staying on high alert is more exhausting.

    • @MikesProjectsandHobbiesMC
      @MikesProjectsandHobbiesMC Před 6 měsíci +1

      You are absolutely correct.

    • @skitidet4302
      @skitidet4302 Před 3 měsíci

      @@susanpetropoulos1039 It's not that exhausting with enough adrenaline going through your veins. Hence why race car drivers can easily keep super focused for hours at a time and why keeping very focused, even for a small amount of time, is nearly impossible if you are just waiting for something to happen without doing anything.

  • @notsure7060
    @notsure7060 Před 6 měsíci +27

    We live in a time where humans cant even open a door. The less we use our body and mind, the more it deteriorate... Its like a horror film going on before our eyes and it feels like nobody cares....

    • @bentullett6068
      @bentullett6068 Před 6 měsíci +1

      Ironically you are absolutely correct on humans struggling with the concept of opening a door. I volunteer for a heritage railway in the UK and the railway now have to send a safety video to people who have booked tickets online to show them how to operate the doors on the old carriages as most people now can't understand the operation of a door handle as most new trains have push buttons to open the doors.

    • @notsure7060
      @notsure7060 Před 6 měsíci +1

      @@bentullett6068 Really . Damn that is quite disturbing . Im not an old timer (last of the gen x as they call it ) but im glad i grew up around those kind of folks in my life who tought me how to figure things out by myself . Im not trying to diss anyone by all means but it looks like the people in charge does not trust us anymore with even the simplest thing like this you explained now . But still have a wounderful new year and thanks for the little story there. Best regards from Norway (ps sorry about my writing its not my first language)

  • @Salmon_Rush_Die
    @Salmon_Rush_Die Před 6 měsíci +43

    I am slow to adjust to new technology, for sure, but I still can't get over the insane idea that we have cars out there that are "self-driving." What madness is this?! I wanted to say something like, "lol it can't be worse than drivers already are," but it's a new kind of horror that you can be plowed under by an errant robot at any time without warning.

    • @ptonpc
      @ptonpc Před 6 měsíci +3

      The drivers are supposed to be paying attention and be ready to take over. This is people being lazy and complacent.

    • @MattExzy
      @MattExzy Před 6 měsíci +3

      I've still never used regular cruise control in a car. My car has the feature, but the control has never been touched since I don't trust it - sounds extreme, but that's me. I can't imagine using entire automation.

    • @metasaurus3233
      @metasaurus3233 Před 6 měsíci

      ​@@ptonpche is talking about self driving cars with no people in them, they have driverless taxis now in most cities.

    • @ptonpc
      @ptonpc Před 6 měsíci

      @@MattExzy Same here.

    • @Supercon57
      @Supercon57 Před 6 měsíci

      ​@@MattExzyyou don't trust cruise control? That's pretty extreme IMO
      I use it every day on my commute and couldn't imagine not using it
      It sets your vehicle at a speed you determine so you travel at a consistent speed, you don't unknowingly slowdown/speed up from user error of having a light foot
      When my roommate would drive she would regularly get distracted and then slow down a good 10mph and he speed would fluctuate all over the place
      I would have to remind her about cruise control and then suddenly we are traveling at a consistent speed

  • @Walter-wo5sz
    @Walter-wo5sz Před 6 měsíci +10

    I wouldn't call a kid walking in front of your car a nonstandard situation. Anyone who injures or kills another person driving in autopilot should be charged.

    • @linusa2996
      @linusa2996 Před 6 měsíci +2

      No kidding, it's why I don't drive on holloween night

  • @johnecker4217
    @johnecker4217 Před 6 měsíci +13

    1st I would like to say that your coverage is fair , balanced, & factual.
    2nd "the computer told the driver 150 TIMES to put his hands on the steering wheel IS UNACCEPTABLE! "auto-drive system " should have a safety over ride program that pulls the car over and shuts the power off until the driver is compliant.

    • @cayminlast
      @cayminlast Před 6 měsíci +2

      Fantastic idea, could be used for all sorts of things like "non payment" of taxes, traffic fines ect. just disable the guilty parties vehicle till they pay up, excellent idea!

    • @darrennew8211
      @darrennew8211 Před 6 měsíci +1

      It's supposed to eventually just come to a stop in the road, on the basis that if you're having a medical emergency, you'll get noticed more quickly. Honestly, there isn't really a good solution to the problem of "I've told someone else to do something dangerous, and now they can't do that, and I won't take control back."

    • @lunasakara7306
      @lunasakara7306 Před 6 měsíci

      @@cayminlast How kind of you to assume the person being forcibly stopped in their vehicle is guilty of something first.
      Because we all know that power won't ever be abused.

    • @MikesProjectsandHobbiesMC
      @MikesProjectsandHobbiesMC Před 6 měsíci +2

      The computer will only warn you five times and disables itself. Not sure how this person got 150 warnings.

  • @papalegba6796
    @papalegba6796 Před 6 měsíci +11

    Autopilot drives into the back of motorcycles & under trailers too. It's murder but nothing will be done because of Teflon Musk.

  • @gomezgomezian3236
    @gomezgomezian3236 Před 6 měsíci +23

    The 'night' problem comes from Tesla only using cameras (and obviously their ability to 'see' is reduced at night, just as ours is).
    The 'non-standard' situation problem, is that when the camera 'sees' something, it must check in the car's database to see if that is something it already 'knows'. If it is, the database tells it what to do. If it isn't in its database (the 'non-standard' bit), a prudent safety strategy says 'stop'. But Elon doesn't want his customers being stopped for things that 'probably aren't a problem' ... so he told his programmers to tell the car "if you don't specifically know that something you see is a problem, then presume it isn't and just keep going".
    Oh, and to keep the processing time to search through the database as short as possible, they purposely delete items that are not common (like an Amish horse-drawn buggy).
    And the couple of motorcyclists killed at night, by the Tesla running into the back of them? That is because Teslas only use cameras, so have no way of telling how far away something is, other than comparing the apparent size with the size of the 'identified object' in its database. It is believed that in each case, the Tesla identified the motorcycle directly in front of it, as a car way off in the distance (and so did not slow down).

    • @StacheDTraining
      @StacheDTraining  Před 6 měsíci +4

      There are many issues due to using cameras without other redundant systems.

    • @roybm3124
      @roybm3124 Před 6 měsíci

      With radar acc motorcycles are not always detected when they are hugging the lane lines.
      But you notice this one time and are aware of this.
      Stationary objects are a pain for radar. So with camera+radar in foggy weather it’s still not 100% to detect objects in time.

    • @AndrewTSq
      @AndrewTSq Před 6 měsíci +1

      search thru the database? I think its a vision system at play, with a pre-trained model, that gives you an answer on what it sees and usually a confidence score about how sure the model is that it looks at what the answer is. So it could say "Dog at position x, im 20% confident". The problem with using cameras, is that something could sometimes look like something completely different than what it is. So lets say in some sunset, a gray car looks like the asphalt on the road.. the car will probably crash into it. Also from my tests with using dual cameras for depth instead of laser, is that its like not at all accurate compared to a laser.

    • @AndrewTSq
      @AndrewTSq Před 6 měsíci +1

      ​@@roybm3124 Radar needs to have the "phase shift" (not sure about the english word, sorry) to be able to detect objects. So you are right. That means also if you have the same speed as the object in the radarview, it might be hard to see. I would say these selfdriving cars should need by law all systems, Laser (lidar), Radar, Ultrasound, and Cameras. And if any sensor notices something wierd in the data, the system needs to slow the car down and ask the driver to take over directly.

    • @supertec2023
      @supertec2023 Před 6 měsíci +1

      What's crazy is if you had a Tesla with a radar that came with it that system's been deactivated something you paid for.

  • @SuperBartet
    @SuperBartet Před 6 měsíci +9

    A car crashing head on into a Tesla, and you say the autopilot not to blame. If the owner was driving it themselves, I think they would be steering out of the way most of the time, so I do blame the autopilot, because there is doubt that the crash would have happened with a human at the wheel.

    • @GF-mf7ml
      @GF-mf7ml Před 6 měsíci

      The only matter is who behind the wheel.

    • @SuperBartet
      @SuperBartet Před 6 měsíci

      @@GF-mf7ml Yes that's right. In 49 years of driving many times I have taken evasive action to avoid a head on crash many times. Two main ones that I will not forget is, in a 30 mph zone road curving to the left, and a car comes at me at very high speed. He brakes and ends up in a spin, I end up fully on the pavement as he travelling side wards skids past me. Got the reg and reported him to the police, and he got done for it. Seconded time turning off a main roundabout into a road with a long que of cars waiting to get on. A ambulance with lights flashing on the wrong side of the road coming at me very fast, I ended up off the road in grass. He endangered my life, if he had his siren going I would have heard it. Both time I'm sure that if I was on that Tesla auto pilot, I would have been dead.

  • @AndrewTSq
    @AndrewTSq Před 6 měsíci +6

    How do insurance work for this? I mean, if I was an insurance company, I dont think I would like to insure a selfdriving car at all?.

  • @charlesslack8090
    @charlesslack8090 Před 6 měsíci +12

    The HAL 9000 should be a reminder of what can happen if we allow computer technology to be totally in control. Wait untill the kill switch is installed in the new cars in 2026, which is already mandated by the Infrastructure bill, and you were not allowed to VOTE on.
    Just like taxation without representation you will not have any say in your safety behind the wheel of any new car.

    • @RoverIAC
      @RoverIAC Před 6 měsíci

      That would be a great number plate HAL-9000.

  • @bobbybishop5662
    @bobbybishop5662 Před 6 měsíci +12

    Elon claimed straight out the software would take you from Cali to New York hands free. He encouraged unsafe use of the software.

    • @cayminlast
      @cayminlast Před 6 měsíci +3

      In our modern vocabulary it's called "Misinformation", used to be called 'Bull Sh!t'

    • @MikesProjectsandHobbiesMC
      @MikesProjectsandHobbiesMC Před 6 měsíci

      It also shouldn’t be called “Full Self Driving”. I own a Tesla and there are three levels of auto pilot. I hardly use it because it’s more of a pain in the butt than just driving. Enhanced Auto Pilot and Full Self Driving are in beta testing. You pay $12,000 for it to “test” it. I’m willing to bet a lot of these crashes are due to morons trying to trick the safeguards. They hang weighs off the steering wheel so the system thinks someone is holding onto the wheel. They even tape photos over the camera to try and trick the system into believing they are alert. This is more of a moronic Tesla owner issue than a Tesla issue.

    • @juliahello6673
      @juliahello6673 Před 6 měsíci +1

      He said that would happen in the future. Not now. I think you probably knew that when you wrote this comment.

  • @markcummings150
    @markcummings150 Před 6 měsíci +5

    How in hell do the governments allow this ‘autopilot’ mode to be used in these and other vehicles. Did the vehicle with autopilot have to sit a license test and who’s going to be responsible for any damages and deaths involved. I don’t remember voting for this on the same roads that I use.

    • @cayminlast
      @cayminlast Před 6 měsíci +1

      The Governments usual response is "Trust the Science"!

    • @luigig6256
      @luigig6256 Před 6 měsíci

      We’re in an experiment without giving consent. Autopilot tries to solve an non-existent problem and makes for lx drivers.

  • @HuFlungDung2
    @HuFlungDung2 Před 6 měsíci +5

    I can't think of anything more stressful than babysitting your car's computer, except for..... babysitting a kid running loose outdoors.

    • @darrennew8211
      @darrennew8211 Před 6 měsíci

      It's actually not bad if you use it in the situations where it performs well, like stop-and-go traffic on a freeway, or cruising in light traffic on a freeway. You're still paying attention, but you can look down to change the radio channel, or look to the side to see what that new store is advertising. And it's convenient cruising behind someone who is letting their speed drift up and down, because the car will slow to match and you don't have to be staring ahead to see if the guy just dropped 2MPH off his speed and now you're getting close.
      Occasionally it'll do something stupid like slam the brakes for a quarter second, or notice it's on a bridge going over a much slower road and turn down the cruise control speed without mentioning it, but you get used to the quirks.

    • @happyjoyjoy6976
      @happyjoyjoy6976 Před 6 měsíci

      if the controls for the vehicles accessories were not on a tablet screen you wouldn't need to look down.@@darrennew8211

  • @jeffbroders9781
    @jeffbroders9781 Před 6 měsíci +13

    With the heavier weight of the EV'S if they run into a pedestrian there is a higher chance of causing a fatality.

    • @StacheDTraining
      @StacheDTraining  Před 6 měsíci +3

      A pedestrian doesn't care about a few extra hundred pounds. Any vehicle hitting a pedestrian will do serious damage. If anything, the lack of an engine will likely improve HIC (head impact criteria)

    • @doublebackagain4311
      @doublebackagain4311 Před 6 měsíci +2

      @@StacheDTraining 9,000lb Hummer EV begs to differ...

    • @RoverIAC
      @RoverIAC Před 6 měsíci

      yes but if you look carefully you will notice that the pedestrians are also a few hundred pounds heavier then they use to be.

    • @altvamp
      @altvamp Před 6 měsíci

      Twaddle, a solid metal object at a certain speed will do the same damage regardless of weight, weight only affects stopping distance.

    • @darrennew8211
      @darrennew8211 Před 6 měsíci +2

      @@StacheDTraining Have you seen the Cybertruck? You'll get sliced in half. And that angle is reinforced, so you're not bouncing off that hood.

  • @martinr8278
    @martinr8278 Před 6 měsíci +9

    Saving the environment and universe one crash and fire at a time.

  • @RC-wu6gm
    @RC-wu6gm Před 6 měsíci +4

    I wonder if all the bright flashing and strobe lights blind or confuse the car's cameras and sensors.

    • @StacheDTraining
      @StacheDTraining  Před 6 měsíci +5

      That's what I've thought for a while.

    • @roybm3124
      @roybm3124 Před 6 měsíci +1

      If it’s confused it will tell the driver to take over controls.

    • @darrennew8211
      @darrennew8211 Před 6 měsíci

      But ... that's kind of the point. Flashing strobe lights should mean "slow or stop", not "ignore these things."

  • @pchelloo
    @pchelloo Před 6 měsíci +11

    Autopilot is blind to construction props, stopped school buses and more. Battery repair/replacement reported at $20,000 dollars or more. Tire treads wear out very quickly due to weight. Putting out a fire in one of these vehicles is a nightmare as it has been reported that once out, the fire can re-ignite.....days later. Maybe one day the kinks will get worked out.

    • @garychlastawa8277
      @garychlastawa8277 Před 6 měsíci

      current versions of FSD beta can handle construction zones pretty well. Still years to go till its ready for no human behind the wheel, but slowly improving.

    • @ldnwholesale8552
      @ldnwholesale8552 Před 6 měsíci

      Tyres too are very expensive. And currently only premium brand tyres available. That will change maybe. That is IF EVs are still around. It does seem insurance costs will kill them and crash repairers who do not want them as their insurers will not cover them.

  • @basbass429
    @basbass429 Před 6 měsíci +2

    So many MANY accidents were prevented by the driver paying attention and overruling the auto-pilot. Simply what i always stayed; Never trust a computer and do not use a computer when there is wide array of variables and conditions. That is why auto-pilot in a plane works: no curbs, no pedestrians, no streetlights, no trees, no animals, no crowded random vehicles around, no holes in the road, no blind corners, no busses stopping, no trains crossings, etc etc etc.

  • @jeremyashford2145
    @jeremyashford2145 Před 6 měsíci +2

    A school bus is hardly a “non-standard situation”.
    They are ubiquitous.

  • @davidsoom1551
    @davidsoom1551 Před 6 měsíci +22

    What happened to common sense? These cars are not ready for prime time! Autopilot is dangerous. The risk is too great.

    • @ecsolha
      @ecsolha Před 6 měsíci +1

      I think that is a good question.

    • @marianpazdzioch6632
      @marianpazdzioch6632 Před 6 měsíci +5

      belief in Musk's promises is strong enough in his cultists to overshadow any common sense

    • @roybm3124
      @roybm3124 Před 6 měsíci +1

      It’s a driver assist system, the driver is responsible of the auto pilot system. Not the other way around. So the driver have to learn it’s flaws and when to take action.

    • @jebes909090
      @jebes909090 Před 6 měsíci +1

      ​@@roybm3124i know right, what idiot would think that FULL SELF DRIVING would actually mean it could drive itself.

    • @davidsoom1551
      @davidsoom1551 Před 6 měsíci +1

      Who desires to babysit constantly a flawed and dangerous diving system and take the moral and legal liability? Just F'n drive the damned car as is your responsibility. Our highways and streets should not be subject to the public beta testing this dangerous technology. @@roybm3124

  • @anonanon1604
    @anonanon1604 Před 6 měsíci +1

    Imagine turning on cruise control and actually thinking that you suddenly aren't responsible for driving your own vehicle anymore.
    Don't forget that people who are this hopelessly regarded only need to take three turns in a parking lot to get a driver's license in the US...

  • @softwarephil1709
    @softwarephil1709 Před 6 měsíci +2

    I use adaptive cruise control on my Subaru Forester all the time. It improves my safety. But I have to keep my hands on the steering wheel, eyes on the road, and steer the car. Full Self Driving is not ready for public roads.

  • @jamesready5
    @jamesready5 Před 6 měsíci +8

    They really need to get this autopilot working properly so when the time comes after mandating everyone have an EV, “they” can log in remotely and control everyone’s personal vehicles.
    It will either be that or a foreign actor will hack in and cause a major incident.
    Think of even now with OTA access to the vehicle computers, companies can shut you out of your personal property.

    • @chrisfallis5851
      @chrisfallis5851 Před 6 měsíci

      Someone should use that car takeover event as the idea for a movie script. Oh wait. Netflix already has.

    • @YouTubeDeletesComments
      @YouTubeDeletesComments Před 6 měsíci

      You think they want everyone in EV's? That's impossible, but, they do want the lower class(no middle) with nothing, but Public Transit, for their 15 Minute City commute

    • @GF-mf7ml
      @GF-mf7ml Před 6 měsíci

      Good luck try to turn my steering wheel. It have physical lock. I won't drive by wire anyway.

  • @darrennew8211
    @darrennew8211 Před 6 měsíci +2

    Heaven help you if you're a cop directing traffic in the road and a Tesla on "full self driving" shows up.

  • @user-zo4yi2vc1j
    @user-zo4yi2vc1j Před 6 měsíci +2

    That's the beauty of Tesla. Every single sensor in a tesla records data all the time. This data can and will be analyzed by authorities, when in doubt. That's why almost every case of "Autopilot failure" turns out to be human failure.

  • @yodaiam1000
    @yodaiam1000 Před 6 měsíci +1

    To get a proper baseline, you have to compare non-auto pilot rates of accidents to auto-pilot rates of accidents to see if it is a benefit or added hazard. Regardless if it is better or worse, you will have accidents with autopilot so until you do a proper study, you can't say that it increases the risk of driving.

  • @ptonpc
    @ptonpc Před 6 měsíci +2

    Most of these accidents and deaths are caused by people not paying attention, by drivers not doing they thing they are supposed to do, *pay attention and be ready to take over*.

  • @peterwexler5737
    @peterwexler5737 Před 6 měsíci +1

    At least there's an emergency vehicle on scene when you crash into an emergency vehicle. What better "safety" "feature" can you ask for?

  • @practicalguy973
    @practicalguy973 Před 6 měsíci +5

    More than ever people have less common sense and responsibility, auto pilot only adds to this growing issue. Maybe in the next 100 years everything will be done for people and they wont be able to think at all for themselves!

    • @RoverIAC
      @RoverIAC Před 6 měsíci +1

      It's going to be like that movie Wall-E.

  • @ericmathena
    @ericmathena Před 6 měsíci +1

    Although these crashes show the weaknesses of the Tesla Auto Pilot, all of these are the fault of the driver. No matter how good the Auto Pilot is, YOU are still THE PILOT.

  • @johnjriggsarchery2457
    @johnjriggsarchery2457 Před 6 měsíci +2

    I just rely on knowing how to drive without hitting stuff.

  • @tommays56
    @tommays56 Před 6 měsíci

    In South Carolina it’s common to have to yield to left Turning traffic while making a right turn on GREEN 😮 which in have never seen anyplace else

  • @airborneadventurer
    @airborneadventurer Před 6 měsíci +1

    In the past, I was a drone pilot. After a few erratic automated flight incidents, put through grief being told each incident was my fault even though I was just monitoring what the aircraft was doing by itself, I learned to never trust automated features. You, the user, are responsible for anything that goes wrong. Since this is the case, I only trust myself. I’m glad I experienced that with a half pound drone and not a 5,000 lb. vehicle.

    • @anordenaryman.7057
      @anordenaryman.7057 Před 6 měsíci +1

      That really is the crux of the situation. You can be the pilot in command and have full control, or you can be a passenger and have zero responsibility. There can be no gray area in the middle. I am 50 years old and do not even use cruise control. When I am driving, I am driving.

    • @airborneadventurer
      @airborneadventurer Před 6 měsíci +1

      I couldn’t agree more. I only use cruise control on wide open roads without anyone around. I had a Toyota Sienna with radar cruise control years back and it was amazing technology but I never trusted it because of its rudimentary flaws. This was an ‘08 Toyota and 15 years on, I still don’t trust it. I am in control.

  • @dontask8979
    @dontask8979 Před 6 měsíci +2

    Recall the DRIVER.

  • @TovarasSanders
    @TovarasSanders Před 6 měsíci

    first time here, but man, that's a stash you can definitely trust!

  • @micke3035
    @micke3035 Před 6 měsíci +1

    Driver assistance systems(or what you call them) should be "passive", i.e the driver is always the primary controller of the vehicle but assistance step in if the driver is going over sidelines, or becomes unconscious and things like that. Well my opinion anyway, driver complacency is not a good thing.

  • @gormenfreeman499
    @gormenfreeman499 Před 6 měsíci +2

    Human environments should be for humans, autonomous robots belong in back rooms.

  • @juliahello6673
    @juliahello6673 Před 6 měsíci +1

    Every major automaker has an ADAS system that has the same capabilities of basic Autopilot (traffic aware cruise control and lane keeping). But police don’t ask unless it’s a Tesla. The media doesn’t broadcast it unless it’s a Tesla. Do you really think that other ADAS system haven’t hit emergency vehicles or other objects? How would those manufacturers have trained their vehicles on relatively rare situations like emergency vehicles without being able to get video from their fleet?

    • @StacheDTraining
      @StacheDTraining  Před 6 měsíci

      Most manufacturers don't only use cameras. I'll do a video in the future on the topic. Also, this isn't media driven. It's driven off a NHSTA investigation.

  • @seanswilson
    @seanswilson Před 6 měsíci

    Two comments: 1. - If someone can ignore 150 warnings (which is designed to protect the occupants of the vehicle AND all the vehicles around it), then the system should essentially shut the car down after a pre-determined number of warnings e.g. 10. That last warning should include the statement that the car will be immobilised and authorities called (i.e. whether they are drunk or medically incapacitated). It should be able to find a suitable place to bring the vehicle to a safe stop away from the road. 2. - About the bus incident, and this is a serious question. Are children not taught to check the road before crossing it, regardless of the circumstances? I know the whole purpose of the warning lights and STOP signs is to stop the traffic, but if they get off the bus and cross the road immediately in front of the bus, they are crossing blind, and any vehicle (such as that Tesla mentioned) isn't going to see them.

  • @iAPX432
    @iAPX432 Před 6 měsíci +1

    School bus is not a non-standard situation, it's so standard that it is in the law. Shame!

  • @Relkond
    @Relkond Před 6 měsíci

    The automation is, honestly, capable of a lot. It’s not however a living, thinking being, which is where you start to approach ‘enough’. It sounds like it’s failing to recognize when it’s outside standard scenarios and fails to apply extra caution - ‘there are stopped emergency vehicles in my lane with lights flashing - let’s slow down in case I need to stop or switch lanes’

  • @dorisatkinson7259
    @dorisatkinson7259 Před 6 měsíci

    The algorithm should be - I see flashing lights, disengage autopilot and scream loudly at the driver.

  • @OM617a
    @OM617a Před 6 měsíci

    I read that tesla may be instructed to recall cars due to suspension issues.

  • @dstr1
    @dstr1 Před 5 měsíci

    "The Dawn project"!?? How is he doing now!?

  • @CaptainProton1
    @CaptainProton1 Před 5 měsíci

    Recall or over the air update

  • @deansapp4635
    @deansapp4635 Před 6 měsíci +2

    We need ZERO auto pilot

  • @cuddlepaws4423
    @cuddlepaws4423 Před 6 měsíci

    My husband used to work in engineering for companies that made car parts, and there is a phrase, 'Risk Compensation', i.e. the safer you feel because of all your shiny in car gadgets, the more you drive like a dick.

  • @johnclark290
    @johnclark290 Před 6 měsíci

    auto pilot or full self driving for any vehicle is banned in Australia
    Cheers john

  • @ensignbodybag
    @ensignbodybag Před 6 měsíci +2

    At the end of the day, it's not the fault of autopilot or Tesla's technology. It is 100% the fault of the drivers. How about a fair comparison between accidents that have been avoided thanks to autopilot.

  • @sc0572
    @sc0572 Před 6 měsíci +2

    Sir, I have over 50K miles of personal experience driving alongside Teslas FSD. I'm also professionally very active in the AI space. Respectfully, you got some fundamental things wrong here. As an owner of the vehicle, I can confirm from my own personal testing and experiences that some of what you said simply isn't true. I'm curious about your sources because what you're saying is so incorrect it's almost a crime to say it. I'm assuming you're not just Tesla bashing and you want to put out factual content. I'd be happy to connect with you to review my experiences with both my Teslas driving me around. After 50K miles I can promise you one thing, my car won't hit an emergency vehicle just sitting out on an open road with its lights on, but I can tell you exactly why some do. No pressure, let me know if you want to have quick chat.

    • @MikesProjectsandHobbiesMC
      @MikesProjectsandHobbiesMC Před 6 měsíci +1

      Thank you. It’s only Tesla owners than can see though all these negative BS post. Unfortunately no one will believe us. LOL. People love to click on hype and add fuel to their hate.

  • @robertahrens9481
    @robertahrens9481 Před 6 měsíci +1

    If are to lazy to drive the dam thing! Ask a friend for a ride!

  • @AlaskanInsights
    @AlaskanInsights Před 6 měsíci +2

    correction - people think they are smart.
    not - people are smart.
    not sure why you would have a car an not drive it yourself. half the fun of having a car.
    My whole life I hear people complain about their computer, phone and countless other computerized devices not working properly ... why in the name of gods green earth would you trust it to drive.
    ooo that's right - your modern and smart.

  • @Ted...youtubee
    @Ted...youtubee Před 6 měsíci

    My Ford Everest 3 times in 2 years, decides car in front is slowing down to quickly and my car auto brakes then goes to low power mode.
    Had to roll to side road, turn off.. On.... All good.
    Ford dealer advised, nothing found wrong.
    If software engineers can't get this right, ehy would I trust a fully automated vehicle.

  • @picobyte
    @picobyte Před 6 měsíci

    I don't like public airplanes for this reason.

  • @happyjoyjoy6976
    @happyjoyjoy6976 Před 6 měsíci

    we have been manually driving cars for well over a hundred years, if it aint broke why fix it is my thoughts.

  • @Zodliness
    @Zodliness Před 6 měsíci +1

    Can't blame the vehicle if the culprit is a loose nut between the steering wheel and driver seat. 🤔😉

  • @ibretus
    @ibretus Před 6 měsíci

    Auto pilot is no different to using cruise control on older cars you don't just drive into the back of things with cruise control on. How many of you drive a car with cruise control on and don't pay attention.

  • @Mattisttam
    @Mattisttam Před 6 měsíci

    I’ll bet $100 that they’ll be putting heart monitoring systems in steering wheels.

  • @tylerdurden4006
    @tylerdurden4006 Před 6 měsíci

    So the cars don't respect authority? Lmfao 😂

  • @bobjohnston5527
    @bobjohnston5527 Před 6 měsíci

    Autopilot shouldn't have been put in the Tesla. The new update might not work

  • @michaelwebber4033
    @michaelwebber4033 Před 6 měsíci

    Autopilot makes sense in an airplane, because they operate very differently to a car, perhaps they shouldn't have called it autopilot.

  • @EmilioBaldi
    @EmilioBaldi Před 6 měsíci +2

    It's only a glorified level 2. "Auto pilot" is only in the name.

  • @sheilaolfieway1885
    @sheilaolfieway1885 Před 6 měsíci

    Well if droids could think, where would we be?

  • @andrewgraham7659
    @andrewgraham7659 Před 6 měsíci

    If it works really well, but has flaws, it's not working very well......

  • @hargobindsingh2012
    @hargobindsingh2012 Před 6 měsíci

    We are living in Idiocracy.

  • @jerrymcrie
    @jerrymcrie Před 6 měsíci

    do not know... when the computer is driving one should be permanently as careful as one should be as if driving.
    that will not happen ever. if the car drives itself one will not be as careful.
    so .... the software should be perfect with no errors. and that is a utopia.

  • @martinsoelby5902
    @martinsoelby5902 Před 6 měsíci

    No. Just no.

  • @ahorton6786
    @ahorton6786 Před 6 měsíci

    They should just ban it.

  • @johnanthonycolley3803
    @johnanthonycolley3803 Před 6 měsíci +1

    Seems the owners and other road users here are just : Lab Rats

  • @thomask4836
    @thomask4836 Před 6 měsíci

    Judging from this video, a Tesla is not in my immediate future. They still have work to do.

  • @williammeek4078
    @williammeek4078 Před 6 měsíci +1

    You are out of your wheelhouse on this one.
    Look at the data, a Tesla on Autopilot is less likely to run into emergency vehicles than a person.
    Autopilot is SAFER for emergency workers, not more dangerous.

    • @StacheDTraining
      @StacheDTraining  Před 6 měsíci

      So the NHSTA didn't require a recall on the autopilot?

  • @damon1957ful
    @damon1957ful Před 6 měsíci

    Its not needed get rid of it

  • @xiro6
    @xiro6 Před 5 měsíci

    Ntsb knows more about autopilot crashes than tesla? A cinic person will say tesla knew and did nothing until forced to.

  • @xiro6
    @xiro6 Před 5 měsíci

    Cars, even teslas, dont "crash into people" they run over them, not sure about the word. I never said a car crashed into a person.

  • @craigbasoco4886
    @craigbasoco4886 Před 6 měsíci

    Umm how about NO never there nuts yea it called a sell phone.

  • @dn744
    @dn744 Před 6 měsíci +1

    In the event just before a crash or similar. The driver doesn't react in time, because they expect the car to. By which point its too late. 😮

  • @joelaichner3025
    @joelaichner3025 Před 6 měsíci +4

    The whole joy of driving a car , is ? Actually Driving the Car ! WTF

  • @andrewgraham7659
    @andrewgraham7659 Před 6 měsíci +1

    Isn't this the sort of problem you figure out before you release the vehicle onto the road?

  • @RavTokomi
    @RavTokomi Před 6 měsíci +2

    Good deal for Tesla. You pay THEM over $10k to die beta testing their software.

  • @EleanorPeterson
    @EleanorPeterson Před 6 měsíci +2

    It's outrageous that any company is allowed to 'develop' safety-critical technology on open public roads using actual people - drivers, cyclists, motorcyclists, pedestrians, traffic, emergency, and maintenance workers - as crash-test subjects.

  • @jk35260
    @jk35260 Před 6 měsíci +2

    Its the reckless marketing often used by Tesla. Waymo has an almost clean safety record with no known fatality.

    • @StacheDTraining
      @StacheDTraining  Před 6 měsíci +3

      Waymo is a bit different. It only functions in a limited area. Speed was capped. Waymo has had around 150 accidents with 3 injuries. It still has issues driving into construction zones.

    • @jk35260
      @jk35260 Před 6 měsíci +1

      @StacheDTraining Yes, it function as a driverless taxi in designated zones. But the experience by users are generally very positive. The injuries involved are not serious. The accident rate is also far lower than those caused by human driver.

  • @tmarsalek36
    @tmarsalek36 Před 6 měsíci +4

    Why is auto pilot legal for fuck sake?

  • @steventurner8428
    @steventurner8428 Před 6 měsíci

    technology will ALWAYS fail its only a matter of time.

  • @Paul-li9hq
    @Paul-li9hq Před 6 měsíci +6

    But how can Electric Jesus get it so wrong? After all, he told everyone he "...knows more about manufacturing than anyone else on the planet..." (Which I would assume includes.... God?)

  • @xiro6
    @xiro6 Před 5 měsíci

    Maybe yes maybe no? Sorry, if you want to use an autopilot must be a requisite ro record all video and data and provide it to police.
    Privacy and patents, ok, is not allowed on public roads then, easy as that. And what about the privacy of everyday people being recorded by those cars? Even lots of them end on youtube, but if i got killed by a car police can access to that same recording to know who or what killed me?

  • @bentullett6068
    @bentullett6068 Před 6 měsíci

    I remember seeing a video where a Tesla using autopilot literally just stopped because it got confused on a busy city road, which then caused a huge pile up behind it. The driver was apparently asleep (possibly using those weight things on the steering wheel to trick the system) and only woke up when he was hit. He got out of the car assuming that the pickup truck behind him was the only vehicle involved and it wasn't until he looked up from seeing where the truck had hit him that he moticed there were multiple vehicles involved. I think he was going to have a hard time explaining that to the insurance company.

  • @jdgvee9313
    @jdgvee9313 Před 6 měsíci +2

    Its really amazing how much better the human brain is compared to the most state of the art computers even today. Autopilot is not safe now and won't be safe anytime soon

    • @darrennew8211
      @darrennew8211 Před 6 měsíci +1

      Eyes are a whole lot better too.

    • @lunasakara7306
      @lunasakara7306 Před 6 měsíci

      At least you said "soon"
      At this rate, we're going to dumb ourselves down enough that even an Autopilot is smarter and more decisive.
      All joking aside; I'm sure improvements in AI are going to help self-driving cars adapt to more conditions, faster. But I do still worry about the mental devolution of humans in the last 30 years... Or maybe it's not that we're getting dumber, but access to the internet is just showing us how many idiots have lived among us since forever?

    • @darrennew8211
      @darrennew8211 Před 6 měsíci

      @@lunasakara7306 One of the problems is that humans can use social skills to figure out other humans. Example: you come to a flashing yellow arrow, and you've never seen a flashing yellow arrow before, so you have to figure out what it means. I can think "I know what a green arrow and a red arrow means, and I know what a flashing solid circle means, so I can figure out what the designer of this traffic signal would have intended me to understand this signal to mean." A car is likely never going to do that.
      Similarly with (say) a pedestrian standing on the side of the road. I can look at him and guess pretty confidently whether they're waiting to cross or not, just from body language, or even seeing the stuff they're carrying (like, car keys in hand? A shopping bag logo'ed with the name of the store behind them?)

  • @THX--nn5bu
    @THX--nn5bu Před 6 měsíci +1

    I refuse to drive a rolling mobile device on four wheels.

  • @BadThrusher
    @BadThrusher Před 6 měsíci

    I love moustache, I have moustache but yours is a disaster

  • @ghunt9146
    @ghunt9146 Před 6 měsíci +1

    Simply not ready to be passed out as safe for use.

  • @mikemccormick8115
    @mikemccormick8115 Před 6 měsíci +2

    Why does society NEED autopilot on vehicles? So people can go to sleep? So they can play with their phones? To start work early? All bad reasons.

    • @darrennew8211
      @darrennew8211 Před 6 měsíci

      It's handy on freeways with traffic that keeps changing speed. It's nice to cruise along and be able to (say) change sunglasses, change the radio station, look at the car beside you, etc. It makes it much safer to take your eyes off the road for three or four seconds. IME, driving 200 freeway miles on AP is like driving 50 miles without it, in terms of effort and fatigue.

    • @mikemccormick8115
      @mikemccormick8115 Před 6 měsíci +1

      @@darrennew8211 you’re addicted to the kool aid. Its not needed. If you’re too tired to drive without an autopilot you’re to tired to drive with one. And 3 or 4 seconds is more than enough time to get killed. What is already happening is people, some have already lost their lives, are relying on technology too much. It isn’t reliable 100% of the time and never will be. That’s obvious to all but the fan boys.

  • @tedsaylor6016
    @tedsaylor6016 Před 6 měsíci

    Musk (I'm NOT defending/supporting auto-driving systems) had an observation of "autopilot just needs to be better than the average american driver". And good luck with perfect software, not going to happen.

    • @larslrs7234
      @larslrs7234 Před 6 měsíci +1

      The average is lowered by selling touch-only cars and driver assistants that fail when really needed.

    • @darrennew8211
      @darrennew8211 Před 6 měsíci +1

      It not only has to be better than the average driver, it also has to not get involved in accidents that humans would never even begin to get involved in. If it had half as many accidents, but all of them involved running into the back of a police car stopped on the side of the road with lights on, it's not going to fly.

  • @itabiritomg
    @itabiritomg Před 6 měsíci

    That has an easy fix: just program the car to break and stop if the driver doesn't keep his hands on the steering wheel for more than a few seconds. That would force the driver to pay attention.

    • @darrennew8211
      @darrennew8211 Před 6 měsíci

      That's what it's supposed to already do. Did you see the pictures of people putting artificial machines on the wheel to fool the car?

    • @MikesProjectsandHobbiesMC
      @MikesProjectsandHobbiesMC Před 6 měsíci

      It actually does that but idiot owners always find a way to “trick” the safeguards.

  • @fredrik3685
    @fredrik3685 Před 6 měsíci

    Elon is insane

  • @russingersoll5761
    @russingersoll5761 Před 6 měsíci

    Complexity is the enemy of reliability.

  • @stevenwithanS
    @stevenwithanS Před 6 měsíci +5

    The most important thing to learn about EV's, is not to buy them.

  • @chrismayer3919
    @chrismayer3919 Před 6 měsíci

    Digi-computeletronic navigational A.I. (autopilot) disturbs me. Call me a paranoid Pamela, but i’d be too scared to EVER take my hands off the wheel, autopilot or no… 😓

  • @RealButcher
    @RealButcher Před 6 měsíci +1

    EM has got to go to jail, naming it FSDM (IMO). It's still not there and having to pay $$$ for this. What a scam.

  • @blxtothis
    @blxtothis Před 4 měsíci

    Why is Tesla still in business, their false range predictions, disgusting build quality and countless faults mark them out as the biggest Con job in automative history.