Autopilot vs Humanpilot: A Dark Reality
Vložit
- čas přidán 18. 01. 2023
- Sometimes truth can be... Uncomfortable.
Consider supporting this channel on Patreon: / aidrivr
Equipment used for filming:
GoPro Hero 10 Black (Visualizations): amzn.to/3kGze2I
GoPro Hero 11 Black (Interior): amzn.to/3H7Xsuq
GoPro Enduro Batteries: amzn.to/3wrzhSK
GoPro Dual Battery Charger: amzn.to/3XBxnLi
Fat Gecko Triple Mount (Exterior): amzn.to/3R8J0GQ
Fat Gecko Double Suction Mount (Interior): amzn.to/3HyW0Tj
Sony A6400 w/ 18-135mm Lens: amzn.to/3RauluT
Sigma 30mm F1.4 Lens: amzn.to/3HvOU1C
Insta360 One RS: www.insta360.com/sal/one_rs?i...
Insta360 X3 (free selfie stick!): www.insta360.com/sal/x3?insrc...
Insta360 X2 (free selfie stick!): www.insta360.com/sal/one_x2?i...
Equipment used for editing:
16” MacBook Pro M1: amzn.to/3j71X04
Magic Trackpad: amzn.to/403P8UV
Roost V3 Laptop Stand: amzn.to/3JgqBX4
Phillips Fidelio X2HR: amzn.to/3wvuVd7
Dragonfly Cobalt: amzn.to/3kHLPCR
Shure SM7B Microphone: amzn.to/3wxz9kK
Gator Frameworks Mic Arm: amzn.to/3XGGE4E
Elgato Wave XLR Preamp: amzn.to/3XEZpW2
I earn from purchases you make via Amazon Associates & Insta360 affiliate links - thank you! - Věda a technologie
It's almost like cars are an inherently dangerous and deadly form of transportation.
+1 to this. I wonder how many deaths per year are related to trams or trains
All this effort and tech for a fundamentally destructive and wasteful form of transportation. It's a bummer
Its almost like it would be best to invest in public transport and probably just abandoned cars entirely.
@Almarca The CIA website has a Fact book that records all information around the world and provides it publicly. If you wanna check its in there. From what I've seen it's far far less then cars.
Almost like a system were all the transport is organized to work within a select time frame on rails away from roads to ensure safety in train cars designed for human safety and accommodations such as food water and backup generators in the rare case the train gets stranded for some reason. (Won't ever happen) in the end the train is safer. And vehical deaths are on the rise along with pedestrian deaths due to a lack of government intervention and a lack of proper driveing execution from the average driver who woke up late for work and now is speeding through a red light because they can't be any later then 5 minutes.
In the end a society would function better without cars. Then one with a great public transport system. If we had 2 bullet trains that go all across the states or 2 that connect southern California with northern then every major city has a good train network to move people about the city and those city's get rid of highways and roads that aren't really nessesary. Things would be safer.
You would get to work faster or be able to visit family for a smaller price then owning a car.
This would reduce the strain on the population and actually fund a city properly with minimum risk.
@@playstation8779 unfortunately this will never happen in the US. i think the only reasonable and likely way to accomplish a car-less system in the US is with an entirely new city. i'm almost certain that city would be in california, no other state cares enough. maybe when world war 3 comes and all the big US cities get nuked, we will rebuild them car-less.
Professional level production here. 👍
I put my Model 3 in drive the other day when I meant to put it in reverse to back out of a parking space. The car didn't allow me to drive forward into a bad situation.
That is amazing! And provided by an OTA software update, which is awesome. Thank you for the kind words :]
Statistically, you currently save the Tesla much more often than it saves you.
Humans are still safer than FSD Beta by itself.
Human supervised FSD Beta seems to be safer than both.
@@LightAndShaddow5 Source?
@@TrueFerret I think what he’s referring to are disengagements and it’s true.
Watching some videos you’ll see a couple of disengagements per ride. How often did an almost accident happen during those rides in which cases Tesla Autopilot did save the situation? Close to 0.
Let’s say 5 disengagements and 0 scenarios in which Autopilot had to take action in a 30min drive through a city.
So he’s technically right, the driver just saved the Tesla 5 times, the car didn’t save him once
@@LightAndShaddow5 Maybe. It is a difficult one to quantify for the simple reason that disengagements happen long before necessary. IOW a self-driving car is disengaged long before it becomes a problem simply because the human behind the wheel feels uncomfortable not because the situation is not recoverable or has become unsafe. TESLA specifically chose testers in their BETA program that are more cautious than the average driver.
I have made serious mistakes on the road where I was saved by other drivers paying attention. I have a feeling that 90% of the disengagements that looked SERIOUS something similar would have happened.
Also to consider: If one person learns to be a better driver after a mistake (if they survive it), it's just that one driver. If a self-driving car makes a mistake and that mistake is sent to the software developers to investigate, *all* cars with that software will learn from it after an update.
This topic aside if this wasn't about driving and avoiding deaths this would sound like an endorsement campaign a super intelligence would make on why it's okay to replace humans on Earth by robots. I just thought it's funny btw I totally agree with you there.
Yes you're right but on the other hand humans are much faster at learning so it's hard to say if this is really an advantage for self driving cars.
If a human makes a mistake he learns from it and apply it to various scenario that look similar meanwhile a self driving car has to do x time that mistake over all similar scenarios in order not to do it again.
Not to mention humans can also learn from the mistakes of others, eg watching dashcam videos or hearing advices from others drivers. So they also benefit from a crowd sourced database of way to improve.
@@polosh100 Nvidia made an AI that learned to play Minecraft by watching tutorial CZcams videos so maybe the standards agency’s could make a central edge case video database for AI to learn from.
Last year I was driving between northern Norway and Sweden in my Model 3, just using traffic-aware cruise control.
During winter, we have no sunlight for about 1 month, so it's all night driving. This can really get to you. I didn't really perceive how sleepy I was becoming.
I suddenly sprung back to awareness when I realized the car was beeping the alarm and had corrected my course back onto the road, as I was about to drive straight off.
Gave me a chance to learn something about driving and rest without earning a horrible injury in the process.
My Volvo does that too. It also detects when you are driving in a sleepy driving pattern.
@@Xanthopteryx Volvo and other have "features". Tesla has hardware and software to be L4. And they will achieve it in 1-2 years.
@@archigoel Yeah I doubt that.
@@archigoel well they've been saying that it will be next year for the past 7 years.
My car has these systems too. Doesn't let me do anything stupid if i fall asleep, stops me when im about to crash into something and prevents collisions when someone is being stupid around me. Saved me like 2 times because people on highway are morons. I still wouldn't use full self driving, i like to pay attention to everything around me and i dont like driving while i wouldn't be able to immediately react. I also enjoy driving so i want to keep doing that. But those safety systems are awesome and all cars should have that. Would prevent tons of crashes. I also almost never use the cruise control, only when i need to drink some water, just so I dont spill it all over me while still trying to drive.
I drive everyday as a couier for 300-600 km a day. All I can say is that there is no way humans could ever design an AI that is worse than humans driving.
As a tesla driver in a rural area, they did and its called FSD
@@geoffzephyrus9849 rural drivers are usually better (in their areas), as it's easier to pay attention when the alternative is driving off a soft shoulder and down a cliff. Most drivers I pass in cities nowadays are on their phones.
"The next time you see autopilot allegedly messing up on the news causing an 8 car pile up" just ask yourself how many of those 7 other cars were maintaining a safe following distance and paying attention to their surroundings.
Zero. And they are all at fault.
@@scottgaree7667 not necessarily if six of those are the vehicles stopped with room to spare and the seventh one plowed into them at full speed well then it's on that one!
@@freman true… At least one of them was at fault. Though in that particular example, it looks like there were several bad drivers 😂
This is one of your best videos yet!
What people forget is that humans are great drivers, BUT ONLY when well trained, attentive and rested, not affected by drugs or alcohol, not on the phone, not suffering a medical condition, not involved in a heated discussion with your passenger, etc etc
If all the cars behind the Tesla in the tunnel accident were Teslas driving on autopilot, there would not have been a accident.
Exactly!
You hit the nail on the head by calling out the "emotional response" as a lot of progress (not just regarding self-driving) is held up by the very same thing.
I absolutely agree wtih the points made in this!
The goal is not 'no crashes', it's 'less crashes than a human'. As you and your clips pointed out, we are horrible drivers who can be thinking about everything but driving whilst driving a 2 ton metral brick along at 70mph.
I mean... The pile up with that Tesla, it's amazing how many people hit it. Damn, it's a real shame no one told them to keep a safe follow distance in the case the car in front suddenly stops or hits a stationary object, but I mean... That would never happen, of course! :)
And it looks like Tesla was not at full emergency braking. It the event of a crash ahead the pileup word start much worse.
nope. Incorrect
I know, right? I honestly can’t tell how or why that particular Tesla stopped (considering how dishonest most media reports are when it comes to Autopilot/FSD)
But, it literally should not matter why they stopped. If I need to stop for any reason and you rear-end me, then it’s literally your fault for following too closely. Unless I was literally “brake checking” you and trying to cause an accident on purpose (which obviously an AI could never do - it does not have emotions, and therefore cannot choose to brake check you out of “anger” over something you did)
From the perspective of a Tesla owner using FSD, THANK YOU! You have accurately portrayed the realty that is normally buried by the click-bait media in the U.S. Truly love this objective video. Hope to see more👍
“Objective video” lol
This is cope. This guy made this video cause he’s upset he paid $15,000 for something that will still run down children.
@@wyattnoise
Dan O'Dowd faked his ad campaign by first running down the mannequin on regular autopilot (old code), than on FSD while not showing the prompt at the bottom of the display indicating that the accelerator was pressed, overriding the system. Oh wait... They did show the prompt in another clip but it was far too blurry to read the text. The whole running over children narrative is a lie.
Shram, you said exactly what I wanted to say! I too am a Tesla driver, and I too can recognize that it isn't "perfect" yet. But wow, it is several times better than a significant number of the bad drivers out there... and as several good drivers have had the courage to point out, even they have made mistakes or briefly fallen asleep at the wheel. But Tesla is ALWAYS pretty good, and ALWAYS better than many bad drivers.
Best video I've seen in a long time.
Not sure how objective it is. He claims that lives would be saved if every car today is replaced with a Tesla without steering wheels. Well, what about the number of interventions needed to prevent Teslas from crashing today? What percentage of rides won't make it? I think the video is as biased as it can get.
@@wyattnoise nice FUD attempt, but that was already thoroughly debunked
Gotta love people who have never used FSD, but seem to know all about it after hearing a scary rumor 😂
I so appreciate your effort at setting the record straight! Keep up the good work.
Set the record straight in regards to human / AI cyborg car being safer than human car, but not regarding a human car vs AI car. In the later case, the human is safer despite the video using human supervised AI data interchangeably with AI only data.
When you make good content like this, imma just end up watching the whole video on accident.
UH OHHHHHHHHHHHHHHHHHHHHHH!!!;)
It’s always the most reckless drivers complaining about how “dangerous” autopilot is
I'm not a reckless driver and your comment doesn't make sense.
@@pruthvinedunuri2983when did he say your a reckless driver?
I’m a reckless driver and I love autopilot, even tho I don’t have any ability to use it 💀
@@Twistiry what I mean is I'm not a reckless driver and still believe autopilot is dangerous.
@@pruthvinedunuri2983 I do agree autopilot is still dangerous but it has learned within the past few years
Great video and insight! Framing the accidents prevented on a macro level is something often missed by the masses.
I sometimes lose sight of how this isn't known to some people, or I guess even commonly understood. You probably have to deal with a great many comments that probably aren't so kind. Hope you're doing okay, and that people will continue to learn from you. Love the videos!
Thanks for posting this. Really solid... an objective look at human vs. machine safety statistics.
It’s not objective though, it’s a look at human vs human/machine cyborg stats.
Not human vs machine stats.
Without the human, autopilot and FSD beta stats would look horrible.
THANK YOU, THANK YOU, THANK YOU! I've been shouting that message for a very long time, but people look at me like I am crazy when I point out that machines are better drivers than human. I finally got a Bolt with Supercruise, and I am blown away by how much more relaxed I can be when I don't have to be making every every calculation and figuring every angle constantly during a trip. Makes me a better driver too.
Humans are currently better though.
If your friend said “if I drove you to the shopping center, in your car, would you trust me enough to close your eyes?” You would probably say yes.
You surely wouldn’t currently trust your car enough, so that you would close your eyes while it drove you to the shopping center.
Until you can sincerely say you, with your eyes closed, you would trust the car more than the average human to drive you through a typical drive, then you don’t believe it either.
@@LightAndShaddow5 Well said. I only know FSD from the videos, and while it's good and cautious, every now and then it makes an incredibly stupid mistake no human would ever do. It's not even close yet.
@@peter.g6 after using FSD for 3 yrs I see more average human drivers make mistake on the road next to me than my car.
Remember that this is only a Level 2 autonomous software NOT a level 4-5 and it can do more than most average people can do. That’s why we still have lots of crashes caused by average human then FSD cars.
@@mavinhuynh2042 Yeah, I believe you.
My main source is AI DRIVER's YT channel. FSD can do great, but then it's not even capable of making a hard turn when there is no other car around. It also gets confused around constructions. I've also noticed it sometimes doesn't yield to cars that are going slowly, as if it was thinking those cars are staying still.
That being said, there was a huge leap with 10.69 and I cannot wait for 11.
I'm a big time car enthusiast, even was testing cars for a good part of my life, but I really appreciate this video and couldn't agree more with you! Greetings from SOCAL.
This can really help educate people. Thank you! Awesome video!
FSD and AP save lives TODAY. It drives me absolutely crazy seeing hit pieces against this amazing technology.
Ppl not going to mention it stopped in the middle of the lane bc the person fell asleep or had something happen that made them unable to take over after 3 loud warnings, then instead of drifting into the wall and other cars spinning out killing way more wallets and ppl and blocking the whole road. It made a bad pile up ya, but that's it. Cars were still moving to the side of it and going on and not dying. It's just the fact, teslas ai is already better then human
Phantom braking is real. Tunnels are a common location. I'm always prepared when coming up on overpasses. The other drivers were responsible for hitting the car in front of them. It doesn't matter why the Tesla slowed unexpectedly.
@Scott Garee phantom breaking was fixed and it doesn't do that to 0mph just a sudden slow down which is FIXED
@@lavaphoenix753 It may not slow to 0, but it still occurs. I'm not trying to diagnose that particular incident.
This is the most important video on this channel. More people need to know this
Wow... All I can do is think back to my own lobbying efforts when I knowingly told the story of one poor individual knowing how that pulled a politician more than another set of statistics.
It is sad how poorly our politicians think things through and get pulled by emotional responses.
well spoken. never understood why computers need to be 100x better as a human, to be considered safe. Even if the system is just twice as good, it is half as many dead people.
It’s literally immoral to NOT use it at that point!
I'm someone who loves driving. Especially old cars. But even a car enthusiast like myself appreciates the value and opportunity to save lives that comes with the expansion of self driving technology. Sure it isn't perfect yet, but it is way better that most people behind the wheel in most scenarios.
I would seriously like to see some form of self driving only highway lane in high congestion areas, much like how many HOV lanes are currently set up. It is unrealistic to expect a widescale mass transition to self driving vehicles over night, but self driving only lanes are something that could be made reality right now.
I have never been in a self driving or semi-autonomous vehicle, but I have owned plenty with adaptive cruise control. It is amazing how upset people get when the car leaves a safe following distance behind the car in front. Especially in very heavy traffic. We truly do need to change our mindset behind the wheel and realize not every car on the road today has an aggresive human at the controls.
Seriously! I always have to change my AutoPilot follow distance in traffic just so people don’t get mad at me for being safe 😂
(Though even its closest follow distance is still safe, especially with instant reaction time. Some people still get upset behind me lol)
Such a refreshing analysis. Too often anecdotal evidence is used to support superficial reporting. Thank you for diving deeper and presenting data from a broader perspective.
Thank you so much for the kind words, gbrailsford!
@@AIDRIVR are you okay ? I lowkey miss a lot your video... hope everything's alright
If Im being honest though my Model S Plaid (on beta) causes at least one completely pointless phantom breaking event per drive, causing a massive unexpected adrenaline stress dump in me and any passengers. Happens on our Model Y as well.
I almost rear ended a model 3 on the freeway, i thought for sure the driver brake checked me, i rolled my windows down and before i said anything the driver of the M3 started apologizing saying the car braked on its own for no reason. At that point everything i thought about tesla changed.
So, thanks to the Tesla you now learned it was a good idea to keep that safe distance?
Thank you for making this, the problem with many humans is they’re not willing to understand. I hope this video gets seen by lots of people so there can be less hate for an obviously good thing.
Also love the content btw, really appreciate what you do.
Absolutely great analysis of the issues at hand.
Perfect summary. Thanks for putting this important piece out there.
I agree with you 100% and as a side note when I studied this topic in the early 1980s and received my master's degree on this very topic , the data at the time supports everything you have said. Keep doing what you are doing , and, last night I received an update on my Tesla Model 3 and it simply keeps on getting better.
Great video. Thank you for creating it.
7:16 perfect reaction from the motorcycle cop
Great episode. We all needed that.
I love this video... It really puts things in perspective. 👍
I remember a recent interview with Elon where he was asked something along the lines of, even if Tesla is involved in several fatalities using FSD tech, but saved hundreds of thousands of lives, are you still going to do it? He was like, hell yeah. Unfortunately the media will spin those accidents so negatively, that it will be tough to weather the storm. But he will do what is right for human kind.
Seat belts saved countless of life but are also responsible for some deaths. Are we supposed to remove seat belts because of those deaths? Same thing applies to FSD once it matures.
@@sylvaing1EXACTLY
Very well put together video. Hit the nail right on the head.
Always good to put things in perspective.
Unintended pedal misapplication: I suspect one pedal driving. Your foot only ever hovers over the drive pedal. Happened to me once too, nothing bad happened luckily. But since then I have set the Regen to low and train the instinct to press brake every day.
The statistic for projected vehicle accidents if traditional vehicles were replaced can’t account for the circumstances where the road infrastructure is too poor for sensors to reliably navigate with them. Also bad weather can potentially block them completely. The data collected for autonomous vehicles were largely taken in locations people are comforting handing off control to the machine, which would skew the results. I don’t have a hard time believing that the accident rates are higher for people though since we are idiots. I feel like that’s more of an argument for higher standards in drivers ed rather than for autonomous driving in the present. Public transit will always reign king in safety.
You are such a nice guy! I love watching your videos.
Outstanding presentation. Appreciate it.
Maybe your single most important video ever. Thanks a lot.
Well done. Great analysis.
Props to you for trying to educate people who do not question their beliefs. It might not change their mind this time but like you said in your intro, it might if we can restore the balance of coverage to real life events.
Loved the points you made here
Great video; right message! Thank you!
that “jokes on you, buddy, it’s a Tesla” was sooo satisfying for some reason omg
I was driving my 1 month old model 3 the other day, without autosteer engaged, and was looking over to my left to keep an eye on a car that seemed to be drifting over the road a lot (they literally had their phone on a holder directly in front of them LOL)
Next thing I knew my Tesla was beeping and swerved, as I was approaching a somewhat sharp turn at 100kmh that I hadn't noticed. Saved me from a potentially deadly crash. Autopilot is still shit in Australia, but it definitely is saving lives.
Excellent video! If only I had a following, I'd pass this on to everyone. Thank you.
Excellent video. Thank you
We also appreciate your video !!! Keep up to good work !
The numbers are rather amazing….. good job
Fantastic video!
Autopilot needs a person next to them constantly making sure they are awake.
Idea for a future video: more examples of Tesla ADAS features preventing accidents. I'm very interested in understanding what the capacity of the various levels of software (e.g. Autopilot, Enhanced AP, FSD, and FSD Beta) to actively prevent collisions and other accidents. I'm having a very difficult time understanding this based off watching Teslacam footage.
Aside from the things you mentioned, Tesla has another really cool feature coming up (in regards to preventing accidents)
I’m not sure exactly what they will call it, but it’s like that unintended acceleration prevention example on steroids
Right now (while under manual driving), Teslas can prevent a limited set of accidents from happening. But they are currently working on a feature that will automatically avoid all accidents no matter what you do (unless you somehow put it in a situation where all options are bad, or someone were to instantly jump out in front of you, etc.)
There was a demo of this at Tesla AI Day 2022, I believe. Basically, you could literally floor the accelerator and let go of the steering wheel, and it would still find a safe path that avoids all obstacles (this is completely separate from FSD, so it would not be actively following your intended route or otherwise take over normal driving, just preventing an accident until you were safe). Of course, it would not only steer, but also make acceleration/deceleration decisions (so the fact that you happen to be flooring it would just be ignored)
@@Muhahahahaz I'll have to rewatch AI Day 2022.
All points well taken. As a Tesla owner w/o FSD or Enhanced Auto Pilot and happy to have a performance car that allows me to drive in my usual defensive mode but is able to help me avoid erratic drivers. It will make sense for all to use FSD or equivalent software to drive but we are not there yet.I greatly appreciate all you FSD Beta Testers for doing the hard work needed to achieve full autonomous driving.
In its current state, would you really trust autopilot by itself, over a human driver by itself?
Auto pilot needs human interventions, without the human supervision, it’s still way below human level of safety m.
FWIW, data point of 1; in a year I’ve put about 30k highway miles on AP driving between SF, LA, and LV (traveling nurse) and literally never had a problem. It must have saved my butt a hundred times. However FSD and especially navigate on autopilot are an absolute cluster f*ck mess of garbage programming. It’s bad. It’s really really bad. I know the software is not single-stack, but since enabling FSD I’ve noticed AP has started misbehaving, making mistakes and acting unusual where it didn’t used to. Again, I’m just 1 person, but that’s been my experience.
@@youtubesucks8024 I totally agree. I’ve had FSDb for 3 months now. It simply cannot handle city driving without endangering me and those around me. As I say, FSDb is about as good as a teenager with 5 hours of driving experience.
Thanks for making a video about this.
Would I be right to suggest that the statistics don’t show that autopilot is safer than a human driver but rather that autopilot combined with human supervision is safer than a human driver?
In the data from 2:50 it is only with autopilot engaged
@@0topon But even when it is engaged there is a human monitoring it which makes it way safer.
@@0topon Read his post again
Very reasonable, thank you for making this video.
You created the best video on the internets! Wow. It might be the 1st video I ever share
Really well put
Great video. Thanks.
The stats, however, don't include cases where the human has taken over to stop a crash happening. There have been many occasions where my car would have crashed if I hadn't intervened.
Yes true and I think those are also relatively high numbers (would love to know), even on this channel on a good day, intervening happens more than once. Not for dangerous situations per-se, but still. Of course, overall it's very likely autopilot and FSD are saving lives, humans are just so bad in driving it's not even funny.
Great Video! Numbers do not lie. 👌
Thank you for the unique video
Great video. Captures the insanity that is human drivers!
I wonder how the statistics would compare with a human pilot that has safety aids engaged. Like automatic braking when on course for a collision.
I don't think it has to be full self driving to get most of the benefits. Especially considering how much things like ABS have improved safety.
Tesla does compare these statistics. It’s approximately a 2x or so improvement in safety for each level (yes, the average Tesla is involved in fewer accidents than the average US car, even without any special features at all! They are easy to drive and require basically zero maintenance for safe driving conditions, though I wouldn’t be surprised if driver demographics were also a factor):
Human baseline (US avg): 1x
Tesla average (zero assistance): 2x
Tesla automated safety (manual driving): 4x
Tesla AutoPilot/FSD: ~8x
With AutoPilot/FSD going as high as 10x (as cited in the video), depending on which quarter of statistics you look at (NHTSA releases US nationwide stats every quarter, as does Tesla for their own fleet)
FSD saved me from a head-on with a large construction truck that came over the centerline yesterday morning.
Thanks for this.
Excellent video
If you can't drive a car, don't drive.
Great vid thanks
tesla autopilot is a god send. i get easily fatigued driving even on highways and in case i lose focus tesla beeps at me hard enough to jolt me awake
I just don't understand why people can't actually admit to their mistakes, I trust Teslas being around me rather than ant other car. That's why I drive defensively.
People: humans.
Good work
i'm still in awe of the 40 pedal misapplications per day statistic from tesla. you'd think someone who buys a $50k+ vehicle would be able to differentiate from only TWO pedals. that means if a car only had one pedal there would be 40 people a day pressing their floor mat wondering why the car isnt going. insane.
They can. There isn’t a similar amount of crashes like this happening with ICE vehicles because it ISN’T HUMAN ERROR IN MOST CASES…
It’s a glitchy car flooring you into danger. Period.
@@wyattnoise there are crashes for this constantly in all sorts of vehicles. Stop spewing unbacked bs
The moral of the story seems to be that Tesla drivers are worse than regular drivers and they need software aids to shore up their lack of ability.
Great video.
I once tapped accidentally the wrong pedal when I was parking my friend's model 3. I immediately changed to brake pedal and nothing bad happened. I don't understand how one could "accidentally" keep pushing the wrong pedal down. They must been on drugs.
THIS!
I fail to understand how you can press the wrong pedal and just keep pressing it expecting the car to do something different.
Thank you sir!
Fantastic Video
Well said! Thank you.
Thank you!
I agree all the way bro keep spreading the idea!!
yes humans are definitely not always the best drivers, when they are distracted, sleepy, on something they shouldn't be, in a hurry, and when you mix them, it makes it even worse, in a hurry and distracted, can make the worst accidents.
i like driving my car, i try to drive well and with traffic, but ive had way too many people that are in a hurry driving far to fast or recklessly and i am glad when they disappear off the horizon and didn't cause any mayhem near me, but also kinda wishing there was a officer nearby to see the event and have a word with them.
i think there could be a happy medium somewhere, for those that want to drive, the automation could keep an eye on them and if they are doing things well, just let them drive, maybe nudge them a little when they are near the line, and if the driver doesn't keep it together or if they are distracted, sleepy, on something, going to fast, failing to keep their lane, the automated system could take over and park the car on the side of the road for emergency services to come have a talk to them. or they can become an observer and let the automation take them home. future forecasting much?
Tesla is already working on this feature, in a way :)
At AI Day 2022, they previewed a feature that would basically prevent all accidents when in manual driving mode. You could literally floor it and let go of the steering wheel, and the car would automatically avoid all obstacles until you were safe (and of course, it would ignore the fact that you are flooring the accelerator and make its own decisions)
It’s basically a huge upgrade to some of the accident prevention they already have (such as the example in the video above, where it stops the driver from flooring it into that pedestrian, or that shop window)
The ironic thing here is, if all the cars around the Tesla was using FSD, the pile up would most likely not have happened.
As Elon said, all the thousands of people who will not die because of Autopilot won't know their lives were saved because of Autopilot, and the few people who do die because of an Autopilot mistake will sue Tesla.
What a great video. Well done. I would also add, that if every car on the road was driving as cautious as Tesla's do while on autopilot, it could drop traffic accidents by even more than 90%.
Thanks for the video!
Vehicle safety seems so underrated. As a father of a two-year old, and knowing the US safety stats, I just can't help but buy a Tesla as our next family vehicle (aside from the many other perks).
I was wondering: How long had you considered releasing a video like this? Did something in particular spur you to make it? Just curious more than anything.
I've had the idea for a while, but it wasn't until a critical mass of ignorant comments set me off lol. Got tired of reading how dangerous AP is
@@AIDRIVR You paid $15,000 for something the second richest person on earth has assured yall for like 7 years in a row now would be able to take you coast to coast with zero engagements…
People making comments about how dumb you Tesla fan boys are aren’t the ignorant ones.
Also, you are seriously such a scumbag for not playing the entirety of the clip where that Model Y ran down several people as the driver desperately applied the brakes. I know it doesn’t fit your narrative though and certainly won’t get you on the short list to have Elon buy you an horse anytime soon.
@@wyattnoise sound like you have not use FSD before. Watching videos and reading about it doesn’t make you knowledgeable about it. Just rent a Tesla with FSD and drive 1,000 miles road-trip and see it for yourself. I have done it 5 times from Cali to Chicago and I didn’t even feel tried after 2 1/2 days of driving while using the system. Best $6,000 spend ever (cuz I got it early 3 years ago 😊)
comma ai still cheaper than initial FSD prices and doesn’t regularly run down little kids soooooooo@@mavinhuynh2042
@@wyattnoise the average drivers run down children more than FSD and you are one of the average driver. You don’t hear about it cuz it happened too many times.
Great video
Well said. Title should be "A Bright Reality" or "An Unrecognized Reality."
As a Tesla fsd driver the stats are skewed because Tesla and autopilot only handle easy situations, drivers have to take over for any tricky parts, there's a huge chasm to cross for fsd beta to become fsd, probably 5-10 years, because these construction zones, parking lots and anything unknown that comes up it just quits on you or makes a catastrophic error.
FSD can handle almost everything by now, except the few cases it's not trained for yet (like parking). It handles construction well decently too, just not as well as it should yet. And doesn't make catastrophic errors, 99% of the time it just doesn't move when it's confused.
So no, it won't tak 5-10 years, 1-2 more likely. In 10 years AI will likely reach general human level intelligence.
Thanks this is very good text. Hope to see same level news reports someday.
This "pedal misapplication" problem really begs the question: Do we need cars on public roads able to do 0-60mph in 3sec?
Thanks for reporting the truth!
💯% on point. I had a lady pull out in front of me after that accident . It's the other drive non safe Tesla behaving badly driver and other you have to worry about. I wish there was some kind obd2 port communication device that could talk to other cars to help prevent accidents. That would tell the car to stop for my case before she pulled out in front of making a left on a highway .
Pedal misapplication. Never heard of something like that ever. That is something so trivial, I though that almost doesn't occur but some drivers....
Thank you