Tesla FSD (Supervised) 12.3.4 get extra credits with my referral ts.la/vitaliy15844 #elonmusk #fsd #fsdbeta #autopilot #magic #tesla #technology #innovation
@@FSDEvolution Version 12 still needs frequent interventions/disengagements on any mildly challenging drive. Version 12 remains a simple minded Level 2 assistant. Level 5 full autonomy is *science* *fiction* in 2024.
Tbh, I would not call this a fail. Yes there are issues that if you are not supervising, would potentially cause damage, and that is because you have neural networks in your onboard computer, think of them like one of the borgs, each car is like a Borg, and the dojo server is the hive, they are still very early in their development. On my drive yesterday, I noticed that it chose the right lane on a certain road even though it was narrow near parked cars, and I knew it was dangerous, but a thought hit me "what if the neural networks just don't know the danger since it has never seen it?" One thing is for sure it was very well at control Speed in terms of slowing down when appropriate, and it was able to basically put in the turn signal to inform the car's in the adjacent lane of its intention to come in their lane partially, until its lane had more room, and it would slow down if a pedestrian showed up or if a door suddenly opened, but the thing is, it does not seem to consider the idiot that does not look in the mirror before opening the door yet, because it may not have that event in its collective experience yet... I was nervous and I probably should not have allowed it to go through that, but that thought hit me at the moment, so I was tempted to let it do it, and someone might have died because of that, but it is equally the city's responsibility to only provide lanes with enough width for vehicles to fit, I even see busses using that lane 🤷♂️
I've had same issue with cones in the road, Tesla will drive right into them. Seems to do fine in actual construction zones with cones though so go figure.
What are the official Tesla FSD (Supervised) v12.3.4 pass/fail criteria you used to make your determination that v12.3.4 has failed, so prominently advertised about your video?
@@FSDEvolution While each of us has our own personal likes/dislikes (e.g. Tesla car doesn’t understand a certain dialect of a language) a failure to perform certain task has to be judged against the manufacturer’s stated product capabilities. So, it wouldn’t be a “fail” if a 2024 Model 3 can’t follow voice commands in Cantonese, unless Tesla has explicitly stated that the ability to follow voice commands in Cantonese is one of its capabilities.
@@TuneupMagic Tesla have said "Self driving next year" for almost ten years. Every year, for almost TEN YEARS! They have also showed a video claiming it was self driving - several years ago. And they call the system Full Self Driving. And they build all their advertising around Self Driving. So, yes. Each time it is NOT capable it is definitely a BIG fail. This is no "Full Self Driving", and it does not matter that they call it Beta or Supervised. Supervised is in fact really funny - they talk against themselves! Self Driving Supervised?! Then it is obviously NOT Self Driving. They should rename this shit to something else, like Supervised Driving or Supervised Driving Assistant.
Just shows how difficult a problem this is to solve! Thanks for making this video. Keep it up. 👍
🫡
Some experts say Full Autonomy Level 5 is 10 years away, at a minimum.
Will see, version 12 was a big leap🤞
@@FSDEvolution Version 12 still needs frequent interventions/disengagements on any mildly challenging drive.
Version 12 remains a simple minded Level 2 assistant. Level 5 full autonomy is *science* *fiction* in 2024.
Fun segment w/good edge cases. Thanks.
😊
Great format
Thanks😁
Tbh, I would not call this a fail. Yes there are issues that if you are not supervising, would potentially cause damage, and that is because you have neural networks in your onboard computer, think of them like one of the borgs, each car is like a Borg, and the dojo server is the hive, they are still very early in their development. On my drive yesterday, I noticed that it chose the right lane on a certain road even though it was narrow near parked cars, and I knew it was dangerous, but a thought hit me "what if the neural networks just don't know the danger since it has never seen it?" One thing is for sure it was very well at control Speed in terms of slowing down when appropriate, and it was able to basically put in the turn signal to inform the car's in the adjacent lane of its intention to come in their lane partially, until its lane had more room, and it would slow down if a pedestrian showed up or if a door suddenly opened, but the thing is, it does not seem to consider the idiot that does not look in the mirror before opening the door yet, because it may not have that event in its collective experience yet... I was nervous and I probably should not have allowed it to go through that, but that thought hit me at the moment, so I was tempted to let it do it, and someone might have died because of that, but it is equally the city's responsibility to only provide lanes with enough width for vehicles to fit, I even see busses using that lane 🤷♂️
Progress is incredible, I’m having a lot of fun watching it go through edge cases😅
An easy drive for a human, but
😅
You dare not drive in India. Indians: An easy drive for us.
India is next level😂
@@FSDEvolution My American friend went to India on vacation. When he returned, he told us: Don't ever go to India!
I've had same issue with cones in the road, Tesla will drive right into them. Seems to do fine in actual construction zones with cones though so go figure.
Hopefully will be fixed soon🤞
@@FSDEvolution They have had ten years, so do not hold your breath...
It worked well on v11
It can't see stuff on the ground
Sometimes😅
What are the official Tesla FSD (Supervised) v12.3.4 pass/fail criteria you used to make your determination that v12.3.4 has failed, so prominently advertised about your video?
Driving over cones and garbage is fail to me😅 v11 was able to avoid that a lot better
@@FSDEvolution While each of us has our own personal likes/dislikes (e.g. Tesla car doesn’t understand a certain dialect of a language) a failure to perform certain task has to be judged against the manufacturer’s stated product capabilities. So, it wouldn’t be a “fail” if a 2024 Model 3 can’t follow voice commands in Cantonese, unless Tesla has explicitly stated that the ability to follow voice commands in Cantonese is one of its capabilities.
👍
@@TuneupMagic Tesla have said "Self driving next year" for almost ten years. Every year, for almost TEN YEARS!
They have also showed a video claiming it was self driving - several years ago.
And they call the system Full Self Driving.
And they build all their advertising around Self Driving.
So, yes. Each time it is NOT capable it is definitely a BIG fail.
This is no "Full Self Driving", and it does not matter that they call it Beta or Supervised.
Supervised is in fact really funny - they talk against themselves!
Self Driving Supervised?! Then it is obviously NOT Self Driving.
They should rename this shit to something else, like Supervised Driving or Supervised Driving Assistant.
😅
FSD going over double solid yellow lines to pass a pedestrian is not a "good job".
Why not?😂 definitely better then stopping
I think any human driver would've done the same considering the circumstances
Better than killing the pedestrian.
💯
I think it's fine if nobody is in the other lane. This is what most people would do.