IS IT BAD to participate in the TESLA FSD BETA program?

Sdílet
Vložit
  • čas přidán 20. 08. 2024
  • One of the commenters is in my last video tried to persuade me that full self driving is dangerous, and that Tesla is setting back the cause of full self driving by many years. He suggested that Elon was just trying to grab as much money as possible and that he was deceiving us about the true outcome. This is a very important video to watch. If you have doubts about full self driving, (the beta program that tens of thousands are participating in), and whether or not there can be a successful outcome - or whether it is just a dangerous scam. I urge you to watch the video from start to finish.

Komentáře • 80

  • @davidalton7378
    @davidalton7378 Před rokem +12

    Excellent commentary! The first time I tried using standard autopilot, it was quite strange and unnerving. Having driven manually for 60 years, giving up control to the car was difficult. Now after a years experience in my tesla I realize the car drives better than I do. Having FSD when you are older will be a godsend. There are many reasons people should cheer Tesla for the FSD endeavor. Lives will be saved and life improved for many.

    • @jestrjk
      @jestrjk Před rokem +4

      I am in exactly the same place you are @davidalton . At first i was nervous, but once i became comfortable, and entered more the "supervisor" role of FSD, and not so much the "driver" I realized what you said, "Crap, this car might drive better than I do."

  • @greggoryjohnson156
    @greggoryjohnson156 Před rokem +7

    Peter, excellent video! I have FSD Beta on Model X 2022 and I love it to pieces. I believe that the best is yet to come.

  • @fredericakarcay8943
    @fredericakarcay8943 Před rokem +6

    Even i haven't FSD, my car saved me in a situation, where I would have made a mistake, because I haven't seen the other car. So, thank you Tesla and Elon for the good work.

  • @73av8r5
    @73av8r5 Před rokem +3

    I bought a used M3 back in October…..It came with FSD and now I’m in the beta test program. I’ve only used it a few times as it’s done some whacky stuff and I consider myself a good driver and I enjoy driving the car manually. It’s a start and it continues to get better.

  • @superstring101
    @superstring101 Před rokem +1

    Thanks, Peter, great video! (cheers from a neighbour & Model3 owner in Victoria)

    • @model3man
      @model3man  Před rokem

      Hey neighbour :) Glad to see more and more Teslas on the island whenever I visit!

  • @Lee-sx5np
    @Lee-sx5np Před rokem +2

    Great commentary! Loved it!!!!

  • @codemonkey2k5
    @codemonkey2k5 Před rokem +2

    Great video! Absolutely agree with every point you made.
    I've been using FSD Beta for a couple years now and it has improved to the point where I almost always use it.

  • @rctezluh42069
    @rctezluh42069 Před rokem +1

    Well said, Upvoted!!!!!

  • @eileendover3938
    @eileendover3938 Před rokem +1

    Happy birthday!!!

  • @millin-avp-educationav-kin9128

    Great video, I was looking back over your content to find the video you did about Tesla calling you after a crash, is that video still up?

    • @model3man
      @model3man  Před rokem +1

      It sure is! czcams.com/video/aijuAjxCZ7A/video.html

  • @Lee-sx5np
    @Lee-sx5np Před rokem +1

    Great Video Loved it!!!

  • @BigBen621
    @BigBen621 Před rokem +1

    To me, a very good analogy is thinking of a Tesla with FSD beta as a student driver in driver ed. Like a student driver, FSD beta frequently makes mistakes (although the frequency of these is decreasing with every release); and like a student driver, there is a skilled operator watching every move, and interceding when the driver makes a mistake.
    But there are some key differences. First, FSD beta software is improving rapidly. On the other hand, although each *individual* student driver's skills improve over time, as a group the average skill stays about the same; because new, unskilled student drivers are constantly replacing more skilled student drivers as they move out of student status. So unlike FSD beta, there will continue to be hundreds of thousands of student drivers on the road indefinitely, and at the same, fairly low average skill level. Second, especially with the latest version of FSD beta, the skill level of a Tesla with FSD beta is far higher than the average skill level of student drivers. And third, the public FSD beta program will likely last no more than a couple of years; while the student driver program will be with us as long as teenagers are learning to drive.
    Communityband1 presumably accepts the risk of hundreds of thousands of low-skilled student drivers on the road, collectively for decades, as a necessary part of developing more skilled drivers; why shouldn't he accept a similar risk for Tesla FSD beta testers, using FSD beta software that is already more skilled than an average student driver, for the relatively short period while the FSD software is being developed for full release-especially since cars with the final release of FSD are virtually certain to be much safer than typical drivers?

    • @communityband1
      @communityband1 Před rokem

      First of all, this video started with a bit of a strawman argument. If you read my original comment that this video was a response to, what I actually objected to was the idea of letting FSD behave badly on the road instead of intervening. I didn't actually say that people shouldn't be allowed to test FSD. Model3man made some comments in that previous video to the effect of noting that FSD's behavior was probably annoying other drivers. And I asked him to not let it do that. I see that as a risk in the sense that it annoys other drivers and potentially makes them less safe.
      However, since we're on this topic now, I can't pretend that I don't have some big concerns with this particular approach to achieving self-driving cars. Please understand - I am highly supportive of autonomy replacing human driving in time, for exactly the reasons you named. Humans make mistakes. Technology has the potential to make far fewer mistakes. However...
      The problem with achieving autonomy by gradually improving it over time while humans act as backups is that _humans suck even more at playing backup._ It is intuitive and has been documented that the more a system seems trustworthy to us, the more trouble we have staying attentive. Right now, FSD does enough awkward things that drivers mostly stay on task. They're not completely comfortable with it, and this helps them pay attention. But as we saw with Autopilot, when the technology improves, people zone out. There are a couple of ways we can treat this. We can say, "We told you to pay attention, so it's your fault if an accident occurs." Or we can say, "We know that humans struggle with this, and if we do it this way, some people are going to die." And the thing is, it's not the case that we can expect things to just get better as the technology gets better. The reverse may in fact be what happens. When the technology improves, we're going to get worse in our human backup role. We're not going to pay perfect attention for hours, days, weeks or months on end. We're going to start trusting it, and that's when there will be problems.
      The student driver comparison you bring up is actually very relevant here. Unlike a human who is learning to drive for the first time, FSD has a huge wealth of knowledge stored about the rules of the road. But unlike a human, FSD doesn't understand the world the same way we do. Humans are incredible at encountering something they've never seen before and categorizing it into something that makes intuitive sense. Machine learning AI is not. Think about how you can see a single tree once as a child, be told it's a tree, and instantly identify any other kind of tree as a tree. You can even deduce concepts like "plant," "living," "solid," "immobile," etc. without being told just because you easily relate it to these things you've seen elsewhere. You understand that it would be better to hit a ball than to hit that tree. And you understand it would be better to hit that tree than to hit a human. You simply don't have to be taught how to think of these concepts, because they all are natural to you. FSD can't compare to you in that way. If it encounters something that seems new, it may act very unintuitively. Even in terms of something as basic as how we see the world, FSD shows that it is far below any person who has never even driven a car. We can find videos with the latest versions of FSD that show it visualizing objects which don't exist, such as cars. It speaks to just how difficult the problem of perception is that one of the object categories FSD focuses on most is something it still struggles with identifying.
      I believe strongly in self-driving cars making a better future. But I think the safer approach is to do it in an all-or-nothing fashion that doesn't place human weakness in the spotlight. I also think that cost should not be a dominant factor in deciding what technology goes into these vehicles. When it comes down to it, Tesla is bypassing technology that their engineers have asked for to make the system more reliable. This allows them to offer a competitive commercial product. But it's not the safest or quickest way to achieve autonomy.
      In any case, I hope you can see that my thinking here is not black and white.

    • @BigBen621
      @BigBen621 Před rokem

      ​@@communityband1 I agree that FSD beta should be interrupted when it's misbehaving; not only for the reason you state, but for the perhaps more important reason that the only time misbehavior is reported back to the mothership for correction is when FSD is disengaged.
      FSD will inevitably cause some accidents; some will simply be unavoidable due to limitations of automobile dynamics and computational limitations; and others because the autonomous car will encounter a situation it hasn't been trained on, as you discussed above. But it will also avoid many accidents that would otherwise have occurred, and arguably it's doing so now. The real measure is not whether it causes any accidents, which of course it will; but what is the ratio of accidents avoided to accidents caused. Theoretically, we'd be better off (in terms of overall safety) whenever the ratio is >1.0. In fact, if/when we get to to that point, it could plausibly be argued that failing to license FSD, even though it causes some accidents, is more dangerous than not licensing it; because failure to license it will cause more accidents and deaths than licensing it. The actual ratio at which we make that decision remains to be seen; but we may already be fairly close to it.
      I don't see how you implement an all-or-nothing approach; others have stated plausibly that it'll take tens or hundreds of billions of miles of training to achieve an acceptable level of safety, and I don't see how you do that without real-work testing, which then injects human weakness into the process. But let me point out a few things. First of all, the FSD beta test group has expanded incrementally by a factor of 130 or so since late fall of 2021, from about 2,000 then to about 260,000 now. Presumably each expansion has been preceded by a careful review of the accident history of the FSD beta test group to date, and has not proceeded until accident data made it clear that it was safe to do so. And, unlike student drivers, at any point where accident rates spike or exceed some pre-defined limit, the entire FSD beta test program can have be terminated literally on a moment's notice.
      You state that humans do not do well in the backup mode, and theoretically this is true; but in the FSD beta program there's no evidence that we're at that point yet, and much evidence that we aren't. Here's why: according to NHTSA statistics, among a group of 260,000 average drivers there'd be 480 accidents a month. And we also know that essentially every accident a Tesla with FSD has makes the news; so if there were anything like that number of accidents among the FSD beta test group, it'd be all over the news. But in fact, there's only been the one claimed accident on the Bay Bridge; it's dubious whether it was on FSD at all, and if it was it would have been on FSD NOA, and not FSD beta. But where are the other 479 for November, and the 480 for December, and ... ?
      It's clear that FSD beta on its own would cause a substantial number of accidents. And it's clear that 260,000 Tesla drivers without FSD would have a significant number of accidents; perhaps not the same 480 as national average drivers because of demographics; but still a substantial number. But for now at least, there's no actual evidence of accidents among the FSD beta test group. And to me the only way to interpret this is that FSD beta on its own does most things right, although it sometimes misbehaves; but when it does misbehave, the driver is consistently able to step in and save the day. Do you have any different explanation for the apparent near complete absence of reportable accidents among the FSD beta test group?

    • @communityband1
      @communityband1 Před rokem

      ​@@BigBen621 Addressing your last question first, yes, there is a different answer on that. We've heard in the past year that more FSD accidents were known to regulators than were known publicly. I personally know of at least one example of an accident that may have gone unreported, simply because a video was uploaded of it for a very short time before the owner took it down, apparently second guessing the wisdom of posting it. So we don't really know how many accidents there have been. There's an article called "Musk said not one self-driving Tesla had ever crashed. By then, regulators already knew of 8" which you can check out.
      However, I'm inclined to agree that even if we had the true data, we'd still find fewer accidents than average statistics. The issue here though is that the miles driven by FSD tend to be selective. In the same way that if we had a child steering from our lap, and we took over whenever we foresaw a tricky situation, this is being done by users with FSD. We have relatively few reports of children crashing their parents' cars, and I think if we could see the data, we'd probably see a statistic that would point to them being safer drivers than adults. But it's pretty obvious why that doesn't make much sense. In the end, there's just no viable way to compare the statistics when we are selectively choosing which miles are driven by FSD.
      Another thing to keep in mind is that even though the testing has expanded 200+ thousand drivers, the number of them using the software or using it regularly is likely to be much smaller. Even among those posting videos, we hear that some are only engaging it occasionally to run their test routes. Others have decided to put it aside until they feel it's more ready. I'm sure there are many thousands of people testing this. But I can't really make a guess at how the number compares to 260,000 people driving their cars regularly. And we do also have statistics that point to the average Tesla getting into fewer accidents even when not running FSD. Some of this is technology, and some of it is likely demographics. Most 16 year old first-time drivers don't own Teslas. 🙂
      With cars operating with Level 2 functionality, even if the human driver is paying attention, their response is likely to be less effective on average than if they were the one driving. Imagine you're driving down the highway, and your car brakes suddenly. You didn't see anything. But does that mean you should instantly step on the accelerator? What if you were checking a side mirror when it happened? Could you have missed something that the car saw? In many situations, there's going to be some extra doubt that will influence how quickly the human will react.
      _I don't see how you implement an all-or-nothing approach;_
      What I mean by this is going straight for Level 4, taking the human driver completely out of the mix. This is the approach taken by companies like Waymo, Cruise, and some in China. The bar is set higher initially in many respects. First, redundancy becomes very important. You need to have hardware that backs up the hardware, running software that backs up the software. And you need to regularly verify it's all working. With perception, you can no longer settle for good guesses with the expectation that a human will intervene. And so you double up on sensor technologies to make up for the weaknesses of each type. You define the places and scenarios where you will operate so that you can test and verify that things are working in each of those places. You train test drivers to obtain your real world test data, and you rely more heavily on simulation than real world miles, because simulation allows you to generate more specific data that can be used for training. Simulation is also a key to verifying that changes don't break something and cause accidents.
      It took a long time for companies to get their robotaxi services running in a single city. But what we're seeing now is that they're starting to expand more quickly. It took Waymo many years to introduce a service in Chandler, Arizona. They accomplished San Francisco in about a year from when they first announced testing, and then Phoenix seemed to only take a few months. LA appears to be next for them. It's picking up pace, showing that scaling isn't linear. I do think this is likely the avenue most of us will be introduced to cars that operate with no driver (Level 4).
      I understand the practical nature of what you're saying. More lives saved is more lives saved. But the thing is, we don't actually have to settle for that. We can do better. Level 4 approaches are one path. And even Level 2 systems can be safer if better technology is integrated. It's not an A B choice here. And I think that there's a reasonable chance that FSD could do more harm than good. As it improves, it may cause more accidents as humans trust it too much. And if it makes mistakes, it could set the industry back through regulation that blocks its usage. Already today I think Tesla is running this risk. If you go and read the article that was mentioned in this video, about Tesla telling regulators something different than customers, you'll see that they are in fact playing a bit of a game here. They want regulators to accept that FSD City Streets is only Level 2 and _will_ only be Level 2 in its final release form. And they need that because otherwise they can't release their software to public testers like they're doing. California law bars them from doing this with anything about Level 2. Meanwhile, Elon Musk really is telling customers that this product is nearly Level 4 and soon will be.

    • @communityband1
      @communityband1 Před rokem +1

      @@BigBen621 Btw, thank you for the discussion. I appreciate your thoughts.

    • @communityband1
      @communityband1 Před rokem

      oops - "California law bars them from doing this with anything _above_ Level 2." If you want to check out one of those articles, you can search for "A Tesla regulatory counsel told the California DMV that it will remain a hands-on system."

  • @communityband1
    @communityband1 Před rokem +2

    Hello! Thank you for making this video. I can tell from your comments here that we did not always understand each other. I'll try to address each point here that you raise:
    1. Your initial summary of my message, that I am saying you should not be doing beta testing on this technology, is an incorrect summary of my point. What I actually object to is the method of testing of allowing FSD to misbehave. During that video which I responded to, you noted that FSD was doing things that felt awkward and might be frustrating other drivers around you. I argued that you should intervene when that is the case rather than allow FSD to go forward with behavior you would not do yourself and which you think distracts or annoys other drivers. My argument is that it is not enough for you to know the behavior won't directly lead to an accident. You should also consider that other drivers being distracted or annoyed makes them potentially worse drivers for a time. Later in my comments on that video, I also explained that allowing FSD to behave this way does not provide Tesla with the best feedback for its training purposes, and I also said that videos showing drivers allowing FSD to misbehave are of interest to regulators who are now considering legal action to block Tesla from this testing. When I stated that nobody is doing more than Tesla to threaten an autonomous driving future, regulatory action is what I was referring to. Tesla's actions have been drawing far more regulator scrutiny than other companies, to the point that there are numerous investigations happening. It is not unreasonable to think that laws may be introduced in response. And this is why I believe that people making CZcams videos which show themselves letting FSD behave in awkward ways are not helping. If you are an FSD tester, your best path towards helping Tesla is to intervene when the vehicle makes a mistake of any kind. There is no real benefit to letting it continue other than appealing to curiosity.
    2. You felt that if I rode in a Tesla, it would give me a different impression and show me the excitement you feel when using this technology. And I told you I did not intend to do this, because I felt I gained better insight through watching a greater selection of videos posted by other people. Later in this video, you talk about how Tesla could not train its vehicles effectively with only a few cars giving it input. You are essentially making the same point there as I was here. I don't feel it's better to try to skew my sense of this technology by putting so much focus on a single data point - a test drive that I have in one of these vehicles. I have watched hundreds of these videos, and I do believe that this gives me a better view of things. Keep in mind that the technology itself is only one of the things I'm interested in. I'm also interested in the human behavior that the technology leads to. I do not value my own emotional response as being worth more than the sum of experiences I can witness from others.
    Also keep in mind that the point I addressed when I first made the comment about not owning a Tesla was that drivers who do not own Teslas are still being made to become participants in this test. If an FSD tester allows the Tesla to do something distracting or annoying, other drivers who aren't voluntary participants become unwitting ones.
    3. You did not pull out the portions of the letters Tesla sent to regulators which state FSD, in its final form, will be a Level 2 system. You can find the letters in their entirety. "As such, a final release of City Streets will continue to be an SAE Level 2, advanced driver-assistance feature." The situation here was that California was asking Tesla for clarification about FSD. California has much stricter testing policies for driving systems that operate at level 3 and above which would require Tesla to gain approval. Tesla would not be able to have the general public participate in testing of a Level 3 or 4 system. And so what Tesla wrote back to regulators was that FSD, in its current _and final_ forms, would be SAE Level 2. I stated that I could not include a link to the article. I said that after my comment was removed by CZcams's automatic filtering system. This isn't something that you do. It's a CZcams feature that hides comments. Upon viewing the comment thread as a guest user, I was able to see that my comment was only visible to me. So I elected to go back and give you the title of an article you could read on this subject. "Tesla Admits Current ‘Full Self-Driving Beta’ Will Always Be a Level 2 System: Emails"
    4. Elon musk saw dollar signs. I stand by this, because what Elon did here shows in sharp contrast to the decisions made by another company on moral grounds. Google was the first company to have functionality like Autopilot. They pioneered this technology, and for a time, they had employees riding in cars that could drive themselves down the highway. But Google found that its employees failed to stay attentive. They released a video years later showing their employees not paying attention. One was even sleeping. They noted that this realization is what led them to abandon the approach of gradually improving autonomous driver assistance technology in a way that would replace humans while requiring that humans act as the backups. It put people at too much risk. Elon Musk made the opposite decision when presented with the same choice. And we have seen some people die as a result. Tesla, led by Musk, puts the blame for this on the customer, saying that they're not following the rules. And while that's perfectly true, it overlooks the fact that humans, by our nature, don't make good backups to technology. You talk about how Tesla cars have the highest safety statistics. This is a testament both to the drivers who operate them, who are typically wealthier and more mature, as well as to the technology and design that Tesla has put into improving the safety of these vehicles. There are some really good things Tesla is doing. But the idea that we can rely on humans to play backup and be ready in emergency situations as the technology improves is a gamble at best. People zone out, and the better the technology becomes, the more this happens. MIT has done studies that show this. And there is no statistic Elon Musk can point to that says that drivers, after riding in a car that makes no mistakes for a week, a month or a year, will generally be ready to respond when a critical mistake happens. In this regard, this is all an experiment. It involves human life, and it involves regulatory scrutiny that may delay the progress towards autonomy.
    Elon Musk has also made decisions about the design of these cars that do in fact make autonomy less safe. Cost and appearance have been driving factors in the design, and this has affected the sensors they have included. Teslas that have hit emergency vehicles did so because the cars could not "see" stationary objects at that time. That was a money decision, not a safety-based one. The technology was available to prevent that. When it comes to camera-based object and scene perception, we have still not reached human level capability. Even Google, for all of its billions of users feeding it data, still shows these limitations in its applications. Tesla has not solved this problem either. We see this in the videos that show FSD hitting low-lying objects or braking in response to objects which don't exist. If Tesla had advancements over the rest of the industry in this regard, it's quite likely they would seek to take advantage of them by expanding in to the many other areas where this is valuable. We've really seen no indication of this, and so right now, it seems safe to guess that Tesla is largely employing the same types of machine learning techniques as others. And as such, they're facing the same limitations which other companies are solving through additional sensor technologies.
    5. As a music teacher, I would say that if you want to learn to become good on recorder, you need to pick up a recorder and play it! If you want to know the merits of playing recorder, vs. playing another instrument, you would do better to watch as many videos as you can about various instruments to gain insight from others. Listen to what a recorder sounds like when played by a professional. Look up its restrictions and benefits, both technically and in its usage, and decide if this is really something that you want to learn. If you're trying to choose an instrument to learn, there's no need to try every instrument when making your selection. As someone who plays many instruments, I will tell you that I make decisions about instruments to learn not by trying them but rather by listening to what accomplished players sound like playing them. It's unrealistic that I will gain a good feel for a new instrument by trying it for a short time. It's better for me to hear the potential and then decide if that's something I want to work towards.

    • @model3man
      @model3man  Před rokem +3

      Thank you for responding to the video (since it was primarily a response to positions you've taken and comments you've previously made). I will BRIEFLY respond to your comments, but I have to say that the biggest problem I have with the positions you take is that they are one-sided; very biased against Tesla/Elon; show an unwillingness to actually investigate FIRST-HAND (not via completely inadequate videos); and seem determined to "throw shade" on every single aspect of Tesla and it's creator. That seems a strange position to take on a channel devoted to the GOOD that is being done to the EV revolution by Tesla; to the development of safer cars; safer driving technologies. Now, very BRIEF responses to your 5 points:
      1. NONE of us Beta testers allow our cars to actually endanger other drivers, any more than an old, slow, erratic driver might do because, well, they simply are no longer as sharp as they used to be. We as fellow road users treat drivers like that with responses that range from tolerance, understanding, or perhaps a level of impatience, but NEVER does that cause drivers in the vicinity to get into an accident or fly into a fit of road rage against the 'poor or slow' driver. That argument is simply a 'crock'
      2. You still try to maintain that you have a BETTER overall perception of FSD by watching videos that may be biased, may be inaccurate, may be poorly filmed.. etc etc., than if you actually spent a few days IN an FSD-enable vehicle alongside of an intelligent, alert owner who would be happy to show you first hand. That is a major mistake on your part, and shows that you are predisposed to dislike / reject anything at ALL about Tesla's autonomous driver assist technologies. And that's fine - you can have all of the predispositions you like, but please don't come into this forum and tell us we are wrong about something you've NEVER actually tried. I say again: Get out there, be brave, and TRY it out for yourself. Once you've done that I will start to pay more serious attention to your conclusions.
      3. You need to update yourself regarding the situation between Californian regulators (one state in 50) and the Tesla company. There are MANY indicators that Tesla is pushing on beyond Level 2 (in my opinion they already have) and that they intend it to be a mature self-driving solution. Every drive I take after a software update has occurred show DRAMATIC improvements in how the car deals with previously difficult situations. You are living in a period 3 years ago and think it still applies today. It simply doesn't.
      4. You say Elon "saw dollar signs" and adopted a technology that Google has (on "moral grounds") turned away from. First, if by "Dollar signs" you are meaning that Elon sees the need to make a PROFIT, so that his company can go from strength to strength; NOT fail and go bankrupt like so many other well-meaning startups have done; have enough cash reserves for product development and technology advances, then YES - and we are all SO glad that he is making Tesla profitable and removing the previous worries about going bankrupt.
      Second, however, you are completely incorrect when you attempt to make out that it is "immoral" to advance the self driving technology that Google somehow "righteously" abandoned in some selfless high-minding act of philanthropy. And here is what you IGNORE: dozens of other cars have freeway assist / pilot mode / lane-guidance strategies - ALL of which allow the driver to relax and leave certain things to the car. So by your argument, ALL your of these manufacturers create the danger that their drivers will become complacent after months of the technology working well, and will end up causing some horrific accident when they, too, stop paying attention to the road. What's sauce for the goose is sauce for the gander!.
      To attempt to say (as you do) that, because Tesla owners must (somehow) be the well-off careful types who therefore have less accidents, is complete nonsense. Teslas are being bought by EVERYONE who values safety in a vehicle, loves what Tesla is doing, and are STRETCHING their budget to make it happen. Ordinary people see quality, safety, lack of maintenance, technological advancement as big PLUSES and are willing to pay more.
      5. Finally, you can watch all the videos on recorder playing (or any other instrument) but all that watching is purely of academic interest until you start PLAYING and you start PRACTISING, and you determine not to give up on the instrument no matter how hard it seems at the start. So videos are great for INVESTIGATING but if that is not followed up by acquisition of the instrument, and countless hours of practice, it will remain FOREVER simply an academic interest where you never actually get to know that instrument.
      In conclusion: your hostility to Tesla and to Elon is what makes me suggest that you not any longer spend the countless hours you claim to have spent watching videos on Tesla and self driving, since you clearly do not believe in it; do not SEEM to value what Tesla or Elon are doing; seem to be strongly influenced by the negative anything-but-Tesla lobby, and clearly do NOT have that most necessary of requisites - an open mind.
      Let's draw a line under this comment thread. You've expressed your opinions over numerous posts (including this one); we are not going to see eye to eye; you are not going to persuade anyone on this forum of your views, but you have made them as well as it is possible to express the sentiments you've shared. So thanks for the participation but no need to respond with an endless "tit-for-tat".

    • @communityband1
      @communityband1 Před rokem +1

      ​@@model3man
      _So thanks for the participation but no need to respond with an endless "tit-for-tat"._
      I'm sorry, sir, but you created a video about me, and I do feel I want to respond to what you're saying. I do not feel you've represented my positions as I've stated them, and I don't think it's reasonable for you to both create this video and then ask that I stop responding. I don't ask you to respond if you're tired of doing so.
      _1. NONE of us Beta testers allow our cars to actually endanger other drivers, any more than an old, slow, erratic driver might do because, well, they simply are no longer as sharp as they used to be._
      I've seen people allow FSD to act in ways that cause other drivers to become quite nervous, such as nearly entering an intersection when it is unsafe to go. Sometimes testers delay long enough that we actually see the other drivers steer quickly to avoid them. We also have accidents apparently caused by FSD or Autopilot which are atypical of a human driver, such as braking hard in heavy traffic when there is no vehicle in front (phantom braking).
      Beyond this, though, it doesn't make sense in my opinion to say, "It's okay for me to be a worse driver, because there are already some worse drivers on the road. So it's okay to add more bad drivers, and I shouldn't feel badly about not doing my best to be a good driver." Regardless of where the system stands today, if you elect to change your driving habits to something that's less respectful and more annoying to other drivers, you are making the situation worse.
      This is how our argument began. I believe you felt upset when I asked you to not allow FSD to do things which you felt would annoy other drivers. I asked you to intervene.
      _2. You still try to maintain that you have a BETTER overall perception of FSD by watching videos that may be biased, may be inaccurate, may be poorly filmed.. etc etc., than if you actually spent a few days IN an FSD-enable vehicle alongside of an intelligent, alert owner who would be happy to show you first hand._
      The videos I'm watching are very much biased in favor of Tesla. I am not cherry-picking content with titles that suggest problems. I'm watching complete drives posted by people who are excited about this technology. That's how I found yours.
      If you want to have a conversation with me about FSD, I think it would be quite interesting. You are using the system, but because you have spent less time than me watching how it behaves in a wide selection of videos, analyzing its visualizations, and observing the drivers as they use it, I think you haven't picked up the details I have. I also work as a software engineer (making music education software), and I think this has helped give me a better understanding of the machine learning topics you discuss. Some of the ways you discuss it make me think you misunderstand how it's really being applied to train the system. I certainly can't claim to know how it works either. But in general, I think it's applied much more selectively than how you've represented it.
      _That is a major mistake on your part, and shows that you are predisposed to dislike / reject anything at ALL about Tesla's autonomous driver assist technologies._
      I am very keen on the technologies Tesla develops which operate to assist the driver, such as impending collision detection. There is technology here that can save lives. But there is also some that threatens life and has in fact resulted in deaths.
      _Once you've done that I will start to pay more serious attention to your conclusions._
      I don't ask you to. You may choose to recognize that there is value in hearing feedback from people who aren't interested in participating in this testing and don't agree with how you are allowing it to make your driving less friendly. This has value to you as someone who wants this technology to succeed. Public perception and regulation are obstacles for your goal.
      _3. You need to update yourself regarding the situation between Californian regulators (one state in 50) and the Tesla company._
      You continue to not respond to the words Tesla stated. Tesla stated that the final release of FSD will be a level 2 system. This does still apply, because the fact that Tesla is handing this technology out to users in California is something they would not be able to do today if the technology was seen as level 3 or 4. It's simply true that Tesla is being dishonest with at least one of these groups - the users or the regulators.
      4. _...then YES - and we are all SO glad that he is making Tesla profitable and removing the previous worries about going bankrupt._
      Other companies are taking on losses to develop this technology in order to do it in a safer way. And in general, I don't support the approach of any company that's trying to slowly replace human driving while requiring humans to act as backups. I won't limit that criticism to Tesla. That's not something I ignored. It's a topic that hadn't come up. But we must note that nobody has pushed forward as far as Tesla with ADAS features, and other companies have prioritized safety more. Sensor selection is one example. Another is driver awareness tracking.
      _that Google somehow "righteously" abandoned in some selfless high-minding act of philanthropy._
      This is as stated by them. You may look it up. They were years ahead of Tesla on this, starting long before Tesla was in the game. And this is the reasoning they gave for abandoning it. If you choose not to believe it, that's up to you. But what is your alternative explanation? They were clearly the furthest along and had the evidence to prove it. They had vehicles that could drive down the highway with a human sleeping in the car seat. Do you think they didn't see profit in that? It was much lower hanging fruit.
      _To attempt to say (as you do) that, because Tesla owners must (somehow) be the well-off careful types who therefore have less accidents, is complete nonsense._
      It's quite easy to verify what I've said. The average household income of Tesla Model 3 owners is $133,879 per year. The average US household income (in 2020) was $67,521. And there are plenty of studies that show people who are better educated, have better credit scores and often have families get into fewer accidents. I'm not trying to tell you that Teslas don't have technology that helps prevent accidents. They do. And I think that's awesome. But the statistics are skewed because of demographics. Even in your response here, you talk about how Tesla owners pick Teslas because they are safe. That in itself suggests a skewed demographic.
      _5. Finally, you can watch all the videos on recorder playing (or any other instrument) but all that watching is purely of academic interest until you start PLAYING and you start PRACTISING, and you determine not to give up on the instrument no matter how hard it seems at the start._
      As I indicated, it depends on your goal. Did you note that the first thing I said was that if you want to learn to play recorder, you need to play recorder? It seems you are inventing something to argue against here. But if I want to learn what a recorder is capable of, what value it has and what it really sounds like, my best bet is to do some research. And that analogy is closer to what my goals are with self-driving cars. And by the way, music teachers teach instruments they don't play. That's a job requirement if you teach orchestra, or as was the situation for me, orchestra/band/piano/guitar.
      _In conclusion: your hostility to Tesla and to Elon is what makes me suggest that you not any longer spend the countless hours you claim to have spent watching videos on Tesla and self driving, since you clearly do not believe in it; _
      I think I've shown that I am very much interested in self-driving! But as I've shown here, in articles you've not responded to, there are issues with the way Elon Musk is approaching it.
      I _am_ openly critical of Elon Musk. But I am hardly alone in this. I suggested an article to you before on Business Insider called "Elon's stale playbook."

    • @model3man
      @model3man  Před rokem +1

      @@communityband1 - Clearly there is NO point in arguing with you. We completely don't agree with each other. I am extending an offer for you to come to Vancouver and spend time actually DRIVING around in my FSD-equipped vehicle. If you're interested let me know. Otherwise - over and out. Thank you

    • @communityband1
      @communityband1 Před rokem +1

      @@model3man Thank you for the offer!

  • @casperhansen826
    @casperhansen826 Před rokem +1

    Hope it comes to Europe soon

  • @iowa_don
    @iowa_don Před rokem +2

    I'm kind of bummed about FSD Beta. I once had the camera report button but a release or two ago I lost it so I don't really feel that I am "participating" any longer. I feel that they no longer want my input. Model S Plaid owner in the lower 48.

    • @jameshoffman552
      @jameshoffman552 Před rokem +2

      The report function being pulled just means that Tesla can assess the incidents themselves. It doesn't mean your participation isn't appreciated.

    • @iowa_don
      @iowa_don Před rokem

      @@jameshoffman552 I’m not 100% sure that is true. I suppose they can get a report if I take over but there are other times when it does something dumb and I don’t take over but I would have pressed the button to let them know it wasn’t behaving in a good manner. Dirty Tesla and Black Tesla often push the button but don’t take control. I did the same.

    • @model3man
      @model3man  Před rokem +1

      Hey Don. It’s not that they don’t appreciate the input from testers but that they have moved beyond the data-gathering phase having gleaned sufficient information from the hundreds of thousands of individuals reports drivers have submitted (via that button). At a certain point they move from data gathering (not exclusively) to solution implementation. They may well bring the button back if they discover that more feedback is needed after they have made certain changes and improvements.

    • @iowa_don
      @iowa_don Před rokem +1

      @@model3man I hope so. As far as I can tell they have done absolutely nothing about school zones and I would never let it go on it’s own through the roundabouts near me (based on past experience). Dirty Tesla has had luck with his roundabouts, me not so much with it totally ignoring a posted roundabout speed of 15 MPH.

    • @model3man
      @model3man  Před rokem +1

      @@iowa_don - Yes, that is a question I also have! Is the car noticing, interpreting and acting on posted street signs (as opposed to using information in the map data) and I think the answer is: Not yet. I’ll do some digging into that important aspect.

  • @johngannon
    @johngannon Před rokem +2

    I've had FSD beta for months and have never turned it on once. Total waste of money for me.

    • @model3man
      @model3man  Před rokem

      Out of interest, what made you pay all that money to purchase the full self driving feature and never even use it once? Did you try to gain admittance into the FSD beta program? Did you not achieve the safety score required? I can’t imagine anyone paying that money and not even trying it out.

    • @johngannon
      @johngannon Před rokem +3

      @@model3man Wasn't willing to drive a certain way to get my safety score down (shouldn't have to). Also the tech just isn't there, MKBHD has the perfect video detailing why I don't (and won't) be using FSD personally. Add to that my total distaste for Elon and his antics and it's the perfect recipe for me to not be interested in my Tesla or it's tech.

    • @johngannon
      @johngannon Před rokem +1

      Also I had enhanced auto pilot with my 2018 Model 3, upgraded to FSD for just $2k so it wasn't a huge amount of cash compared to todays pricing.

    • @model3man
      @model3man  Před rokem

      @@johngannon - Awfully sad to see that you cannot separate Elon's (in my view) brave attempt to allow freedom of speech to the world's most influential social platform because he has such strong convictions about ALL people's freedoms of expression (something we should ALL share) from his genius, and his amazing pioneering of a revolution in the electric car industry. Every person has political beliefs. I respect that. I respect those who disagree with me politically. I respect those who politically agree with me. It stuns me how people (AOC is an example) can LOVE their vehicle (a Tesla), talk about it, Instagram about it, and then, suddenly, when she realizes that Elon is on the other side of the political divide, she HATES her Tesla and says she is going to give it up! That is a stunning example of idiocy. One may not like Elon's political positions - a perfect right - but to reject the incredible products he is responsible for developing and pioneering seems to me a case of "throwing out the baby with the bathwater". Apologies for using old English sayings, but.. well... I'm old and English

    • @johngannon
      @johngannon Před rokem +1

      @@model3man Im not selling my Tesla but I'm certainly not buying another. Elon sold us all a dream that he doesn't believe in. If you think his Twitter takeover is about freedom of expression and not a petulant child lashing out I can't help you either. Unsubbed!

  • @Resist4
    @Resist4 Před rokem +1

    Did you address what the issue was with your Yoke install that caused a Service Center visit? I know you said you were going to tell us in another video before, did I miss it?

    • @model3man
      @model3man  Před rokem

      Hi Dan. I haven't yet - but it's no secret so I'll share it here. The problem basically had to do with the fact that the new steering wheel (yoke) was not pushed down far enough and didnt fully engage with the "clock-style" connector that goes to control the horn and the wheel buttons. That ultimately caused the failure of that unit - requiring its' replacement. An expensive lesson for me - but not that it will deter me from other mods to the car where they are warranted. Just a good lesson in doing the job properly and doing sufficient research prior to embarking on the project

    • @Resist4
      @Resist4 Před rokem

      @@model3man So what had to be replaced?

    • @model3man
      @model3man  Před rokem

      @@Resist4 the clock-style rotating connector ribbon the links the steering wheel buttons and the horn to the main system.

    • @Resist4
      @Resist4 Před rokem +1

      @@model3man Ah. Thanks for the clarification.

  • @jimbercik
    @jimbercik Před rokem

    in your next FSD educational video. Please first define what fully self driving means to you, what do you think it means to Tesla?.
    Also, it is my understanding that you ensure drivers not necessarily the vehicle
    Where is the profit to cover the liability of full self driving? Will we still be responsible 100% for what this car does at the next autonomous level?
    I care very much about the liability of a 5000 pound killing machine. would you allow a 19-year-old teenager (unrelated) to drive your new Corvette Z6 and take all the liability since he has no insurance.
    I thought part of that big margin Tesla was getting per vehicle. What is the cover some of this liability? But it appears they have no liability for the fitness performance or suitability of their software hardware and sensor combination. Or do you believe as soon as they get past the beta stage, they will be excepting 100% liability. Or is this just more fantasy uncertainty and doubt.
    Please educate me, because I’m just an old electrical engineer who spent his life designing and implementing control systems. I wish I had the luxury of doing that and putting all the liability on my customers. Just not the way the world I lived in ever worked. But I never looked at it like learning to play the recorder.
    Oh, I did see that ecological paradise in Texas on fire yesterday. According to Elon, it’s better to blow stuff up on the ground especially in taxes were nobody gives a crap. But he’s going to save humanity with reusable rocket that cost as much or more as disposable ones. Or do you still believe the 1/10 of a price and I’m going to Mars for $100,000.

  • @jimbercik
    @jimbercik Před rokem +1

    I am saying it is impossible to achieve level 4 in my 2018 with current hardware, cameras and sensors. Impossible, impossible. No effort has been made for regulator approval. Who has liability for tour Tesla when you are allowed to not pay attention (even for 1 min). This basic question has never been answered

    • @BigBen621
      @BigBen621 Před rokem

      Tesla has committed that all Teslas for which FSD was purchased will be capable of FSD, and if necessary will upgrade the hardware and sensor suite to achieve that. In fact, if I'm not mistaken your 2018 Tesla will already have been upgraded from HW 2 to HW 3; and an upgrade to HW 4 is reportedly just around the corner.
      You are never "allowed to not pay attention"; I don't know where you got that idea. And you are always liable, at least until Level 4 is achieved.

    • @jimbercik
      @jimbercik Před rokem

      @@BigBen621 you’re wrong I was promised a full self driving car. I paid for it after almost 5 years I see no progress and no hope. You’re saying that I should’ve understood that I would always be responsible and have to pay 100% attention with no forward display. That is not what I was sold by the huckster and if you think that’s what you bought all I can say is you are a fool or a shill, or a foolish shill. A Tesla is not adding features and sensors. They are removing them. I was actually promised and sold level five, after four years, all I got was crappy cruise control, crappy infotainment, terrible headlights, and stupid windshield wipers. Also, the climate control sucks but it has an OCTOvalve. Keep riding that d

    • @BigBen621
      @BigBen621 Před rokem +1

      ​@@jimbercik "after almost 5 years I see no progress and no hope" There's been huge progress, to the point where going two hours and more on FSD beta without interventions or disengagements. If you don't see any progress, you're not looking.
      "You’re saying that I should’ve understood that I would always be responsible and have to pay 100% attention with no forward display" No, I'm saying that at the current Level 2 status, you're responsible and have to pay 100% attention. Obviously that won't be the case when/if Tesla achieves Level 4 autonomy. I don't know what "no forward display" means.
      "A Tesla is not adding features and sensors." I didn't say they're adding features and sensors. I'm saying they'll upgrade them if that's necessary to achieve higher levels of autonomy.
      If you think it's so crappy, why didn't you unload it a six months ago when used values peaked, and buy something more to your liking? But of course then you wouldn't have anything to complain about.

    • @MindzEnt
      @MindzEnt Před rokem

      @@BigBen621 Elon Musk himself said that FSD will be able to drive without a human inside from ans LA parking lot to a NY parking lot. He promised full self drive and the human would be in the car just for legal reasons. Musk is a fraud.

    • @jimbercik
      @jimbercik Před rokem

      @@BigBen621 yes that’s the reason. I haven’t sold my Tesla. I like to complain. That’s the reason I purchased it, I like to complain. Maybe I’m trying to protect other fools and gullible people from thinking that the car is an investment. I did not sell my Pinto for seven years. Does that mean I was a big fan of it? Keep riding Elon’s D

  • @MindzEnt
    @MindzEnt Před rokem +1

    100% Tesla should not be allowed to experiment such dangerous technology out in public, I get if the Tesla owners want to but we the public did not sign up for it.

    • @BigBen621
      @BigBen621 Před rokem

      What actual evidence do you have that it's dangerous?

    • @MindzEnt
      @MindzEnt Před rokem +1

      @@BigBen621 Evidence? Sure just get on CZcams and search FSD dangerous, Tesla autopilot dangerous, Tesla phantom braking. Hundreds of videos with Tesla owners telling you how dangerous it is. Most compare it to a teenager driving for the first time. NHTSA has received thousands of complaints from Tesla owners who reported Teslas driving into oncoming traffic, running red lights, suddenly breaking in the middle of high speed traffic, not stopping for pedestrians or objects in road, running stop signs, driving full speed through speed bumps. Tesla actually tops the list of cars using driver assistance with the most crashes. The evidence is there, look for yourself though, don't believe me just take a look for yourself.

    • @BigBen621
      @BigBen621 Před rokem

      ​@@MindzEnt Those are not evidence of FSD beta being dangerous; they are anecdotes about FSD beta appearing to do something that's potentially dangerous-but in *every* case, the driver intercedes to avoid an accident. Phantom braking occurs only on highways, where the NOA ("Navigate on Autopilot") component of FSD operates, and not FSD beta which operates only on city streets. I've experienced occasional episodes of this, but it's always been on NOA, and not FSD beta.
      Comparisons to a teenager driving for the first time were common a year ago, much less so now; and certainly not by "most". Much more common are YT videos of drives of up to and even exceeding two hours, with no interventions or disgagements.
      The NHTSA Customer Complaint database contains self-reports of claimed incidents, with no verification of the identity of the reporter or validity of the report. The last time I looked-last May-it had only 15 reports containing the terms "FSD" or "Full Self Driving". Of the accidents claimed in the database, some didn't claim the car was operating under FSD beta, only that the car was equipped with it. Others claimed the accidents occurred on freeways (Long Island Expressway, I-405), while FSD beta, AKA "Autosteer on City Streets", of course operates only on city streets and not highways; so these claimed accidents did not involve FSD beta. And one reported a crash because "an alien landed on the road in front of me".
      Several people claimed difficulty taking control back from FSD. Anyone who's driven a Tesla can attest to the fact that it only takes a slight torque on the steering wheel to take back control. In fact, many of us have inadvertently disengaged FSD or Autopilot, just by holding the steering wheel a little too closely while it's making a turn; so these claims are not plausible.
      Some claims were more plausible; but the only way to know for sure if the claimed incident or accident occurred-because few are reported to police, which itself casts doubt on the veracity of these claims-and if so whether FSD beta was at fault, is by the logs in the cars, which Tesla freely shares for accident investigations. Since NHTSA is now investigating Tesla ADAS crashes, this will likely end up in their final report.
      You are correct that Tesla leads all other car companies in *reports* of accidents involving (but not necessarily caused by) ADAS, and reported to NHTSA under SGO 2021-01. But a little deeper dive into the data will show why. This report is much more a measure of various car companies' ability and willingness to report accidents involving ADAS, than a measure of the relative accident frequency of these companies' ADAS. For example, while they have relatively similar numbers of cars on the road with relatively similar ADAS, Honda has 82 reports in the latest edition of the database, and Toyota has only 5. Do you really think Toyotas are >16 times as safe as Hondas?
      Looking at Tesla, the reason they top the list is that Tesla, and only Tesla, reports almost all of their accidents via their telematics-which report back to the mothership the instant there's an accident. Other than Tesla, there are only two Cadillacs; one Honda; one Lucid Motors; and three Subarus with accidents reported via telematics; every other ADAS-involved accident from every other company is reported to NHTSA *only* if someone reports the accident to the manufacturer to be reported to NHTSA, or in a very small percentage of incidents, the report is by LE. Since the rate and method of reporting varies hugely between manufacturers, no conclusions can be drawn about relative safety based on this data; and the report makes this clear.
      It is possible that there are some incidents involving FSD beta within the SGO data, but there's no way to tell from the public database. So the fact still remains; there are no verified instances of accidents caused by FSD beta. That is, none of the claims you've made constitutes actual *evidence* of a reportable accident caused by FSD beta, rather than being simply anecdotes about behavior that *appears* dangerous, but does not lead to a history of verifiable accidents.

  • @davidpearn5925
    @davidpearn5925 Před rokem +2

    Why would anyone pay sooo much for something that doesn’t work and requires you to prequalify in order to work for nothing ? Only the world’s greatest salesman could pull this off.

    • @mrthemoo
      @mrthemoo Před rokem +3

      Elon is a genius when it comes to EV’s,but he is far from the world’s greatest salesman. Part of me wishes that many individuals don’t buy a Tesla so I can enjoy the speed and the features of the car without competition. If people knew how awesome the car is, they would find a way to have one.

    • @AlainFattal
      @AlainFattal Před rokem +3

      Teslas are cheap for what they have to offer: ultimate safety & pleasure. That's what life is about. If you don't own a Tesla, you cannot understand. FSD will be much more expensive when it will be complete. We have the opportunity to buy it early for a fraction of the final price. Take it as an investment.

    • @jameshoffman552
      @jameshoffman552 Před rokem +2

      $15k is absolutely worth having robotic driving IMO. I have it on my two Model Ys. It's not perfect (although it seems to be in crash avoidance) but well worth $15k.

    • @codemonkey2k5
      @codemonkey2k5 Před rokem

      This my friends, is a troll. ;)

    • @davidpearn5925
      @davidpearn5925 Před rokem

      @@AlainFattal I have had one for the last 3 years. I only have to apologise for my choice………now.

  • @jimbercik
    @jimbercik Před rokem

    1. Users take all liability for this
    2. How much data do they need. Elon said it would take much less data
    3. More people taking the liability speeds up the processes (impressed wives are not a good benchmark
    4. Yelling out the windows does just as well, that’s why they removed the button
    5. Machine learning is just tuning a model. The model is still not closed to being tuned
    6. Or paid shills (through Sponsorship) I never scored above 88
    The safety rating only comes from no engine and a large crumple zone in front (this is common with all EVs) this also goes away if the cyber truck is made out of spacex stainless. Tesla and Elon are not humanitarians, but Kickstarter hucksters.

    • @BigBen621
      @BigBen621 Před rokem

      1) Users take all liability for driving any automobile; no different for Teslas. Financial liability is usually covered by insurance; and since Tesla now sells collision and liability insurance in 12 states with more to come, it is perfectly willing to take on that liability. At a cost, of course, but again, no different from any other automobile manufacturer. Again like any automobile manufacturer, Tesla can be found liable by a lawsuit if their product is found defective.
      2) They need lots of data. Tesla obviously underestimated how much data it would take to achieve higher levels of autonomy in the beginning. This is not surprising, because they were taking on perhaps the most difficult task ever attempted in AI; and there was no a priori basis for making an accurate estimate of the time and effort that would involve.
      3) Everyone takes liability when they drive a Tesla or any other automobile,, so presumably the process is going as fast as possible.
      4) Although the button was removed, your Tesla still reports back to the mothership whenever you disengage FSD beta.
      5) Frequent CZcams videos of Teslas driving two hours or more on FSD beta without intervention; e.g. czcams.com/video/3gZJWWfyCqE/video.html, suggest that it is fairly "closed' [sic] to being tuned.
      6) If you never scored above 88% in the safety score, I'm perfectly happy that you didn't get early access to FSD beta-safer for the rest of us; but you have it now.

    • @jimbercik
      @jimbercik Před rokem

      @@BigBen621 where is the score for Teslas auto pilot FSD.. I have not had an accident for 40 years driving can tesla say that. When I purchased my Tesla, I thought Tesla would need to test their software for suitability, not test the customer for suitability. Who gives a crap what you think about whether I should have the product I paid for. I did not invest in a go fund me. I purchased a product.