Humans did not evolve to drive cars. ML did. It drives consistently with no distractions. It is never tired, drunk, or experiences road rage. It has super human reaction time and can see in a full 360 degrees. It is not about being a lazy fatass it is about safety. Hundreds of people in the US were killed in car accidents just today, and none of them were from self driving cars.
Also please provide an example of a life threatening accident cause by FSD.
The article listed 2 life threatening near accidents that were only prevented because the person behind the wheel took over and kicked out FSD. Read the article and then comment.
I read it just fine. He asked for an example of a life threatening accident caused by Full Self Driving. I noted that 2 examples were listed in the article. The ONLY difference was that the driver prevented the accidents by being aware. The FSD was going to cause accidents without intervention. I guess in your would people are supposed to do nothing to avoid a major accident. Hilarious that you want to love FSD driving so much that you’re willing to defend a billionaire who wouldn’t piss on you if you were on fire. Billionaires are not your friends. FSD is BETA feature that doesn’t work properly. Take your love somewhere else and away from my comment because you read it, didn’t understand it, and fired off a reply stating I didn’t do something I did because you can understand me. The next time you want to have a discussion come prepared, or don’t come at all!
Ah the “only difference” in your two examples of life-threatening accidents occurring is that no accident occurred in either example? That’s quite the difference if you ask me… this isn’t a level 4 or 5 system so driver intervention is required. These systems can’t improve without real world testing, meanwhile a hundred people die on the road every single day. I guess you’d prefer more people die on the road from drunk or distracted drivers than have manufacturers roll out solutions that aren’t absolutely 100% perfect even if they’re more perfect than human drivers most of the time.
Your obessesion with Musk is clouding your judgment. I made no mention of him, nor do I like or defend him. This tech wasn’t built by Musk so who gives a shit about him in this discussion?
I am not obsessed with Musk in any form, but the fact of the matter is when you have FSD systems that fail to do the thing they are supposed to do, then maybe it’s not the best idea to roll them out the entire world. Maybe it’s better to continue with more limited testing. You act as if all drunk driving/distracted will stop when FSD is used and that simply isn’t the care. Many people still use gasoline powered cars and drink and drive even though it’s dangerous to do so. Furthermore, FSD will lead to more distracted driving because people will assume the self driving means the car will take of everything and there is no need to be vigilant.
The plain truth is that while FSD can be the future, rolling it out despite knowing that it isn’t ready is not the solution it’s irresponsible and will cause harm. The almost accidents that you aren’t concerned with would have most likely killed the driver and probably other people to. Our difference of opinion here is that you believe it’s okay if people die as long the the testing shoes that there is a chance they won’t die in the future and think if anyone dies it’s too much. The feature clearly isn’t ready for prime time and needs more limited real world testing, but the fact of the matter is testing doesn’t bring in money.
Your inability to ever consider the fact that a worldwide roll out might not be the best idea right now since the testing shows the car isn’t ready shows that you really aren’t arguing in good faith. You have chosen the position that FSD is good and is ready even when confronted with articles like the above show it isn’t. I would wager that a lot of people want the era, of FSD, they just want it when it works. Keep the roll out more limited and do further testing. When mistakes happen, take the time to figure out why and how it can be prevented in the future. You argue testing is needed, but are in favor of a roll out now even though we need lots more limited real world testing. Both can’t be true. Time to think what you really want, because I don’t think you know… And accusing any person who doesn’t want a complete roll out of FSD today of having a bias against Musk shows that.
Teslas have 360 degree dashcams that are recording all the time. Why didn’t they upload the video? I promise you they have it.
Such a video would go viral pretty easily. It would light a fire under tesla engineering to fix such a dangerous and life threatening situation. Where is it? Why is there never any footage attached to these articles? Why can’t I find a video ANYWHERE of such a thing? Why can nobody in this thread bashing the tech over and over produce any justification for their fear?
If I were tesla and I wanted to cover up dangers of FSD trying to kill people I wouldn’t give everyone a constantly running dashcam. It would really make them look bad.
Could it possibly be, just maybe, that the video disagrees with the “journalist” opinion that it was performing dangerously? Could it be that an article that says “Tesla FSD performs admirably, swerves to avoid obstacle that would have caused a blowout” might not get nearly as many clicks and ad revenue? Maybe?
FSD is aware of where barriers and medians are. If it needs to swerve to avoid an obstacle it will go in whatever direction is safest. Sometimes that means towards a barrier. Sometimes the driver panicking and disengaging and taking over interrupts the maneuver and causes danger that wasn’t otherwise present. We will never know what actually happened because there is no evidence. Evidence that I promise you exists but for whatever reason was omitted.
If a cop said something outrageous and dangerous happened to them and they say they are completely clear of fault and wrongdoing, would it not be reasonable to want to see the bodycam footage? If for whatever reason the police department says “we don’t have it” “its corrupted” or whatever other excuse would that not raise eyebrows? The same situation applies here.
There are plenty of youtube channels out there like dirtytesla, whole mars catalog, AI Driver, Chuck Cook, and many others that show and even livestream FSD. None of them have been in an accident, even in very early releases of the beta software. These people are comfortable with the beta and often don’t take over control of the vehicle under any circumstances, even in their torture test scenario.
Is it at all possible, just maybe, that FSD isn’t as dangerous as you might think? Fear is often a result of ignorance.
I am extremely open to changing my mind here just show me some convincing evidence. Every tesla is recording all the time so it should be really easy to find some, no?
Im sure im just a Tesla shill or fanboy whatever. The truth is Im just looking for facts. I would like to know why people feel this way and are so afraid of new technology despite overwhelming evidence to the contrary that it is saving lives.
Thats because its a rolling recording. If you explicitly want to save a clip long-term you honk the horn. This is clearly laid out in the manual and is located as a setting right on the screen where the dashcam is enabled. This line is a pure cop-out. They had the footage they just refused to upload it. Possibly, they never bothered to check for it but that would be incredibly irresponsible for anything resembling “journalism”
So you are saying that since the author of this article didn’t upload a video of the events he detailed, that means FSD has absolutely no issues and is completely safe for every person on the road to use all time? Seems like quite a leap to me, but what do I know? It seems to me that people here want FSD when it’s ready. You want it now, ready or not. I guess that’s where we disagree. And I don’t really think you are open to anyone changing your mind. I think you picked your position and come hell or high water you’re sticking to it.
FSD is not without issues, but yes lots in this thread are implying that FSD is unsafe and causes tons of accidents, which there is absolutely no evidence to back that up. Its just a “Feeling” they have. They believe that it is irresponsible of anyone to use it and doing so puts others at unnecessary risk. All I have asked for the entire time is just some kind of evidence of that. Anything. Help me understand your view. Please.
The reason I defend it so much at this point is because it is already demonstrated to be far safer than any average human driver and getting better with every release. With the new V12 and full neural net it is expected to get far smoother and drive even more human like with less code and consuming less power. We have seen massive improvements in the tech just in the past year and the rate in which it gets better continues to accelerate. It is impossible to count how many lives it has saved already through accident avoidance. We don’t need misinformed people bashing and trying to cast doubt and hold back this technology just because they “feel” a certain way about it. You should absolutely criticize valid concerns, but FFS please bring some facts and evidence to the table.
The reason I am confident why FSD is safe despite “feelings” is how its programmed. For several years prior to even the earliest public beta, the camera and AI system learned how to correctly identify everything on the road. Other cars, pedestrians, dogs, cats, babies, telephone poles, traffic cones, whatever. It is in a state now where it is as accurate as it ever can be and any issues it has are with regards to mislabeling one thing as something else (like a car as a truck, etc). That doesn’t actually matter with self driving because literally the first line of code in FSD is something like: “This is a car, this is a truck, this is a pedestrian, this is a dog and this is where it is, where it is going and how far away it is”… OK? Don’t hit those. and it doesn’t. Everything else comes secondary. It drives like a robot, it obeys traffic laws to a T and that pisses off other drivers, or freaks out whoever is behind the wheel because the car didn’t do exactly what they would have done in that situation and it is therefore wrong, so they had to take over. It is sometimes unnecessarily cautious around pedestrians (But, honestly how would you want it to behave?) It might suddenly detect a hazard and swerve to avoid it, possibly moving the car into another unoccupied space. It is fully aware of the space it is occupying and fully aware of the space it is about to occupy. And it doesn’t hit anything. There are lots of youtube channels that prove this, they upload regularly and stress test FSD and try to get it into trickier and trickier situations and it never hits anything. It acts indecisively sometimes, and waits for gaps too large in an abundance of caution, but these are the issues that are getting better over time. At no point does it do anything “Unsafe”. Imagine, if you would, a world where all cars are like this. The most dangerous part of driving right now, FSD or not, is other drivers. The more people we have using it who understand it and are comfortable with it, the better it gets and our roads get safer and safer. I really don’t care how you feel about Elon, he deserves every bit of hate that is sent his way, but FFS please take a look at FSD for what it is and what it is becoming. If it helps you feel any better he was not personally responsible for writing a single line of code or designing any of the components of the system.
All I’ve gotten to “back up” the claims here is 3 different articles referencing the exact same incident (the bay bridge pile-up). The video clearly shows the car coasting (regen) to a stop and just sitting there. Had emergency braking been engaged, the hazards would have been turned on, and the car would have stopped a lot quicker. FSD never, ever has had any history or incident of completely stopping in a lane. There is no evidence of this ever happening anywhere else. 500k of these cars on the road and no other similar reports. Is that a fault of the software, or is it more likely some kind of user error? From my standpoint, having actually used FSD for several years I can tell you with complete certainty that the car would never behave like that and there are far too many red flags in that video to cast blame on the software. Of course, we will see what plays out in the court case once it is completed, but in my professional opinion, the driver clearly disengaged FSD and allowed the car to come to a complete stop on its own and did nothing to move the car out of the way, it had absolutely nothing to do with the software. I’m 100% open to disagreement on that and am curious as to what a civilized discussion on it would sound like and what someone else thinks is happening here, but so far it just turns into a flame war and I get called a deluded fanboy, even being called a liar and other names. No evidence, no discussion only anger.
Again, here is my point. If FSD is as dangerous as others are implying then we should see tons of accidents. Given that every single one of these cars has a constantly running 360 degree dashcam, we should see some evidence, right? Maybe not from this specific case, maybe there’s a valid reason why they couldn’t upload it. But, surely with half a million cars on the road and many millions of miles traveled collectively, we should at least see something, right? There are tons and tons of videos of teslas avoiding accidents, but nobody wants to mention or talk about those. People are focusing all of their energy into one highly suspect negative with nothing to back it up, holding back technology and safety and refusing to have any sort of civilized discussion around it.
It’s clear from what you wrote that you want FSD to be as good as it can and I think we can get there but we aren’t there yet. You say there hasn’t been any reports of any accidents with FSD save for one, but I don’t know if that’s true and that would require some serious research on my behalf to evaluate that. First, I don’t know the number of people that have a car capable of doing FSD driving, from your reply you said 500k on the road, but provided no evidence so I can’t say that’s true without independent evaluation. Second, I have no knowledge of how many of those cars use FSD. It may be a bunch, but it may not. You don’t say and I don’t know. Now there may be far less accidents with FSD, but if the number of people of people on the road in Q1 is 286 million just in the US (https://www.statista.com/statistics/859950/vehicles-in-operation-by-quarter-united-states/ ), and the number of vehicles using FSD every single day all the time for every single drive, it would stand to reason there are far less accidents because there are far less car. You also mention that it has become good at being able to detect objects and I think it has, but being able to detect objects and being able to avoid getting accidents when there are 286 million FSD driving cars on the road that FSD exclusively every single time the vehicle is in use are two different things.
The fact is, I do want FSD to be a thing, but when I see article written by someone who says that two times they had to take over for the car so it didn’t kill the driver or others, I start to worry that FSD isn’t ready. And frankly although there are YouTube channels that are about electric vehicles that haven’t brought up accidents ever, I wonder if they have a reason not to. I’m not sure. Also, I can’t say the big YouTube channels have never talked about this because I haven’t watched every video they’ve ever posted. And I would have to do that to know if your correct.
I see that you are passionate about FSD, and I think your passion makes you overlook the real discussion going on. People, and certainly not all people, generally want FSD to be a thing for the reasons you stated, but they want to make sure the cars are safe when they are. And I get that you take a risk every time you drive a car, but the fact of the matter is from reading this article I get the sense that FSD isn’t ready to implemented for every person with a drivers license to use. It sounds like the author knew what to do because he had been driving for some time. If he hasn’t, I think the situation could have been very different.
You talk about the car not doing exactly what they would have done, but the in articles case it was going to crash. I don’t think anyone would have done that. If the car was able to detect the object, why was it going to crash into it? That is something that would need to be investigated. You argue that people talk about FSD being removed/cancelled because people have a feeling it isn’t good, but I haven’t seen that in droves. I’ve seen several people say that they think FSD needs more testing and more limited roll out.
I know I didn’t hit all your points, but they were quite numerous. I want full self driving, but I want it to be reliable. And I think if articles like this are written we just aren’t there yet. Yes, keep it coming, but be real about its current limitations.
Humans did not evolve to drive cars. ML did. It drives consistently with no distractions. It is never tired, drunk, or experiences road rage. It has super human reaction time and can see in a full 360 degrees. It is not about being a lazy fatass it is about safety. Hundreds of people in the US were killed in car accidents just today, and none of them were from self driving cars.
Also please provide an example of a life threatening accident cause by FSD.
The article listed 2 life threatening near accidents that were only prevented because the person behind the wheel took over and kicked out FSD. Read the article and then comment.
Hilarious telling them to read the article first when you couldn’t even be bothered to read their question before replying.
I read it just fine. He asked for an example of a life threatening accident caused by Full Self Driving. I noted that 2 examples were listed in the article. The ONLY difference was that the driver prevented the accidents by being aware. The FSD was going to cause accidents without intervention. I guess in your would people are supposed to do nothing to avoid a major accident. Hilarious that you want to love FSD driving so much that you’re willing to defend a billionaire who wouldn’t piss on you if you were on fire. Billionaires are not your friends. FSD is BETA feature that doesn’t work properly. Take your love somewhere else and away from my comment because you read it, didn’t understand it, and fired off a reply stating I didn’t do something I did because you can understand me. The next time you want to have a discussion come prepared, or don’t come at all!
Ah the “only difference” in your two examples of life-threatening accidents occurring is that no accident occurred in either example? That’s quite the difference if you ask me… this isn’t a level 4 or 5 system so driver intervention is required. These systems can’t improve without real world testing, meanwhile a hundred people die on the road every single day. I guess you’d prefer more people die on the road from drunk or distracted drivers than have manufacturers roll out solutions that aren’t absolutely 100% perfect even if they’re more perfect than human drivers most of the time.
Your obessesion with Musk is clouding your judgment. I made no mention of him, nor do I like or defend him. This tech wasn’t built by Musk so who gives a shit about him in this discussion?
I am not obsessed with Musk in any form, but the fact of the matter is when you have FSD systems that fail to do the thing they are supposed to do, then maybe it’s not the best idea to roll them out the entire world. Maybe it’s better to continue with more limited testing. You act as if all drunk driving/distracted will stop when FSD is used and that simply isn’t the care. Many people still use gasoline powered cars and drink and drive even though it’s dangerous to do so. Furthermore, FSD will lead to more distracted driving because people will assume the self driving means the car will take of everything and there is no need to be vigilant.
The plain truth is that while FSD can be the future, rolling it out despite knowing that it isn’t ready is not the solution it’s irresponsible and will cause harm. The almost accidents that you aren’t concerned with would have most likely killed the driver and probably other people to. Our difference of opinion here is that you believe it’s okay if people die as long the the testing shoes that there is a chance they won’t die in the future and think if anyone dies it’s too much. The feature clearly isn’t ready for prime time and needs more limited real world testing, but the fact of the matter is testing doesn’t bring in money.
Your inability to ever consider the fact that a worldwide roll out might not be the best idea right now since the testing shows the car isn’t ready shows that you really aren’t arguing in good faith. You have chosen the position that FSD is good and is ready even when confronted with articles like the above show it isn’t. I would wager that a lot of people want the era, of FSD, they just want it when it works. Keep the roll out more limited and do further testing. When mistakes happen, take the time to figure out why and how it can be prevented in the future. You argue testing is needed, but are in favor of a roll out now even though we need lots more limited real world testing. Both can’t be true. Time to think what you really want, because I don’t think you know… And accusing any person who doesn’t want a complete roll out of FSD today of having a bias against Musk shows that.
They themselves clearly didnt bother to read the rest of the thread.
Teslas have 360 degree dashcams that are recording all the time. Why didn’t they upload the video? I promise you they have it.
Such a video would go viral pretty easily. It would light a fire under tesla engineering to fix such a dangerous and life threatening situation. Where is it? Why is there never any footage attached to these articles? Why can’t I find a video ANYWHERE of such a thing? Why can nobody in this thread bashing the tech over and over produce any justification for their fear?
If I were tesla and I wanted to cover up dangers of FSD trying to kill people I wouldn’t give everyone a constantly running dashcam. It would really make them look bad.
Could it possibly be, just maybe, that the video disagrees with the “journalist” opinion that it was performing dangerously? Could it be that an article that says “Tesla FSD performs admirably, swerves to avoid obstacle that would have caused a blowout” might not get nearly as many clicks and ad revenue? Maybe?
FSD is aware of where barriers and medians are. If it needs to swerve to avoid an obstacle it will go in whatever direction is safest. Sometimes that means towards a barrier. Sometimes the driver panicking and disengaging and taking over interrupts the maneuver and causes danger that wasn’t otherwise present. We will never know what actually happened because there is no evidence. Evidence that I promise you exists but for whatever reason was omitted.
If a cop said something outrageous and dangerous happened to them and they say they are completely clear of fault and wrongdoing, would it not be reasonable to want to see the bodycam footage? If for whatever reason the police department says “we don’t have it” “its corrupted” or whatever other excuse would that not raise eyebrows? The same situation applies here.
There are plenty of youtube channels out there like dirtytesla, whole mars catalog, AI Driver, Chuck Cook, and many others that show and even livestream FSD. None of them have been in an accident, even in very early releases of the beta software. These people are comfortable with the beta and often don’t take over control of the vehicle under any circumstances, even in their torture test scenario.
Is it at all possible, just maybe, that FSD isn’t as dangerous as you might think? Fear is often a result of ignorance.
I am extremely open to changing my mind here just show me some convincing evidence. Every tesla is recording all the time so it should be really easy to find some, no?
Im sure im just a Tesla shill or fanboy whatever. The truth is Im just looking for facts. I would like to know why people feel this way and are so afraid of new technology despite overwhelming evidence to the contrary that it is saving lives.
Wow that’s sure a lot of text for someone that didn’t read the article.
The author states that despite having storage plugged in, he was not given the option to save a recording.
Thats because its a rolling recording. If you explicitly want to save a clip long-term you honk the horn. This is clearly laid out in the manual and is located as a setting right on the screen where the dashcam is enabled. This line is a pure cop-out. They had the footage they just refused to upload it. Possibly, they never bothered to check for it but that would be incredibly irresponsible for anything resembling “journalism”
So you are saying that since the author of this article didn’t upload a video of the events he detailed, that means FSD has absolutely no issues and is completely safe for every person on the road to use all time? Seems like quite a leap to me, but what do I know? It seems to me that people here want FSD when it’s ready. You want it now, ready or not. I guess that’s where we disagree. And I don’t really think you are open to anyone changing your mind. I think you picked your position and come hell or high water you’re sticking to it.
FSD is not without issues, but yes lots in this thread are implying that FSD is unsafe and causes tons of accidents, which there is absolutely no evidence to back that up. Its just a “Feeling” they have. They believe that it is irresponsible of anyone to use it and doing so puts others at unnecessary risk. All I have asked for the entire time is just some kind of evidence of that. Anything. Help me understand your view. Please.
The reason I defend it so much at this point is because it is already demonstrated to be far safer than any average human driver and getting better with every release. With the new V12 and full neural net it is expected to get far smoother and drive even more human like with less code and consuming less power. We have seen massive improvements in the tech just in the past year and the rate in which it gets better continues to accelerate. It is impossible to count how many lives it has saved already through accident avoidance. We don’t need misinformed people bashing and trying to cast doubt and hold back this technology just because they “feel” a certain way about it. You should absolutely criticize valid concerns, but FFS please bring some facts and evidence to the table.
The reason I am confident why FSD is safe despite “feelings” is how its programmed. For several years prior to even the earliest public beta, the camera and AI system learned how to correctly identify everything on the road. Other cars, pedestrians, dogs, cats, babies, telephone poles, traffic cones, whatever. It is in a state now where it is as accurate as it ever can be and any issues it has are with regards to mislabeling one thing as something else (like a car as a truck, etc). That doesn’t actually matter with self driving because literally the first line of code in FSD is something like: “This is a car, this is a truck, this is a pedestrian, this is a dog and this is where it is, where it is going and how far away it is”… OK? Don’t hit those. and it doesn’t. Everything else comes secondary. It drives like a robot, it obeys traffic laws to a T and that pisses off other drivers, or freaks out whoever is behind the wheel because the car didn’t do exactly what they would have done in that situation and it is therefore wrong, so they had to take over. It is sometimes unnecessarily cautious around pedestrians (But, honestly how would you want it to behave?) It might suddenly detect a hazard and swerve to avoid it, possibly moving the car into another unoccupied space. It is fully aware of the space it is occupying and fully aware of the space it is about to occupy. And it doesn’t hit anything. There are lots of youtube channels that prove this, they upload regularly and stress test FSD and try to get it into trickier and trickier situations and it never hits anything. It acts indecisively sometimes, and waits for gaps too large in an abundance of caution, but these are the issues that are getting better over time. At no point does it do anything “Unsafe”. Imagine, if you would, a world where all cars are like this. The most dangerous part of driving right now, FSD or not, is other drivers. The more people we have using it who understand it and are comfortable with it, the better it gets and our roads get safer and safer. I really don’t care how you feel about Elon, he deserves every bit of hate that is sent his way, but FFS please take a look at FSD for what it is and what it is becoming. If it helps you feel any better he was not personally responsible for writing a single line of code or designing any of the components of the system.
All I’ve gotten to “back up” the claims here is 3 different articles referencing the exact same incident (the bay bridge pile-up). The video clearly shows the car coasting (regen) to a stop and just sitting there. Had emergency braking been engaged, the hazards would have been turned on, and the car would have stopped a lot quicker. FSD never, ever has had any history or incident of completely stopping in a lane. There is no evidence of this ever happening anywhere else. 500k of these cars on the road and no other similar reports. Is that a fault of the software, or is it more likely some kind of user error? From my standpoint, having actually used FSD for several years I can tell you with complete certainty that the car would never behave like that and there are far too many red flags in that video to cast blame on the software. Of course, we will see what plays out in the court case once it is completed, but in my professional opinion, the driver clearly disengaged FSD and allowed the car to come to a complete stop on its own and did nothing to move the car out of the way, it had absolutely nothing to do with the software. I’m 100% open to disagreement on that and am curious as to what a civilized discussion on it would sound like and what someone else thinks is happening here, but so far it just turns into a flame war and I get called a deluded fanboy, even being called a liar and other names. No evidence, no discussion only anger.
Again, here is my point. If FSD is as dangerous as others are implying then we should see tons of accidents. Given that every single one of these cars has a constantly running 360 degree dashcam, we should see some evidence, right? Maybe not from this specific case, maybe there’s a valid reason why they couldn’t upload it. But, surely with half a million cars on the road and many millions of miles traveled collectively, we should at least see something, right? There are tons and tons of videos of teslas avoiding accidents, but nobody wants to mention or talk about those. People are focusing all of their energy into one highly suspect negative with nothing to back it up, holding back technology and safety and refusing to have any sort of civilized discussion around it.
It’s clear from what you wrote that you want FSD to be as good as it can and I think we can get there but we aren’t there yet. You say there hasn’t been any reports of any accidents with FSD save for one, but I don’t know if that’s true and that would require some serious research on my behalf to evaluate that. First, I don’t know the number of people that have a car capable of doing FSD driving, from your reply you said 500k on the road, but provided no evidence so I can’t say that’s true without independent evaluation. Second, I have no knowledge of how many of those cars use FSD. It may be a bunch, but it may not. You don’t say and I don’t know. Now there may be far less accidents with FSD, but if the number of people of people on the road in Q1 is 286 million just in the US (https://www.statista.com/statistics/859950/vehicles-in-operation-by-quarter-united-states/ ), and the number of vehicles using FSD every single day all the time for every single drive, it would stand to reason there are far less accidents because there are far less car. You also mention that it has become good at being able to detect objects and I think it has, but being able to detect objects and being able to avoid getting accidents when there are 286 million FSD driving cars on the road that FSD exclusively every single time the vehicle is in use are two different things.
The fact is, I do want FSD to be a thing, but when I see article written by someone who says that two times they had to take over for the car so it didn’t kill the driver or others, I start to worry that FSD isn’t ready. And frankly although there are YouTube channels that are about electric vehicles that haven’t brought up accidents ever, I wonder if they have a reason not to. I’m not sure. Also, I can’t say the big YouTube channels have never talked about this because I haven’t watched every video they’ve ever posted. And I would have to do that to know if your correct.
I see that you are passionate about FSD, and I think your passion makes you overlook the real discussion going on. People, and certainly not all people, generally want FSD to be a thing for the reasons you stated, but they want to make sure the cars are safe when they are. And I get that you take a risk every time you drive a car, but the fact of the matter is from reading this article I get the sense that FSD isn’t ready to implemented for every person with a drivers license to use. It sounds like the author knew what to do because he had been driving for some time. If he hasn’t, I think the situation could have been very different.
You talk about the car not doing exactly what they would have done, but the in articles case it was going to crash. I don’t think anyone would have done that. If the car was able to detect the object, why was it going to crash into it? That is something that would need to be investigated. You argue that people talk about FSD being removed/cancelled because people have a feeling it isn’t good, but I haven’t seen that in droves. I’ve seen several people say that they think FSD needs more testing and more limited roll out.
I know I didn’t hit all your points, but they were quite numerous. I want full self driving, but I want it to be reliable. And I think if articles like this are written we just aren’t there yet. Yes, keep it coming, but be real about its current limitations.
Self driving is not there, and it may never get there, but you are right. We can save so many lives if we get this right.
I dont know is Musk is responsible enough to be the one to get us there though.