A fan of Tesla might think that the automaker just can’t catch a break when it comes to its autonomous driving tech. It’s already subject to several federal investigations over its marketing and deployment of technologies like Autopilot and Full Self-Driving (FSD), and as of last week, we can add another to the list involving around 2.4 million Tesla vehicles. This time, regulators are assessing the cars’ performance in low-visibility conditions after four documented accidents, one of which resulted in a fatality.
The National Highway Traffic Safety Administration (NHTSA) says this new probe is looking at instances when FSD was engaged when it was foggy or a lot of dust was in the air, or even when glare from the sun blinded the car’s cameras and this caused a problem.
What the car can “see” is the big issue here. It’s also what Tesla bet its future on.
I didn’t realize they were using other sensors in the past and dropped them on newer models.
🤦♂️
Didn’t want to develop two different versions of software I guess?
The problem was the different sensors could sometimes disagree. Like, vision sees an obstacle but radar isn’t picking it up…which one does the software believe?
And if you think vision has problems with things like rain and fog, try radar or lidar!
Not mentioning the downsides of the other sensors always makes me suspicious of an article.
The key point of going vision-only is that: its what humans do every day. Articles that leave that out also disappoint me.
Isn’t vision cameras the only sensor we have to recognize lane markings? This article is bunk making it seem like that’s not industry standard. RADAR can’t see paint on the road. My understanding is neither can LiDAR well enough for real-time lane markings at highway speeds.
I thought they canceled a contract for an outsourced system