OK, its just a deer, but the future is clear. These things are going to start kill people left and right.
How many kids is Elon going to kill before we shut him down? Whats the number of children we’re going to allow Elon to murder every year?
Is there video that actually shows it “keeps going”? The way that video loops I know I can’t tell what happens immediately after.
For the 1000th time Tesla: don’t call it “autopilot” when it’s nothing more than a cruise control that needs constant attention.
It is autopilot (a poor one but still one) that legally calls itself cruise control so Tesla wouldn’t have to take responsibility when it inevitably breaks the law.
the deer is not blameless. those bastards will race you to try and cross in front of you.
Finally someone else familiar with the most deadly animal in North America.
yeah well ive hit about $15k worth of them over the years
Driving is full of edge cases. Humans are also bad drivers who get edge cases wrong all the time.
The real question isn’t is Tesla better/worse in anyone in particular, but overall how does Tesla compare. If a Tesla is better in some situations and worse in others and so overall just as bad as a human I can accept it. Is Tesla is overall worse then they shouldn’t be driving at all (If they can identify those situations they can stop and make a human take over). If a Tesla is overall better then I’ll accept a few edge cases where they are worse.
Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.
Humans are also bad drivers who get edge cases wrong all the time.
It would be so awesome if humans only got the edge cases wrong.
Tesla claims overall they are better, but they may not be telling the truth. One would think regulators have data for the above - but they are not talking about it.
The agency is asking if other similar FSD crashes have occurred in reduced roadway visibility conditions, and if Tesla has updated or modified the FSD system in a way that may affect it in such conditions.
It sure seems like they aren’t being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren’t telling the truth.
It sure seems like they aren’t being very forthcoming with their data between this and being threatened with fines last year for not providing the data. That makes me suspect they still aren’t telling the truth.
I think their silence is very telling, just like their alleged crash test data on Cybertrucks. If your vehicles are that safe, why wouldn’t you be shoving that into every single selling point you have? Why wouldn’t that fact be plastered across every Gigafactory and blaring from every Tesla that drives past on the road? If Tesla’s FSD is that good, and Cybertrucks are that safe, why are they hiding those facts?
If the cybertruck is so safe in crashes they would be begging third parties to test it so they could smugly lord their 3rd party verified crash test data over everyone else.
Bu they don’t because they know it would be a repeat of smashing the bulletproof window on stage.
It doesn’t have to not kill people to be an improvement, it just has to kill less people than people do
The autopilot knows deers can’t sue
What if it kills the deer out of season?
Tesla’s approach to automotive autonomy is a unique one: Rather than using pesky sensors, which cost money, the company has instead decided to rely only on the output from the car’s cameras. Its computers analyze every pixel, crunch through tons of data, and then apparently decide to just plow into deer and keep on trucking.
I mean, to be honest…if you are about to hit a deer on the road anyway, speed up. Higher chance the scrawny fucker will get yeeted over you after meeting your car, rather than get juuuuust perfectly booped into air to crash through windshield and into your face.
Official advice I heard many times. Prolly doesn’t apply if you are going slow.
Edit: Read further down. This advice is effing outdated, disregard. -_- God I am happy I’ve never had to put it i to test.
I know a lot of people here are/will be mad at Musk simply for personal political disagreement, but even just putting that aside, I’ve never liked the idea of self-driving cars. There’s just too much that can go wrong too easily, and in a 1-ton piece of metal and glass moving at speeds up to near 100 mph, you need to be able to have the control enough to respond within a few seconds if the unexpected happens, like a deer jumping in the middle of the road. Computers don’t, and may never, have the benefit of contextual awareness to make the right decision as often as a human would in those situations. I’m not going to cheer for the downfall of Musk or Tesla as a whole, but they do severely need to reconsider this idea or else there will be a lot of people hurt and/or killed and a lot of liability on them when it happens. That’s a lot of risk to take on for a smaller auto maker like them, just thinking in business terms.
An FSD car that makes perfect decisions would theoretically be safer than a human driver who also makes perfect decisions, if for no other reason than the car could do it faster.
Personally, I would love to see autonomous cars see widespread use. They don’t have to be perfect, just safer mile-for-mile than human drivers. (Which means that Teslas, with Musk’s gobsmackingly stupid insistence on only using cameras, will never reach that threshold).
I mean we do let humans drive cars and some of them are as dumb as bricks and some are malicious little freaks.
Not saying we are anywhere FSD and Elon is a clown, but I would support a future with this technology if we ever got there. The issue is we would have to be all or nothing. Like you can’t have a mix of robots and people driving around.
The problem is that with dumb drivers you can easily place blame at the driver and make him pay for his idiocracy. FSD is a lot more complicated. You can’t really blame the driver since he wasn’t driving the car but neither did the engineer or the company itself. We’d have to draw up entirely new frameworks in order to define and place criminal neglect if one should exist. Is the company responsible for a malicious developer? Is the company responsible for a driver ignoring a set guideline and sits impaired behind the emergency stop? Is the driver responsible for a software fault?
All of these questions and many more needs to be answered. Some probably can’t and must remain a so-called “act of God” with no blame to place. And people is not fond of blaming just the software, they’re out for blood when an accident happens and software don’t bleed. Of course the above questions might be the easiest to answer but the point still stands.
Full self driving should only be implemented when the system is good enough to completely take over all driving functions. It should only be available in vehicles without steering wheels. The Tesla solution of having “self driving” but relying on the copout of requiring constant user attention and feedback is ridiculous. Only when a system is truly capable of self-driving 100% autonomously, at a level statistically far better than a human, should any kind of self-driving be allowed on the road. Systems like Tesla’s FSD officially require you to always be ready to intervene at a moment’s notice. They know their system isn’t ready for independent use yet, so they require that manual input. But of course this encourages disengaged driving; no one actually pays attention to the road like they should, able to intervene at a moment’s notice. Tesla’s FSD imitates true self-driving, but it pawns off the liability do drivers by requiring them to pay attention at all times. This should be illegal. Beyond merely lane-assistance technology, no self-driving tech should be allowed except in vehicles without steering wheels. If your AI can’t truly perform better than a human, it’s better for humans to be the only ones actively driving the vehicle.
This also solves the civil liability problem. Tesla’s current system has a dubious liability structure designed to pawn liability off to the driver. But if there isn’t even a steering wheel in the car, then the liability must fall entirely on the vehicle manufacturer. They are after all 100% responsible for the algorithm that controls the vehicle, and you should ultimately have legal liability for the algorithms you create. Is your company not confident enough in its self-driving tech to assume full legal liability for the actions of your vehicles? No? Then your tech isn’t good enough yet. There can be a process for car companies to subcontract out the payment of legal claims against the company. They can hire State Farm or whoever to handle insurance claims against them. But ultimately, legal liability will fall on the company.
This also avoids criminal liability. If you only allow full self-driving in vehicles without steering wheels, there is zero doubt about who is control of the car. There isn’t a driver anymore, only passengers. Even if you’re a person sitting in the seat that would normally be a driver’s seat, it doesn’t matter. You are just a passenger legally. You can be as tired, distracted, drunk, or high as you like, you’re not getting any criminal liability for driving the vehicle. There is such a clear bright line - there is literally no steering wheel - that it is absolutely undeniable that you have zero control over the vehicle.
This actually would work under the same theory of existing drunk-driving law. People can get ticketed for drunk driving for sleeping in their cars. Even if the cops never see you driving, you can get charged for drunk driving if they find you in a position where you could drunk drive. So if you have your keys on you while sleeping drunk in a parked car, you can get charged with DD. But not having a steering wheel at all would be the equivalent of not having the keys to a vehicle - you are literally incapable of operating it. And if you are not capable of operating it, you cannot be criminally liable for any crime relating to its operation.
The poster, who pays Tesla CEO Elon Musk for a subscription to the increasingly far-right social media site, claimed that the FSD software “works awesome” and that a deer in the road is an “edge case.” One might argue that edge cases are actually very important parts of any claimed autonomy suite, given how drivers check out when they feel the car is doing the work, but this owner remains “insanely grateful” to Tesla regardless.
How are these people always such pathetic suckers.
I grew up in Maine. Deer in the road isn’t an edge case there. It’s more like a nightly occurrence.
Same in Kansas. Was in a car that hit one in the 80s and see them often enough that I had to avoid one that was crossing a busy interstste highway last week.
Deer are the opposite of an edge case in the majority of the US.
It’s no different in Southern Ontario where I live. Saw a semi truck plow into one, it really wasn’t pretty. Another left a huge dent on my mom’s car when she hit one driving at night.
Putting these valid points aside we’re also all just taking for granted that the software would have properly identified a human under the same circumstances… This could very easily have been a much more chilling outcome
I’m not taking that for granted. If it can’t tell a solid object os in the road, I would guess that would be true for a human that is balled up or facing away as well.
Same, hit one just south of Lyndon at night.
I drove through rural Arkansas at sundown once. I’ve never seen so many deer in my life.
Being a run of the mill fascist (rather than those in power) is actually an incredibly submissive position, they just want strong daddies to take care of them and make the bad people go away. It takes courage to be a “snowflake liberal” by comparison
Edge cases (NOT features) are the thing that keeps them from reaching higher levels of autonomy. These level differences are like “most circumstances”, “nearly all circumstances”, “really all circumstances”.
Since Tesla cares so much more about features, they will remain on level 2 for another very long time.
Deer on the road is an edge case that humans cannot handle well. In general every option other than hitting the deer is overall worse - which is why most insurance companies won’t increase your rates if you hit a deer and file a claim for repairs.
The only way to not hit/kill hundreds of deer (thousands? I don’t know the number) every year is to reduce rural speed limits to unreasonably slow speeds. Deer jump out of dark places right in front of cars all the time - the only option to avoid it that might work is either drive in the other lanes (which sometimes means into an oncoming car), or into the ditch (you have no clue what might be there - if you are lucky the car just rolls, but there could be large rocks or strong fence posts and the car stops instantly. Note that this all happens fast, you can’t think you only get to react. Drivers in rural areas are taught to hit the brakes and maintain their lane.
Drivers in rural areas are taught to hit the brakes and maintain their lane.
Which the Tesla didn’t do. It plowed full speed into the deer, which arguably made the collision much much worse than it could have been. I doubt the thing was programmed to maintain speed into a deer. The more likely alternative is that the FSD couldn’t tell there was a deer there in the first place.
Braking dips the hood making it easier for the deer to go into the windshield. You should actually speed up right before hitting to make your hood go up and make it hopefully go under or better stay in the grill.
Doesn’t this all depend on the height of your car and the condition of your shocks? Doesn’t seem like a hard and fast rule. Also, you’re assuming rear wheel drive. FWD does not “raise the hood” like you’re playing Cruising USA.
Please show me that guideline, anywhere.
/Swede living in the deer countryside
Wear gloves when they hand you that guideline because they might be pulling it out of their ass.
Maybe, but it’s still the case that slowing down will impart less energy to the collision. Let up on the brake before impact if you want, but you should have been braking once you first saw the deer in the road.
Sometimes those fuckers just jump out at you at the last minute. They’re not smart. But if you click the link, this one was right in the middle of the road, with that “Deer in the headlights” look. There was plenty of time to slow down before impact.
Conditions matter and your reaction should always be for the worst possible scenario (moose and snow), braking removes your ability to maneuver as well, and locking the brakes up which will almost always happen when you panic break, would be the worst scenario. If there’s snow or rain, braking again is right out.
If it jumps out and you can’t do anything but brake, you shouldn’t do that, you grip the wheel and maintain speed, and if you can punch the gas for the hood raise. But people panic and can’t think. So maintain speed, don’t panic and lock your brakes up.
You should know how to brake without causing maneuver problems (including not locking up the wheels). It is a basic skill needed for many situations. Just keep slowing down, the accelerate just before impact is something that can only be done in movies - any real world attempt will be worse - remember if you keep braking you lose momentum, so the acceleration needs to be perfectly timed or it is worse.
This sounds made up
If you think physics is made up, sure…
I don’t think hitting more gas is going to gently slide the 300 pound buck under my car. It’s just going to increase the impact force.
Sliding the deer under your car is also really bad for you. It’s going to do a lot of damage under there such as ripping break lines, destroying ball joints, or fragging your differentials. You need to safely shed as much speed as possible while maintaining your lane when about to hit a deer.
Considering suspension, if you accelerate there’s a lowering of the back of the car/raising of the front.
Conversely, breaking has the opposite effect, increasing the chances of the deer rolling over your hood and through your windshield.
You’ll want to minimize that, hence the acceleration.
Right before hitting begin the keyword. If you can stop before hitting yes that’s ideal, but in situations where it jumps out and you can’t react. Braking during impact is the worst thing you can do.
If you think I’m saying to line it up and accelerate for 200meters, I dont know what to say about that,
Dude, the article just said to hit the brakes “if you can’t avoid hitting a deer”, the exact scenario you described… Did you even open it?
Braking during impact is the worst thing you can do.
This is not correct, where are you getting this from?
aight what’s your strategy for hitting a giraffe, then?
I don’t know, where I live giraffes are only in the zoo and thus never on the road. I’m not aware of any escaping the zoo.
I’m sure if I lived around wild deere, my training would include that, but since I don’t I was able to save some time by not learning that.
Same for a moose? Speed up so you clear it before gravity caves your car roof.
You maintain speed, you can’t maneuver well if braking, and as stated your hood dips while braking too which can cause worse issues.
The whole premise of ABS brakes, which all cars made in North America since 2012 will have, is specifically to allow you to maintain control when you fully apply the brakes. Unless you are a professional driver or have a car without ABS, you should just fully apply the brakes in an emergency stop. Please stop telling people that fully applying the brakes will reduce manueverability when it won’t for the majority of drivers in the developed world.
And if someone’s vehicle doesn’t have ABS, they should know how to properly brake without locking their tires, and when it won’t be appropriate to use them.
That’s a good strategy to ensure you die: a mooses torso is already higher than the hood of a lot of SUVs, so you’re taking a moose to the face.
Troll comment.
You do that - you die.
No, for moose you are actually supposed to swerve and risk the ditch.
Damn right. Stomp the brakes and take it to the face.
The problem is not that the deer was hit, a human driver may have done so as well. The actual issue is that the car didn’t do anything to avoid hitting it. It didn’t even register that the deer was there and, what’s even worse, that there was an accident. It just continued on as if nothing happened.
Yeah, the automated system should be better than a human. That is the whole point of collision detection systems!
Deer on the road is an edge case that humans cannot handle well.
If I’m driving at dawn or dusk, when they’re moving around in low light I’m extra careful when driving. I’m scanning the treeline, the sides of the road, the median etc because I know there’s a decent chance I’ll see them and I can slow down in case they make a run across the road. So far I’ve seen several hundred deer and I haven’t hit any of them.
Tesla makes absolutely no provision in this regard.
This whole FSD thing is a massive failure of oversight, no car should be doing self driving without using cameras and radar and Tesla should be forced to refund the
suckerscustomers who paid for this feature.Sure, I do that too. I also have had damage because a deer I didn’t see jumped out of the trees onto the road. (Though as others pointed out this case the deer was on the road with plenty of time to stop (or at least greatly slow down), but the Tesla did nothing.
Deer jump out of dark places
that one was just standing there, yo
If tesla also used radar or other sensing systems instead of limiting themselves to only cameras then being in the dark wouldn’t be an issue.
In general every option other than hitting the deer is overall worse
You’re wrong. The clear solution here is to open suicide-prevention clinics for the depressed deer.
Sunk cost? Tech worship?
I’m so jaded, I question my wife when she says the sun will rise tomorrow so I really don’t get it either.
I roll my eyes at the dishonest bad faith takes people have in the comments about how people do the same thing behind the wheel. Like that’s going to make autopiloting self-driving cars an exception. Least a person can react, can slow down or do anything that an unthinking, going-by-the-pixels computer can’t do at a whim.
How come human drivers have more fatalities and injuries per mile driven?
Musk can die in a fire, but self driving car tech seems to be vastly safer than human drivers when you do apples to apples comparisons. It’s like wearing a seatbelt, you certainly don’t need to have one to go from point A to point B, but you’re definitely safer with it - even if you are giving up a little control. Like a seatbelt, you can always take it off.
I honestly think it shouldn’t be called “self driving” or “autopilot” but should work more like the safety systems in Airbusses by simply not allowing the human to make a decision that would create a dangerous situation.
People are well known for never ever running over anything or anyone.
Only keeping the regular cameras was a genius move to hold back their full autonomy plans
The day he said that “ReGULAr CAmErAs aRe ALl YoU NeEd” was the day I lost all trust in their implementation. And I’m someone who’s completely ready to turn over all my driving to an autopilot lol
I believe we can make a self-driving car with only optical sensors that performs as well as a human someday. I don’t think today is that day, or that we shouldn’t aim for self-driving to be far better than human drivers.
You can’t understand his ironman levels of genius because of your below-billionnaire mind
Hardware 4 models have a radar on the front as well.
It was an illegal deer immigrant, it recognised it, added it to the database on Tesla servers, and mowed it down before it took any jobs or whatever the hate-concern was.
/s
… but some actual technically human people do the same when they see an animal, don’t they?
:(… but some actual technically human people do the same when they see an animal, don’t they?
Not deer…
I thought the deer would be running or something, but no its just straight on from the car, doesn’t move at all! How the fuck does a deer standing dead center in front of you not get caught by the camera!
That deer was pushing the woke agenda!
It was using the doe restroom.
The car was pushing the fast asleep (at the wheel) agenda.