Tesla Self Crashing Car has First Fatality

That's unlike you not to provide some commentary Dauntless!

Never fear, I'll have a go: I didn't care for the flavor of the article. How can the creators/regulators/drivers be surprised these things will have wrecks too? The potential is actually greater for large scale catastrophe in some ways!

So imo, as long as humans have accidents, so will their creations. Heck even nature itself seems to have 'accidents' (I'm told I'm an accident :roll: ), which leads me to wonder if we really understand things we generally consider 'bad' / 'accidental', etc.

I am curious why radar/laser guidance didn't pickup the obstruction though. Sure, visual sensors can be fooled easily, but usually not in conjunction with sound and/or directed light? Or maybe it was *just* a software glitch.
Imo, the technological failings of the tech should be the main focus of any article that I consider worthy. Instead it was just typical 'news'.

Well, peace to the driver's family. He died doing what he loved!
 
In a statement, Tesla said it appeared the Model S car was unable to recognise "the white side of the tractor trailer against a brightly lit sky" that had driven across the car's path.

On Thursday, Tesla stressed that cars being controlled by Autopilot had travelled 130 million safe miles to date.

The company said in a statement: "The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."

"Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents."

I don't know whether the open space between the axles of the truck momentarily fooled the car's sensors. On European trucks, at least, this space has to be filled by a crash structure (increasingly it's completely faired off for aerodynamic reasons) to prevent cars entering.
 
"Numerous similar accidents."

I like to spare commentary when I don't really know. The only thing I'm sure if is IT'S INEVITABLE. Hence the title. The speculation was that the side of the truck matched the sky and the driver was unattentive to the brakes not kicking in until it was too late. Today they said the guy had 8 speeding tickets in less than 3 years.

The self driving Google cars have had accidents. The driver inside prevented more. We've already been bombarded with how safe this is and how it's better than a regular car. Basically that's the bias that will cancel out honest, objective reporting. THAT is the only part I'd want to comment on. The use of confirmation bias PROVES the bias, while the use of bias proves itself wrong.

"He died doing what he loved!" Being careless, reckless, etc. That needs to be recognized. He could have taken someone with him. That too needs to be recognized.

Wednesday afternoon I headed into the Hades of all traffic: Los Angeles. I did so with the notion that I had to directly prevent even the most minor bump into other objects. Blaming the driver to let the system off the hook PROVES it's a bad system. Because he's who will be in the car. You don't find a lot of people like me with no tickets or accidents since god knows when. But even I might be complacent when the car drives itself.

Commentary enough for you?
 
I read this on Yahoo news homepage last night.

The foolish and dumb driver did not even do a shoulder check.

A quick 1 second shoulder check when changing lanes is basic driving skills learned when you are 14 years of age.

8 speeding tickets in less than 3 years.
That last speeding ticket I got was in Highschool many decades ago.
Wasnt this guy like 60 yrs old or something? I can only guess from the picture I saw of him on the Yahoo article I read.
 
Good linked article, thanks :)

Aside from a potential blind-spot, it does highlight what will be a difficult job for self-driving systems: anticipation. For a human it's quite easy to assess whether a non-hazard might be about to become one based on a range of subtle cues and start slowing down or looking for an escape route in case that thing decides to pull out/across. I'm guessing it'll be a long time before the Tesla Autopilot will be able to judge the intention of another driver based on whether they've made eye-contact...
 
Reporting is that he was the 40 year old owner of a 'Technology company.' If he's a technology type, then he knew better than to trust it. I'm not sure where the shoulder check or changing lanes come in.

Based on the history of stories like this, I'm sure the 'Harry Potter' tale is bunk. People LOVE coming up with crap like that.

But what about when Hugh Grant's self driving car gets in an accident while he has a prostitute in the car taking care of him? What if she's the type who bites when startled? Poor Hugh.
 
I've been of the opinion that this technology shouldn't ever be trusted more than an attentive driver. I don't quite trust that it will always recognize pedestrians and cyclists, nor do I trust its ability to judge what functions to execute in the case that an accident is unavoidable.
 
The Toecutter said:
I've been of the opinion that this technology shouldn't ever be trusted more than an attentive driver. I don't quite trust that it will always recognize pedestrians and cyclists, nor do I trust its ability to judge what functions to execute in the case that an accident is unavoidable.

105 people will die today on the american roads in accidents, should we ban everybody from driving, because they can't be trusted?
 
LSBW said:
should we ban everybody from driving, because they can't be trusted?

No.(Although a more compelling argument could certainly be made for banning cars, than banning guns, from this standpoint.)

That being said, there is a big difference between a human being that is prone to fatigue, altered states of consciousness, inattentiveness, and other variables but is well capable of making a judgement based on a sense of ethics, and a computer algorithm that has none of those shortcomings, but is utterly incapable of making a judgement from an ethical standpoint, plus the shortcomings of being vulnerable to tampering or hacking or incorrect/unexpected sensory inputs.

Personally, I'd rather take my chances with idiot drivers.
 
The accident-per-trip, and also accident-per-passenger-mile are both better on the self-driving Tesla.

If someone wants to do this, I think they should have the option. If I don't want to do this, I don't think I should restrict other drivers option to exercise this choice. The safety of the general population is not worse because of self-driving Teslas, so...why not?

When I was a truck driver, my elevated position allowed me to see into other vehicles. I saw police and bus drivers texting, and not just at stop lights...Common citizen drivers? Bitch, please...
 
spinningmagnets said:
The accident-per-trip, and also accident-per-passenger-mile are both better on the self-driving Tesla.

True. In fact, the accident rate of self-driving cars is greatly reduced versus the norm, by like an order of magnitude.

If someone wants to do this, I think they should have the option. If I don't want to do this, I don't think I should restrict other drivers option to exercise this choice. The safety of the general population is not worse because of self-driving Teslas, so...why not?

I can agree with this. However, I've seen politicians bandying about proposals for making self-driving cars mandatory, and I think that such a proposal is overall a bad idea.

Personally, I'd rather be in control of the vehicle, than for the vehicle to be in control of me.

My concern arises from a scenario where the algorithm is faced with the prospect of being in an accident, no matter what it decides to do. Does it decide to slam head-on into the oncoming 18-wheeler that has veered into its lane, or instead plow over a lone cyclist at the shoulder of the road to protect the driver and their children?

I would rather a human be making this decision... simply for the fact that it's hard to hold an algorithm responsible for the consequences, and completely unfair to hold a driver responsible for the split-second decision of a computer.

When I was a truck driver, my elevated position allowed me to see into other vehicles. I saw police and bus drivers texting, and not just at stop lights...Common citizen drivers? Bitch, please...

I've seen worse than that, but your point is well made.
 
I'm of the opinion that people's estimation of their own driving and their estimation of acceptable risk to impose on others are both wildly faulty, bordering on criminal at the median.

As soon as self-driving cars can establish that they're categorically safer and more reliable than human drivers, human drivers should be phased out. Using the sensor suites of self-driving cars to detect and report moving violations by human drivers, who could then be fined for every violation detected, would greatly expedite the transition (while generating a great deal of public revenue from those who most deserve to pay).
 
spinningmagnets said:
The accident-per-trip, and also accident-per-passenger-mile are both better on the self-driving Tesla.

If someone wants to do this, I think they should have the option. If I don't want to do this, I don't think I should restrict other drivers option to exercise this choice. The safety of the general population is not worse because of self-driving Teslas, so...why not?

since there is no driving around without accidents you could just count one something like this to happen. And 'luckely' the guy choosing to use the self-drive option is the one that got killed. Because, mark my words, at a certain point in time a self driving car is going to kill an innocent (ie not chosing to use the self-drive option) cyclist, pedestrian or other car driver.

And I am afraid for what will happen then...Mr Musk is going to scream that there should be regulation aimed at making life easier for self-driving vehicles. You as a cyclist will be forced by law to wear a yellow jersey with radar or infrared or whatever reflective material just so that a self drive vehicle will spot you and be able to deal with you. The blame for the case where a self drive vehicle kills someone outside of the vehicle will shift to the victim...
 
From historical point of view, people who is against self-driving cars, should look back 50 years ago.
Cars then had no seatbelts, period.
50 years fast forward, seatbelts are mandatory, as is ABS and 10 air bags.
As soon as there will be enough data to prove that self-driving cars are 100 times better at not killing humans, they will be mandatory.
Couple generations later, our grand kids won't even know how to drive cars.
 
Self driving cars will continue to improve in sensory input and processing speed.

Humans will not improve at the rate self driving cars will evolve, and self driving cars are already in a comparable range.
 
liveforphysics said:
Self driving cars will continue to improve in sensory input and processing speed.

Humans will not improve at the rate self driving cars will evolve, and self driving cars are already in a comparable range.

isn't this akin to stating that in the future artificial intelligence will be smarter than human beings ?

i can imagine this will happen, but would you want to live in a world like that ?
 
Lebowski said:
liveforphysics said:
Self driving cars will continue to improve in sensory input and processing speed.

Humans will not improve at the rate self driving cars will evolve, and self driving cars are already in a comparable range.

isn't this akin to stating that in the future artificial intelligence will be smarter than human beings ?

i can imagine this will happen, but would you want to live in a world like that ?


Most car wrecks I've been involved in were do to myself falling asleep driving. I welcome cars that don't fall asleep.
 
liveforphysics said:
Lebowski said:
liveforphysics said:
Self driving cars will continue to improve in sensory input and processing speed.

Humans will not improve at the rate self driving cars will evolve, and self driving cars are already in a comparable range.

isn't this akin to stating that in the future artificial intelligence will be smarter than human beings ?

i can imagine this will happen, but would you want to live in a world like that ?


Most car wrecks I've been involved in were do to myself falling asleep driving. I welcome cars that don't fall asleep.

never been in a car wreck... been in motorcycle crashes a few times though...but always been lucky.

talking about self driving cars, I would love to have a self driving car in which i can fall asleep (self driving bed ?) friday evening and wake up 1000km away in Barcelona (for a nice weekend) the next morning.
 
Chalo said:
I'm of the opinion that people's estimation of their own driving and their estimation of acceptable risk to impose on others are both wildly faulty, bordering on criminal at the median.

Agreed.

As soon as self-driving cars can establish that they're categorically safer and more reliable than human drivers, human drivers should be phased out. Using the sensor suites of self-driving cars to detect and report moving violations by human drivers, who could then be fined for every violation detected, would greatly expedite the transition (while generating a great deal of public revenue from those who most deserve to pay).

Such a scheme would place special interests even more in control of transportation than they are today. It's already into the realm of the dystopian that police and other entities today are constantly scanning passing cars with ALPRs, traffic cameras, toll booth cameras, and other screening for the purposes to track, monitor, control, and collect money from mostly a subset of people that would really rather go about their business from point A to point B unimpeded.

I've seen many types of vehicle operators. IMO, the safest vehicle operators are the most attentive, conscientious, and engaged ones. Not the slowest. Not the sober. Not the "responsible" that got licensed and insured and have all of their paperwork in order. Not even the most skilled.

All but two of the twenty-plus near-misses I've had were from other drivers not paying attention to the task of driving. The two that weren't were my own fault for not paying attention to the task of driving, but all of the others I avoided a collision precisely because I was paying attention. One example was operating my grandmother's Mustang through East St. Louis. I was doing roughly the speed limit, going through an intersection at a green light. I was constantly checking my mirrors every few seconds so I had awareness that no car was behind or beside me. All of a sudden, an old Mercedes that had a red light on my right was heading passed the crosswalk to my right at 15+ mph at a 90 degree angle. I floored it and swerved to opposite lanes and very narrowly avoided being broadsided by this old Benz that continued through the intersection on my green light, never having attempted to brake, and both of us narrowly missed two children crossing the crosswalk on my left. I saw them and was aware of them when I made the move that I did, but I doubt that the red-light runner was, and I know that I broke a few traffic laws in the process of all of this. Had I been hit from the right side, I could have been thrown their way. When I looked at the speedometer after the avoidance and getting back into my lane, I was doing about 8 mph over the speed limit, on top of just earlier veering into a lane that had oncoming traffic a hundred feet away moving at 30+ mph towards their green light.

I've had lots of such incidents from idiot drivings not paying attention.

Given that I once received a red light camera ticket for a violation that I didn't commit(I never paid it or showed up to court because of its dubious legal status in Missouri, a fine of which later turned out in court to be unenforceable on any of the recipients), I would question the ability of an automated system to understand when I broke a few laws for the purpose of safety.

During my bouts of reckless driving in the country for fun at triple digit speeds(using my own cars, of course, and being well aware of the rules being broken or the risks involved), my number of narrow misses or accidents has thus far been zero. I've spent well over one-hundred hours of my life engaged in this sort of driving. Most of my driving has been of the mundane A to B type though, where safety is the primary concern and I don't normally drive like a jackass, but when I do, I pay very close attention to everything I do. When I drive wrecklessly, I do so in a manner that maximizes the possible safety in doing so, inherent danger in mind and the goal of not wrecking only second to driving with speed in mind. I do it when there are no other drivers around, or pedestrians, where the risk is bordering on zero for anyone else but myself.

I don't have a tendency to get into wrecks or have a single paid ticket to my name, never had any points on the license I used to hold, and when I had it, I had extremely low insurance rates as a result of avoiding wrecks. I've traveled well over 100,000 miles traveled as I've operated my own cars thus far, and only 2 narrow misses that were my own fault.

I have ridden with or seen on the roads lots of "bad" vehicle operators, and I would consider myself in the "not bad" category considering my experiences when either an operator or passenger of an automobile. I've been in cars with operators who were following the rules to the best of their ability, but were not at all paying attention to the road itself, and I have witnessed plenty of narrow misses that they caused, in which the other driver avoided it. I was once in a car where the driver was so concerned about maintaining a speed limit and watching the speedometer through a well-known speed trap that she almost rear-ended a motorcycle stopped at a stop sign in front of her. Police were known for getting people doing 1 mph over, even though the flow of traffic was commonly 5+ mph over in the slow lane and it was a downhill section.

The most dangerous things about me operating a motor vehicle has been other vehicle operators, usually who didn't have their lights on, were on a cell phone talking or texting while they veered into my lane, were distracted in some form, or even sleeping. A self driving car is a solution that allows these bad habits to continue, but at great expense to those of us for whom the default mode of operating a vehicle is to be concerned with the task of operating the vehicle in a manner that avoids a collision of any sort, rules only secondary to that.

I'm far from the "best" operator out there, but I have managed quite well.

A self-driving car should still have an operator that is attentive and awake for the purpose of safety, and in order for that to be viable, a person would still have to know how to operate a car, a skill which requires practice, something one will not acquire being shuttled around by a driver-less automobile. Banning the operation of a car on the roads without having this technology in use makes this additional safety factor impossible. The computer can't be 100% trusted to know better what to do than a human operator 100% of the time, precisely because it was created by the same flawed human operators in the first place. If the technology is to be used, IMO the vehicle operator should still be required to treat the task of vehicle operation seriously, and making the technology mandatory could actually serve to work to the detriment of that goal.

Also, the more rules you have to concern yourself with, the less attention you can pay on what is on the road in front of you, lest you get a fine backed by threat of force. Rigorous enforcement of every minor little offense could serve to drive those operators to paying attention to the rules to an extreme degree, even to the expense of safe vehicle operation.

Of course, Joe Sixpack with their i-Crap in hand, radio on loud, eating, during their morning commute, is probably the largest risk factor with regard to automobile accidents. Inattentive drivers frighten me more than intoxicated or wreckless drivers by far. All of these risk factors have the potential to be dramatically reduced without the need for self driving tech. Self driving tech encourages this behavior on part of the operator not in the name of safety, but in the name of convenience.

Driverless cars as mandatory would insert a computer algorithm, additional special interests, and more expense and bureaucracy between you and personal transportation. It would also remove the autonomy that came with operating a vehicle, that used to exist in history for a significant period. The inconvenient fact about the self driving computer is that you do not control it. It also controls your transportation, and therefore within some facets of your existence, it controls you. It chooses for you because someone somewhere agreed to the notion that it always knows best and happened to have a say in the rules established and backed by the threat of force.

Massive reductions in accidents could be made if we had a cultural change in how we behaved behind the wheel. What happens if the norm becomes for drivers to pay attention to the task of driving? Where do accident rates go from there? Some cultures(such as Germany) are much more rigorous about safety than the U.S. as a result of their cultural mores, not just the way that their rules are set up. I don't necessarily agree with all of their rules, but in many respects, their rules are more about competence and safety than are those in the U.S. which are largely used for wealth extraction from motorists more than anything, and drivers in Germany are seldom seen texting while driving when compared to the U.S. Part of the reason for this cultural shift is a greater understanding of what can go wrong when operating a vehicle as a condition of having permission to do such, but really it should just be plain common sense, but currently in the U.S., isn't.

As proof of our "rules" being exploited to extract revenue today, often to the detriment of safety, one example is the red light cameras established in the metropolitan area of St. Louis, MO. The number of rear-end collisions at intersections went up as a result of the traffic cameras being programmed to reduce the duration of their "yellow" light time sequence by 2 seconds, in order to increase the number of violations and therefore tickets generated. American Traffic Solutions, the owner of the cameras, received the majority of the generated revenue from the tickets, and paid a few bureaucrats money to install them and change the yellow light timing. There is a cause and effect relationship there, along with special interests with their hand in the cookie jar. Some of these cameras have been removed since, and the tickets have been deemed by the courts as unenforceable.

Drivers licenses are handed out like candy here as well, but if and only if a series of bureaucrats give the person permission after a series of unnecessary and arduous paperwork, in conjunction with a complete and irreversible degradation of and lack of respect for the privacy and autonomy of the person to whom the license is issued, all the result of rules of which have no real bearing on a person's safety or competence as a vehicle operator. Judging by the sheer number of fatalities that currently occur, I don't think safety is the overriding concern in the minds of the rule makers and enforcers, yet great expense is incurred onto the taxpaying public and great fortunes are made by the special interests that helped write the rules.

Self driving technology has the potential to exacerbate all of these issues if made mandatory, or reduce the severity of these issues, if used and implemented responsibly. I for one don't want to live in such a society where I was barred control over my personal transportation, and don't even agree with or like all of the rules that currently exist, and I am certainly not alone in those notions.

https://www.youtube.com/watch?v=uukZgfHZIoc
 
Back
Top