Tesla Autopilot issues

TheBeastie

1 MW
Joined
Jul 27, 2012
Messages
2,095
Location
Melbourne Australia
I thought I would create a dedicated topic on this because its probably going to hit headlines longterm.

Personally I just couldn't believe it when Tesla came out with Autopilot without lidar technology which every other car marker says is crucial.
To me its insanely obvious that Elon dismissed lidar in his self driving cars because it doesn't make the car look good, its expensive and he needs everything he can to boost his stock price.
To me it was obvious he doesn't care if some people would die and I can't help but believe he knew it would be a miracle if no one dies due to his technology specifically..
When discussing these issues with other people I found my self being bitten on the neck that I am completely wrong harder than a hungry zombie could attack me.
I realized that if your got stock in Tesla or just like his stuff the bias is something that is not worth arguing against, its tribal power part of the brain https://youtu.be/S74C-XF9kYY .

I was thinking about this report where a Teslas cars owner that was in Autopilot killed his own son and like a lot of Tesla incidences the bits that were supposed to record the incident were lost.
https://www.bloomberg.com/news/articles/2016-09-15/tesla-investigates-potential-autopilot-link-in-fatal-china-crash

Typically Bloomberg has turned more click baity to the left side over the last year instead of being a boring business news site it has a constant stream of news articles about how reneweable energy is going to replace everything etc, what people love to hear, but what news outlet doesn't do this these days.

Thought this article below was interesting bit of what goes under the surface in the industry.
Uber’s Former Self-Driving Chief Urged Kalanick to Criticize Musk
The former engineer at the center of Uber’s self-driving car legal troubles urged ex-Chief Executive Officer Travis Kalanick to criticize Tesla Inc.’s Elon Musk and several of his claims about autonomous vehicles.

Anthony Levandowski, whom Uber fired in May, sent a text to Kalanick in September that criticized Musk for saying Tesla was unlikely to use lidar sensors for its cars.

https://www.bloomberg.com/news/articles/2017-08-15/uber-texts-show-former-self-driving-chief-urged-calling-out-musk?utm_content=tv&utm_campaign=socialflow-organic&utm_source=twitter&utm_medium=social&cmpid%3D=socialflow-twitter-tv
 
What benefit does LIDAR offer over the systems already in a Tesla.

I have driven one and the ability for it's sesors to pick up vehicles outside line of sight was impressive. I think Musk's thinking is possibly that adding another method of detecting objects is unnecessary. It already has good abilities in that space.

Where all of these technologies need to improve is the ability to interpret the data (regardless of the source of the data).

While there have been deaths in Teslas the fact still remains that statistically you are much less likely to be in a major accident while you have the Autopilot engaged. But I am only referring to the driver assistance mode not the fully autonomous capability for which we don't yet have enough experience or data.
 
I use LIDAR for laser pulse "time of flight" ground distance measurement. Just a small bit of fog and it just gives garbage data.

For vehicles that will be operating in unplanned entry and exit of fog banks, I'm personally very grateful they went Radar.
 
Lidar is an effective addition to radar sensor arrays in dense urban environments where autonomous vehicles have much more environmental inputs around them to process. For scenarios involving autonomous assistance on highways or interstates, much less information is required to maintain position.

galderdi said:
Where all of these technologies need to improve is the ability to interpret the data (regardless of the source of the data).

In addition to requiring infinitely more powerful processing devices, the communication networks that are involved - CAN, ethernet, etc,. are all carrying huge amounts of data at 2-3 times the rate of a typical CAN Bus. It is one of the technology "races within the race" to full autonomy.

Len
 
The Tesla optical lense Tesla cars use can't see through fog etc either its one of those Chewbacca defenses that makes me questions peoples ability to think for themselves, I been trying to work out where people get these dumb ideas from and I can only assume its mostly just Elon Chewbaccaing folks. https://www.youtube.com/watch?v=clKi92j6eLE

Its also known that objects like small aluminum soda cans etc can show up on Tesla's radar as big as garbage bins when on there on the road and also its radar can't see wood or plastics and even struggles with people.

I think if just some of this technology could be used for example to stop folks vehicles going above the speed limit then it would do a lot to solve deaths on the roads.
It's crazy that cars can typically go 3 times faster than the max speed limit on roads in general, if just a tiny bit of modern technology could be used to enforce the basics like speed it would go a very long way for road safety, but folks seem to miss these simple facts over more easily absorbable ideas from Elon such as self driving cars have to take over the roads completely instead which will work out very well for the stock price of these companies.

Unfortunately the single biggest employer in the USA is truck driving and it seems the community in general seems to miss that fact, and Uber etc has proven people love these jobs, while at the same time Uber is desperately trying to automate driving to push up its market capitalization for its investors.
 
TheBeastie said:
The Tesla optical lense Tesla cars use can't see through fog etc either its one of those Chewbacca defenses that makes me questions peoples ability to think for themselves, I been trying to work out where people get these dumb ideas from and I can only assume its mostly just Elon Chewbaccaing folks. https://www.youtube.com/watch?v=clKi92j6eLE

Its also known that objects like small aluminum soda cans etc can show up on Tesla's radar as big as garbage bins when on there on the road and also its radar can't see wood or plastics and even struggles with people.

I think if just some of this technology could be used for example to stop folks vehicles going above the speed limit then it would do a lot to solve deaths on the roads.
It's crazy that cars can typically go 3 times faster than the max speed limit on roads in general, if just a tiny bit of modern technology could be used to enforce the basics like speed it would go a very long way for road safety, but folks seem to miss these simple facts over more easily absorbable ideas from Elon such as self driving cars have to take over the roads completely instead which will work out very well for the stock price of these companies.

Unfortunately the single biggest employer in the USA is truck driving and it seems the community in general seems to miss that fact, and Uber etc has proven people love these jobs, while at the same time Uber is desperately trying to automate driving to push up its market capitalization for its investors.

I accept much of what you say here.

I think there are two different objectives regarding sensors.

One is the Cruise Control and Safety augmentation end of the spectrum where they add capability to the human driver. I am all for this objective as it will save lives. In this context I think the Tesla system is fine. The system generally improves the time to react to most common obstacles, and when it isn't able to handle a situation the driver should react.

The other objective is driverless automated driving. I am dead against this for the moment. Its not about the sensors for me. Just for a moment lets assume we live in the future and we have a sensors that can detect 100% of objects 100% of the time 100% accurately. I have been around computers and programming long enough to know that corners are always cut. Sure we will program in all the obvious scenarios. But there is an infinite number of less predictable scenarios that would confuse the program logic and possibly result in undesirable outcomes. Even with AI and computer learning the reactions can only be taught based on scenarios fed into the system. So what happens when it is faced with a freak scenario? I know the time will come quite quickly when the safety of a totally driverless system will exceed the safety of a human driver. But I think it will take some other assurance (in addition to the stats) before I would be comfortable without a human involved. What happens if the system fails? No system is immune to an actual failure. Do we have total redundant backup systems, I don't think so. I think it brings new meaning to the term "Blue Screen of Death"
 
Back
Top