Fox T-Bird/Cougar Forums

General => Lounge => Topic started by: jcassity on June 21, 2017, 01:31:54 PM

Title: on this day 20june2017
Post by: jcassity on June 21, 2017, 01:31:54 PM
what i said years ago on this board ,, phase one of many to come has passed.

The car crash could have been prevented if the car had the latest firmware and software upgrades in order to drive itself.

I am watching closely to see how quickly this news that happened yesterday being the very first documented case is quickly burried.

soon you will not be able to drive until you get your mandatory MAC & IP address issued to your car and if needed retrofitted with such equipment so you can be closely monitored,, and your particular car may become illegal to drive because it has too much metal in it.

[COLOR="#B22222"]Tesla has long insisted that drivers must keep their hands at the wheel, ready to take over at any time. The company, which declined to comment, has previously defended its system. But software upgrades since the accident would likely have prevented it, CEO Elon Musk has said.[/COLOR]

https://www.usatoday.com/story/money/cars/2017/06/20/tesla-self-driving-car-crash/411516001/
Title: on this day 20june2017
Post by: Thunder Chicken on June 21, 2017, 04:44:12 PM
You've misunderstood Musk's statement. The updates he refers to were to require the driver to take the wheel after the car has driven a certain distance, then refuse to go back into "autopilot" until the car is stopped and put in park. Had the car insisted a little more aggressively that the driver take the wheel, instead of him only taking it for something like 25 seconds on a 35 minute trip, the accident "might" have been avoided. The car warned the driver seven times to take the wheel, and he ignored the warnings. The new software makes it impossible to ignore the warnings, because Autopilot will simply shut down if the driver ignores them, and will not be usable again until the car is stopped and put in park. The accident did not happen because of a flaw in Autopilot, it happened because the driver was using Autopilot in a way that was expressly prohibited (and, with the new software, not even possible).

Although, come to think of it, the accident didn't happen because of autopilot at all, it happened because a highway tractor turned left in front of oncoming traffic. The trucker was at fault. People seem to forget about that in their rush to shiznit on Tesla.

That being said, I wouldn't be too worried about autonomous vehicles taking over. Driver's aids will become more sophisticated, but I don't think you'll see fully autonomous cars go mainstream any time soon. Two reasons: First, it's a huge leap of faith to get into a car with no steering wheel, one I don't think many people will be willing to take, and second, automakers would be signing their own death warrants. There's too much profit in individualization and options at stake. Who would buy a V8 Mustang over a 4-cyl when it drives itself anyway? Who would buy a Mustang at all when a basic autonomous blob is cheaper and drives itself anyway?

There are also ethical and legal questions that have to be answered. If an autonomous car is driving down the road and a child runs out in front of it, does the car steer itself into the ditch, potentially injuring or killing its occupants, or does it hit the child? If the child is chasing a dog, does the car hit the dog or the kid? Or does it go into oncoming traffic to avoid both, and cause a head-on? And, if nobody is driving, who is responsible in case of accident? Could you be held liable even though you can't control the car (no steering wheel), or would the automaker be liable? And would insurance even be necessary if individuals can't be liable? And if the owner IS held liable, who is going to buy a car that might crash and result in a lawsuit through no fault of their own? Then there are the insurance companies themselves. They make a LOT of money insuring cars against human error. Insurance companies claim they want safer vehicles, but they don't want foolproof ones, because then they would be obsolete. They're not going to give up that revenue stream willingly.
Title: on this day 20june2017
Post by: Haystack on June 22, 2017, 02:38:17 AM
Really good post.

Autopilot may become the next big thing, but you can still legally drive an old 60's car with no seat belts or our cars without abs and airbags. My biggest fear with autonomous cars was what happens when a plastic bag or bug gets smashed over a sensor. They now have so many sensors and of different types, its pretty much impossible for that scenario to come to fruition.

Then again, i said the same thing about electric steering and brakes with no physical connection to the driver, and look where modern cars are going now.
Title: on this day 20june2017
Post by: jcassity on June 24, 2017, 09:00:52 AM
TC ,

"But software upgrades since the accident would likely have prevented it, CEO Elon Musk has said"

your saying i am misunderstanding what tesla said. 
pehaps you happen know what he was thinking when he said what he said,  and in your opinion that becomes the facts instead of the facts of the quoted / printed word.

i didnt quote the whole article, or explain it because the end result is the same pertaining to my claim no matter who spoke the words.
The truck caused the problem but the revealing part that you fail to mention is the 7 seconds that elapsed between when the car's software noticed the truck up to impact.  thats 7 seconds at 74mph.  You make it sound like the truck was 1/2 second ahead of the tesla car or something of a closer call than actual.  7 seconds is a very long time and offers a lot of distance. 

A human can be 2 1/2 or more car lengths behind another car and be safely oriented wthout being accused of tail gating.
 You might as well admit that 7 seconds of distance may have well been a whole foot ball field of time to react...

7 seconds is a whole bunch of distance to react at 74mph.  yes the wrech happened yet internally Tesla also knew that the car was being released into the public road ways with this known issue with the software,, so .. is the truck still at fault,,,? 
i will answer for you- yes 
would this have been a collision case if we trusted his software- yes because it happened.
would this have been a collision case if a software was not involved- not 100% no but i am going to say with 95% assuance to NO.
 
if you say Chevy released cars into the public knowing about a particular flaw,  and it is a gas engine, it would be the fault of Chevy.
because its tesla, and other unrealted things about Tesla people endorse, you say any wrongful doing is not on his back.,, its on the back of anyone who says a single peep about it

this is a very common theme down here in the lower 48

I know you say that Tesla did not say software updates could have prevented the wreck and instead i am the problem and bashing on Tesla.