S-Class (W222) 2014-2020

Why One Should Not Buy A Tesla Model S

Thread Tools
 
Search this Thread
 
Rating: Thread Rating: 2 votes, 2.50 average.
 
Old 07-03-2016, 05:09 PM
  #276  
Junior Member
 
terrain's Avatar
 
Join Date: Sep 2009
Posts: 36
Likes: 0
Received 0 Likes on 0 Posts
Swell -
Originally Posted by syswei
They do have steering wheel sensors and will seemingly randomly remind users to hold the wheel if they aren't doing so. But whether that info gets recorded I have no idea.
During a demo at the Tesla factory (Fremont) I recalled being told that there is a black box and it records everything that happens with the car and can be used just like a black box in an airplane post accident. I'd be surprised if investigators are not able to piece this together 1 for 1.
Old 07-03-2016, 05:14 PM
  #277  
MBWorld Fanatic!

 
El Cid's Avatar
 
Join Date: Nov 2009
Location: Southeastern USA
Posts: 2,572
Received 143 Likes on 102 Posts
2010 E350 Luxury Sedan, Engine 272 (V6)
More technology

Originally Posted by terrain
During a demo at the Tesla factory (Fremont) I recalled being told that there is a black box and it records everything that happens with the car and can be used just like a black box in an airplane post accident. I'd be surprised if investigators are not able to piece this together 1 for 1.
Don't most cars have some kind of "black box" now? Regardless, it only records what it was programmed to record. So will never have the complete story.
Most importantly, the Tesla technology failed and someone died. What if it had been a school bus or mini-van full of kids and he crashed into it?
How fast was he going at time of crash vs. speed limit?
I recall where a Tesla vehicle was recorded as vastly exceeding the speed limit in a mountainous area because Autopilot had been programmed by Tesla to drive at the handling extremes, not the posted speed limit.
Old 07-03-2016, 09:43 PM
  #278  
MBWorld Fanatic!
 
syswei's Avatar
 
Join Date: Dec 2007
Location: FL & CT
Posts: 2,755
Received 796 Likes on 752 Posts
2015 S550 Palladium/Deep Sea Blue, 2016 Tesla Model S 70D, 2015 Volvo XC70
Originally Posted by terrain
During a demo at the Tesla factory (Fremont) I recalled being told that there is a black box and it records everything that happens with the car and can be used just like a black box in an airplane post accident. I'd be surprised if investigators are not able to piece this together 1 for 1.
Understood, I know that data was pulled from the black box for the summon incident with the truck...I'm just saying that I don't know if the steering wheel "touch" would be recorded...maybe it is. But if it is and the guy wasn't holding the wheel at the time, you'd think Tesla would have put that in their press release...the accident happened in May.
Old 07-06-2016, 03:14 PM
  #279  
MBWorld Fanatic!
Thread Starter
 
MTrauman's Avatar
 
Join Date: Feb 2010
Posts: 1,435
Received 313 Likes on 214 Posts
‘19 AMG S63
About an hour ago the online edition of the Wall Street Journal has a headline "Tesla's Autopilot Vexes Some Drivers, Even Its Fans--Self Driving System is Flawed and Can Lull Owners Into Danger, Some Claim; Automaker Says AutoPilot Works as It Should"




This headline alone tells the story about Tesla and its dysfunctional culture. In the world of Autos, at the very least the AutoPilot system is defective from a warning standpoint. When I went to law school I learned this is the type of case that the automakers seem to always lose based on the fact that the warnings from the company are insufficient to keep people safe.


Musk is so bold that he tweeted after the accident that this accident is immaterial. Based on the company culture and the fact that Tesla appears to only want to advance "their cause" at the expense of human beings is another reason to not buy a Tesla on top of dysfunctional corporate governance.
The following users liked this post:
El Cid (07-06-2016)
Old 07-06-2016, 04:26 PM
  #280  
MBWorld Fanatic!

 
El Cid's Avatar
 
Join Date: Nov 2009
Location: Southeastern USA
Posts: 2,572
Received 143 Likes on 102 Posts
2010 E350 Luxury Sedan, Engine 272 (V6)
Will be interesting to see how this plays out with NHTSA, civil courts, safety groups, etc.
Unsafe At Any Speed ring a bell?
Old 07-06-2016, 11:37 PM
  #281  
MBWorld Fanatic!
 
C280 Sport's Avatar
 
Join Date: Jan 2009
Location: Saratoga Springs, New York & Sarasota, Florida.
Posts: 3,462
Received 407 Likes on 336 Posts
MB’s
Originally Posted by TheTeslaDude
Oil-leaking, stinky-gasoline-burning, noise-polluting, maintenance-heavy, 20th-century tech vehicles. You may keep them. 14 more years for MB EVs. Easy choice for me.
This comment is mind blowing in so many ways. You Tesla fan boys just have way too much time on your hands. You live in a world where you want everything to be your way and everyone to drive(excuse me) ride in a car that drives itself. You are not a car person at all. You are a tech guy who wants automotive equality.
Old 07-06-2016, 11:41 PM
  #282  
MBWorld Fanatic!
 
C280 Sport's Avatar
 
Join Date: Jan 2009
Location: Saratoga Springs, New York & Sarasota, Florida.
Posts: 3,462
Received 407 Likes on 336 Posts
MB’s
Originally Posted by WEBSRFR
By real world I mean as most sane people drive their vehicles in the real world. Any vehicle will go fast given enough time and then it becomes a game of who is stupid enough to drive triple digit speeds for prolonged periods on public roads and risk jail time if caught.

The point being when you are driving a P85D or P90D Tesla and you are driving in normal traffic and you want to pass someone, there is nothing they can do to prevent you from passing them.

We are not talking about track racing here and I'd never race anyone on public streets into triple digit speeds. We are talking about 1/8 mile acceleration or 0-60 type day to day driving scenarios. Basically if I'm stopped at a red light no one in a AMG or hellcat making silly farting exhaust sounds can get in front of me if I don't want them to. Conversely I can out accelerate and make a safe lane change in front of them with plenty of room to spare if I decide to do so and there is nothing they can do about it as the Tesla goes to warp speed in an instant.

Performance is not an area where you will see a Tesla loose in day to day driving. I'd stick to talking about the interior as that's where the tesla can improve

Hates the sound of a engine. Yet owns a E550 and ML350.
Old 07-07-2016, 02:58 PM
  #283  
MBWorld Fanatic!
Thread Starter
 
MTrauman's Avatar
 
Join Date: Feb 2010
Posts: 1,435
Received 313 Likes on 214 Posts
‘19 AMG S63
Here is another article from MIT Technology Review that once again shows that Musk and Tesla appear to be a "loose cannon" and overstate accident/death stats.




MIT TECHNOLOGY REVIEW
Tesla’s Dubious Claims About Autopilot’s Safety Record

Figures from Elon Musk and Tesla Motors probably overstate the safety record of the company’s self-driving Autopilot feature compared to humans.


Tesla Motors’s statement last week disclosing the first fatal crash involving its Autopilot automated driving feature opened not with condolences but with statistics.


Autopilot’s first fatality came after the system had driven people over 130 million miles, the company said, more than the 94 million miles on average between fatalities on U.S. roads as a whole.
Soon after, Tesla’s CEO and cofounder Elon Musk threw out more figures intended to prove Autopilot’s worth in a tetchy e-mail to Fortune (first disclosed yesterday). “If anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available,” he wrote.
Tesla Motors cofounder and CEO Elon Musk.Tesla and Musk’s message is clear: the data proves Autopilot is much safer than human drivers. But experts say those comparisons are worthless, because the company is comparing apples and oranges.
“It has no meaning,” says Alain Kornhauser, a Princeton professor and director of the university’s transportation program, of Tesla’s comparison of U.S.-wide statistics with data collected from its own cars. Autopilot is designed to be used only for highway driving, and may well make that safer, but standard traffic safety statistics include a much broader range of driving conditions, he says.
Tesla’s comparisons are also undermined by the fact that its expensive, relatively large vehicles are much safer in a crash than most vehicles on the road, says Bryant Walker Smith, an assistant professor at the University of South Carolina. He describes comparisons of the rate of accidents by Autopilot with population-wide statistics as “ludicrous on their face.” Tesla did not respond to a request asking it to explain why Musk and the company compare figures from very different kinds of driving.
Google has in the past drawn similar contrasts between the track record of its self-driving cars and accident statistics for humans, says Smith. He, Kornhauser, and other researchers argue that companies working on autonomous driving technology need to drop such comparisons altogether. In April, a RAND Corporation report concluded that fatalities and injuries are so rare that it would require an automated car to drive as many as hundreds of billions of miles before its performance could be fairly compared with statistics from the much larger population of human drivers.
Instead researchers say that Tesla and others need to release more data on the limitations and performance of automated driving systems if self-driving cars are to become safe and understood enough for mass market use.
The tragic crash disclosed by Tesla last week, to a chorus of negative headlines, provides a case study, says Smith. The company’s Autopilot branding and dubious comparisons with figures on human safety have encouraged people to think of it as fully competent, he says. If the company had spent the months since the feature’s October release talking about its development process and how it was refining the technology to deal with difficult situations, the reaction to the crash could have been different, he says.
“The crash could have been a continuation of an established narrative about the costs and benefits, not a surprise event,” says Smith. That more cautious approach may have sold less cars in the short run, but helped the prospects of Tesla and others banking on self-driving technology in the long term, he says. “Companies need to start saying what safety means, how they define and measure that safety and how they will monitor it,” he says.
Autopilot and the more sophisticated systems in testing by Google and others collect huge volumes of data on conditions around them and the actions they take at all times. In California and some other states that permit testing of autonomous cars, companies must report accidents or technology failures that required a human to take over. But the data is generally sparse, and collected in the form of letters, making it hard to scrutinize. Companies like Tesla that have put novel automated driving features on the market aren’t generally obligated to report special data on their performance.
Smith suggests that by releasing more of their data trove, companies could accelerate development of self-driving cars, better prove their worth, and inform efforts to develop ways to hold them to account from a safety perspective.
Elon Musk has said that Tesla would share some of its data on Autopilot with the U.S. Department of Transportation and other manufacturers, although the company has not released details about what will be handed over or when. And with Tesla, Google, established automakers such as GM, and newer startups all working on autonomous driving technologies, competitive pressures could make meaningful coöperation seem unlikely. Kornhauser at Princeton hopes that companies such as Google and Tesla will understand that they and society have more to gain if they do work together.
“They may not want to help the competition, but we’re dealing with other people’s lives,” he says. “We should have more public spiritedness in the effort to do this thing.”
The following 2 users liked this post by MTrauman:
El Cid (07-07-2016), syswei (07-07-2016)
Old 07-07-2016, 07:34 PM
  #284  
MBWorld Fanatic!
 
syswei's Avatar
 
Join Date: Dec 2007
Location: FL & CT
Posts: 2,755
Received 796 Likes on 752 Posts
2015 S550 Palladium/Deep Sea Blue, 2016 Tesla Model S 70D, 2015 Volvo XC70
Originally Posted by MTrauman
Here is another article from MIT Technology Review that once again shows that Musk and Tesla appear to be a "loose cannon" and overstate accident/death stats.
Good article, thanks for posting. But they could have brought up another point: because Autopilot is supposed to supplement the driver, i.e., not be fully autonomous, if it has anything greater than 0% effectiveness, properly measured accident rates (which we don't have) are bound to look better than human-only accident rates, as long as humans are indeed using it to supplement their full attention to the road.
The following users liked this post:
WEBSRFR (07-08-2016)
Old 07-08-2016, 03:21 PM
  #285  
MBWorld Fanatic!
 
WEBSRFR's Avatar
 
Join Date: Apr 2006
Posts: 2,136
Received 40 Likes on 34 Posts
Tesla Model S P100D
Originally Posted by El Cid
Don't most cars have some kind of "black box" now? Regardless, it only records what it was programmed to record. So will never have the complete story.
Most importantly, the Tesla technology failed and someone died. What if it had been a school bus or mini-van full of kids and he crashed into it?
How fast was he going at time of crash vs. speed limit?
I recall where a Tesla vehicle was recorded as vastly exceeding the speed limit in a mountainous area because Autopilot had been programmed by Tesla to drive at the handling extremes, not the posted speed limit.
No genius, Tesla technology did not magically fail but the driver apparently wasn't paying attention as he should have. There's only so much that can be done when an 18 wheeler does not yield to oncoming traffic and decides to cut across oncoming traffic.

This accident could have just as easily have happened in a Mercedes S Class if the driver activated lane keeping assist and didn't pay attention to the road as he should have.

Both the systems offered by Tesla and Mercedes are NOT fully autonomous and never meant to abdicate the responsibility of the driver to drive the car.

And what is that about the speed limit? The car drives the speed limit set by the driver as it should. You must live in a monetary if you drive the speed limit. When we travel long distances we set Tesla Autopilot to about 10 MPH over the speed limit so it keeps pace with traffic on the upper north east.

Astounding how our culture is increasingly lacking any sense of self responsibility. A driving aid that is not autonomous is used to the point where someone does not pay attention and when an accident is caused by a negligent truck driver cutting across oncoming traffic it is the fault of the car manufacturer because god forbid anyone be held responsible for their actions.
Old 07-08-2016, 03:24 PM
  #286  
Out Of Control!!
 
PeterUbers's Avatar
 
Join Date: Jun 2004
Posts: 11,403
Received 1,884 Likes on 1,321 Posts
2014 E63S; AMS 100 octane ecu tune; edok tcu tune; BB intakes; dyno tuned
Originally Posted by WEBSRFR
No genius, Tesla technology did not magically fail but the driver apparently wasn't paying attention as he should have. There's only so much that can be done when an 18 wheeler does not yield to oncoming traffic and decides to cut across oncoming traffic.

This accident could have just as easily have happened in a Mercedes S Class if the driver activated lane keeping assist and didn't pay attention to the road as he should have.

Both the systems offered by Tesla and Mercedes are NOT fully autonomous and never meant to abdicate the responsibility of the driver to drive the car.

And what is that about the speed limit? The car drives the speed limit set by the driver as it should. You must live in a monetary if you drive the speed limit. When we travel long distances we set Tesla Autopilot to about 10 MPH over the speed limit so it keeps pace with traffic on the upper north east.

Astounding how our culture is increasingly lacking any sense of self responsibility. A driving aid that is not autonomous is used to the point where someone does not pay attention and when an accident is caused by a negligent truck driver cutting across oncoming traffic it is the fault of the car manufacturer because god forbid anyone be held responsible for their actions.
This exact scenario happened in my e class and I stopped using distro
Old 07-08-2016, 03:25 PM
  #287  
MBWorld Fanatic!
 
WEBSRFR's Avatar
 
Join Date: Apr 2006
Posts: 2,136
Received 40 Likes on 34 Posts
Tesla Model S P100D
Originally Posted by syswei
Good article, thanks for posting. But they could have brought up another point: because Autopilot is supposed to supplement the driver, i.e., not be fully autonomous, if it has anything greater than 0% effectiveness, properly measured accident rates (which we don't have) are bound to look better than human-only accident rates, as long as humans are indeed using it to supplement their full attention to the road.
Absolutely. This is the point that many people fail to grasp and this is really the most fundamental matter. Does it increase safety and does it make it less likely you will die or be in an accident. The answer is an unequivocal yes.

Whether in a Tesla or a Mercedes I would like the added benefit of safety systems watching over my shoulder in ASSISTING me avoid accidents. There will still be accidents and people will still die. But if the rate of accidents is cut down by 50% or 300% that is good because it means fewer dead people.

This is the first known fatality in just over 130 million miles where Tesla's SEMI AUTONOMOUS system was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.

Already the Tesla system is better than a human and similar systems are offered by Mercedes, Audi, Hyundai, and other car companies. And these systems will get better by leaps and bounds and will SAVE lives and REDUCE accidents, as they already are doing.

You'd have to lack a brain to process simple math to proclaim that we should remove these systems from public roads when they are ALREADY reducing the rate of accidents.

As Churchill said, perfection is the enemy of progress. If these systems are 2X better than a human at reducing the rate of accidents what point is there in withholding or blaming this technology?

When used as directed you are far safer in a car with these systems and a vigilant driver than you are in a car with just a vigilant driver who is subject to all the distractions, tiring, anger, and other issues humans are subject to that reduce our effectiveness.
Old 07-08-2016, 03:30 PM
  #288  
MBWorld Fanatic!
 
WEBSRFR's Avatar
 
Join Date: Apr 2006
Posts: 2,136
Received 40 Likes on 34 Posts
Tesla Model S P100D
Originally Posted by PeterUbers
This exact scenario happened in my e class and I stopped using distro
And the automatic braking happens in a Tesla too. An accident was avoided in the video below when someone turned in front of a Tesla without yielding.

However the important thing is that these are driver ASSIST systems. Both Tesla and Mercedes describes these systems as such. While they will reduce the possibility of accidents as they do now, they are not infallible. This is why a vigilant driver is required to monitor the safety of the vehicle at all times.

None of these driving aids are designed to abdicate responsibility for paying attention and driving the car. They are not perfect but they will reduce accidents and save lives as they are doing now. These systems will get substantially better in the coming years.

This is what the driver had to say:

"Was travelling a little under 45 mph. There was some rain, but roads were pretty dry. I was watching stopped traffic to my right. I did not touch the brake. Car did all the work."

Old 07-08-2016, 03:38 PM
  #289  
MBWorld Fanatic!
 
WEBSRFR's Avatar
 
Join Date: Apr 2006
Posts: 2,136
Received 40 Likes on 34 Posts
Tesla Model S P100D
Originally Posted by C280 Sport
Hates the sound of a engine. Yet owns a E550 and ML350.
I've been a Mercedes fan for over a decade before Tesla offered a vehicle I could buy.

Yes I hate the stupid farting sounds a combustion engine makes along with the vibration and the stupid gear shifts that are sometimes not where I need them to be. None of these deficiencies are needed for the uncompromised instant acceleration I get in the Tesla.

I still own the E550 because it is pretty much all depreciated and we like having a third vehicle around for when we have friends or family visiting so they have a vehicle to use. The ML350 is what I bought for my girlfriend a while back and we are likely going to replace it with a Model X once they ramp up production and fix some of the production issues.

So yes, I happen to have two Mercedes vehicles in our household but I happen to think those two combustion vehicles have more in common with a gasoline lawnmower than a modern premium car. When I drive either the E550 or the ML350 it is like I'm driving something from a past era. Quite quaint actually.
Old 07-08-2016, 05:09 PM
  #290  
MBWorld Fanatic!
Thread Starter
 
MTrauman's Avatar
 
Join Date: Feb 2010
Posts: 1,435
Received 313 Likes on 214 Posts
‘19 AMG S63
Originally Posted by syswei
Good article, thanks for posting. But they could have brought up another point: because Autopilot is supposed to supplement the driver, i.e., not be fully autonomous, if it has anything greater than 0% effectiveness, properly measured accident rates (which we don't have) are bound to look better than human-only accident rates, as long as humans are indeed using it to supplement their full attention to the road.

Good point about the supplement.


This is the reason the Tesla AutoPilot will be deemed to have been deemed defective in a court of law. When one pulls all the evidence together that can be admitted into court it is likely to be determined that there is a Product Defect in the AutoPilot for an ineffective warning. Part of the problem is Musk himself and the way the company has conducted themselves. They have let people believe that the AutoPilot (via the companies and Musk's actions) is more than a supplement. AND this is where the courts (if this ever would end up there or NHTSA) would determine that the system is defective due to improper product warnings as to the exact use of the system. And yes when you turn on the system it gives warnings but the courts in the US will look at more than this and Tesla as a company has not conducted themselves to the standards of a proper product warning. I am not an expert on this subject but I know enough of the case law to speculate what could happen.
Old 07-08-2016, 05:36 PM
  #291  
MBWorld Fanatic!
Thread Starter
 
MTrauman's Avatar
 
Join Date: Feb 2010
Posts: 1,435
Received 313 Likes on 214 Posts
‘19 AMG S63
Originally Posted by WEBSRFR
No genius, Tesla technology did not magically fail but the driver apparently wasn't paying attention as he should have. There's only so much that can be done when an 18 wheeler does not yield to oncoming traffic and decides to cut across oncoming traffic.

This accident could have just as easily have happened in a Mercedes S Class if the driver activated lane keeping assist and didn't pay attention to the road as he should have.

Both the systems offered by Tesla and Mercedes are NOT fully autonomous and never meant to abdicate the responsibility of the driver to drive the car.

And what is that about the speed limit? The car drives the speed limit set by the driver as it should. You must live in a monetary if you drive the speed limit. When we travel long distances we set Tesla Autopilot to about 10 MPH over the speed limit so it keeps pace with traffic on the upper north east.

Astounding how our culture is increasingly lacking any sense of self responsibility. A driving aid that is not autonomous is used to the point where someone does not pay attention and when an accident is caused by a negligent truck driver cutting across oncoming traffic it is the fault of the car manufacturer because god forbid anyone be held responsible for their actions.
WEBSRFR,


Carful who you call genius. Since you work for Tesla you maybe called as a witness to the defect in the AutoPilot system.


The AutoPilot seems to be defective from a product standpoint due to improper warnings.


As to the AutoPilot system. This system should have detected the trailer of the truck---PERIOD.


AND NO to WEBSRFR the genius that he is, this could not have happened in a Mercedes Benz. I remember looking at some early tests on MB's distronic cruise control system. Before they released the distronic system to the public they showed an S Class rear ending another S Class during a test. They at first had difficulty figuring out what happen. MB finally figured out that in some tunnels there was interference in their radar system and they corrected this BEFORE unleashing the product into the market place--because they conducted the proper R&D. Again another reason I would choose an MB product over a Tesla product any day of the week.


Tesla simply is not testing their systems appropriately. They put the AutoPilot system into the public's hand without the proper R&D and the accident that happened is the result. Now they are dealing with the backlash and the backlash will have broad implications because Musk appears to be a "loose cannon" or a "cowboy" to advance his worldly causes without the company acting like an auto manufacturing company. In the US, Tesla will be held to the standard of a typical auto company in the courts.


WEBSRFR is again showing his lack of intelligence by blaming Joshua Brown for the accident that occurred in his Tesla. There are many factors I am sure that caused the accident and Mr. Brown's death but one factor may certainly be a defective product due to an improper warning with the Tesla AutoPilot system.


It is interesting that WEBSRFR stated the truck driver was negligent. How many court proceedings has WEBSRFR been involved in? Clearly he does not have a legal background. One needs to prove all the elements of negligence before one can say they are negligent. This is different for product manufacturers since auto manufacturers do not need to be found negligent in distributing their products they simply need to be found to have a product sent into the stream of commerce that is found to be defective from a warning, manufacturing, or design standpoint to be strictly liable.


Clearly WEBSRFR is the genius here
Old 07-08-2016, 05:42 PM
  #292  
MBWorld Fanatic!
Thread Starter
 
MTrauman's Avatar
 
Join Date: Feb 2010
Posts: 1,435
Received 313 Likes on 214 Posts
‘19 AMG S63
Originally Posted by WEBSRFR
Absolutely. This is the point that many people fail to grasp and this is really the most fundamental matter. Does it increase safety and does it make it less likely you will die or be in an accident. The answer is an unequivocal yes.

Whether in a Tesla or a Mercedes I would like the added benefit of safety systems watching over my shoulder in ASSISTING me avoid accidents. There will still be accidents and people will still die. But if the rate of accidents is cut down by 50% or 300% that is good because it means fewer dead people.

This is the first known fatality in just over 130 million miles where Tesla's SEMI AUTONOMOUS system was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.

Already the Tesla system is better than a human and similar systems are offered by Mercedes, Audi, Hyundai, and other car companies. And these systems will get better by leaps and bounds and will SAVE lives and REDUCE accidents, as they already are doing.

You'd have to lack a brain to process simple math to proclaim that we should remove these systems from public roads when they are ALREADY reducing the rate of accidents.

As Churchill said, perfection is the enemy of progress. If these systems are 2X better than a human at reducing the rate of accidents what point is there in withholding or blaming this technology?

When used as directed you are far safer in a car with these systems and a vigilant driver than you are in a car with just a vigilant driver who is subject to all the distractions, tiring, anger, and other issues humans are subject to that reduce our effectiveness.

Holly hell! Is WEBSRFR retarded


If anyone reads WEBSRFR's post they can figure out that he is being paid by Tesla and is the software developer that is responsible for the AutoPilot.


Continually repeating Tesla's 130 million BS--Does WEBSRFR not get it? Clearly does not.
The following users liked this post:
MDMercedesGuy (07-11-2016)
Old 07-08-2016, 07:42 PM
  #293  
MBWorld Fanatic!
Thread Starter
 
MTrauman's Avatar
 
Join Date: Feb 2010
Posts: 1,435
Received 313 Likes on 214 Posts
‘19 AMG S63
Originally Posted by WEBSRFR
And the automatic braking happens in a Tesla too. An accident was avoided in the video below when someone turned in front of a Tesla without yielding.

However the important thing is that these are driver ASSIST systems. Both Tesla and Mercedes describes these systems as such. While they will reduce the possibility of accidents as they do now, they are not infallible. This is why a vigilant driver is required to monitor the safety of the vehicle at all times.

None of these driving aids are designed to abdicate responsibility for paying attention and driving the car. They are not perfect but they will reduce accidents and save lives as they are doing now. These systems will get substantially better in the coming years.

This is what the driver had to say:

"Was travelling a little under 45 mph. There was some rain, but roads were pretty dry. I was watching stopped traffic to my right. I did not touch the brake. Car did all the work."

Tesla Autopilot saves the day - YouTube

Excellent! If WEBSRFR does work for Tesla he better prepare for termination since to republish this video that will help the Brown's estate may not be looked upon with favor by Musk and his new management team since everyone is leaving Tesla.


This will help Joshua Brown's estate recover damages in a law suit.


This type of video helps prove that the Tesla AutoPilot has a defect in design.


At the very least the AutoPilot system should have braked before the moment of impact. Since it did not read "the White Side of the Trailer" the Brown estate will allege defect in design. This video will be helpful to Joshua Brown's estate against Tesla.
Old 07-08-2016, 07:53 PM
  #294  
MBWorld Fanatic!
Thread Starter
 
MTrauman's Avatar
 
Join Date: Feb 2010
Posts: 1,435
Received 313 Likes on 214 Posts
‘19 AMG S63
Originally Posted by WEBSRFR
I've been a Mercedes fan for over a decade before Tesla offered a vehicle I could buy.

Yes I hate the stupid farting sounds a combustion engine makes along with the vibration and the stupid gear shifts that are sometimes not where I need them to be. None of these deficiencies are needed for the uncompromised instant acceleration I get in the Tesla.

I still own the E550 because it is pretty much all depreciated and we like having a third vehicle around for when we have friends or family visiting so they have a vehicle to use. The ML350 is what I bought for my girlfriend a while back and we are likely going to replace it with a Model X once they ramp up production and fix some of the production issues.

So yes, I happen to have two Mercedes vehicles in our household but I happen to think those two combustion vehicles have more in common with a gasoline lawnmower than a modern premium car. When I drive either the E550 or the ML350 it is like I'm driving something from a past era. Quite quaint actually.

The stupidity of this post is amazing


"Stupid farting sounds"--this is what a two year old says.


"Combustion vehicles have more in common with a gasoline lawnmower than a modern premium car". Really

This post is so full of sh-- I cannot stop laughing!
The following users liked this post:
MDMercedesGuy (07-11-2016)
Old 07-08-2016, 09:40 PM
  #295  
MBWorld Fanatic!
Thread Starter
 
MTrauman's Avatar
 
Join Date: Feb 2010
Posts: 1,435
Received 313 Likes on 214 Posts
‘19 AMG S63
I found this explanation of Tesla AutoPilot by a Tesla Model S owner that is a computer programmer and writer from a link on the Tesla Forum website:


Understanding Tesla Autopilot
July 6, 2016 ∞https://marco.org/2016/07/06/tesla-autopilot
A few Tesla vehicles have had accidents with Autopilot enabled recently, and I’ve gotten countless questions about these incidents and the nature of Autopilot from people who aren’t Tesla owners. Tesla and the media haven’t clearly communicated what these features do (and don’t do) to the public, so I’ll try to help in whatever small way I can as a Model S owner for a few months so far.
I apologize in advance if I get any technical details wrong about these features. Authoritative information is hard to find, and these features change and evolve often.
Tesla’s autonomous features today, all somewhat grouped under or involved in “Autopilot”:
Automatic emergency braking: This always-on feature will sense if you’re approaching another car or obstacle too quickly and loudly alert you. If you don’t apply the brakes yourself, the car will automatically brake to some degree. This is a common feature in luxury cars today and seems to be a clear safety win.
Autopark: Reverses into parking spots on demand. This is also becoming a common feature on other cars, and seems reasonably safe as long as you watch out for pedestrians. I use it regularly for parallel parking and it works well.
Summon: This feature lets you command the car, from outside of it, to very slowly drive itself into or out of a garage or parking space. It’s disabled by default and requires multiple steps to enable and engage (nobody could do this accidentally). The potential damage from failures is likely limited to car body or garage damage, not major bodily harm, due to the very slow movement and ultrasonic parking sensors. I haven’t used it yet — I don’t think the small benefit is worth the risk.
Adaptive cruise control: Like normal cruise control, but with a forward radar (augmented by the camera) to maintain a safe distance from the car ahead of you, automatically slowing down or even stopping as necessary. It’s almost like automated driving, but you still steer, and you’re responsible for obeying signs and signals. This feature is also available on many luxury cars today, and Tesla’s is the best one I’ve used yet, so I use it all the time. It bears most of the same risks as any cruise control, but the chances of rear-ending the car ahead of you are greatly reduced, and it may even be safer than manual driving in low-speed stop-and-go traffic. I’m a huge fan of this feature.
Autosteer, which people probably mean by “Autopilot”: Really just one significant addition to adaptive cruise control: the car also steers itself, using the camera to detect lane markings painted on roads (a feature offered by many other cars on its own) and automatically steer to keep you roughly centered in the lane.
Autosteer is a strange feeling in practice. It literally turns the steering wheel for you, but if you take your hands off for more than a few minutes, it slows down and yells at you until you put your hands back on the wheel. It’s an odd sensation: You’ve given up control of the vehicle, but you can’t stop mimicking control, and while your attention is barely needed, you can’t safely stop paying attention.
It’s automated enough that people will stop paying attention, but it’s not good enough that they can. You could say the same about cruise control, but cruise control feels like an acceptable balance to me, whereas Autosteer feels like it’s just over the line. History will probably prove me wrong on that, but it feels a bit wrong today.
Tesla, Elon Musk, and a lot of media coverage have set expectations too high for these features. People expect Autosteer to be fully autonomous, but today’s Tesla vehicles simply don’t have the hardware or software to safely and reliably self-drive on all roads, and such an advance doesn’t feel imminent.
There’s a huge gap between Autosteer and what most people expect from a “self-driving car”. For instance, Autosteer doesn’t see signs or traffic signals, so it will happily drive through red lights or stop signs if you let it.
Most critically, Autosteer has simply not been reliable enough yet for me on anything but wide-laned, gently turning, intersection-free highways with clearly painted lines in dry weather. In my experience, using it on any other type of road — even New York’s highway-like parkways — is dangerous and unsettling, often requiring manual corrections to avoid crossing center lines or getting dangerously close to lane edges and concrete barriers.
The most reliable, useful, and defensible parts of Tesla’s “Autopilot” features today are emergency braking, Autopark, and adaptive cruise control. I’d be just as happy with my Model S if it only had those, without Summon or Autosteer.
While I like using Autosteer on long highway trips, frankly, I’m amazed that it’s legal. I don’t think it’s a big enough advance over adaptive cruise control to be worth the risks in its current implementation. I’m scared for what will happen to Tesla and the progress of autonomous driving as more people use Autosteer in situations it’s not good at, or as a complete replacement for paying attention.
If Tesla updates the software to restrict Autosteer only to interstate highways, the yelling (and possible lawsuits) from existing owners would cause short-term pain, but I think it may save a lot of reputation damage — and possibly even people’s lives — in the long run.

Follow Marco.org posts: Twitter, RSS feed, or the alternate RSS feed in which link posts always point here first instead of their targets.
Follow @marcoarment on Twitter if you’d like.
© 2006–2016 Marco Arment
The following users liked this post:
El Cid (07-09-2016)
Old 07-08-2016, 09:54 PM
  #296  
MBWorld Fanatic!
Thread Starter
 
MTrauman's Avatar
 
Join Date: Feb 2010
Posts: 1,435
Received 313 Likes on 214 Posts
‘19 AMG S63
Mercedes Owners:


You just have to read some of the stuff the Tesla Owners are talking about the Tesla accident that claimed Mr. Brown's life. It is very sad. They are suggesting that by this accident or having people getting killed by Tesla's AutoPilot instead of the proper R&D by the company it will actually help Tesla sell more cars.


Here it is:


Did AutoPilot Accident(s) Actually Help Tesla?

Did AutoPilot Accident(s) Actually Help Tesla?



Submitted by jinx on July 8, 2016
I had a few associates after hearing about the accident, who knew, I drive a Tesla, state, that they "Never knew the car could actually drive itself, wow!".
I would explain to them how it works, and that it is really more of a driver assist if anything. But that I do really love the feature and used on a recent business trip to Chicago (300+ miles). The car did 80% of the driving, I would add.
I would also state, that when I first bought the car, it couldn't "drive itself", but that feature, along with many others came via a software update, kinda of like your phone. "Wow!" again...
The stock price actually went up the next day, and continues to hold steady.
How the accident helps Tesla:
1. Tesla can enhance autopilot going forward in future models or possibly with the current hardware, by making sure the front radar looks a little more higher.
2. More people are now aware of what the Model S can do, and are actually somewhat impressed.
3. Other automakers, will not release their version of "auto-pilot" or keep it scaled it down, because they wouldn't want to take the risk.
What do you all think? Could this sad event actually have helped Tesla?
Old 07-09-2016, 12:39 AM
  #297  
MBWorld Fanatic!
Thread Starter
 
MTrauman's Avatar
 
Join Date: Feb 2010
Posts: 1,435
Received 313 Likes on 214 Posts
‘19 AMG S63
Here is information directly from Tesla:




















We may become subject to product liability claims, which could harm our financial condition and liquidity if we are not able to successfully defend or insure against such claims.
Product liability claims could harm our business, prospects, operating results and financial condition. The automobile industry experiences significant product liability claims and we face inherent risk of exposure to claims in the event our vehicles do not perform as expected resulting in personal injury or death. We also may face similar claims related to any misuse or failures of new technologies that we are pioneering, including autopilot in our vehicles and our Tesla Energy products. A successful product liability claim against us with respect to any aspect of our products could require us to pay a substantial monetary award. Our risks in this area are particularly pronounced given the limited number of vehicles and energy storage products delivered to date and limited field experience of our products. Moreover, a product liability claim could generate substantial negative publicity about our products and business and would have material adverse effect on our brand, business, prospects and operating results. We self-insure against the risk of product liability claims, meaning that any product liability claims will have to be paid from company funds, not by insurance.
Old 07-09-2016, 08:06 AM
  #298  
Newbie
 
Tré's Avatar
 
Join Date: Jul 2016
Posts: 5
Likes: 0
Received 0 Likes on 0 Posts
15 S65, 11 550i, 05 E55, 99 F355
Originally Posted by TheTeslaDude
Oil-leaking, stinky-gasoline-burning, noise-polluting, maintenance-heavy, 20th-century tech vehicles. You may keep them. 14 more years for MB EVs. Easy choice for me.
Easy choice for you, take your EV love and go to a forum where people actually care. True car enthusiasts will never stop loving combustion engines and some of us like electric cars as an idea... doesn't mean they are the greatest.
Old 07-09-2016, 08:10 AM
  #299  
Newbie
 
Tré's Avatar
 
Join Date: Jul 2016
Posts: 5
Likes: 0
Received 0 Likes on 0 Posts
15 S65, 11 550i, 05 E55, 99 F355
MTrauman, while there is a fault with Tesla's system to a point... You have to bet that most of the radar and sensors of the Tesla are probably Mercedes sourced... it's almost, if not completely, the same system from the 222... except it has more capabilities unlocked.
Old 07-09-2016, 11:30 AM
  #300  
Junior Member
 
JayinToronto's Avatar
 
Join Date: Jun 2016
Posts: 45
Likes: 0
Received 15 Likes on 8 Posts
S550LWB 4matic
Just did a Google search on 2017 cars, just found it interesting to see the names of the comanies writing about each. Ford, mercedes, etc - car magazines. Tesla, computer magazine.



You have already rated this thread Rating: Thread Rating: 2 votes, 2.50 average.

Quick Reply: Why One Should Not Buy A Tesla Model S



All times are GMT -4. The time now is 03:48 PM.