View Single Post
Old 04-27-2019, 09:52 PM   #44
clyde
Chief title editor
 
clyde's Avatar
 
Join Date: Oct 2003
Posts: 26,599
Quote:
Originally Posted by rumatt View Post
I'm using my judgement based on my experiences, and the types of scenarios where I choose to use it.

What happens if the car suddenly swerves hard left and/or does not let me override it? The same thing that happens if the cruise control accelerates out of control and/or does not let me override it. I crash. True statements of any automatic driver control.
Quote:
Originally Posted by rumatt View Post
Quote:
I didn't ask for a comparison of the differences between cruise control and autopilot. I asked only about cruise control and why you felt it was safe to use.

I never said cruise and autopilot are equivalent. They obviously are not. Autopilot is a much more extreme form of cruise control. But the analogy points out the flaws in your line of questioning. If you can't answer them about cruise control, they're not very good questions, right?
Autopilot is not a more extreme version of cruise control. It's an entirely different thing. I explained why I don't think they are similar enough to warrant comparison and why I don't accept your analogy as valid. Because it is not. But, whatever.

Quote:
So the obvious followup question is: what about the first 1, 5, 10 years of use? Who told you it was safe then? If you didn't know it was safe why did you use it? Why did you endanger the lives of my family?
As I described, I did not use it in the first 1, 5, or 10 years it was available. I waited about 30 years. Even then, when I did use it, I nearly killed myself and very easily could have killed anyone nearby. That that was not the result was dumb and fortuitous luck. Hopefully, if you live another 30 years, you won't have a similar story to tell. As I also described, I didn't give the safety of cruise control any thought at the time, much like it appears you're giving no thought to the safety of using Autopilot today.

Quote:
And if years of experience by end users is the measure we use to define safety, how are we supposed to get there without using it?
It can be a measure, sure. But I'm not saying that's the only way. Not at all. For some reason, Musk and Tesla thought it was safe enough to unleash on the world a few years ago and also safe enough to add all the things they've added since. What were those reasons?

I would like to think they have a sound reason (ideally "reasons" plural) to believe it's safe like some combination of simulations, real world data gathered under controlled circumstances, maybe other things. I'd like to think that, but I don't. I'm highly skeptical for a few reasons: They
  • won't show us their data
  • have given incomplete and misleading data about the subject to the government wrapped in substandard self-interested analysis
  • they regularly mislead, deceive, and lie to the public and government agencies
  • they operate beyond the bounds of law and regulation (see SEC and labor violations for starters)

It's all about their self-interest whatever effect it may have on the rest of us. They're kind of too Trumponian to be trusted in my view. Of course, 60 million people in the US think he's a truthteller, so

Would anyone be all that surprised if Musk went on a "fuck you world" Twitter rant and confessed to unleashing Autopilot and enhanced features to drive sales, media interest, and goose its stock price without any underlying reasonable reason suggesting or showing it was safe? Would anyone have trouble believing it? More directly, would it prompt you to reevaluate your use of it? Given what you've told us in this thread, the answer is no.

None of that means Autopilot isn't safe enough to use. We just don't have any way to know. You made your decision why you want to believe them. ZBB made his. JST saw all the same things you did and made a different decision.

Of course, there's a point where if you're going to advance anything you need to take risks. The choice is to take those risks blindly, like many Tesla drivers are doing (and, unfortunately, taking those of us that must shared the roads with them along for their joyride) or refusing to take those risks without an opportunity objectively evaluate those risks and thereby make an informed choice about what risks we're willing to take and under what circumstances.

Once you cross that threshold and make it available for general use, opening the data about how, when, and where Autopilot and competing technologies are used and, perhaps more importantly, how, when, where, and under what circumstances it fails would let us make informed decisions about whether the technology is safe enough to keep using widely or it needs more development work.

There was a time when the only choice for just about everything was to just start doing things in public and see what happened. Today, there are still some things we don't have much choice about but to turn them loose in public and see what happens, but there are a lot of things where we have better choices available to us to give us a fair preliminary sense of what's likely to happen and have debate about what constitutes acceptable risk in difference scenarios related to those things before public introductions.
__________________
OH NOES!!!!!1 MY CAR HAS T3H UND3R5T33R5555!!!!!!1oneone!!!!11

Team WTF?!
What are you gonna do?
clyde is offline   Reply With Quote