View Single Post
Old 04-26-2019, 07:43 AM   #27
JST
195
 
JST's Avatar
 
Join Date: Oct 2003
Posts: 24,611
The question Clyde is posing reflects a pretty fundamental issue at the heart of various legal systems.

There are basically two ways of dealing with new things. Either you prohibit them until people can show they are safe, or you let people do them until they are shown to be dangerous, and require compensation be paid to anyone that is injured.

The American legal system (generally) works on the second principle. Other, code-based systems are (generally) closer to the first. The problem with the first is that it’s hard to prove a negative, so innovation can be harder in a system like that.

Even in America, there are exceptions, of course, for things where we feel like innovation and experimentation are just too risky to proceed without approval. Airplane type approval and drugs are two big ones.

But cars have generally not fallen into that category. The govt sets a minimum safety standard for some things, but beyond that you’re free to innovate. If you fuck up and kill people, they can sue you.

I don’t wholly trust autopilot. I don’t like it and I don’t use it. But the answer to Clyde’s question about how we as a society let people use it lies in the tort system. If it’s defective, if it causes crashes, people will sue and the problem will get fixed. The threat of that (hopefully) means that Tesla has done its due diligence to make sure the system works.
JST is offline   Reply With Quote