Well Cruise is offering a full self driving taxi service where they don’t mandate you as a passenger to pay attention to the traffic and take control if needed so it’s not fair to say that they don’t trust it so why should you.
With Tesla however this is the case but despite their rather aggresive marketing they still make it very clear that this is not finished yet and you are allowed to use it but you’re still the driver and the safe use of it is on your responsibility. That’s the case with the beta version of any software; you get it early which is what early adopters like but you’re expected to encounter bugs and this is the trade-off you have to accept.
If liability is forced on them, that is a huge difference from them voluntarily accepting responsibility. That is what would indicate that they trusted the service they provided.
I think the issue here is that you like many other people seem to imagine that because a system is called “full self driving” it literally means that. As if it’s either fully human controlled or fully AI controlled and there’s no inbetween. No, this is just overly simplified black and white thinking that misses all the nuances about the subject.
Is the company legally liable for the actions of the self driving car? If no, then they don’t trust the vehicles.
This is utter nonsense. These companies aren’t not-liable for the accidents they cause. Ofcourse they don’t want to be and would rather swipe these incidents under the rug but that’s just not going to happen. There howerer just isn’t a precedent. This is brand new technology that no one has seen before. What the liability of these companies is going to be the end is still under debate. It’s just a blatant lie at this point to claim they have no liability as if that’s something that’s been settled.
Well Cruise is offering a full self driving taxi service where they don’t mandate you as a passenger to pay attention to the traffic and take control if needed so it’s not fair to say that they don’t trust it so why should you.
With Tesla however this is the case but despite their rather aggresive marketing they still make it very clear that this is not finished yet and you are allowed to use it but you’re still the driver and the safe use of it is on your responsibility. That’s the case with the beta version of any software; you get it early which is what early adopters like but you’re expected to encounter bugs and this is the trade-off you have to accept.
Is the company legally liable for the actions of the self driving car? If no, then they don’t trust the vehicles.
What charges would apply against a human that delayed an emergency vehicle and caused someone to die?
There’s several court cases ongoing about this stuff and I’d be surprised if these companies didn’t have any liability
That’s a moved goalpost, and you know it.
If liability is forced on them, that is a huge difference from them voluntarily accepting responsibility. That is what would indicate that they trusted the service they provided.
I think the issue here is that you like many other people seem to imagine that because a system is called “full self driving” it literally means that. As if it’s either fully human controlled or fully AI controlled and there’s no inbetween. No, this is just overly simplified black and white thinking that misses all the nuances about the subject.
This is utter nonsense. These companies aren’t not-liable for the accidents they cause. Ofcourse they don’t want to be and would rather swipe these incidents under the rug but that’s just not going to happen. There howerer just isn’t a precedent. This is brand new technology that no one has seen before. What the liability of these companies is going to be the end is still under debate. It’s just a blatant lie at this point to claim they have no liability as if that’s something that’s been settled.