

A robot can theoretically drive better than a human because emotions and boredom don’t have to be involved. But we aren’t there yet and Teslas are trying to solve the hard mode of pure vision without range finding.
Also, I suspect that the ones we have are set up purely as NNs where everything is determined by the training, which likely means there’s some random-ass behaviour for rare edge cases where it “thinks” slamming on the accelerator is as good an option as anything else but since it’s a black box no one really understands, there’s no way to tell until someone ends up in that position.
The tech still belongs in universities, not on public roads as a commercial product/service. Certainly not by the type of people who would at any point say, “fuck it, good enough, ship it like that”, which seems to be most of the tech industry these days.




Fwiw, just because a dumb phone doesn’t give you access to “smart” features doesn’t mean the capabilities aren’t present on the phone. It’s just a matter of what could be hidden on the circuit board (lots can be hidden in chips), and what can be hidden in usual expected traffic (if bandwidth requirements are low, even timing of packets could be used to encode hidden data that would never show up in any logs).
Plus the simple tracking of cellphones is necessary for them to function at all.