So it begins.
It totally sucks. Teslas use a long-range radar which -under normal circumstances- should've detected the vehicle, but my educated guess is that the truck was just too high above ground for the radar to catch it. It also has an infrared camera but that is not meant for long distances. Ironically / incidentally, the guy who got killed posted a video of Autopilot saving him in April, which Musk retweeted.
That seems so odd. I can imagine a single frame image of a high trailer looking similar to a single frame image of an overhead sign. But given the distance to an object, as the car moves forward it seems like it should be easy to tell the difference between an object 1.5 meters off the ground and one 15 meters off the ground.
I think all us computer geeks are scratching our heads over that description of the Teslas' vision system. It seems... nearsighted, dare I say. In a single still frame I could see the system getting confused. But as soon as you take in the relative motion of the object in question, and that the gap between the object and road is not increasing, it seems like warning bells would be going off. And... think about the accident... the first thing to touch the truck was the windshield. The top (and guy's head) just got sheared off by the truck bed, like a knife. I wonder how long the car continued past the accident scene... Oof. Frightening.
That makes a lot more sense, and I kinda got the feel that I was wrong when I started digging in a bit. Still sucks for everyone, though. 300 or so people a month are killed in accidents involving tractor trailer trucks, but this one accident because the nature of it is slightly different, will get the media hook. BTW, I saw that the drive was did not have his hands on the wheel at the time of the accident and may have been watching a movie on a portable DVD player. Which makes me thing he was way overconfident in the tech.
I'm more concerned about smartphones than self-driving cars. Self-driving car manufacturers are held responsible for crashes like these. Like, I could see someone making a case from this that the technology isn't ready to be released to the public. I'm not sure who would profit from that happening, but the argument could be made.
It isn't ready - it's in beta, and Teslas are not equipped with a full suite of sensors so accidents of the whoops-didn't-see-ya type are much more likely. Users are supposed to always keep an eye on the road and keep their hands on the steering wheel, which of course nobody does because it drives like it doesn't need that.