Following another fatal Tesla crash, accident investigators have announced that they have stopped working with the company. Self-driving cars urgently need ‘ethical black boxes’ so that we can all learn from their mistakes.

Self-driving cars are learning to drive. The algorithms that control them need to be fed vast quantities of real world data in order to improve. Cities and freeways, particularly in the US, are the laboratories in which they are being trained. Companies like Waymo, Uber and Tesla would argue that this real-world experience is vital for machine learning. Others would say that it creates an experiment in which other road users are unwitting test subjects. When technologies fail and people die, as happened with the Uber crash in Tempe last month, everyone, not just self-driving car companies, needs to learn what happened and why. Social learning must take precedence over machine learning.

For this reason, we should be worried by the news that the National Transportation Safety Board has thrown Tesla out its investigation into the fatal crash of a Tesla Model X that was in Autopilot mode. NTSB has announced that Tesla is no longer party to the investigation because the car company broke the rules on speaking out, in effect prejudicing the conclusions of the inquiry.

Continue reading…

Read More Self-driving car companies should not be allowed to investigate their own crashes

Facebook Comments