It's rare that academic terms so perfectly capture the issue:
Humans-in-the-loop of an automated system risk being the 'moral crumple zone', like a car bonnet 'designed to absorb the force of impact in a crash', suffering the 'moral and legal penalties when the system fails'
“It's rare that academic terms so perfectly capture the issue: Humans-in-the-loop of an automated system risk being the 'moral crumple zone', like a car bonnet 'designed to absorb the force of impact in a crash', suffering the 'moral and legal penalties when the system fails'”