Description (in English)

Rather than being presented as a conventional narrative, "STET" is presented as a short section of a scientific paper analysing the principles by which self-driving cars make decisions. The paper is interspersed with suggestions to remove or change content which the journal editor finds inappropriate, to each of which the paper's author responds STET, a proof-reader's term for "let it stand".

The story is built by three linked texts. The body of the scientific paper, the linked footnotes and the linked editorial comments and responses from the "author". On first reading the scientific paper seems very unpersonal and academic. But as we read the footnotes and comments we slowly learn that the author's child was killed in a road accident involving a self-driving car, and the author blames the AI technology's programming, which prioritised the life of an endangered bird above a human child, for the child's death. 

The stark difference between the very technical and emotionless text of the scientific paper and the increasingly emotional tone of the footnotes and comments is very effective, highlighting the very real effects of apparently objective decisions. 

Pull Quotes

It was murder, the car had a choice, you can’t choose to kill someone and call it manslaughter.

Situation machine vision is used in
Seems complete except that it's missing situation(s). We have to read the text and add one, or have the author of this entry add it. -R
Read the short story. Light editing of description. Removed several of the techs referenced - Edward had included everything a self-driving car might use There were also a lot of topics (AI, Autonomous vehicles, Empathy, Grief, Physical violence, Surveillance), and I reduced to the most central topics (Machine learning, Autonomous vehicles, Grief). -J

Authored by