Taster Post: Cry Me A Driver: Why Computers Fail At Detecting Emotions

9-Aug-Richard-Godbehere-200-160x300

Richard Firth-Godbehere is a Wellcome Trust-funded doctoral candidate in the medical humanities. This is an excerpt from a recent article on Gizmodo. Richard’s thesis – ‘Understanding the Opposites of Desire and the Prehistory of Disgust, c.1604 – c.1755’ – explores how people tried to make varying sense of aversions as ‘the opposite of desire’ across a time of huge intellectual, political, religious, social, and cultural change, finally resulting in something akin to the modern notion of ‘disgust’. Twitter @AbominableHMan


According to proponents of Affective Computing, the days when machines have emotions are much closer than the twenty-fourth century … These gadgets and platforms all claim to be able to recognise emotions through facial recognition and speech analysis.

All this might muster up visions of a terrible dystopian future, in which our slave-like vacuum cleaners buzz around the floor sucking up the stale crumbs from last night’s Quattro Formaggi pizza with more enthusiasm than is comfortable. A world in which Siri sulks, refusing to talk to us for days every time we suggest that our iPhone might work better as a projectile, and our car autopilots wake up in a particularly bad mood. Thankfully, Affective Computing is a long way from creating machines that feel the way humans do because of three main problems.

To read the whole article, click here.