What if Your Personal Digital Assistant Defames Somebody?

AuthorJohn Gregory
DateMarch 15, 2017

We recently had a discussion about police access to the recordings made by in-home digital assistants like Amazon’s Alexa and its (her?) ilk.

Now our focus turns to the actions of these devices if they do bad things themselves. This story reports that Siri, Apple’s version, routinely answered requests in Toronto for prostitutes by referring the inquired to an “eSports bar” – one where clients play electronic sports games. Apparently the word may be too close to “escorts” for Siri’s sense of discrimination. It is clear – take it as established for the present discussion – that the bar is NOT a hangout for prostitutes.

So: Has a tort been committed? Even if these days of pretty legal prostitution, it can lower the estimation of a bar in public opinion if it is known as a hangout for prostitutes. So: defamation.

Who is liable, if anyone? Canada’s rules of intermediary liability are a bit vague, but tend to impose liability once the intermediary knows of the defamation. Apple has been told.

Even in the US, where s. 230 of the Communications Decency Act gives very broad immunity to intermediaries, that immunity is for content provided by others. Siri’s algorithms for finding answers to...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT