“Alexa, tell me, in your own words, what happened on the night in question.” Actually the request is more like, “Alexa, please replay the dialog that was recorded at 9:05 PM for the jury.” The case is in Bentonville, Arkansas, and the charge is murder. Since an Echo unit was present, Amazon has been asked to disclose whatever information might have been captured at the time of the crime. CNN reports that the defendant has agreed to the disclosure of that recording.
Amazon indicates that “Echo” keeps less than sixty seconds of recorded sound, and it may not have that level of detail, but presumably a larger database exists of requests and responses for the night in question as well. Amazon has provided some data about purchase history, but is waiting for a formal court document to release any additional information.
This begs the issue of how Alexa (0r her sisters: Siri, OK Google, Hello Barbie) might respond to apparent sounds of a crime in progress. “Alexa call 911!” is pretty clear, but “Don’t Shoot!” (or other phrases that might either be “real” or “overheard” from a movie in the background) may not be. An interesting future awaits us.