Have you seen the Geico ad with the talking parrot? A 19th-century ship is boarded, the captain surrounded by pirates. The leader shouts, “Let’s feed him to the sharks,” (pirates cheer and swords are held high) “and take all his gold” (more cheers). The parrot repeats these lines, and adds, “and hide it from the crew. They’re all morons anyway.”
Voice-activated IoT devices (which, for this piece, includes smartphones and televisions) are always there, just like that pirate’s parrot. You know the services: Siri, Google Assistant, Cortana and Amazon’s Alexa. Mostly, these fine-featured friends are waiting for their activation command—listening, not recording. When activated, they gather the particulars of your life and beam them into a cloud server where your day-to-day existence is, at least in some basic ways, made better, the improvement generally taking the form of convenience or efficiency. The voiceover at the end of the Geico ad explains, “If you’re a parrot, you repeat things. That’s what you do.” If you’re a voice-activated Internet of Things (IoT) device, you don’t repeat things, but you may transmit them.
See also: How Internet of Things Puts Industry at Risk
Is the cost worth the benefit?
But all the value of having a digital assistant comes at a personal price that many
privacy advocates—including me—worry may come at a cost much higher than the price of, say, the device you need to access the service.
The price is your privacy.
Unfortunately, it is a murder case in Bentonville, Arkansas, that most forcefully highlights one of the more complex privacy issues connected to digital assistant IoT technology these days.
In November 2015, a former Georgia police officer named Victor Collins was found floating face down in a hot tub owned by Bentonville resident James Andrew Bates. There were traces of blood at the scene, and a coroner later determined that Collins had died of strangulation and partial drowning.
The smart water meter installed at Bates’ house indicated that 140 gallons of water—much more than usual—had been used on the night of Collins’ death. That pointed to post-murder cleaning. There was physical evidence at the scene, but the prosecutor wanted to know if there was more information hiding on the Amazon Echo that had been streaming music when Collins died. There was the possibility that the device had stored 60 seconds, which is what it is equipped to do, and that it might still be on the physical device. Amazon declined to help with the investigation. (Amazon did not immediately respond to Credit.com’s request for comment.)
Why this raises questions
It should be said that the producers of digital assistants aren’t trying to create a better pirate parrot. They aren’t in the business of mindless repetition. They are in the business of
learning more about you so they can sell you things, or helping others do that, or selling what they know about you to a third party that can use it to make money.
There is so much information potentially. Consumers use digital assistants to help with travel,
email and messages; they listen to music, check out sports scores and the weather. They can keep a calendar in order, post to social media, translate documents and search the internet. (When it comes to criminals, these devices could be seen as the digital equivalent of a stupid accomplice.)
Murder isn’t the best backdrop for discussions about privacy, but unfortunately the protections guaranteed by our courts is nowhere in evidence at the consumer level, so it is often the
mise-en-scéne for this kind of article.
How much privacy can be expected?
If you’re a parrot, you repeat things. If you’re an Amazon Echo at a murder scene, you give rise to serious questions about the expectation of privacy in a consumer landscape that has turned personal preference into a commodity. Increasingly geared toward the conveniences of radical personalization, a digital assistant knows how you like things in your home, but given the inevitability of
hacking and data compromises, that means that at least potentially all that information could be used against you—and not just in your personal battle to resist temptation in the marketplace and save money.
Without a doubt, it would be easier to talk about the cost of convenience when it comes to digital assistance were we dealing with a case revolving around hacked information used to burglarize a home, or the purloined daily schedule of a popular celebrity who was (supply your own verb) as a result of leaked data. For that matter, it would be easier to talk about plug-and-play cameras that can’t be made secure no matter what you do. But until there’s a body, it seems, no one pays attention, and so these outlier situations often are how privacy becomes a topic for discussion.
See also: Why Buy Cyber and Privacy Liability. . .
The digital assistant as a privacy issue may not be a problem for you—some people feel they have nothing to hide—but it is for sure something consumers need to think about before transmitting their lives to the cloud where it may be only a matter of time, or bad luck, before a hacker streams it for laughs or loot.
Full disclosure: CyberScout.com sponsors ThirdCertainty. This story originated as an Op/Ed contribution to Credit.com and does not necessarily represent the views of the company or its partners.