Ten years later, after the horrors of World War II, George Orwell published 1984, which described a dystopian future far less comforting than Huxley’s, and was positively terrifying in many ways. A cypherpunk is any activist advocating widespread use of strong cryptography and privacy-enhancing technologies as a route to social and political change.
Just as we once assumed all Facebook knew was the information we willingly gave it, we’re unprepared for the myriad ways smart speakers tuned into our surroundings could someday day be exploited, whether for our convenience or not. We’re not equipped to fully appreciate the trade-offs we’re making. We don’t know what we need to do to protect ourselves, how much we even need protection, or if the tools to do so are even available.
What happens when we go from merely interacting with Alexa to living with her?
It’s always best to protect your personal computer. Another thing which the privacy software needs to possess is a way of eliminating cookie trackers. The software should offer malware detection and browser cleaning. Program that protects privacy will continue to defend you even when you’re not online.
Mentally, we’re equally unprepared for what’s to come. As long as smart speakers remain a visible, external piece of furniture, we can mentally separate them from our lives. They are not yet seamlessly integrated into our days, but what happens when they are? What happens when we go from merely interacting with Alexa to living with her? When we go from inviting Alexa into our home to accepting the program as a necessary feature of life?“We envision a world where the consumer devices around us are more helpful, more intelligent, more… human,” says Audio Analytic, a company that has created software capable of recognizing a range of sounds. It reportedly hopes this technology will soon be able to detect the sound of consumer products when they’re being used around smart speakers.
Anyone who’s chatted with Alexa knows the feeling of wanting artificial intelligence programs to feel more human. And despite their stilted speech and limited range of responses, they already do feel somewhat human. That is, of course, by design. Manufacturers want us to feel connected to this tech, not just pragmatically but emotionally. Alexa and Siri can’t just be computer programs. They need to be trusted. They need to be friends we invite into our homes, or that we allow to sleep beside us.
But when voice-activated computer assistants eventually become a necessary feature of our lives, we may notice a profound irony.
When every noise in our lives is a search prompt, the sounds of our homes, the symphony of life — laughing, crying, talking, shouting, sitting in silence — will no longer be considered memories, but data. The more we humanize technology — the more it becomes not just part of the furniture, but part of the family — the more our lives will become less human.