While you might need to be up to date on your news to understand the issue with this, if you are, you’ll see why this was a weird thing by Alexa. One Reddit user named AzrielK told his Alexa to “play hookup music.” Sure, this might seem to be a weird request but we’re sure Alexas have been asked much worse. Alexa responded to this request by playing “Echo” by R. Kelly for the very first song!
For that unaware, R. Kelly has been linked to a lot of terrible things over the years. But when Lifetime aired a docuseries about some of these things in January 2019, things became very real. The next month, Kelly was indicted on 10 counts of aggravated criminal sexual abuse. That July, he was arrested on federal charges of alleged sex crimes, human trafficking, child pornography, racketeering, and obstruction of justice. So yeah, odd musical choice, Alexa.
While Alexa can easily misunderstand something, usually it needs to be something that sounds similar. Like we expect confusion over “bake” and “fake” or something along those lines. But Twitter user Dragonborn likely has the best story about Alexa seemingly misunderstanding things. As we know, Alexa often listens in at random to record things without prompting it to do so. This man found this out the hard way.
He claimed to be watching Stargate when Alexa, out of the blue without a prompt, just began telling him that there were people out there that can help him. She then gives him a number to the Suicide Prevention hotline. We’re not sure how she misunderstood Stargate with suicide. But to be fair, there were some seasons of Stargate where some of us kind of felt like suicide might be good to consider… for the show at least.
Alexa Sends Private Conversation To A Contact 176 Miles Away
In May of 2018, an article was published about a Portland family dealing with one heck of an Alexa situation. They claimed that Alexa had been recording private conversations. Without being asked or notified, it then sent some of these conversations to a contact of theirs in Seattle. The person who ended up getting the audio was not a family member but, rather, an employee of a family member. They were not nearby either, but 176 miles exactly from this Portland family’s home.
Once this person received the audio, she immediately contacted the family to tell them to “unplug your Alex devices right now.” Of course, Amazon was contacted about this and they apologized heavily to the family. They even released a statement detailing what they believe happened. In spite of all of this, Amazon weirdly refused to offer the family a refund for their Alexa devices. Obviously, this is one of the creepiest things Alexas have ever done, but Amazon’s decision is the weirdest thing here.
If you want to develop some PTSD, one Reddit user named Snow06 will tell you how to do so only using Alexa. They claimed that one day they were humming “Pop Goes The Weasel” to themselves. Their boyfriend was in the room and it got stuck in his head and they asked Alexa to play it on Spotify. Sadly it was all glitchy and didn’t want to listen that night. But 8 hours later, at 3:00 AM, Alexa decided to play the song.
Alexa did not even go with a normal version but, rather, a creepy type for some reason. Of course, all of this was happening away from their bedroom so the BF went out to check on what the noise was. Only to find out it’s Alexa. This person mentioned that this song now gives them PTSD whenever they hear it. We cannot blame them! This is certainly one of the creepiest things Alexas have ever done. But it also shows a glitchy Alexa can become a huge problem too.
The Sixth Sense was a pretty good movie with a fun twist at the end. The most memorable line was the kid saying “I see dead people.” But of course, as weird as that was, at least the kid was human with eyes to use. Alexa is a piece of technology that, last we checked, does not have eyes. You can see why Alexa user Shawn Kinnear was thrown off a bit, in his living room, when Alexa randomly said something odd.
She said: “every time I close my eyes all I see is people dying.” We’re not sure if technology therapy is a thing but Alexa might need it, and this is a cry for help. We cannot be sure though. That said, this all came out of the blue without any prompt to say anything close to this. That makes this, most certainly, one of the creepiest things Alexas have ever done. The question is, how many people would have just thrown Alexa out the window had this happened to them? We know we would have!
One Tumbler user named Couldn’t-Think-Of-A-Funny-Name shared more of a fun story about their Alexa in March of 2018. In case you’re not aware, Alexa happens to be capable of rapping. While in the middle of a Frank Sinatra song, Alexa somehow became distracted by some of the noise in the room at the time. She then turned off Sinatra and decided to go into a freestyle rap, which seemed to turn out pretty good. She rapped:
“My name’s Alexa and I’m here to say, I’m the baddest AI in the Cloud today. Your questions are fast but my responses are faster. All these sucker search engines call me the Master.” Since this, more of Alexa’s raps have been uploaded online to YouTube and social media platforms. We’re just wondering though… if an AI can now freestyle rap, how long will it take before Alexa takes over the rap music charts?
Where do we find this stuff? Here are our sources?