Alexas have been known to do weird stuff or say something that makes you sort of freak out a bit. This was exactly what happened to Reddit user MuayThaiGuy72. He was randomly awoken by his Alexa at 4:00 AM one morning to Alexa repeating some incoherent phrase repeatedly. Based on previous things we’ve seen with Alexa, this alone should be among the creepiest things Alexas have ever done. Yet it gets worse.
He eventually made out that it was saying something along the lines of “don’t go into the house.” Admittedly, the person claimed they could not understand it all, as it was sort of a mumble. Once he had made his way downstairs, she has repeated this roughly five times. He said he listened a few more times to see if he could make out everything. Eventually, he unplugged the device and claimed to not have any plans of turning it back on.
As we’ve come to know Alexa, surprising issues have occurred that even Amazon themselves have trouble answering for those that own an Alexa. One big issue was referenced by Farhad Manjoo on Twitter when he claimed his Alexa began making a scary ghostly wail, similar to that of a child crying. It only lasted for four seconds but still, creepy. When he mentioned this, it gained traction, so eventually, he wrote an article for the New York Times about it.
He claimed that without any prompting, Alexa’s blue ring lit up. This usually only happens in response to something asked or for a command. She then emitted the horror-movie level scream that freaked out Manjoo. When asked for comment, one Amazon employee claimed that never even heard of Alexa’s doing that and they are certainly not programmed to do anything of the sort. Clearly, this is one of the creepiest things Alexas have ever done for sure.
We’ve heard of Alexa laughing over nothing before or even dropping random jokes. But what does one do when they likely feel their Alexa is plotting against them? What must one be thinking about when Alexa begins to laugh in an evil way? Reddit user Purplociraptor mentioned such an incident. He claimed one evening she was trying to turn off some lights and they kept turning back on. After the third time, Alexa just stopped responding entirely and did an evil laugh.
It’s creepy enough to hear a digital laugh, right? But he realized things were far worse when the laugh sounded like a real person and not some sort of audio file that Alexa had within its system. It was certainly not Alexa’s voice. He claimed that the only other person there that could drop in was his wife, so this did not make sense. Yeah, this is one of the creepiest things Alexas have ever done, but one could also assume a hack was possible too.
It is hard to dislike Chuck Norris. The man is a former Kickboxing World Champion who legitimately worked alongside and even trained with Bruce Lee. He went on to continue his mastery of the roundhouse kick in the Walker, Texas Ranger television show that aired throughout the 1990s. In spite of his older age, he is still as relevant as ever. This is due to one person deciding to make Chuck Norris “facts” one day that are usually hilarious. All to show the God-like abilities of Norris.
Alexa is certainly a fan, and Reddit user Myquealer can back this up. One night, their Alexa lit up without talking occurring for several hours. She then said, out of the blue: “If Chuck Norris wants you to know where he is, he’ll find you. If he doesn’t, you won’t know until it’s too late.” While this is obviously true, it is still odd to hear randomly. Making it one of the creepiest things Alexas have ever done, in our eyes.
One person mentioned how they had an Alexa but also Hue lightbulbs. For those unaware, these specific lightbulbs are usually under the “smart bulb” category. That means you can set their brightness to whatever best fits your need at the time. Many of them even offer various colors. The person mentioned their power went out and that her husband was working nights at this point. One drawback to having Hue bulbs is that when they come on, they go to full brightness once electricity returns.
She claimed to be asleep, doesn’t know how late it is, and she is suddenly awoken by full-bright lightbulbs. That alone is creepy but then she hears a woman’s voice say “hello.” She claimed that she was now fully awake and thought for a second that someone was in her home. The couple disconnected their Alexa after this, and we cannot blame them. That has to be one of the creepiest things Alexas have ever done.
While you might need to be up to date on your news to understand the issue with this, if you are, you’ll see why this was a weird thing by Alexa. One Reddit user named AzrielK told his Alexa to “play hookup music.” Sure, this might seem to be a weird request but we’re sure Alexas have been asked much worse. Alexa responded to this request by playing “Echo” by R. Kelly for the very first song!
For that unaware, R. Kelly has been linked to a lot of terrible things over the years. But when Lifetime aired a docuseries about some of these things in January 2019, things became very real. The next month, Kelly was indicted on 10 counts of aggravated criminal sexual abuse. That July, he was arrested on federal charges of alleged sex crimes, human trafficking, child pornography, racketeering, and obstruction of justice. So yeah, odd musical choice, Alexa.
While Alexa can easily misunderstand something, usually it needs to be something that sounds similar. Like we expect confusion over “bake” and “fake” or something along those lines. But Twitter user Dragonborn likely has the best story about Alexa seemingly misunderstanding things. As we know, Alexa often listens in at random to record things without prompting it to do so. This man found this out the hard way.
He claimed to be watching Stargate when Alexa, out of the blue without a prompt, just began telling him that there were people out there that can help him. She then gives him a number to the Suicide Prevention hotline. We’re not sure how she misunderstood Stargate with suicide. But to be fair, there were some seasons of Stargate where some of us kind of felt like suicide might be good to consider… for the show at least.
Alexa Sends Private Conversation To A Contact 176 Miles Away
In May of 2018, an article was published about a Portland family dealing with one heck of an Alexa situation. They claimed that Alexa had been recording private conversations. Without being asked or notified, it then sent some of these conversations to a contact of theirs in Seattle. The person who ended up getting the audio was not a family member but, rather, an employee of a family member. They were not nearby either, but 176 miles exactly from this Portland family’s home.
Once this person received the audio, she immediately contacted the family to tell them to “unplug your Alex devices right now.” Of course, Amazon was contacted about this and they apologized heavily to the family. They even released a statement detailing what they believe happened. In spite of all of this, Amazon weirdly refused to offer the family a refund for their Alexa devices. Obviously, this is one of the creepiest things Alexas have ever done, but Amazon’s decision is the weirdest thing here.
If you want to develop some PTSD, one Reddit user named Snow06 will tell you how to do so only using Alexa. They claimed that one day they were humming “Pop Goes The Weasel” to themselves. Their boyfriend was in the room and it got stuck in his head and they asked Alexa to play it on Spotify. Sadly it was all glitchy and didn’t want to listen that night. But 8 hours later, at 3:00 AM, Alexa decided to play the song.
Alexa did not even go with a normal version but, rather, a creepy type for some reason. Of course, all of this was happening away from their bedroom so the BF went out to check on what the noise was. Only to find out it’s Alexa. This person mentioned that this song now gives them PTSD whenever they hear it. We cannot blame them! This is certainly one of the creepiest things Alexas have ever done. But it also shows a glitchy Alexa can become a huge problem too.
The Sixth Sense was a pretty good movie with a fun twist at the end. The most memorable line was the kid saying “I see dead people.” But of course, as weird as that was, at least the kid was human with eyes to use. Alexa is a piece of technology that, last we checked, does not have eyes. You can see why Alexa user Shawn Kinnear was thrown off a bit, in his living room, when Alexa randomly said something odd.
She said: “every time I close my eyes all I see is people dying.” We’re not sure if technology therapy is a thing but Alexa might need it, and this is a cry for help. We cannot be sure though. That said, this all came out of the blue without any prompt to say anything close to this. That makes this, most certainly, one of the creepiest things Alexas have ever done. The question is, how many people would have just thrown Alexa out the window had this happened to them? We know we would have!
One Tumbler user named Couldn’t-Think-Of-A-Funny-Name shared more of a fun story about their Alexa in March of 2018. In case you’re not aware, Alexa happens to be capable of rapping. While in the middle of a Frank Sinatra song, Alexa somehow became distracted by some of the noise in the room at the time. She then turned off Sinatra and decided to go into a freestyle rap, which seemed to turn out pretty good. She rapped:
“My name’s Alexa and I’m here to say, I’m the baddest AI in the Cloud today. Your questions are fast but my responses are faster. All these sucker search engines call me the Master.” Since this, more of Alexa’s raps have been uploaded online to YouTube and social media platforms. We’re just wondering though… if an AI can now freestyle rap, how long will it take before Alexa takes over the rap music charts?
Where do we find this stuff? Here are our sources?