I call absolute bullshit on this. They’re losing out on the sale of the device but make up for it 20 fold by selling and manipulating data it collects in your house. This isn’t even conspiracy loads of people report Alexa going off randomly without any sort of prompt. Don’t tell me the device isn’t listening closely to every little conversation you have.
Don’t be paranoid. An Eco Dot literally can’t tell you the time w/o phoning home. You can watch the network traffic it produces. No way it’s transmitting 24h of audio. And if you think about it, millions of Alexa devices recording 24/7 audio would generate more traffic than porn. And that’s before Amazon has paid a nickel to process any of that audio.
When it comes to eavesdropping on “every little conversation” They don’t, they can’t, it would be stupid to try.
Audio esp for voices can be super compressed, it’s not like music, few hours of low quality audio can be as little as a few MB. There is also hardware transcoding and as the exact modifications of the SOC aren’t public, it could be doing that too
Don’t be naive about how shitty corporations are, they are not really disincentivized to not break laws as the fines are just a cost of business.
It doesn’t even have to be that much. Obviously these devices can do sound to text conversion, that’s how they interpret commands. That can convert hours of stored conversation to text, zip it up and send it as a few kilobytes along with the next network request it makes for a legit purpose.
Do you really think one of those cheap little nuggets has the computing power to do that? The only thing it really does locally is listen to the wake word, everything else, including audio, it sends off to the Zon.
No way is it sitting there converting everything it hears to text.
If my cheap ass $250 cad phone can do it locally I’m sure the echo can too
We would easily be able to tell if an Alexa was constantly streaming audio data by monitoring its network traffic. It’d be just a wasteful inefficient implementation to stream everything 24/7. Makes much more sense to only start recording when it hears certain keywords that it can recognize locally beyond “Alexa”.
Who says it’s constantly streamed? Who says it’s not stored or transcribed then sent off in a small package?
Porn is generally video and audio with an acceptable quality standard for consumers, which is incomparable in size to compressed audio.
God. I’m imagining the nightmare this would look like passing through a network. Everyone with more than 1 would probably notice rather quick. The poor router being forced to just spew lol
It’s weird that people always think the Alexa shit is spying on them, but happily walk around with a smartphone in their pocket which is infinitely more capable of doing do.
It can only recognize certain hotwords on its own, eg, Alexa. So its not recording 24/7 but it is listening 24/7 for hotwords. They could push additional words and start recording whenever they hear it.
Honestly, they can just send the keywords. No need to send audio if they can match 1000 or so words that are most meaningful to advertisers and send counts of those.
AFAIK this is only speculated, not proven.
I can see why people are quick to think this but I don’t see any compelling evidence this is the case, and as others have pointed out it would be impractical for them to do so.
More likely they use it for consumer lock-in and to collect data through its api endpoints. Collecting media activity and smart home device information is valuable enough on its own, before even approaching the value of collecting recorded audio.
They can already intuit consumer habits/word of mouth exposure from other associated data with your online activity. After locking down all my other privacy, the ads I get are far less relevant to me, even though I have a number of smart listening devices in my home
There’s also the matter of there being literally hundreds of security and privacy researchers who would love nothing more than to catch Amazon doing this, and no one has in any major way.
It’s always listening. They don’t debate that.
Sure, no one is saying that. The point is that it doesn’t send anything other than the stuff after the keywords back to company servers.
Or what it thinks is a keyword. Correct.
Well, obviously it operates on what it believes is a keyword. It does not have magically divine insight. Are you trying to imply they make them overly sensitive? I don’t see the problem. Imagine the opposite. If they responded to less things they thought were keywords people would just think they’re broken.
Just wanted to highlight they miss trigger.
Some hackers have found that there is builtin protection in the hardware that guarantees the led turns on when the device listens
I got a free Echo Dot a number of years ago when I attended an AWS conference. I played briefly with it but never found it all that useful. I certainly never would have trusted using it to order things from Amazon, which is one of the things they hoped people would do. It sat in a pile of junk for a year or so before I finally got rid of it.
this. It’s the same as with phones, just more obvious because they(Alexa devices) can’t do most of the other stuff you can do on a smartphone.
And because it might not be that legal or ethical or a good look to customers, they‘d rather not disclose it and hide the revenue partly through their „normal“ ad business or other venues. But that’s just my guess.
This isn’t even conspiracy loads of people report Alexa going off randomly without any sort of prompt. Don’t tell me the device isn’t listening closely to every little conversation you have.
This definitely is conspiracy. You’re claiming that Amazon is secretly conspiring to make Alexa devices behave differently than they advertise them to. That’s like the definition of conspiracy lol. But that aside, I really don’t believe this. What’s the exact claim, that they’re always listening? No, they don’t. People can analyze the traffic and tell that’s false. That they’re intentionally overly sensitive? I have an easier time beginning to buy that but I still think we’d see more quantitative articles about that if it were true. Like we haven’t had whistle blowers or security researchers saying anything like that.
We have had stories like this one where marketers claim to be able to actively listen:
https://www.404media.co/cmg-cox-media-actually-listening-to-phones-smartspeakers-for-ads-marketing/
Whether you believe them or not is important. But they are secretly claiming to have this capability.
Article doesn’t mention Alexa which is specifically what we’re talking about, but I get your point.
Even if it is listening, based on the article, it seems the current CEO wants Alexa itself to be profitable. He doesn’t want another division of Amazon to be profitable because of Alexa.
Conspiracy means there are people conspiring meaning it is a conspiracy fact. I mention this coz the next comment says it IS but goes on to back up it is not a conspiracy because wording
Don’t tell me the device isn’t listening closely to every little conversation you have.
If it is, it’s impressively doing all of the data processing locally, otherwise any nerd with Wireshark would have caught it.
The device has to be always on and always listening, that’s how it works. Else it won’t hear your prompt.
Siri on the iPhone too is always listening. Every time I have a conversation about something with my wife, the next day I see ads for it online.
I coded an Alexa Skill once. It was tedious and a garbage platform. After a while it was delisted for spurious reasons, even worse DX than Google and Apple app stores. Complete dumpster fire from start to finish.
All obsolete now that LLMs are here. I don’t think any devs will miss it.
Alexa and LLMs are fundamentally not too different from each other. It’s just a slightly different architecture and most importantly a much larger network.
The problem with LLMs is that they require immense compute power.
I don’t see how LLMs will get into the households any time soon. It’s not economical.
The problem with LLMs is that they require immense compute power.
To train. But you can run a relatively simple one like phi-3 on quite modest hardware.
I don’t see how LLMs will get into the households any time soon. It’s not economical.
I can run an LLM on my phone, on my tablet, on my laptop, on my desktop, or on my server. Heck, I could run a small model on the Raspberry PI 5 if I wanted. And none of those devices have dedicated chips for AI.
The problem with LLMs is that they require immense compute power.
Not really, particularly if you’re talking about the usage of smaller models. Running an LLM on your GPU and sending it queries isn’t going to use more energy than using your GPU to game for the same amount of time would.
I think when people talk about LLMs replacing Alexa they mean the much more capable models with billions of parameters. The small models that a Raspberry-Pi can run are no use really.
The models I’m talking about that a PI 5 can run have billions of parameters, though. For example, Mistral 7B (here’s a guide to running it on the PI 5) has roughly 7 Billion parameters. By quantizing each parameter to 4 bits, it only takes up 3.5 GB in RAM, making it easily fit in the 8 GB model’s memory. If you have a GPU with 8+ GB of VRAM (most cards from the past few years have 8 GB or more - the 1070, 2060 Super, and 3050 and each better card in that generation hit that mark), you have enough VRAM and more than enough speed to run Q4 versions of the 13B models (which have roughly 13 Billion parameters), and if you have one with 24 GB of VRAM, like the 3090, then you can run Q4 versions of the 30B models.
Apple Silicon Macs can also competently run inference for these models - for them, the limiting factor is system RAM, not VRAM, though. And it’s not like you’ll need a Mac as even Microsoft is investing in ARM CPUs with dedicated AI chips.
Thanks for sharing that. I have a Raspberry-Pi 4B laying around and getting dusty. I might try this.
The immense computing power for AI is needed for training LLMs, it’s far less for running a pre-trained model on a local machine.
deleted by creator
Well yea. You could slap Gemini Google-Home today. You wouldn’t even need a new device for that probably. The reason they don’t do that is econimical.
My point is that LLMs aren’t replacing those devices. They are the same thing essentially. Just one a trimmed version of the other for economic reasons.
Alexa skill store is a “prime” example of Amazon’s we don’t give a shit attitude. For years they’ve turned their back on third party developers by limiting skill integration. A well designed skill on that store gets a two star rating. When everything in your app store is total shit - maybe the problem is you Amazon?! It’s been like that for years ; I completely avoid using skills as they only lead to frustration.
LLM integration into an Alexa device could be a big improvement, but current speed performance at that scale seems concerning that we’d get a laggy or very dumbed down system. Frankly Id be happy if Alexa could just grasp the concept of synonyms and also have the ability to attempt second guess interpretations of speech comprehension rather than assume user has just asked the exact same question in rapid succession but with a more frustrated tone.
Every damn smart light skill has different syntax and there is no way to get the Alexa app to just fucking tell me what the syntax is. The "nui’ (no user interface) approach is cute but really falls flat when trying to do complex tasks or mix brands of smart devices.
Also, it might be Google that does this more often so I won’t blame Alexa necessarily, but a lot of times when I ask things to play my liked songs I end up getting a song called “my liked songs” to play. It hasn’t happened in a while so however I am phrasing it must be correct now but it’s not something I’m super consciously aware of.
Yeah the syntax stuff was the biggest disappointment for me as a dev, too. There’s very little natural language processing going on, just simple template-based pattern matching. So basic and inflexible.
Whoever made a song called my liked songs is an evil genius.
I never dove into the skill API, but I’d imagine you’re setting phrases up. Can LLMs really help there? Like asking Alexa general information, I could see how LLMs were helpful, but asking it to turn lights on, how would that help?
If their store was good I think more people would be ok buying via Alexa. But even searching on the web or app, the top result is hardly ever the correct thing I searched for
I kinda feel like voice search is just an inherently bad platform for shopping.
Supposedly… Home & Kitchen is the most popular category on Amazon, consumer choice comes into that so rapidly that it’s hard for it to make sense with just audio feedback or even a tiny screen like the show.
It could be useful for reordering familiar items but only if prices were more stable or the system reliably gave feedback on how the price compared to previous orders. Now it seems like it’s built to try to get you to reorder while masking the fact that the price doubled since you last ordered the item.
Not to mention the mess of sellers on the individual items. Sometimes it’s Amazon, sometimes it’s a rando third party with ridiculous shipping fees and times.
And counterfeit products
Even when I was an Amazon customer, which I no longer am for the usual reasons, I would never have used Alexa to make a purchase of a physical good. Hell I wouldn’t trust it to get “order a 12-pack of diet pepsi” right, I’d get sent the mini cans or bottles or diet caffeine free pepsi or whatever.
Often when I’m looking online to buy something it’s because I can’t get it locally, which means I’m being kind of particular.
Maybe. maybe. I would use it to make a media purchase of some kind. But I very rarely used Amazon for multimedia; Audible, maybe. I bought one DVD and two streamable movies from Amazon EVER.
And as a Kindle Fire user, I found Alexa to not work very well anyway. Because it’s designed for a device that doesn’t have a screen, it can’t do a lot of things that Siri or Bixby or Android Voice Formerly Google Talk Is Being Replaced With Play Assistant can, and the syntax of “Alexa, ask a skill to do a thing” was just something I wasn’t going to fuck with.
deleted by creator
deleted by creator
A paywall?
WaPo the paywall??For your consideration, I present an anti-paywal-inator!!! TO THE ARCHIVES! https://archive.is/5VPB5
Holy archive! Thanks for the link.
Isn’t that the reason why Amazon gutted their Alexa development team? It turns out there isn’t business case for Alexa.
If they charge as much as a penny annually, it’s binned.
It’s like they shockingly didn’t think people would ever realize they didn’t need it.
Yeah I can live with having to turn on my bedroom lights manually. Certainly not paying 5 a month for that
You don’t need an Alexa for that, anyway. I have LCARs touch panels around my house plus the HA app on my phone with custom interfaces for all of them.
Do you have a good source in English? I’m looking for some nice panel solutions but what I found so far is a guy that did a complete EIB install with touch screen and is in German. I’d really like better touchscreen HUD access to my home assistant and haven’t made much progress from the information I’ve found myself.
One of my family has Alexa in her house, which advertises whenever they engage it. Whenever we go there, I have to resist the urge to pick a fight with Alexa regarding the improprieties of Amazon.
“Alexa, turn off ‘by the way’.”
You’re welcome.
God bless you
Thank you! I will pass it on.
Mines been unplugged for over 3 years, since I got google hub.
Then that started to need to be reset every month so I unplugged that too and now I’m happier without any of that bullshit.
They will be fine. If that asshole can afford to go to space and argue that workers rights are unconstitutional then they can eat it. It’s called capitalism.
Gee whizz, who would have thought that building your entire platform on deceptive practices would make people not trust you?
If you’re going to post pay walled content, most people will either use a gift article link, or just paste the text of the article into a comment.
IF you are in accounting, especially if you are in regulatory compliance accounting, or going into it, you NEED to know about a book named “Financial Shenanigans”.
I’m not intellectually-equal to it ( or to accounting, for that matter: psychology’s much easier to crack, for me ), but it is THE most important book for forensic-accountants to know.
The bullshit that they’ve been pulling, where “Between 2017 and 2021, Amazon had more than $25 billion in losses from its devices business, according to the documents. The losses for the years before and after that period couldn’t be determined.” didn’t produce criminal consequences…
You’ve got to be kidding, right?
Individual human goes to jail or prison for $2k tax fraud, but … big tech gets a free pass on that kind of “accounting”??
couldn’t be determined??
Either run a tight-ship or don’t be surprised when it sinks.
Economies are the “ships” that carry our countries, & have to be properly regulated, exactly as a tightly-regulated ship has to be, to keep it afloat longer.
It isn’t the sloppy mechanic-racers who win NASCAR, it is the ones who control everything correctly, with total right-regulation.
I remember when I’d read that Cisco switched to closing their books out daily, so as to always know the exact position of the company…
what an incredible degree of financial-operations integrity that was…
Anyways, “Financial Shenanigans” is THE book to dig into, if you want to know if the business you’re considering investing in is cooking the books… and you’re capable of understanding that stuff at the level it’s speaking…
_ /\ _