In an interview with Bloomberg, Dave Limp said that he "absolutely" believes that Amazon will soon start charging a subscription fee for Alexa
Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he "absolutely" believes that Amazon will soon start charging a subscription fee for Alexa
“By the way, you can now pay for Alexa AI option if you want me to reply in a slightly smarter way, but I will still cut you off with ads and other useless things. To activate AlexaAI say activate”
"I heard 'activate'. Thank you! Your credit card will be charged $129 annually. To cancel, please log on to the website because there's no way we're letting you get out of this mess the same way we got you into it."
Siri was always shit but somehow managed to devolve even further lately. I never trusted her to do more than than turning lights on or off but now this shit happens:
Me: Siri, turn off the lights in the living room
Siri: OKAY, WHICH ROOM? BATHROOM, BEDROOM, KITCHEN, HALLWAY, LIVING ROOM?
We need to move AI from the cloud to our own hardware running in our homes. Free, open source, privacy focused hardware. It'll eventually be very affordable.
That's already here. Anyone can run AI chatbots similar to, but not as intelligent as, Chatgpt or Bard.
Llama.cpp and koboldcpp allow anyone to run models locally, even with only a CPU if there's no dedicated graphics card available (although more slowly). And there are numerous open source models available that can be trained for just about any task.
Hell, you can even run llama.cpp on Android phones.
This has all taken place in just the last year or so. In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.
I have that with just my phone, using Wiz lights and ITEEE. It's the only home automation I even have because it's the only one I found that doesn't necessarily need a special base station like an Alexa or Google Home.
It's the year of the voice for Home Assistant. Given their current trajectory, I'm hopeful they'll have a pretty darn good replacement for the most common use cases of Google Home/Alexa/Siri in another year. Setting timers, shopping list management, music streaming, doorbell/intercom management. If you're on the fence about a Nabu Casa subscription, pull the trigger as it helps them stay independent and not get bought out or destroyed by commercial interests.
I haven’t yet played with the local voice stuff but have been following it with interest. Actually, now that Taspberry Piis are starting to become available again, I’m on the fence between buying a few more, vs finding something with a little more power, specifically for voice processing
I do wonder how much of those voice assistants could run on-device. Most of what I use Bixby for (I know. I KNOW.) is setting timers. I think simple things like that can run entirely on the phone. It's got a shocking amount of processing in it.
While you may have points against Apple and how effective Siri may be, with this latest version kind of products, even the watch has enough processing power to do voice processing on device. No ads. No cloud services
Pretty much. If you want a voice assistant right now, Siri is probably the best in terms of privacy. I bought a bunch of echos early, then they got a little shitty but I was in, and now I just want them out of my house except for one thing - music. Spotify integration makes for easy multi-room audio in a way that doesn't really work as well on the other platform that I'll consider (Apple/Siri) and basically adds sonos-like functionality for a tiny fraction of the price. The Siri balls and airplay are just not as good, and of course, don't work as well with Spotify.
But alexa is so fucking annoying that at this point I mostly just carry my phone (iPhone) and talk to that even though it's a little less convenient because I'm really goddamned tired of hearing "by the way..."
I'm not going to buy into a subscription model for something I've already paid for. This subscription model crap is complete bullshit.
We even tried to do it with heated seats recently. Like install heated seats in your car, but disable them in software. It's crazy that companies think they can get away with this.
Alexa is more like a telemarketer disguised as an assistant. Every interaction is followed by a “by the way . Its a shit experience so I stopped using mine.
Alexa was designed explicitly for that purpose. They lose money on every Echo sold, the whole idea was they would make money selling you stuff. Turns out people would rather use their Echo to check the weather, get recipes, etc. rather than voice shop.
I just can’t see a use case for voice shopping. There are almost zero instances where I want to buy something without having a visual of that thing in front of me at time of purchase.
I could possibly see something like “buy another stick of deodorant”, but even then I want to see if there are deals or some other options and would want to check the price at a minimum.
Ha, I use mine almost exclusively as a light switch. I don't have to get out of bed to turn off my lights or turn on my fan. I'm sure they're losing a bunch of money on me
Setting all my Alexa's to UK English got rid of all marketing "by the ways." I still regret going with the Alexa ecosystem but at least for now there is a workaround for the most rage inducing part of it.
By the way, did you know that you can find out more about telemarketing with an audio book from audible on the subject. Would you like to hear a preview of that now?
I don't know about that. They never delivered on Smart Home promises and the only truly useful thing my Google AI does is to give me the forecast. Otherwise it's just a wifi speaker.
If they finally integrate Bard, I would actually consider paying for the service.
I think the data is probably less valuable than people think, especially if the users expect an AI response whenever a data point can be collected from them.
Alexa has a feature where you tell it you're leaving the house and it will listen for smoke detectors or breaking glass, alerting you through your phone if it detects something. Amazon is putting that behind a paywall next year.
Amazon has bet big on AI, with the company unveiling a new, AI-powered version of Alexa alongside updated versions of its Echo Frames and Carrera smart glasses last week.
Good luck, I guess? Got the first Google home, at first it was great, I was asking it tons of questions. Then the questions stopped, used it for turning on the lights and other automations. Then I installed Home Assistant and the only command Google Home got was to set a timer to know when to pull things out of the oven. Eventually I stopped doing that.
At the moment all Google/Nest Homes have their mic cut off, I only use them to stream music in my house from my NAS via Plex. So yeah..
I'll happily join in bashing Amazon for plenty of reasons, but training an AI is a step that adds significant value to the material. Just like any other product that is an effort that people are willing to pay for.
They had those "re-order product" physical buttons for a while which you were supposed to glue to your washing machine so you could reorder when your detergent ran out.
Besides legal issues (at least over here all they could do is put things in your shopping cart) apparently the primary customers of those buttons were hardware hackers, turning them into all kinds of stuff.
This is the killer for all this shit right now as far as I'm concerned. All of it lives squarely in "huh...neat" territory. I have yet to see anything I felt was truly necessary. Until that happens, paying is a non starter for me.
This is why I'm so confused by Amazon's approach. I know they've already sunk millions if not billions of dollars into this, so why has the user experience not improved in the last 8 years?
I'm not going to buy things with my voice when just getting the lights to turn off or music to play can be an infuriating endeavor. Speech recognition has stagnated.
The third party integrations are just so clunky too. They could have made money by selling licenses to businesses in order to access the service, but again, they haven't improved that experience at all.
The "Alexa, let me talk to dominos." or "Alexa, ask LG to turn off the TV" is just stupidly cumbersome. Why can't you set up preferred providers? I don't have to say "ask Spotify to play music" I just say "play music", so we know it's possible. It would be trivial to implement other preferred service providers compared to the overall scale of Alexa.
In a real free market, all the banks that destroyed the economy through fraud wouldn't have gotten bailouts, they would've had to "pull themselves up by their bootstraps" like everyone else had to.
All I want alexa to do is turn my lights on and off, set timers, and show me my own pictures. And it can BARELY do that without fucking it up. Everyone I know wants the same, they expect nothing more from it. "AI" features of Alexa aren't t needed or wanted by anyone I've talked to about it.
Same. I’ve already got an entire setup between gpt with customizable system level prompting capabilities and it uses custom voice models I’ve trained over at eleven labs
Now I just gotta slap my lil monsters phat ass into a raspberry pi and then destroy the fuck out of my Alexa devices and ship em to Jeff bozo
You just typed that question on one. See: GPT4All
You can download many models and run them locally. They were about 5-16GB in size the last time I downloaded one. Pretty slow if you don't have a hefty GPU, but it works!
My Google homes have gotten progressively worse over the years. Half the time it will say it's setting a timer but nope, no timer. Recently I'll tell it to play music and it will reply that I don't have any devices with that feature.... they're all Google homes or Chromecast which absolutely play music. Really like the hardware but the software is utter shit
They also removed the ability to link third-party list applications so now saying "Add X to shopping list" just sends it into some nether realm where the item is never to be seen ever again.
I hate when it's playing music and I tell it to shut the fuck up, then it decides to turn off every actively going alarm in the house instead of turning off the goddamn music playing on the one it literally responded in. This happens most mornings.
Running AI may be currently expensive, but the hardware will continue to improve and get cheaper. If they institute a subscription fee and people actually pay for it, they'll never remove that fee even after it becomes super cheap to run.
That is sort of the issue when mixing good conscience with capitalism. Either the goods are valued at what we're willing to pay, or either they're valued at what we think the profit margin of the business should be, but mixing the two ultimately leads us to fall for PR crap. Business are quick to gather sympathy when the margins are low, and we fall for this PR crap, but then as soon they own a part of the market it turns into raising the price as much as they possibly can.
That being said, Amazon became what it is because Bezos was hell bent on not rug pulling customers, at least in the early years, so it is possible they would decrease prices eventually to gain market advantage, that's their whole strategy.
but the hardware will continue to improve and get cheaper.
Eh. I mean sure the likes of A100s will invariably get cheaper because they're overpriced AF, but there isn't really that much engineering going into those things hardware-wise: Accelerating massive chains of fmas is quite a smaller challenge than designing a CPU or GPU. Meanwhile moore's law is -- well maybe not dead but a zombie. In the past advances in manufacturing meant lower price per transistor, that hasn't been true for a while now and the physics of everything aren't exactly getting easier, they're now battling quantum uncertainty in the lithography process itself.
Where there might still be significant headways to be made is by switching to analogue, but, eeeh. Reliability. Neural networks are rather robust against small perturbations but it's not like digital systems can't make use of that by reducing precision, and controlling precision is way harder in analoge. Everything is harder there, it's an arcane art.
tl;dr: Don't expect large leaps, especially not multiple. This isn't a naughts "buy a PC twice as fast at half the price two years later" kind of situation, AI accelerators are silicon like any other they already make use of the progress we made back then.
As someone at a company still using free AI credits in their commercial products and hasn't figured out how he's going to price the shit when the credits are up.... this AI market looks a lot like Uber subsidies..
Like a year or two from now, probably any AI stuff that isn't self hosted is going to be 100% inaccessible to normal people due to cost. It's just a question of how hard they're going to fight to keep current free to download LLM models off the internet once this happens.
We're seeing this all over the tech and tech adjacent space. Can't grow forever at a loss, especially not with increased interest rates and a potential economic downturn.
My guess, if you want to have decent services we're going to end up needing to pick few (or a suite of the basics) to pay for on a monthly basic and cut out all the "free" stuff that is/will get enshittified.
in my eyes they put themselves in an awkward position by garnering a reputation of always collecting more user data than justified, and at this point i assume they do the same with paid products as it's an industry norm. however I'm not ok with it and will never pay when the product doesn't respect privacy. the saying used to be "if you don't pay, you're the product", but it is increasingly shifting to: you're the product and also you have to pay so that our shareholders can experience more infinite growth
Well that's exactly what I was thinking when these companies were making these claims... like HOW could they possibly handle this locally on a CPU or GPU when there must be a massive database that (I assume) is constantly being updated? Didn't make sense.
EDIT: this entire website can go fuck off. You ask a simple question about some reasonably new tech, and you get downvoted for having the audacity to learn some new stuff. People on here are downright pathetic.
You can record and edit videos on your own devices, but that doesn't mean it's suddenly free for Netflix or YouTube to stream their videos to you.
Surely a local version of Alexa could be developed, but that development would come with its own costs.
Some things simply can't be done locally, such as a web search. Often your route calculations for a map application are also done in the cloud.
They know charging for total access will cause a riot, so instead they're enshitifying the whole experience and holding access to the current non-shit experience hostage with monthly fees.
I think at this point with so many tech giants introducing ads to their services and increasing subscription prices, I think we can expect some kind of subscription fee to access assistants with the AI/LLM capability. It would make sense to offer a 'basic' version of these services for free since people have already invested in the hardware, but wouldn't be surprised if these companies suddenly block us from using the smart functionality suddenly unless you pay.
The emerging generation of "superhuman" AI models are so expensive to run that Amazon might charge you to use its Alexa assistant one day.
In an interview with Bloomberg, outgoing Amazon executive Dave Limp said that he "absolutely" believes that Amazon could start charging a subscription fee for Alexa, and pointed to the cost of training and running generative artificial intelligence models for the smart speaker's new AI features as the reason why.
Limp said that the company had not discussed what price it would charge for the subscription, adding that "the Alexa you know and love today is going to remain free" but that a future subscription-based version is "not years away."
Generative AI models require huge amounts of computing power, with analysts estimating that OpenAI's ChatGPT costs $700,000 a day or more to run.
Limp, Amazon's senior VP of devices and services, announced he would step down from his role at the company after 13 years a month before the launch of the new products.
Insider's Ashley Stewart reported that former Microsoft exec Panos Panay is expected to replace Limp.
The original article contains 298 words, the summary contains 182 words. Saved 39%. I'm a bot and I'm open source!
im actually surprised companies havent tried to charge for voice assistants already considering pretty much everything you say to them gets sent to some service somewhere
Yeah... The moment they do that is the moment I turn off and disconnect every smart speaker I own, take them to the electronics recycling place, and start building out an open-source smart home setup.
I use commercial smart speakers because they're easy and cheap. The moment they stop being one or the other is the moment they stop being in my home.
and start building out an open-source smart home setup.
Why are you waiting? Home Assistant is here now and it works pretty damn well for running a smart home.
At this point all Alexa is / does is act as a voice enabler for my HA setup and even that will be going away soon. They're working hard to have a localized VA of their own. It's actually usable NOW it just doesn't have "Wake Word" support yet so you have to PTT (Push To Talk) on something for HA to know you want to talk to it.
I shut mine off a while back when I was sure they were advertising stuff based on things we were talking about in the same room. We were discussing moving the chairs out of the office and the next time we went to play music she wanted to sell us new ones.
They could be handing those out for free and they'd still rot on the shelves. People just don't know what to do with them and I am not certain Amazon does.