Amazon is releasing a new backend for its Alexa home assistant in October, to be called “Remarkable!” This will run on Anthropic’s Claude LLM chatbot. The new Alexa service will only cost you $5 to…
Did you know that one simple tip for turning your lights on is to use a smart home device or smart light bulbs? You can control them with your smartphone or voice commands through a virtual assistant, making it easy to turn your lights on and off without needing to physically flip a switch!
Amazon hoped the unit would one day generate a profit by users spending more money somehow
anybody that's used an alexa for more than about 5 minutes should have noticed that pretty much everything it can do that's more complicated than setting a timer to tell you when your chicken nuggets are done is designed to push you into the amazon services ecosystem
Amazon is releasing a new backend for its Alexa home assistant in October, to be called “Remarkable!” This will run on Anthropic’s Claude LLM chatbot. The new Alexa service will only cost you $5 to $10 a month — and not the present price of $0. [Reuters, archive]
oh this is going to go remarkably poorly. the only time I’ve seen some of my family effectively boycott a technology was when they realized all the Alexa shit they got for cheap was fucking terrible and wouldn’t stop pushing ads to them, actively trying to get them to order shit, and lighting up their bedroom with a bright notification light (for either a package status they didn’t care that much about or more ads) they couldn’t turn off. so they boxed up all their Alexa shit and asked if I needed any of it for scrap parts (I did not)
The new Alexa service will only cost you $5 to $10 a month — and not the present price of $0.
it’s also fucking remarkable how doomed this is. did they talk to none of the people responsible for frog boiling over at Prime Video? come on, we all know the steps:
you start your janky, subpar service at $0, as a value add to an existing Amazon product. this starts you with a massive userbase right off the bat.
you then implement a lock-in mechanism. for a streaming service that’s exclusives; for a voice assistant, that’s probably API integrations designed under an exclusivity contract
now that switching off of your service carries consequences and the lock-in mechanism has given it an aura of prestige (even though it’s probably still janky as fuck), increase its price to $1-2 a month to see how much of your userbase is pliant and willing to accept the introduction of a subscription mechanism. do what it takes to make the idea of a subscription tolerable; segregate your userbase into paid (which get exclusives) and free (which get ads) tiers if you need to.
now that your userbase has been primed (heh) to accept the idea of paying for a previously complementary service, boil the fuck out of that frog. increase subscription fees regularly. introduce ads even for paid tiers. run constant experiments to see how and where you can introduce ads relative to the amount you’re charging before subscriber numbers drop; use that to make the service just annoying and expensive enough that you’re still making a massive profit even after shedding what I’m very certain Amazon considers the dead weight of bad consumers.
the product is now in its final form under capitalism: some horseshit that’s functionally and economically indistinguishable from paying a cable company far too much for a premium TV channel, but with even more ads and customer data exfiltration enabled by the underlying technology. for some shit like Alexa, the value for the customer is even worse — the platform does so little other than push ads and steal data.
but amazon’s not doing the above obvious shit they always do in this case, and I think I know why: unlike streaming, people fucking hate voice assistants, so this $5-$10 fee might just be a desperate strategy to get true believers paying (or they’re fine with killing Alexa by making the subscription version mandatory)
I wonder what the Alexa backend costs relative to user base and data value. Seems like they aren't likely to get much more useful information than they already get from other sources, and even ignoring the forest-burning hell that is LLMs earlier voice recognition technology wasn't free in terms of compute.