Hot off the back of its recent leadership rejig, Mozilla has announced users of Firefox will soon be subject to a 'Terms of Use' policy — a first for the
Hot off the back of its recent leadership rejig, Mozilla has announced users of Firefox will soon be subject to a ‘Terms of Use’ policy — a first for the iconic open source web browser.
This official Terms of Use will, Mozilla argues, offer users ‘more transparency’ over their ‘rights and permissions’ as they use Firefox to browse the information superhighway — as well well as Mozilla’s “rights” to help them do it, as this excerpt makes clear:
You give Mozilla all rights necessary to operate Firefox, including processing data as we describe in the Firefox Privacy Notice, as well as acting on your behalf to help you navigate the internet.
When you upload or input information through Firefox, you hereby grant us a nonexclusive, royalty-free, worldwide license to use that information to help you navigate, experience, and interact with online content as you indicate with your use of Firefox.
Also about to go into effect is an updated privacy notice (aka privacy policy). This adds a crop of cushy caveats to cover the company’s planned AI chatbot integrations, cloud-based service features, and more ads and sponsored content on Firefox New Tab page.
The actual addition to the terms is essentially this:
If you choose to use the optional AI chatbot sidebar feature, you're subject to the ToS and Privacy Policy of the provider you use, just as if you'd gone to their site and used it directly. This is obvious.
Mozilla will collect light data on usage, such as how frequently people use the feature overall, and how long the strings of text are that are being pasted in. That's basically it.
The way this article describes it as "cushy caveats" is completely misleading. It's quite literally just "If you use a feature that integrates with third party services, you're relying on and providing data to those services, also we want to know if the feature is actually being used and how much."
I agree to a point, but I look at this similar to how I'd view any feature in a browser. Sometimes there are features added that I don't use, and thus, I simply won't use them.
This would be a problem for me if it was an "assistant" that automatically popped up over pages I was on to offer "help," but it's not. It's just a sidebar you can click a button in the menu to pop out, or you can never click that button and you'll never have to look at it.
It's not a feature that auto-enables in a way that actually starts sending data to any AI company, it's just an optional interface, that you have to click a specific button to open, that can then interface with a given AI model if you choose to use it. If you don't want to use it, then you ideally won't even see it open during your use of Firefox.
Sidebar you can open from the hamburger menu that is basically just a tiny chat UI
Right click to paste the selected text into the sidebar
If you don't want it, they don't seem to be pushing it any further than that. Just don't click the option in the menus and you'll be fine. (I believe you can also fully disable the option from appearing in settings too)
AI is quite possibly a more catastrophic technological development than nuclear weapons.
I wouldn't go that far. A technology that wastes a lot of energy and creates a lot of bad quality content isn't the same as a bomb that directly kills millions.
But nuclear weapons have only been used twice in 80 years for military purposes. They have arguably prevented more deaths than they have caused.
And you're drastically underselling the potential impact of AI. If anything, your reaction is a defense mechanism because you can't bear to stomach the potential consequences of AI.
One could have easily reacted the same way to the invention of the printing press, or the automobile, or the analog computer. They all wasted a lot of energy for limited benefit, at first. But if the technology develops enough, it can destroy everything that we hold dear.
Human beings engineering their own obsolescence while cavalierly disregarding the potential consequences. A tale as old as time
But nuclear weapons have only been used twice in 80 years for military purposes. They have arguably prevented more deaths than they have caused.
Nukes only "prevent" deaths by saying they'll cause drastically large numbers of deaths otherwise. If the nukes didn't exist, there wouldn't then be the threat of death from the nukes, which is being prevented by more people having the nukes.
If anything, your reaction is a defense mechanism because you can’t bear to stomach the potential consequences of AI.
"AI" is just more modern machine learning techniques that we've had for decades. Most implementations of it today are things that nobody actually wants, producing worse quality outputs than that of a human. Maybe it will automate some jobs, sure, that can happen. Just like how tons of automation historically has just pushed people from direct labor to management of machine labor.
Heck, if "AI" automated most of the work people did and put us out of a job, that would just accelerate our progress towards pushing for UBI/or an era of superabundance, which I'd welcome with open arms. It's a lot easier to convince people that centralized ownership of wealth and resources makes no sense if goods can be produced automatically by machines for free.
But sure, seeing matrix multiplication causing statistically probable sentences to be formed really has me unable to stomach the potential consequences. /s
One could have easily reacted the same way to the invention of the printing press, or the automobile, or the analog computer. They all wasted a lot of energy for limited benefit, at first. But if the technology develops enough, it can destroy everything that we hold dear.
And what did the printing press, automobile, and analog computer bring?
A rapid advancement in the spread of information and local news, faster individualized transport that later contributed to additional developments to rail and bus transit solutions, and software solutions that can massively reduce workloads while accelerating human progress.
And all of those things either raised the standard of living without causing equivalent harm from job loss, or actively created substantially more jobs.
Human beings engineering their own obsolescence while cavalierly disregarding the potential consequences. A tale as old as time
Make human work obsolete so we can do what we care about and hang out with people we like instead of spending our days doing labor to produce goods we rely on? Sign me up.
Nukes only “prevent” deaths by saying they’ll cause drastically large numbers of deaths otherwise. If the nukes didn’t exist, there wouldn’t then be the threat of death from the nukes, which is being prevented by more people having the nukes.
Okay? But war existed long before nuclear weapons, and it also causes a large number of deaths. If nukes didn't exist, there would potentially be more wars, and thus more death.
Heck, if “AI” automated most of the work people did and put us out of a job, that would just accelerate our progress towards pushing for UBI/or an era of superabundance, which I’d welcome with open arms.
I wouldn't be so sure about that. We have already automated essentially everything else, and yet people work more than ever. If goods can be produced automatically by machines for free, what's to stop the owners of the machines from simply eliminating what used to be the working class?
But sure, seeing matrix multiplication causing statistically probable sentences to be formed really has me unable to stomach the potential consequences. /s
Your defensiveness speaks volumes.
And what did the printing press, automobile, and analog computer bring?
An ever more powerful nucleus of mechanization that has resulted in the most devastating wars and the most widespread suffering in all of human history. Genocides, chattel slavery, famine, biochemical and nuclear weapons; mass extinction and the imminent destruction of the very planet on which we live.
Make human work obsolete so we can do what we care about and hang out with people we like instead of spending our days doing labor to produce goods we rely on? Sign me up.
Sweet summer child. Making human work obsolete makes human beings obsolete. I envy your naivety.
as a glorified search engine, after pretty much all search indexes were neutered on purpose...but even then it's...mostly passable, but always untrustworthy.
So phone-home telemetry that you can’t opt out of.
You can opt out of it. You've always been able to opt out of Mozilla's telemetry. Not to mention that if you actually read the Privacy Notice, there's an entire section detailing every single piece of telemetry that Mozilla collects, and if you read the section very clearly titled "To provide AI chatbots," you'll see what's collected:
Technical data
Location
Settings data
Unique identifiers
Interaction data
The consent required for the collection to even start:
Our lawful basis
Consent, when you choose to enable an AI Chatbot.
And links that lead to the page explaining how to turn off telemetry even if you're using the in-beta AI features.
They use the term telemetry in a special way. If they are collecting info from users, that is telemetry under a different name, ok fine. Not collecting info means they receive 0 bits.
I truly don't understand what point you're trying to make here.
Mozilla defines telemetry as "data collection." Any collection of data by Mozilla is considered telemetry, as is described by the docs page that is cited on the Telemetry Collection & Deletion page.
If you deselect the Allow Firefox to send technical and interaction data to Mozilla option, this disables all telemetry, or in other words, all data collection by Mozilla.
Mozilla will collect light data on usage, such as how frequently people use the feature overall,
That says to me they want to know (among other things) how many browser users make zero use of the AI feature. To acquire that info, they have to collect it. You have to assume the worst when you see phrasing like that.