The problem still remains: why’s this thing “opt-out” and not “opt-in”? Why not make it an official, totally optional (as in voluntarily wanting to have it and, only then, proceeding to have it) plug-in or extension that the user (let us remember the meaning of “User Agent”: an agent acting on behalf of the user, not a piece of software who’s become “the user”) could install at any moment, out of their own will?
I’m far from being an anti-AI person, I myself use those clankers on a daily basis. However, I use them because I want to, while I still want to, not because they were pushed unto me.
Mechanisms of “opt-out” where there should be an “opt-in” is a form of dark pattern.
In fact, the very concept of “opting-out” is a dark pattern per se, because it implies something pushed unto a person, something from which they were “allowed” the “right to leave”.
Yeah, it’s awesome to have means of “opting-out” from something, but having an “opt-out” mechanism in place doesn’t mitigate the very fact that it was coercively pushed unto the person beforehand and didn’t require explicit consent from the person unto which the thing was pushed.
Speaking of “consent”, situations like these are not that much different from the dark pattern “Yes / Not now” we’ve been seen everywhere: in certain scenarious, this insistence and disregard for explicit consent would verge the criminal (e.g. harassment), but suddenly it’s “okay” when corporations (and the State itself) do it.
If, say, a situation where someone is being harassed and, only after having started to harass, the harasser offers the harassed a means to leave the harassment, does this make the harasser less of a harasser? Because that’s the same absurd logic behind the corporate advocacy whenever it’s said “oh, but Mozilla is offering an opt-out, you can always turn off ‘sponsored shortcuts’ (that is, after having been faced by the shortcut from a Jeff Bezos corp as you proceeded to open a new tab for accessing the opting-out settings, but that’s totally okay), ‘sponsored wallpapers’, and the ‘Anonym tracking’, and now you can, check this out, you can turn off the clankers, too! Wow, isn’t that such a cute corp, the corp with the cute fiery fox mascot?”.
Not to say how it’s gonna end up cluttering the upstream with (more) binary blobs, adding to the Sisyphean struggle that WaterFox, IronFox, LibreWolf, Fennec, among other Firefox forks, have been experiencing upon trying to de-enshittificate the enshittificated and de-combobulate the combobulated.
“Mozilla needs to make money”. Yeah, yeah, because the very fundamental, immutable principle of cosmic existence boils down to “there’s no such thing as a free lunch”, amirite? After all, “money” is clearly within the table of elementary particles alongside quarks and gluons, isn’t it? And Mozilla needs to make money… We had a tool for that: it’s called donations.
Just because if it were opt-in, people wouldn’t have chosen to activate it, and fewer people would use it and the graph line wouldn’t go up for the shareholders to appreciate? Then, maybe, just maybe, it would be quite a strong evidence that this isn’t really something that the users want, don’t ya think?
For whatever reason, they have decided it’s important.
There’s the reason, right above this paragraph: one can only achieve what people would certainly refuse, if they pushed it onto people by use of force (not necessarily physical force, but, for example, dark pattern is a technical means of “force”).
A fox can’t convince the roosters to become her food, if the roosters were to have a stake on deciding in this regard, less roosters would become a tasty dinner for the cute fox, because becoming a tasty dinner isn’t exactly a demand from roosters. Hence why the fox must grab the roosters, but in this case the fox gives them an option to escape from her paws.
Ah, notice your own phrasing: “They have decided”. Who have decided? Not the user, not the party interested in their own UX/UI, but the very archontic architects of a kind of digital apparatus we’ve been compelled to use for participating in this digital realm of society (risking social ostracism if we don’t), the World Wide Web.
And when a decision is made upon someone, without regard for the very someone upon which the decision is being made, even when there’s some kind of “opting out” from the object of decision, we had a name for that: it was called “non-consensual relationship”.
Because people overwhelmingly do not change any defaults whatsoever
Most roosters wouldn’t normally seek the paws of the fox to be hugged by, what an astonishing news!
You see, that’s exactly what plays favorably for things pushed with “opt-out” mechanisms, anything. If people are less likely to change the settings to better enhance their UX (be it due to a lack of knowledge, a lack of proactive pursuit or because they deem their current settings “good enough”), this means people would be more likely to have the clankers shoved down their throats if said clankers were to be part of default settings.
In fact, if settings would very likely go unchanged, then Mozilla could push anything, absolutely anything under they will, “shall be the whole of the Law” with the legally-required “opt-out” mechanisms in place.
In the foreseeable future, we’d have Firefox as a new “Agentic Browser” where a clanker does all the tiring and utterly boring effort of “browsing the web” as the user watches their credit card being depleted by prompt injections carefully placed amidst Unicode exploits across the web by scammers. But, hey, let us not worry, there’s always a button to turn it off! 😄
Most roosters wouldn’t normally seek the paws of the fox to be hugged by, what an astonishing news!
Whoosh. The point is “the roosters” don’t seek anything at all. It could be 50 lbs. of delicious cow shit, but if you don’t put it down in front of them, they’re not going to go looking for it.
Please read my comments in their entirety before replying.
I’m not referring only to the feature per se, I’m also referring to any pop-up designed to appear throughout the navigation to “remind the user about the superb features”.
Said pop-up is explicitly mentioned on their “confirmation dialog” upon turning off (screenshot attached below):
You won’t see new or current AI enhancements in Firefox, or pop-ups about them.
It speaks volumes about how much a dark pattern this is, the fact that the opt-off has a confirmation dialog, while the further proceeding with logging in with Anthropic/OpenAI/Google/Meta account doesn’t seem to have a confirmation dialog.
And the fact that the confirmation feels “menacing” and defaulted to cancelling the opting-off (i.e. pressing “esc” or clicking outside the window; one must click the primary-colored “block” button which, contrasted to a grayish “Cancel” button, may psychologically induce the user into thinking “block” is a dangerous action), quite similar to the about:config warning screen.
Ah, and the clanker options: notice the lack of alternative options for those who want a custom clanker, such as DeepSeek, Qwen, Z AI, Brazilian Maritaca IA and Amazônia IA (to mention some non-Chinese LLMs), or even something running locally through ollama. Seemingly no option for using a custom, possibly self-hosted LLM endpoint. The fact that all the options offered are all heavily corporate options (with Mistral being the “least corporate” of them all, but still Global Northern nonetheless) might tell us something…
All of these dark patterns, among others not mentioned, are the object of my critique, not just the fact that Mozilla is shoving clankers unto Firefox.
Whenever a feature needs an invasive pop-up and the opt-out brings up a second pop-up that requires further confirmation (but none seems to be offered upon actually using said feature), it is called a dark pattern, no matter if said feature requires further configuration.
And the fact that the confirmation feels “menacing” and defaulted to cancelling the opting-off (i.e. pressing “esc” or clicking outside the window; one must click the primary-colored “block” button which, contrasted to a grayish “Cancel” button, may psychologically induce the user into thinking “block” is a dangerous action), quite similar to the about:config warning screen.
I don’t think it’s menacing at all. It gives an informative list of features, which is nice to know. I could see a lot of people wanting to turn off all AI then realizing they actually want local translate instead of sending everything to google.
And you’ve got the button intents mixed up. Primary color is always the encouraged action in that kind of design. Dark pattern would be if the colors were flipped.
When we develop a system (I used to work as a DevOps for almost 10 years), the technical aspects aren’t the only aspects being accounted for: especially when it comes to the front-end (i.e. the UI the user sees, the UX how user interaction will happen and how it may be perceived by them), psychology (especially behaviorism) is sine qua non.
Shapes and colors often carry archetypal meanings: a red element feels “dangerous”, a window with a yellow triangle icon feels to be “warning” about something, a green button feels “okayish”. I mean, those are the exact same principles behind traffic lights.
And signs and symbols, ruling the world, don’t exist in a vacuum: a colored button besides a monochromatic button may, psychologically, lead to a feeling that the colored button is the proper way to proceed.
But… there’s a twist: imagine you have a light-gray “Cancel” and a colored (regardless of the color) “Block”. “Block” is a strong word. The length of the label text also does impart psychological effects. The human brain may see: “huh, I have this button which reads ‘block’ and it’s quite strong, and this other button which reads ‘cancel’ and it’s more easy to the eyes, maybe ‘block’ is dangerous”. Contrast matters: the comparison between a substrate and the substances is pretty much how we’re wired to navigate this world as living beings.
Now, corporations such as Apple (Safari), Google (Chromium), and very likely Mozilla (Firefox) as well, they have entire hordes of psychologists directly working for them, likely the same psychologists who’ll work together with their HR departments for evaluating the candidates who applied for a job position there. These psychologists, and/or psychoanalysts, they know about Jungian archetypes, they know about fight-or-flight response and other facets of our deeply-ingrained instincts, they know about how colors are generally perceived by the human brain. Those psychologists likely played a role when a brand was chosen, or when an advertisement pitch was made. They know what they’re doing.
UX/UI decisions are far from random choices from the leading team of project management engineers, it involved designers with psychologists. Again: they know what they’re doing, they know it pretty well. They know how the users are likely to keep the functionality. They know how the users, as Ulrich said, are very unlikely to touch the settings, likely to keep the defaults, no matter what those defaults are. Because they know humans are driven by the “least-effort” instinct, which is quite of a fundamental principle shared among living beings as a byproduct of the “lowest energetic point” (thermodynamic equilibrium) principle.
To me, a former full-stack developer, the newer Firefox interfaces don’t feel like Firefox is being psychologically fair and honest with the user’s mind. Dark patterns are often subtle, and they’re part of a purposeful, corporate decision.
@[email protected] @[email protected]
The problem still remains: why’s this thing “opt-out” and not “opt-in”? Why not make it an official, totally optional (as in voluntarily wanting to have it and, only then, proceeding to have it) plug-in or extension that the user (let us remember the meaning of “User Agent”: an agent acting on behalf of the user, not a piece of software who’s become “the user”) could install at any moment, out of their own will?
I’m far from being an anti-AI person, I myself use those clankers on a daily basis. However, I use them because I want to, while I still want to, not because they were pushed unto me.
Mechanisms of “opt-out” where there should be an “opt-in” is a form of dark pattern.
In fact, the very concept of “opting-out” is a dark pattern per se, because it implies something pushed unto a person, something from which they were “allowed” the “right to leave”.
Yeah, it’s awesome to have means of “opting-out” from something, but having an “opt-out” mechanism in place doesn’t mitigate the very fact that it was coercively pushed unto the person beforehand and didn’t require explicit consent from the person unto which the thing was pushed.
Speaking of “consent”, situations like these are not that much different from the dark pattern “Yes / Not now” we’ve been seen everywhere: in certain scenarious, this insistence and disregard for explicit consent would verge the criminal (e.g. harassment), but suddenly it’s “okay” when corporations (and the State itself) do it.
If, say, a situation where someone is being harassed and, only after having started to harass, the harasser offers the harassed a means to leave the harassment, does this make the harasser less of a harasser? Because that’s the same absurd logic behind the corporate advocacy whenever it’s said “oh, but Mozilla is offering an opt-out, you can always turn off ‘sponsored shortcuts’ (that is, after having been faced by the shortcut from a Jeff Bezos corp as you proceeded to open a new tab for accessing the opting-out settings, but that’s totally okay), ‘sponsored wallpapers’, and the ‘Anonym tracking’, and now you can, check this out, you can turn off the clankers, too! Wow, isn’t that such a cute corp, the corp with the cute fiery fox mascot?”.
Not to say how it’s gonna end up cluttering the upstream with (more) binary blobs, adding to the Sisyphean struggle that WaterFox, IronFox, LibreWolf, Fennec, among other Firefox forks, have been experiencing upon trying to de-enshittificate the enshittificated and de-combobulate the combobulated.
“Mozilla needs to make money”. Yeah, yeah, because the very fundamental, immutable principle of cosmic existence boils down to “there’s no such thing as a free lunch”, amirite? After all, “money” is clearly within the table of elementary particles alongside quarks and gluons, isn’t it? And Mozilla needs to make money… We had a tool for that: it’s called donations.
If it’s opt-in it may as well not exist. For whatever reason, they have decided it’s important.
@[email protected] @[email protected]
Just because if it were opt-in, people wouldn’t have chosen to activate it, and fewer people would use it and the graph line wouldn’t go up for the shareholders to appreciate? Then, maybe, just maybe, it would be quite a strong evidence that this isn’t really something that the users want, don’t ya think?
There’s the reason, right above this paragraph: one can only achieve what people would certainly refuse, if they pushed it onto people by use of force (not necessarily physical force, but, for example, dark pattern is a technical means of “force”).
A fox can’t convince the roosters to become her food, if the roosters were to have a stake on deciding in this regard, less roosters would become a tasty dinner for the cute fox, because becoming a tasty dinner isn’t exactly a demand from roosters. Hence why the fox must grab the roosters, but in this case the fox gives them an option to escape from her paws.
Ah, notice your own phrasing: “They have decided”. Who have decided? Not the user, not the party interested in their own UX/UI, but the very archontic architects of a kind of digital apparatus we’ve been compelled to use for participating in this digital realm of society (risking social ostracism if we don’t), the World Wide Web.
And when a decision is made upon someone, without regard for the very someone upon which the decision is being made, even when there’s some kind of “opting out” from the object of decision, we had a name for that: it was called “non-consensual relationship”.
Because people overwhelmingly do not change any defaults whatsoever, regardless of what they like or want.
If you put a button in the settings that did nothing but automatically generate a $5 bill, no one would click that either.
@[email protected] @[email protected]
Most roosters wouldn’t normally seek the paws of the fox to be hugged by, what an astonishing news!
You see, that’s exactly what plays favorably for things pushed with “opt-out” mechanisms, anything. If people are less likely to change the settings to better enhance their UX (be it due to a lack of knowledge, a lack of proactive pursuit or because they deem their current settings “good enough”), this means people would be more likely to have the clankers shoved down their throats if said clankers were to be part of default settings.
In fact, if settings would very likely go unchanged, then Mozilla could push anything, absolutely anything under they will, “shall be the whole of the Law” with the legally-required “opt-out” mechanisms in place.
In the foreseeable future, we’d have Firefox as a new “Agentic Browser” where a clanker does all the tiring and utterly boring effort of “browsing the web” as the user watches their credit card being depleted by prompt injections carefully placed amidst Unicode exploits across the web by scammers. But, hey, let us not worry, there’s always a button to turn it off! 😄
Whoosh. The point is “the roosters” don’t seek anything at all. It could be 50 lbs. of delicious cow shit, but if you don’t put it down in front of them, they’re not going to go looking for it.
Please read my comments in their entirety before replying.
Other than link previews all the features they are opt-in in the sense you’d have to actually use the feature.
@[email protected] @[email protected]
I’m not referring only to the feature per se, I’m also referring to any pop-up designed to appear throughout the navigation to “remind the user about the superb features”.
Said pop-up is explicitly mentioned on their “confirmation dialog” upon turning off (screenshot attached below):
It speaks volumes about how much a dark pattern this is, the fact that the opt-off has a confirmation dialog, while the further proceeding with logging in with Anthropic/OpenAI/Google/Meta account doesn’t seem to have a confirmation dialog.
And the fact that the confirmation feels “menacing” and defaulted to cancelling the opting-off (i.e. pressing “esc” or clicking outside the window; one must click the primary-colored “block” button which, contrasted to a grayish “Cancel” button, may psychologically induce the user into thinking “block” is a dangerous action), quite similar to the
about:configwarning screen.Ah, and the clanker options: notice the lack of alternative options for those who want a custom clanker, such as DeepSeek, Qwen, Z AI, Brazilian Maritaca IA and Amazônia IA (to mention some non-Chinese LLMs), or even something running locally through ollama. Seemingly no option for using a custom, possibly self-hosted LLM endpoint. The fact that all the options offered are all heavily corporate options (with Mistral being the “least corporate” of them all, but still Global Northern nonetheless) might tell us something…
All of these dark patterns, among others not mentioned, are the object of my critique, not just the fact that Mozilla is shoving clankers unto Firefox.
Whenever a feature needs an invasive pop-up and the opt-out brings up a second pop-up that requires further confirmation (but none seems to be offered upon actually using said feature), it is called a dark pattern, no matter if said feature requires further configuration.
I don’t think it’s menacing at all. It gives an informative list of features, which is nice to know. I could see a lot of people wanting to turn off all AI then realizing they actually want local translate instead of sending everything to google.
And you’ve got the button intents mixed up. Primary color is always the encouraged action in that kind of design. Dark pattern would be if the colors were flipped.
@[email protected] @[email protected]
When we develop a system (I used to work as a DevOps for almost 10 years), the technical aspects aren’t the only aspects being accounted for: especially when it comes to the front-end (i.e. the UI the user sees, the UX how user interaction will happen and how it may be perceived by them), psychology (especially behaviorism) is sine qua non.
Shapes and colors often carry archetypal meanings: a red element feels “dangerous”, a window with a yellow triangle icon feels to be “warning” about something, a green button feels “okayish”. I mean, those are the exact same principles behind traffic lights.
And signs and symbols, ruling the world, don’t exist in a vacuum: a colored button besides a monochromatic button may, psychologically, lead to a feeling that the colored button is the proper way to proceed.
But… there’s a twist: imagine you have a light-gray “Cancel” and a colored (regardless of the color) “Block”. “Block” is a strong word. The length of the label text also does impart psychological effects. The human brain may see: “huh, I have this button which reads ‘block’ and it’s quite strong, and this other button which reads ‘cancel’ and it’s more easy to the eyes, maybe ‘block’ is dangerous”. Contrast matters: the comparison between a substrate and the substances is pretty much how we’re wired to navigate this world as living beings.
Now, corporations such as Apple (Safari), Google (Chromium), and very likely Mozilla (Firefox) as well, they have entire hordes of psychologists directly working for them, likely the same psychologists who’ll work together with their HR departments for evaluating the candidates who applied for a job position there. These psychologists, and/or psychoanalysts, they know about Jungian archetypes, they know about fight-or-flight response and other facets of our deeply-ingrained instincts, they know about how colors are generally perceived by the human brain. Those psychologists likely played a role when a brand was chosen, or when an advertisement pitch was made. They know what they’re doing.
UX/UI decisions are far from random choices from the leading team of project management engineers, it involved designers with psychologists. Again: they know what they’re doing, they know it pretty well. They know how the users are likely to keep the functionality. They know how the users, as Ulrich said, are very unlikely to touch the settings, likely to keep the defaults, no matter what those defaults are. Because they know humans are driven by the “least-effort” instinct, which is quite of a fundamental principle shared among living beings as a byproduct of the “lowest energetic point” (thermodynamic equilibrium) principle.
To me, a former full-stack developer, the newer Firefox interfaces don’t feel like Firefox is being psychologically fair and honest with the user’s mind. Dark patterns are often subtle, and they’re part of a purposeful, corporate decision.