Attached: 1 image
WHAT the FUCK mastodon?!?!?!?!
do NOT do the fucking age verification bullshit what the fuck is wrong with you
https://blog.joinmastodon.org/2026/02/connecting-the-world-through-thriving-online-communities/
#AgeVerification #mastodon #MastoAdmin #FediAdmin #fediverse
Do they? There’s one thing to make it law, another thing to enforce it. OSA in the UK has been around since last July and managed to do nothing other than pick a fight with 4chan and get nowhere. I seem to recall someone mentioned Lemmy to Ofcom in a discussion regarding OSA and they were literally like “What’s a Lemmy?”
How on earth do you imagine a regulator is going to work out how to deal with 50+ federated instances (for instance)?
I mean if they can really just do nothing, then that is also something it would be good to be sure about.
Nintendo has shown that it is possible to attack open source projects at the repository level, and while that wouldn’t necessarily stop development, it would be a step down to force development technically “underground”.
And if instances have to start being regularly replaced, that WILL cause attrition.
I just think this is a logistical dead-end for regulators who may rely on the chilling effect of the thought of being targeted rather than actually being targeted. Unless the Fediverse somehow becomes massive, I don’t see that it’ll ever enter their eyes. Especially as many places will be based in the USA who is the least likely country to implement these laws, and the most hostile to any threats from foreign regulators (see again the 4chan example).
You may want to look into what the legal requirements actually are, and how it changes who is liable. It is outright draconian.
Essentially, it requires the OS to find out the age of the user, and then inform ALL software that is run by API. Any software that theoretically could use the data, and still allows a child to see something they should not have, will be liable.
You claimed that the US was the least likely to do this sort of thing…
Instead, despite the incompetence, they are clearly spearheading this globally along with the UK. Making it most decidedly the first place that will have to deal with this crap.
I mean… They have to.
Countries are making it law, so sooner or later, fedi projects are going to have to deal with that crap.
Do they? There’s one thing to make it law, another thing to enforce it. OSA in the UK has been around since last July and managed to do nothing other than pick a fight with 4chan and get nowhere. I seem to recall someone mentioned Lemmy to Ofcom in a discussion regarding OSA and they were literally like “What’s a Lemmy?”
How on earth do you imagine a regulator is going to work out how to deal with 50+ federated instances (for instance)?
I mean if they can really just do nothing, then that is also something it would be good to be sure about.
Nintendo has shown that it is possible to attack open source projects at the repository level, and while that wouldn’t necessarily stop development, it would be a step down to force development technically “underground”.
And if instances have to start being regularly replaced, that WILL cause attrition.
I just think this is a logistical dead-end for regulators who may rely on the chilling effect of the thought of being targeted rather than actually being targeted. Unless the Fediverse somehow becomes massive, I don’t see that it’ll ever enter their eyes. Especially as many places will be based in the USA who is the least likely country to implement these laws, and the most hostile to any threats from foreign regulators (see again the 4chan example).
uh, what?
Yes? USA is the least likely to do this. Porn laws in various states don’t apply to social media.
Other attempts have been stuck in legislative hell, been unenforced or have court cases challenging their legality (Mississipi)
Not even two weeks later, California is making OS level age verification a thing.
I’m not even sure how that is remotely enforceable, although this also is a somewhat different thing to what this thread is about.
You may want to look into what the legal requirements actually are, and how it changes who is liable. It is outright draconian.
Essentially, it requires the OS to find out the age of the user, and then inform ALL software that is run by API. Any software that theoretically could use the data, and still allows a child to see something they should not have, will be liable.
You claimed that the US was the least likely to do this sort of thing…
Instead, despite the incompetence, they are clearly spearheading this globally along with the UK. Making it most decidedly the first place that will have to deal with this crap.
Not the last.
US Tech firms profit the most from it, the verification data lands on some palantir server - as the recent discord fiasco implied.
Whether they do so optionally is a different thing entirely, to be fair.