A Bad Week for the Internet
It's been a shitter, really.
This one gets a bit rambly, but I think people need to be openly thinking about more than just "what just happened?" and more about "how do we prevent or mitigate this happening again?"
Last week, Steam started its way down the slippery slope of removing edgy adult games at the behest of payment processors, themselves responding to a reportedly pretty small amount of pressure from a far-right religious group targeting "incest" games. A few days ago, itch.io followed suit with an approach more extreme in its caution - completely removing all adult content until they can do a full review and remove anything considered "outside the terms of service of their payment processor."
A day later, the UK's Online Safety Act went into force, causing (among other things) large sites involving user interaction or adult content (i.e. all of them) to require extremely invasive amounts of information to verify ages and keep children away from dangerous content. We're talking highly identifiable information - Photo ID is being suggested unironically alongside bank details and credit card info (!!!), the sorts of things no child has ever fraudulently obtained from their parents.
These are two very superficially related problems with fundamentally different sources, which I want to explore, but I do believe the solutions end up horseshoeing back to being similar.
Banning Adults Buying Adult Media about Adults
But first, I want to be explicit about something - in art and media it has always been the case that we permit the depiction and exploration of acts that in reality would be definitively transgressive, immoral, illegal, or worthy of scorn. The idea that these depictions can be causally linked to increases in such behaviour among viewers is itself worthy of derision. All attempts to do so in the past are rightly looked back on as conservative, authoritarian, and reactionary - Video Nasties, Violence in Video Games, even the Hays Code - all are now considered at best laughable or at worst actively damaging to the causes they claimed to support.
This leads to the first, foundational rebuttal we should bear in mind discussing the Steam and itch.io purges - it doesn't matter if a game lets the player character fuck their fictional sister.
On a store that lets people buy games where they assault, kill, maim, and commit other acts of imaginative violence, pulling out the ban hammer because imaginary siblings are getting someone off somewhere is a reactionary conservatism, and equivocating on the degree to which this is unacceptable because it gives you the ick is reactionary cowardice.
Banning transgressive media then linking it to queer media so they can ban that too is explicitly the gameplan of conservative activists, so it's extra important to not buy into their vision of "responsible media" is and tell them to fuck off at the first step.
Platforms have the right to set limits on what they deem to be acceptable when they're the ones distributing the media, but they also have a responsibility to communicate and enforce those limits consistently and clearly.
That's not what's happening here. Mastercard and Visa are afraid of being found liable for enabling transactions that allow "illegal activity," because as I understand it a prior judgment ruled that way (desperately seeking citation here, could be misinformed). They are incentivised to be wildly conservative because they run a near-global duopoly on online payments, and will lose no business over stepping on such a small group.
Steam has historically been pretty decent on this stuff - a solid B+ to my recollection. They saw the difficulty of trying to draw the line between "adult" and "normal???" games and instead of coming down with a heavy hand, they accepted they'd never be able to solve it and let their store's users (both devs and players) take the wheel on categorising their own media. It was in everyone's interest to do so accurately, so devs could find their audiences and players could find their media.
I'll admit the store has been less pleasant to use since then, but overall I think this was the correct tack for Steam to take.
This time, I think Steam probably should have tried to fight it - it's possible an organisation as big and influential as them could have helped redefine the legal precedent and got payment processors out of the whole conversation. The idea that they're responsible for someone buying illicit 3D renders of fictionally related fictional characters is genuinely baffling when steam operates as such a middleman.
Hell, Steam has its own wallet system - it would have been fascinating to see them move questionable media to be wallet-only purchases while they figured out the specifics. I don't know if they'd have been successful in fighting that the media is lawful in the current political climate, but I am confident that itch.io's capitulation was driven by the success this pressure had with Steam.
Won't The UK Please Think Of The Children
If you're not British, or you are but politics has you in that generally depressive spiral of scrolling past anything resembling a picture of the houses of parliament, you might need small primer on the current ruling party.
They don't have policies, goals, or any particular platform. They won an election because the other major party imploded harder than them at an opportune moment, and they're terrified that they'll lose the next election to a bunch of far-right racists.
The current Labour party define themselves by insisting that they're "the adults in the room", and when pressed on particulars cycling back to that phrase (or something similar) like an incantation.
Their premiership has been defined by doing relatively little, getting what they have done wrong, and kicking people out of the party for voting against their woefully unpopular ideas.
There's an additional note of context for the specifics of the Online Safety Act that just went into force. It wasn't the Labour Party's law.
Back in 2023 the Conservative Party was in charge and this was one of their laws, but in the process of pushing it through our parliamentary system a bunch of additional bits of language were added or changed - especially around extending the scope and vagueness of what might be considered "unsafe" - with all the precision and expertise you'd expect from the demographics populating a "House of Lords."
By the time we get to the present day we have a feckless government trying to taxonomise the entire internet, defining chairs while trying to dodge defining sofas, and putting unacceptable amounts of pressure on small site admins paying for passion projects from their own pockets.
They think they've sidestepped this by sectioning off websites by their traffic and content delivery mechanisms, but Wikipedia pointed out that they'd technically fall under those rules despite being expressly the kind of website the law tries not to harm!
The actual mechanisms by which this is going to be enforced are so comically broad and exploitable that I honestly expect they'll become as ubiquitously ignored as GDPR compliance pop-ups. I fully expect browser plug-ins (containing zero malware, I'm sure) to automate the process of bypassing them.
Or everyone in the UK is suddenly going to "move" to Rotterdam.
All of this is the result of overconfident lawmakers identifying a problem - a real fucking problem, it must be said - and being confident they have what it takes to solve it. They can define the exact contours by which a website becomes "unsafe for children" based on their massive political brains, and it's certain not to change over time. After all, it's not like the giant corporations being regulated have a history of exploiting loopholes and regulatory oversights...
What this called for was probably a new regulator, with the power to define and sanction improper moderation. These current powers are going to fall on Ofcom I think, but I think we'd need some kind of Ofnet to interact directly with websites, companies, and site admins and ensure they're behaving responsibly within the bounds of their abilities. You cannot paint this map with a broad brush and it's a full time job in this ever-changing landscape.
The benefit of a regulator is that it can be staffed by people who understand the medium they're regulating, rather than careerist politicians trying to get Daddy Starmer's attention for a cabinet position.
Anyway we're not getting a sensible solution, we're getting a map of the territory drawn from the castle window, so what can we do about it?
What now?
I think the big lesson from all of this is that we can't trust platforms to protect us.
When Steam started banning things it was just assumed that this was the time for people to move to "the real ally platform" of itch.io, but it took literally a week for them to capitulate even worse, because as a smaller player they're more vulnerable.
But ultimately, itch.io is still just a faceless platform that stores, distributes, and takes and provides payments for media.
You can't fix this by just moving your transgressive smut to Patreon, they have the same incentives and pressures.
Ultimately, the closing of doors at Steam and itch.io are just going to push users to using shadier websites to download their illicit media, and that opens those users up to massively more risk of compromise by bad actors.
The main potential I see for solving this is community driven navigation to more granularly hosted media. A kind of catalogue with clear guidelines about what is and isn't allowed (I'm sure no drama would come from those lines being drawn... cohost flashbacks intensify ), no on-site hosting of materials, and some kind of moderation of link quality and content relevance. Creator-owned pages and well-vetted data.
Such a thing probably already exists tbh - for all these moves have annoyed me the porn game sphere hasn't really been in my oeuvre since I was, ironically, young enough for this legislation to affect me. I follow some people who work in this space (Bigg's feed in particular was a highlight of Cohost for me) so I'm broadly aware of the scene, but I'm fundamentally an outsider to it and my thoughts probably come across that way.
This isn't coming from thinking I know better than the people being affected though, it comes from having seen this happen time and again and coming to the same conclusions - platforms don't work for transgressive spaces because platforms are too vulnerable to political pressure.
So when I describe a non-hosting catalogue/wiki with moderation and safety ratings, I'm really advocating for something portable. There's no one platform that can be pressured into nuking thousands of people's work from being accessed, because it's purely informational and information is easy to replicate. The reputation of such a catalogue would be the primary defence against duplication of the database for nefarious purposes - "why use this shady copycat when everyone trusts the links on xcat?" or similar. Plus, we're talking about a non-profit site here - there's not a lot of money in stealing its traffic.
This is admittedly a grey-area defence like people use for emulation websites - "it's not illegal if you own the cartridge!" To be fair this defence has functioned fairly well in non-profit non-distribution contexts, but I think it holds a bit more water here.
The illegality of whatever media is being catalogued is irrelevant to the catalogue itself - the responsibility for that would be on the people distributing the files itself, be it some external store or personal website. It's so much harder for a campaign like this most recent one to succeed when there's no one point of failure to attack.
In the worst case of attack or implied liability, links could be removed while maintaining the informational page describing and cataloguing the media, allowing users essentially the same circumstances as the best case scenario when something like the itch.io ban happens.
This is where I think it ties in with the UK Online Safety Act - this isn't about ban-evasion or engaging in bad-faith with regulations, it's about having a culture and community that's nimble and not complacent about what is expected of them - or what may be in future.
As far as how a website like this would be compliant with the act? That's tricky for me to judge. Not hosting any adult content itself, it may be able to rely on the age-gating of any linked websites for cover. It may also be able to default to a fully compliant, no images/explicit descriptions layer unless users consent to some minimally compliant age verification.
A catalogue that can function purely on text is a great deal more robust to this kind of exploitation anyway, so that's probably a win-win feature.
The biggest gap remaining around this is, like, the store part? The whole catalogue idea exists specifically to excise the vector of money from the domain of archival, so it actively shuns the question of payment for the work.
Maybe it links to a Patreon page for your favourite VN creator, and if Patreon pulls another Patreon on them it can be updated to link to some Dutch online storefront while they figure out a new revenue stream.
To be honest, from what I can tell most adult games on itch.io were already functionally links to a Patreon, because there's so much more money to be had from monthly subscriptions than a one-off payment.
Successful devs were already so hostile to itch.io as a platform that the transition to a catalogue that just sends users where they need to go might be more natural than I initially thought.
Closing Thoughts
Apologies for what is essentially a dump of every conversation that I've had with my partner over the last few days, condensed into something vaguely constructive.
I don't know what will really happen next, where the displaced sex-workers of the games industry will go or what they'll do.
I just firmly discard the notion that the future lies in trusting a different platform to not fuck them over.