Skip to content

Instantly share code, notes, and snippets.

@soatok
Created October 19, 2022 21:53
Show Gist options
  • Save soatok/1eb76ea6e484cde13d50a3e224df732d to your computer and use it in GitHub Desktop.
Save soatok/1eb76ea6e484cde13d50a3e224df732d to your computer and use it in GitHub Desktop.
Towards A Better FurAffinity

After witnessing FurAffinity get flooded with CSAM last night, and subsequently losing sleep because of it, I thought I'd enumerate the mechanisms that FurAffinity could implement to make this problem tractible.

  1. Report Button
  2. Asynchronous PhotoDNA Integration
  3. Account Suspension Automation
  4. Make the Block feature more powerful

I'll explain each of these four in detail.

Report Button

Right now, if you want to report an AUP violation to the FurAffinity staff, you must create a support ticket and provide a URL to the offending material. This requires manual staff intervention and is prone to human error.

A report button would be irrevocably linked to a specific submission in the database. When a staffer resolves a necessary takedown, it could auto-resolve all related tickets. It would also be possible to build some metrics and automation for some categories of user report (i.e. Child Sex Abuse Material).

Asynchronous PhotoDNA Integration

New submissions should be fed into a message queue that scans new documents against PhotoDNA to catch likely CSAM. If combined with some automation of the report button (SELECT count(*) FROM report_button WHERE content_id = :foo AND report_type = 'CSAM' versus some minimum threshold), CSAM could be instantly taken down without false positives negatively impacting furry art.

Account Suspension Automation

Building on the previous two items, if a new account uploads CSAM, they should be automatically banned. Any information about their conduct (including IP addresses) should be quarantined in a space that can easily be retrieved for law enforcement without impacting the rest of FurAffinity users.

Make the Block feature more powerful

When we block someone, their content shouldn't be displayed on the FA front page or Browse section for us.

It's totally fine if this filtering is implemented client-side. It just needs to exist for the majority of FA users.

How to Cross This Bridge

These ideas would serve to reduce the number of times FurAffinity staff have to be woken up at crazy hours to stop CSAM from being posted on their website. This will help with staff morale and reduce burnout. Thus, the incentives are already there to add these features.

I do not have access to the FurAffinity source code. If it was open source, I'd be happy to contribute to some of these ideas.

If FurAffinity wanted to raise money to pay for the additional database and server-side processing necessary to assure the safety of the platform, and to avoid retraumatizing abuse victims in our community, I think that would be acceptable. But I can't speak for the community on this one.

The only unacceptable outcome is nothing changing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment