Some of you may have noticed a lot of people freaking out about CSAM and a bunch of communities closing, instances restricting registrations, turning off image uploads or shutting down completely. It’s a bit of a chaos.

Fortunately your admin has been fighting this fight for the past year so I have developed some tools to help me out. I repurposed one of them to cover lemmy images

Using this approach, I’ve now turned on automatic scanning of new uploads.

What this means for you is that occasionally you will upload an image for a post and it will stop working after a bit. C’est la vie. Just upload something else. Changing format or slightly altering the image won’t help you.

Also, sometimes you might see missing thumbnails on post from other communities. Those were the cached thumbnails hosted by us. The original images should still work in those cases.

Unfortunately this sort of AI scanning is not perfect and due to the nature of the beast, it will catch more false positives but to an acceptable degree. But I find that this is OK for a small social network site run as a hobby project.

Cool? Cool.

  • aldalire@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    11
    ·
    10 months ago

    behavior like this baffle me. I wonder what they stand to gain by spamming CSAM? Purely destructive and psychopath behavior :|

    • db0@lemmy.dbzer0.comOPM
      link
      fedilink
      arrow-up
      10
      arrow-down
      1
      ·
      10 months ago

      Most likely 4channers. They do stuff like that constantly and laugh at the panic

  • nyakojiru@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    10 months ago

    This is perfect moment / place for interpol and other organizations to catch some pedo mother fuckers . Those people are absolute human trash. Also Expect heavy vigilance from big eyes over the fediverse.

  • dangblingus@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    6
    arrow-down
    3
    ·
    10 months ago

    Defederate that tankie shit. Literally one of the main rules of this instance is no tankie shit, yet HB is here.

    • ArcaneSlime@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      10 months ago

      I happen to agree with this somewhat blunt individual here. They’re annoying, and though I’m not generally in favor of censoring even people as abhorrent as they, when it becomes so prevalent and aggressive that it is tantamount to spam, I think it is warranted. The genocide denial and all that is just icing on the cake to me, frankly.

  • SootySootySoot [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    10 months ago

    This is dang impressive, nice! As someone entirely unappreciative and unaware of Lemmy’s backend, is it likely a tool that other instances could/would make use of? Or is it too faffy to redeploy for other instances?

  • hemko@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 months ago

    Very cool! Personally I’d think even blocking any pictures of children on the website, as those are almost always posted by adults without real consent and rest of the time children who do not understand the consequences of uploading their photos to internet.

    But this is very cool!

  • lambalicious@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    and due to the nature of the beast, it will catch more false positives but to an acceptable degree

    1. What is this “acceptable degree”? Where is it documented?
    2. What is the recourse for the uploader in case of a false positive? And no I don’t mean “upload something else”, I mean what do you answer to “my legit content is being classified by a shared internet tool as CSAM, of all things”.
  • Dizzy Devil Ducky@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    10 months ago

    …will catch more false positives but to an acceptable degree.

    I’d much rather see it catching plenty of false positives than not because it at least shows it’s working as it should.