EDIT

TO EVERYONE ASKING TO OPEN AN ISSUE ON GITHUB, IT HAS BEEN OPEN SINCE JULY 6: https://github.com/LemmyNet/lemmy/issues/3504

June 24 - https://github.com/LemmyNet/lemmy/issues/3236

TO EVERYONE SAYING THAT THIS IS NOT A CONCERN: Everybody has different laws in their countries (in other words, not everyone is American), and whether or not an admin is liable for such content residing in their servers without their knowledge, don’t you think it’s still an issue anyway? Are you not bothered by the fact that somebody could be sharing illegal images from your server without you ever knowing? Is that okay with you? OR are you only saying this because you’re NOT an admin? Different admins have already responded in the comments and have suggested ways to solve the problem because they are genuinely concerned about this problem as much as I am. Thank you to all the hard working admins. I appreciate and love you all.


ORIGINAL POST

cross-posted from: https://lemmy.ca/post/4273025

You can upload images to a Lemmy instance without anyone knowing that the image is there if the admins are not regularly checking their pictrs database.

To do this, you create a post on any Lemmy instance, upload an image, and never click the “Create” button. The post is never created but the image is uploaded. Because the post isn’t created, nobody knows that the image is uploaded.

You can also go to any post, upload a picture in the comment, copy the URL and never post the comment. You can also upload an image as your avatar or banner and just close the tab. The image will still reside in the server.

You can (possibly) do the same with community icons and banners.

Why does this matter?

Because anyone can upload illegal images without the admin knowing and the admin will be liable for it. With everything that has been going on lately, I wanted to remind all of you about this. Don’t think that disabling cache is enough. Bad actors can secretly stash illegal images on your Lemmy instance if you aren’t checking!

These bad actors can then share these links around and you would never know! They can report it to the FBI and if you haven’t taken it down (because you did not know) for a certain period, say goodbye to your instance and see you in court.

Only your backend admins who have access to the database (or object storage or whatever) can check this, meaning non-backend admins and moderators WILL NOT BE ABLE TO MONITOR THESE, and regular users WILL NOT BE ABLE TO REPORT THESE.

Aren’t these images deleted if they aren’t used for the post/comment/banner/avatar/icon?

NOPE! The image actually stays uploaded! Lemmy doesn’t check if the images are used! Try it out yourself. Just make sure to copy the link by copying the link text or copying it by clicking the image then “copy image link”.

How come this hasn’t been addressed before?

I don’t know. I am fairly certain that this has been brought up before. Nobody paid attention but I’m bringing it up again after all the shit that happened in the past week. I can’t even find it on the GitHub issue tracker.

I’m an instance administrator, what the fuck do I do?

Check your pictrs images (good luck) or nuke it. Disable pictrs, restrict sign ups, or watch your database like a hawk. You can also delete your instance.

Good luck.

  • r00ty@kbin.life
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I’m not using lemmy. But I was thinking of making a process to periodically scan the object storage and check for a reference to a post, comment etc and if none are found delete it. In most cases the images are deleted but sometimes they don’t seem to be.

    Probably lemmy could have a similar process created.

  • garrett@infosec.pub
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    Yeah, this is a big issue. I know Lemmy blew up a bit before it was truly ready for prime time but I hope this cleans up.

  • squiblet@kbin.social
    link
    fedilink
    arrow-up
    4
    arrow-down
    2
    ·
    1 year ago

    It would not be difficult to use SQL to delete any images that are not associated with a post or active as an avatar etc. So, set that to be run periodically and it would solve this problem.

    • Kaldo@kbin.social
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      Isn’t it more likely that paths are used to reference resources like images rather than a db fk?

      • squiblet@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Not familiar with Lemmy specifically, but usually in an app like this, while of course the files are stored on a filesystem, IDs and metadata are stored in the DB and associated with each other through relations. It seems in this case one way to express it would be ‘don’t delete every image that is associated with a valid post or in-use avatar, but delete everything else’.

        Take this random image for instance: https://lemmy.world/pictrs/image/ede63269-7b8a-42a4-a1fa-145beea682cb.jpeg
        associated with this post: https://lemmy.world/post/4130981

        Highly likely the way it works is there is an entry for post 4130981 that says it uses ede63269-7b8a-42a4-a1fa-145beea682cb, or an image table with a relation to the post table where an image entry (with whatever ID) that is ede63269-7b8a-42a4-a1fa-145beea682cb says it is related to post 4130981. Whatever the specifics, it would be possible.

    • bmygsbvur@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I’m not knowledgeable with SQL. If you know or if anyone knows how to fix it with a script or built into Lemmy, please share.

      • squiblet@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I haven’t worked with Lemmy, but I certainly could craft a script to do that if I was familiar with the database structure. Perhaps I’ll try installing it and running an instance. In the meantime, surely there’s someone with an instance and SQL skills who could figure that out.

  • Swedneck@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    50
    ·
    1 year ago

    seems like the solution to this should be to automatically remove images that haven’t been posted, after like 3 minutes

    • Venat0r@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Or make it like 1hr and don’t let the user know the url of the uploaded image until they post it, that way it wouldn’t be able to be shared or reported.

      • squiblet@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        It’s difficult to display an image without the client knowing the URL, but it would be possible to use a temporary URL that only works for that signed-in user.

    • KIM_JONG_JUICEBOX@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Or you set a flag that says something like “incomplete image” and then only once user completes whatever operation by hitting “submit” do you then set it to complete.

      And maybe while an image is not yet complete, only the uploading user can view the image.

    • cwagner@lemmy.cwagner.me
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      3
      ·
      1 year ago

      The idea is good, but 3 minutes is horribly short and would remove in tons of posts with broken images. After one hour would be bad UX but workable. 3 minutes would just suck :D

        • russjr08@outpost.zeuslink.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          As in when you’d normally get automatically logged out? If so, I’m not sure that would work since Lemmy uses JWTs that don’t expire (or if they do, not for a very long time) it seems.

      • Kitikuru@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        5
        ·
        1 year ago

        The 3 minutes would only kick in if an image was uploaded but then never posted. So nobody would see it anyway in any case.

        This route would avoid the issue but also help save on space too.

        • Racle@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          20
          ·
          1 year ago

          What happens if user spends over 3 minutes to write the post after uploading image?

          Would user create a post with broken image link? or would there be some kind of “call home” API call on create post page so image wouldn’t be removed? (which has risk that API call could be replicated by a bot)

          • Swedneck@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Could allow for like one hour of keep-alive pings before it’s deleted and the client is told to notify the user of this

            Also: rate limits that gradually increase are good.

          • Kitikuru@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            9
            ·
            1 year ago

            That is a good point. Could potentially not upload the image until the post is created instead of at image choosing, which would also alleviate the issue. But I’m not sure how that would work across web and mobile clients.

            • squiblet@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              I think that’s the best solution. I can’t see a reason any client couldn’t upload the image when the post is submitted. Currently the uploader is some fancy javascript deal and it’s unnecessary.

            • CptMuesli@artemis.camp
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              This could be handled by the client. Get the Ruleset for image uploads (max size, format, etc.), Validate the image within the client, only upload when the post is published.
              Then the delay between post and image only depends on your internet connection and the user can still take 3 hours to write a post.

  • CaptObvious@literature.cafe
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    9
    ·
    1 year ago

    I can’t be the only one getting bored with the 8-hr-old accounts spreading FUD.

    If you have a legitimate concern, post it from your proper account. Otherwise it looks like you’re just trolling for Spez. It’s pathetic, really.

    • bmygsbvur@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      1 year ago

      You’re not concerned because you’re not an admin. Of course you only bothered to check my account profile and not the actual post. If the issue I stated above doesn’t bother you, then it’s only a matter of time until people start seriously abusing it. Or who knows, somebody already is and we just aren’t aware of it yet.

      • CaptObvious@literature.cafe
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        9
        ·
        1 year ago

        I’m not concerned because people smarter than us have said it isn’t a concern. So long as they preserve their safe harbor shield, instance admins are not generally liable for content posted by users.

        • bmygsbvur@lemmy.caOP
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          edit-2
          1 year ago

          Who are these people that are smarter than us? Do you know them? What are their qualifications?

          Did you not consider that not everyone is subject to American law and that there are other nations who have different laws? Did you not consider how diverse the Lemmy instances are and most do not fall under American law?

          How come that every Lemmy admin who replied to this post expressed their concern regarding this issue? Explain to me why admins like sunasaurus and db0 are working on tools and solutions to address this problem if, according to you, this is not a concern.

          Are you REALLY SURE that this is NOT a concern?

          • teawrecks@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            1 year ago

            Qualified person here. You’re spreading FUD.

            The fact that someone can upload illegal content to a lemmy server doesn’t change whether or not it is associated with a post. The two are mutually exclusive issues:

            • moderation of user submitted content
            • moderation of abuse of hosting functionality (illegal or otherwise)

            Both are real issues that need to be addressed, obviously, but it’s simply not the case that a server admin’s only visibility into the content hosted on their server is only that which a user associates with a post. If you know any admins like that, do them a favor and let them know they have no business running a lemmy server.

          • CaptObvious@literature.cafe
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            1 year ago

            Who are these people that are smarter than us? Do you know them? What are their qualifications?

            I don’t know and I don’t need to know. They are clearly capable of reading sources that are authoritative (e.g. EFF) and applying a modicum of logic and common sense. Their response is rational rather than breathless and dramatic moral panic.

            Did you not consider that not everyone is subject to American law…?

            Of course there are countries other than the US. International safe harbor exists for a reason. Can you name one country that doesn’t have a safe harbor provision for web site hosts? Just one.

            Did you not consider [that] most [instances] do not fall under American law?

            It would be interesting to know how many Lemmy instances don’t fall under US law. I don’t know. Do you? Based on which source?

            How come that every Lemmy admin who replied to this post expressed their concern regarding this issue? Explain to me why admins like sunasaurus and db0 are working on tools and solutions to address this problem if, according to you, this is not a concern.

            So that’s two who are working on tools. Not panicked and not viewing this as a giant problem. Two out of thousands. It’s an exceptionally low percentage and not even remotely statistically significant.

            Are you REALLY SURE that this is NOT a concern?

            A concern? Sure. Is anyone going to prison if they don’t bow to your demands right now as you’ve suggested up and down this thread? Unlikely.

            What is your usual account? Why are you hiding? Do you plan to plant evidence and then call the authorities?

            • bmygsbvur@lemmy.caOP
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              1 year ago

              Again, you are assuming everything is based on American law. What is up with people always thinking that American laws apply everywhere in the world?

              “Do you plan to plant evidence and then call the authorities?” No but be very careful about statements like this.

              In the end, you admitted that this is a concern anyway. Congrats. Can’t believe it took so much to hammer it into your head.

              • CaptObvious@literature.cafe
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                5
                ·
                1 year ago

                I was going to just let it go, but it’s late and my patience is exhausted.

                …be very careful about statements like this.

                Or what? You’ll have your dad beat up my dad?

              • CaptObvious@literature.cafe
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                4
                ·
                1 year ago

                So you’re just going to ignore any inconvenient points and glom onto my agreement that this issue is a small concern? You think that constitutes “winning”?

                TBH, if you need to win an argument with an internet stranger that badly, I’m happy to oblige?

    • sonstwas@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      2
      ·
      1 year ago

      Additionally this isn’t the community where this needs to be addressed. Either contact the admins or open an issue on GitHub.

  • d3Xt3r@lemmy.nz
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    4
    ·
    edit-2
    1 year ago

    Or just disable image uploads completely. We got by on Reddit without any built-in image hosting functionality for over a decade, so Lemmy should be fine without it as well - especially considering that we don’t really have many image-heavy communities, besides the NSFW instances. I mean, storage costs money you know, and with Lemmy being run by volunteers, it makes even more sense to get rid of image hosting to save costs.

    • cwagner@lemmy.cwagner.me
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      Just being able to paste images in comments (usually screenshots in my case, related to the comment in question) is super convenient. Images are for more than just memes and photos.

      • d3Xt3r@lemmy.nz
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        1
        ·
        edit-2
        1 year ago

        This can be easily implemented client-side, like how third-party Reddit clients have been doing for years, by uploading to the likes of Imgur. Shift the responsibility away from the Lemmy server and onto dedicated image hosts, who have the storage to spare, plus manpower/policies to deal with illegal content.

          • d3Xt3r@lemmy.nz
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            1 year ago

            Desktop users exist

            So do Desktop tools like Flameshot, which can directly upload to image hosts and copy the URL to the clipboard, and there also exists third-party Desktop web-clients such as Photon, which could be updated with that functionality as well. But with Lemmy itself being open source, it wouldn’t take much effort to modify the code to use a third-party image host.

            have a history of deciding to forbid hotlinking

            There are plenty of hosts which do allow hotlinking though, like imgbb.com

            history of suddenly deleting all (e.g. PhotoBucket) or some (e.g. Imgur) images .

            Not a big loss, IMO. Lemmy isn’t an image hosting nor an image-centric site, it’s a text-heavy forum at first instance, and anyone posting images are encouraged to provide text alts for the benefit of blind users, so images not persisting isn’t a big deal.

            If image persistence is really that important, there are other services which are better suited for that, such as Pixelfed. But in the first place, I wouldn’t rely on some random Lemmy server, which is vulnerable to DDoS and other attacks and could go down at any time (also why the importance on decentralization), as an assurance of persistence. I mean, when there’s no guarantee that a Lemmy instance will even be there tomorrow, is there really a need to worry about image persistence?

            • cwagner@lemmy.cwagner.me
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I feel like you are saying “there are other issues, we should not care and forget about it”, it seems very defeatist to me.

  • Tetsuo@jlai.lu
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    1 year ago

    I’m usually pretty relaxed when it comes to disclosure of vulnerabilities but this is the kind of issues where I think it would have been better to privately report the issue to the Lemmy dev and wait ( a long time probably) for it to be fixed before disclosing.

    Especially since currently there is multiple people abusing the image hosting feature.

    Not a big deal, but sometimes it is actually a better practice to give an opportunity to the dev to fix something before forcing them to do so in a hurry.

    • CaptObvious@literature.cafe
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      Nah. Where’s the drama and FUD in behaving like adults? Much better to make a brand new account and spam moral panic all over the fediverse. /smh

    • bmygsbvur@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      I’ve mentioned this before to a similar reply. But I’ll say it again: this was already publicly known months ago. People just forgot about it because they didn’t think it was a big deal. Now that they realize CSAM is a real issue, I made this post to remind everyone about it again. Bad actors already know about this and really, it isn’t hard to figure out how this work.

    • BreakDecks@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I hate how everything is a double edged sword, because this is now also the perfect tool for making sure your CSAM doesn’t trip the filter. Also, it uses CLIP so a simple obfuscation overlay would render it useless.

        • BreakDecks@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Any of filter or image processing technique that fools machine vision.

          Example: https://sandlab.cs.uchicago.edu/fawkes/

          At a high level, Fawkes “poisons” models that try to learn what you look like, by putting hidden changes into your photos, and using themn as Trojan horses to deliver that poison to any facial recognition models of you.

          This could be done with any kind of image or detail, not just faces.

          • db0@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            1 year ago

            I don’t think random trolls like that would be be that sophisticated, but in any case we can deal with that once we get to that point.

    • bmygsbvur@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      5
      ·
      1 year ago

      I’m not on GitHub. Someone else can submit it this and I’m very sure the Lemmy devs are aware. They just have different priorities.

  • Admiral Patrick@dubvee.org
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    10
    ·
    edit-2
    1 year ago

    Just my two cents, but I feel it’s quite irresponsible to post a “how to exploit this platform” guide ON the platform.

  • Kool_Newt@lemm.ee
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    This is just like how someone could put printed CSAM behind a bush in my yard or something and some authorities could decide to hold me responsible.

    • lazynooblet@lazysoci.al
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      To take your analogy, it could be someone hosts a collection of material in your yard and invites all the pedos to use your yard to see and share other material.

      • Kool_Newt@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        These are interesting thought experiments.

        If I live in say Oregon, and own 20 acres in Montana. Am I responsible for an hourly or daily sweep of my distant property to ensure no CSAM exists lest I be held responsible? Would I need to hire guards to ensure nobody uses a hole in a tree on my property to stash CSAM otherwise be responsible?

        IMHO, it is or should be more what’s reasonable. Obviously hourly sweeps of acres or property is ridiculous but if you run a magazine stand then it should be have some processes to ensure what is sold is legal. Similarly, so long as a lemmy server operator is running by current best practices and isn’t grossly negligent it seems like a bad idea to hold them responsible. As time goes on, best practices will evolve better methods of keeping shit out.

        TBH, my worry is that the owners of Reddit could pay some bad actors to post CSAM to lemmy servers to take out their competition.

    • bmygsbvur@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      1 year ago

      So you’re telling me you’re NOT bothered if CSAM was sitting on your server and shared with others without your knowledge? Do you think all countries have the same laws? You don’t think any of this is an issue?

      • Kool_Newt@lemm.ee
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        You’re a regarded one. I won’t bother to answer such dumb questions.

        • bmygsbvur@lemmy.caOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          You’re not an admin so of course you don’t care. How come every admin in this thread has expressed their concern? Because it IS a concern. :)

  • apprehentice@lemmy.enchanted.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Funny, I couldn’t even get pict-rs working on my instance. I don’t need it, either. I just upload to an FTP server when I need to share something.

  • BreakDecks@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    the admin will be liable for it.

    These bad actors can then share these links around and you would never know! They can report it to the FBI and if you haven’t taken it down (because you did not know) for a certain period, say goodbye to your instance and see you in court.

    In most jurisdictions this is not now it would work. Even a less tech savvy investigator would figure out that it was an online community not obviously affiliated with CSAM, and focus on alerting you and getting the content removed.

    There’s this misunderstanding that CSAM is some sort of instant go-to-prison situation, but it really does depend on context. It’s generally not so easy to just plant illegal files and tip off the FBI, because the FBI is strategic enough not to be weaponized like that. Keep an eye on your abuse and admin email inboxes, and take action as soon as you see something, and nobody is going to shut you down or drag you to court.

    • bmygsbvur@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      Doesn’t change the fact that this is an issue that needs to be resolved.

      • koper@feddit.nl
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        It’s not. Image hosting sites have existed for decades. Websites are not liable unless they have actual knowledge of illegal content and ignore takedown requests. Stop fearmongering.

        • bmygsbvur@lemmy.caOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          Doesn’t change the fact that this issue needs to be addressed. Besides, do you think all countries laws are the same?