Not discrediting Open Source Software, but nothing is 100% safe.

    • andrew@lemmy.stuart.fun
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Man we would have been so much better with plaintext communications everywhere, right?

      You cite heartbleed as a negative but a) SSL would never have proliferated as it has without openssl and b) the fix was out in under a week and deployed widely even faster.

      The alternative, proprietary crypto, would have all the same problems including the current laggards, but likely without everyone understanding what happened and how bad it was. In fact, it probably wouldn’t have been patched because some manager would’ve decided it wasn’t worth it vs new features.

      • Muddybulldog@mylemmy.win
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        I think the point that’s more relevant to the original post is that while the speed with which fixes were rolled out were admirable, the flaw existed for years before anybody noticed it.

        • TheYang@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          it would have been way worse, because it would have been less discoverable in a closed source software by someone somewhere

          • Muddybulldog@mylemmy.win
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            Devil’s Advocate…

            Codenomicon, the company who actually named the flaw, didn’t find the bug via the source code. They were building a security product and when testing that product against their own servers exposed the flaw. Open Source was not a factor in this discovery.

            Google HAD discovered the flaw via the source code, exactly two days earlier.

            In this case, the bug was 0.267379679% more discoverable due to being open source versus being closed.

        • andrew@lemmy.stuart.fun
          link
          fedilink
          English
          arrow-up
          13
          ·
          edit-2
          1 year ago

          I’m not mad, just disappointed.

          In all seriousness though, I just disagree and I think it’s important to note the inaccuracy of thinking that a bug, which is famous only because it was deliberately publicized and deliberately open source, is anything but a huge win compared to what would likely have played out had the most popular SSL library in the world been proprietary and closed.

          • bloodfart@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            1 year ago

            What do you disagree with? Heartbleed was a vulnerability in OpenSSL. It affected millions of computers.

            • stappern@lemmy.one
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              that is a big problem. it was quickly fixed and i dont see how it does proprietary software any favors…

              • bloodfart@lemmy.ml
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                1 year ago

                The only person in the whole thread talking about proprietary software is that guy.

                This is a thread about how the accepted wisdom that many eyes make open source software more secure is based on the assumption that someone else is effectively auditing the code base which has been proven over and over again not to be true.

                E: I just looked at this thread and now everyone is talking about proprietary software. It would be cool if the progression of time made fools of us all, but it looks like it’s just me this time.

      • damnthefilibuster@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        the fix was out in under a week

        I don’t disagree with this, but your point about automatic audits… It’s always a learning curve to prevent silly shit like heartbleed from getting into the system. But the idea that there was no check against this when it was first PR’d seems almost absurd. This is why sticking hard to API and design specs and building testing around them is so important.

        I’m sure they learnt a valuable lesson there.

  • TheBeege@lemmy.world
    link
    fedilink
    English
    arrow-up
    53
    arrow-down
    4
    ·
    1 year ago

    I had a discussion with a security guy about this.

    For software with a small community, proprietary software is safer. For software with a large community, open source is safer.

    Private companies are subject to internal politics, self-serving managers, prioritizing profit over security, etc. Open source projects need enough skilled people focused on the project to ensure security. So smaller companies are more likely to do a better job, and larger open source projects are likely to do a better job.

    This is why you see highly specialized software has really enterprise-y companies running it. It just works better going private, as much as I hate to say it. More general software, especially utilities like OpenSSL, is much easier to build large communities and ensure quality.

    • Zeth0s@reddthat.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      1 year ago

      Unfortunately that is not the case. Closed sourced software for small communities are not safer. My company had an incredibly embarrassing data leak because they outsourced some work and trusted a software used also by the competitors. Unfortunately the issue was found by one of our customers and ended up on the newspapers.

      Absolutely deserved, but still, closed sourced stuff is not more secure

    • andrew@lemmy.stuart.fun
      link
      fedilink
      English
      arrow-up
      37
      arrow-down
      3
      ·
      1 year ago

      With all due respect, I have to strongly disagree. I would hold that all OSS is fundamentally better regardless of community size.

      Small companies go under with startling frequency, and even with an ironclad contract, there’s often nothing you can do but take them to court when they’ve gone bankrupt. Unless you’ve specifically contracted for source access, you’re completely SOL. Profitable niche companies lose interest too, and while you may not have the same problems if they sell out, you’ll eventually have very similar problems that you can’t do anything about.

      Consider any of my dozens of little OSS libraries that a handful of people have used, on the other hand. Maybe I lost interest a while ago, but it’s pretty well written still (can’t have people judging my work) and when you realize it needs to do something, or be updated (since things like dependabot can automatically tell you long after I’m gone), you’re free and licensed to go make all the changes you need to.

      I think you see highly specialized software being run by enterprisey companies because that’s just business, not because it’s better. It’s easiest to start in a niche and grow from there, but that holds true with open software and protocols too. Just look at the internet: used to share research projects between a handful of universities, and now has grown to petabytes of cat gifs. Or linux. Started out as a hobby operating system for a handful of unix geeks, and now runs 96.3 percent of the top 1 million web servers.

      It always starts small and gets better if it’s good enough. This goes for OSS and companies.

    • stappern@lemmy.one
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      no, just no. proprietary software its always possible malware. small or big. that will never change

    • Distributed@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      prioritizing profit over security

      Laughs, nervously, while looking at my company’s auth db, which uses sha-256 still lol…

  • bill_1992@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    Even audited source code is not safe. Supply-chain attacks are possible. A lot of times, there’s nothing guaranteeing the audited code is the code that’s actually running.

  • Captain Beyond@linkage.ds8.zone
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    1 year ago

    Free software has only promised its users the Four Freedoms, which are the freedoms to use, share, modify, and share modified copies of the software. That is not an inherent guarantee that it is more secure.

    Even if you yourself don’t know how to work with code, you can always enlist the community or a trusted friend to exercise freedoms on your behalf. This is like saying right to repair is meaningless because you don’t know how to repair your own stuff.

    • Lvxferre@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I have doubt about the Linux kernel being properly audited.

      Torvalds is doing it so he has more reasons to chain insults. “I SAID NO REGRESSIONS, YOU BUNCH OF %#$%%&#$@#$%#&%#!!!”

    • TheYang@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      1 year ago

      I mean, what’s a “proper audit”?
      most audits my company does are a complete smoke and mirrors sham. But they do get certifications. Is that “proper”?

      I’m pretty confident that the code-quality of linux is, on average, higher than that of the windows kernel. And that is because not only do other people read and review, the programmer also knows his shit is for everyone to see. So by and large they are more ashamed to submit some stringy mess that barely works

    • regeya@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      1 year ago

      A little scary to contemplate since some of the code comes from the NSA

      • Hubi@feddit.de
        link
        fedilink
        English
        arrow-up
        19
        ·
        edit-2
        1 year ago

        I’m pretty sure the code submitted by the NSA has had more people look over it than any other snippet in there.

        • lemminer@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Probably there’s more to it. Who know maybe the active developers were contacted by secret services to add something kinky.

  • Cypher@lemmy.world
    link
    fedilink
    English
    arrow-up
    201
    arrow-down
    2
    ·
    1 year ago

    Luckily there are people who do know, and we verify things for our own security and for the community as part of keeping Open Source projects healthy.

    • guy@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      1 year ago

      Though one of the major issues is that people get comfortable with that idea and assume for every open source project there is some other good Samaritan auditing it

      • 𝕽𝖔𝖔𝖙𝖎𝖊𝖘𝖙@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        I would argue that even in that scenario it’s still better to have the source available than have it closed.

        If nobody has bothered to audit it then the number of people affected by any flaws will likely be minimal anyway. And you can be proactive and audit it yourself or hire someone to before using it in anything critical.

        If nobody can audit it that’s a whole different situation though. You pretty much have to assume it is compromised in that case because you have no way of knowing.

        • guy@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Oh definitely, I fully agree. It’s just a lot of people need to stop approaching open source with an immediate inherent level of trust that they wouldn’t normally give to closed source. It’s only really safer once you know it’s been audited.

    • bill_1992@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      1 year ago

      Have you seen the dependency trees of projects in npm? I really doubt most packages are audited on a regular basis.

      • AlexWIWA@lemmy.ml
        link
        fedilink
        English
        arrow-up
        17
        ·
        1 year ago

        It’s safe because there’s always a loud nerd who will make sure everyone knows if it sucks. They will make it their life mission

          • AlexWIWA@lemmy.ml
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            I’ll listen to them because I love OSS drama. But you’re right that they may just get passed over at large

      • buckykat@lemmy.fmhy.ml
        link
        fedilink
        English
        arrow-up
        18
        ·
        1 year ago

        Also because those people who can audit it don’t have a financial incentive to hide any flaws they find

      • andrew@lemmy.stuart.fun
        link
        fedilink
        English
        arrow-up
        40
        ·
        edit-2
        1 year ago

        And to a large extent, there is automatic software that can audit things like dependencies. This software is also largely open source because hey, nobody’s perfect. But this only works when your source is available.

          • andrew@lemmy.stuart.fun
            link
            fedilink
            English
            arrow-up
            9
            ·
            1 year ago

            See my comment below for more of my thoughts on why I think heartbleed was an overwhelming success.

            And you help make my point because openssl is a dependency which is easily discovered by software like dependabot and renovate. So when the next heartbleed happens, we can spread the fixes even more quickly.

            • 018118055@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Enterprise software inventory can unfortunately be quite chaotic, and understanding the exposure to this kind of vulnerability can take weeks if not longer.

      • kbotc@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        1 year ago

        My very obvious rebuttal: Shellshock was introduced into bash in 1989, and found in 2014. It was incredibly trivial to exploit and if you had shell, you had root perms, which is insane.

        env x=‘() { :;}; echo vulnerable’ bash -c “echo this is a test”

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 year ago

    IDK why, but this had me imagining someone adding malicious code to a project, but then also being highly proactive with commenting his additions for future developers.

    “Here we steal the user’s identity and sell it on the black market for a tidy sum. Using these arguments…”

  • ichbinjasokreativ@lemmy.world
    link
    fedilink
    English
    arrow-up
    108
    ·
    1 year ago

    The point is not that you can audit it yourself, it’s that SOMEBODY can audit it and then tell everybody about it. Only a single person needs to find an exploit and tell the community about it for that exploit to get closed.

    • theangryseal@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      1 year ago

      Exactly! I wait on someone who isn’t an idiot like me to say, “ok, so here’s what’s up guys.”

    • butter@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I don’t know how to audit code. But I can generally get through. For example, I use Aegis for 2FA OTP. How do we know it’s secure? Because I can see very clearly that it doesn’t have network access on Android and that it hasn’t tried to get network access.

  • The Snark Urge@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    Ahh the old motte and bailey doctrine.

    FOSS is superior even for an end user like me. It only fails when corporations are allowed to “embrace, extend, and extinguish” them.