Basically what the title says. I know many users here are trying to avoid giving Reddit any traffic and I saw how grateful everyone was on another post where this was done.

A sticky making this a rule would be amazing!

    • Andojus@lemmy.worldOP
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      That’s the end goal but it’s a ways until we get there. In the mean time one of the good ways to be sure we have lots of content here is scraping Reddit.

      This is also directed at those who are already posting Reddit content here.

      • intensely_human@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        It’s a good policy to scrape content, at least for now.

        I volunteered for a non-profit that was basically a competitor to GoodGuide.

        What we intended to build was an app where you could scan a barcode then get a score about how well that company aligned with your values.

        Trouble is, our plan for generating those scores was all crowdsourced. People would fill out reviews based on research and provide their conception of how a company should score on environment, corruption, etc.

        It was stupidly naive of us to expect people to work hard to fill our site with content. What we should have done is scrape a bunch of data, come up with scores, then allow that review/research system to build on top. MVP required some data, any data, about corporate values adherence.

  • eleitl@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    I recommend mirroring whole /r/ subs to /c/ communities using automated tools. I don’t need to see Reddit content in /c/reddit – at all.

  • TwoFace211@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    4
    ·
    1 year ago

    Why would anyone want to do this? If you want to read Reddit just go to Reddit. I’m here because Reddit sucks

  • Andojus@lemmy.worldOP
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    I apologize for the duplicate posts, I was given an error message and timed out when initially submitting. Sorry!

    • abbadon420@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      It seems you’re not the first to have that happen and you probably won’t be the last 😀

      • Andojus@lemmy.worldOP
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Haha, thank you!

        Lesson learned: if it’s times out while posting check before trying again.

  • spaduf@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Between every thread of someone saying we should scrape Reddit is another asking how to block the Reddit scraping bots. I think the reality is most lurkers want more content and see this as an easy way to do it, and most people who comment don’t want to touch it with a ten foot pole. I think this is because, without being relatively selective, the fire hose of content from Reddit scraping bots is simply too much for most people to get interaction on these posts. I think if you really want to do this you need to be very selective about what you choose to cross post. The quality of the content needs to outweigh the fact that you are significantly less likely to get a response. Either way I am still in the nobot camp and will actively avoid communities with lots of bot posters.