Sorry but I can’t think of another word for it right now. This is mostly just venting but also if anyone has a better way to do it I wouldn’t hate to hear it.

I’m trying to set up a home server for all of our family photos. We’re on our way to de-googling, and part of the impetus for the change is that our Google Drive is almost full.We have a few hundred gigs of photos between us. The problem with trying to download your data from Google is that it will only allow you to do so in a reasonable way through Google takeout. First you have to order it. Then you have to wait anywhere from a few hours to a day or two for Google to “prepare” the download. Then you have one week before the takeout “expires.” That’s one week to the minute from the time of the initial request.

I don’t have some kind of fancy California internet, I just have normal home internet and there is just no way to download a 50gig (or 2 gig) file in one go - there are always intrruptions that require restarting the download. But if you try to download the files too many times, Google will give you another error and you have to start over and request a new takeout. Google doesn’t let you download the entire archive either, you have to select each file part individually.

I can’t tell you how many weeks it’s been that I’ve tried to download all of the files before they expire, or google gives me another error.

  • redxef@scribe.disroot.org
    link
    fedilink
    English
    arrow-up
    53
    ·
    edit-2
    5 months ago

    Honestly I thought you were going to bitch about them separating your metadata from the photos and you then having to remerge them with a special tool to get them to work with any other program.

  • BodilessGaze@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    115
    arrow-down
    3
    ·
    5 months ago

    There’s no financial incentive for them to make is easy to leave Google. Takeout only exists to comply with regulations (e.g. digital markets act), and as usual, they’re doing the bare minimum to not get sued.

    • Avid Amoeba@lemmy.ca
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      2
      ·
      edit-2
      5 months ago

      Or why is Google Takeout as good as it is? It’s got no business being as useful as it is in a profit-maximizing corpo. 😂 It can be way worse while still technically compliant. Or expect Takeout to get worse over time as Google looks into undermaximized profit streams.

      • BodilessGaze@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        26
        arrow-down
        1
        ·
        edit-2
        5 months ago

        Probably because the individual engineers working on Takeout care about doing a good job, even though the higher-ups would prefer something half-assed. I work for a major tech company and I’ve been in that same situation before, e.g. when I was working on GDPR compliance. I read the GDPR and tried hard to comply with the spirit of the law, but it was abundantly clear everyone above me hadn’t read it and only cared about doing the bare minimum.

        • Avid Amoeba@lemmy.ca
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          5 months ago

          Most likely. Plus Takeout appeared way before Google was showing any profit maximization signs and didn’t even hold the monopoly position it does hold today.

  • Moonrise2473@feddit.it
    link
    fedilink
    English
    arrow-up
    20
    ·
    5 months ago

    It doesn’t have an option to split it?

    When I did my Google takeout to delete all my pics from Google photos there was an option to split in like “one zip every 2gb”

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      5 months ago

      The first time I tried it in the two gigabyte blocks. The problem with that is I have to download them one or two at a time. It’s not very easy to do over the course of a week on a normal internet connection. Keep in mind, I also have a job.

      I got about 50 out of 60 files before the one week timer reset and I had to start all over.

      • habitualTartare@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        5 months ago

        Apparently you can save it to Google drive then download the Google drive program and make that folder available offline so it downloads it to the computer.

        1. When you setup the Google Takeout export choose Save in a Google Drive folder

        2. Install the Google Drive PC client (Drive for desktop)

        3. It will create a new drive (i.e. G:) in your explorer. Right click on the takeout folder and select “Make available offline”. All files in that folder will be downloaded by the Google Drive Desktop in the background, and you will be able to copy to another location, as they will be local files.

      • wizardbeard@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        10
        ·
        5 months ago

        You could look into using a download manager. No reason for you to manually start each download in sequence if there’s a way to get your computer to automatically start the next as soon as one finishes.

          • ubergeek77@lemmy.ubergeek77.chat
            link
            fedilink
            English
            arrow-up
            3
            ·
            5 months ago

            Definitely recommend Motrix:

            https://motrix.app/

            If the Google download link supports it, it should be fairly resistant to interruptions. If it doesn’t, this might not help much, but you should still use this instead of just a browser.

            I haven’t tried to download a Google takeout, so you might need to get clever with how you add the download link to it.

            If you just can’t get it to work, you can try getting the browser extension to automatically send all downloads to Motrix. There is some setup required, though:

            https://github.com/gautamkrishnar/motrix-webextension

            Good luck!

          • Kraiden@kbin.run
            link
            fedilink
            arrow-up
            1
            ·
            5 months ago

            I used uGet on windows, and it was fairly smooth. Not google, but an equally annoying large download. I believe it’s on Linux as well.

    • Dave@lemmy.nz
      link
      fedilink
      English
      arrow-up
      12
      ·
      5 months ago

      From a search, it seems photos are no longer accessible via Google Drive and photos downloaded through the API (such as with Rclone) are not in full resolution and have the EXIF data stripped.

      Google really fuck over anyone using Google Photos as a backup.

      • gedaliyah@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 months ago

        Yeah, with takeout, there are tools that can reconstruct the metadata. I think Google includes some JSONs or something like that. It’s critical to maintain the dates of the photos.

        Also I think if I did that I would need double the storage, right? To sync the drive and to copy the files?

        • Dave@lemmy.nz
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          From what I’ve read, I would not trust any process other than the takeout process. Do the album thing to split it up.

  • butitsnotme@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    2
    ·
    5 months ago

    I know it’s not ideal, but if you can afford it, you could rent a VPS in a cloud provider for a week or two, and do the download from Google Takeout on that, and then use sync or similar to copy the files to your own server.

    • Flax@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      Use this. It’s finnicky but works for me. You have to start the download on one device, then pause it, copy the command to your file server, then run it. It’s slow and you can only do one at the time, but it’s enough to leave it idling

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      10
      ·
      5 months ago

      I don’t know how to do any of that but I know it will help to know anyway. I’ll look into it. Thanks

      • Blue_Morpho@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 months ago

        Instead of having to do an Operating system setup with a cloud provider, maybe another cloud backup service would work. Something like Backblaze can receive your Google files. Then you can download from Backblaze at your leisure.

        https://help.goodsync.com/hc/en-us/articles/115003419711-Backblaze-B2

        Or use the filters by date to limit the amount of takeout data that’s created? Then repeat with different filters for the next chunk.

      • Avid Amoeba@lemmy.ca
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        5 months ago

        Be completely dumb and install a desktop OS like Ubuntu Desktop. Then remote into it, and use the browser just as normal to download the stuff on it. We’ll help you with moving the data off it to your local afterwards. Critically the machine has to have as much storage as needed to store all of your download.

  • stepan@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    5
    ·
    5 months ago

    There was an option to split the download into archives of customizable size IIRC

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      Yeah, that introduces an issue of queuing and monitoring dozens of downloads rather than just a few. I had similar results.

      As my family is continuing to add photos over the week, I see no way to verify that previously downloaded parts are identical to the same parts in another takeout. If that makes sense.

      • Willdrick@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        5 months ago

        You could try a download manager like DownThemAll on Firefox, set a queue with all the links and a depth of 1 download at a time.

        DtA has been a godsend when I had shitty ADSL. It splits download in multiple parts and manages to survive micro interruptions in the service

        • gedaliyah@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          I couldn’t get it working, but I didn’t try too hard. I may give it another shot. I’m trying a different approach right now.

        • gedaliyah@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          DownloadThemAll seems to be helping. I’ll update the original post with the details once I have success. In this case, I was able to first download them internally in the browser, then copy the download link and add them to DtA using the link. Someone smarter than me will be able to explain why the extra step was necessary, or how to avoid it.

  • Dave@lemmy.nz
    link
    fedilink
    English
    arrow-up
    8
    ·
    5 months ago

    Can you do one album at a time? Select the albums you want to download, then do that file. Then do the next few albums. That way you have direct control over the data you’re getting in each batch, and so you’ll have a week to get that batch instead of having to start again if the whole thing didn’t finish in a week.

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      5 months ago

      That may be a thought. I could organize the photos first and then do multiple takeouts. Thanks

  • K3CAN@lemmy.radio
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 months ago

    I do occasional smaller “takeouts” and haven’t had any issues.

    I have an “automatic album” (or whatever they call it) where all the photos of friends and family (even pets) get automatically added to it. Then I can just request a “takeout” for that one album, since those are the photos I actually care about. It’s a much smaller download than the entirety of my Photos account.

  • weker01@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    ·
    5 months ago

    Google takeout is the best gdpr compliant platform of all the big tech giants. Amazon for example lets you wait until the very last day they legally can.

    Also they do minimal processing like with the metadata (as others commented) as it is probably how they internally store it and that’s what they need to deliver. The simple fact that you can select what you want to request and not having to download everything about you makes it good in my eyes.

    I actually see good faith compliance with the gdpr in the Plattform

  • floofloof@lemmy.ca
    link
    fedilink
    English
    arrow-up
    28
    ·
    edit-2
    5 months ago

    Google takeout is there so they are technically compliant with rules that say you must be able to download your personal data, but they make it so inconvenient to use that practically it’s almost impossible to download it. Google photos isn’t a backup service so much as a way for Google to hold your photos hostage until you start paying for higher amounts of storage. And by the time you need that storage, Google takeout download has become impractical.

  • Railcar8095@lemm.ee
    link
    fedilink
    English
    arrow-up
    14
    ·
    5 months ago

    Not sure if somebody mentioned, but you can export to one drive. So you can get a 1TB account for a free trial or for a single month and export everything there as simple files, no large zips. Then with the app download to the computer and then cancel one drive.

    Pretend to be in California/EU and then ask full removal of all your data on both Microsoft and google

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      This route may be the answer. I didn’t have success so far in setting up a download manager that offered any real improvements over the browser. I wanted to avoid my photos being on two corporate services, but as you say, in theory everything is delete-able.

  • Eager Eagle@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    5 months ago

    A 50GB download takes less than 12h on a 10Mbps internet. And I had a 10Mbps link 10 years ago in a third world country, so maybe check your options with your ISP. 50GB really should not be a problem nowadays.

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      It’s not the speed - it’s the interruptions. If I could guarantee an uninterrupted download for 12 hours, then I could do it over the course of 3-4 days. I’m looking into some of the download management tools that people here have suggested.

      • Eager Eagle@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 months ago

        that might work; I don’t know if you live in a remote area, but I’d also consider a coffee shop, library, university, or hotel lobby with wifi. You might be able to download it within an hour.

  • Possibly linux@lemmy.zip
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 months ago

    You need a solid wired connection. Maybe phone a friend for help.

    Alternatively you could use curl. I think it as a resume option.

  • rambos@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    5 months ago

    Im surprised that feature exist tbh. It worked fine for my 20GB splited into 2GB archives if I remember correctly

    • gedaliyah@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      I used it for my music collection not that long ago and had no issues. The family’s photo library is an order of magnitude larger, so is putting me up against some of the limitations I didn’t run into before

  • YurkshireLad@lemmy.ca
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    3
    ·
    5 months ago

    Because Google don’t want you to export your photos. They want you to depend on them 100%.