Could it be an MTU issue? Networking van be weird if packets get fragmented unexpectedly, but I see this mostly for IKEv2 and other VPN Services. Try to lower your MTU on the WAN side Maybe?
Could it be an MTU issue? Networking van be weird if packets get fragmented unexpectedly, but I see this mostly for IKEv2 and other VPN Services. Try to lower your MTU on the WAN side Maybe?
I run Nextcloud for many, many years. I hosted it for a very long time at Hetzners second lowest tier of Webspace they rent. It was not very fast there (you get what you pay for), but fast enough for our need here. Later I moved it to an Azure VM and after that to my Homeserver where it runs blazingly fast, especially since the last updates they pushed out.
In all that time I never reinstalled. I just upgraded to the newer versions when they were out. The only times I had problems upgrading was when I was hosting at the cheap Webspace instance at Hetzner and an upgrade process took longer than the PHP timeout my very cheap hosting instance provided. So it was never a fault of Nextcloud, but just that I hosted it on basically the cheapest hosting plan I could find.
We use it for file sharing, calendar + contacts (+ Sync with DAVx), Notes and of course Talk. For talk to make full use of Voice + Video calls, you should have a TURN Server, but if you do not use that (if you just text) it was running great even on the Webspace instance at Hetzner.
We are very happy in our family that it exists, that it is free and that it serves us well since many years.
You would think so, yes. But to my surprise, my well over 60 Containers so far consume less than 7 GB of RAM, according to htop. Also, of course Containers can network and share services. For external access for example I run only one instance of traefik. Or one COTURN for Nextcloud and Synapse.
I would absolutely look into it. Many years ago when Docker emerged, I did not understand it and called it “Hipster shit”. But also a lot of people around me who used Docker at that time did not understand it either. Some lost data, some had servicec that stopped working and they had no idea how to fix it.
Years passed and Containers stayed, so I started to have a closer look at it, tried to understand it. Understand what you can do with it and what you can not. As others here said, I also had to learn how to troubleshoot, because stuff now runs inside a container and you don´t just copy a new binary or library into a container to try to fix something.
Today, my homelab runs 50 Containers and I am not looking back. When I rebuild my Homelab this year, I went full Docker. The most important reason for me was: Every application I run dockerized is predictable and isolated from the others (from the binary side, network side is another story). The issues I had earlier with my Homelab when running everything directly in the Box in Linux is having problems when let´s say one application needs PHP 8.x and another, older one still only runs with PHP 7.x. Or multiple applications have a dependency of a specific library when after updating it, one app works, the other doesn´t anymore because it would need an update too. Running an apt upgrade was always a very exciting moment… and not in a good way. With Docker I do not have these problems. I can update each container on its own. If something breaks in one Container, it does not affect the others.
Another big plus is the Backups you can do. I back up every docker-compose + data for each container with Kopia. Since barely anything is installed in Linux directly, I can spin up a VM, restore my Backups withi Kopia and start all containers again to test my Backup strategy. Stuff just works. No fiddling with the Linux system itself adjusting tons of Config files, installing hundreds of packages to get all my services up and running again when I have a hardware failure.
I really started to love Docker, especially in my Homelab.
Oh, and you would think you have a big resource usage when everything is containerized? My 50 Containers right now consume less than 6 GB of RAM and I run stuff like Jellyfin, Pi-Hole, Homeassistant, Mosquitto, multiple Kopia instances, multiple Traefik Instances with Crowdsec, Logitech Mediaserver, Tandoor, Zabbix and a lot of other things.
One reason is because I can. And because of that, I tend to host things myself which I can. This generates cost and work to maintain it on my side and not for others. A few less users from our household on a public instance means more room for others who are just not as tech-savvy and have no other choice as to rely on public instances. So it is a mix of respecting other peoples time, effort and money and a part is just the nerd that wants to find out how it works and how it´s done :-)
I was just looking for cheap backup space recently and Hetzners Storage Box BX21 is 13€ per month for 5 TB, 20 Snapshots and unlimited traffic. I did not compare the service with backblaze yet, though.
Setup of the HMAC Key for the CouchDB was indeed the step I struggled with too. I think the first time I either made a mistake or used a broken Website to generate a Base64 value. The 2nd time my mistake was that I put in the Base64 value for the HMAC Key into the jwt.ini AND in the docker-compose.yml. But in the docker-compose.yml COUCHDB_HMAC_KEY, I had to put it unencoded and in the jwt.ini hmac:_default it has to be Base64 encoded. Maybe this is the thing you did wrong too?
I bet you are close!
On the other hand, if you are the only person using the shopping list and your current setup offers you what you need, maybe it is not worth it for you. For me it was (and updating when it runs is super easy, I promise!). The instant sync over all devices is great + it keeps working when I lose reception in a shop and syncs again instantly when I have internet again. But what makes Groceries for me are:
Oh, and adding a photo to an item is super useful if you are like me and need very close instructions what to get for your partner if you stand in front of a shelf with 100 different types of cheese which look all exactly the same to you… having a photo is sometimes a life saver for me :-)
Hmm, what do docker logs -f <container> tell? I made myself a compose file and use traefik. Not on my PC atm, but when I had problems getting it running, I made mistakes with the secrets. But that should show in the logs.
Maybe Tandoor for recipes and Groceries from David Shay for shopping lists of all kind. So far the best multi User shopping list / app I ever had.
I know nothing about PlexAmp, but could FinAmp be what you search for? Does Music only and let’s you grab your songs for offline usage.
We follow the principle of doing one thing well instead of all things mediocre, so we use 2 solutions for what you asked. As others in the thread, we do use Tandoor, but only for Recipes and Meal Planning. It does this execeptionally well, but the shopping list part is fitting to our style of shopping.
As a shopping list, we use David Shays Groceries / Specifically Clementines. Why?
There is more, but this post got too long already. It also has User Management, Permissions and Live Sync. Yes, my Partner can see live when I tick of items on the list and can put stuff on the list while I am shopping :-)
Everything in that software feels like it was created by a person that goes actually shopping.
It has a very good web interface (which also has the offline mode AFAIK) and a very good Android App.
Does it look fancy? No. Has it everything we ever searched for in a shopping list app? Absolutely!