I'm going to be moving soon, and that means taking the homelab offline for some period of time. Here's part 1 of how I'm preparing for it, by shrinking my Nextcloud storage footprint.
Like many Nextcloud users, I have a Google (Photos) Takeout and Facebook backup dump stored at home on my Nextcloud instance, along with using the Nextcloud app on my phone to auto-backup new photos. I don't actually go through those old Google and Facebook photos, but it's much better to have in what you might call my "ownCloud" than having to trust Google and Meta to keep that data.
With that said, my Nextcloud instance is only as available as the server it's hosted on - which as I've written about before, is an Intel NUC in my home. I'm planning a move soon, though, and Nextcloud hosts services that I don't want to do without during the logistics phase between this stable internet connection and the next one, so I have to temporarily move Nextcloud - among other services, which I may write about if anything interesting happens to them - to a rented server. I don't, however, need to pay for tens of gigabytes of expensive storage for data I'm not accessing, so I set up for the first time some external storage to move these lesser-used files to.
But Nextcloud appears to be much better-designed for a use-case where external storage is set up before you add any data - simply setting it up and moving the Google folder to the external storage volume in the web UI kicked off an incredibly slow and uncancelable (until I restarted the containers) process, which was quite frustrating.
So I started over, and here's what I set up. I'm integrating Backblaze B2, which is a very cheap, S3-compatible object storage I use for a number of bulk storage use-cases, as the External Storage in Nextcloud. I mostly followed Backblaze's guide for configuring a B2 bucket as a Nextcloud external storage, with the exceptions that I:
1) Am setting up buckets on a per-user basis, because I don't need to share these personal backup files across the instance, and
2) Unchecked the "enable previews" option in the three-dot menu, to ensure I don't generate a bunch of billable API calls from otherwise-unrelated Nextcloud storage utilization
So now that I've got an empty B2 bucket as External Storage available, now what? Well, first I need to manually sync the data from the appropriate Nextcloud storage folders to the bucket. For this, I used a modified version of the modified version of this sync script that copies a directory into a B2 bucket. Once I confirmed that the files were correctly uploaded and visible in Nextcloud, I had to remove the duplication in local storage. This is where Nextcloud - especially in a Docker Compose environment - doesn't seem well-designed for data migration use-cases.
What you have to do is manually delete the appropriate directories - which I did as a superuser from the host OS, as I'm bind-mounting the volumes out to the host, but the permissions are set to a specific UID/GID inside the container - and then shelling into the container itself:
$ docker exec -u 33 -it nextcloud-app-1 /bin/bash
Your container name may vary, but it's unlikely your UID will. This drops you in to /var/www/html
which should contain the occ
binary, and you can trigger a manual rescan of the directories with:
$ ./occ files:scan --path="user1/files/some_folder"
I just went ahead and re-scanned my entire user directory, to make sure everything was set up properly, and this took about 3 minutes for my ~54GB of data. I didn't see any better way to move between "internal" and "external" storage in Nextcloud without these manual steps, but if I missed something, let me know!