Skip Navigation

How to backup object storage for NextCloud

I'm experimenting with running NextCloud (AIO) on a VPS with a B2 bucket as the primary storage. I want to compare performance compared to running it on my home server (esp. when I'm remote) and get an idea of the kinds of costs I'd rack up doing it.

As part of the setup I have configured the built in borg backup but it has this caveat:

Be aware that this solution does not back up files and folders that are mounted into Nextcloud using the external storage app - but you can add further Docker volumes and host paths that you want to back up after the initial backup is done.

The primary storage is external but I'm not using the "external storage" app. So, I have 2 questions.

  1. Does it backup object storage if it's primary (my gut says no)?
  2. If no, what's a good way to backup the B2 bucket?

I've done some research on this topic and I'm kinda coming up empty. I would normally use restic but restic doesn't work in that direction (B2 -> local backup).

It looks like rclone can be used to mount a B2 bucket. One idea I had was to mount it, read-only, and let AIO/borg backup that path with the container backups.

Has anyone done this before? Any thoughts?

6
6 comments
  • Be aware your backup is useless, if you don't backup nextclouds database when using a bucket as primary storage โ˜๐Ÿป

    I use nextcloud with local storage and I use rclone for backup to a S3 bucket (MinIO).

    You can use rclone to directly backing up your bucket. There is no need for restic. It is simple like rsync

    An example would be:

    rclone sync b2:mybucket otherprovider:otherbucket

    You can use local storage too:

    rclone sync b2:mybucket /my/path/here

    Rclone can mount and backup almost everything. It is a swiss knife and I love it.

    Performance is really good with nextcloud, if configurated well. Look at this thread

    • Be aware your backup is useless, if you donโ€™t backup nextclouds database when using a bucket as primary storage โ˜๐Ÿป

      Understood. My hope was to mount the bucket locally (ro) and have it backed up with the container backups using the built in borg backup option.

      rclone sync b2:mybucket otherprovider:otherbucket

      I'd prefer to have proper incremental backups not just a warm copy of the data.

      Rclone can mount and backup almost everything. It is a swiss knife and I love it.

      It seems to be very capable but I cannot make it work for my purposes. I fought with rclone/aio for a few hours yesterday trying to make it work.

      I was, quite easily, able to mount the B2 bucket to a local path. I used the --allow-other option to make it available to the whole system. Everything was accessible via the CLI, but the Nextcloud AIO admin refused to allow me to add that path to the backup job. I was unable to find any logs that indicated why. If I could get this working, I think, it would be ideal as the backups would be consistent.

      I also tried using a couple of the serve options. The nfs option would launch but mounts would fail with protocol errors. I couldn't get the docker plugin to sync up properly with docker. I haven't tried the restic serve option yet. I can provide logs if requested.

      Thanks for the help.

      • Did you try to mount your bucket on your host system via rclone?

        rclone mount b2:bucket /path/on/host --daemon --vfs-cache-mode full

        I would mount it on the host system and add an additional volume in your docker-compose.yml

                volumes:
                 - /path/on/host:/myvolume
        
        

        You could give this a try, if you want to use it in your container ๐Ÿค”

6 comments