Digitalocean and Nextcloud

I've been listening to a few Linux podcasts lately and heard some rave reviews of Nextcloud. Dropbox has recently introduced some new charges for their services and this prompted me to take a look at my options to see if I want to continue using Dropbox and start paying for advanced services (more storage, more devices, etc) or do something different. Nextcloud presents an alternative with a lot of the key features and benefits of Dropbox and similar services with the added benefit that I own the service myself. That benefit is also a tradeoff as the question becomes how much of a management burden would it be to take on running a Nextcloud instance myself and what method should I go with? Use of Google Drive and other similar services adds some privacy concerns.

As the title of the post would indicate, I ultimately settled on Digitalocean for hosting a Nextcloud instance. I did look at AWS lightsail as it could provide a similar service at a similar monthly charge. Right now with the smallest Nextcloud instance with 25GB of SSD storage I'm looking at $5/month. I'm also paying an additional $5/month for a spaces 250GB S3 bucket attached as external storage that I am using primarily as a backup location for my data. I did have an AWS t3 instance I was booting up on demand for server type uses but I'll probably go ahead and get rid of this in favor of the droplet with Nextcloud and an openvpn or wireguard service running as well .

Droplet and Spaces setup

Sizing my droplet instance was relatively easy. I picked the basic 1G RAM, 25 GB SSD disk droplet that runs $5/month from DO's nyc3 region. I picked up a spaces instances within the same region which will give me 250G at an additional $5/month. I elected to go with the latest 19.04 ubuntu-server release image available from Digitalocean. I did consider uploading my own image but decided to go with simplicity. It provisioned quickly and I lauched my ansible post-provisioning setup. This went as expected and I had a ready and waiting system.

I then configured my spaces instance and keys so I could use that with Nextcloud as well.

Install Nextcloud

Digitalocean tutorials are excellent and I followed this one to deploy Nextcloud as a snap, I haven't done too much with snaps in the past but I like the concept and the isolation the snap installation provides including all dependencies and tooling. This would have some implications for management that I'd have to work through.

Despite being written for an earlier version of Ubuntu the tutorial worked perfectly and installation was quick and painless. I did some experimentation with making spaces the primary storage location but had issues. It turns out that DO was having some partial outages on spaces in NYC3 the day I deployed but perhaps having that happen was for the best. I decided 25G of SSD storage for Nextcloud would be better and I could always upgrade later. Spaces I would use for additional storage and backup capacity as well.

After setting up Nextcloud I added TOTP mutli-factor authentication to my instance within the console. I also configured my spaces instance as a secondary storage location. Both of these tasks are well documented within the Nextcloud console itself and went off without a hitch. I then sent up clients on my phone and laptop and started moving data over. Again, this couldn't be more smooth.

Backups

Nextcloud 16 is just around the corner and in any case while I am flinging data across multiple systems onto Nextcloud I still want a reliable backup method. with a little searching I came across this post that has a nice little script to backup the mysql database and file contents to S3 or Backblaze B2. the challenge is the commands for Nextcloud such as occ and mysqldump assume a standard install, rather than snap install of Nextcloud.

The script runs a Nextcloud command to put the product into maintenance mode, it dumps the sql database and then copies it and the nextlcoud data repository to the backup target. It finishes by taking Nextcloud out of mainteinance mode. I noticed in the initial installation tutorial that the snap had its own prefixed command names such as sudo nextcloud.occ referenced. I simple which nextcloud.occ revealed our path for that is /snap/bin and within that path there's a handy README file:

This directory presents installed snap packages.

It has the following structure:

/snap/bin                   - Symlinks to snap applications.
/snap/<snapname>/<revision> - Mountpoint for snap content.
/snap/<snapname>/current    - Symlink to current revision, if enabled.

DISK SPACE USAGE

The disk space consumed by the content under this directory is
minimal as the real snap content never leaves the .snap file.
Snaps are *mounted* rather than unpacked.

For further details please visit
https://forum.snapcraft.io/t/the-snap-directory/2817

note that bit about <snapname> and different versions. snap allows for multiple versions. thus I have a few options when Nextcloud 16 releases. I can provision a new droplet and deploy anew or I can deploy a new snap alongside . But for now I do need a backup and recovery method. looking at /snap/bin I quickly locate the necessary commands to manage Nextcloud as well as its included mysql instance . What I did next was run through the commands in the script using their nextcloud. prefix varians and was able to manually execute a backup. I decided to slightly modify the original script to allow timestamped copies of the mysql database dump and ended up with the following:

  #!/bin/sh

  # Nextcloud ubuntu snap backup to S3 destination
  # Scott Harney <scotth@scottharney.com>
  # inspired by autoize https://autoize.com/Nextcloud-backup-to-s3-or-b2/
  # This script creates an incremental backup of your Nextcloud instance to Spaces S3.
  # Spaces S3 is a highly redundant block storage service with versioning and lifecycle management features.
  # Requirements:
  # - snap installation of Nextcloud
  # - s3cmd from python pip or package

  # Name of S3 bucket
  s3_bucket='mybucketname'
  s3_config='/home/sharney/.s3cfg'

  # path to Nextcloud installation
  data_dir='/var/snap/nextcloud/common/nextcloud/data'

  today=$(date +'%a-%b-%e-%H:%M:%S-%Z-%Y')

  # Check if running as root
  if [ "$(id -u)" != "0" ]; then
      echo "This script must be run as root" 1>&2
      exit 1
  fi

  echo "Started\n$today"

  # Put Nextcloud into maintenance mode.
  # This ensures consistency between the database and data directory.
  /snap/bin/nextcloud.occ maintenance:mode --on

  # Dump database and backup to S3
  /snap/bin/nextcloud.mysqldump --single-transaction > "/tmp/nextcloud-$today.sql"
  /usr/bin/s3cmd -c "$s3_config" put "/tmp/nextcloud-$today.sql" "s3://$s3_bucket/NextCloudDB/"
  /usr/bin/rm "/tmp/nextcloud-$today.sql"

  # Sync data to S3 in place, then disable maintenance mode
  # Nextcloud will be unavailable during the sync. This will take a while if you added much data since your last backup.
  /usr/bin/s3cmd -c "$s3_config" sync --recursive --preserve --exclude '*/cache/*' $data_dir "s3://$s3_bucket/"

  /snap/bin/nextcloud.occ maintenance:mode --off

  echo 'Finished'
  date +'%a-%b-%e-%H:%M:%S-%Z-%Y'

A script run looks like this with a little snip for brevity:

root@celebratedsummer:~# ./bin/backup_Nextcloud.sh
Started
Tue-May-28-14:49:33-UTC-2019
Maintenance mode enabled
upload: '/tmp/Nextcloud-Tue-May-28-14:49:33-UTC-2019.sql' -> 's3://mybucketname/NextCloudDB/nextcloud-Tue-May-28-14:49:33-UTC-2019.sql'  [1 of 1]
 327617 of 327617   100% in    0s     4.24 MB/s  done
upload: '/var/snap/Nextcloud/common/nextcloud/data/appdata_oc3bfhyw1p23/css/core/6e81-be2e-ie.css' -> 's3://mybucketname/data/appdata_oc3bfhyw1p23/css/core/6e81-be2e-ie.css'  [1 of 35]
 532 of 532   100% in    0s    27.72 kB/s  done
upload: '/var/snap/Nextcloud/common/nextcloud/data/appdata_oc3bfhyw1p23/css/core/6e81-be2e-ie.css.deps' -> 's3://mybucketname/data/appdata_oc3bfhyw1p23/css/core/6e81-be2e-ie.css.deps'  [2 of 35]
 213 of 213   100% in    0s    13.23 kB/s  done
upload: '/var/snap/Nextcloud/common/nextcloud/data/appdata_oc3bfhyw1p23/css/core/6e81-be2e-ie.css.gzip' -> 's3://mybucketname/data/appdata_oc3bfhyw1p23/css/core/6e81-be2e-ie.css.gzip'  [3 of 35]
 303 of 303   100% in    0s    14.16 kB/s  done
Done. Uploaded 24180718 bytes in 5.8 seconds, 3.99 MB/s.
Maintenance mode disabled
Finished
Tue-May-28-14:49:46-UTC-2019

That can go in cron and done. I can browse the S3 space an find my versioned copies of files as well. recovery of database and or file storage can be easily accomplished using the nextcloud.occ and nextcloud.mysqlclient commands aong with s3cmd .

Wrap-Up

Like most people today I use several different cloud based services. I long ago gave up management of my own email server in favor of Gmail. That said it's periodically worth looking at what's evailable and tradeoffs between costs, management burden, and privacy. Cloud file storage and sharing services like Dropbox and Google Drive are fantastically useful. But this is some of your most sensitive data. For $10/month and a rather light management burden I can get many of the same features and benefits that I would paying a similar price to a 3rd party service. I also like supporting businesses like DigitalOcean that have a community focus and provide an alternative to the "big boys:. That seems like a worthwhile tradeoff to me.

My next move will be to further lower my visibility to 3rd party tracking with a combination of pi-hole caching DNS blocker and wireguard VPN .

 Share!

 
comments powered by Disqus