Hosting your own File Sharing service like DropBox or OneDrive is not only possible, it’s actually not too hard. One of the most widely used products for doing just this is an app called Nextcloud. In short, Nextcloud is an open source web app written in PHP that acts as a front end for and sync server for a file repository. It supports multiple users, is highly extensible, and really just downright awesome in what in what you get out of the box. It can also sync files from your phone and laptops too with its sync clients.
For a small installation for a couple of users (say, less than 20 users) that doesn’t need to be highly available, setting up a single virtual machine to handle the load is probably more than enough.
While it’s possible to install Nextcloud on an Azure VM and just use a VHD to host the files is possible, I wanted to explorer the possibility of using Azure Blob Storage as a backend for Nextcloud instead. The justification for using blob storage was that it could be cheaper than VHD’s because on a per GB basis blob storage is cheaper than a VHD and with Blob storage you only pay for the storage that you actually use rather than paying for the full capacity of a VHD even if half the disk isn’t in use. Some other advantages could be rationalized like geo-redundant storage and easy acess to files even if Nextcloud or the VM blows up. It seemed to make a lot of sense.
Nextcloud appears to have an in-progress plug-in for Azure blob storage and there wasn’t anything in the way of documentation on how to use it. However , there is a little known project from the Azure Storage team called Blobfuse. Blobfuse allows a user to mount a Blob Storage container as a folder in a Linux filesystem. Behind the scenes, it’s using the libfuse library that lets users create mount points on the filesystem in Linux user mode. This creates challenges, but also opens up a lot of possibilities for things like Blobfuse. After some fiddling, a recipe for creating an instance of Nextcloud that uses Azure Blob Storage emerged.
SSH into your Ubuntu box, get root access (sudo -i) and run the following recipe…
There are several default packages and some from Microsoft that need to be installed to make the installation work.
wget https://packages.microsoft.com/config/ubuntu/18.04/packages-microsoft-prod.deb
dpkg -i packages-microsoft-prod.deb
apt update && apt upgrade
apt install nginx mariadb-server mariadb-client php7.2 php7.2-fpm php7.2-mysql php-common php7.2-cli php7.2-common php7.2-json php7.2-opcache php7.2-readline php7.2-mbstring php7.2-xml php7.2-gd php7.2-curl php7.2-zip unzip php-imagick php7.2-bz2 php7.2-intl blobfuse
MariaDB is a drop-in replacement for MySQL that is commonly used with applications that need MySQL databaes to run, including Nextcloud.
mysql_secure_installation
mysql
create database nextcloud;
your-password
with a password for your database instance. Remember the password because you will need it later to configure NextCloud.
create user nextclouduser@localhost identified by 'your-password';
grant all privileges on nextcloud.* to nextclouduser@localhost identified by 'your-password';
flush privileges;
exit;
NGINX is the web server responsible for serving the content for the Nextcloud installation.
rm /etc/nginx/sites-enabled/*
nano /etc/nginx/conf.d/nextcloud.conf
server_name
value from example.com
to the name of your intended host and then press Ctrl + O
to save the configuration.
server {
listen 80;
server_name example.com;
# Add headers to serve security related headers
add_header X-XSS-Protection "1; mode=block";
add_header X-Robots-Tag none;
add_header X-Download-Options noopen;
add_header X-Permitted-Cross-Domain-Policies none;
#This header is already set in PHP, so it is commented out here.
#add_header X-Frame-Options "SAMEORIGIN";
# Path to the root of your installation
root /usr/share/nginx/nextcloud/;
location = /robots.txt {
allow all;
log_not_found off;
access_log off;
}
# The following 2 rules are only needed for the user_webfinger app.
# Uncomment it if you're planning to use this app.
#rewrite ^/.well-known/host-meta /public.php?service=host-meta last;
#rewrite ^/.well-known/host-meta.json /public.php?service=host-meta-json
# last;
location = /.well-known/carddav {
return 301 $scheme://$host/remote.php/dav;
}
location = /.well-known/caldav {
return 301 $scheme://$host/remote.php/dav;
}
location ~ /.well-known/acme-challenge {
allow all;
}
# set max upload size
client_max_body_size 512M;
fastcgi_buffers 64 4K;
# Disable gzip to avoid the removal of the ETag header
gzip off;
# Uncomment if your server is build with the ngx_pagespeed module
# This module is currently not supported.
#pagespeed off;
error_page 403 /core/templates/403.php;
error_page 404 /core/templates/404.php;
location / {
rewrite ^ /index.php$uri;
}
location ~ ^/(?:build|tests|config|lib|3rdparty|templates|data)/ {
deny all;
}
location ~ ^/(?:.|autotest|occ|issue|indie|db_|console) {
deny all;
}
location ~ ^/(?:index|remote|public|cron|core/ajax/update|status|ocs/v[12]|updater/.+|ocs-provider/.+|core/templates/40[34]).php(?:$|/) {
include fastcgi_params;
fastcgi_split_path_info ^(.+.php)(/.*)$;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
fastcgi_param PATH_INFO $fastcgi_path_info;
#Avoid sending the security headers twice
fastcgi_param modHeadersAvailable true;
fastcgi_param front_controller_active true;
fastcgi_pass unix:/run/php/php7.2-fpm.sock;
fastcgi_intercept_errors on;
fastcgi_request_buffering off;
}
location ~ ^/(?:updater|ocs-provider)(?:$|/) {
try_files $uri/ =404;
index index.php;
}
# Adding the cache control header for js and css files
# Make sure it is BELOW the PHP block
location ~* .(?:css|js)$ {
try_files $uri /index.php$uri$is_args$args;
add_header Cache-Control "public, max-age=7200";
# Add headers to serve security related headers (It is intended to
# have those duplicated to the ones above)
add_header X-Content-Type-Options nosniff;
add_header X-XSS-Protection "1; mode=block";
add_header X-Robots-Tag none;
add_header X-Download-Options noopen;
add_header X-Permitted-Cross-Domain-Policies none;
# Optional: Don't log access to assets
access_log off;
}
location ~* .(?:svg|gif|png|html|ttf|woff|ico|jpg|jpeg)$ {
try_files $uri /index.php$uri$is_args$args;
# Optional: Don't log access to other assets
access_log off;
}
}
systemctl reload nginx
Finally, install Nextcloud after all the supporting components have been installed.
wget https://download.nextcloud.com/server/releases/nextcloud-13.0.2.zip
unzip nextcloud-13.0.2.zip -d /usr/share/nginx/
www-data
user.
chown www-data:www-data /usr/share/nginx/nextcloud/ -R
Lastly configure and start Blobfuse.
data
folder. These two commands remove the default files along with any hidden files.
rm /usr/share/nginx/nextcloud/data/*
rm -rf /usr/share/nginx/nextcloud/data/.*
blob
.
mkdir /blobfuse
nano /blobfuse/blob.cfg
yourstorageaccount
with the name of your storage account, accountKey
with either the primary or secondary key for your storage account, then finally containerName
with the name of the container you want to use in your storage account.
accountName yourstorageaccount
accountKey youraccountkey
containerName thenameofthecontainer
rc.local
file. rc.local
is a script file that executes when the system starts. Because Blobfuse needs to be run as the user that will be using blobfuse (in this case, www-data) using fstabs or other mounting conventions isn’t possible.
nano /etc/rc.local
#!/bin/sh -e
sudo -u www-data blobfuse /usr/share/nginx/nextcloud/data --tmp-path=/tmp -o uid=33 -o gid=33 -o attr_timeout=240 -o entry_timeout=240 -o negative_timeout=120 --config-file=/blobfuse/blob.cfg --log-level=LOG_DEBUG --file-cache-timeout-in-seconds=120
chmod +x /etc/rc.local
/etc/rc.local
Well, that’s it! Happy Nextclouding!
Microsoft Azure and Amazon Web Services (AWS) are two of the most popular cloud platforms.…
Cloud management is difficult to do manually, especially if you work with multiple cloud…
Azure’s scalable infrastructure is often cited as one of the primary reasons why it's the…
https://www.youtube.com/watch?v=wDzCN0d8SeA Watch our "Unlocking the Power of AI in your Software Development Life Cycle (SDLC)"…
FinOps is a strategic approach to managing cloud costs. It combines financial management best practices…
Using Kubernetes with Azure combines the power of Kubernetes container orchestration and the cloud capabilities…