I’ve bee running Nextcloud as my personal file server in ‘production’ for six months now and am more and more in love with it. I tried various installs over the last year – on a bare metal ubuntu host, on a VM Ware virtual machine but my current setup has Nextcloud running in a jail on my FreeNAS server. It seems very happy there, is easy to access from anywhere and seemlessly keeps my two work macs, my two linux laptops and my windows laptop synced. I’m on the verge of deleting my files from Google Drive which will complete one big thread of removing Google from my life.
Samuel Dowling’s super clear and comprehensive walkthrough was my key resource with this build. Thanks mate, as us Aussie’s say.
I’ve been upset to think about Google holding so many of my files but the service Google provides in keeping things in sync between my several comupters has been just too convenient. The NextCloud project has been on my horizon for soem time because it offers me things I care about: self sufficiency, privacy and security. It’s an amazing project and my goals for using it are right at the simplest end of the spectrum of users. Surprisingly, I found it a little hard to get super basic advice – the admin guide is hundreds of pages long. I knew it coudn’t be hard but I struggled to find an ‘Idiot’s guide’….so here it is.
Step 1: Build a host
My intent was to host my own files locally, in my home. A cloud in my house. I maintain a couple of servers and use virtual machines under vmWare but I could just as easily have used any old PC or laptop with a bit of storage space. I installed Ubuntu server I still use the 18.04 LTS relaese because of its generous maintenance period. It’s here.
Update and patch it:
sudo apt-get update
sudo apt-get upgrade
Step 2: Install
The next steps were too easy for any one to have documented them — except u/Lognei on reddit who put together a basic post on basic installations. Thanks! We will use the snap tool to install in one command. Lovely.
sudo apt-get install snapd
snap install nextcloud
So, now we have an application which is automatically running and all our further work can be most easily accomplished on the web interface using any browser. The only stumpling point is, ‘What do I type into the address bar?’ First try:
and type hostname.local into your browser (substitute the name of your host for ‘hostname’ obviously!) The didn’t help me in my environment so I used an ip address directly. 192.168.4.99 did the trick for me and your’s will be different (try ifconfig if you need help finding your ip address.)
Step 3: Remote access
I want my NextCloud to be accessible locally and from my laptop, office machine, phones and tablets. Basically from where ever. That is always going to bring about security concerns but the NextCloud project itself is super secure. Just take care!
I had a domain already registered – we’ll call it myname.net. So I logged into the company that manages the DNS for myname.com. That’s the directory that links the readable URL (myname.net) to the machine readable ip address. For the record I use Zoneedit for that and have for years. I added a subdomain called ‘cloud’ so now cloud.myname.net points to my Home ip address. That a constant address for my, but your may change each time your modem drops out – check out dynamic DNS if that proplem applies to you.
Next I needed to look at my router and its firewall to allow incoming requests for the NextCloud services to reach the server we set up in the steps above. These incoming requests are on ports 443 (and possibly port 80). In my router I was able to add a port forwarding rule so that all incoming requests on port 443 are directed to the ip address of the server running NextCloud. Good
So now if I type cloud.myname.net into any browser, I get the log in page of my NextCloud server.
Step 4: Add users
The first user on a NextCloud server is called ‘admin’. Not a very personalised choice. Once through the login screen it was trivial to add a new user with my proper name.
Step 5: SSL
This got me for a bit and took a bit of googling to get right. To keep your data safe when it is transmitted from your NextCloud server, across the internet to your laptop or phone it needs to be encrypted in transit. This is part of the https protocol and is required by most browsers in 2020. If its not there you will get big warnings that your connections are insecure. To allow encryption your server needs a certificate and we will use a certificate from Let’s Encrypt (its free!).
When I initially set out to do this the process failed but another kind blogger helped me out with and important first step:
apt install resolvconf
When asked enter the domain name that you plan to use to access NextCloud (remember cloud.myname.net). Now your data will traverse the internet in a form that no one can read (unless they break into your server and steal your certificate).
Step 6: Download clients
Nextcloud runs a little client on each of your machines to keep you files synced. They’re here. I installed clients under Ubuntu linux, Windows 10, Window 7, and Android and connected my new user. I could now save a file on one machine and see it arrive on any of the others around a minute later. Cool.
Step 7: Encryption
Being able to manage the security of my stuff is a big deal to me. Not that I have anything to hide but rather that I like to assert my right to keep my stuff to myself. Human rights stuff. NextCloud has a bunch of stuff to extend security beyond HTTPS. All you files can be encrypted on the hard drive of the server but this may be easier to achieve by allowing the operating system to encrypt the entire hard drive. That’s my preferred pathway. I have enabled end-to-end encryption after reading this TechRepublic post. There’s a bit of fiddling here including adding two apps in the NextCloud web interface – you need ‘Default encryption module’ and ‘End-to-End Encryption’. Once they are running you can set up folders to be encrypted before any files are transmitted over the internet. I’m still trying to think how much that helps me. It does ensure that the files are encrypted at rest on the server as well as in transit. That’s a good thing.
Step 8: TODO External storage
My Nextcloud runs as a virtual machine inside another host. It’s nice to keep VMs small so they can be moved and backed up and so on. I’ll need to add more storage to my cloud before it can store all of my documents. More research is my next step.
After many years it was time to move eek.io and my other sites to a new host. The reasons are not important but the process was scary.
There are lots of descriptions scattered about about how to export, transfer and reinstall a WordPress site but they were all a little different and used widely different technologies. One of my reasons for moving involved php. My old host had been slow to update php and the new host was much more up to date. I was anxious that the difference might break the transfer – and just to make things worse – so was the software I used. Any way, we got there.
I ended up using these instructions from DreamHost which were not too hard to follow and got me home. Duplicator has free and paid versions and I came across no reason to use the paid version. The free plugin did everything I needed. If your site is huge then the pro paid version may be for you.
Only one plugin broke – my plugin – so I’ll be off to get that restored next.
Best of luck to anyone out to transfer a site… it can be done.
I have belatedly upgraded this blog to serve pages using the more secure SSL. I have no confidential content here and no real need for encrypted delivery but one by on all the major browsers are putting up bigger warnings to users accessing insecure http pages.
The process was fairly simple – my website host was able to provide a certificate with only a couple of clicks of the admin panel. I then used a plugin called ‘Really Simple SSL‘ to make the changes required to the WordPress installation.
Now we’re off and away. If you could use a walkthrough of the preocess try this at WordPress Beginner.
I have battled to get my Synology DiscStation to do somethiong that I thought was pretty simple. Its a DS214play running DSM 5.2. I use Surveilance station to record motion activated video of the driveway at the front of the house. This documents everyone arriving and leaving and has been pretty reliable for some time. However I wanted to automatically backup this series of short video files to a cloud storage provider – I’m using Google.
The Synology has a tool called CloudSync which can do this sort of thing but it failed to backup the security video because it is kept in a series of folders owned by root and there is a permissions conflict. A less direct route was required and I hit on the idea of copying the automatically recorded files to an easier part of the filesystem to access, to change the permissions and then to upload them.
So I hit the terminal. Interestingly ssh login as root is allowed but sudo is not. An unusual combination that took me some time to puzzle out. I created a directory owned by an ordinary user:
Now I set up a recurring task in the diskstation front end to run the script at /volume1/DS/scripts/securitycopy.sh every minute to accomplish two things. Firstly to move the inaccessible security footage from a set of directories owned by root into an accessible area owned by melba, and, secondly, to alter all the files and directories to make the owner melba. This will allow me to back them up with CloudSync. The file securitycopy.sh therefore looks like:
rsync -ra "$FROM" "$TO"chown -R melba:users "$TO"/usr/syno/bin/synologset1 sys info 0x11100000 "Security footage has been synchronised."
The last line there was the result of a little googling to allow logging of the running of the little script, largely for my own peace of mind. Thanks to the kind user whose post I’ve mislaid! The first time I ran the rsync command the system hummed away for about 10minutes but subsequently the sync takes a second or two.
The final stage is to set up a task with the CloudSync tool on the DiskStation. This is a one way sync to my Google Drive which makes all the footage available off site and also in the event that the DiscStation is stolen. I’ll need to report back on the latency of the system.
I’ve been reading the 2018 history of reddit by Christine Lagorio-Chafkin this weekend and I really enjoy this type of work. The history of reddit.com is a little different from many US IT startups being initially East Coast and with founders who took an idea to product stage and indeed fleshed it out for several years before being hit by the realities of venture capital and the impact that has on the purity of a concept. The founding concept of ‘the front page of the internet’ being stripped down, user contributed and devoid of editorial control remained true for many years and still largely exists today. CL-C fleshes out the key characters with skill making them interesting, distinct and human – and for Ohanian and Huffman who spent 18 months alone in a Cambridge, Ma apartment writing code and contributing to their own infant site, thats quite an effort.
I like novels in this ilk because of the insight into the qualities that allow young risk taking upstarts to achieve so much. There are lessons in that for all of us. For me We are the Nerds ranks up there with some of my favourite reads: I Woz , The Smartest guys in the room and the old Zukerberg movie The Social Network. The characters here are probably a bit more real and understandable than those of Steve Wozniac or Mark Zukerberg which may simply be because their financial success was more modest. The novel is all the better for that and I’m sure it will be appealing to many readers. Enjoy.
As my kids walk from school to swimming they pass a 7-Eleven and a Boost juice business and often want a drink. A Slurpee costs around a dollar while a juice costs about six times as much. They came to me today requesting more pocket money so they could afford the more expensive, premium and ‘healthier’ Boost juice.
I’ve been working off this how-to put together by the electronic frontier foundation to establish a working email client with capacity to send and retieve encrypted email. I’d recommend it as a usable, noob-friendly guide.