OK, so admittedly this is a bit of an experiment. The problem is that I have multiple machines in the house and want to back them up automatically (Mac, Windows and KDE neon walk into a bar…) and be able to access the files. Plus, I wouldn’t mind being able to play music through the server (eg, it is connected to the stereo) and it should have a public folder for file sharing.
This has now taken quite a few installs, including Openmediavault, Amahi Home Server, KDE neon and Ubuntu Server 19.04 (Updated samba for Time Machine) and 18.04. And during this process I have setup samba, Time Machine, Plex Media Server and god knows how many other programs trying to get what I wanted.
In the end I settled on,
- Ubuntu Server 18.04 as the base
- Nextcloud to receive backups and view files
- Rclone to sync files with Nextcloud via WebDav
- Samba to share files between computers
- Mpd to play music on the server
1) The Base – Ubuntu server 18.04
I am not going to go into the installation of Ubuntu 18.04 to much as there are a lot of examples out there. The only important changes I made to the installation were,
- Setting a static IP
- Installing the SSH server
- Installing Nextcloud using the snap option
The ssh server installation is fairly standard but, for those in the know, I am not a fond lover of snaps so why did I use that option? Because it was easy. It stopped me having to install Apache, etc… Also, this is a bit of a test. If it works out I will probably put together a computer with raid and no fans but at the moment this is all running on an laptop so easy is good.
Speaking of which, in order to stop the laptop going into suspend each time I shut the lid I edited logind.conf with,
sudo nano /etc/systemd/logind.conf
I then changed one of the lines from,
Saved it with ctl+x and enter (if you don’t use nano then there are basic instructions here) and restarted logind.
sudo service systemd-logind restart
So that gave me the base to work from and a working instance of Nextcloud as well. Time to close the lid, ssh in
sudo apt update && sudo apt upgrade
After waiting a while to make sure it had restarted I went to the static IP Address I set up in step one in a browser. This took me to the Nextcloud instance where I was asked to enter a username and password for the Nextcloud admin.
I did, and hit enter but it took a long time to do anything after that. Maybe ten minutes or swirling circle? It’s hard to say as I thought I had done something wrong so I refreshed the page which broke everything and then reinstalled it, tried again, tried with a different browser, went and had a coffee while I thought about it, and returned to find it had finished whatever it was doing.
After that it has had a few slow patches but it seems to be working those out as time goes on. The major things I have done in Nextcloud are,
- Create some users
- Change the login screen (see top of post)
- Enable plugin for external drives (to be used later)
The most important thing I did in Nextcloud was to log into one of my users and get the webdav path by clicking on the settings at the bottom left of the webpage. Mine is something like,
It is the same for all users. Where things go to gets changed by the username used but having that address meant that I could start installing Rclone on my computers.
Rclone is rsync for cloud storage and I chose it because it includes WebDav as one of the protocols it can access. There are about twenty others. It is quite cool.
The first problem I ran into was after installing rclone on the server. The version in the package repos is so old it doesn’t include WebDav so I installed it from the website.
curl https://rclone.org/install.sh | sudo bash
After it is installed you run,
which creates a configuration file for you. Basically it asks what you want to call your configuration, what protocol you want to use, the URL to use (insert WebDav URL), username, password, etc…It is all fairly straight forwards.
After that, instead of having to to enter the information in all the time you can just type in Konsole,
rclone copy <sourcepath> <configname>:<destinationpath>
Which in English would say use rclone to copy the contents of the sourcepath using the information contained in configname to the desinationpath.
My destination path was basically a folder called Backup. This was created as it didn’t exist and rsync happily continues to update it. Be aware there are other commands than copy. For instance sync updates the contents of the folder and deletes files that no longer exist at the source. To continue to keep my Linux laptop backed up I created a shell script to execute,
rclone sync /archive/Archive mbs:Backup
on login using KDE’s Autostart. The Mac was slightly more complex.
On the Mac there is a program to handle autostart called launchd. To run a program after login as the user you just have to place the .plist in the Library/LaunchAgents directory. Luckily there is an automated plist creator online at Zerowidth.
So basically, I repeated all the Rclone steps again on the Mac and then created a .plist and inserted it into the LaunchAgents folder using the commands from Zerowidth. I did add caffeinate -i to the start of the command though as this stops the Mac deciding to go into idle power mode.
The samba setup is extremely easy. I only have one user, there is only going to be one folder shared so that people can easily move things around, especially if they are visiting and the instructions for this are virtually identical to setting up samba in KDE neon. So I ssh’d into the server and created the public folder with,
sudo apt install samba
This will install the server. The next step is to add yourself as a samba user. You have to be a user already to be a samba user so use your existing username or it won’t work.
sudo smbpasswd -a <insert username>
It will then ask you to enter your password and then confirm it. You don’t have to use your account password, in fact it is more secure to use a different one for networking as you tend to have to give it out so other can access it.
The next step is to backup the configuration file and then edit it. You can backup the file by typing,
sudo cp /etc/samba/smb.conf /etc/samba/smb.conf_backup
Then edit it using nano.
sudo nano /etc/samba/smb.conf
At the very bottom of the file I am going to at the share definitions for a folder called Public in my home directory. This will probably already exist but if it doesn’t then create it. Change the <partstochange> to suit yourself.
[<share name>] path = /home/<username>/Public/ available = yes valid users = <username> read only = no browseable = yes public = yes writable = yes
So if my user name was squalidh then the share would look like,
[Squalidh Share] path = /home/squalidh/Public/ available = yes valid users = squalidh read only = no browseable = yes public = yes writable = yes
After finishing save the file and restart the samba server with,
sudo systemctl restart smbd
Then as a test I went to the Public folder with KDE neon, entered my username and password, created a folder called Music and copied some of my Music there. This will give me a place to point the music player daemon when I install it.
5) MDP – Music Player Daemon
The first thing to do is to make sure your system is capable of playing music from the command line. To do this I used mpg123 to play an mp3 file but that was at the end of a long period of trying to get sound working. So long I have forgotten most of what I actually did although thinking back it may be as simple (after all the installs, configs, and uninstalls) as installing alsa, setting up alsamixer, saving the settings and adding my user to the audio group. Anyway, what I am trying to say is make sure your sound is working before installing sound servers.
Eventually, I did get the sound working with alsa and installed mpd with,
sudo apt install mpd
and edited the config file which I am not going to post as it is too long so if it is handy you can download it below.
After that, it was just a matter of installing cantata on another computer, pointing it at the servers ip address and port 6600 and waiting for it to update.
It sort of fell over for me at the end, I prefer to be a little more sure of what I am doing but as a proof of concept the entire effort has been largely a success and justifies me doing it properly with a couple of drives in a RAID array, a silent motherboard, and a hell of a lot more security. But that is part two…