Hi, I’m starting a series of posts that will follow the upgrades I’ll be doing to a self-hosted machine that serves as NAS and also runs all kinds of self-hosted software. I’m lazy so it will probably take time, don’t expect me to post too often.
About me: I’ve been using Linux exclusively for personal use (both desktop and servers) for about 20 years now. I’ve used several distributions over the years, I’ve built my own stuff from source (including kernels) and I’ve done Linux From Scratch. I’m not a Linux expert or professional sysadmin but I know my way around it, and I can learn what I don’t know. So don’t be afraid to make any suggestions no matter how complicated.
I’ll start by describing the current state of the machine:
- It’s a PC using an i5 7400 CPU and 4 GB of RAM, 6 HDDs, and loads the system from an M2 SSD.
- The OS is Ubuntu Server 16.04 LTS using Expanded Security Maintenance for updates.
- It’s currently running SSH, NFS, Samba, CUPS, OpenVPN, Emby and Deluge on bare metal, from distro packages.
- The HDDs run in 3 pairs of RAID 1 arrays. I’m limited to 6 HDDs due to the PC case only having 6 slots.
- My ISP provides a public albeit dynamic IP, so I’m able to use a dynamic DNS service to have a public name pointing at my public IP, and able to have port forwards.
- There’s a router running open source firmware between the LAN and Internet, fwiw.
What I’d like to do:
- Increase the RAM to 32 GB.
- Stick with a Linux distro, as opposed to a NAS-tailored OS, Unraid etc.
- Install Debian Stable on a SSD, most likely via debootstrap from the Ubuntu system.
- Add a GRUB menu entry that makes a passthrough to the other system, so I can keep them both around for a while.
- Use docker-compose and possibly Portainer for as many of the services as it makes sense. Not sure if it’s worth bothering to make containers for things like SSH, NFS, Samba.
- Add more services. I’d like to try Jellyfin, NextCloud and other stuff (trying to degoogle for example).
- I’d like to find a better solution for accessing services from outside the LAN. Currently using OpenVPN which is nice for individual devices but gets complicated when you want an entire remote LAN to be able to access (to allow smart TVs or Chromecast to use Emby/Jellyfin for example). I’m hoping Authelia + reverse proxy will be able to help with this.
What I’m not interested in:
- Not interested in using Plex. I’ve used it, it’s a fine piece of software. But I don’t like the direction they went with the access through their server. It was supposed to be an optional feature not a lock-in method.
- Not interested in changing the filesystem or the RAID setup for the HDDs. RAID 1 pairs give me enough redundancy. The HDD upgrades are very simple. I’m fine with losing 50% of capacity.
Any and all suggestions and comments are welcome! Even if they’re about things I said I’m not interested in. It’s always possible there are things I haven’t considered.
For SSO I use old-school LDAP (openLDAP) because it is mature and integrates with anything (reverse proxies, most web applications, various file sharing/VoIP services…).
As a general recommendation, I recommend using some kind of config management tool to manage your setup, it makes it easy to replicate your setup (in case it goes down), bring up/tear down test environments, store and version your configuration, test and rollback changes… I use ansible [1] for this as it can manage any kind of infra or deployment methods (bare-metal, VM/VPS, container-based…). Currently managing a few dozen servers with it.
Install Debian Stable on a SSD, most likely via debootstrap from the Ubuntu system
What an interesting way to install a new system. I’ve only ever done that for image building purposes. Why would you do that instead of just installing it from a flash drive?
Also: it sounds like you’re manually installing things. I would suggest Ansible or something similar, so that reinstalling isn’t so brittle and manual.
What an interesting way to install a new system. I’ve only ever done that for image building purposes. Why would you do that instead of just installing it from a flash drive?
It minimizes downtime for the system. You can run the debootstrapped system as a guest and take your time with the initial setup and configuration.
Interesting tidbit, I don’t use actual flash drives anymore: I download an ISO to my phone and use the DriveDroid app to make the phone look like a bootable flash drive to the PC.
it sounds like you’re manually installing things. I would suggest Ansible or something similar, so that reinstalling isn’t so brittle and manual.
If you mean the original install, that’s something that only happens once every 5-10 years (and that because I got taken with trying other distros instead of sticking with Debian continuously since 2003 like I should have).
I’m not sure if I understand Ansible correctly, but attempting to replicate a system install 5-10 years later probably won’t yield the same result, the repos having moved on and so forth.
It’s also going to be a very basic Debian system + docker-compose, not a big chore.
If you mean for recovery purposes I usually take periodical system snapshots and can restore from those.
If you’re going to try Authelia and a reverse proxy, I recommend using SWAG. It’s a docker container that includes Authelia, nginx, fail2ban, geoip restrictions, and has premade config files for most of the selfhosted software that people run. The config files are especially useful since they include comments that describe the settings you need to change within the services you run, like changing the external domain in Emby for example.
Do you think there’s any advantage to use SSO if all your external facing services already have built in 2fa (ex. Nextcloud). I use vaultwarden so it’s not like any passwords need to be remembered. Just seems like extra setup
I think SSO is less important than having everything behind the reverse proxy. The importance of the proxy is that if there is a security hole in the web server component of your service, it cannot be exploited without a second flaw in the proxy. It’s an additional layer of abstraction and security that doesn’t add a ton of overhead.
An attacker would have to find an exploit in nginx, which is used by most of the big tech companies, so it is well secured compared to the services many of us selfhost.
Another advantage of using SWAG is being able to use fail2ban and geoip restrictions. Any ports open to the ipv4 internet get scanned by security services and malicious actors many times each day. It’s nice to be able to have nginx refuse connections from any of them that repeatedly fail to login, or that come from outside your geographic region.
What attracts me to Authelia is the ability to whitelist an IP for a limited time (2-3h) so everything in the LAN behind that IP can access, say, Emby. A person over there on their WiFi logs into Authelia and that’s it, they can stream my Emby to their Chromecast wherever they are…