I’ve never had so much fun self-hosting. A decade or so ago I was hosting things on Linode and running all kinds of servers for myself but with the rise of cloud services, I favored just giving everything to Google. I noticed how popular this community was on Reddit/Lemmy and now it’s my new addiction.

I’m a software engineer and have plenty of experience deploying to AWS/GCP so my head has been buried in the sand with these cloud providers. Now that I’m looking around there are things like NextCloud, Pihole, and Portainer all set up with Cloudflare Zero Trust… I feel like I’m living the dream of having the convenience to deploy my own services with proper authentication and it’s so much fun.

Reviving old hardware to act as local infra is so badass it feels great turning on old machines that were collecting dust. I’m now trying to convince my brother to participate in doing hard-drive swaps on a monthly basis so I have some backup redundancy off-site without needing to back up to the cloud.

Sorry if this feels ranty but I just can’t get over how awesome this is and I feel like a kid again. Cheers to this awesome community!

EDIT: Just also found Fission, selfhosted serverless functions, I’m jumping with joy right now! https://github.com/fission/fission

  • DrWeevilJammer@lm.rdbt.no
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    The easiest way to think about docker is to consider it a type of virtual machine, like something you’d use VirtualBox for.

    So let’s say you run Windows, but want to try out Linux. You’d could install Ubuntu in a VirtualBox VM, and then install software that works on Ubuntu in that VM, and it’s separate from Windows.

    Docker is similar to this in that a docker container for a piece off software often includes an entire operating system within it, complete with all of the correct versions of drivers that the software needs to function. This is all in a sandbox/container that does not really interact with the host operating system.

    As to why this is convenient: Let’s say that you have a computer running Ubuntu natively/bare metal. It has a certain version of python installed that you need to run the applications you use. But there’s some new software you want to try that uses a later version of python that will break your other apps if you upgrade.

    The developer of that software you want to try makes a docker version available. There’s a docker-compose.yml file that specifies things like the port the application will be available on, the time zone your computer is in, the location of the docker files on dockerhub, etc. You can modify this file if you like, and when you are done, you type docker compose up -d in the terminal (in the same directory as the docker-compose.yml file).

    Docker will then read the compose file, download the required files from the repository, extract them, set up the network and the web server and configure everything else specified in the compose file. Then you open a browser, type in the address of the machine the compose file is on, followed by the port number in the compose file (ex: http://192.168.1.100:5000), and boom, there’s your software.

    You can use the new software with the newer version of python at the same time as the old stuff installed directly on your machine.

    You can leave it running all the time, or bring it down by typing docker compose down. Need to upgrade to a new version? Bring the container down, type docker compose pull, which tells docker to pull the latest version from the repository, then docker compose up -d to bring the updated version back up again.

    Portainer is just a GUI that runs docker commands “under the hood”.