Like the title says, I’m new to self hosting world. 😀 while I was researching, I found out that many people dissuaded me to self host email server. Just too complicated and hard to manage. What other services that you think we should just go use the currently available providers in the market and why? 🙂thank you

    • KN4MKB@alien.topB
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Meh, been doing it for 5 years now with minimal issues. Had one issue come up where my domain was flagged as malicious, but was solved in a few days and some emails to security vendors.

      I think it’s important that those who can, and are educated enough to keep it running properly do host their own. Hosting your own email should be encouraged if capable because it helps reduce the monopoly, and keep a little bit of power for those who want to retain email privacy.

      • rad2018@alien.topB
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I agree with KN4MKB. I’ve been hosting my own mail server for decades. Not one issue. I use that in lieu of a mail service provider (Google immediately comes to mind), as their EULA service agreement will tell you that - since you’re using their service, on their servers - anything goes. Read the fine print on Gmail, and you’ll see. 😉

      • AdmiralPoopyDiaper@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I did for years quite successfully. Ultimately blocklists did me in however - I don’t have the knowledge to resolve those timely and it became a headache I couldn’t tolerate at that time.

    • Zoenboen@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      They are not hard to setup, easy to keep running (once going they pretty much just work). If you follow the right steps you can avoid being undeliverable and keep people from abusing your sending server (as a relay).

      https://workaround.org/

    • Im1Random@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I did it anyway some time ago and I’m really happy with it. I’m using my own email addresses for absolutely anything by now.

  • bulletproofkoala@alien.topB
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Okay I understand that email hosting is bad for SENDING email , but what about only RECEIVING email , isn’t it a good idea to keep my stuff private ? I rarely send personal emails, and like to avoid my data being used for marketing purposes Is that bad to have smtp imap open on dynamic ip address ? Just asking your opinion

    • nekapsule@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Self hosted doesn’t mean hosted on your home connection. Even with a static IP I would recommend against hosting your mail server at home because any outage means no mail (been there, done that). I have hosted my own imap/smtp server for decades and couldn’t be happier with it, but yes, the smtp part is tricky to evade blocks, especially from MSFT who would just block entire networks without a real reason (Linode for example)

    • shrugal@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      I’m doing exactly that, and it works like a charm. Get a DynDNS, backup mx and SMTP relay and you’re good, or get a domain provider like strato.de that already includes all three with the domain.

      Spam is also manageable. I get maybe 1-2 per day that make it past the filter, and I do have to add some custom keyword filters from time to time, but that’s about it. Fetching updated filter lists and self-learning from past errors keeps the filter up to date and is completely automated.

  • No-Needleworker-9890@alien.topB
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Passwords:
    -> You want to have immediat access to them, even if your house burns down

    Notes:
    -> You want to be able to read the documentation how to fix your selfhosted service, even when your selfhosted services are down

    Public Reverse proxy:
    -> A reverse proxy is only as safe as the applications behind. And NO, most selfhosted-applications are not hardened or had security audits
    (reverse proxy with a forward authentication proxy is something different)

  • Vogete@alien.topB
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    A password manager because if anything goes wrong, you’ll be completely screwed.

    What you SHOULD absolutely self host though is a password manager, so you can be in control of your most sensitive data.

    Regarding email, I think everyone should absolutely self host it, but it’s less and less viable in this google/Microsoft duopoly world. But ideally everyone would self host it. The reason why people advise against it really comes down to lack of real competition, and the two tech giants dictating how we violate every RFC possible.

    • pogky_thunder@alien.topB
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      A password manager because if anything goes wrong, you’ll be completely screwed.

      What you SHOULD absolutely self host though is a password manager, so you can be in control of your most sensitive data.

      Wot?

  • SwingingTheLamp@midwest.social
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    In my opinion, cloud storage for (zero knowledge) backup. Your backup strategy should include a diversity of physical locations. I had a house fire a few years ago. Luckily, my data drives survived, but if they hadn’t, my cloud backup would’ve been invaluable.

  • rgnissen202@alien.topB
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    I’d say backups. At least it shouldn’t be only local. I follow the rule of threes: two local copies and one off site with backblaze. Yeah, it ties up a not insignificant amount of disk space I could use for other things, but dammit, I’m not loosing my wedding photos, important system configurations, etc.

  • GolemancerVekk@alien.topB
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Don’t self-host email SMTP or public DNS. They’re hard to set up properly, hard to maintain, easy to compromise and end up used in internet attacks.

    Don’t expose anything directly to the internet if you’re not willing to constantly monitor the vulnerability announcements, update to new releases as soon as they come out, monitor the container for intrusions and shenanigans, take the risk that the constant updates will break something etc. If you must expose a service use a VPN (Tailscale is very easy to set up and use.)

    Don’t self-host anything with important data that takes uber-geek skills to maintain and access. Ask yourself, if you were to die suddenly, how screwed would your non-tech-savvy family be, who can’t tell a Linux server from a hot plate? Would they be able to keep functioning (calendar, photos, documents etc.) without constant maintenance? Can they still retrieve their files (docs, pics) with only basic computing skills? Can they migrate somewhere else when the server runs down?

  • shrugal@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    People saying email, look into using external SMTP servers as relays. Your domain most likely comes with at least one email account with SMTP access. You can use that as a relay to send personal/business emails from your server using the provider’s reputable IP addresses.

  • JoeB-@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Choosing a service to NOT selfhost is a subjective descision.

    I host 18 Proxmox VMs and 20 Docker containers at home. I also was selfhosting a WebDAV server for synchronizing my Joplin notes between devices and Vaultwarden for managing my Bitwarden vault, but decided to push the Joplin synchronization target to Dropbox [free] and to use Bitwarden’s free cloud solution for my passwords and secure notes. I did this because I will need immediate access to these two critical sources of information should my house burn down, or get blown over by a tornado. I have extremely strong passcodes for these and trust the hosts.

    This was strictly a personal decision. YMMV.

      • JoeB-@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Single host - Just Docker run + Portainer - Also using Macvlans so most containers have hostnames and static IPs on my LAN. K8s is cool, but I have no need for container orchestration.

    • Zoenboen@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I’m running Ollama, the LLAMA2 port for Mac. I hosted an LLM for a site that generated the next line of story, no issues.

      There’s no reason to hide from running an LLM at home if you can, people should, the source is out there for a reason.

      • Diligent_Ad_9060@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I’m not telling people to avoid running a LLM at home. I’m just saying that it wouldn’t be a generic purpose one close to what ChatGPT provides. The reason I would guess is primarily a lack of computational power.

    • Cart0gan@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Not really an option when I’m providing file hosting services to a bunch of my friends.

    • KN4MKB@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      If your NAS is properly updated, and SSL is used, then the login screen it just as safe as any other web app with regular updates. I would ask why someone would want that.

      • Accomplished-Lack721@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It’s not. SSL in itself doesn’t make any exposed service safe, just safer. An updated service isn’t necessarilu free of vulnerabilities.

        The difference between exposing your login page and most other services is the attack surface. If someone gets into your NAS administration, game over. You’re getting hit with ransomware or worse.

        If someone gets into my Calibre Web server, for instance, my vulnerability is much more limited. That runs in a docker container that only has access to the resources and folders is absolutely needs. The paths to doing harm to anything besides my ebook library are limited.

        I of course still use SSL, with my Calibre Wev behind a reverse proxy, with long complex passwords, and I’ll probably soon move it to an OATH login where I can use MFA (since it doesn’t support it natively itself). And there are more measures I could take beyond that, if I chose.

        • KN4MKB@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I’ll leave with this. ANY service exposed publicly or not should not have vulnerabilities. If there is any hint that your NAS webserver has vulnerabilities, it shouldn’t even be used internally. So to me, it does not matter. I don’t expose my NAS webserver because I have no reason to increase my attack surface that wide.

          But I’m comfortable exposing any of my internal services as needed because I’ve personally checked the source code for vulnerabilities, and have proper checks in place on top of regular security updates. I understand why others wouldn’t think the same way, as this takes a high level of confidence in your ability to assess the security posture of your systems and network. I’ve had penetration tests in my network, conduct them myself for business.

          • Accomplished-Lack721@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            It would be nice if we, and apps’ developers, always knew what the vulnerabilities are. They generally exist because the developer doesn’t know about them yet, or hasn’t found a solution yet (though ideally has been transparent about that). Zero-day exploits happen. There’s always a first person or group discovering a flaw.

            If being up to date and using SSL was all it took, security would be a lot simpler.

            No one security measure is ever foolproof, other than taking everything offline. But multiple used in tandem make it somewhere between inconveniently and impractically difficult to breach a system.

    • Tivin-i@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Any public facing service that other (services) depend on should not be running on a public IP (especially ones that translate addresses, and ones you have to manually update).

      You could run an authoritative NS “hidden” where only your secondary NS can reach out to for zone transfers. You could also escape having a public IP if you configure rsync or scripts to update secodary host files on every IP change.

  • zfa@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I don’t self host anything where it would impact me unduly if it went down while I was on holiday to the point where I’d have to break state and go fix stuff.

    I don’t want to have to leave my beer or beach and head off to fix things like an email server, restore a password manager db etc. so anything like that which is critical to the point where an outage would prob have me do so means I pay someone else.