• chandleya@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    This looks like a problem that would be solved cheaper with less mess by a used Lenovo Thinkstation P720, a pair of Xeon Gold 6140s, a dozen 32GB DIMMs, a 4TB NVMe, and a copy of VMware.

    Why on earth do you have so many minis?

    • ghafri@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      But can it be solved if I need to run graphics on each of those devices and one graphics vietualized is not good?

    • NavySeal2k@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      To mitigate terminal server cost I guess. And no, virtualizing windows is no solution because to be legal you need expensive open license windows licenses and software assurance on that licenses …

  • DonkeyOfWallStreet@alien.top
    cake
    B
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    10" racks

    Consolidate power might not be possible 12v? Even 20amps is 4-6 computers. Just get 10" pdu’s they have 3 plugs. I know there’s 17 computers and 3 switches that’s like 7 pdu’s…

    20 shelves

    2x 10u 10" racks…

    You’ll need to have a 10u empty to keep going for expandability.

  • Chuffed_Canadian@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Low hanging fruit: ditch the dumb switches and get a 48 port or something. Place it on the shelf above. If you’re using the two switches to ensure that you have more throughput per machine considering putting in a 10gbit uplink. Beyond that consider mounting/stacking the monitors differently and changing out the wood shelves for metal shelves for airflow. The trick is to know in which direction they dissipate heat. If they’re fanless the heat is gonna go up which will limit your ability to stack.

    • ghafri@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I run certain AI bots that require integrated cpu graphics to run 24/7

  • Professional-Fee2235@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Personally I’d buy big power supply at the voltage it needs, get some board with fuses and connect all to that.

    But it seems you need few more shelves and pack of zip-ties first.

  • linerror@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    you can replace the 17 power supplies with a server PSU… these should all be 12v DC input with a standard 2.1x5.5MM barrel connectors…

    flip them up onto their face, with the rear facing up. if you want to get fancy you could even make a base that integrates a little wedge and rod for the power button.

    get at least a 24 port switch and micro ethernet cables.

    this could be cleaned up to a single row on one shelf with no visible wires other than the ethernet and power lead running up and to the back. 1 switch under the machines… 1 PSU instead of 17… even one power cord…

    or a little cable management at the very least…

    • ghafri@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I didnt know about server psu so will check that out. 24 port switches are expensive, all 3 switches are far cheaper than those big switches, tho im not sure if I add more 8 port switches ill be doing daisy chaining.

      • linerror@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        you’re clients are 1gbe… you can get a brand new unmanaged 24 port gbe switch for $50/£56 a fraction of one of those machines… https://www.amazon.co.uk/Tenda-Ethernet-Internet-Splitter-TEG1024D/dp/B09DPLVLPY/ -- not to mention you can get a used managed switch for less than half that.

        if £56 is going to break your bank then get some double sided velcro and clean up your mess. wish you had mentioned your obscenely limited budget…

        • ghafri@alien.topOPB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Budget is not an issue but noise, I sleep next to those machines like between me and them is the table desk u see on the right. So looking at that large switch i think it would have fans and such that would make alot of noise

  • gizzlyxbear@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I only have a passing interest in homelab stuff, but I just wanted to say that I got really excited because I thought those were a bunch of PS2s. Thought this was gonna be some weird FFXI or Battlefront LAN system.

    • ghafri@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Well at least all do run some graphics so there is some truth close to that

  • theresmorethan42@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Get a cheap UCS or super micro server on eBay. You can get a stupid number of cores and RAM for just about nothing, then you have one box, and altogether probably a LOT less power draw

  • freakierice@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Other than mounting the switch above and running the network cables up the rear there doesn’t seem to be a easy solution

  • PopNo626@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Get some 2x2 boards, a crown stapler, and window/door screen and make 3 panels big enough to cover the exposed sides. I did that and everyone said it looked cool. And it only cost like $40 because I already had the crown stapler.