Wondering if anyone has a feel for the power efficiency of older server hardware. I’m reading posts from people who say they have R710 with lots of hard drive and it IDLES at 160W with 8 hard drives. So…if you take the hard drives out of the equation, it’s probably still like 120W. Is that just how inefficient old computers are? Kinda like incandescent bulbs are less efficient than LED bulbs? How efficient is the R730 compared to the R710?

My 6 year old desktop computer is 60W idle with a GPU, and 30W idle without the GPU. Seems like a huge difference. It’s like $70 more per year to run a R710 than my old desktop with a GPU. Is that correct?

  • Piciunio91@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I will give you an interesting comparison in my opinion. I have an HP DL380 G7 128GB RAM server and a 500GB SSD. I took out one CPU for lower power consumption, at idle I have 80W. When I run two servers minecraft, plexa, XPenology(NAS), WIN11, HomeAssistant then after loading everything the server draws 85-90W. All in Proxmox. For comparison I moved all the machines to a Dell Optiplex with an i5-6500 and 32GB RAM, here the CPU goes with a load of 85% and the computer drew 95Watt all the time with the same machines. You have to answer whether the server is going to do anything or not. If you’re going to be running something from time to time then a desktop PC is better, but if there’s going to be something running all the time then a server is better. Of course, I am not talking about a file server, because then it is not worth it.