Wondering if anyone has a feel for the power efficiency of older server hardware. I’m reading posts from people who say they have R710 with lots of hard drive and it IDLES at 160W with 8 hard drives. So…if you take the hard drives out of the equation, it’s probably still like 120W. Is that just how inefficient old computers are? Kinda like incandescent bulbs are less efficient than LED bulbs? How efficient is the R730 compared to the R710?

My 6 year old desktop computer is 60W idle with a GPU, and 30W idle without the GPU. Seems like a huge difference. It’s like $70 more per year to run a R710 than my old desktop with a GPU. Is that correct?

  • user3872465@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Further be aware that PCIe devices also need to support sleepstates. So if you add an old enterprise card liek 10g to your server/machine, then you might idle at twice or more times the power it would otherwise do as that device prevents lower sleepstates.

    so the cheapest 10g card or hba is not the best in the long run