Wondering if anyone has a feel for the power efficiency of older server hardware. I’m reading posts from people who say they have R710 with lots of hard drive and it IDLES at 160W with 8 hard drives. So…if you take the hard drives out of the equation, it’s probably still like 120W. Is that just how inefficient old computers are? Kinda like incandescent bulbs are less efficient than LED bulbs? How efficient is the R730 compared to the R710?

My 6 year old desktop computer is 60W idle with a GPU, and 30W idle without the GPU. Seems like a huge difference. It’s like $70 more per year to run a R710 than my old desktop with a GPU. Is that correct?

  • campr23@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Just to give you an idea, I’ve just built myself an AMD Ryzen 9 7900(not X) with: P600 gpu, SAS controller, SAS Expander, Intel x710 10gig card, Samsung 3.84Tbyte u.2 nvme PCIe 4.0 drive (scratch space), Samsung 2tbyte 980 pro PCIe 4.0 (OS disk), 64gbytes of ddr5 ram.

    It idles at ~100W, though I am working on reducing that. But the power/performance ratio is nuts compared to those old ‘clunkers’ like the r710.

    I still have to migrate my storage drives from my old NAS, but those are 870 QVO drives (8tb) that idle at 30mw, so I don’t expect much extra idle usage from those.