I wonder if my system is good or bad. My server needs 0.1kWh.
Mate, kWh is a measure of electricity volume, like gallons is to liquid. Also, 100 watt hours would be a much more sensical way to say the same thing. What you’ve said in the title is like saying your server uses 1 gallon of water. It’s meaningless without a unit of time. Watts is a measure of current flow (pun intended), similar to a measurement like gallons per minute.
For example, if your server uses 100 watts for an hour it has used 100 watt hours of electricity. If your server uses 100 watts for 100 hours it has used 10000 watts of electricity, aka 10kwh.
My NAS uses about 60 watts at idle, and near 100w when it’s working on something. I use an old laptop for a plex server, it probably uses like 50 watts at idle and like 150 or 200 when streaming a 4k movie, I haven’t checked tbh. I did just acquire a BEEFY network switch that’s going to use 120 watts 24/7 though, so that’ll hurt the pocket book for sure. Soon all of my servers should be in the same place, with that network switch, so I’ll know exactly how much power it’s using.
My home rack draws around 3.5kW steady-state, but it also has more than 200 spinning disks
For the whole month of November. 60kWh. This is for all my servers and network equipment. On average, it draws around 90 watt.
kWh is a unit of energy, not power
Wasn’t it stated for the usage during November? 60kWh for november. Seems logic to me.
Edit: forget it, he’s saying his server needs 0.1kWh which is bonkers ofc
I was really confused by that and that the decided units weren’t just in W (0.1 kW is pretty weird even)
I use unraid with 5950x and it wouldn’t stop crashing until I disabled c states
So that plus 18 hdds and 2 ssds it sits at 200watts 24/7