I wonder if my system is good or bad. My server needs 0.1kWh.
Do you mean 0.1kWh per hour, so 0.1kW or 100W?
My N100 server needs about 11W.
It’s the other way around. 0.1 kWh means 0.1 kW times 1 h. So if your device draws 0.1 kW (100 W) of power for an hour, it consumes 0.1 kWh of energy. If your device factory draws 360 000 W for a second, it consumes the same amount of 0.1 kWh of energy.
Thank you for explaining it.
My computer uses 1kwh per hour.
It does not yet make sense to me. It just feels wrong. I understand that you may normalize 4W in 15 minutes to 16Wh because it would use 16W per hour if it would run that long.
Why can’t you simply assume that I mean 1kWh per hour when I say 1kWh? And not 1kWh per 15 minutes.