I wonder if my system is good or bad. My server needs 0.1kWh.
It’s the other way around. 0.1 kWh means 0.1 kW times 1 h. So if your device draws 0.1 kW (100 W) of power for an hour, it consumes 0.1 kWh of energy. If your device factory draws 360 000 W for a second, it consumes the same amount of 0.1 kWh of energy.
Thank you for explaining it.
My computer uses 1kwh per hour.
It does not yet make sense to me. It just feels wrong. I understand that you may normalize 4W in 15 minutes to 16Wh because it would use 16W per hour if it would run that long.
Why can’t you simply assume that I mean 1kWh per hour when I say 1kWh? And not 1kWh per 15 minutes.
A watt is 1 Joule per Second (1 J/s). E.g. Every second, your device draws 1 Joule of energy. This energy over time is called “Power” and is a rate of energy transfer.
A watt-hour is (1 J/s) * (1 hr)
This can be rewritten as (3600 J/hr) * (1 hr). The “per hour” and “hour” cancel themselves out which makes 1 watt-hour equal to 3600 Joules.
1 kWh is 3,600 kJ or 3.6 MJ
kWh is a unit of power consumed. It doesn’t say anything about time and you can’t assume any time period. That wouldn’t make any sense. If you want to say how much power a device consumes, just state how many watts (W) it draws.