I have a Watt-Meter right on my PC plug next to my monitor so I can always see how much I consume. It's crazy how much the monitors alone take up, it's kind 40 KW/h each. I'm considering removing one of them.
Neither hopefully. The former at least is a unit of power, but 40 kW is enough to heat up a whole apartment building.
In reality a large and older monitor might use a couple hundred watts. A small modern 24" will probably use closer to 50 W (guesstimating), which is still a decent chunk of the power draw of a budget build.
KWh is a measure of total energy, not instantaneous power. Your watt meter was saying that since last reset of the value it measured 40 KWh of energy use. That's not an insignificant amount - a Chevy Bolt can go around 180 miles on 40KWh.
Watts, or kilowatts, are instantaneous power. That same Bolt can easily pull 100KW while accelerating and if it could somehow do that for an hour, it would have used 100KWh. It could never make it the whole hour as it has a 65KWh battery, so it would run out after 39 minutes.
What you're describing is kWh, not kW/h. You need to multiply power with time to get back to energy. An appliance using 1kW of power for 1h "uses" 1kWh of energy. The same appliance running for 2h requires 2kWh instead.
kW/h doesn't really make sense as a unit, although it could technically describe the rate at which energy consumption changes over time.
A typical wall outlet can only draw 1800w (1.8kw) no way is it drawing 40kwh (kw/h is a nonsense unit in this context). If it's drawing 40wh ghats actually quite low, a typical monitor is closer to 80-100w while powered on.
Where I live electricity is about 10c/kwh (cheap I know) so a 100w monitor is costing me about a cent an hour. More than worth it imo but you make your own decisions.