Relative humidity is not a very useful measure because of it's temperature dependency. Dewpoint is a much better quantity to work with. 90 and 90% relative humidity is not common as it translates to a dew point of 87, which is pretty rare (at least around here). Dewpoint is a measure of the actual amount of evaporated water in the air. Relative humidity is "relative" to the maximum amount of water vapor that the air can hold and has an exponential relationship to temperature.
For example, 95F with a relative humidity of 50% doesn't sound too bad but it is actually very miserable with a dewpoint of 76F. 70F with 90% relative humidity is pretty comfortable because the dewpoint is in the mid 60s.
Dewpoints above 75F: pretty muggy
Dewpoint below 50F: very comfortable
Quote:
Originally Posted by rustyp
To show a temperature graph without a humidity graph Vs how hot do you feel is almost meaningless. I will illustrate. We used to environmentally test our products for certification worldwide. Three zones are pretty much considered within normal ranges for to get the certification
1- standard = 72 degrees F and 45% relative humidity
2- cool = 60 degrees F 20% rh
3- hot = 80 degrees F 80 % rh
We had operators paid to run the equipment usually in 10 - 12 hour shifts. When at 80/80 the operators got paid a differential like shift workers would get paid for a night shift.
90 degrees F 10% rh is very comfortable 90 degrees F 90% rh unbearable to me.
|