NEW YORK, NY - DECEMBER 13: A pedestrian walks through steam coming up from a manhole in Lower Manhattan , December 13, 2017 in New York City. According to the National Weather Service, the cold temperatures and up to 40 mph wind gusts are making it feel as cold as 15 degrees on Wednesday in New York City.(Photo by Drew Angerer/Getty Images)
Relative humidity and dew point are often mistaken for the same thing and can be sources of confusion. But what are the differences between the two, and which is more relevant in everyday life?
Relative humidity is defined by the National Weather Service as the amount of atmospheric moisture present, relative to the amount that would be present if the air were saturated. But what does that really mean In short, the relative humidity is a function of both the moisture content in the atmosphere and the actual air temperature. By itself, it does not directly indicate the amount of moisture present in the atmosphere.
Relative humidity is calculated using meteorological variables such as vapor pressure, saturation vapor pressure, temperature and dew point. Without explaining the equation variable by variable, we can simply state that the relative humidity is determined by how close the air temperature is to the dew point temperature. The closer these values are, the higher the humidity. The further apart these values are, the lower the humidity.
Dew point is the term that most meteorologists use to describe the amount of moisture in the air. The National Weather Service defines dew point as the temperature to which air must be cooled to in order to reach saturation, assuming air pressure and moisture content are constant. In short, the dew point is an accurate measurement of the moisture content in the air. When talking about a certain day feeling “muggy” or “sticky,” the dew point temperature is the more accurate term to use.