Basic Thermography settings

What values should I set for my emissivity and reflected temperature in my Therm-App TH?


It all depends for which application are using the device for. Are you interested in identifying thermal patterns or measure temperatures? For example, If you are just trying to locate thermal patterns such as missing insulation or air leakage in a building, measuring an exact temperature is not required.  In this case we recommend leaving the values at their default settings (0.95 for emissivity and 20C for reflected temperature).

However, if you require exact temperature measurements for your application, then correctly setting emissivity and reflected temperature might be necessary to get the most accurate reading. Note that if you lack knowledge in thermography, we recommend you consult with a professional thermographer before changing the settings.

If you are unable able to consult with a thermographer, here are some general guidelines for taking simple measurements.  Note that there are many other factors to consider. But this can provide you with the basics:

 

What is Emissivity?

Emissivity is how efficiently an object radiates heat.  It’s defined as the ratio of infrared energy emitted by the object, compared to that emitted by an ideal blackbody, if both are at the same temperature.  It is represented as either a percent or a decimal.

Surfaces exhibit emissivity values ranging anywhere from 0.01 to 0.99.  For example, a highly polished metallic surface such as copper or aluminum are often below 0.10 and are practically an infrared mirror.  Heavily oxidized metallic surfaces will have a much higher emissivity (0.6 or greater depending on the surface condition and the amount of oxidation).  Most flat-finish paints are around 0.90 (in long-wave infrared) while human skin and water are about 0.98.

What is Reflected Temperature?


Reflected temperature is any thermal radiation originating from other objects that reflects off the target you are measuring.  To properly obtain an accurate surface temperature reading with thermal imaging, this value (along with that of emissivity) must be quantified and programmed into the camera’s object parameters.  This is used so that the software can compensate for, and ignore, the effects of this radiation as it does not relate to the actual surface temperature of the object you are measuring.

For higher emissivity objects, reflected temperature has less influence.  For lower emissivity objects, however, it’s a critical factor that should be understood carefully.   As emissivity decreases, what you are measuring (and seeing thermally) is coming more from the surfaces of surrounding objects (including the camera and operator), not the target you are inspecting.

How to Take a Basic Temperature Measurement


The easiest way to get an accurate measurement is to modify the surface with a material that has a high, known, and consistent emissivity value.  Standard electrical tape, with its emissivity of .95, is one such item that works well for this purpose.

Simply place a piece of electrical tape to that object and set the camera’s emissivity value to 0.95.  Next set the reflected temperature to an appropriate value for the environment.  A stable, room temperature environment will provide the best results.


Next, measure the temperature of the tape with the Therm-App spot meter, making sure that the target completely fills the spot meter’s circle.  

 

Have more questions? Submit a request

Comments

Powered by Zendesk