In 1860, American astronomer Samuel Pierpont Langley invented the bolometer which is a device that measures infrared or thermal radiation. And in 1929, Kálmán Tihanyi, Hungarian physicist invents infrared-sensitive electronic television camera which was capable of capturing thermal images.
Both infrared radiation and visible light are part of the electromagnetic spectrum, but unlike the visible light, Infrared radiation cannot be perceived with human eyes directly. Which explains why a thermal camera is not affected by the light and it can give a clear picture of an object even in a dark environment.
Thermal imaging is all about converting that infrared light into electric signals and creating an image using that information.
This technology was revolutionary at the time, but it is in common use today. But, how do these devices manage to capture this invisible visual information? Let’s check it out.
How Do Thermal Cameras Work?
The common standard today for thermal camera is showing warmer, objects with a yellow-orange hue that gets brighter as the object gets hotter. Colder objects are displayed with a blue or purple color.
Infrared energy has a wavelength starting at approximately 700 nanometers and extends to approximately 1mm. Wavelengths shorter than this begin to be visible by the naked eye. Thermal imaging cameras use this infrared energy to create thermal images. The lens of the camera focuses the infrared energy onto a set of detectors that then create a detailed pattern called thermogram. The thermogram is then converted to electrical signals to create a thermal image that we can see and interpret.