Invisible

Learn How I create these beautiful photos

 

Beyond visible

In school, we learned that many forms of light, or ‘electromagnetic radiation’ exist, but we can only perceive the visible spectrum. The UV and IR portions are invisible to us. If this is the case, then how can I see an infrared or ultraviolet photograph? Additionally, how is it that there is color in those photographs? Lenses and cameras can be designed to see light beyond the range of human perception. Take airport x-ray scanners or thermal cameras, for example.

The light that these devices “see” is converted to the visible spectrum (410-680 nanometers).  If we were to capture an x-ray image, but then display that image with x-rays, in its native form, we would be unable to see it.  Cameras, like the one on your phone, have been designed to capture light with the same wavelengths, or ‘colors’ that your eyes can.  When that image is displayed with the same wavelengths, you have a color photograph. The cool part is if you take photographs with a camera that can see into different portions of the electromagnetic spectrum, such as infrared or ultraviolet, and display those photographs using the visible portion of the spectrum, similar to x-rays and thermal scans, you get other-worldly results.

Why do photographs of invisible light have color?

Color is just a perception of different wavelengths, or energies of light. We can translate light from another spectrum and display it with visible light so we can see it. Color is made up of different wavelengths of light; for example, if there is light with wavelength 430 nanometers, we see blue, and light with a wavelength of 520 nanometers is green. yellow is 570 nm, orange is 600 nm, red is 640 nm.     If you combine all those wavelengths at once, you get white.  The colors we see are just the sensitivity of our three basic color sensors, red, green, and blue.  Our red-ish receptor is most sensitive to light at 564 nm, green-ish is 534 nm, and blue-ish 420 nm.  If we have an object that reflects light from 400 to 480 nm, and 600 to 700 nm but blocks the rest, you get pink.  If it blocks all wavelengths you get black.  For more details on the specifics of how we see color, this Wikipedia article is a good read.

So, where does the color in the photographs come from?

For each photograph, I carefully choose which spectral range to assign to each color. Below are images taken in different spectra to show how things appear different than what we are used to when captured with invisible light.           Below each of my photographs, you will find a chart that indicates what colors are assigned to each portion of the spectrum. Details about each part of the spectrum and how it compares to what we see are indicated below.

What is the ultraviolet spectrum?

The ultraviolet spectrum is very familiar to bees and birds.  Air scatters more UV light and appears more hazy.  Foliage typically absorbs more UV than visible light and therefore appears darker than in visible light.  

Can you tell me about Near infrared (NIR)?

The near infrared spectrum is very close to the visible spectrum, but it is still unseen by us. Things appear similar, however landscapes appear more clear and vivid because the light scatters less in the atmosphere. Mountains have more contrast and leaves reflect more light so they appear brighter. This is the land of TV remote controls and night vision.  

What is the Shortwave infrared (SWIR)

The shortwave infrared spectrum is far enough away from the visible spectrum that photographs appear less realistic and more like an illustration. Water absorbs SWIR light rapidly, and snow appears black. The sky scatters very little SWIR light and appears very dark. Weather satellites use this spectrum to determine water content in the atmosphere.

Tell me about Wide spectrum

Wide spectrum is just that, an extended view of what we can see but expanded to include ultraviolet, visible and infrared  

Have a question, or simply want to say hi?