You’ve probably heard the term Optical Low Pass Filter, commonly abbreviated as OLPF. The newly announced GH5 has no such filter in front of the sensor, so I thought it would be appropriate to revisit what they do when it comes to applications in photography and videography.
Most commonly referred to as an OLPF, it’s purpose is to reduce moiré and aliasing from digital sensors. Most shooters from the early DSLR revolution are all too familiar with aliasing and the problems it can create when capturing images. Many of us have had to embarrassingly ask a client to change their shirt for an interview to be sure that the resulting footage would be free of problems.
Almost all modern cinema and still cameras use a Bayer sensor to capture imagery. A Bayer sensor (or Bayer filter) is an arrangement of photosensors on a grid with a layer of RGB filters over top of it. Without the RGB filter layer, the sensor would only be able to capture black & white images. Since the filter only separated Red Green & Blue, the image is interpolated to full color through a process known as demosaicing. The demosiacing algorithms vary from camera to camera, but the important thing to know is that on a base level, a certain degree of error has to be introduced in the process because it is a “best guess” through complex engineering. One such error introduced in the demosiacing process is Moiré and/or Aliasing.
Moiré / Aliasing occurs when the photo being captured contains a pattern that overlays and conflicts with the pattern of the RGB filters over the photosites on the Bayer sensor. The two patterns overlap and add a third pattern to the resulting captured image. Common subjects that create this phenomenon are hair, brick, chain link fences, tightly patterned shirts, and architecture with repeated vertical lines. With DSLR video, this became a problem because as the subject or camera moved, the pattern would swim and shimmer with the footage, similar to the animation below.
To combat this, many manufacturers put an OLPF in the path of the light hitting the sensor. This filter typically contains layers of material that will spread each point of light into multiple points, resulting in a slightly blurred image but resulting in less aliasing. This solution is inherently a compromise. As you fight aliasing with stronger filters, you lose the original sharpness of the image and will have to process in post to recover some of the information. High end large format cameras typically do no have these OLPFs because they want to provide to the photographer with the sharpest most detailed image and the photographer learns to shoot around it, or “fix it in post.”
To see the difference in sharpness detail, here is an example from Reduser Les Dittert, who took a RED 4K camera, removed the OLPF, and cropped 1:1 to do a side by side. The most striking details that are lost due to the OLPF are the leaves of the palm tree. (Non OLPF on the left, OLPF on the right)
While the left image is clearly sharper, removing the OLPF from the optical path of a sensor designed to have one changes the focal plane as well as sets the cinematographer up for a much higher chance of encountering aliasing. Some DSLRs on the market ship with no OLPF, but all major cinema cameras have one in the optical path. (B&W variants do not have the OLPF due to not needing a Bayer filter array). Manufacturers such as RED now sell different swappable OLPFs that are tuned for skin tones or low light shooting.
One example of a video camera lacking an OPLF was released in 2013 when Blackmagic Design released their original Cinema Camera. This camera lacked an OLPF and many users placed the camera into scenarios which exposed the compromise of having a sharper image by not having a filter. Later there were third party OLPFs designed to try and remedy this.
As pixel density has continued to increase over the years, the need for OLPFs is diminishing, along with more advanced demosiacing techniques
Special thanks to reddit user drunk_caterpillar for this article which demonstrates the importance of the algorithms used in processing the RAW image from the sensor and how it can reduce aliasing. Some of it is above my head, but you can see the different resulting quality from different processing techniques.
I’m sure in the upcoming years that a new sensor design will come along and that the Bayer filter will become obsolete altogether.