What Causes Lens Aberrations? An Insight on Types of Lens Aberrations and How to Minimize Them

Lens aberration occurs when the optical system (or lens) can’t focus the incoming light on the sensor plane. But to understand that, it is important to look back at where it all began as far as photography is concerned!

In this blog, we’ll understand how and why lens aberration occurs, as well as the common types of lens aberrations, and more.

Pinhole Cameras – Where It All Began

The pinhole camera is given below (Refer to Figure 1). This is the most primitive type of camera. Let’s see how a pinhole camera works.

A pinhole camera is a dark box with no light passing through it except through an aperture ( a small opening or a hole) in front of it. The light passed through this aperture forms the image of the outside scene on the screen inside of it (as shown in Figure 1).

 Figure 1: Pinhole Camera and The Image Formed

Even though pinhole cameras could produce images of the outside world, they had a fundamental flaw. The limitations were:

  1. Less light entering the system: The aperture through which light enters the pinhole camera is so small that the amount of light entering the system (inside the box) was too little. Therefore, the image formed appears too dim.
  2. Longer exposure times: The time for the image to be formed was too long. For example, for an image as given below (Refer to figure 2) to be formed inside the pinhole camera, you would have to wait for approximately 10s to 12s. This time delay was a major drawback.

Figure 2: Image Formed in a Pinhole Camera

These defects resulted in the development of optical components such as lenses, which are now an essential part of today’s cameras.

The Role of Lenses in Photography

Lenses are refractive devices that gather a large number of light rays from each object point. This allows lenses to focus a large number of light rays coming from a scene and bend them to create a better-focused image on the camera’s sensor.

As shown in Figure 3, lenses collect a large bundle of rays from each object point (points P and Q), converge these collected rays, and focus them onto the corresponding point on the imaging plane. This process helps to achieve a brighter image that is exposed more quickly on the photosensitive film.

Figure 3: Image Formation by a Pinhole Camera and Image Formation by a Lens Camera

Are Single Lens Systems Sufficient for Ideal Image Formation?

A single lens systems prove insufficient for ideal image formation. The three conditions for ideal image formation are:

  1. Point-to-point convergence: All light rays originating from a single point on the object should ideally converge to a single corresponding point on the image plane. This ensures a sharp and well-defined representation of the object.
  2. Planar object, planar image: If all object points lie in the same plane perpendicular to the optical axis (think of a flat picture), their corresponding image points should also form on a plane perpendicular to the axis. This ensures that the spatial relationships between objects are preserved in the image.
  3. Scaling and proportion: There should be a well-defined scaling relationship between the object and its image. This means that all linear dimensions (lengths, widths) of the object are magnified or reduced by a constant factor to form the image. This ensures that the image retains the overall proportions of the object, even if it’s larger or smaller.

Figure 4: Conditions for Ideal Image Formation

If the three conditions given above fail, the image formed will be distorted or imperfect.

To understand the above-given conditions better, let us look at an analogy.

Consider image formation with a rubber stamp. When the rubber stamp is stamped on the same spot on paper more than once, it leads to smudging. This is the same thing that happens when the condition (1) given above fails. If all the rays from a point on the object are not uniquely imaged onto their corresponding point on the imaging plane, that will lead to smudging.

Lens aberrations are caused when the above conditions are not met properly, leading to poor image quality.

Types of Lens Aberrations in Optics

Lens aberrations can be broadly categorized into two kinds.

  1. Chromatic aberration: It occurs when light rays of different wavelengths (colors) are bent by different amounts as they pass through a lens. This is because the refractive index of the lens material varies with wavelength. This phenomenon creates a colored fringe around objects in the image, where different colors come to focus at slightly different points.
  2. Monochromatic aberration: It occurs when light rays passing through different parts of the lens focus on distinct points. Even if the light rays are of single wavelength, the lens becomes incapable of focusing all the object points onto a single image point. It is caused by the geometry of the lens and is independent of the light’s wavelength.

Figure 5: Lens Aberrations

[Image Source]

Many changes can be made to the lens systems to reduce the above-mentioned aberrations. For example, a compound lens system (Refer to Figure 6) with multiple lenses serving different purposes can be used, the lenses’ aperture can be changed, etc.

Figure 6: Compound Lens System

[Image Source]

Let us look at each of the lens aberrations in detail.

1. Chromatic aberration

Each medium through which light passes has a different refractive index, and each wavelength of light behaves differently inside these mediums. This is the primary reason for chromatic aberration.

Consider three wavelengths of light: red, green, and blue. Chromatic aberration happens when these three wavelengths of light from a single object point do not converge at a single point on the image plane (See Figure 7). In the figure below, you can see that the blue wavelength of light bends more, green a little more, and red bends less than the other two. This leads to colored smudges on the image – which is called chromatic aberration.

Figure 7: Chromatic Aberration

Chromatic aberration correction

Chromatic aberration can be corrected using an achromatic doublet (Refer to Figure 8). An achromatic doublet is a combination of a converging lens and a diverging lens. These two lenses have to complement dispersing properties, i.e., the dispersion by one element gets canceled out by the dispersion of the other. This helps reduce aberration by reducing the dispersion of the object rays.

Figure 8: Achromatic Doublet

[Image Source]

However, one downside of using an achromatic doublet is that adding more lens elements can make the photography equipment heavy.

Another lightweight solution for chromatic aberration is the usage of a diffractive element (See Figure 9). Diffractive element is a well-engineered lens, with concentric grooves cut on it to reduce the aberrations. This helps in reducing the dispersion of light rays emerging from the converging lens.

Figure 9: Diffractive Element

[Image Source]

2. Monochromatic aberrations

There are five kinds of monochromatic aberrations. Let us look at each of them and their correction techniques.

1. Spherical aberration

The bundle of rays emerging from the object behaves differently while traversing towards the optical system. Rays closer to the axis converge at a different distance, and rays far away from the axis converge at a different distance. Light rays entering farther from the center (periphery) bend more than those entering closer to the center. This uneven bending causes the light rays to converge at different points, resulting in a blurred or distorted image.

[Image Source]

Figure 9 : Spherical Aberration

Light rays entering the lens farther from the center are called marginal rays. The point at which marginal rays converge is called marginal focus (Refer to Figure 10). Similarly, light rays entering close to the center are called paraxial rays, and the corresponding point is called the paraxial focus.

Even with spherical aberration, there’s a point where the image appears sharper and with the least spherical aberration. This point is called the circle of least confusion. It’s the location where the overlapping out-of-focus light rays create the best possible image detail despite the aberration. The sensor can be placed at this point to reduce spherical aberration.

Figure 10: Spherical Aberration

2. Coma

Unlike spherical aberration that affects all incoming light, coma specifically impacts off-axis point sources, like stars viewed away from the center of a telescope’s field of view.

Rays from an off-axis object pass through the lens at different heights. Rays near the optical axis and farther away from the optical axis will converge at different distances (See Figure 11). Due to this unequal bending of light rays in different parts of the lens, a point source gets smeared into a teardrop or “comet” shape, hence the name coma (See Figure 12). This happens because different zones in the lens have different magnification powers.

Figure 11: Coma Formation

[Image Source]

Figure 12: Comatic Aberration

[Image Source]

Abbe Sine condition can be followed to reduce Coma aberration. This condition dictates the relationship between the entrance and exit angles of light rays passing through an optical system. It ensures that the image of the aperture (pupil) in object space appears in the same relative position in image space.

3. Astigmatism

Astigmatism is an off-axis phenomenon that occurs in symmetrical lenses. The bundle of rays (cone of rays) coming from the object traverses through two planes. The plane that contains the object and the optical axis is called the meridional plane (Refer to Figure 13). The plane perpendicular to the meridional plane is called the sagittal plane. Even though the light rays on these two planes are coming from the same object, they behave differently. The ray from the meridional plane focuses on a point called the primary image. And the ray from the sagittal plane focuses on the secondary image point.

As seen in the figure given below, when the sensor is placed at the primary image instead of a point image, we get a line. Similarly, when the sensor is placed at the secondary image, we get a line perpendicular to the line formed at the primary image instead of a point image. So, to get a proper image, the sensor is neither placed at the primary image nor at the secondary image but at the circle of least confusion.

Figure 12: Astigmatism

[Image Source]

Figure 12: Image Formation in Primary Plane and Secondary Plane

Astigmatism can be overcome by using high-quality, precise lens manufacturing that reduces lens imperfections. Aspheric lenses (non-spherical) can also be used to counteract the uneven focusing caused by astigmatism. Corrective optics techniques are also deployed in complex optical systems to address astigmatism. Astigmatism can occur for an on-axis object as well when the lens is asymmetrical.

4. Distortion

Distortion is a lens aberration commonly seen in photography when wide FOV lenses are used. It occurs when the light ray from the paraxial axis is focused at different magnifications.

Distortion can be overcome by using aspheric lenses or using multiple lens systems for minimal distortion. Software correction can also be done during post-processing to reduce distortions in the image formed.

Figure 13: Distortion

e-con Systems Lens Customization Expertise Will Help Choose the Right Camera for Your Embedded Vision Applications

e-con Systems is an industry pioneer with 20+ years of experience in designing, developing, and manufacturing OEM cameras.

At e-con Systems, we understand that lens requirements can vary widely depending on factors like the required field of view, working environment, and application scale. We recognize the importance of custom-designed lenses for every application and will guide you in choosing the right lens.

We also provide various other customization services, including camera enclosures, resolution, frame rate, and sensors of your choice, to ensure our cameras fit perfectly into your embedded vision applications.

Visit e-con Systems Camera Selector Page to explore our wide range of cameras.

You can also get our expert assistance and guidance. For queries and more information, email us at camerasolutions@e-consystems.com.

Related posts

How to Eliminate the Need for Separate Sensors with RGB-IR Cameras in Surgical Visualization Systems

Why You Don’t Need Two Separate Cameras for RGB and IR Imaging in Remote Patient Monitoring

Seamless Day-Night Vision: The Power of RGB-IR Cameras without Mechanical Filters