You have 3 free guides left 😟
Unlock your guides
You have 3 free guides left 😟
Unlock your guides

Image sensors are the heart of , converting light into electrical signals. CCD and CMOS are the two main types, each with unique strengths. CCDs excel in image quality, while CMOS sensors offer faster processing and lower .

These sensors use photodiodes to capture light, organizing them into pixels for image formation. Key performance factors include , , , and . Understanding these helps in choosing the right sensor for specific applications.

Image Sensor Types

Charge-Coupled Device (CCD) and Complementary Metal-Oxide-Semiconductor (CMOS) Sensors

  • Charge-Coupled Device (CCD) uses an array of light-sensitive capacitors to capture and store charge generated by incoming photons
  • CCD transfers the stored charge across the chip and reads it at one corner of the array
  • Complementary Metal-Oxide-Semiconductor (CMOS) uses an array of photodiodes and transistors to capture and process light
  • CMOS incorporates amplifiers, -correction, and digitization circuits next to each , enabling on-chip image processing
  • CCD sensors generally have lower noise and higher sensitivity compared to CMOS sensors
  • CMOS sensors consume less power, have lower manufacturing costs, and enable faster readout speeds than CCD sensors

Photodiodes and Pixels

  • Photodiode is a semiconductor device that converts light into an electrical current
  • Photodiodes are the fundamental light-sensing elements in both CCD and CMOS image sensors
  • When a photon strikes a photodiode, it generates an electron-hole pair, which contributes to the electrical signal
  • Pixel (picture element) refers to the smallest addressable element in an image sensor
  • Each pixel typically contains a photodiode and associated circuitry for readout and processing
  • The number of pixels in an image sensor determines its spatial (megapixels)

Image Sensor Performance

Quantum Efficiency and Dark Current

  • Quantum efficiency is the ratio of the number of collected electrons to the number of incident photons on the sensor
  • Higher quantum efficiency indicates better light sensitivity and improved low-light performance
  • Quantum efficiency varies with wavelength, and image sensors are designed to optimize sensitivity for specific spectral ranges
  • Dark current refers to the small electrical current that flows through the image sensor even in the absence of light
  • Dark current arises from thermal generation of electron-hole pairs and is a source of noise in the image
  • Cooling the image sensor can reduce dark current and improve the signal-to-noise ratio

Signal-to-Noise Ratio (SNR) and Dynamic Range

  • Signal-to-Noise Ratio (SNR) compares the level of the desired signal to the level of background noise in an image
  • Higher SNR indicates better image quality, with clearer details and less visible noise
  • SNR can be improved by increasing the light exposure, using a lower ISO setting, or applying noise reduction techniques
  • Dynamic range is the ratio between the maximum and minimum measurable light intensities of an image sensor
  • Wide dynamic range allows an image sensor to capture details in both bright and dark areas of a scene without saturation or loss of information
  • High-end image sensors often employ techniques like multiple exposures or logarithmic response to extend the dynamic range

Shutter Mechanisms

Rolling Shutter and Global Shutter

  • is a capture method where the image sensor scans the scene row by row, with a slight time delay between each row
  • Rolling shutter can cause image distortion for fast-moving objects or when the camera is in motion (skew, wobble, or partial exposure)
  • captures the entire image frame at the same time, eliminating the distortion issues associated with rolling shutter
  • Global shutter is preferred for applications involving fast motion or precise timing, such as machine vision or scientific imaging
  • CMOS sensors can be designed with either rolling or global shutter, while CCD sensors typically use global shutter

Color Capture

Color Filter Arrays and Bayer Pattern

  • (CFA) is a mosaic of tiny color filters placed over the image sensor to capture color information
  • Each pixel in the image sensor is covered by a single color filter, typically red, green, or blue
  • The is the most common CFA arrangement, consisting of a repeating 2x2 grid of red, green, and blue filters
  • In the Bayer pattern, there are twice as many green filters as red or blue, mimicking the human eye's higher sensitivity to green light
  • Demosaicing algorithms interpolate the missing color values at each pixel location based on the surrounding pixels to reconstruct a full-color image
  • Alternative CFA patterns (X-Trans, RGBW) and multi-layer sensors have been developed to improve color accuracy and low-light performance
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary