Android Camera HAL3.0 Properties

Table of Contents

Properties

Property Name Type Description Units Range Tags
colorCorrection
controls
Property Name Type Description Units Range Tags
android.colorCorrection.mode byte [public]
Details

When android.control.awbMode is not OFF, TRANSFORM_MATRIX should be ignored.

android.colorCorrection.transform rational x 3 x 3 [public]
3x3 rational matrix in row-major order

A color transform matrix to use to transform from sensor RGB color space to output linear sRGB color space

Output values are expected to be in the range (0,1)

Details

This matrix is either set by HAL when the request android.colorCorrection.mode is not TRANSFORM_MATRIX, or directly by the application in the request when the android.colorCorrection.mode is TRANSFORM_MATRIX.

In the latter case, the HAL may round the matrix to account for precision issues; the final rounded matrix should be reported back in this matrix result metadata.

android.colorCorrection.gains float x 4 [public]
A 1D array of floats for 4 color channel gains

Gains applying to Bayer color channels for white-balance

Details

The 4-channel white-balance gains are defined in the order of [R G_even G_odd B], where G_even is the gain for green pixels on even rows of the output, and G_odd is the gain for greenpixels on the odd rows. if a HAL does not support a separate gain for even/odd green channels, it should use the G_even value,and write G_odd equal to G_even in the output result metadata.

This array is either set by HAL when the request android.colorCorrection.mode is not TRANSFORM_MATRIX, or directly by the application in the request when the android.colorCorrection.mode is TRANSFORM_MATRIX.

The ouput should be the gains actually applied by the HAL to the current frame.

dynamic
Property Name Type Description Units Range Tags
android.colorCorrection.transform rational x 3 x 3 [public]
3x3 rational matrix in row-major order

A color transform matrix to use to transform from sensor RGB color space to output linear sRGB color space

Output values are expected to be in the range (0,1)

Details

This matrix is either set by HAL when the request android.colorCorrection.mode is not TRANSFORM_MATRIX, or directly by the application in the request when the android.colorCorrection.mode is TRANSFORM_MATRIX.

In the latter case, the HAL may round the matrix to account for precision issues; the final rounded matrix should be reported back in this matrix result metadata.

android.colorCorrection.gains float x 4 [public]
A 1D array of floats for 4 color channel gains

Gains applying to Bayer color channels for white-balance

Details

The 4-channel white-balance gains are defined in the order of [R G_even G_odd B], where G_even is the gain for green pixels on even rows of the output, and G_odd is the gain for greenpixels on the odd rows. if a HAL does not support a separate gain for even/odd green channels, it should use the G_even value,and write G_odd equal to G_even in the output result metadata.

This array is either set by HAL when the request android.colorCorrection.mode is not TRANSFORM_MATRIX, or directly by the application in the request when the android.colorCorrection.mode is TRANSFORM_MATRIX.

The ouput should be the gains actually applied by the HAL to the current frame.

control
controls
Property Name Type Description Units Range Tags
android.control.aeAntibandingMode byte [public]
  • OFF

    The camera device will not adjust exposure duration to avoid banding problems.

  • 50HZ

    The camera device will adjust exposure duration to avoid banding problems with 50Hz illumination sources.

  • 60HZ

    The camera device will adjust exposure duration to avoid banding problems with 60Hz illumination sources.

  • AUTO

    The camera device will automatically adapt its antibanding routine to the current illumination conditions. This is the default.

The desired setting for the camera device's auto-exposure algorithm's antibanding compensation.

android.control.aeAvailableAntibandingModes

Details

Some kinds of lighting fixtures, such as some fluorescent lights, flicker at the rate of the power supply frequency (60Hz or 50Hz, depending on country). While this is typically not noticeable to a person, it can be visible to a camera device. If a camera sets its exposure time to the wrong value, the flicker may become visible in the viewfinder as flicker or in a final captured image, as a set of variable-brightness bands across the image.

Therefore, the auto-exposure routines of camera devices include antibanding routines that ensure that the chosen exposure value will not cause such banding. The choice of exposure time depends on the rate of flicker, which the camera device can detect automatically, or the expected rate can be selected by the application using this control.

A given camera device may not support all of the possible options for the antibanding mode. The android.control.aeAvailableAntibandingModes key contains the available modes for a given camera device.

The default mode is AUTO, which must be supported by all camera devices.

If manual exposure control is enabled (by setting android.control.aeMode or android.control.mode to OFF), then this setting has no effect, and the application must ensure it selects exposure times that do not cause banding issues. The android.statistics.sceneFlicker key can assist the application in this.

HAL Implementation Details

For all capture request templates, this field must be set to AUTO. AUTO is the only mode that must supported; OFF, 50HZ, 60HZ are all optional.

If manual exposure control is enabled (by setting android.control.aeMode or android.control.mode to OFF), then the exposure values provided by the application must not be adjusted for antibanding.

android.control.aeExposureCompensation int32 [public]

Adjustment to AE target image brightness

count of positive/negative EV steps
Details

For example, if EV step is 0.333, '6' will mean an exposure compensation of +2 EV; -3 will mean an exposure compensation of -1

android.control.aeLock byte [public as boolean]
  • OFF

    Autoexposure lock is disabled; the AE algorithm is free to update its parameters.

  • ON

    Autoexposure lock is enabled; the AE algorithm must not update the exposure and sensitivity parameters while the lock is active

Whether AE is currently locked to its latest calculated values

Details

Note that even when AE is locked, the flash may be fired if the AE mode is ON_AUTO_FLASH / ON_ALWAYS_FLASH / ON_AUTO_FLASH_REDEYE.

android.control.aeMode byte [public]

The desired mode for the camera device's auto-exposure routine.

android.control.aeAvailableModes

Details

This control is only effective if android.control.mode is AUTO.

When set to any of the ON modes, the camera device's auto-exposure routine is enabled, overriding the application's selected exposure time, sensor sensitivity, and frame duration (android.sensor.exposureTime, android.sensor.sensitivity, and android.sensor.frameDuration). If one of the FLASH modes is selected, the camera device's flash unit controls are also overridden.

The FLASH modes are only available if the camera device has a flash unit (android.flash.info.available is true).

If flash TORCH mode is desired, this field must be set to ON or OFF, and android.flash.mode set to TORCH.

When set to any of the ON modes, the values chosen by the camera device auto-exposure routine for the overridden fields for a given capture will be available in its CaptureResult.

android.control.aeRegions int32 x 5 x area_count [public]

List of areas to use for metering

Details

Each area is a rectangle plus weight: xmin, ymin, xmax, ymax, weight. The rectangle is defined inclusive of the specified coordinates.

The coordinate system is based on the active pixel array, with (0,0) being the top-left pixel in the active pixel array, and (android.sensor.info.activeArraySize.width - 1, android.sensor.info.activeArraySize.height - 1) being the bottom-right pixel in the active pixel array. The weight should be nonnegative.

If all regions have 0 weight, then no specific metering area needs to be used by the HAL. If the metering region is outside the current android.scaler.cropRegion, the HAL should ignore the sections outside the region and output the used sections in the frame metadata

android.control.aeTargetFpsRange int32 x 2 [public]

Range over which fps can be adjusted to maintain exposure

android.control.aeAvailableTargetFpsRanges

Details

Only constrains AE algorithm, not manual control of android.sensor.exposureTime

android.control.aePrecaptureTrigger byte [public]
  • IDLE

    The trigger is idle.

  • START

    The precapture metering sequence will be started by the camera device. The exact effect of the precapture trigger depends on the current AE mode and state.

Whether the camera device will trigger a precapture metering sequence when it processes this request.

Details

This entry is normally set to IDLE, or is not included at all in the request settings. When included and set to START, the camera device will trigger the autoexposure precapture metering sequence.

The effect of AE precapture trigger depends on the current AE mode and state; see android.control.aeState for AE precapture state transition details.

android.control.afMode byte [public]
  • OFF

    The auto-focus routine does not control the lens; android.lens.focusDistance is controlled by the application

  • AUTO

    If lens is not fixed focus.

    Use android.lens.info.minimumFocusDistance to determine if lens is fixed-focus. In this mode, the lens does not move unless the autofocus trigger action is called. When that trigger is activated, AF must transition to ACTIVE_SCAN, then to the outcome of the scan (FOCUSED or NOT_FOCUSED).

    Triggering AF_CANCEL resets the lens position to default, and sets the AF state to INACTIVE.

  • MACRO

    In this mode, the lens does not move unless the autofocus trigger action is called.

    When that trigger is activated, AF must transition to ACTIVE_SCAN, then to the outcome of the scan (FOCUSED or NOT_FOCUSED). Triggering cancel AF resets the lens position to default, and sets the AF state to INACTIVE.

  • CONTINUOUS_VIDEO

    In this mode, the AF algorithm modifies the lens position continually to attempt to provide a constantly-in-focus image stream.

    The focusing behavior should be suitable for good quality video recording; typically this means slower focus movement and no overshoots. When the AF trigger is not involved, the AF algorithm should start in INACTIVE state, and then transition into PASSIVE_SCAN and PASSIVE_FOCUSED states as appropriate. When the AF trigger is activated, the algorithm should immediately transition into AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the lens position until a cancel AF trigger is received.

    Once cancel is received, the algorithm should transition back to INACTIVE and resume passive scan. Note that this behavior is not identical to CONTINUOUS_PICTURE, since an ongoing PASSIVE_SCAN must immediately be canceled.

  • CONTINUOUS_PICTURE

    In this mode, the AF algorithm modifies the lens position continually to attempt to provide a constantly-in-focus image stream.

    The focusing behavior should be suitable for still image capture; typically this means focusing as fast as possible. When the AF trigger is not involved, the AF algorithm should start in INACTIVE state, and then transition into PASSIVE_SCAN and PASSIVE_FOCUSED states as appropriate as it attempts to maintain focus. When the AF trigger is activated, the algorithm should finish its PASSIVE_SCAN if active, and then transition into AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the lens position until a cancel AF trigger is received.

    When the AF cancel trigger is activated, the algorithm should transition back to INACTIVE and then act as if it has just been started.

  • EDOF

    Extended depth of field (digital focus). AF trigger is ignored, AF state should always be INACTIVE.

Whether AF is currently enabled, and what mode it is set to

android.control.afAvailableModes

Details

Only effective if android.control.mode = AUTO.

If the lens is controlled by the camera device auto-focus algorithm, the camera device will report the current AF status in android.control.afState in result metadata.

android.control.afRegions int32 x 5 x area_count [public]

List of areas to use for focus estimation

Details

Each area is a rectangle plus weight: xmin, ymin, xmax, ymax, weight. The rectangle is defined inclusive of the specified coordinates.

The coordinate system is based on the active pixel array, with (0,0) being the top-left pixel in the active pixel array, and (android.sensor.info.activeArraySize.width - 1, android.sensor.info.activeArraySize.height - 1) being the bottom-right pixel in the active pixel array. The weight should be nonnegative.

If all regions have 0 weight, then no specific focus area needs to be used by the HAL. If the focusing region is outside the current android.scaler.cropRegion, the HAL should ignore the sections outside the region and output the used sections in the frame metadata

android.control.afTrigger byte [public]
  • IDLE

    The trigger is idle.

  • START

    Autofocus will trigger now.

  • CANCEL

    Autofocus will return to its initial state, and cancel any currently active trigger.

Whether the camera device will trigger autofocus for this request.

Details

This entry is normally set to IDLE, or is not included at all in the request settings.

When included and set to START, the camera device will trigger the autofocus algorithm. If autofocus is disabled, this trigger has no effect.

When set to CANCEL, the camera device will cancel any active trigger, and return to its initial AF state.

See android.control.afState for what that means for each AF mode.

android.control.awbLock byte [public as boolean]
  • OFF

    Auto-whitebalance lock is disabled; the AWB algorithm is free to update its parameters if in AUTO mode.

  • ON

    Auto-whitebalance lock is enabled; the AWB algorithm must not update the exposure and sensitivity parameters while the lock is active

Whether AWB is currently locked to its latest calculated values

Details

Note that AWB lock is only meaningful for AUTO mode; in other modes, AWB is already fixed to a specific setting

android.control.awbMode byte [public]
  • OFF

    The camera device's auto white balance routine is disabled; the application-selected color transform matrix (android.colorCorrection.transform) and gains (android.colorCorrection.gains) are used by the camera device for manual white balance control.

  • AUTO

    The camera device's auto white balance routine is active; the application's values for android.colorCorrection.transform and android.colorCorrection.gains are ignored.

  • INCANDESCENT

    The camera device's auto white balance routine is disabled; the camera device uses incandescent light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant A.

  • FLUORESCENT

    The camera device's auto white balance routine is disabled; the camera device uses fluorescent light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant F2.

  • WARM_FLUORESCENT

    The camera device's auto white balance routine is disabled; the camera device uses warm fluorescent light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant F4.

  • DAYLIGHT

    The camera device's auto white balance routine is disabled; the camera device uses daylight light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant D65.

  • CLOUDY_DAYLIGHT

    The camera device's auto white balance routine is disabled; the camera device uses cloudy daylight light as the assumed scene illumination for white balance.

  • TWILIGHT

    The camera device's auto white balance routine is disabled; the camera device uses twilight light as the assumed scene illumination for white balance.

  • SHADE

    The camera device's auto white balance routine is disabled; the camera device uses shade light as the assumed scene illumination for white balance.

Whether AWB is currently setting the color transform fields, and what its illumination target is

android.control.awbAvailableModes

Details

This control is only effective if android.control.mode is AUTO.

When set to the ON mode, the camera device's auto white balance routine is enabled, overriding the application's selected android.colorCorrection.transform, android.colorCorrection.gains and android.colorCorrection.mode.

When set to the OFF mode, the camera device's auto white balance routine is disabled. The applicantion manually controls the white balance by android.colorCorrection.transform, android.colorCorrection.gains and android.colorCorrection.mode.

When set to any other modes, the camera device's auto white balance routine is disabled. The camera device uses each particular illumination target for white balance adjustment.

android.control.awbRegions int32 x 5 x area_count [public]

List of areas to use for illuminant estimation

Details

Only used in AUTO mode.

Each area is a rectangle plus weight: xmin, ymin, xmax, ymax, weight. The rectangle is defined inclusive of the specified coordinates.

The coordinate system is based on the active pixel array, with (0,0) being the top-left pixel in the active pixel array, and (android.sensor.info.activeArraySize.width - 1, android.sensor.info.activeArraySize.height - 1) being the bottom-right pixel in the active pixel array. The weight should be nonnegative.

If all regions have 0 weight, then no specific metering area needs to be used by the HAL. If the metering region is outside the current android.scaler.cropRegion, the HAL should ignore the sections outside the region and output the used sections in the frame metadata

android.control.captureIntent byte [public]
  • CUSTOM

    This request doesn't fall into the other categories. Default to preview-like behavior.

  • PREVIEW

    This request is for a preview-like usecase. The precapture trigger may be used to start off a metering w/flash sequence

  • STILL_CAPTURE

    This request is for a still capture-type usecase.

  • VIDEO_RECORD

    This request is for a video recording usecase.

  • VIDEO_SNAPSHOT

    This request is for a video snapshot (still image while recording video) usecase

  • ZERO_SHUTTER_LAG

    This request is for a ZSL usecase; the application will stream full-resolution images and reprocess one or several later for a final capture

Information to the camera device 3A (auto-exposure, auto-focus, auto-white balance) routines about the purpose of this capture, to help the camera device to decide optimal 3A strategy.

All must be supported

Details

This control is only effective if android.control.mode != OFF and any 3A routine is active.

android.control.effectMode byte [public]
  • OFF
  • MONO optional
  • NEGATIVE optional
  • SOLARIZE optional
  • SEPIA optional
  • POSTERIZE optional
  • WHITEBOARD optional
  • BLACKBOARD optional
  • AQUA optional

Whether any special color effect is in use. Only used if android.control.mode != OFF

android.control.availableEffects

android.control.mode byte [public]
  • OFF

    Full application control of pipeline. All 3A routines are disabled, no other settings in android.control.* have any effect

  • AUTO

    Use settings for each individual 3A routine. Manual control of capture parameters is disabled. All controls in android.control.* besides sceneMode take effect

  • USE_SCENE_MODE

    Use specific scene mode. Enabling this disables control.aeMode, control.awbMode and control.afMode controls; the HAL must ignore those settings while USE_SCENE_MODE is active (except for FACE_PRIORITY scene mode). Other control entries are still active. This setting can only be used if availableSceneModes != UNSUPPORTED

Overall mode of 3A control routines

all must be supported

Details

High-level 3A control. When set to OFF, all 3A control by the camera device is disabled. The application must set the fields for capture parameters itself.

When set to AUTO, the individual algorithm controls in android.control.* are in effect, such as android.control.afMode.

When set to USE_SCENE_MODE, the individual controls in android.control.* are mostly disabled, and the camera device implements one of the scene mode settings (such as ACTION, SUNSET, or PARTY) as it wishes. The camera device scene mode 3A settings are provided by android.control.sceneModeOverrides.

android.control.sceneMode byte [public]
  • UNSUPPORTED 0
  • FACE_PRIORITY

    if face detection support exists Use face detection data to drive 3A routines. If face detection statistics are disabled, should still operate correctly (but not return face detection statistics to the framework).

    Unlike the other scene modes, aeMode, awbMode, and afMode remain active when FACE_PRIORITY is set. This is due to compatibility concerns with the old camera API

  • ACTION optional
  • PORTRAIT optional
  • LANDSCAPE optional
  • NIGHT optional
  • NIGHT_PORTRAIT optional
  • THEATRE optional
  • BEACH optional
  • SNOW optional
  • SUNSET optional
  • STEADYPHOTO optional
  • FIREWORKS optional
  • SPORTS optional
  • PARTY optional
  • CANDLELIGHT optional
  • BARCODE optional

Which scene mode is active when android.control.mode = SCENE_MODE

android.control.availableSceneModes

android.control.videoStabilizationMode byte [public as boolean]
  • OFF
  • ON

Whether video stabilization is active

Details

If enabled, video stabilization can modify the android.scaler.cropRegion to keep the video stream stabilized

static
Property Name Type Description Units Range Tags
android.control.aeAvailableAntibandingModes byte x n [public]
list of enums

The set of auto-exposure antibanding modes that are supported by this camera device.

Details

Not all of the auto-exposure anti-banding modes may be supported by a given camera device. This field lists the valid anti-banding modes that the application may request for this camera device; they must include AUTO.

android.control.aeAvailableModes byte x n [public]
list of enums

The set of auto-exposure modes that are supported by this camera device.

Details

Not all the auto-exposure modes may be supported by a given camera device, especially if no flash unit is available. This entry lists the valid modes for android.control.aeMode for this camera device.

All camera devices support ON, and all camera devices with flash units support ON_AUTO_FLASH and ON_ALWAYS_FLASH.

Full-capability camera devices always support OFF mode, which enables application control of camera exposure time, sensitivity, and frame duration.

android.control.aeAvailableTargetFpsRanges int32 x 2 x n [public]
list of pairs of frame rates

List of frame rate ranges supported by the AE algorithm/hardware

android.control.aeCompensationRange int32 x 2 [public]

Maximum and minimum exposure compensation setting, in counts of android.control.aeCompensationStepSize

At least (-2,2)/(exp compensation step size)

android.control.aeCompensationStep rational [public]

Smallest step by which exposure compensation can be changed

<= 1/2

android.control.afAvailableModes byte x n [public]
List of enums

List of AF modes that can be selected with android.control.afMode.

Details

Not all the auto-focus modes may be supported by a given camera device. This entry lists the valid modes for android.control.afMode for this camera device.

All camera devices will support OFF mode, and all camera devices with adjustable focuser units (android.lens.info.minimumFocusDistance > 0) will support AUTO mode.

android.control.availableEffects byte x n [public]
list of enums

what subset of the full color effect enum list is supported

OFF must be listed

android.control.availableSceneModes byte x n [public]
list of enums from android.control.sceneMode, plus UNSUPPORTED to indicate no scene modes are supported

what subset of the scene mode enum list is supported.

SCENE_MODE_FACE_PRIORITY must be supported if face detection is supported

android.control.availableVideoStabilizationModes byte x n [public]
List of enums.

List of video stabilization modes that can be supported

OFF must be included

android.control.awbAvailableModes byte x n [public]
List of enums

The set of auto-white-balance modes (android.control.awbMode) that are supported by this camera device.

Details

Not all the auto-white-balance modes may be supported by a given camera device. This entry lists the valid modes for android.control.awbMode for this camera device.

All camera devices will support ON mode.

Full-capability camera devices will always support OFF mode, which enables application control of white balance, by using android.colorCorrection.transform and android.colorCorrection.gains(android.colorCorrection.mode must be set to TRANSFORM_MATRIX).

android.control.maxRegions int32 [public]

For AE, AWB, and AF, how many individual regions can be listed for metering?

>= 1

android.control.sceneModeOverrides byte x 3 x length(availableSceneModes) [system]

List of AE, AWB, and AF modes to use for each available scene mode

For each listed scene mode, lists the aeMode, awbMode, and afMode that the HAL wants to use for that scene mode.

For each entry, the order is {aeMode, awbMode, afMode} in order of increasing index

Details

When a scene mode is enabled, the HAL is expected to override aeMode, awbMode, and afMode with its preferred settings for that scene mode.

To simplify communication with old camera API applications, the service wants this override list in the static metadata. The order of this list matches that of availableSceneModes, with 3 entires for each scene mode. The overrides listed for SCENE_MODE_FACE_PRIORITY are ignored, since for that mode, the application-set aeMode, awbMode, and afMode are used instead, like they are when android.control.mode is AUTO.

It is recommended that for FACE_PRIORITY, the overrides should be set to 0. As an example, if availableSceneModes is { FACE_PRIORITY, ACTION, NIGHT }, then the service expects this field to have 9 entries; for example { 0 , 0, 0, ON_AUTO_FLASH, AUTO, CONTINUOUS_PICTURE, ON_AUTO_FLASH, INCANDESCENT, AUTO }

dynamic
Property Name Type Description Units Range Tags
android.control.aePrecaptureId int32 [hidden]

The ID sent with the latest CAMERA2_TRIGGER_PRECAPTURE_METERING call

Details

Must be 0 if no CAMERA2_TRIGGER_PRECAPTURE_METERING trigger received yet by HAL. Always updated even if AE algorithm ignores the trigger

android.control.aeMode byte [public]

The desired mode for the camera device's auto-exposure routine.

android.control.aeAvailableModes

Details

This control is only effective if android.control.mode is AUTO.

When set to any of the ON modes, the camera device's auto-exposure routine is enabled, overriding the application's selected exposure time, sensor sensitivity, and frame duration (android.sensor.exposureTime, android.sensor.sensitivity, and android.sensor.frameDuration). If one of the FLASH modes is selected, the camera device's flash unit controls are also overridden.

The FLASH modes are only available if the camera device has a flash unit (android.flash.info.available is true).

If flash TORCH mode is desired, this field must be set to ON or OFF, and android.flash.mode set to TORCH.

When set to any of the ON modes, the values chosen by the camera device auto-exposure routine for the overridden fields for a given capture will be available in its CaptureResult.

android.control.aeRegions int32 x 5 x area_count [public]

List of areas to use for metering

Details

Each area is a rectangle plus weight: xmin, ymin, xmax, ymax, weight. The rectangle is defined inclusive of the specified coordinates.

The coordinate system is based on the active pixel array, with (0,0) being the top-left pixel in the active pixel array, and (android.sensor.info.activeArraySize.width - 1, android.sensor.info.activeArraySize.height - 1) being the bottom-right pixel in the active pixel array. The weight should be nonnegative.

If all regions have 0 weight, then no specific metering area needs to be used by the HAL. If the metering region is outside the current android.scaler.cropRegion, the HAL should ignore the sections outside the region and output the used sections in the frame metadata

android.control.aeState byte [public]
  • INACTIVE

    AE is off or recently reset. When a camera device is opened, it starts in this state.

  • SEARCHING

    AE doesn't yet have a good set of control values for the current scene.

  • CONVERGED

    AE has a good set of control values for the current scene.

  • LOCKED

    AE has been locked.

  • FLASH_REQUIRED

    AE has a good set of control values, but flash needs to be fired for good quality still capture.

  • PRECAPTURE

    AE has been asked to do a precapture sequence (through the android.control.aePrecaptureTrigger START), and is currently executing it. Once PRECAPTURE completes, AE will transition to CONVERGED or FLASH_REQUIRED as appropriate.

Current state of AE algorithm

Details

Switching between or enabling AE modes (android.control.aeMode) always resets the AE state to INACTIVE. Similarly, switching between android.control.mode, or android.control.sceneMode if android.control.mode == USE_SCENE_MODE resets all the algorithm states to INACTIVE.

The camera device can do several state transitions between two results, if it is allowed by the state transition table. For example: INACTIVE may never actually be seen in a result.

The state in the result is the state for this image (in sync with this image): if AE state becomes CONVERGED, then the image data associated with this result should be good to use.

Below are state transition tables for different AE modes.

State Transition Cause New State Notes
INACTIVE INACTIVE Camera device auto exposure algorithm is disabled

When android.control.aeMode is AE_MODE_ON_*:

State Transition Cause New State Notes
INACTIVE Camera device initiates AE scan SEARCHING Values changing
INACTIVE android.control.aeLock is ON LOCKED Values locked
SEARCHING Camera device finishes AE scan CONVERGED Good values, not changing
SEARCHING Camera device finishes AE scan FLASH_REQUIRED Converged but too dark w/o flash
SEARCHING android.control.aeLock is ON LOCKED Values locked
CONVERGED Camera device initiates AE scan SEARCHING Values changing
CONVERGED android.control.aeLock is ON LOCKED Values locked
FLASH_REQUIRED Camera device initiates AE scan SEARCHING Values changing
FLASH_REQUIRED android.control.aeLock is ON LOCKED Values locked
LOCKED android.control.aeLock is OFF SEARCHING Values not good after unlock
LOCKED android.control.aeLock is OFF CONVERGED Values good after unlock
LOCKED android.control.aeLock is OFF FLASH_REQUIRED Exposure good, but too dark
PRECAPTURE Sequence done. android.control.aeLock is OFF CONVERGED Ready for high-quality capture
PRECAPTURE Sequence done. android.control.aeLock is ON LOCKED Ready for high-quality capture
Any state android.control.aePrecaptureTrigger is START PRECAPTURE Start AE precapture metering sequence
android.control.afMode byte [public]
  • OFF

    The auto-focus routine does not control the lens; android.lens.focusDistance is controlled by the application

  • AUTO

    If lens is not fixed focus.

    Use android.lens.info.minimumFocusDistance to determine if lens is fixed-focus. In this mode, the lens does not move unless the autofocus trigger action is called. When that trigger is activated, AF must transition to ACTIVE_SCAN, then to the outcome of the scan (FOCUSED or NOT_FOCUSED).

    Triggering AF_CANCEL resets the lens position to default, and sets the AF state to INACTIVE.

  • MACRO

    In this mode, the lens does not move unless the autofocus trigger action is called.

    When that trigger is activated, AF must transition to ACTIVE_SCAN, then to the outcome of the scan (FOCUSED or NOT_FOCUSED). Triggering cancel AF resets the lens position to default, and sets the AF state to INACTIVE.

  • CONTINUOUS_VIDEO

    In this mode, the AF algorithm modifies the lens position continually to attempt to provide a constantly-in-focus image stream.

    The focusing behavior should be suitable for good quality video recording; typically this means slower focus movement and no overshoots. When the AF trigger is not involved, the AF algorithm should start in INACTIVE state, and then transition into PASSIVE_SCAN and PASSIVE_FOCUSED states as appropriate. When the AF trigger is activated, the algorithm should immediately transition into AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the lens position until a cancel AF trigger is received.

    Once cancel is received, the algorithm should transition back to INACTIVE and resume passive scan. Note that this behavior is not identical to CONTINUOUS_PICTURE, since an ongoing PASSIVE_SCAN must immediately be canceled.

  • CONTINUOUS_PICTURE

    In this mode, the AF algorithm modifies the lens position continually to attempt to provide a constantly-in-focus image stream.

    The focusing behavior should be suitable for still image capture; typically this means focusing as fast as possible. When the AF trigger is not involved, the AF algorithm should start in INACTIVE state, and then transition into PASSIVE_SCAN and PASSIVE_FOCUSED states as appropriate as it attempts to maintain focus. When the AF trigger is activated, the algorithm should finish its PASSIVE_SCAN if active, and then transition into AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the lens position until a cancel AF trigger is received.

    When the AF cancel trigger is activated, the algorithm should transition back to INACTIVE and then act as if it has just been started.

  • EDOF

    Extended depth of field (digital focus). AF trigger is ignored, AF state should always be INACTIVE.

Whether AF is currently enabled, and what mode it is set to

android.control.afAvailableModes

Details

Only effective if android.control.mode = AUTO.

If the lens is controlled by the camera device auto-focus algorithm, the camera device will report the current AF status in android.control.afState in result metadata.

android.control.afRegions int32 x 5 x area_count [public]

List of areas to use for focus estimation

Details

Each area is a rectangle plus weight: xmin, ymin, xmax, ymax, weight. The rectangle is defined inclusive of the specified coordinates.

The coordinate system is based on the active pixel array, with (0,0) being the top-left pixel in the active pixel array, and (android.sensor.info.activeArraySize.width - 1, android.sensor.info.activeArraySize.height - 1) being the bottom-right pixel in the active pixel array. The weight should be nonnegative.

If all regions have 0 weight, then no specific focus area needs to be used by the HAL. If the focusing region is outside the current android.scaler.cropRegion, the HAL should ignore the sections outside the region and output the used sections in the frame metadata

android.control.afState byte [public]
  • INACTIVE

    AF off or has not yet tried to scan/been asked to scan. When a camera device is opened, it starts in this state.

  • PASSIVE_SCAN

    if CONTINUOUS_* modes are supported. AF is currently doing an AF scan initiated by a continuous autofocus mode

  • PASSIVE_FOCUSED

    if CONTINUOUS_* modes are supported. AF currently believes it is in focus, but may restart scanning at any time.

  • ACTIVE_SCAN

    if AUTO or MACRO modes are supported. AF is doing an AF scan because it was triggered by AF trigger

  • FOCUSED_LOCKED

    if any AF mode besides OFF is supported. AF believes it is focused correctly and is locked

  • NOT_FOCUSED_LOCKED

    if any AF mode besides OFF is supported. AF has failed to focus successfully and is locked

  • PASSIVE_UNFOCUSED

    if CONTINUOUS_* modes are supported. AF finished a passive scan without finding focus, and may restart scanning at any time.

Current state of AF algorithm

Details

Switching between or enabling AF modes (android.control.afMode) always resets the AF state to INACTIVE. Similarly, switching between android.control.mode, or android.control.sceneMode if android.control.mode == USE_SCENE_MODE resets all the algorithm states to INACTIVE.

The camera device can do several state transitions between two results, if it is allowed by the state transition table. For example: INACTIVE may never actually be seen in a result.

The state in the result is the state for this image (in sync with this image): if AF state becomes FOCUSED, then the image data associated with this result should be sharp.

Below are state transition tables for different AF modes.

When android.control.afMode is AF_MODE_OFF or AF_MODE_EDOF:

State Transition Cause New State Notes
INACTIVE INACTIVE Never changes

When android.control.afMode is AF_MODE_AUTO or AF_MODE_MACRO:

State Transition Cause New State Notes
INACTIVE AF_TRIGGER ACTIVE_SCAN Start AF sweep, Lens now moving
ACTIVE_SCAN AF sweep done FOCUSED_LOCKED Focused, Lens now locked
ACTIVE_SCAN AF sweep done NOT_FOCUSED_LOCKED Not focused, Lens now locked
ACTIVE_SCAN AF_CANCEL INACTIVE Cancel/reset AF, Lens now locked
FOCUSED_LOCKED AF_CANCEL INACTIVE Cancel/reset AF
FOCUSED_LOCKED AF_TRIGGER ACTIVE_SCAN Start new sweep, Lens now moving
NOT_FOCUSED_LOCKED AF_CANCEL INACTIVE Cancel/reset AF
NOT_FOCUSED_LOCKED AF_TRIGGER ACTIVE_SCAN Start new sweep, Lens now moving
Any state Mode change INACTIVE

When android.control.afMode is AF_MODE_CONTINUOUS_VIDEO:

State Transition Cause New State Notes
INACTIVE Camera device initiates new scan PASSIVE_SCAN Start AF scan, Lens now moving
INACTIVE AF_TRIGGER NOT_FOCUSED_LOCKED AF state query, Lens now locked
PASSIVE_SCAN Camera device completes current scan PASSIVE_FOCUSED End AF scan, Lens now locked
PASSIVE_SCAN Camera device fails current scan PASSIVE_UNFOCUSED End AF scan, Lens now locked
PASSIVE_SCAN AF_TRIGGER FOCUSED_LOCKED Immediate trans. If focus is good, Lens now locked
PASSIVE_SCAN AF_TRIGGER NOT_FOCUSED_LOCKED Immediate trans. if focus is bad, Lens now locked
PASSIVE_SCAN AF_CANCEL INACTIVE Reset lens position, Lens now locked
PASSIVE_FOCUSED Camera device initiates new scan PASSIVE_SCAN Start AF scan, Lens now moving
PASSIVE_UNFOCUSED Camera device initiates new scan PASSIVE_SCAN Start AF scan, Lens now moving
PASSIVE_FOCUSED AF_TRIGGER FOCUSED_LOCKED Immediate trans. Lens now locked
PASSIVE_UNFOCUSED AF_TRIGGER NOT_FOCUSED_LOCKED Immediate trans. Lens now locked
FOCUSED_LOCKED AF_TRIGGER FOCUSED_LOCKED No effect
FOCUSED_LOCKED AF_CANCEL INACTIVE Restart AF scan
NOT_FOCUSED_LOCKED AF_TRIGGER NOT_FOCUSED_LOCKED No effect
NOT_FOCUSED_LOCKED AF_CANCEL INACTIVE Restart AF scan

When android.control.afMode is AF_MODE_CONTINUOUS_PICTURE:

State Transition Cause New State Notes
INACTIVE Camera device initiates new scan PASSIVE_SCAN Start AF scan, Lens now moving
INACTIVE AF_TRIGGER NOT_FOCUSED_LOCKED AF state query, Lens now locked
PASSIVE_SCAN Camera device completes current scan PASSIVE_FOCUSED End AF scan, Lens now locked
PASSIVE_SCAN Camera device fails current scan PASSIVE_UNFOCUSED End AF scan, Lens now locked
PASSIVE_SCAN AF_TRIGGER FOCUSED_LOCKED Eventual trans. once focus good, Lens now locked
PASSIVE_SCAN AF_TRIGGER NOT_FOCUSED_LOCKED Eventual trans. if cannot focus, Lens now locked
PASSIVE_SCAN AF_CANCEL INACTIVE Reset lens position, Lens now locked
PASSIVE_FOCUSED Camera device initiates new scan PASSIVE_SCAN Start AF scan, Lens now moving
PASSIVE_UNFOCUSED Camera device initiates new scan PASSIVE_SCAN Start AF scan, Lens now moving
PASSIVE_FOCUSED AF_TRIGGER FOCUSED_LOCKED Immediate trans. Lens now locked
PASSIVE_UNFOCUSED AF_TRIGGER NOT_FOCUSED_LOCKED Immediate trans. Lens now locked
FOCUSED_LOCKED AF_TRIGGER FOCUSED_LOCKED No effect
FOCUSED_LOCKED AF_CANCEL INACTIVE Restart AF scan
NOT_FOCUSED_LOCKED AF_TRIGGER NOT_FOCUSED_LOCKED No effect
NOT_FOCUSED_LOCKED AF_CANCEL INACTIVE Restart AF scan
android.control.afTriggerId int32 [hidden]

The ID sent with the latest CAMERA2_TRIGGER_AUTOFOCUS call

Details

Must be 0 if no CAMERA2_TRIGGER_AUTOFOCUS trigger received yet by HAL. Always updated even if AF algorithm ignores the trigger

android.control.awbMode byte [public]
  • OFF

    The camera device's auto white balance routine is disabled; the application-selected color transform matrix (android.colorCorrection.transform) and gains (android.colorCorrection.gains) are used by the camera device for manual white balance control.

  • AUTO

    The camera device's auto white balance routine is active; the application's values for android.colorCorrection.transform and android.colorCorrection.gains are ignored.

  • INCANDESCENT

    The camera device's auto white balance routine is disabled; the camera device uses incandescent light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant A.

  • FLUORESCENT

    The camera device's auto white balance routine is disabled; the camera device uses fluorescent light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant F2.

  • WARM_FLUORESCENT

    The camera device's auto white balance routine is disabled; the camera device uses warm fluorescent light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant F4.

  • DAYLIGHT

    The camera device's auto white balance routine is disabled; the camera device uses daylight light as the assumed scene illumination for white balance. While the exact white balance transforms are up to the camera device, they will approximately match the CIE standard illuminant D65.

  • CLOUDY_DAYLIGHT

    The camera device's auto white balance routine is disabled; the camera device uses cloudy daylight light as the assumed scene illumination for white balance.

  • TWILIGHT

    The camera device's auto white balance routine is disabled; the camera device uses twilight light as the assumed scene illumination for white balance.

  • SHADE

    The camera device's auto white balance routine is disabled; the camera device uses shade light as the assumed scene illumination for white balance.

Whether AWB is currently setting the color transform fields, and what its illumination target is

android.control.awbAvailableModes

Details

This control is only effective if android.control.mode is AUTO.

When set to the ON mode, the camera device's auto white balance routine is enabled, overriding the application's selected android.colorCorrection.transform, android.colorCorrection.gains and android.colorCorrection.mode.

When set to the OFF mode, the camera device's auto white balance routine is disabled. The applicantion manually controls the white balance by android.colorCorrection.transform, android.colorCorrection.gains and android.colorCorrection.mode.

When set to any other modes, the camera device's auto white balance routine is disabled. The camera device uses each particular illumination target for white balance adjustment.

android.control.awbRegions int32 x 5 x area_count [public]

List of areas to use for illuminant estimation

Details

Only used in AUTO mode.

Each area is a rectangle plus weight: xmin, ymin, xmax, ymax, weight. The rectangle is defined inclusive of the specified coordinates.

The coordinate system is based on the active pixel array, with (0,0) being the top-left pixel in the active pixel array, and (android.sensor.info.activeArraySize.width - 1, android.sensor.info.activeArraySize.height - 1) being the bottom-right pixel in the active pixel array. The weight should be nonnegative.

If all regions have 0 weight, then no specific metering area needs to be used by the HAL. If the metering region is outside the current android.scaler.cropRegion, the HAL should ignore the sections outside the region and output the used sections in the frame metadata

android.control.awbState byte [public]
  • INACTIVE

    AWB is not in auto mode. When a camera device is opened, it starts in this state.

  • SEARCHING

    AWB doesn't yet have a good set of control values for the current scene.

  • CONVERGED

    AWB has a good set of control values for the current scene.

  • LOCKED

    AWB has been locked.

Current state of AWB algorithm

Details

Switching between or enabling AWB modes (android.control.awbMode) always resets the AWB state to INACTIVE. Similarly, switching between android.control.mode, or android.control.sceneMode if android.control.mode == USE_SCENE_MODE resets all the algorithm states to INACTIVE.

The camera device can do several state transitions between two results, if it is allowed by the state transition table. So INACTIVE may never actually be seen in a result.

The state in the result is the state for this image (in sync with this image): if AWB state becomes CONVERGED, then the image data associated with this result should be good to use.

Below are state transition tables for different AWB modes.

When android.control.awbMode != AWB_MODE_AUTO:

State Transition Cause New State Notes
INACTIVE INACTIVE Camera device auto white balance algorithm is disabled

When android.control.awbMode is AWB_MODE_AUTO:

State Transition Cause New State Notes
INACTIVE Camera device initiates AWB scan SEARCHING Values changing
INACTIVE android.control.awbLock is ON LOCKED Values locked
SEARCHING Camera device finishes AWB scan CONVERGED Good values, not changing
SEARCHING android.control.awbLock is ON LOCKED Values locked
CONVERGED Camera device initiates AWB scan SEARCHING Values changing
CONVERGED android.control.awbLock is ON LOCKED Values locked
LOCKED android.control.awbLock is OFF SEARCHING Values not good after unlock
LOCKED android.control.awbLock is OFF CONVERGED Values good after unlock
android.control.mode byte [public]
  • OFF

    Full application control of pipeline. All 3A routines are disabled, no other settings in android.control.* have any effect

  • AUTO

    Use settings for each individual 3A routine. Manual control of capture parameters is disabled. All controls in android.control.* besides sceneMode take effect

  • USE_SCENE_MODE

    Use specific scene mode. Enabling this disables control.aeMode, control.awbMode and control.afMode controls; the HAL must ignore those settings while USE_SCENE_MODE is active (except for FACE_PRIORITY scene mode). Other control entries are still active. This setting can only be used if availableSceneModes != UNSUPPORTED

Overall mode of 3A control routines

all must be supported

Details

High-level 3A control. When set to OFF, all 3A control by the camera device is disabled. The application must set the fields for capture parameters itself.

When set to AUTO, the individual algorithm controls in android.control.* are in effect, such as android.control.afMode.

When set to USE_SCENE_MODE, the individual controls in android.control.* are mostly disabled, and the camera device implements one of the scene mode settings (such as ACTION, SUNSET, or PARTY) as it wishes. The camera device scene mode 3A settings are provided by android.control.sceneModeOverrides.

demosaic
controls
Property Name Type Description Units Range Tags
android.demosaic.mode byte [system]
  • FAST

    Minimal or no slowdown of frame rate compared to Bayer RAW output

  • HIGH_QUALITY

    High-quality may reduce output frame rate

Controls the quality of the demosaicing processing

edge
controls
Property Name Type Description Units Range Tags
android.edge.mode byte [public]
  • OFF

    No edge enhancement is applied

  • FAST

    Must not slow down frame rate relative to sensor output

  • HIGH_QUALITY

    Frame rate may be reduced by high quality

Operation mode for edge enhancement

Details

Edge/sharpness/detail enhancement. OFF means no enhancement will be applied by the HAL.

FAST/HIGH_QUALITY both mean camera device determined enhancement will be applied. HIGH_QUALITY mode indicates that the camera device will use the highest-quality enhancement algorithms, even if it slows down capture rate. FAST means the camera device will not slow down capture rate when applying edge enhancement.

android.edge.strength byte [system]

Control the amount of edge enhancement applied to the images

1-10; 10 is maximum sharpening
dynamic
Property Name Type Description Units Range Tags
android.edge.mode byte [public]
  • OFF

    No edge enhancement is applied

  • FAST

    Must not slow down frame rate relative to sensor output

  • HIGH_QUALITY

    Frame rate may be reduced by high quality

Operation mode for edge enhancement

Details

Edge/sharpness/detail enhancement. OFF means no enhancement will be applied by the HAL.

FAST/HIGH_QUALITY both mean camera device determined enhancement will be applied. HIGH_QUALITY mode indicates that the camera device will use the highest-quality enhancement algorithms, even if it slows down capture rate. FAST means the camera device will not slow down capture rate when applying edge enhancement.

flash
controls
Property Name Type Description Units Range Tags
android.flash.firingPower byte [system]

Power for flash firing/torch

10 is max power; 0 is no flash. Linear

0 - 10

Details

Power for snapshot may use a different scale than for torch mode. Only one entry for torch mode will be used

android.flash.firingTime int64 [system]

Firing time of flash relative to start of exposure

nanoseconds

0-(exposure time-flash duration)

Details

Clamped to (0, exposure time - flash duration).

android.flash.mode byte [public]

The desired mode for for the camera device's flash control.

Details

This control is only effective when flash unit is available (android.flash.info.available != 0).

When this control is used, the android.control.aeMode must be set to ON or OFF. Otherwise, the camera device auto-exposure related flash control (ON_AUTO_FLASH, ON_ALWAYS_FLASH, or ON_AUTO_FLASH_REDEYE) will override this control.

When set to OFF, the camera device will not fire flash for this capture.

When set to SINGLE, the camera device will fire flash regardless of the camera device's auto-exposure routine's result. When used in still capture case, this control should be used along with AE precapture metering sequence (android.control.aePrecaptureTrigger), otherwise, the image may be incorrectly exposed.

When set to TORCH, the flash will be on continuously. This mode can be used for use cases such as preview, auto-focus assist, still capture, or video recording.

static
Property Name Type Description Units Range Tags
android.flash.info.available byte [public]

Whether this camera has a flash

boolean (0 = false, otherwise true)
Details

If no flash, none of the flash controls do anything. All other metadata should return 0

android.flash.info.chargeDuration int64 [system]

Time taken before flash can fire again

nanoseconds

0-1e9

Details

1 second too long/too short for recharge? Should this be power-dependent?

android.flash.colorTemperature byte [system]

The x,y whitepoint of the flash

pair of floats

0-1 for both

android.flash.maxEnergy byte [system]

Max energy output of the flash for a full power single flash

lumen-seconds

>= 0

dynamic
Property Name Type Description Units Range Tags
android.flash.firingPower byte [system]

Power for flash firing/torch

10 is max power; 0 is no flash. Linear

0 - 10

Details

Power for snapshot may use a different scale than for torch mode. Only one entry for torch mode will be used

android.flash.firingTime int64 [system]

Firing time of flash relative to start of exposure

nanoseconds

0-(exposure time-flash duration)

Details

Clamped to (0, exposure time - flash duration).

android.flash.mode byte [public]

The desired mode for for the camera device's flash control.

Details

This control is only effective when flash unit is available (android.flash.info.available != 0).

When this control is used, the android.control.aeMode must be set to ON or OFF. Otherwise, the camera device auto-exposure related flash control (ON_AUTO_FLASH, ON_ALWAYS_FLASH, or ON_AUTO_FLASH_REDEYE) will override this control.

When set to OFF, the camera device will not fire flash for this capture.

When set to SINGLE, the camera device will fire flash regardless of the camera device's auto-exposure routine's result. When used in still capture case, this control should be used along with AE precapture metering sequence (android.control.aePrecaptureTrigger), otherwise, the image may be incorrectly exposed.

When set to TORCH, the flash will be on continuously. This mode can be used for use cases such as preview, auto-focus assist, still capture, or video recording.

android.flash.state byte [public]
  • UNAVAILABLE

    No flash on camera

  • CHARGING

    if android.flash.available is true Flash is charging and cannot be fired

  • READY

    if android.flash.available is true Flash is ready to fire

  • FIRED

    if android.flash.available is true Flash fired for this capture

Current state of the flash unit

geometric
controls
Property Name Type Description Units Range Tags
android.geometric.mode byte [system]
  • OFF

    No geometric correction is applied

  • FAST

    Must not slow down frame rate relative to raw bayer output

  • HIGH_QUALITY

    Frame rate may be reduced by high quality

Operating mode of geometric correction

android.geometric.strength byte [system]

Control the amount of shading correction applied to the images

unitless: 1-10; 10 is full shading compensation
hotPixel
controls
Property Name Type Description Units Range Tags
android.hotPixel.mode byte [system]
  • OFF

    No hot pixel correction can be applied

  • FAST

    Frame rate must not be reduced compared to raw Bayer output

  • HIGH_QUALITY

    Frame rate may be reduced by high quality

Set operational mode for hot pixel correction

static
Property Name Type Description Units Range Tags
android.hotPixel.info.map int32 x 2 x n [system]
list of coordinates based on android.sensor.pixelArraySize

Location of hot/defective pixels on sensor

dynamic
Property Name Type Description Units Range Tags
android.hotPixel.mode byte [system]
  • OFF

    No hot pixel correction can be applied

  • FAST

    Frame rate must not be reduced compared to raw Bayer output

  • HIGH_QUALITY

    Frame rate may be reduced by high quality

Set operational mode for hot pixel correction

jpeg
controls
Property Name Type Description Units Range Tags
android.jpeg.gpsCoordinates double x 3 [public]
latitude, longitude, altitude. First two in degrees, the third in meters

GPS coordinates to include in output JPEG EXIF

(-180 - 180], [-90,90], [-inf, inf]

android.jpeg.gpsProcessingMethod byte [public as string]

32 characters describing GPS algorithm to include in EXIF

UTF-8 null-terminated string
android.jpeg.gpsTimestamp int64 [public]

Time GPS fix was made to include in EXIF

UTC in seconds since January 1, 1970
android.jpeg.orientation int32 [public]

Orientation of JPEG image to write

Degrees in multiples of 90

0, 90, 180, 270

android.jpeg.quality byte [public]

Compression quality of the final JPEG image

1-100; larger is higher quality

Details

85-95 is typical usage range

android.jpeg.thumbnailQuality byte [public]

Compression quality of JPEG thumbnail

1-100; larger is higher quality

android.jpeg.thumbnailSize int32 x 2 [public as size]

Resolution of embedded JPEG thumbnail

Size must be one of the size from android.jpeg.availableThumbnailSizes

Details

When set to (0, 0) value, the JPEG EXIF will not contain thumbnail, but the captured JPEG will still be a valid image.

When a jpeg image capture is issued, the thumbnail size selected should have the same aspect ratio as the jpeg image.

static
Property Name Type Description Units Range Tags
android.jpeg.availableThumbnailSizes int32 x 2 x n [public as size]

Supported resolutions for the JPEG thumbnail

Will include at least one valid resolution, plus (0,0) for no thumbnail generation, and each size will be distinct.

Details

Below condiditions will be satisfied for this size list:

  • The sizes will be sorted by increasing pixel area (width x height). If several resolutions have the same area, they will be sorted by increasing width.
  • The aspect ratio of the largest thumbnail size will be same as the aspect ratio of largest size in android.scaler.availableJpegSizes. The largest size is defined as the size that has the largest pixel area in a given size list.
  • Each size in android.scaler.availableJpegSizes will have at least one corresponding size that has the same aspect ratio in availableThumbnailSizes, and vice versa.
  • All non (0, 0) sizes will have non-zero widths and heights.
android.jpeg.maxSize int32 [system]

Maximum size in bytes for the compressed JPEG buffer

Must be large enough to fit any JPEG produced by the camera

Details

This is used for sizing the gralloc buffers for JPEG

dynamic
Property Name Type Description Units Range Tags
android.jpeg.gpsCoordinates double x 3 [public]
latitude, longitude, altitude. First two in degrees, the third in meters

GPS coordinates to include in output JPEG EXIF

(-180 - 180], [-90,90], [-inf, inf]

android.jpeg.gpsProcessingMethod byte [public as string]

32 characters describing GPS algorithm to include in EXIF

UTF-8 null-terminated string
android.jpeg.gpsTimestamp int64 [public]

Time GPS fix was made to include in EXIF

UTC in seconds since January 1, 1970
android.jpeg.orientation int32 [public]

Orientation of JPEG image to write

Degrees in multiples of 90

0, 90, 180, 270

android.jpeg.quality byte [public]

Compression quality of the final JPEG image

1-100; larger is higher quality

Details

85-95 is typical usage range

android.jpeg.size int32 [system]

The size of the compressed JPEG image, in bytes

>= 0

Details

If no JPEG output is produced for the request, this must be 0.

Otherwise, this describes the real size of the compressed JPEG image placed in the output stream. More specifically, if android.jpeg.maxSize = 1000000, and a specific capture has android.jpeg.size = 500000, then the output buffer from the JPEG stream will be 1000000 bytes, of which the first 500000 make up the real data.

android.jpeg.thumbnailQuality byte [public]

Compression quality of JPEG thumbnail

1-100; larger is higher quality

android.jpeg.thumbnailSize int32 x 2 [public as size]

Resolution of embedded JPEG thumbnail

Size must be one of the size from android.jpeg.availableThumbnailSizes

Details

When set to (0, 0) value, the JPEG EXIF will not contain thumbnail, but the captured JPEG will still be a valid image.

When a jpeg image capture is issued, the thumbnail size selected should have the same aspect ratio as the jpeg image.

lens
controls
Property Name Type Description Units Range Tags
android.lens.aperture float [public]

The ratio of lens focal length to the effective aperture diameter.

f-number (f/NNN)

android.lens.info.availableApertures

Details

This will only be supported on the camera devices that have variable aperture lens. The aperture value can only be one of the values listed in android.lens.info.availableApertures.

When this is supported and android.control.aeMode is OFF, this can be set along with android.sensor.exposureTime, android.sensor.sensitivity, and android.sensor.frameDuration to achieve manual exposure control.

The requested aperture value may take several frames to reach the requested value; the camera device will report the current (intermediate) aperture size in capture result metadata while the aperture is changing.

When this is supported and android.control.aeMode is one of the ON modes, this will be overridden by the camera device auto-exposure algorithm, the overridden values are then provided back to the user in the corresponding result.

android.lens.filterDensity float [public]

State of lens neutral density filter(s).

Steps of Exposure Value (EV).

android.lens.info.availableFilterDensities

Details

This will not be supported on most camera devices. On devices where this is supported, this may only be set to one of the values included in android.lens.info.availableFilterDensities.

Lens filters are typically used to lower the amount of light the sensor is exposed to (measured in steps of EV). As used here, an EV step is the standard logarithmic representation, which are non-negative, and inversely proportional to the amount of light hitting the sensor. For example, setting this to 0 would result in no reduction of the incoming light, and setting this to 2 would mean that the filter is set to reduce incoming light by two stops (allowing 1/4 of the prior amount of light to the sensor).

android.lens.focalLength float [public]

Lens optical zoom setting

focal length in mm

> 0

Details

Will not be supported on most devices.

android.lens.focusDistance float [public]

Distance to plane of sharpest focus, measured from frontmost surface of the lens

diopters (1/m)

>= 0

Details

0 = infinity focus. Used value should be clamped to (0,minimum focus distance)

android.lens.opticalStabilizationMode byte [public]
  • OFF
  • ON optional

Whether optical image stabilization is enabled.

android.lens.availableOpticalStabilization

Details

Will not be supported on most devices.

static
Property Name Type Description Units Range Tags
android.lens.info.availableApertures float x n [public]

List of supported aperture values.

one entry required, &> 0

Details

If the camera device doesn't support variable apertures, listed value will be the fixed aperture.

If the camera device supports variable apertures, the aperture value in this list will be sorted in ascending order.

android.lens.info.availableFilterDensities float x n [public]

List of supported neutral density filter values for android.lens.filterDensity.

At least one value is required. Values must be >= 0.

Details

If changing android.lens.filterDensity is not supported, availableFilterDensities must contain only 0. Otherwise, this list contains only the exact filter density values available on this camera device.

android.lens.info.availableFocalLengths float x n [public]
the list of available focal lengths

If fitted with optical zoom, what focal lengths are available. If not, the static focal length

> 0

Details

If optical zoom not supported, only one value should be reported

android.lens.info.availableOpticalStabilization byte x n [public]
list of enums

List of supported optical image stabilization modes

android.lens.info.geometricCorrectionMap float x 2 x 3 x n x m [system]
2D array of destination coordinate pairs for uniform grid points in source image, per color channel. Size in the range of 2x3x40x30

A low-resolution map for correction of geometric distortions and chromatic aberrations, per color channel

N, M >= 2

Details

[DNG wants a function instead]. What's easiest for implementers? With an array size (M, N), entry (i, j) provides the destination for pixel (i/(M-1) * width, j/(N-1) * height). Data is row-major, with each array entry being ( (X, Y)_r, (X, Y)_g, (X, Y)_b ) )

android.lens.info.geometricCorrectionMapSize int32 x 2 [system as size]
width and height of geometric correction map

Dimensions of geometric correction map

Both values >= 2

android.lens.info.hyperfocalDistance float [public]

Hyperfocal distance for this lens; set to 0 if fixed focus

diopters

>= 0

Details

The hyperfocal distance is used for the old API's 'fixed' setting

android.lens.info.minimumFocusDistance float [public]

Shortest distance from frontmost surface of the lens that can be focused correctly

diopters

>= 0

Details

If the lens is fixed-focus, this should be 0

android.lens.info.shadingMapSize int32 x 2 [public as size]
width and height of lens shading map provided by the HAL. (N x M)

Dimensions of lens shading map.

Both values >= 1

Details

The map should be on the order of 30-40 rows and columns, and must be smaller than 64x64.

android.lens.facing byte [public]
  • FRONT
  • BACK

Direction the camera faces relative to device screen

android.lens.opticalAxisAngle float x 2 [system]
degrees. First defines the angle of separation between the perpendicular to the screen and the camera optical axis. The second then defines the clockwise rotation of the optical axis from native device up.

Relative angle of camera optical axis to the perpendicular axis from the display

[0-90) for first angle, [0-360) for second

Details

Examples:

(0,0) means that the camera optical axis is perpendicular to the display surface;

(45,0) means that the camera points 45 degrees up when device is held upright;

(45,90) means the camera points 45 degrees to the right when the device is held upright.

Use FACING field to determine perpendicular outgoing direction

android.lens.position float x 3, location in mm, in the sensor coordinate system [system]

Coordinates of camera optical axis on device

dynamic
Property Name Type Description Units Range Tags
android.lens.aperture float [public]

The ratio of lens focal length to the effective aperture diameter.

f-number (f/NNN)

android.lens.info.availableApertures

Details

This will only be supported on the camera devices that have variable aperture lens. The aperture value can only be one of the values listed in android.lens.info.availableApertures.

When this is supported and android.control.aeMode is OFF, this can be set along with android.sensor.exposureTime, android.sensor.sensitivity, and android.sensor.frameDuration to achieve manual exposure control.

The requested aperture value may take several frames to reach the requested value; the camera device will report the current (intermediate) aperture size in capture result metadata while the aperture is changing.

When this is supported and android.control.aeMode is one of the ON modes, this will be overridden by the camera device auto-exposure algorithm, the overridden values are then provided back to the user in the corresponding result.

android.lens.filterDensity float [public]

State of lens neutral density filter(s).

Steps of Exposure Value (EV).

android.lens.info.availableFilterDensities

Details

This will not be supported on most camera devices. On devices where this is supported, this may only be set to one of the values included in android.lens.info.availableFilterDensities.

Lens filters are typically used to lower the amount of light the sensor is exposed to (measured in steps of EV). As used here, an EV step is the standard logarithmic representation, which are non-negative, and inversely proportional to the amount of light hitting the sensor. For example, setting this to 0 would result in no reduction of the incoming light, and setting this to 2 would mean that the filter is set to reduce incoming light by two stops (allowing 1/4 of the prior amount of light to the sensor).

android.lens.focalLength float [public]

Lens optical zoom setting

focal length in mm

> 0

Details

Will not be supported on most devices.

android.lens.focusDistance float [public]

Distance to plane of sharpest focus, measured from frontmost surface of the lens

diopters (1/m)

>= 0

Details

Should be zero for fixed-focus cameras

android.lens.focusRange float x 2 [public]
Range of scene distances that are in focus

The range of scene distances that are in sharp focus (depth of field)

pair of focus distances in diopters: (near, far)

>=0

Details

If variable focus not supported, can still report fixed depth of field range

android.lens.opticalStabilizationMode byte [public]
  • OFF
  • ON optional

Whether optical image stabilization is enabled.

android.lens.availableOpticalStabilization

Details

Will not be supported on most devices.

android.lens.state byte [public]
  • STATIONARY
  • MOVING

Current lens status

noiseReduction
controls
Property Name Type Description Units Range Tags
android.noiseReduction.mode byte [public]
  • OFF

    No noise reduction is applied

  • FAST

    Must not slow down frame rate relative to sensor output

  • HIGH_QUALITY

    May slow down frame rate to provide highest quality

Mode of operation for the noise reduction algorithm

android.noiseReduction.availableModes

Details

Noise filtering control. OFF means no noise reduction will be applied by the HAL.

FAST/HIGH_QUALITY both mean camera device determined noise filtering will be applied. HIGH_QUALITY mode indicates that the camera device will use the highest-quality noise filtering algorithms, even if it slows down capture rate. FAST means the camera device should not slow down capture rate when applying noise filtering.

android.noiseReduction.strength byte [system]

Control the amount of noise reduction applied to the images

1-10; 10 is max noise reduction

1 - 10

dynamic
Property Name Type Description Units Range Tags
android.noiseReduction.mode byte [public]
  • OFF

    No noise reduction is applied

  • FAST

    Must not slow down frame rate relative to sensor output

  • HIGH_QUALITY

    May slow down frame rate to provide highest quality

Mode of operation for the noise reduction algorithm

android.noiseReduction.availableModes

Details

Noise filtering control. OFF means no noise reduction will be applied by the HAL.

FAST/HIGH_QUALITY both mean camera device determined noise filtering will be applied. HIGH_QUALITY mode indicates that the camera device will use the highest-quality noise filtering algorithms, even if it slows down capture rate. FAST means the camera device should not slow down capture rate when applying noise filtering.

quirks
static
Property Name Type Description Units Range Tags
android.quirks.meteringCropRegion byte [system]

If set to 1, the camera service does not scale 'normalized' coordinates with respect to the crop region. This applies to metering input (a{e,f,wb}Region and output (face rectangles).

Details

Normalized coordinates refer to those in the (-1000,1000) range mentioned in the android.hardware.Camera API.

HAL implementations should instead always use and emit sensor array-relative coordinates for all region data. Does not need to be listed in static metadata. Support will be removed in future versions of camera service.

android.quirks.triggerAfWithAuto byte [system]

If set to 1, then the camera service always switches to FOCUS_MODE_AUTO before issuing a AF trigger.

Details

HAL implementations should implement AF trigger modes for AUTO, MACRO, CONTINUOUS_FOCUS, and CONTINUOUS_PICTURE modes instead of using this flag. Does not need to be listed in static metadata. Support will be removed in future versions of camera service

android.quirks.useZslFormat byte [system]

If set to 1, the camera service uses CAMERA2_PIXEL_FORMAT_ZSL instead of HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED for the zero shutter lag stream

Details

HAL implementations should use gralloc usage flags to determine that a stream will be used for zero-shutter-lag, instead of relying on an explicit format setting. Does not need to be listed in static metadata. Support will be removed in future versions of camera service.

android.quirks.usePartialResult byte [hidden]

If set to 1, the HAL will always split result metadata for a single capture into multiple buffers, returned using multiple process_capture_result calls.

Details

Does not need to be listed in static metadata. Support for partial results will be reworked in future versions of camera service. This quirk will stop working at that point; DO NOT USE without careful consideration of future support.

dynamic
Property Name Type Description Units Range Tags
android.quirks.partialResult byte [hidden as boolean]
  • FINAL

    The last or only metadata result buffer for this capture.

  • PARTIAL

    A partial buffer of result metadata for this capture. More result buffers for this capture will be sent by the HAL, the last of which will be marked FINAL.

Whether a result given to the framework is the final one for the capture, or only a partial that contains a subset of the full set of dynamic metadata values.

Optional. Default value is FINAL.

Details

The entries in the result metadata buffers for a single capture may not overlap, except for this entry. The FINAL buffers must retain FIFO ordering relative to the requests that generate them, so the FINAL buffer for frame 3 must always be sent to the framework after the FINAL buffer for frame 2, and before the FINAL buffer for frame 4. PARTIAL buffers may be returned in any order relative to other frames, but all PARTIAL buffers for a given capture must arrive before the FINAL buffer for that capture. This entry may only be used by the HAL if quirks.usePartialResult is set to 1.

request
controls
Property Name Type Description Units Range Tags
android.request.frameCount int32 [system]

A frame counter set by the framework. Must be maintained unchanged in output frame. This value monotonically increases with every new result (that is, each new result has a unique frameCount value).

incrementing integer

Any int

android.request.id int32 [hidden]

An application-specified ID for the current request. Must be maintained unchanged in output frame

arbitrary integer assigned by application

Any int

android.request.inputStreams int32 x n [system]

List which camera reprocess stream is used for the source of reprocessing data.

List of camera reprocess stream IDs

Typically, only one entry allowed, must be a valid reprocess stream ID.

If android.jpeg.needsThumbnail is set, then multiple reprocess streams may be included in a single request; they must be different scaled versions of the same image.

Details

Only meaningful when android.request.type == REPROCESS. Ignored otherwise

android.request.metadataMode byte [system]
  • NONE

    No metadata should be produced on output, except for application-bound buffer data. If no application-bound streams exist, no frame should be placed in the output frame queue. If such streams exist, a frame should be placed on the output queue with null metadata but with the necessary output buffer information. Timestamp information should still be included with any output stream buffers

  • FULL

    All metadata should be produced. Statistics will only be produced if they are separately enabled

How much metadata to produce on output

android.request.outputStreams int32 x n [system]

Lists which camera output streams image data from this capture must be sent to

List of camera stream IDs

List must only include streams that have been created

Details

If no output streams are listed, then the image data should simply be discarded. The image data must still be captured for metadata and statistics production, and the lens and flash must operate as requested.

android.request.type byte [system]
  • CAPTURE

    Capture a new image from the imaging hardware, and process it according to the settings

  • REPROCESS

    Process previously captured data; the android.request.inputStream parameter determines the source reprocessing stream. TODO: Mark dynamic metadata needed for reprocessing with [RP]

The type of the request; either CAPTURE or REPROCESS. For HAL3, this tag is redundant.

static
Property Name Type Description Units Range Tags
android.request.maxNumOutputStreams int32 x 3 [public]

How many output streams can be allocated at the same time for each type of stream

The number of raw sensor streams; the number of processed, uncompressed streams; and the number of JPEG-compressed streams

>=1 for Raw and JPEG-compressed stream. >= 3 for processed, uncompressed streams

Details

Video snapshot with preview callbacks requires 3 processed streams (preview, record, app callbacks) and one JPEG stream (snapshot)

android.request.maxNumReprocessStreams int32 x 1 [system]

How many reprocessing streams of any type can be allocated at the same time

>= 1

dynamic
Property Name Type Description Units Range Tags
android.request.frameCount int32 [public]

A frame counter set by the framework. This value monotonically increases with every new result (that is, each new result has a unique frameCount value).

count of frames

> 0

Details

Reset on release()

android.request.id int32 [hidden]

An application-specified ID for the current request. Must be maintained unchanged in output frame

arbitrary integer assigned by application

Any int

android.request.metadataMode byte [system]
  • NONE

    No metadata should be produced on output, except for application-bound buffer data. If no application-bound streams exist, no frame should be placed in the output frame queue. If such streams exist, a frame should be placed on the output queue with null metadata but with the necessary output buffer information. Timestamp information should still be included with any output stream buffers

  • FULL

    All metadata should be produced. Statistics will only be produced if they are separately enabled

How much metadata to produce on output

android.request.outputStreams int32 x n [system]

Lists which camera output streams image data from this capture must be sent to

List of camera stream IDs

List must only include streams that have been created

Details

If no output streams are listed, then the image data should simply be discarded. The image data must still be captured for metadata and statistics production, and the lens and flash must operate as requested.

scaler
controls
Property Name Type Description Units Range Tags
android.scaler.cropRegion int32 x 4 [public as rectangle]

(x, y, width, height).

A rectangle with the top-level corner of (x,y) and size (width, height). The region of the sensor that is used for output. Each stream must use this rectangle to produce its output, cropping to a smaller region if necessary to maintain the stream's aspect ratio.

HAL2.x uses only (x, y, width)

(x,y) of top-left corner, width and height of region in pixels; (0,0) is top-left corner of android.sensor.activeArraySize
Details

Any additional per-stream cropping must be done to maximize the final pixel area of the stream.

For example, if the crop region is set to a 4:3 aspect ratio, then 4:3 streams should use the exact crop region. 16:9 streams should further crop vertically (letterbox).

Conversely, if the crop region is set to a 16:9, then 4:3 outputs should crop horizontally (pillarbox), and 16:9 streams should match exactly. These additional crops must be centered within the crop region.

The output streams must maintain square pixels at all times, no matter what the relative aspect ratios of the crop region and the stream are. Negative values for corner are allowed for raw output if full pixel array is larger than active pixel array. Width and height may be rounded to nearest larger supportable width, especially for raw output, where only a few fixed scales may be possible. The width and height of the crop region cannot be set to be smaller than floor( activeArraySize.width / android.scaler.maxDigitalZoom ) and floor( activeArraySize.height / android.scaler.maxDigitalZoom), respectively.

static
Property Name Type Description Units Range Tags
android.scaler.availableFormats int32 x n [public as imageFormat]
  • RAW_SENSOR optional 0x20
  • YV12 optional 0x32315659

    YCrCb 4:2:0 Planar

  • YCrCb_420_SP optional 0x11

    NV21

  • IMPLEMENTATION_DEFINED 0x22

    System internal format, not application-accessible

  • YCbCr_420_888 0x23

    Flexible YUV420 Format

  • BLOB 0x21

    JPEG format

The list of image formats that are supported by this camera device.

Details

All camera devices will support JPEG and YUV_420_888 formats.

When set to YUV_420_888, application can access the YUV420 data directly.

HAL Implementation Details

These format values are from HAL_PIXEL_FORMAT_* in system/core/include/system/graphics.h.

When IMPLEMENTATION_DEFINED is used, the platform gralloc module will select a format based on the usage flags provided by the camera HAL device and the other endpoint of the stream. It is usually used by preview and recording streams, where the application doesn't need access the image data.

YCbCr_420_888 format must be supported by the HAL. When an image stream needs CPU/application direct access, this format will be used.

The BLOB format must be supported by the HAL. This is used for the JPEG stream.

android.scaler.availableJpegMinDurations int64 x n [public]

The minimum frame duration that is supported for each resolution in availableJpegSizes. Should correspond to the frame duration when only that JPEG stream is active and captured in a burst, with all processing set to FAST

Details

When multiple streams are configured, the minimum frame duration will be >= max(individual stream min durations)

android.scaler.availableJpegSizes int32 x n x 2 [public as size]

The JPEG resolutions that are supported by this camera device.

Details

The resolutions are listed as (width, height) pairs. All camera devices will support sensor maximum resolution (defined by android.sensor.info.activeArraySize).

HAL Implementation Details

The HAL must include sensor maximum resolution (defined by android.sensor.info.activeArraySize), and should include half/quarter of sensor maximum resolution.

android.scaler.availableMaxDigitalZoom float [public]

The maximum ratio between active area width and crop region width, or between active area height and crop region height, if the crop region height is larger than width

>=1

android.scaler.availableProcessedMinDurations int64 x n [public]

The minimum frame duration that is supported for each resolution in availableProcessedSizes. Should correspond to the frame duration when only that processed stream is active, with all processing set to FAST

Details

When multiple streams are configured, the minimum frame duration will be >= max(individual stream min durations)

android.scaler.availableProcessedSizes int32 x n x 2 [public as size]

The resolutions available for use with processed output streams, such as YV12, NV12, and platform opaque YUV/RGB streams to the GPU or video encoders.

Details

The resolutions are listed as (width, height) pairs.

For a given use case, the actual maximum supported resolution may be lower than what is listed here, depending on the destination Surface for the image data. For example, for recording video, the video encoder chosen may have a maximum size limit (e.g. 1080p) smaller than what the camera (e.g. maximum resolution is 3264x2448) can provide.

Please reference the documentation for the image data destination to check if it limits the maximum size for image data.

HAL Implementation Details

For FULL capability devices (android.info.supportedHardwareLevel == FULL), the HAL must include all JPEG sizes listed in android.scaler.availableJpegSizes and each below resolution if it is smaller than or equal to the sensor maximum resolution (if they are not listed in JPEG sizes already):

  • 240p (320 x 240)
  • 480p (640 x 480)
  • 720p (1280 x 720)
  • 1080p (1920 x 1080)

For LIMITED capability devices (android.info.supportedHardwareLevel == LIMITED), the HAL only has to list up to the maximum video size supported by the devices.

android.scaler.availableRawMinDurations int64 x n [system]

The minimum frame duration that is supported for each raw resolution in availableRawSizes. Should correspond to the frame duration when only the raw stream is active.

Details

When multiple streams are configured, the minimum frame duration will be >= max(individual stream min durations)

android.scaler.availableRawSizes int32 x n x 2 [system as size]

The resolutions available for use with raw sensor output streams, listed as width, height

Must include: - sensor maximum resolution

dynamic
Property Name Type Description Units Range Tags
android.scaler.cropRegion int32 x 4 [public as rectangle]

(x, y, width, height).

A rectangle with the top-level corner of (x,y) and size (width, height). The region of the sensor that is used for output. Each stream must use this rectangle to produce its output, cropping to a smaller region if necessary to maintain the stream's aspect ratio.

HAL2.x uses only (x, y, width)

(x,y) of top-left corner, width and height of region in pixels; (0,0) is top-left corner of android.sensor.activeArraySize
Details

Any additional per-stream cropping must be done to maximize the final pixel area of the stream.

For example, if the crop region is set to a 4:3 aspect ratio, then 4:3 streams should use the exact crop region. 16:9 streams should further crop vertically (letterbox).

Conversely, if the crop region is set to a 16:9, then 4:3 outputs should crop horizontally (pillarbox), and 16:9 streams should match exactly. These additional crops must be centered within the crop region.

The output streams must maintain square pixels at all times, no matter what the relative aspect ratios of the crop region and the stream are. Negative values for corner are allowed for raw output if full pixel array is larger than active pixel array. Width and height may be rounded to nearest larger supportable width, especially for raw output, where only a few fixed scales may be possible. The width and height of the crop region cannot be set to be smaller than floor( activeArraySize.width / android.scaler.maxDigitalZoom ) and floor( activeArraySize.height / android.scaler.maxDigitalZoom), respectively.

sensor
controls
Property Name Type Description Units Range Tags
android.sensor.exposureTime int64 [public]

Duration each pixel is exposed to light.

If the sensor can't expose this exact duration, it should shorten the duration exposed to the nearest possible value (rather than expose longer).

nanoseconds

android.sensor.info.exposureTimeRange

Details

1/10000 - 30 sec range. No bulb mode

android.sensor.frameDuration int64 [public]

Duration from start of frame exposure to start of next frame exposure

nanoseconds

see android.sensor.info.maxFrameDuration, android.scaler.info.availableMinFrameDurations

Details

Exposure time has priority, so duration is set to max(duration, exposure time + overhead)

android.sensor.sensitivity int32 [public]

Gain applied to image data. Must be implemented through analog gain only if set to values below 'maximum analog sensitivity'.

If the sensor can't apply this exact gain, it should lessen the gain to the nearest possible value (rather than gain more).

ISO arithmetic units

android.sensor.info.sensitivityRange

Details

ISO 12232:2006 REI method

static
Property Name Type Description Units Range Tags
android.sensor.info.activeArraySize int32 x 4 [public as rectangle]
Four ints defining the active pixel rectangle

Area of raw data which corresponds to only active pixels; smaller or equal to pixelArraySize.

xmin, ymin, width, height. Top left of full pixel array is (0,0)
android.sensor.info.sensitivityRange int32 x 2 [public]
Range of supported sensitivities

Range of valid sensitivities

Min <= 100, Max >= 1600

android.sensor.info.colorFilterArrangement byte [system]
  • RGGB
  • GRBG
  • GBRG
  • BGGR
  • RGB

    Sensor is not Bayer; output has 3 16-bit values for each pixel, instead of just 1 16-bit value per pixel.

Arrangement of color filters on sensor; represents the colors in the top-left 2x2 section of the sensor, in reading order

android.sensor.info.exposureTimeRange int64 x 2 [public]
nanoseconds

Range of valid exposure times

Min <= 100e3 (100 us), Max >= 30e9 (30 sec)

android.sensor.info.maxFrameDuration int64 [public]

Maximum possible frame duration (minimum frame rate)

nanoseconds

>= 30e9

Details

Minimum duration is a function of resolution, processing settings. See android.scaler.availableProcessedMinDurations android.scaler.availableJpegMinDurations android.scaler.availableRawMinDurations

android.sensor.info.physicalSize float x 2 [public]
width x height in millimeters

The physical dimensions of the full pixel array

Details

Needed for FOV calculation for old API

android.sensor.info.pixelArraySize int32 x 2 [system as size]

Dimensions of full pixel array, possibly including black calibration pixels

Details

Maximum output resolution for raw format must match this in android.scaler.info.availableSizesPerFormat

android.sensor.info.whiteLevel int32 [system]

Maximum raw value output by sensor

> 1024 (10-bit output)

Details

Defines sensor bit depth (10-14 bits is expected)

android.sensor.baseGainFactor rational [public]

Gain factor from electrons to raw units when ISO=100

android.sensor.blackLevelPattern int32 x 4 [system]
2x2 raw count block

A fixed black level offset for each of the Bayer mosaic channels

>= 0 each

Details

As per DNG BlackLevelRepeatDim / BlackLevel tags

android.sensor.calibrationTransform1 rational x 9 [system]
3x3 matrix in row-major-order

Per-device calibration on top of color space transform 1

android.sensor.calibrationTransform2 rational x 9 [system]
3x3 matrix in row-major-order

Per-device calibration on top of color space transform 2

android.sensor.colorTransform1 rational x 9 [system]
3x3 matrix in row-major-order

Linear mapping from XYZ (D50) color space to reference linear sensor color, for first reference illuminant

Details

Use as follows XYZ = inv(transform) * clip( (raw - black level(raw) ) / ( white level - max black level) ). At least in the simple case

android.sensor.colorTransform2 rational x 9 [system]
3x3 matrix in row-major-order

Linear mapping from XYZ (D50) color space to reference linear sensor color, for second reference illuminant

android.sensor.forwardMatrix1 rational x 9 [system]
3x3 matrix in row-major-order

Used by DNG for better WB adaptation

android.sensor.forwardMatrix2 rational x 9 [system]
3x3 matrix in row-major-order

Used by DNG for better WB adaptation

android.sensor.maxAnalogSensitivity int32 [public]

Maximum sensitivity that is implemented purely through analog gain

Details

For android.sensor.sensitivity values less than or equal to this, all applied gain must be analog. For values above this, it can be a mix of analog and digital

android.sensor.noiseModelCoefficients float x 2 [system]
float constants A, B for the noise variance model

Estimation of sensor noise characteristics

var(raw pixel value) = electrons * (baseGainFactor * iso/100)^2 + A * (baseGainFactor * iso/100)^2 + B
Details

A represents sensor read noise before analog amplification; B represents noise from A/D conversion and other circuits after amplification. Both noise sources are assumed to be gaussian, independent, and not to vary across the sensor

android.sensor.orientation int32 [public]

Clockwise angle through which the output image needs to be rotated to be upright on the device screen in its native orientation. Also defines the direction of rolling shutter readout, which is from top to bottom in the sensor's coordinate system

degrees clockwise rotation, only multiples of 90

0,90,180,270

android.sensor.referenceIlluminant1 byte [system]
  • DAYLIGHT 1
  • FLUORESCENT 2
  • TUNGSTEN 3

    Incandescent light

  • FLASH 4
  • FINE_WEATHER 9
  • CLOUDY_WEATHER 10
  • SHADE 11
  • DAYLIGHT_FLUORESCENT 12

    D 5700 - 7100K

  • DAY_WHITE_FLUORESCENT 13

    N 4600 - 5400K

  • COOL_WHITE_FLUORESCENT 14

    W 3900 - 4500K

  • WHITE_FLUORESCENT 15

    WW 3200 - 3700K

  • STANDARD_A 17
  • STANDARD_B 18
  • STANDARD_C 19
  • D55 20
  • D65 21
  • D75 22
  • D50 23
  • ISO_STUDIO_TUNGSTEN 24

Light source used to define transform 1

Details

[EXIF LightSource tag] Must all these be supported? Need CCT for each!

android.sensor.referenceIlluminant2 byte [system]

Light source used to define transform 2

Same as illuminant 1
dynamic
Property Name Type Description Units Range Tags
android.sensor.exposureTime int64 [public]

Duration each pixel is exposed to light.

If the sensor can't expose this exact duration, it should shorten the duration exposed to the nearest possible value (rather than expose longer).

nanoseconds

android.sensor.info.exposureTimeRange

Details

1/10000 - 30 sec range. No bulb mode

android.sensor.frameDuration int64 [public]

Duration from start of frame exposure to start of next frame exposure

nanoseconds

see android.sensor.info.maxFrameDuration, android.scaler.info.availableMinFrameDurations

Details

Exposure time has priority, so duration is set to max(duration, exposure time + overhead)

android.sensor.sensitivity int32 [public]

Gain applied to image data. Must be implemented through analog gain only if set to values below 'maximum analog sensitivity'.

If the sensor can't apply this exact gain, it should lessen the gain to the nearest possible value (rather than gain more).

ISO arithmetic units

android.sensor.info.sensitivityRange

Details

ISO 12232:2006 REI method

android.sensor.timestamp int64 [public]

Time at start of exposure of first row

nanoseconds

> 0

Details

Monotonic, should be synced to other timestamps in system

android.sensor.temperature float [public]

The temperature of the sensor, sampled at the time exposure began for this frame.

The thermal diode being queried should be inside the sensor PCB, or somewhere close to it.

celsius

Optional. This value is missing if no temperature is available.

shading
controls
Property Name Type Description Units Range Tags
android.shading.mode byte [system]
  • OFF

    No shading correction is applied

  • FAST

    Must not slow down frame rate relative to raw bayer output

  • HIGH_QUALITY

    Frame rate may be reduced by high quality

Quality of lens shading correction applied to the image data

android.shading.strength byte [system]

Control the amount of shading correction applied to the images

unitless: 1-10; 10 is full shading compensation
dynamic
Property Name Type Description Units Range Tags
android.shading.mode byte [system]
  • OFF

    No shading correction is applied

  • FAST

    Must not slow down frame rate relative to raw bayer output

  • HIGH_QUALITY

    Frame rate may be reduced by high quality

Quality of lens shading correction applied to the image data

statistics
controls
Property Name Type Description Units Range Tags
android.statistics.faceDetectMode byte [public]
  • OFF
  • SIMPLE

    Optional Return rectangle and confidence only

  • FULL

    Optional Return all face metadata

State of the face detector unit

android.statistics.info.availableFaceDetectModes

Details

Whether face detection is enabled, and whether it should output just the basic fields or the full set of fields. Value must be one of the android.statistics.info.availableFaceDetectModes.

android.statistics.histogramMode byte [system as boolean]
  • OFF
  • ON

Operating mode for histogram generation

android.statistics.sharpnessMapMode byte [system as boolean]
  • OFF
  • ON

Operating mode for sharpness map generation

android.statistics.lensShadingMapMode byte [public]
  • OFF
  • ON

Whether the HAL needs to output the lens shading map in output result metadata

Details

When set to ON, android.statistics.lensShadingMap must be provided in the output result metadata.

static
Property Name Type Description Units Range Tags
android.statistics.info.availableFaceDetectModes byte x n [public]
List of enums from android.statistics.faceDetectMode

Which face detection modes are available, if any

List of enum: OFF SIMPLE FULL
Details

OFF means face detection is disabled, it must be included in the list.

SIMPLE means the device supports the android.statistics.faceRectangles and android.statistics.faceScores outputs.

FULL means the device additionally supports the android.statistics.faceIds and android.statistics.faceLandmarks outputs.

android.statistics.info.histogramBucketCount int32 [system]

Number of histogram buckets supported

>= 64

android.statistics.info.maxFaceCount int32 [public]

Maximum number of simultaneously detectable faces

>= 4 if availableFaceDetectionModes lists modes besides OFF, otherwise 0

android.statistics.info.maxHistogramCount int32 [system]

Maximum value possible for a histogram bucket

android.statistics.info.maxSharpnessMapValue int32 [system]

Maximum value possible for a sharpness map region.

android.statistics.info.sharpnessMapSize int32 x 2 [system as size]
width x height

Dimensions of the sharpness map

Must be at least 32 x 32

dynamic
Property Name Type Description Units Range Tags
android.statistics.faceDetectMode byte [public]
  • OFF
  • SIMPLE

    Optional Return rectangle and confidence only

  • FULL

    Optional Return all face metadata

State of the face detector unit

android.statistics.info.availableFaceDetectModes

Details

Whether face detection is enabled, and whether it should output just the basic fields or the full set of fields. Value must be one of the android.statistics.info.availableFaceDetectModes.

android.statistics.faceIds int32 x n [hidden]

List of unique IDs for detected faces

Details

Only available if faceDetectMode == FULL

android.statistics.faceLandmarks int32 x n x 6 [hidden]
(leftEyeX, leftEyeY, rightEyeX, rightEyeY, mouthX, mouthY)

List of landmarks for detected faces

Details

Only available if faceDetectMode == FULL

android.statistics.faceRectangles int32 x n x 4 [hidden as rectangle]
(xmin, ymin, xmax, ymax). (0,0) is top-left of active pixel area

List of the bounding rectangles for detected faces

Details

Only available if faceDetectMode != OFF

android.statistics.faceScores byte x n [hidden]

List of the face confidence scores for detected faces

1-100

Details

Only available if faceDetectMode != OFF. The value should be meaningful (for example, setting 100 at all times is illegal).

android.statistics.histogram int32 x n x 3 [system]
count of pixels for each color channel that fall into each histogram bucket, scaled to be between 0 and maxHistogramCount

A 3-channel histogram based on the raw sensor data

Details

The k'th bucket (0-based) covers the input range (with w = android.sensor.info.whiteLevel) of [ k * w/N, (k + 1) * w / N ). If only a monochrome sharpness map is supported, all channels should have the same data

android.statistics.histogramMode byte [system as boolean]
  • OFF
  • ON

Operating mode for histogram generation

android.statistics.sharpnessMap int32 x n x m x 3 [system]
estimated sharpness for each region of the input image. Normalized to be between 0 and maxSharpnessMapValue. Higher values mean sharper (better focused)

A 3-channel sharpness map, based on the raw sensor data

Details

If only a monochrome sharpness map is supported, all channels should have the same data

android.statistics.sharpnessMapMode byte [system as boolean]
  • OFF
  • ON

Operating mode for sharpness map generation

android.statistics.lensShadingMap float x 4 x n x m [public]
2D array of float gain factors per channel to correct lens shading

The shading map is a low-resolution floating-point map that lists the coefficients used to correct for vignetting, for each Bayer color channel.

Each gain factor is >= 1

Details

The least shaded section of the image should have a gain factor of 1; all other sections should have gains above 1.

When android.colorCorrection.mode = TRANSFORM_MATRIX, the map must take into account the colorCorrection settings.

The shading map is for the entire active pixel array, and is not affected by the crop region specified in the request. Each shading map entry is the value of the shading compensation map over a specific pixel on the sensor. Specifically, with a (N x M) resolution shading map, and an active pixel array size (W x H), shading map entry (x,y) ϵ (0 ... N-1, 0 ... M-1) is the value of the shading map at pixel ( ((W-1)/(N-1)) * x, ((H-1)/(M-1)) * y) for the four color channels. The map is assumed to be bilinearly interpolated between the sample points.

The channel order is [R, Geven, Godd, B], where Geven is the green channel for the even rows of a Bayer pattern, and Godd is the odd rows. The shading map is stored in a fully interleaved format, and its size is provided in the camera static metadata by android.lens.info.shadingMapSize.

The shading map should have on the order of 30-40 rows and columns, and must be smaller than 64x64.

As an example, given a very small map defined as:

android.lens.info.shadingMapSize = [ 4, 3 ]
android.statistics.lensShadingMap =
[ 1.3, 1.2, 1.15, 1.2,  1.2, 1.2, 1.15, 1.2,
    1.1, 1.2, 1.2, 1.2,  1.3, 1.2, 1.3, 1.3,
  1.2, 1.2, 1.25, 1.1,  1.1, 1.1, 1.1, 1.0,
    1.0, 1.0, 1.0, 1.0,  1.2, 1.3, 1.25, 1.2,
  1.3, 1.2, 1.2, 1.3,   1.2, 1.15, 1.1, 1.2,
    1.2, 1.1, 1.0, 1.2,  1.3, 1.15, 1.2, 1.3 ]

The low-resolution scaling map images for each channel are (displayed using nearest-neighbor interpolation):

Red lens shading map Green (even rows) lens shading map Green (odd rows) lens shading map Blue lens shading map

As a visualization only, inverting the full-color map to recover an image of a gray wall (using bicubic interpolation for visual quality) as captured by the sensor gives:

Image of a uniform white wall (inverse shading map)

android.statistics.predictedColorGains float x 4 [hidden]
A 1D array of floats for 4 color channel gains

The best-fit color channel gains calculated by the HAL's statistics units for the current output frame

Deprecated. Do not use.

Details

This may be different than the gains used for this frame, since statistics processing on data from a new frame typically completes after the transform has already been applied to that frame.

The 4 channel gains are defined in Bayer domain, see android.colorCorrection.gains for details.

This value should always be calculated by the AWB block, regardless of the android.control.* current values.

android.statistics.predictedColorTransform rational x 3 x 3 [hidden]
3x3 rational matrix in row-major order

The best-fit color transform matrix estimate calculated by the HAL's statistics units for the current output frame

Deprecated. Do not use.

Details

The HAL must provide the estimate from its statistics unit on the white balance transforms to use for the next frame. These are the values the HAL believes are the best fit for the current output frame. This may be different than the transform used for this frame, since statistics processing on data from a new frame typically completes after the transform has already been applied to that frame.

These estimates must be provided for all frames, even if capture settings and color transforms are set by the application.

This value should always be calculated by the AWB block, regardless of the android.control.* current values.

android.statistics.sceneFlicker byte [public]
  • NONE
  • 50HZ
  • 60HZ

The HAL estimated scene illumination lighting frequency

Details

Report NONE if there doesn't appear to be flickering illumination

tonemap
controls
Property Name Type Description Units Range Tags
android.tonemap.curveBlue float x n x 2 [public]
1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints.

Table mapping blue input values to output values

same as android.tonemap.curveRed

same as android.tonemap.curveRed

Details

Tonemapping / contrast / gamma curve for the blue channel, to use when android.tonemap.mode is CONTRAST_CURVE.

See android.tonemap.curveRed for more details.

android.tonemap.curveGreen float x n x 2 [public]
1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints.

Table mapping green input values to output values

same as android.tonemap.curveRed

same as android.tonemap.curveRed

Details

Tonemapping / contrast / gamma curve for the green channel, to use when android.tonemap.mode is CONTRAST_CURVE.

See android.tonemap.curveRed for more details.

android.tonemap.curveRed float x n x 2 [public]
1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints.

Table mapping red input values to output values

0-1 on input and output coordinates.

Details

Tonemapping / contrast / gamma curve for the red channel, to use when android.tonemap.mode is CONTRAST_CURVE.

Since the input and output ranges may vary depending on the camera pipeline, the input and output pixel values are represented by normalized floating-point values between 0 and 1, with 0 == black and 1 == white.

The curve should be linearly interpolated between the defined points. The points will be listed in increasing order of P_IN. For example, if the array is: [0.0, 0.0, 0.3, 0.5, 1.0, 1.0], then the input->output mapping for a few sample points would be: 0 -> 0, 0.15 -> 0.25, 0.3 -> 0.5, 0.5 -> 0.64

android.tonemap.mode byte [public]
  • CONTRAST_CURVE

    Use the tone mapping curve specified in android.tonemap.curve

  • FAST

    Must not slow down frame rate relative to raw bayer output

  • HIGH_QUALITY

    Frame rate may be reduced by high quality

static
Property Name Type Description Units Range Tags
android.tonemap.maxCurvePoints int32 [public]

Maximum number of supported points in the tonemap curve

>= 128

dynamic
Property Name Type Description Units Range Tags
android.tonemap.curveBlue float x n x 2 [public]
1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints.

Table mapping blue input values to output values

same as android.tonemap.curveRed

same as android.tonemap.curveRed

Details

Tonemapping / contrast / gamma curve for the blue channel, to use when android.tonemap.mode is CONTRAST_CURVE.

See android.tonemap.curveRed for more details.

android.tonemap.curveGreen float x n x 2 [public]
1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints.

Table mapping green input values to output values

same as android.tonemap.curveRed

same as android.tonemap.curveRed

Details

Tonemapping / contrast / gamma curve for the green channel, to use when android.tonemap.mode is CONTRAST_CURVE.

See android.tonemap.curveRed for more details.

android.tonemap.curveRed float x n x 2 [public]
1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints.

Table mapping red input values to output values

0-1 on input and output coordinates.

Details

Tonemapping / contrast / gamma curve for the red channel, to use when android.tonemap.mode is CONTRAST_CURVE.

Since the input and output ranges may vary depending on the camera pipeline, the input and output pixel values are represented by normalized floating-point values between 0 and 1, with 0 == black and 1 == white.

The curve should be linearly interpolated between the defined points. The points will be listed in increasing order of P_IN. For example, if the array is: [0.0, 0.0, 0.3, 0.5, 1.0, 1.0], then the input->output mapping for a few sample points would be: 0 -> 0, 0.15 -> 0.25, 0.3 -> 0.5, 0.5 -> 0.64

android.tonemap.mode byte [public]
  • CONTRAST_CURVE

    Use the tone mapping curve specified in android.tonemap.curve

  • FAST

    Must not slow down frame rate relative to raw bayer output

  • HIGH_QUALITY

    Frame rate may be reduced by high quality

led
controls
Property Name Type Description Units Range Tags
android.led.transmit byte [hidden as boolean]
  • OFF
  • ON

This LED is nominally used to indicate to the user that the camera is powered on and may be streaming images back to the Application Processor. In certain rare circumstances, the OS may disable this when video is processed locally and not transmitted to any untrusted applications.

In particular, the LED must always be on when the data could be transmitted off the device. The LED should always be on whenever data is stored locally on the device.

The LED may be off if a trusted application is using the data that doesn't violate the above rules.

dynamic
Property Name Type Description Units Range Tags
android.led.transmit byte [hidden as boolean]
  • OFF
  • ON

This LED is nominally used to indicate to the user that the camera is powered on and may be streaming images back to the Application Processor. In certain rare circumstances, the OS may disable this when video is processed locally and not transmitted to any untrusted applications.

In particular, the LED must always be on when the data could be transmitted off the device. The LED should always be on whenever data is stored locally on the device.

The LED may be off if a trusted application is using the data that doesn't violate the above rules.

static
Property Name Type Description Units Range Tags
android.led.availableLeds byte x n [hidden]

A list of camera LEDs that are available on this system.

info
static
Property Name Type Description Units Range Tags
android.info.supportedHardwareLevel byte [public]
  • LIMITED
  • FULL

The camera 3 HAL device can implement one of two possible operational modes; limited and full. Full support is expected from new higher-end devices. Limited mode has hardware requirements roughly in line with those for a camera HAL device v1 implementation, and is expected from older or inexpensive devices. Full is a strict superset of limited, and they share the same essential operational flow.

For full details refer to "S3. Operational Modes" in camera3.h

Optional. Default value is LIMITED.

blackLevel
controls
Property Name Type Description Units Range Tags
android.blackLevel.lock byte [public as boolean]
  • OFF
  • ON

Whether black-level compensation is locked to its current values, or is free to vary.

Details

When set to ON, the values used for black-level compensation will not change until the lock is set to OFF.

Since changes to certain capture parameters (such as exposure time) may require resetting of black level compensation, the camera device must report whether setting the black level lock was successful in the output result metadata.

For example, if a sequence of requests is as follows:

  • Request 1: Exposure = 10ms, Black level lock = OFF
  • Request 2: Exposure = 10ms, Black level lock = ON
  • Request 3: Exposure = 10ms, Black level lock = ON
  • Request 4: Exposure = 20ms, Black level lock = ON
  • Request 5: Exposure = 20ms, Black level lock = ON
  • Request 6: Exposure = 20ms, Black level lock = ON

And the exposure change in Request 4 requires the camera device to reset the black level offsets, then the output result metadata is expected to be:

  • Result 1: Exposure = 10ms, Black level lock = OFF
  • Result 2: Exposure = 10ms, Black level lock = ON
  • Result 3: Exposure = 10ms, Black level lock = ON
  • Result 4: Exposure = 20ms, Black level lock = OFF
  • Result 5: Exposure = 20ms, Black level lock = ON
  • Result 6: Exposure = 20ms, Black level lock = ON

This indicates to the application that on frame 4, black levels were reset due to exposure value changes, and pixel values may not be consistent across captures.

The camera device will maintain the lock to the extent possible, only overriding the lock to OFF when changes to other request parameters require a black level recalculation or reset.

HAL Implementation Details

If for some reason black level locking is no longer possible (for example, the analog gain has changed, which forces black level offsets to be recalculated), then the HAL must override this request (and it must report 'OFF' when this does happen) until the next capture for which locking is possible again.

dynamic
Property Name Type Description Units Range Tags
android.blackLevel.lock byte [public as boolean]
  • OFF
  • ON

Whether black-level compensation is locked to its current values, or is free to vary.

Details

Whether the black level offset was locked for this frame. Should be ON if android.blackLevel.lock was ON in the capture request, unless a change in other capture settings forced the camera device to perform a black level reset.

HAL Implementation Details

If for some reason black level locking is no longer possible (for example, the analog gain has changed, which forces black level offsets to be recalculated), then the HAL must override this request (and it must report 'OFF' when this does happen) until the next capture for which locking is possible again.

Tags

[ top ]