metadata_properties.xml revision d4483d5c218eb1c7d4c3d602cf9261acffaeb65a
1<?xml version="1.0" encoding="utf-8"?> 2<!-- Copyright (C) 2012 The Android Open Source Project 3 4 Licensed under the Apache License, Version 2.0 (the "License"); 5 you may not use this file except in compliance with the License. 6 You may obtain a copy of the License at 7 8 http://www.apache.org/licenses/LICENSE-2.0 9 10 Unless required by applicable law or agreed to in writing, software 11 distributed under the License is distributed on an "AS IS" BASIS, 12 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 See the License for the specific language governing permissions and 14 limitations under the License. 15--> 16<metadata xmlns="http://schemas.android.com/service/camera/metadata/" 17xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 18xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata_properties.xsd"> 19 20 <tags> 21 <tag id="AWB"> 22 Needed for auto white balance 23 </tag> 24 <tag id="BC"> 25 Needed for backwards compatibility with old Java API 26 </tag> 27 <tag id="V1"> 28 New features for first camera 2 release (API1) 29 </tag> 30 <tag id="ADV"> 31 <!-- TODO: fill the tag description --> 32 </tag> 33 <tag id="DNG"> 34 Needed for DNG file support 35 </tag> 36 <tag id="EXIF"> 37 <!-- TODO: fill the tag description --> 38 </tag> 39 <tag id="HAL2"> 40 Entry is only used by camera device HAL 2.x 41 </tag> 42 <tag id="FULL"> 43 Entry is required for full hardware level devices, and optional for other hardware levels 44 </tag> 45 <tag id="LIMITED"> 46 Entry assists with LIMITED device implementation. LIMITED devices 47 must implement all entries with this tag. Optional for FULL devices. 48 </tag> 49 </tags> 50 51 <types> 52 <typedef name="rectangle"> 53 <language name="java">android.graphics.Rect</language> 54 </typedef> 55 <typedef name="size"> 56 <language name="java">android.hardware.camera2.Size</language> 57 </typedef> 58 <typedef name="string"> 59 <language name="java">String</language> 60 </typedef> 61 <typedef name="boolean"> 62 <language name="java">boolean</language> 63 </typedef> 64 <typedef name="imageFormat"> 65 <language name="java">int</language> 66 </typedef> 67 </types> 68 69 <namespace name="android"> 70 <section name="colorCorrection"> 71 <controls> 72 <entry name="mode" type="byte" visibility="public" enum="true"> 73 <enum> 74 <value>TRANSFORM_MATRIX 75 <notes>Use the android.colorCorrection.transform matrix 76 and android.colorCorrection.gains to do color conversion. 77 78 All advanced white balance adjustments (not specified 79 by our white balance pipeline) must be disabled. 80 81 If AWB is enabled with `android.control.awbMode != OFF`, then 82 TRANSFORM_MATRIX is ignored. The camera device will override 83 this value to either FAST or HIGH_QUALITY. 84 </notes> 85 </value> 86 <value>FAST 87 <notes>Must not slow down capture rate relative to sensor raw 88 output. 89 90 Advanced white balance adjustments above and beyond 91 the specified white balance pipeline may be applied. 92 93 If AWB is enabled with `android.control.awbMode != OFF`, then 94 the camera device uses the last frame's AWB values 95 (or defaults if AWB has never been run). 96 </notes> 97 </value> 98 <value>HIGH_QUALITY 99 <notes>Capture rate (relative to sensor raw output) 100 may be reduced by high quality. 101 102 Advanced white balance adjustments above and beyond 103 the specified white balance pipeline may be applied. 104 105 If AWB is enabled with `android.control.awbMode != OFF`, then 106 the camera device uses the last frame's AWB values 107 (or defaults if AWB has never been run). 108 </notes> 109 </value> 110 </enum> 111 112 <description> 113 The mode control selects how the image data is converted from the 114 sensor's native color into linear sRGB color. 115 </description> 116 <details> 117 When auto-white balance is enabled with android.control.awbMode, this 118 control is overridden by the AWB routine. When AWB is disabled, the 119 application controls how the color mapping is performed. 120 121 We define the expected processing pipeline below. For consistency 122 across devices, this is always the case with TRANSFORM_MATRIX. 123 124 When either FULL or HIGH_QUALITY is used, the camera device may 125 do additional processing but android.colorCorrection.gains and 126 android.colorCorrection.transform will still be provided by the 127 camera device (in the results) and be roughly correct. 128 129 Switching to TRANSFORM_MATRIX and using the data provided from 130 FAST or HIGH_QUALITY will yield a picture with the same white point 131 as what was produced by the camera device in the earlier frame. 132 133 The expected processing pipeline is as follows: 134 135 ![White balance processing pipeline](android.colorCorrection.mode/processing_pipeline.png) 136 137 The white balance is encoded by two values, a 4-channel white-balance 138 gain vector (applied in the Bayer domain), and a 3x3 color transform 139 matrix (applied after demosaic). 140 141 The 4-channel white-balance gains are defined as: 142 143 android.colorCorrection.gains = [ R G_even G_odd B ] 144 145 where `G_even` is the gain for green pixels on even rows of the 146 output, and `G_odd` is the gain for green pixels on the odd rows. 147 These may be identical for a given camera device implementation; if 148 the camera device does not support a separate gain for even/odd green 149 channels, it will use the `G_even` value, and write `G_odd` equal to 150 `G_even` in the output result metadata. 151 152 The matrices for color transforms are defined as a 9-entry vector: 153 154 android.colorCorrection.transform = [ I0 I1 I2 I3 I4 I5 I6 I7 I8 ] 155 156 which define a transform from input sensor colors, `P_in = [ r g b ]`, 157 to output linear sRGB, `P_out = [ r' g' b' ]`, 158 159 with colors as follows: 160 161 r' = I0r + I1g + I2b 162 g' = I3r + I4g + I5b 163 b' = I6r + I7g + I8b 164 165 Both the input and output value ranges must match. Overflow/underflow 166 values are clipped to fit within the range. 167 </details> 168 </entry> 169 <entry name="transform" type="rational" visibility="public" 170 type_notes="3x3 rational matrix in row-major order" 171 container="array"> 172 <array> 173 <size>3</size> 174 <size>3</size> 175 </array> 176 <description>A color transform matrix to use to transform 177 from sensor RGB color space to output linear sRGB color space 178 </description> 179 <details>This matrix is either set by the camera device when the request 180 android.colorCorrection.mode is not TRANSFORM_MATRIX, or 181 directly by the application in the request when the 182 android.colorCorrection.mode is TRANSFORM_MATRIX. 183 184 In the latter case, the camera device may round the matrix to account 185 for precision issues; the final rounded matrix should be reported back 186 in this matrix result metadata. The transform should keep the magnitude 187 of the output color values within `[0, 1.0]` (assuming input color 188 values is within the normalized range `[0, 1.0]`), or clipping may occur. 189 </details> 190 </entry> 191 <entry name="gains" type="float" visibility="public" 192 type_notes="A 1D array of floats for 4 color channel gains" 193 container="array"> 194 <array> 195 <size>4</size> 196 </array> 197 <description>Gains applying to Bayer raw color channels for 198 white-balance</description> 199 <details>The 4-channel white-balance gains are defined in 200 the order of `[R G_even G_odd B]`, where `G_even` is the gain 201 for green pixels on even rows of the output, and `G_odd` 202 is the gain for green pixels on the odd rows. if a HAL 203 does not support a separate gain for even/odd green channels, 204 it should use the `G_even` value, and write `G_odd` equal to 205 `G_even` in the output result metadata. 206 207 This array is either set by HAL when the request 208 android.colorCorrection.mode is not TRANSFORM_MATRIX, or 209 directly by the application in the request when the 210 android.colorCorrection.mode is TRANSFORM_MATRIX. 211 212 The output should be the gains actually applied by the HAL to 213 the current frame.</details> 214 </entry> 215 </controls> 216 <dynamic> 217 <clone entry="android.colorCorrection.transform" kind="controls"> 218 </clone> 219 <clone entry="android.colorCorrection.gains" kind="controls"> 220 </clone> 221 </dynamic> 222 </section> 223 <section name="control"> 224 <controls> 225 <entry name="aeAntibandingMode" type="byte" visibility="public" 226 enum="true" > 227 <enum> 228 <value>OFF 229 <notes> 230 The camera device will not adjust exposure duration to 231 avoid banding problems. 232 </notes> 233 </value> 234 <value>50HZ 235 <notes> 236 The camera device will adjust exposure duration to 237 avoid banding problems with 50Hz illumination sources. 238 </notes> 239 </value> 240 <value>60HZ 241 <notes> 242 The camera device will adjust exposure duration to 243 avoid banding problems with 60Hz illumination 244 sources. 245 </notes> 246 </value> 247 <value>AUTO 248 <notes> 249 The camera device will automatically adapt its 250 antibanding routine to the current illumination 251 conditions. This is the default. 252 </notes> 253 </value> 254 </enum> 255 <description> 256 The desired setting for the camera device's auto-exposure 257 algorithm's antibanding compensation. 258 </description> 259 <range> 260 android.control.aeAvailableAntibandingModes 261 </range> 262 <details> 263 Some kinds of lighting fixtures, such as some fluorescent 264 lights, flicker at the rate of the power supply frequency 265 (60Hz or 50Hz, depending on country). While this is 266 typically not noticeable to a person, it can be visible to 267 a camera device. If a camera sets its exposure time to the 268 wrong value, the flicker may become visible in the 269 viewfinder as flicker or in a final captured image, as a 270 set of variable-brightness bands across the image. 271 272 Therefore, the auto-exposure routines of camera devices 273 include antibanding routines that ensure that the chosen 274 exposure value will not cause such banding. The choice of 275 exposure time depends on the rate of flicker, which the 276 camera device can detect automatically, or the expected 277 rate can be selected by the application using this 278 control. 279 280 A given camera device may not support all of the possible 281 options for the antibanding mode. The 282 android.control.aeAvailableAntibandingModes key contains 283 the available modes for a given camera device. 284 285 The default mode is AUTO, which must be supported by all 286 camera devices. 287 288 If manual exposure control is enabled (by setting 289 android.control.aeMode or android.control.mode to OFF), 290 then this setting has no effect, and the application must 291 ensure it selects exposure times that do not cause banding 292 issues. The android.statistics.sceneFlicker key can assist 293 the application in this. 294 </details> 295 <hal_details> 296 For all capture request templates, this field must be set 297 to AUTO. AUTO is the only mode that must supported; 298 OFF, 50HZ, 60HZ are all optional. 299 300 If manual exposure control is enabled (by setting 301 android.control.aeMode or android.control.mode to OFF), 302 then the exposure values provided by the application must not be 303 adjusted for antibanding. 304 </hal_details> 305 <tag id="BC" /> 306 </entry> 307 <entry name="aeExposureCompensation" type="int32" visibility="public"> 308 <description>Adjustment to AE target image 309 brightness</description> 310 <units>count of positive/negative EV steps</units> 311 <details>For example, if EV step is 0.333, '6' will mean an 312 exposure compensation of +2 EV; -3 will mean an exposure 313 compensation of -1</details> 314 <tag id="BC" /> 315 </entry> 316 <entry name="aeLock" type="byte" visibility="public" enum="true" 317 typedef="boolean"> 318 <enum> 319 <value>OFF 320 <notes>Autoexposure lock is disabled; the AE algorithm 321 is free to update its parameters.</notes></value> 322 <value>ON 323 <notes>Autoexposure lock is enabled; the AE algorithm 324 must not update the exposure and sensitivity parameters 325 while the lock is active</notes></value> 326 </enum> 327 <description>Whether AE is currently locked to its latest 328 calculated values.</description> 329 <details>Note that even when AE is locked, the flash may be 330 fired if the android.control.aeMode is ON_AUTO_FLASH / ON_ALWAYS_FLASH / 331 ON_AUTO_FLASH_REDEYE. 332 333 If AE precapture is triggered (see android.control.aePrecaptureTrigger) 334 when AE is already locked, the camera device will not change the exposure time 335 (android.sensor.exposureTime) and sensitivity (android.sensor.sensitivity) 336 parameters. The flash may be fired if the android.control.aeMode 337 is ON_AUTO_FLASH/ON_AUTO_FLASH_REDEYE and the scene is too dark. If the 338 android.control.aeMode is ON_ALWAYS_FLASH, the scene may become overexposed. 339 340 See android.control.aeState for AE lock related state transition details. 341 </details> 342 <tag id="BC" /> 343 </entry> 344 <entry name="aeMode" type="byte" visibility="public" enum="true"> 345 <enum> 346 <value>OFF 347 <notes> 348 The camera device's autoexposure routine is disabled; 349 the application-selected android.sensor.exposureTime, 350 android.sensor.sensitivity and 351 android.sensor.frameDuration are used by the camera 352 device, along with android.flash.* fields, if there's 353 a flash unit for this camera device. 354 </notes> 355 </value> 356 <value>ON 357 <notes> 358 The camera device's autoexposure routine is active, 359 with no flash control. The application's values for 360 android.sensor.exposureTime, 361 android.sensor.sensitivity, and 362 android.sensor.frameDuration are ignored. The 363 application has control over the various 364 android.flash.* fields. 365 </notes> 366 </value> 367 <value>ON_AUTO_FLASH 368 <notes> 369 Like ON, except that the camera device also controls 370 the camera's flash unit, firing it in low-light 371 conditions. The flash may be fired during a 372 precapture sequence (triggered by 373 android.control.aePrecaptureTrigger) and may be fired 374 for captures for which the 375 android.control.captureIntent field is set to 376 STILL_CAPTURE 377 </notes> 378 </value> 379 <value>ON_ALWAYS_FLASH 380 <notes> 381 Like ON, except that the camera device also controls 382 the camera's flash unit, always firing it for still 383 captures. The flash may be fired during a precapture 384 sequence (triggered by 385 android.control.aePrecaptureTrigger) and will always 386 be fired for captures for which the 387 android.control.captureIntent field is set to 388 STILL_CAPTURE 389 </notes> 390 </value> 391 <value>ON_AUTO_FLASH_REDEYE 392 <notes> 393 Like ON_AUTO_FLASH, but with automatic red eye 394 reduction. If deemed necessary by the camera device, 395 a red eye reduction flash will fire during the 396 precapture sequence. 397 </notes> 398 </value> 399 </enum> 400 <description>The desired mode for the camera device's 401 auto-exposure routine.</description> 402 <range>android.control.aeAvailableModes</range> 403 <details> 404 This control is only effective if android.control.mode is 405 AUTO. 406 407 When set to any of the ON modes, the camera device's 408 auto-exposure routine is enabled, overriding the 409 application's selected exposure time, sensor sensitivity, 410 and frame duration (android.sensor.exposureTime, 411 android.sensor.sensitivity, and 412 android.sensor.frameDuration). If one of the FLASH modes 413 is selected, the camera device's flash unit controls are 414 also overridden. 415 416 The FLASH modes are only available if the camera device 417 has a flash unit (android.flash.info.available is `true`). 418 419 If flash TORCH mode is desired, this field must be set to 420 ON or OFF, and android.flash.mode set to TORCH. 421 422 When set to any of the ON modes, the values chosen by the 423 camera device auto-exposure routine for the overridden 424 fields for a given capture will be available in its 425 CaptureResult. 426 </details> 427 <tag id="BC" /> 428 </entry> 429 <entry name="aeRegions" type="int32" visibility="public" 430 container="array"> 431 <array> 432 <size>5</size> 433 <size>area_count</size> 434 </array> 435 <description>List of areas to use for 436 metering.</description> 437 <range>`area_count <= android.control.maxRegions[0]`</range> 438 <details>Each area is a rectangle plus weight: xmin, ymin, 439 xmax, ymax, weight. The rectangle is defined to be inclusive of the 440 specified coordinates. 441 442 The coordinate system is based on the active pixel array, 443 with (0,0) being the top-left pixel in the active pixel array, and 444 (android.sensor.info.activeArraySize.width - 1, 445 android.sensor.info.activeArraySize.height - 1) being the 446 bottom-right pixel in the active pixel array. The weight 447 should be nonnegative. 448 449 If all regions have 0 weight, then no specific metering area 450 needs to be used by the HAL. If the metering region is 451 outside the current android.scaler.cropRegion, the HAL 452 should ignore the sections outside the region and output the 453 used sections in the frame metadata.</details> 454 <tag id="BC" /> 455 </entry> 456 <entry name="aeTargetFpsRange" type="int32" visibility="public" 457 container="array"> 458 <array> 459 <size>2</size> 460 </array> 461 <description>Range over which fps can be adjusted to 462 maintain exposure</description> 463 <range>android.control.aeAvailableTargetFpsRanges</range> 464 <details>Only constrains AE algorithm, not manual control 465 of android.sensor.exposureTime</details> 466 <tag id="BC" /> 467 </entry> 468 <entry name="aePrecaptureTrigger" type="byte" visibility="public" 469 enum="true"> 470 <enum> 471 <value>IDLE 472 <notes>The trigger is idle.</notes> 473 </value> 474 <value>START 475 <notes>The precapture metering sequence will be started 476 by the camera device. The exact effect of the precapture 477 trigger depends on the current AE mode and state.</notes> 478 </value> 479 </enum> 480 <description>Whether the camera device will trigger a precapture 481 metering sequence when it processes this request.</description> 482 <details>This entry is normally set to IDLE, or is not 483 included at all in the request settings. When included and 484 set to START, the camera device will trigger the autoexposure 485 precapture metering sequence. 486 487 The effect of AE precapture trigger depends on the current 488 AE mode and state; see android.control.aeState for AE precapture 489 state transition details.</details> 490 <tag id="BC" /> 491 </entry> 492 <entry name="afMode" type="byte" visibility="public" enum="true"> 493 <enum> 494 <value>OFF 495 <notes>The auto-focus routine does not control the lens; 496 android.lens.focusDistance is controlled by the 497 application</notes></value> 498 <value>AUTO 499 <notes> 500 If lens is not fixed focus. 501 502 Use android.lens.info.minimumFocusDistance to determine if lens 503 is fixed-focus. In this mode, the lens does not move unless 504 the autofocus trigger action is called. When that trigger 505 is activated, AF must transition to ACTIVE_SCAN, then to 506 the outcome of the scan (FOCUSED or NOT_FOCUSED). 507 508 Triggering AF_CANCEL resets the lens position to default, 509 and sets the AF state to INACTIVE.</notes></value> 510 <value>MACRO 511 <notes>In this mode, the lens does not move unless the 512 autofocus trigger action is called. 513 514 When that trigger is activated, AF must transition to 515 ACTIVE_SCAN, then to the outcome of the scan (FOCUSED or 516 NOT_FOCUSED). Triggering cancel AF resets the lens 517 position to default, and sets the AF state to 518 INACTIVE.</notes></value> 519 <value>CONTINUOUS_VIDEO 520 <notes>In this mode, the AF algorithm modifies the lens 521 position continually to attempt to provide a 522 constantly-in-focus image stream. 523 524 The focusing behavior should be suitable for good quality 525 video recording; typically this means slower focus 526 movement and no overshoots. When the AF trigger is not 527 involved, the AF algorithm should start in INACTIVE state, 528 and then transition into PASSIVE_SCAN and PASSIVE_FOCUSED 529 states as appropriate. When the AF trigger is activated, 530 the algorithm should immediately transition into 531 AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the 532 lens position until a cancel AF trigger is received. 533 534 Once cancel is received, the algorithm should transition 535 back to INACTIVE and resume passive scan. Note that this 536 behavior is not identical to CONTINUOUS_PICTURE, since an 537 ongoing PASSIVE_SCAN must immediately be 538 canceled.</notes></value> 539 <value>CONTINUOUS_PICTURE 540 <notes>In this mode, the AF algorithm modifies the lens 541 position continually to attempt to provide a 542 constantly-in-focus image stream. 543 544 The focusing behavior should be suitable for still image 545 capture; typically this means focusing as fast as 546 possible. When the AF trigger is not involved, the AF 547 algorithm should start in INACTIVE state, and then 548 transition into PASSIVE_SCAN and PASSIVE_FOCUSED states as 549 appropriate as it attempts to maintain focus. When the AF 550 trigger is activated, the algorithm should finish its 551 PASSIVE_SCAN if active, and then transition into 552 AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the 553 lens position until a cancel AF trigger is received. 554 555 When the AF cancel trigger is activated, the algorithm 556 should transition back to INACTIVE and then act as if it 557 has just been started.</notes></value> 558 <value>EDOF 559 <notes>Extended depth of field (digital focus). AF 560 trigger is ignored, AF state should always be 561 INACTIVE.</notes></value> 562 </enum> 563 <description>Whether AF is currently enabled, and what 564 mode it is set to</description> 565 <range>android.control.afAvailableModes</range> 566 <details>Only effective if android.control.mode = AUTO. 567 568 If the lens is controlled by the camera device auto-focus algorithm, 569 the camera device will report the current AF status in android.control.afState 570 in result metadata.</details> 571 <tag id="BC" /> 572 </entry> 573 <entry name="afRegions" type="int32" visibility="public" 574 container="array"> 575 <array> 576 <size>5</size> 577 <size>area_count</size> 578 </array> 579 <description>List of areas to use for focus 580 estimation.</description> 581 <range>`area_count <= android.control.maxRegions[2]`</range> 582 <details>Each area is a rectangle plus weight: xmin, ymin, 583 xmax, ymax, weight. The rectangle is defined to be inclusive of the 584 specified coordinates. 585 586 The coordinate system is based on the active pixel array, 587 with (0,0) being the top-left pixel in the active pixel array, and 588 (android.sensor.info.activeArraySize.width - 1, 589 android.sensor.info.activeArraySize.height - 1) being the 590 bottom-right pixel in the active pixel array. The weight 591 should be nonnegative. 592 593 If all regions have 0 weight, then no specific focus area 594 needs to be used by the HAL. If the focusing region is 595 outside the current android.scaler.cropRegion, the HAL 596 should ignore the sections outside the region and output the 597 used sections in the frame metadata.</details> 598 <tag id="BC" /> 599 </entry> 600 <entry name="afTrigger" type="byte" visibility="public" enum="true"> 601 <enum> 602 <value>IDLE 603 <notes>The trigger is idle.</notes> 604 </value> 605 <value>START 606 <notes>Autofocus will trigger now.</notes> 607 </value> 608 <value>CANCEL 609 <notes>Autofocus will return to its initial 610 state, and cancel any currently active trigger.</notes> 611 </value> 612 </enum> 613 <description> 614 Whether the camera device will trigger autofocus for this request. 615 </description> 616 <details>This entry is normally set to IDLE, or is not 617 included at all in the request settings. 618 619 When included and set to START, the camera device will trigger the 620 autofocus algorithm. If autofocus is disabled, this trigger has no effect. 621 622 When set to CANCEL, the camera device will cancel any active trigger, 623 and return to its initial AF state. 624 625 See android.control.afState for what that means for each AF mode. 626 </details> 627 <tag id="BC" /> 628 </entry> 629 <entry name="awbLock" type="byte" visibility="public" enum="true" 630 typedef="boolean"> 631 <enum> 632 <value>OFF 633 <notes>Auto-whitebalance lock is disabled; the AWB 634 algorithm is free to update its parameters if in AUTO 635 mode.</notes></value> 636 <value>ON 637 <notes>Auto-whitebalance lock is enabled; the AWB 638 algorithm must not update its parameters while the lock 639 is active.</notes></value> 640 </enum> 641 <description>Whether AWB is currently locked to its 642 latest calculated values.</description> 643 <details>Note that AWB lock is only meaningful for AUTO 644 mode; in other modes, AWB is already fixed to a specific 645 setting.</details> 646 <tag id="BC" /> 647 </entry> 648 <entry name="awbMode" type="byte" visibility="public" enum="true"> 649 <enum> 650 <value>OFF 651 <notes> 652 The camera device's auto white balance routine is disabled; 653 the application-selected color transform matrix 654 (android.colorCorrection.transform) and gains 655 (android.colorCorrection.gains) are used by the camera 656 device for manual white balance control. 657 </notes> 658 </value> 659 <value>AUTO 660 <notes> 661 The camera device's auto white balance routine is active; 662 the application's values for android.colorCorrection.transform 663 and android.colorCorrection.gains are ignored. 664 </notes> 665 </value> 666 <value>INCANDESCENT 667 <notes> 668 The camera device's auto white balance routine is disabled; 669 the camera device uses incandescent light as the assumed scene 670 illumination for white balance. While the exact white balance 671 transforms are up to the camera device, they will approximately 672 match the CIE standard illuminant A. 673 </notes> 674 </value> 675 <value>FLUORESCENT 676 <notes> 677 The camera device's auto white balance routine is disabled; 678 the camera device uses fluorescent light as the assumed scene 679 illumination for white balance. While the exact white balance 680 transforms are up to the camera device, they will approximately 681 match the CIE standard illuminant F2. 682 </notes> 683 </value> 684 <value>WARM_FLUORESCENT 685 <notes> 686 The camera device's auto white balance routine is disabled; 687 the camera device uses warm fluorescent light as the assumed scene 688 illumination for white balance. While the exact white balance 689 transforms are up to the camera device, they will approximately 690 match the CIE standard illuminant F4. 691 </notes> 692 </value> 693 <value>DAYLIGHT 694 <notes> 695 The camera device's auto white balance routine is disabled; 696 the camera device uses daylight light as the assumed scene 697 illumination for white balance. While the exact white balance 698 transforms are up to the camera device, they will approximately 699 match the CIE standard illuminant D65. 700 </notes> 701 </value> 702 <value>CLOUDY_DAYLIGHT 703 <notes> 704 The camera device's auto white balance routine is disabled; 705 the camera device uses cloudy daylight light as the assumed scene 706 illumination for white balance. 707 </notes> 708 </value> 709 <value>TWILIGHT 710 <notes> 711 The camera device's auto white balance routine is disabled; 712 the camera device uses twilight light as the assumed scene 713 illumination for white balance. 714 </notes> 715 </value> 716 <value>SHADE 717 <notes> 718 The camera device's auto white balance routine is disabled; 719 the camera device uses shade light as the assumed scene 720 illumination for white balance. 721 </notes> 722 </value> 723 </enum> 724 <description>Whether AWB is currently setting the color 725 transform fields, and what its illumination target 726 is</description> 727 <range>android.control.awbAvailableModes</range> 728 <details> 729 This control is only effective if android.control.mode is AUTO. 730 731 When set to the ON mode, the camera device's auto white balance 732 routine is enabled, overriding the application's selected 733 android.colorCorrection.transform, android.colorCorrection.gains and 734 android.colorCorrection.mode. 735 736 When set to the OFF mode, the camera device's auto white balance 737 routine is disabled. The applicantion manually controls the white 738 balance by android.colorCorrection.transform, android.colorCorrection.gains 739 and android.colorCorrection.mode. 740 741 When set to any other modes, the camera device's auto white balance 742 routine is disabled. The camera device uses each particular illumination 743 target for white balance adjustment. 744 </details> 745 <tag id="BC" /> 746 <tag id="AWB" /> 747 </entry> 748 <entry name="awbRegions" type="int32" visibility="public" 749 container="array"> 750 <array> 751 <size>5</size> 752 <size>area_count</size> 753 </array> 754 <description>List of areas to use for illuminant 755 estimation.</description> 756 <range>`area_count <= android.control.maxRegions[1]`</range> 757 <details>Only used in AUTO mode. 758 759 Each area is a rectangle plus weight: xmin, ymin, 760 xmax, ymax, weight. The rectangle is defined to be inclusive of the 761 specified coordinates. 762 763 The coordinate system is based on the active pixel array, 764 with (0,0) being the top-left pixel in the active pixel array, and 765 (android.sensor.info.activeArraySize.width - 1, 766 android.sensor.info.activeArraySize.height - 1) being the 767 bottom-right pixel in the active pixel array. The weight 768 should be nonnegative. 769 770 If all regions have 0 weight, then no specific metering area 771 needs to be used by the HAL. If the metering region is 772 outside the current android.scaler.cropRegion, the HAL 773 should ignore the sections outside the region and output the 774 used sections in the frame metadata. 775 </details> 776 <tag id="BC" /> 777 </entry> 778 <entry name="captureIntent" type="byte" visibility="public" enum="true"> 779 <enum> 780 <value>CUSTOM 781 <notes>This request doesn't fall into the other 782 categories. Default to preview-like 783 behavior.</notes></value> 784 <value>PREVIEW 785 <notes>This request is for a preview-like usecase. The 786 precapture trigger may be used to start off a metering 787 w/flash sequence</notes></value> 788 <value>STILL_CAPTURE 789 <notes>This request is for a still capture-type 790 usecase.</notes></value> 791 <value>VIDEO_RECORD 792 <notes>This request is for a video recording 793 usecase.</notes></value> 794 <value>VIDEO_SNAPSHOT 795 <notes>This request is for a video snapshot (still 796 image while recording video) usecase</notes></value> 797 <value>ZERO_SHUTTER_LAG 798 <notes>This request is for a ZSL usecase; the 799 application will stream full-resolution images and 800 reprocess one or several later for a final 801 capture</notes></value> 802 </enum> 803 <description>Information to the camera device 3A (auto-exposure, 804 auto-focus, auto-white balance) routines about the purpose 805 of this capture, to help the camera device to decide optimal 3A 806 strategy.</description> 807 <range>All must be supported</range> 808 <details>This control is only effective if `android.control.mode != OFF` 809 and any 3A routine is active.</details> 810 <tag id="BC" /> 811 </entry> 812 <entry name="effectMode" type="byte" visibility="public" enum="true"> 813 <enum> 814 <value>OFF 815 <notes> 816 No color effect will be applied. 817 </notes> 818 </value> 819 <value optional="true">MONO 820 <notes> 821 A "monocolor" effect where the image is mapped into 822 a single color. This will typically be grayscale. 823 </notes> 824 </value> 825 <value optional="true">NEGATIVE 826 <notes> 827 A "photo-negative" effect where the image's colors 828 are inverted. 829 </notes> 830 </value> 831 <value optional="true">SOLARIZE 832 <notes> 833 A "solarisation" effect (Sabattier effect) where the 834 image is wholly or partially reversed in 835 tone. 836 </notes> 837 </value> 838 <value optional="true">SEPIA 839 <notes> 840 A "sepia" effect where the image is mapped into warm 841 gray, red, and brown tones. 842 </notes> 843 </value> 844 <value optional="true">POSTERIZE 845 <notes> 846 A "posterization" effect where the image uses 847 discrete regions of tone rather than a continuous 848 gradient of tones. 849 </notes> 850 </value> 851 <value optional="true">WHITEBOARD 852 <notes> 853 A "whiteboard" effect where the image is typically displayed 854 as regions of white, with black or grey details. 855 </notes> 856 </value> 857 <value optional="true">BLACKBOARD 858 <notes> 859 A "blackboard" effect where the image is typically displayed 860 as regions of black, with white or grey details. 861 </notes> 862 </value> 863 <value optional="true">AQUA 864 <notes> 865 An "aqua" effect where a blue hue is added to the image. 866 </notes> 867 </value> 868 </enum> 869 <description>A special color effect to apply.</description> 870 <range>android.control.availableEffects</range> 871 <details> 872 When this mode is set, a color effect will be applied 873 to images produced by the camera device. The interpretation 874 and implementation of these color effects is left to the 875 implementor of the camera device, and should not be 876 depended on to be consistent (or present) across all 877 devices. 878 879 A color effect will only be applied if 880 android.control.mode != OFF. 881 </details> 882 <tag id="BC" /> 883 </entry> 884 <entry name="mode" type="byte" visibility="public" enum="true"> 885 <enum> 886 <value>OFF 887 <notes>Full application control of pipeline. All 3A 888 routines are disabled, no other settings in 889 android.control.* have any effect</notes></value> 890 <value>AUTO 891 <notes>Use settings for each individual 3A routine. 892 Manual control of capture parameters is disabled. All 893 controls in android.control.* besides sceneMode take 894 effect</notes></value> 895 <value>USE_SCENE_MODE 896 <notes>Use specific scene mode. Enabling this disables 897 control.aeMode, control.awbMode and control.afMode 898 controls; the HAL must ignore those settings while 899 USE_SCENE_MODE is active (except for FACE_PRIORITY 900 scene mode). Other control entries are still active. 901 This setting can only be used if availableSceneModes != 902 UNSUPPORTED</notes></value> 903 <value>OFF_KEEP_STATE 904 <notes>Same as OFF mode, except that this capture will not be 905 used by camera device background auto-exposure, auto-white balance and 906 auto-focus algorithms to update their statistics.</notes></value> 907 </enum> 908 <description>Overall mode of 3A control 909 routines</description> 910 <range>all must be supported</range> 911 <details>High-level 3A control. When set to OFF, all 3A control 912 by the camera device is disabled. The application must set the fields for 913 capture parameters itself. 914 915 When set to AUTO, the individual algorithm controls in 916 android.control.* are in effect, such as android.control.afMode. 917 918 When set to USE_SCENE_MODE, the individual controls in 919 android.control.* are mostly disabled, and the camera device implements 920 one of the scene mode settings (such as ACTION, SUNSET, or PARTY) 921 as it wishes. The camera device scene mode 3A settings are provided by 922 android.control.sceneModeOverrides. 923 924 When set to OFF_KEEP_STATE, it is similar to OFF mode, the only difference 925 is that this frame will not be used by camera device background 3A statistics 926 update, as if this frame is never captured. This mode can be used in the scenario 927 where the application doesn't want a 3A manual control capture to affect 928 the subsequent auto 3A capture results. 929 </details> 930 <tag id="BC" /> 931 </entry> 932 <entry name="sceneMode" type="byte" visibility="public" enum="true"> 933 <enum> 934 <value id="0">DISABLED 935 <notes> 936 Indicates that no scene modes are set for a given capture request. 937 </notes> 938 </value> 939 <value>FACE_PRIORITY 940 <notes>If face detection support exists, use face 941 detection data for auto-focus, auto-white balance, and 942 auto-exposure routines. If face detection statistics are 943 disabled (i.e. android.statistics.faceDetectMode is set to OFF), 944 this should still operate correctly (but will not return 945 face detection statistics to the framework). 946 947 Unlike the other scene modes, android.control.aeMode, 948 android.control.awbMode, and android.control.afMode 949 remain active when FACE_PRIORITY is set. 950 </notes> 951 </value> 952 <value optional="true">ACTION 953 <notes> 954 Optimized for photos of quickly moving objects. 955 Similar to SPORTS. 956 </notes> 957 </value> 958 <value optional="true">PORTRAIT 959 <notes> 960 Optimized for still photos of people. 961 </notes> 962 </value> 963 <value optional="true">LANDSCAPE 964 <notes> 965 Optimized for photos of distant macroscopic objects. 966 </notes> 967 </value> 968 <value optional="true">NIGHT 969 <notes> 970 Optimized for low-light settings. 971 </notes> 972 </value> 973 <value optional="true">NIGHT_PORTRAIT 974 <notes> 975 Optimized for still photos of people in low-light 976 settings. 977 </notes> 978 </value> 979 <value optional="true">THEATRE 980 <notes> 981 Optimized for dim, indoor settings where flash must 982 remain off. 983 </notes> 984 </value> 985 <value optional="true">BEACH 986 <notes> 987 Optimized for bright, outdoor beach settings. 988 </notes> 989 </value> 990 <value optional="true">SNOW 991 <notes> 992 Optimized for bright, outdoor settings containing snow. 993 </notes> 994 </value> 995 <value optional="true">SUNSET 996 <notes> 997 Optimized for scenes of the setting sun. 998 </notes> 999 </value> 1000 <value optional="true">STEADYPHOTO 1001 <notes> 1002 Optimized to avoid blurry photos due to small amounts of 1003 device motion (for example: due to hand shake). 1004 </notes> 1005 </value> 1006 <value optional="true">FIREWORKS 1007 <notes> 1008 Optimized for nighttime photos of fireworks. 1009 </notes> 1010 </value> 1011 <value optional="true">SPORTS 1012 <notes> 1013 Optimized for photos of quickly moving people. 1014 Similar to ACTION. 1015 </notes> 1016 </value> 1017 <value optional="true">PARTY 1018 <notes> 1019 Optimized for dim, indoor settings with multiple moving 1020 people. 1021 </notes> 1022 </value> 1023 <value optional="true">CANDLELIGHT 1024 <notes> 1025 Optimized for dim settings where the main light source 1026 is a flame. 1027 </notes> 1028 </value> 1029 <value optional="true">BARCODE 1030 <notes> 1031 Optimized for accurately capturing a photo of barcode 1032 for use by camera applications that wish to read the 1033 barcode value. 1034 </notes> 1035 </value> 1036 </enum> 1037 <description> 1038 A camera mode optimized for conditions typical in a particular 1039 capture setting. 1040 </description> 1041 <range>android.control.availableSceneModes</range> 1042 <details> 1043 This is the mode that that is active when 1044 `android.control.mode == USE_SCENE_MODE`. Aside from FACE_PRIORITY, 1045 these modes will disable android.control.aeMode, 1046 android.control.awbMode, and android.control.afMode while in use. 1047 1048 The interpretation and implementation of these scene modes is left 1049 to the implementor of the camera device. Their behavior will not be 1050 consistent across all devices, and any given device may only implement 1051 a subset of these modes. 1052 </details> 1053 <hal_details> 1054 HAL implementations that include scene modes are expected to provide 1055 the per-scene settings to use for android.control.aeMode, 1056 android.control.awbMode, and android.control.afMode in 1057 android.control.sceneModeOverrides. 1058 </hal_details> 1059 <tag id="BC" /> 1060 </entry> 1061 <entry name="videoStabilizationMode" type="byte" visibility="public" 1062 enum="true" typedef="boolean"> 1063 <enum> 1064 <value>OFF</value> 1065 <value>ON</value> 1066 </enum> 1067 <description>Whether video stabilization is 1068 active</description> 1069 <details>If enabled, video stabilization can modify the 1070 android.scaler.cropRegion to keep the video stream 1071 stabilized</details> 1072 <tag id="BC" /> 1073 </entry> 1074 </controls> 1075 <static> 1076 <entry name="aeAvailableAntibandingModes" type="byte" visibility="public" 1077 type_notes="list of enums" container="array"> 1078 <array> 1079 <size>n</size> 1080 </array> 1081 <description> 1082 The set of auto-exposure antibanding modes that are 1083 supported by this camera device. 1084 </description> 1085 <details> 1086 Not all of the auto-exposure anti-banding modes may be 1087 supported by a given camera device. This field lists the 1088 valid anti-banding modes that the application may request 1089 for this camera device; they must include AUTO. 1090 </details> 1091 </entry> 1092 <entry name="aeAvailableModes" type="byte" visibility="public" 1093 type_notes="list of enums" container="array"> 1094 <array> 1095 <size>n</size> 1096 </array> 1097 <description> 1098 The set of auto-exposure modes that are supported by this 1099 camera device. 1100 </description> 1101 <details> 1102 Not all the auto-exposure modes may be supported by a 1103 given camera device, especially if no flash unit is 1104 available. This entry lists the valid modes for 1105 android.control.aeMode for this camera device. 1106 1107 All camera devices support ON, and all camera devices with 1108 flash units support ON_AUTO_FLASH and 1109 ON_ALWAYS_FLASH. 1110 1111 Full-capability camera devices always support OFF mode, 1112 which enables application control of camera exposure time, 1113 sensitivity, and frame duration. 1114 </details> 1115 <tag id="BC" /> 1116 </entry> 1117 <entry name="aeAvailableTargetFpsRanges" type="int32" visibility="public" 1118 type_notes="list of pairs of frame rates" 1119 container="array"> 1120 <array> 1121 <size>2</size> 1122 <size>n</size> 1123 </array> 1124 <description>List of frame rate ranges supported by the 1125 AE algorithm/hardware</description> 1126 </entry> 1127 <entry name="aeCompensationRange" type="int32" visibility="public" 1128 container="array"> 1129 <array> 1130 <size>2</size> 1131 </array> 1132 <description>Maximum and minimum exposure compensation 1133 setting, in counts of 1134 android.control.aeCompensationStepSize</description> 1135 <range>At least (-2,2)/(exp compensation step 1136 size)</range> 1137 <tag id="BC" /> 1138 </entry> 1139 <entry name="aeCompensationStep" type="rational" visibility="public"> 1140 <description>Smallest step by which exposure compensation 1141 can be changed</description> 1142 <range><= 1/2</range> 1143 <tag id="BC" /> 1144 </entry> 1145 <entry name="afAvailableModes" type="byte" visibility="public" 1146 type_notes="List of enums" container="array"> 1147 <array> 1148 <size>n</size> 1149 </array> 1150 <description>List of AF modes that can be 1151 selected with android.control.afMode.</description> 1152 <details> 1153 Not all the auto-focus modes may be supported by a 1154 given camera device. This entry lists the valid modes for 1155 android.control.afMode for this camera device. 1156 1157 All camera devices will support OFF mode, and all camera devices with 1158 adjustable focuser units (`android.lens.info.minimumFocusDistance > 0`) 1159 will support AUTO mode. 1160 </details> 1161 <tag id="BC" /> 1162 </entry> 1163 <entry name="availableEffects" type="byte" visibility="public" 1164 type_notes="List of enums (android.control.effectMode)." container="array"> 1165 <array> 1166 <size>n</size> 1167 </array> 1168 <description> 1169 List containing the subset of color effects 1170 specified in android.control.effectMode that is supported by 1171 this device. 1172 </description> 1173 <range> 1174 Any subset of enums from those specified in 1175 android.control.effectMode. OFF must be included in any subset. 1176 </range> 1177 <details> 1178 This list contains the color effect modes that can be applied to 1179 images produced by the camera device. Only modes that have 1180 been fully implemented for the current device may be included here. 1181 Implementations are not expected to be consistent across all devices. 1182 If no color effect modes are available for a device, this should 1183 simply be set to OFF. 1184 1185 A color effect will only be applied if 1186 android.control.mode != OFF. 1187 </details> 1188 <tag id="BC" /> 1189 </entry> 1190 <entry name="availableSceneModes" type="byte" visibility="public" 1191 type_notes="List of enums (android.control.sceneMode)." 1192 container="array"> 1193 <array> 1194 <size>n</size> 1195 </array> 1196 <description> 1197 List containing a subset of scene modes 1198 specified in android.control.sceneMode. 1199 </description> 1200 <range> 1201 Any subset of the enums specified in android.control.sceneMode 1202 not including DISABLED, or solely DISABLED if no 1203 scene modes are available. FACE_PRIORITY must be included 1204 if face detection is supported (i.e.`android.statistics.info.maxFaceCount > 0`). 1205 </range> 1206 <details> 1207 This list contains scene modes that can be set for the camera device. 1208 Only scene modes that have been fully implemented for the 1209 camera device may be included here. Implementations are not expected 1210 to be consistent across all devices. If no scene modes are supported 1211 by the camera device, this will be set to `[DISABLED]`. 1212 </details> 1213 <tag id="BC" /> 1214 </entry> 1215 <entry name="availableVideoStabilizationModes" type="byte" 1216 visibility="public" type_notes="List of enums." container="array"> 1217 <array> 1218 <size>n</size> 1219 </array> 1220 <description>List of video stabilization modes that can 1221 be supported</description> 1222 <range>OFF must be included</range> 1223 <tag id="BC" /> 1224 </entry> 1225 <entry name="awbAvailableModes" type="byte" visibility="public" 1226 type_notes="List of enums" 1227 container="array"> 1228 <array> 1229 <size>n</size> 1230 </array> 1231 <description>The set of auto-white-balance modes (android.control.awbMode) 1232 that are supported by this camera device.</description> 1233 <details> 1234 Not all the auto-white-balance modes may be supported by a 1235 given camera device. This entry lists the valid modes for 1236 android.control.awbMode for this camera device. 1237 1238 All camera devices will support ON mode. 1239 1240 Full-capability camera devices will always support OFF mode, 1241 which enables application control of white balance, by using 1242 android.colorCorrection.transform and android.colorCorrection.gains 1243 (android.colorCorrection.mode must be set to TRANSFORM_MATRIX). 1244 </details> 1245 <tag id="BC" /> 1246 </entry> 1247 <entry name="maxRegions" type="int32" visibility="public" container="array"> 1248 <array> 1249 <size>3</size> 1250 </array> 1251 <description> 1252 List of the maximum number of regions that can be used for metering in 1253 auto-exposure (AE), auto-white balance (AWB), and auto-focus (AF); 1254 this corresponds to the the maximum number of elements in 1255 android.control.aeRegions, android.control.awbRegions, 1256 and android.control.afRegions. 1257 </description> 1258 <range> 1259 Value must be &gt;= 0 for each element. For full-capability devices 1260 this value must be &gt;= 1 for AE and AF. The order of the elements is: 1261 `(AE, AWB, AF)`.</range> 1262 <tag id="BC" /> 1263 </entry> 1264 <entry name="sceneModeOverrides" type="byte" visibility="system" 1265 container="array"> 1266 <array> 1267 <size>3</size> 1268 <size>length(availableSceneModes)</size> 1269 </array> 1270 <description> 1271 Ordered list of auto-exposure, auto-white balance, and auto-focus 1272 settings to use with each available scene mode. 1273 </description> 1274 <range> 1275 For each available scene mode, the list must contain three 1276 entries containing the android.control.aeMode, 1277 android.control.awbMode, and android.control.afMode values used 1278 by the camera device. The entry order is `(aeMode, awbMode, afMode)` 1279 where aeMode has the lowest index position. 1280 </range> 1281 <details> 1282 When a scene mode is enabled, the camera device is expected 1283 to override android.control.aeMode, android.control.awbMode, 1284 and android.control.afMode with its preferred settings for 1285 that scene mode. 1286 1287 The order of this list matches that of availableSceneModes, 1288 with 3 entries for each mode. The overrides listed 1289 for FACE_PRIORITY are ignored, since for that 1290 mode the application-set android.control.aeMode, 1291 android.control.awbMode, and android.control.afMode values are 1292 used instead, matching the behavior when android.control.mode 1293 is set to AUTO. It is recommended that the FACE_PRIORITY 1294 overrides should be set to 0. 1295 1296 For example, if availableSceneModes contains 1297 `(FACE_PRIORITY, ACTION, NIGHT)`, then the camera framework 1298 expects sceneModeOverrides to have 9 entries formatted like: 1299 `(0, 0, 0, ON_AUTO_FLASH, AUTO, CONTINUOUS_PICTURE, 1300 ON_AUTO_FLASH, INCANDESCENT, AUTO)`. 1301 </details> 1302 <hal_details> 1303 To maintain backward compatibility, this list will be made available 1304 in the static metadata of the camera service. The camera service will 1305 use these values to set android.control.aeMode, 1306 android.control.awbMode, and android.control.afMode when using a scene 1307 mode other than FACE_PRIORITY. 1308 </hal_details> 1309 <tag id="BC" /> 1310 </entry> 1311 </static> 1312 <dynamic> 1313 <entry name="aePrecaptureId" type="int32" visibility="hidden"> 1314 <description>The ID sent with the latest 1315 CAMERA2_TRIGGER_PRECAPTURE_METERING call</description> 1316 <range>**Deprecated**. Do not use.</range> 1317 <details>Must be 0 if no 1318 CAMERA2_TRIGGER_PRECAPTURE_METERING trigger received yet 1319 by HAL. Always updated even if AE algorithm ignores the 1320 trigger</details> 1321 </entry> 1322 <clone entry="android.control.aeMode" kind="controls"> 1323 </clone> 1324 <clone entry="android.control.aeRegions" kind="controls"> 1325 </clone> 1326 <entry name="aeState" type="byte" visibility="public" enum="true"> 1327 <enum> 1328 <value>INACTIVE 1329 <notes>AE is off or recently reset. When a camera device is opened, it starts in 1330 this state.</notes></value> 1331 <value>SEARCHING 1332 <notes>AE doesn't yet have a good set of control values 1333 for the current scene.</notes></value> 1334 <value>CONVERGED 1335 <notes>AE has a good set of control values for the 1336 current scene.</notes></value> 1337 <value>LOCKED 1338 <notes>AE has been locked.</notes></value> 1339 <value>FLASH_REQUIRED 1340 <notes>AE has a good set of control values, but flash 1341 needs to be fired for good quality still 1342 capture.</notes></value> 1343 <value>PRECAPTURE 1344 <notes>AE has been asked to do a precapture sequence 1345 (through the android.control.aePrecaptureTrigger START), 1346 and is currently executing it. Once PRECAPTURE 1347 completes, AE will transition to CONVERGED or 1348 FLASH_REQUIRED as appropriate.</notes></value> 1349 </enum> 1350 <description>Current state of AE algorithm</description> 1351 <details>Switching between or enabling AE modes (android.control.aeMode) always 1352 resets the AE state to INACTIVE. Similarly, switching between android.control.mode, 1353 or android.control.sceneMode if `android.control.mode == USE_SCENE_MODE` resets all 1354 the algorithm states to INACTIVE. 1355 1356 The camera device can do several state transitions between two results, if it is 1357 allowed by the state transition table. For example: INACTIVE may never actually be 1358 seen in a result. 1359 1360 The state in the result is the state for this image (in sync with this image): if 1361 AE state becomes CONVERGED, then the image data associated with this result should 1362 be good to use. 1363 1364 Below are state transition tables for different AE modes. 1365 1366 State | Transition Cause | New State | Notes 1367 :------------:|:----------------:|:---------:|:-----------------------: 1368 INACTIVE | | INACTIVE | Camera device auto exposure algorithm is disabled 1369 1370 When android.control.aeMode is AE_MODE_ON_*: 1371 1372 State | Transition Cause | New State | Notes 1373 :-------------:|:--------------------------------------------:|:--------------:|:-----------------: 1374 INACTIVE | Camera device initiates AE scan | SEARCHING | Values changing 1375 INACTIVE | android.control.aeLock is ON | LOCKED | Values locked 1376 SEARCHING | Camera device finishes AE scan | CONVERGED | Good values, not changing 1377 SEARCHING | Camera device finishes AE scan | FLASH_REQUIRED | Converged but too dark w/o flash 1378 SEARCHING | android.control.aeLock is ON | LOCKED | Values locked 1379 CONVERGED | Camera device initiates AE scan | SEARCHING | Values changing 1380 CONVERGED | android.control.aeLock is ON | LOCKED | Values locked 1381 FLASH_REQUIRED | Camera device initiates AE scan | SEARCHING | Values changing 1382 FLASH_REQUIRED | android.control.aeLock is ON | LOCKED | Values locked 1383 LOCKED | android.control.aeLock is OFF | SEARCHING | Values not good after unlock 1384 LOCKED | android.control.aeLock is OFF | CONVERGED | Values good after unlock 1385 LOCKED | android.control.aeLock is OFF | FLASH_REQUIRED | Exposure good, but too dark 1386 PRECAPTURE | Sequence done. android.control.aeLock is OFF | CONVERGED | Ready for high-quality capture 1387 PRECAPTURE | Sequence done. android.control.aeLock is ON | LOCKED | Ready for high-quality capture 1388 Any state | android.control.aePrecaptureTrigger is START | PRECAPTURE | Start AE precapture metering sequence 1389 </details> 1390 </entry> 1391 <clone entry="android.control.afMode" kind="controls"> 1392 </clone> 1393 <clone entry="android.control.afRegions" kind="controls"> 1394 </clone> 1395 <entry name="afState" type="byte" visibility="public" enum="true"> 1396 <enum> 1397 <value>INACTIVE 1398 <notes>AF off or has not yet tried to scan/been asked 1399 to scan. When a camera device is opened, it starts in 1400 this state.</notes></value> 1401 <value>PASSIVE_SCAN 1402 <notes>if CONTINUOUS_* modes are supported. AF is 1403 currently doing an AF scan initiated by a continuous 1404 autofocus mode</notes></value> 1405 <value>PASSIVE_FOCUSED 1406 <notes>if CONTINUOUS_* modes are supported. AF currently 1407 believes it is in focus, but may restart scanning at 1408 any time.</notes></value> 1409 <value>ACTIVE_SCAN 1410 <notes>if AUTO or MACRO modes are supported. AF is doing 1411 an AF scan because it was triggered by AF 1412 trigger</notes></value> 1413 <value>FOCUSED_LOCKED 1414 <notes>if any AF mode besides OFF is supported. AF 1415 believes it is focused correctly and is 1416 locked</notes></value> 1417 <value>NOT_FOCUSED_LOCKED 1418 <notes>if any AF mode besides OFF is supported. AF has 1419 failed to focus successfully and is 1420 locked</notes></value> 1421 <value>PASSIVE_UNFOCUSED 1422 <notes>if CONTINUOUS_* modes are supported. AF finished a 1423 passive scan without finding focus, and may restart 1424 scanning at any time.</notes></value> 1425 </enum> 1426 <description>Current state of AF algorithm</description> 1427 <details> 1428 Switching between or enabling AF modes (android.control.afMode) always 1429 resets the AF state to INACTIVE. Similarly, switching between android.control.mode, 1430 or android.control.sceneMode if `android.control.mode == USE_SCENE_MODE` resets all 1431 the algorithm states to INACTIVE. 1432 1433 The camera device can do several state transitions between two results, if it is 1434 allowed by the state transition table. For example: INACTIVE may never actually be 1435 seen in a result. 1436 1437 The state in the result is the state for this image (in sync with this image): if 1438 AF state becomes FOCUSED, then the image data associated with this result should 1439 be sharp. 1440 1441 Below are state transition tables for different AF modes. 1442 1443 When android.control.afMode is AF_MODE_OFF or AF_MODE_EDOF: 1444 1445 State | Transition Cause | New State | Notes 1446 :------------:|:----------------:|:---------:|:-----------: 1447 INACTIVE | | INACTIVE | Never changes 1448 1449 When android.control.afMode is AF_MODE_AUTO or AF_MODE_MACRO: 1450 1451 State | Transition Cause | New State | Notes 1452 :-----------------:|:----------------:|:------------------:|:--------------: 1453 INACTIVE | AF_TRIGGER | ACTIVE_SCAN | Start AF sweep, Lens now moving 1454 ACTIVE_SCAN | AF sweep done | FOCUSED_LOCKED | Focused, Lens now locked 1455 ACTIVE_SCAN | AF sweep done | NOT_FOCUSED_LOCKED | Not focused, Lens now locked 1456 ACTIVE_SCAN | AF_CANCEL | INACTIVE | Cancel/reset AF, Lens now locked 1457 FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Cancel/reset AF 1458 FOCUSED_LOCKED | AF_TRIGGER | ACTIVE_SCAN | Start new sweep, Lens now moving 1459 NOT_FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Cancel/reset AF 1460 NOT_FOCUSED_LOCKED | AF_TRIGGER | ACTIVE_SCAN | Start new sweep, Lens now moving 1461 Any state | Mode change | INACTIVE | 1462 1463 When android.control.afMode is AF_MODE_CONTINUOUS_VIDEO: 1464 1465 State | Transition Cause | New State | Notes 1466 :-----------------:|:-----------------------------------:|:------------------:|:--------------: 1467 INACTIVE | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1468 INACTIVE | AF_TRIGGER | NOT_FOCUSED_LOCKED | AF state query, Lens now locked 1469 PASSIVE_SCAN | Camera device completes current scan| PASSIVE_FOCUSED | End AF scan, Lens now locked 1470 PASSIVE_SCAN | Camera device fails current scan | PASSIVE_UNFOCUSED | End AF scan, Lens now locked 1471 PASSIVE_SCAN | AF_TRIGGER | FOCUSED_LOCKED | Immediate trans. If focus is good, Lens now locked 1472 PASSIVE_SCAN | AF_TRIGGER | NOT_FOCUSED_LOCKED | Immediate trans. if focus is bad, Lens now locked 1473 PASSIVE_SCAN | AF_CANCEL | INACTIVE | Reset lens position, Lens now locked 1474 PASSIVE_FOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1475 PASSIVE_UNFOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1476 PASSIVE_FOCUSED | AF_TRIGGER | FOCUSED_LOCKED | Immediate trans. Lens now locked 1477 PASSIVE_UNFOCUSED | AF_TRIGGER | NOT_FOCUSED_LOCKED | Immediate trans. Lens now locked 1478 FOCUSED_LOCKED | AF_TRIGGER | FOCUSED_LOCKED | No effect 1479 FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 1480 NOT_FOCUSED_LOCKED | AF_TRIGGER | NOT_FOCUSED_LOCKED | No effect 1481 NOT_FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 1482 1483 When android.control.afMode is AF_MODE_CONTINUOUS_PICTURE: 1484 1485 State | Transition Cause | New State | Notes 1486 :-----------------:|:------------------------------------:|:------------------:|:--------------: 1487 INACTIVE | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1488 INACTIVE | AF_TRIGGER | NOT_FOCUSED_LOCKED | AF state query, Lens now locked 1489 PASSIVE_SCAN | Camera device completes current scan | PASSIVE_FOCUSED | End AF scan, Lens now locked 1490 PASSIVE_SCAN | Camera device fails current scan | PASSIVE_UNFOCUSED | End AF scan, Lens now locked 1491 PASSIVE_SCAN | AF_TRIGGER | FOCUSED_LOCKED | Eventual trans. once focus good, Lens now locked 1492 PASSIVE_SCAN | AF_TRIGGER | NOT_FOCUSED_LOCKED | Eventual trans. if cannot focus, Lens now locked 1493 PASSIVE_SCAN | AF_CANCEL | INACTIVE | Reset lens position, Lens now locked 1494 PASSIVE_FOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1495 PASSIVE_UNFOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1496 PASSIVE_FOCUSED | AF_TRIGGER | FOCUSED_LOCKED | Immediate trans. Lens now locked 1497 PASSIVE_UNFOCUSED | AF_TRIGGER | NOT_FOCUSED_LOCKED | Immediate trans. Lens now locked 1498 FOCUSED_LOCKED | AF_TRIGGER | FOCUSED_LOCKED | No effect 1499 FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 1500 NOT_FOCUSED_LOCKED | AF_TRIGGER | NOT_FOCUSED_LOCKED | No effect 1501 NOT_FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 1502 </details> 1503 </entry> 1504 <entry name="afTriggerId" type="int32" visibility="hidden"> 1505 <description>The ID sent with the latest 1506 CAMERA2_TRIGGER_AUTOFOCUS call</description> 1507 <range>**Deprecated**. Do not use.</range> 1508 <details>Must be 0 if no CAMERA2_TRIGGER_AUTOFOCUS trigger 1509 received yet by HAL. Always updated even if AF algorithm 1510 ignores the trigger</details> 1511 </entry> 1512 <clone entry="android.control.awbMode" kind="controls"> 1513 </clone> 1514 <clone entry="android.control.awbRegions" kind="controls"> 1515 </clone> 1516 <entry name="awbState" type="byte" visibility="public" enum="true"> 1517 <enum> 1518 <value>INACTIVE 1519 <notes>AWB is not in auto mode. When a camera device is opened, it 1520 starts in this state.</notes></value> 1521 <value>SEARCHING 1522 <notes>AWB doesn't yet have a good set of control 1523 values for the current scene.</notes></value> 1524 <value>CONVERGED 1525 <notes>AWB has a good set of control values for the 1526 current scene.</notes></value> 1527 <value>LOCKED 1528 <notes>AWB has been locked. 1529 </notes></value> 1530 </enum> 1531 <description>Current state of AWB algorithm</description> 1532 <details>Switching between or enabling AWB modes (android.control.awbMode) always 1533 resets the AWB state to INACTIVE. Similarly, switching between android.control.mode, 1534 or android.control.sceneMode if `android.control.mode == USE_SCENE_MODE` resets all 1535 the algorithm states to INACTIVE. 1536 1537 The camera device can do several state transitions between two results, if it is 1538 allowed by the state transition table. So INACTIVE may never actually be seen in 1539 a result. 1540 1541 The state in the result is the state for this image (in sync with this image): if 1542 AWB state becomes CONVERGED, then the image data associated with this result should 1543 be good to use. 1544 1545 Below are state transition tables for different AWB modes. 1546 1547 When `android.control.awbMode != AWB_MODE_AUTO`: 1548 1549 State | Transition Cause | New State | Notes 1550 :------------:|:----------------:|:---------:|:-----------------------: 1551 INACTIVE | |INACTIVE |Camera device auto white balance algorithm is disabled 1552 1553 When android.control.awbMode is AWB_MODE_AUTO: 1554 1555 State | Transition Cause | New State | Notes 1556 :-------------:|:--------------------------------:|:-------------:|:-----------------: 1557 INACTIVE | Camera device initiates AWB scan | SEARCHING | Values changing 1558 INACTIVE | android.control.awbLock is ON | LOCKED | Values locked 1559 SEARCHING | Camera device finishes AWB scan | CONVERGED | Good values, not changing 1560 SEARCHING | android.control.awbLock is ON | LOCKED | Values locked 1561 CONVERGED | Camera device initiates AWB scan | SEARCHING | Values changing 1562 CONVERGED | android.control.awbLock is ON | LOCKED | Values locked 1563 LOCKED | android.control.awbLock is OFF | SEARCHING | Values not good after unlock 1564 LOCKED | android.control.awbLock is OFF | CONVERGED | Values good after unlock 1565 </details> 1566 </entry> 1567 <clone entry="android.control.mode" kind="controls"> 1568 </clone> 1569 </dynamic> 1570 </section> 1571 <section name="demosaic"> 1572 <controls> 1573 <entry name="mode" type="byte" enum="true"> 1574 <enum> 1575 <value>FAST 1576 <notes>Minimal or no slowdown of frame rate compared to 1577 Bayer RAW output</notes></value> 1578 <value>HIGH_QUALITY 1579 <notes>High-quality may reduce output frame 1580 rate</notes></value> 1581 </enum> 1582 <description>Controls the quality of the demosaicing 1583 processing</description> 1584 <tag id="V1" /> 1585 </entry> 1586 </controls> 1587 </section> 1588 <section name="edge"> 1589 <controls> 1590 <entry name="mode" type="byte" visibility="public" enum="true"> 1591 <enum> 1592 <value>OFF 1593 <notes>No edge enhancement is applied</notes></value> 1594 <value>FAST 1595 <notes>Must not slow down frame rate relative to sensor 1596 output</notes></value> 1597 <value>HIGH_QUALITY 1598 <notes>Frame rate may be reduced by high 1599 quality</notes></value> 1600 </enum> 1601 <description>Operation mode for edge 1602 enhancement</description> 1603 <details>Edge/sharpness/detail enhancement. OFF means no 1604 enhancement will be applied by the HAL. 1605 1606 FAST/HIGH_QUALITY both mean camera device determined enhancement 1607 will be applied. HIGH_QUALITY mode indicates that the 1608 camera device will use the highest-quality enhancement algorithms, 1609 even if it slows down capture rate. FAST means the camera device will 1610 not slow down capture rate when applying edge enhancement.</details> 1611 </entry> 1612 <entry name="strength" type="byte"> 1613 <description>Control the amount of edge enhancement 1614 applied to the images</description> 1615 <units>1-10; 10 is maximum sharpening</units> 1616 </entry> 1617 </controls> 1618 <dynamic> 1619 <clone entry="android.edge.mode" kind="controls"></clone> 1620 </dynamic> 1621 </section> 1622 <section name="flash"> 1623 <controls> 1624 <entry name="firingPower" type="byte"> 1625 <description>Power for flash firing/torch</description> 1626 <units>10 is max power; 0 is no flash. Linear</units> 1627 <range>0 - 10</range> 1628 <details>Power for snapshot may use a different scale than 1629 for torch mode. Only one entry for torch mode will be 1630 used</details> 1631 <tag id="V1" /> 1632 </entry> 1633 <entry name="firingTime" type="int64"> 1634 <description>Firing time of flash relative to start of 1635 exposure</description> 1636 <units>nanoseconds</units> 1637 <range>0-(exposure time-flash duration)</range> 1638 <details>Clamped to (0, exposure time - flash 1639 duration).</details> 1640 <tag id="V1" /> 1641 </entry> 1642 <entry name="mode" type="byte" visibility="public" enum="true"> 1643 <enum> 1644 <value>OFF 1645 <notes> 1646 Do not fire the flash for this capture. 1647 </notes> 1648 </value> 1649 <value>SINGLE 1650 <notes> 1651 If the flash is available and charged, fire flash 1652 for this capture based on android.flash.firingPower and 1653 android.flash.firingTime. 1654 </notes> 1655 </value> 1656 <value>TORCH 1657 <notes> 1658 Transition flash to continuously on. 1659 </notes> 1660 </value> 1661 </enum> 1662 <description>The desired mode for for the camera device's flash control.</description> 1663 <details> 1664 This control is only effective when flash unit is available 1665 (`android.flash.info.available == true`). 1666 1667 When this control is used, the android.control.aeMode must be set to ON or OFF. 1668 Otherwise, the camera device auto-exposure related flash control (ON_AUTO_FLASH, 1669 ON_ALWAYS_FLASH, or ON_AUTO_FLASH_REDEYE) will override this control. 1670 1671 When set to OFF, the camera device will not fire flash for this capture. 1672 1673 When set to SINGLE, the camera device will fire flash regardless of the camera 1674 device's auto-exposure routine's result. When used in still capture case, this 1675 control should be used along with AE precapture metering sequence 1676 (android.control.aePrecaptureTrigger), otherwise, the image may be incorrectly exposed. 1677 1678 When set to TORCH, the flash will be on continuously. This mode can be used 1679 for use cases such as preview, auto-focus assist, still capture, or video recording. 1680 1681 The flash status will be reported by android.flash.state in the capture result metadata. 1682 </details> 1683 <tag id="BC" /> 1684 </entry> 1685 </controls> 1686 <static> 1687 <namespace name="info"> 1688 <entry name="available" type="byte" visibility="public" enum="true" typedef="boolean"> 1689 <enum> 1690 <value>FALSE</value> 1691 <value>TRUE</value> 1692 </enum> 1693 <description>Whether this camera device has a 1694 flash.</description> 1695 <details>If no flash, none of the flash controls do 1696 anything. All other metadata should return 0.</details> 1697 <tag id="BC" /> 1698 </entry> 1699 <entry name="chargeDuration" type="int64"> 1700 <description>Time taken before flash can fire 1701 again</description> 1702 <units>nanoseconds</units> 1703 <range>0-1e9</range> 1704 <details>1 second too long/too short for recharge? Should 1705 this be power-dependent?</details> 1706 <tag id="V1" /> 1707 </entry> 1708 </namespace> 1709 <entry name="colorTemperature" type="byte"> 1710 <description>The x,y whitepoint of the 1711 flash</description> 1712 <units>pair of floats</units> 1713 <range>0-1 for both</range> 1714 <tag id="ADV" /> 1715 </entry> 1716 <entry name="maxEnergy" type="byte"> 1717 <description>Max energy output of the flash for a full 1718 power single flash</description> 1719 <units>lumen-seconds</units> 1720 <range>&gt;= 0</range> 1721 <tag id="ADV" /> 1722 </entry> 1723 </static> 1724 <dynamic> 1725 <clone entry="android.flash.firingPower" kind="controls"> 1726 </clone> 1727 <clone entry="android.flash.firingTime" kind="controls"> 1728 </clone> 1729 <clone entry="android.flash.mode" kind="controls"></clone> 1730 <entry name="state" type="byte" visibility="public" enum="true"> 1731 <enum> 1732 <value>UNAVAILABLE 1733 <notes>No flash on camera</notes></value> 1734 <value>CHARGING 1735 <notes>if android.flash.info.available is true Flash is 1736 charging and cannot be fired</notes></value> 1737 <value>READY 1738 <notes>if android.flash.info.available is true Flash is 1739 ready to fire</notes></value> 1740 <value>FIRED 1741 <notes>if android.flash.info.available is true Flash fired 1742 for this capture</notes></value> 1743 </enum> 1744 <description>Current state of the flash 1745 unit.</description> 1746 <details> 1747 When the camera device doesn't have flash unit 1748 (i.e. `android.flash.info.available == false`), this state will always be UNAVAILABLE. 1749 Other states indicate the current flash status. 1750 </details> 1751 </entry> 1752 </dynamic> 1753 </section> 1754 <section name="geometric"> 1755 <controls> 1756 <entry name="mode" type="byte" enum="true"> 1757 <enum> 1758 <value>OFF 1759 <notes>No geometric correction is 1760 applied</notes></value> 1761 <value>FAST 1762 <notes>Must not slow down frame rate relative to raw 1763 bayer output</notes></value> 1764 <value>HIGH_QUALITY 1765 <notes>Frame rate may be reduced by high 1766 quality</notes></value> 1767 </enum> 1768 <description>Operating mode of geometric 1769 correction</description> 1770 </entry> 1771 <entry name="strength" type="byte"> 1772 <description>Control the amount of shading correction 1773 applied to the images</description> 1774 <units>unitless: 1-10; 10 is full shading 1775 compensation</units> 1776 <tag id="ADV" /> 1777 </entry> 1778 </controls> 1779 </section> 1780 <section name="hotPixel"> 1781 <controls> 1782 <entry name="mode" type="byte" enum="true"> 1783 <enum> 1784 <value>OFF 1785 <notes>No hot pixel correction can be 1786 applied</notes></value> 1787 <value>FAST 1788 <notes>Frame rate must not be reduced compared to raw 1789 Bayer output</notes></value> 1790 <value>HIGH_QUALITY 1791 <notes>Frame rate may be reduced by high 1792 quality</notes></value> 1793 </enum> 1794 <description>Set operational mode for hot pixel 1795 correction</description> 1796 <tag id="V1" /> 1797 </entry> 1798 </controls> 1799 <static> 1800 <namespace name="info"> 1801 <entry name="map" type="int32" 1802 type_notes="list of coordinates based on android.sensor.pixelArraySize" 1803 container="array"> 1804 <array> 1805 <size>2</size> 1806 <size>n</size> 1807 </array> 1808 <description>Location of hot/defective pixels on 1809 sensor</description> 1810 <tag id="ADV" /> 1811 </entry> 1812 </namespace> 1813 </static> 1814 <dynamic> 1815 <clone entry="android.hotPixel.mode" kind="controls"> 1816 <tag id="V1" /> 1817 </clone> 1818 </dynamic> 1819 </section> 1820 <section name="jpeg"> 1821 <controls> 1822 <entry name="gpsCoordinates" type="double" visibility="public" 1823 type_notes="latitude, longitude, altitude. First two in degrees, the third in meters" 1824 container="array"> 1825 <array> 1826 <size>3</size> 1827 </array> 1828 <description>GPS coordinates to include in output JPEG 1829 EXIF</description> 1830 <range>(-180 - 180], [-90,90], [-inf, inf]</range> 1831 <tag id="BC" /> 1832 </entry> 1833 <entry name="gpsProcessingMethod" type="byte" visibility="public" 1834 typedef="string"> 1835 <description>32 characters describing GPS algorithm to 1836 include in EXIF</description> 1837 <units>UTF-8 null-terminated string</units> 1838 <tag id="BC" /> 1839 </entry> 1840 <entry name="gpsTimestamp" type="int64" visibility="public"> 1841 <description>Time GPS fix was made to include in 1842 EXIF</description> 1843 <units>UTC in seconds since January 1, 1970</units> 1844 <tag id="BC" /> 1845 </entry> 1846 <entry name="orientation" type="int32" visibility="public"> 1847 <description>Orientation of JPEG image to 1848 write</description> 1849 <units>Degrees in multiples of 90</units> 1850 <range>0, 90, 180, 270</range> 1851 <tag id="BC" /> 1852 </entry> 1853 <entry name="quality" type="byte" visibility="public"> 1854 <description>Compression quality of the final JPEG 1855 image</description> 1856 <range>1-100; larger is higher quality</range> 1857 <details>85-95 is typical usage range</details> 1858 <tag id="BC" /> 1859 </entry> 1860 <entry name="thumbnailQuality" type="byte" visibility="public"> 1861 <description>Compression quality of JPEG 1862 thumbnail</description> 1863 <range>1-100; larger is higher quality</range> 1864 <tag id="BC" /> 1865 </entry> 1866 <entry name="thumbnailSize" type="int32" visibility="public" 1867 container="array" typedef="size"> 1868 <array> 1869 <size>2</size> 1870 </array> 1871 <description>Resolution of embedded JPEG thumbnail</description> 1872 <range>Size must be one of the size from android.jpeg.availableThumbnailSizes</range> 1873 <details>When set to (0, 0) value, the JPEG EXIF will not contain thumbnail, 1874 but the captured JPEG will still be a valid image. 1875 1876 When a jpeg image capture is issued, the thumbnail size selected should have 1877 the same aspect ratio as the jpeg image.</details> 1878 <tag id="BC" /> 1879 </entry> 1880 </controls> 1881 <static> 1882 <entry name="availableThumbnailSizes" type="int32" visibility="public" 1883 container="array" typedef="size"> 1884 <array> 1885 <size>2</size> 1886 <size>n</size> 1887 </array> 1888 <description>Supported resolutions for the JPEG thumbnail</description> 1889 <range>Will include at least one valid resolution, plus 1890 (0,0) for no thumbnail generation, and each size will be distinct.</range> 1891 <details>Below condiditions will be satisfied for this size list: 1892 1893 * The sizes will be sorted by increasing pixel area (width x height). 1894 If several resolutions have the same area, they will be sorted by increasing width. 1895 * The aspect ratio of the largest thumbnail size will be same as the 1896 aspect ratio of largest size in android.scaler.availableJpegSizes. 1897 The largest size is defined as the size that has the largest pixel area 1898 in a given size list. 1899 * Each size in android.scaler.availableJpegSizes will have at least 1900 one corresponding size that has the same aspect ratio in availableThumbnailSizes, 1901 and vice versa. 1902 * All non (0, 0) sizes will have non-zero widths and heights.</details> 1903 <tag id="BC" /> 1904 </entry> 1905 <entry name="maxSize" type="int32" visibility="system"> 1906 <description>Maximum size in bytes for the compressed 1907 JPEG buffer</description> 1908 <range>Must be large enough to fit any JPEG produced by 1909 the camera</range> 1910 <details>This is used for sizing the gralloc buffers for 1911 JPEG</details> 1912 </entry> 1913 </static> 1914 <dynamic> 1915 <clone entry="android.jpeg.gpsCoordinates" kind="controls"> 1916 </clone> 1917 <clone entry="android.jpeg.gpsProcessingMethod" 1918 kind="controls"></clone> 1919 <clone entry="android.jpeg.gpsTimestamp" kind="controls"> 1920 </clone> 1921 <clone entry="android.jpeg.orientation" kind="controls"> 1922 </clone> 1923 <clone entry="android.jpeg.quality" kind="controls"> 1924 </clone> 1925 <entry name="size" type="int32"> 1926 <description>The size of the compressed JPEG image, in 1927 bytes</description> 1928 <range>&gt;= 0</range> 1929 <details>If no JPEG output is produced for the request, 1930 this must be 0. 1931 1932 Otherwise, this describes the real size of the compressed 1933 JPEG image placed in the output stream. More specifically, 1934 if android.jpeg.maxSize = 1000000, and a specific capture 1935 has android.jpeg.size = 500000, then the output buffer from 1936 the JPEG stream will be 1000000 bytes, of which the first 1937 500000 make up the real data.</details> 1938 </entry> 1939 <clone entry="android.jpeg.thumbnailQuality" 1940 kind="controls"></clone> 1941 <clone entry="android.jpeg.thumbnailSize" kind="controls"> 1942 </clone> 1943 </dynamic> 1944 </section> 1945 <section name="lens"> 1946 <controls> 1947 <entry name="aperture" type="float" visibility="public"> 1948 <description>The ratio of lens focal length to the effective 1949 aperture diameter.</description> 1950 <units>f-number (f/NNN)</units> 1951 <range>android.lens.info.availableApertures</range> 1952 <details>This will only be supported on the camera devices that 1953 have variable aperture lens. The aperture value can only be 1954 one of the values listed in android.lens.info.availableApertures. 1955 1956 When this is supported and android.control.aeMode is OFF, 1957 this can be set along with android.sensor.exposureTime, 1958 android.sensor.sensitivity, and android.sensor.frameDuration 1959 to achieve manual exposure control. 1960 1961 The requested aperture value may take several frames to reach the 1962 requested value; the camera device will report the current (intermediate) 1963 aperture size in capture result metadata while the aperture is changing. 1964 While the aperture is still changing, android.lens.state will be set to MOVING. 1965 1966 When this is supported and android.control.aeMode is one of 1967 the ON modes, this will be overridden by the camera device 1968 auto-exposure algorithm, the overridden values are then provided 1969 back to the user in the corresponding result.</details> 1970 <tag id="V1" /> 1971 </entry> 1972 <entry name="filterDensity" type="float" visibility="public"> 1973 <description> 1974 State of lens neutral density filter(s). 1975 </description> 1976 <units>Steps of Exposure Value (EV).</units> 1977 <range>android.lens.info.availableFilterDensities</range> 1978 <details> 1979 This will not be supported on most camera devices. On devices 1980 where this is supported, this may only be set to one of the 1981 values included in android.lens.info.availableFilterDensities. 1982 1983 Lens filters are typically used to lower the amount of light the 1984 sensor is exposed to (measured in steps of EV). As used here, an EV 1985 step is the standard logarithmic representation, which are 1986 non-negative, and inversely proportional to the amount of light 1987 hitting the sensor. For example, setting this to 0 would result 1988 in no reduction of the incoming light, and setting this to 2 would 1989 mean that the filter is set to reduce incoming light by two stops 1990 (allowing 1/4 of the prior amount of light to the sensor). 1991 1992 It may take several frames before the lens filter density changes 1993 to the requested value. While the filter density is still changing, 1994 android.lens.state will be set to MOVING. 1995 </details> 1996 <tag id="V1" /> 1997 </entry> 1998 <entry name="focalLength" type="float" visibility="public"> 1999 <description> 2000 The current lens focal length; used for optical zoom. 2001 </description> 2002 <units>focal length in mm</units> 2003 <range>android.lens.info.availableFocalLengths</range> 2004 <details> 2005 This setting controls the physical focal length of the camera 2006 device's lens. Changing the focal length changes the field of 2007 view of the camera device, and is usually used for optical zoom. 2008 2009 Like android.lens.focusDistance and android.lens.aperture, this 2010 setting won't be applied instantaneously, and it may take several 2011 frames before the lens can change to the requested focal length. 2012 While the focal length is still changing, android.lens.state will 2013 be set to MOVING. 2014 2015 This is expected not to be supported on most devices. 2016 </details> 2017 <tag id="V1" /> 2018 </entry> 2019 <entry name="focusDistance" type="float" visibility="public"> 2020 <description>Distance to plane of sharpest focus, 2021 measured from frontmost surface of the lens</description> 2022 <units>See android.lens.info.focusDistanceCalibration for details.</units> 2023 <range>&gt;= 0</range> 2024 <details>0 means infinity focus. Used value will be clamped 2025 to [0, android.lens.info.minimumFocusDistance]. 2026 2027 Like android.lens.focalLength, this setting won't be applied 2028 instantaneously, and it may take several frames before the lens 2029 can move to the requested focus distance. While the lens is still moving, 2030 android.lens.state will be set to MOVING. 2031 </details> 2032 <tag id="BC" /> 2033 <tag id="V1" /> 2034 </entry> 2035 <entry name="opticalStabilizationMode" type="byte" visibility="public" 2036 enum="true"> 2037 <enum> 2038 <value>OFF 2039 <notes>Optical stabilization is unavailable.</notes> 2040 </value> 2041 <value optional="true">ON 2042 <notes>Optical stabilization is enabled.</notes> 2043 </value> 2044 </enum> 2045 <description> 2046 Sets whether the camera device uses optical image stabilization (OIS) 2047 when capturing images. 2048 </description> 2049 <range>android.lens.info.availableOpticalStabilization</range> 2050 <details> 2051 OIS is used to compensate for motion blur due to small movements of 2052 the camera during capture. Unlike digital image stabilization, OIS makes 2053 use of mechanical elements to stabilize the camera sensor, and thus 2054 allows for longer exposure times before camera shake becomes 2055 apparent. 2056 2057 This is not expected to be supported on most devices. 2058 </details> 2059 <tag id="V1" /> 2060 </entry> 2061 </controls> 2062 <static> 2063 <namespace name="info"> 2064 <entry name="availableApertures" type="float" visibility="public" 2065 container="array"> 2066 <array> 2067 <size>n</size> 2068 </array> 2069 <description>List of supported aperture 2070 values.</description> 2071 <range>one entry required, &gt; 0</range> 2072 <details>If the camera device doesn't support variable apertures, 2073 listed value will be the fixed aperture. 2074 2075 If the camera device supports variable apertures, the aperture value 2076 in this list will be sorted in ascending order.</details> 2077 <tag id="V1" /> 2078 </entry> 2079 <entry name="availableFilterDensities" type="float" visibility="public" 2080 container="array"> 2081 <array> 2082 <size>n</size> 2083 </array> 2084 <description> 2085 List of supported neutral density filter values for 2086 android.lens.filterDensity. 2087 </description> 2088 <range> 2089 At least one value is required. Values must be &gt;= 0. 2090 </range> 2091 <details> 2092 If changing android.lens.filterDensity is not supported, 2093 availableFilterDensities must contain only 0. Otherwise, this 2094 list contains only the exact filter density values available on 2095 this camera device. 2096 </details> 2097 <tag id="V1" /> 2098 </entry> 2099 <entry name="availableFocalLengths" type="float" visibility="public" 2100 type_notes="The list of available focal lengths" 2101 container="array"> 2102 <array> 2103 <size>n</size> 2104 </array> 2105 <description> 2106 The available focal lengths for this device for use with 2107 android.lens.focalLength. 2108 </description> 2109 <range> 2110 Each value in this list must be &gt; 0. This list must 2111 contain at least one value. 2112 </range> 2113 <details> 2114 If optical zoom is not supported, this will only report 2115 a single value corresponding to the static focal length of the 2116 device. Otherwise, this will report every focal length supported 2117 by the device. 2118 </details> 2119 <tag id="BC" /> 2120 <tag id="V1" /> 2121 </entry> 2122 <entry name="availableOpticalStabilization" type="byte" 2123 visibility="public" type_notes="list of enums" container="array"> 2124 <array> 2125 <size>n</size> 2126 </array> 2127 <description> 2128 List containing a subset of the optical image 2129 stabilization (OIS) modes specified in 2130 android.lens.opticalStabilizationMode. 2131 </description> 2132 <details> 2133 If OIS is not implemented for a given camera device, this should 2134 contain only OFF. 2135 </details> 2136 <tag id="V1" /> 2137 </entry> 2138 <entry name="geometricCorrectionMap" type="float" 2139 type_notes="2D array of destination coordinate pairs for uniform grid points in source image, per color channel. Size in the range of 2x3x40x30" 2140 container="array"> 2141 <array> 2142 <size>2</size> 2143 <size>3</size> 2144 <size>n</size> 2145 <size>m</size> 2146 </array> 2147 <description>A low-resolution map for correction of 2148 geometric distortions and chromatic aberrations, per 2149 color channel</description> 2150 <range>N, M &gt;= 2</range> 2151 <details>[DNG wants a function instead]. What's easiest 2152 for implementers? With an array size (M, N), entry (i, 2153 j) provides the destination for pixel (i/(M-1) * width, 2154 j/(N-1) * height). Data is row-major, with each array 2155 entry being ( (X, Y)_r, (X, Y)_g, (X, Y)_b ) )</details> 2156 <tag id="DNG" /> 2157 </entry> 2158 <entry name="geometricCorrectionMapSize" type="int32" 2159 type_notes="width and height of geometric correction map" 2160 container="array" typedef="size"> 2161 <array> 2162 <size>2</size> 2163 </array> 2164 <description>Dimensions of geometric correction 2165 map</description> 2166 <range>Both values &gt;= 2</range> 2167 <tag id="V1" /> 2168 </entry> 2169 <entry name="hyperfocalDistance" type="float" visibility="public" optional="true"> 2170 <description>Optional. Hyperfocal distance for this lens.</description> 2171 <units>See android.lens.info.focusDistanceCalibration for details.</units> 2172 <range>&gt;= 0</range> 2173 <details>If the lens is fixed focus, the camera device will report 0. 2174 2175 If the lens is not fixed focus, the camera device will report this 2176 field when android.lens.info.focusDistanceCalibration is APPROXIMATE or CALIBRATED. 2177 </details> 2178 </entry> 2179 <entry name="minimumFocusDistance" type="float" visibility="public"> 2180 <description>Shortest distance from frontmost surface 2181 of the lens that can be focused correctly.</description> 2182 <units>See android.lens.info.focusDistanceCalibration for details.</units> 2183 <range>&gt;= 0</range> 2184 <details>If the lens is fixed-focus, this should be 2185 0.</details> 2186 <tag id="V1" /> 2187 </entry> 2188 <entry name="shadingMapSize" type="int32" visibility="public" 2189 type_notes="width and height of lens shading map provided by the HAL. (N x M)" 2190 container="array" typedef="size"> 2191 <array> 2192 <size>2</size> 2193 </array> 2194 <description>Dimensions of lens shading map.</description> 2195 <range>Both values &gt;= 1</range> 2196 <details> 2197 The map should be on the order of 30-40 rows and columns, and 2198 must be smaller than 64x64. 2199 </details> 2200 <tag id="V1" /> 2201 </entry> 2202 <entry name="focusDistanceCalibration" type="byte" visibility="public" enum="true"> 2203 <enum> 2204 <value>UNCALIBRATED 2205 <notes> 2206 The lens focus distance is not accurate, and the units used for 2207 android.lens.focusDistance do not correspond to any physical units. 2208 Setting the lens to the same focus distance on separate occasions may 2209 result in a different real focus distance, depending on factors such 2210 as the orientation of the device, the age of the focusing mechanism, 2211 and the device temperature. The focus distance value will still be 2212 in the range of `[0, android.lens.info.minimumFocusDistance]`, where 0 2213 represents the farthest focus. 2214 </notes> 2215 </value> 2216 <value>APPROXIMATE 2217 <notes> 2218 The lens focus distance is measured in diopters. However, setting the lens 2219 to the same focus distance on separate occasions may result in a 2220 different real focus distance, depending on factors such as the 2221 orientation of the device, the age of the focusing mechanism, and 2222 the device temperature. 2223 </notes> 2224 </value> 2225 <value>CALIBRATED 2226 <notes> 2227 The lens focus distance is measured in diopters. The lens mechanism is 2228 calibrated so that setting the same focus distance is repeatable on 2229 multiple occasions with good accuracy, and the focus distance corresponds 2230 to the real physical distance to the plane of best focus. 2231 </notes> 2232 </value> 2233 </enum> 2234 <description>The lens focus distance calibration quality.</description> 2235 <details> 2236 The lens focus distance calibration quality determines the reliability of 2237 focus related metadata entries, i.e. android.lens.focusDistance, 2238 android.lens.focusRange, android.lens.info.hyperfocalDistance, and 2239 android.lens.info.minimumFocusDistance. 2240 </details> 2241 <tag id="V1" /> 2242 </entry> 2243 </namespace> 2244 <entry name="facing" type="byte" visibility="public" enum="true"> 2245 <enum> 2246 <value>FRONT</value> 2247 <value>BACK</value> 2248 </enum> 2249 <description>Direction the camera faces relative to 2250 device screen</description> 2251 </entry> 2252 <entry name="opticalAxisAngle" type="float" 2253 type_notes="degrees. First defines the angle of separation between the perpendicular to the screen and the camera optical axis. The second then defines the clockwise rotation of the optical axis from native device up." 2254 container="array"> 2255 <array> 2256 <size>2</size> 2257 </array> 2258 <description>Relative angle of camera optical axis to the 2259 perpendicular axis from the display</description> 2260 <range>[0-90) for first angle, [0-360) for second</range> 2261 <details>Examples: 2262 2263 (0,0) means that the camera optical axis 2264 is perpendicular to the display surface; 2265 2266 (45,0) means that the camera points 45 degrees up when 2267 device is held upright; 2268 2269 (45,90) means the camera points 45 degrees to the right when 2270 the device is held upright. 2271 2272 Use FACING field to determine perpendicular outgoing 2273 direction</details> 2274 <tag id="ADV" /> 2275 </entry> 2276 <entry name="position" type="float" container="array"> 2277 <array> 2278 <size>3, location in mm, in the sensor coordinate 2279 system</size> 2280 </array> 2281 <description>Coordinates of camera optical axis on 2282 device</description> 2283 <tag id="V1" /> 2284 </entry> 2285 </static> 2286 <dynamic> 2287 <clone entry="android.lens.aperture" kind="controls"> 2288 <tag id="V1" /> 2289 </clone> 2290 <clone entry="android.lens.filterDensity" kind="controls"> 2291 <tag id="V1" /> 2292 </clone> 2293 <clone entry="android.lens.focalLength" kind="controls"> 2294 <tag id="BC" /> 2295 </clone> 2296 <clone entry="android.lens.focusDistance" kind="controls"> 2297 <details>Should be zero for fixed-focus cameras</details> 2298 <tag id="BC" /> 2299 </clone> 2300 <entry name="focusRange" type="float" visibility="public" 2301 type_notes="Range of scene distances that are in focus" 2302 container="array"> 2303 <array> 2304 <size>2</size> 2305 </array> 2306 <description>The range of scene distances that are in 2307 sharp focus (depth of field)</description> 2308 <units>pair of focus distances in diopters: (near, 2309 far), see android.lens.info.focusDistanceCalibration for details.</units> 2310 <range>&gt;=0</range> 2311 <details>If variable focus not supported, can still report 2312 fixed depth of field range</details> 2313 <tag id="BC" /> 2314 </entry> 2315 <clone entry="android.lens.opticalStabilizationMode" 2316 kind="controls"> 2317 <tag id="V1" /> 2318 </clone> 2319 <entry name="state" type="byte" visibility="public" enum="true"> 2320 <enum> 2321 <value>STATIONARY 2322 <notes> 2323 The lens parameters (android.lens.focalLength, android.lens.focusDistance 2324 android.lens.filterDensity and android.lens.aperture) are not changing. 2325 </notes> 2326 </value> 2327 <value>MOVING 2328 <notes> 2329 Any of the lens parameters (android.lens.focalLength, android.lens.focusDistance 2330 android.lens.filterDensity or android.lens.aperture) is changing. 2331 </notes> 2332 </value> 2333 </enum> 2334 <description>Current lens status.</description> 2335 <details> 2336 For lens parameters android.lens.focalLength, android.lens.focusDistance, 2337 android.lens.filterDensity and android.lens.aperture, when changes are requested, 2338 they may take several frames to reach the requested values. This state indicates 2339 the current status of the lens parameters. 2340 2341 When the state is STATIONARY, the lens parameters are not changing. This could be 2342 either because the parameters are all fixed, or because the lens has had enough 2343 time to reach the most recently-requested values. 2344 If all these lens parameters are not changable for a camera device, as listed below: 2345 2346 * Fixed focus (`android.lens.info.minimumFocusDistance == 0`), which means 2347 android.lens.focusDistance parameter will always be 0. 2348 * Fixed focal length (android.lens.info.availableFocalLengths contains single value), 2349 which means the optical zoom is not supported. 2350 * No ND filter (android.lens.info.availableFilterDensities contains only 0). 2351 * Fixed aperture (android.lens.info.availableApertures contains single value). 2352 2353 Then this state will always be STATIONARY. 2354 2355 When the state is MOVING, it indicates that at least one of the lens parameters 2356 is changing. 2357 </details> 2358 <tag id="V1" /> 2359 </entry> 2360 </dynamic> 2361 </section> 2362 <section name="noiseReduction"> 2363 <controls> 2364 <entry name="mode" type="byte" visibility="public" enum="true"> 2365 <enum> 2366 <value>OFF 2367 <notes>No noise reduction is applied</notes></value> 2368 <value>FAST 2369 <notes>Must not slow down frame rate relative to sensor 2370 output</notes></value> 2371 <value>HIGH_QUALITY 2372 <notes>May slow down frame rate to provide highest 2373 quality</notes></value> 2374 </enum> 2375 <description>Mode of operation for the noise reduction 2376 algorithm</description> 2377 <range>android.noiseReduction.availableModes</range> 2378 <details>Noise filtering control. OFF means no noise reduction 2379 will be applied by the HAL. 2380 2381 FAST/HIGH_QUALITY both mean camera device determined noise filtering 2382 will be applied. HIGH_QUALITY mode indicates that the camera device 2383 will use the highest-quality noise filtering algorithms, 2384 even if it slows down capture rate. FAST means the camera device should not 2385 slow down capture rate when applying noise filtering.</details> 2386 <tag id="V1" /> 2387 </entry> 2388 <entry name="strength" type="byte"> 2389 <description>Control the amount of noise reduction 2390 applied to the images</description> 2391 <units>1-10; 10 is max noise reduction</units> 2392 <range>1 - 10</range> 2393 </entry> 2394 </controls> 2395 <dynamic> 2396 <clone entry="android.noiseReduction.mode" kind="controls"> 2397 </clone> 2398 </dynamic> 2399 </section> 2400 <section name="quirks"> 2401 <static> 2402 <entry name="meteringCropRegion" type="byte" visibility="system" optional="true"> 2403 <description>If set to 1, the camera service does not 2404 scale 'normalized' coordinates with respect to the crop 2405 region. This applies to metering input (a{e,f,wb}Region 2406 and output (face rectangles).</description> 2407 <range>**Deprecated**. Do not use.</range> 2408 <details>Normalized coordinates refer to those in the 2409 (-1000,1000) range mentioned in the 2410 android.hardware.Camera API. 2411 2412 HAL implementations should instead always use and emit 2413 sensor array-relative coordinates for all region data. Does 2414 not need to be listed in static metadata. Support will be 2415 removed in future versions of camera service.</details> 2416 </entry> 2417 <entry name="triggerAfWithAuto" type="byte" visibility="system" optional="true"> 2418 <description>If set to 1, then the camera service always 2419 switches to FOCUS_MODE_AUTO before issuing a AF 2420 trigger.</description> 2421 <range>**Deprecated**. Do not use.</range> 2422 <details>HAL implementations should implement AF trigger 2423 modes for AUTO, MACRO, CONTINUOUS_FOCUS, and 2424 CONTINUOUS_PICTURE modes instead of using this flag. Does 2425 not need to be listed in static metadata. Support will be 2426 removed in future versions of camera service</details> 2427 </entry> 2428 <entry name="useZslFormat" type="byte" visibility="system" optional="true"> 2429 <description>If set to 1, the camera service uses 2430 CAMERA2_PIXEL_FORMAT_ZSL instead of 2431 HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED for the zero 2432 shutter lag stream</description> 2433 <range>**Deprecated**. Do not use.</range> 2434 <details>HAL implementations should use gralloc usage flags 2435 to determine that a stream will be used for 2436 zero-shutter-lag, instead of relying on an explicit 2437 format setting. Does not need to be listed in static 2438 metadata. Support will be removed in future versions of 2439 camera service.</details> 2440 </entry> 2441 <entry name="usePartialResult" type="byte" visibility="hidden" optional="true"> 2442 <description> 2443 If set to 1, the HAL will always split result 2444 metadata for a single capture into multiple buffers, 2445 returned using multiple process_capture_result calls. 2446 </description> 2447 <range>**Deprecated**. Do not use.</range> 2448 <details> 2449 Does not need to be listed in static 2450 metadata. Support for partial results will be reworked in 2451 future versions of camera service. This quirk will stop 2452 working at that point; DO NOT USE without careful 2453 consideration of future support. 2454 </details> 2455 <hal_details> 2456 Refer to `camera3_capture_result::partial_result` 2457 for information on how to implement partial results. 2458 </hal_details> 2459 </entry> 2460 </static> 2461 <dynamic> 2462 <entry name="partialResult" type="byte" visibility="hidden" optional="true" enum="true" typedef="boolean"> 2463 <enum> 2464 <value>FINAL 2465 <notes>The last or only metadata result buffer 2466 for this capture.</notes> 2467 </value> 2468 <value>PARTIAL 2469 <notes>A partial buffer of result metadata for this 2470 capture. More result buffers for this capture will be sent 2471 by the HAL, the last of which will be marked 2472 FINAL.</notes> 2473 </value> 2474 </enum> 2475 <description> 2476 Whether a result given to the framework is the 2477 final one for the capture, or only a partial that contains a 2478 subset of the full set of dynamic metadata 2479 values.</description> 2480 <range>**Deprecated**. Do not use. Optional. Default value is FINAL.</range> 2481 <details> 2482 The entries in the result metadata buffers for a 2483 single capture may not overlap, except for this entry. The 2484 FINAL buffers must retain FIFO ordering relative to the 2485 requests that generate them, so the FINAL buffer for frame 3 must 2486 always be sent to the framework after the FINAL buffer for frame 2, and 2487 before the FINAL buffer for frame 4. PARTIAL buffers may be returned 2488 in any order relative to other frames, but all PARTIAL buffers for a given 2489 capture must arrive before the FINAL buffer for that capture. This entry may 2490 only be used by the HAL if quirks.usePartialResult is set to 1. 2491 </details> 2492 <hal_details> 2493 Refer to `camera3_capture_result::partial_result` 2494 for information on how to implement partial results. 2495 </hal_details> 2496 </entry> 2497 </dynamic> 2498 </section> 2499 <section name="request"> 2500 <controls> 2501 <entry name="frameCount" type="int32" visibility="system"> 2502 <description>A frame counter set by the framework. Must 2503 be maintained unchanged in output frame. This value monotonically 2504 increases with every new result (that is, each new result has a unique 2505 frameCount value). 2506 </description> 2507 <units>incrementing integer</units> 2508 <range>**Deprecated**. Do not use. Any int.</range> 2509 </entry> 2510 <entry name="id" type="int32" visibility="hidden"> 2511 <description>An application-specified ID for the current 2512 request. Must be maintained unchanged in output 2513 frame</description> 2514 <units>arbitrary integer assigned by application</units> 2515 <range>Any int</range> 2516 <tag id="V1" /> 2517 </entry> 2518 <entry name="inputStreams" type="int32" visibility="system" 2519 container="array"> 2520 <array> 2521 <size>n</size> 2522 </array> 2523 <description>List which camera reprocess stream is used 2524 for the source of reprocessing data.</description> 2525 <units>List of camera reprocess stream IDs</units> 2526 <range>**Deprecated**. Do not use. 2527 2528 Typically, only one entry allowed, must be a valid reprocess stream ID. 2529 2530 If android.jpeg.needsThumbnail is set, then multiple 2531 reprocess streams may be included in a single request; they 2532 must be different scaled versions of the same image.</range> 2533 <details>Only meaningful when android.request.type == 2534 REPROCESS. Ignored otherwise</details> 2535 <tag id="HAL2" /> 2536 </entry> 2537 <entry name="metadataMode" type="byte" visibility="system" 2538 enum="true"> 2539 <enum> 2540 <value>NONE 2541 <notes>No metadata should be produced on output, except 2542 for application-bound buffer data. If no 2543 application-bound streams exist, no frame should be 2544 placed in the output frame queue. If such streams 2545 exist, a frame should be placed on the output queue 2546 with null metadata but with the necessary output buffer 2547 information. Timestamp information should still be 2548 included with any output stream buffers</notes></value> 2549 <value>FULL 2550 <notes>All metadata should be produced. Statistics will 2551 only be produced if they are separately 2552 enabled</notes></value> 2553 </enum> 2554 <description>How much metadata to produce on 2555 output</description> 2556 </entry> 2557 <entry name="outputStreams" type="int32" visibility="system" 2558 container="array"> 2559 <array> 2560 <size>n</size> 2561 </array> 2562 <description>Lists which camera output streams image data 2563 from this capture must be sent to</description> 2564 <units>List of camera stream IDs</units> 2565 <range>**Deprecated**. Do not use. List must only include streams that have been 2566 created</range> 2567 <details>If no output streams are listed, then the image 2568 data should simply be discarded. The image data must 2569 still be captured for metadata and statistics production, 2570 and the lens and flash must operate as requested.</details> 2571 <tag id="HAL2" /> 2572 </entry> 2573 <entry name="type" type="byte" visibility="system" enum="true"> 2574 <enum> 2575 <value>CAPTURE 2576 <notes>Capture a new image from the imaging hardware, 2577 and process it according to the 2578 settings</notes></value> 2579 <value>REPROCESS 2580 <notes>Process previously captured data; the 2581 android.request.inputStream parameter determines the 2582 source reprocessing stream. TODO: Mark dynamic metadata 2583 needed for reprocessing with [RP]</notes></value> 2584 </enum> 2585 <description>The type of the request; either CAPTURE or 2586 REPROCESS. For HAL3, this tag is redundant.</description> 2587 <tag id="HAL2" /> 2588 </entry> 2589 </controls> 2590 <static> 2591 <entry name="maxNumOutputStreams" type="int32" visibility="public" 2592 container="array"> 2593 <array> 2594 <size>3</size> 2595 </array> 2596 <description>The maximum numbers of different types of output streams 2597 that can be configured and used simultaneously by a camera device. 2598 </description> 2599 <range> 2600 &gt;= 1 for JPEG-compressed format streams. 2601 2602 &gt;= 0 for Raw format streams. 2603 2604 &gt;= 3 for processed, uncompressed format streams. 2605 </range> 2606 <details> 2607 This is a 3 element tuple that contains the max number of output simultaneous 2608 streams for raw sensor, processed (and uncompressed), and JPEG formats respectively. 2609 For example, if max raw sensor format output stream number is 1, max YUV streams 2610 number is 3, and max JPEG stream number is 2, then this tuple should be `(1, 3, 2)`. 2611 2612 This lists the upper bound of the number of output streams supported by 2613 the camera device. Using more streams simultaneously may require more hardware and 2614 CPU resources that will consume more power. The image format for a output stream can 2615 be any supported format provided by android.scaler.availableFormats. The formats 2616 defined in android.scaler.availableFormats can be catergorized into the 3 stream types 2617 as below: 2618 2619 * JPEG-compressed format: BLOB. 2620 * Raw formats: RAW_SENSOR and RAW_OPAQUE. 2621 * processed, uncompressed formats: YCbCr_420_888, YCrCb_420_SP, YV12. 2622 </details> 2623 <tag id="BC" /> 2624 </entry> 2625 <entry name="maxNumReprocessStreams" type="int32" visibility="system" 2626 container="array"> 2627 <array> 2628 <size>1</size> 2629 </array> 2630 <description>How many reprocessing streams of any type 2631 can be allocated at the same time.</description> 2632 <range>&gt;= 0</range> 2633 <details> 2634 **Deprecated**. Only used by HAL2.x. 2635 2636 When set to 0, it means no reprocess stream is supported. 2637 </details> 2638 <tag id="HAL2" /> 2639 </entry> 2640 <entry name="maxNumInputStreams" type="int32" visibility="public"> 2641 <description> 2642 The maximum numbers of any type of input streams 2643 that can be configured and used simultaneously by a camera device. 2644 </description> 2645 <range> 2646 &gt;= 0 for LIMITED mode device (`android.info.supportedHardwareLevel == LIMITED`). 2647 &gt;= 1 for FULL mode device (`android.info.supportedHardwareLevel == FULL`). 2648 </range> 2649 <details>When set to 0, it means no input stream is supported. 2650 2651 The image format for a input stream can be any supported format provided 2652 by android.scaler.availableInputFormats. When using an input stream, there must be 2653 at least one output stream configured to to receive the reprocessed images. 2654 2655 For example, for Zero Shutter Lag (ZSL) still capture use case, the input 2656 stream image format will be RAW_OPAQUE, the associated output stream image format 2657 should be JPEG. 2658 </details> 2659 </entry> 2660 </static> 2661 <dynamic> 2662 <entry name="frameCount" type="int32" visibility="public"> 2663 <description>A frame counter set by the framework. This value monotonically 2664 increases with every new result (that is, each new result has a unique 2665 frameCount value).</description> 2666 <units>count of frames</units> 2667 <range>&gt; 0</range> 2668 <details>Reset on release()</details> 2669 </entry> 2670 <clone entry="android.request.id" kind="controls"></clone> 2671 <clone entry="android.request.metadataMode" 2672 kind="controls"></clone> 2673 <clone entry="android.request.outputStreams" 2674 kind="controls"></clone> 2675 <entry name="pipelineDepth" type="byte" visibility="public"> 2676 <description>Specifies the number of pipeline stages the frame went 2677 through from when it was exposed to when the final completed result 2678 was available to the framework.</description> 2679 <range>&lt;= android.request.pipelineMaxDepth</range> 2680 <details>Depending on what settings are used in the request, and 2681 what streams are configured, the data may undergo less processing, 2682 and some pipeline stages skipped. 2683 2684 See android.request.pipelineMaxDepth for more details. 2685 </details> 2686 <hal_details> 2687 This value must always represent the accurate count of how many 2688 pipeline stages were actually used. 2689 </hal_details> 2690 </entry> 2691 </dynamic> 2692 <static> 2693 <entry name="pipelineMaxDepth" type="byte" visibility="public"> 2694 <description>Specifies the number of maximum pipeline stages a frame 2695 has to go through from when it's exposed to when it's available 2696 to the framework.</description> 2697 <details>A typical minimum value for this is 2 (one stage to expose, 2698 one stage to readout) from the sensor. The ISP then usually adds 2699 its own stages to do custom HW processing. Further stages may be 2700 added by SW processing. 2701 2702 Depending on what settings are used (e.g. YUV, JPEG) and what 2703 processing is enabled (e.g. face detection), the actual pipeline 2704 depth (specified by android.request.pipelineDepth) may be less than 2705 the max pipeline depth. 2706 2707 A pipeline depth of X stages is equivalent to a pipeline latency of 2708 X frame intervals. 2709 2710 This value will be 8 or less. 2711 </details> 2712 <hal_details> 2713 This value should be 4 or less. 2714 </hal_details> 2715 </entry> 2716 <entry name="partialResultCount" type="int32" visibility="public"> 2717 <description>Optional. Defaults to 1. Defines how many sub-components 2718 a result will be composed of. 2719 </description> 2720 <range>&gt;= 1</range> 2721 <details>In order to combat the pipeline latency, partial results 2722 may be delivered to the application layer from the camera device as 2723 soon as they are available. 2724 2725 A value of 1 means that partial results are not supported. 2726 2727 A typical use case for this might be: after requesting an AF lock the 2728 new AF state might be available 50% of the way through the pipeline. 2729 The camera device could then immediately dispatch this state via a 2730 partial result to the framework/application layer, and the rest of 2731 the metadata via later partial results. 2732 </details> 2733 </entry> 2734 </static> 2735 </section> 2736 <section name="scaler"> 2737 <controls> 2738 <entry name="cropRegion" type="int32" visibility="public" 2739 container="array" typedef="rectangle"> 2740 <array> 2741 <size>4</size> 2742 </array> 2743 <description>(x, y, width, height). 2744 2745 A rectangle with the top-level corner of (x,y) and size 2746 (width, height). The region of the sensor that is used for 2747 output. Each stream must use this rectangle to produce its 2748 output, cropping to a smaller region if necessary to 2749 maintain the stream's aspect ratio. 2750 2751 HAL2.x uses only (x, y, width)</description> 2752 <units>(x,y) of top-left corner, width and height of region 2753 in pixels; (0,0) is top-left corner of 2754 android.sensor.activeArraySize</units> 2755 <details> 2756 Any additional per-stream cropping must be done to 2757 maximize the final pixel area of the stream. 2758 2759 For example, if the crop region is set to a 4:3 aspect 2760 ratio, then 4:3 streams should use the exact crop 2761 region. 16:9 streams should further crop vertically 2762 (letterbox). 2763 2764 Conversely, if the crop region is set to a 16:9, then 4:3 2765 outputs should crop horizontally (pillarbox), and 16:9 2766 streams should match exactly. These additional crops must 2767 be centered within the crop region. 2768 2769 The output streams must maintain square pixels at all 2770 times, no matter what the relative aspect ratios of the 2771 crop region and the stream are. Negative values for 2772 corner are allowed for raw output if full pixel array is 2773 larger than active pixel array. Width and height may be 2774 rounded to nearest larger supportable width, especially 2775 for raw output, where only a few fixed scales may be 2776 possible. The width and height of the crop region cannot 2777 be set to be smaller than floor( activeArraySize.width / 2778 android.scaler.maxDigitalZoom ) and floor( 2779 activeArraySize.height / android.scaler.maxDigitalZoom), 2780 respectively. 2781 </details> 2782 <tag id="BC" /> 2783 </entry> 2784 </controls> 2785 <static> 2786 <entry name="availableFormats" type="int32" 2787 visibility="public" enum="true" 2788 container="array" typedef="imageFormat"> 2789 <array> 2790 <size>n</size> 2791 </array> 2792 <enum> 2793 <value optional="true" id="0x20">RAW16 2794 <notes> 2795 RAW16 is a standard, cross-platform format for raw image 2796 buffers with 16-bit pixels. Buffers of this format are typically 2797 expected to have a Bayer Color Filter Array (CFA) layout, which 2798 is given in android.sensor.info.colorFilterArrangement. Sensors 2799 with CFAs that are not representable by a format in 2800 android.sensor.info.colorFilterArrangement should not use this 2801 format. 2802 2803 Buffers of this format will also follow the constraints given for 2804 RAW_OPAQUE buffers, but with relaxed performance constraints. 2805 2806 See android.scaler.availableInputFormats for the full set of 2807 performance guarantees. 2808 </notes> 2809 </value> 2810 <value optional="true" id="0x24">RAW_OPAQUE 2811 <notes> 2812 RAW_OPAQUE is a format for raw image buffers coming from an 2813 image sensor. The actual structure of buffers of this format is 2814 platform-specific, but must follow several constraints: 2815 2816 1. No image post-processing operations may have been applied to 2817 buffers of this type. These buffers contain raw image data coming 2818 directly from the image sensor. 2819 1. If a buffer of this format is passed to the camera device for 2820 reprocessing, the resulting images will be identical to the images 2821 produced if the buffer had come directly from the sensor and was 2822 processed with the same settings. 2823 2824 The intended use for this format is to allow access to the native 2825 raw format buffers coming directly from the camera sensor without 2826 any additional conversions or decrease in framerate. 2827 2828 See android.scaler.availableInputFormats for the full set of 2829 performance guarantees. 2830 </notes> 2831 </value> 2832 <value optional="true" id="0x32315659">YV12 2833 <notes>YCrCb 4:2:0 Planar</notes> 2834 </value> 2835 <value optional="true" id="0x11">YCrCb_420_SP 2836 <notes>NV21</notes> 2837 </value> 2838 <value id="0x22">IMPLEMENTATION_DEFINED 2839 <notes>System internal format, not application-accessible</notes> 2840 </value> 2841 <value id="0x23">YCbCr_420_888 2842 <notes>Flexible YUV420 Format</notes> 2843 </value> 2844 <value id="0x21">BLOB 2845 <notes>JPEG format</notes> 2846 </value> 2847 </enum> 2848 <description>The list of image formats that are supported by this 2849 camera device.</description> 2850 <details> 2851 All camera devices will support JPEG and YUV_420_888 formats. 2852 2853 When set to YUV_420_888, application can access the YUV420 data directly. 2854 </details> 2855 <hal_details> 2856 These format values are from HAL_PIXEL_FORMAT_* in 2857 system/core/include/system/graphics.h. 2858 2859 When IMPLEMENTATION_DEFINED is used, the platform 2860 gralloc module will select a format based on the usage flags provided 2861 by the camera HAL device and the other endpoint of the stream. It is 2862 usually used by preview and recording streams, where the application doesn't 2863 need access the image data. 2864 2865 YCbCr_420_888 format must be supported by the HAL. When an image stream 2866 needs CPU/application direct access, this format will be used. 2867 2868 The BLOB format must be supported by the HAL. This is used for the JPEG stream. 2869 2870 A RAW_OPAQUE buffer should contain only pixel data. It is strongly 2871 recommended that any information used by the camera device when 2872 processing images is fully expressed by the result metadata 2873 for that image buffer. 2874 </hal_details> 2875 <tag id="BC" /> 2876 </entry> 2877 <entry name="availableJpegMinDurations" type="int64" visibility="public" 2878 container="array"> 2879 <array> 2880 <size>n</size> 2881 </array> 2882 <description>The minimum frame duration that is supported 2883 for each resolution in android.scaler.availableJpegSizes. 2884 </description> 2885 <units>ns</units> 2886 <details> 2887 This corresponds to the minimum steady-state frame duration when only 2888 that JPEG stream is active and captured in a burst, with all 2889 processing (typically in android.*.mode) set to FAST. 2890 2891 When multiple streams are configured, the minimum 2892 frame duration will be &gt;= max(individual stream min 2893 durations)</details> 2894 <tag id="BC" /> 2895 </entry> 2896 <entry name="availableJpegSizes" type="int32" visibility="public" 2897 container="array" typedef="size"> 2898 <array> 2899 <size>n</size> 2900 <size>2</size> 2901 </array> 2902 <description>The JPEG resolutions that are supported by this camera device.</description> 2903 <details> 2904 The resolutions are listed as `(width, height)` pairs. All camera devices will support 2905 sensor maximum resolution (defined by android.sensor.info.activeArraySize). 2906 </details> 2907 <hal_details> 2908 The HAL must include sensor maximum resolution 2909 (defined by android.sensor.info.activeArraySize), 2910 and should include half/quarter of sensor maximum resolution. 2911 </hal_details> 2912 <tag id="BC" /> 2913 </entry> 2914 <entry name="availableMaxDigitalZoom" type="float" visibility="public"> 2915 <description>The maximum ratio between active area width 2916 and crop region width, or between active area height and 2917 crop region height, if the crop region height is larger 2918 than width</description> 2919 <range>&gt;=1</range> 2920 <tag id="BC" /> 2921 </entry> 2922 <entry name="availableProcessedMinDurations" type="int64" visibility="public" 2923 container="array"> 2924 <array> 2925 <size>n</size> 2926 </array> 2927 <description>For each available processed output size (defined in 2928 android.scaler.availableProcessedSizes), this property lists the 2929 minimum supportable frame duration for that size. 2930 2931 </description> 2932 <units>ns</units> 2933 <details> 2934 This should correspond to the frame duration when only that processed 2935 stream is active, with all processing (typically in android.*.mode) 2936 set to FAST. 2937 2938 When multiple streams are configured, the minimum frame duration will 2939 be &gt;= max(individual stream min durations). 2940 </details> 2941 <tag id="BC" /> 2942 </entry> 2943 <entry name="availableProcessedSizes" type="int32" visibility="public" 2944 container="array" typedef="size"> 2945 <array> 2946 <size>n</size> 2947 <size>2</size> 2948 </array> 2949 <description>The resolutions available for use with 2950 processed output streams, such as YV12, NV12, and 2951 platform opaque YUV/RGB streams to the GPU or video 2952 encoders.</description> 2953 <details> 2954 The resolutions are listed as `(width, height)` pairs. 2955 2956 For a given use case, the actual maximum supported resolution 2957 may be lower than what is listed here, depending on the destination 2958 Surface for the image data. For example, for recording video, 2959 the video encoder chosen may have a maximum size limit (e.g. 1080p) 2960 smaller than what the camera (e.g. maximum resolution is 3264x2448) 2961 can provide. 2962 2963 Please reference the documentation for the image data destination to 2964 check if it limits the maximum size for image data. 2965 </details> 2966 <hal_details> 2967 For FULL capability devices (`android.info.supportedHardwareLevel == FULL`), 2968 the HAL must include all JPEG sizes listed in android.scaler.availableJpegSizes 2969 and each below resolution if it is smaller than or equal to the sensor 2970 maximum resolution (if they are not listed in JPEG sizes already): 2971 2972 * 240p (320 x 240) 2973 * 480p (640 x 480) 2974 * 720p (1280 x 720) 2975 * 1080p (1920 x 1080) 2976 2977 For LIMITED capability devices (`android.info.supportedHardwareLevel == LIMITED`), 2978 the HAL only has to list up to the maximum video size supported by the devices. 2979 </hal_details> 2980 <tag id="BC" /> 2981 </entry> 2982 <entry name="availableRawMinDurations" type="int64" 2983 container="array"> 2984 <array> 2985 <size>n</size> 2986 </array> 2987 <description> 2988 For each available raw output size (defined in 2989 android.scaler.availableRawSizes), this property lists the minimum 2990 supportable frame duration for that size. 2991 </description> 2992 <units>ns</units> 2993 <details> 2994 Should correspond to the frame duration when only the raw stream is 2995 active. 2996 2997 When multiple streams are configured, the minimum 2998 frame duration will be &gt;= max(individual stream min 2999 durations)</details> 3000 <tag id="BC" /> 3001 </entry> 3002 <entry name="availableRawSizes" type="int32" 3003 container="array" typedef="size"> 3004 <array> 3005 <size>n</size> 3006 <size>2</size> 3007 </array> 3008 <description>The resolutions available for use with raw 3009 sensor output streams, listed as width, 3010 height</description> 3011 <range>Must include: - sensor maximum resolution</range> 3012 </entry> 3013 </static> 3014 <dynamic> 3015 <clone entry="android.scaler.cropRegion" kind="controls"> 3016 </clone> 3017 </dynamic> 3018 </section> 3019 <section name="sensor"> 3020 <controls> 3021 <entry name="exposureTime" type="int64" visibility="public"> 3022 <description>Duration each pixel is exposed to 3023 light. 3024 3025 If the sensor can't expose this exact duration, it should shorten the 3026 duration exposed to the nearest possible value (rather than expose longer). 3027 </description> 3028 <units>nanoseconds</units> 3029 <range>android.sensor.info.exposureTimeRange</range> 3030 <details>1/10000 - 30 sec range. No bulb mode</details> 3031 <tag id="V1" /> 3032 </entry> 3033 <entry name="frameDuration" type="int64" visibility="public"> 3034 <description>Duration from start of frame exposure to 3035 start of next frame exposure.</description> 3036 <units>nanoseconds</units> 3037 <range>See android.sensor.info.maxFrameDuration, 3038 android.scaler.available*MinDurations. The duration 3039 is capped to `max(duration, exposureTime + overhead)`.</range> 3040 <details> 3041 The maximum frame rate that can be supported by a camera subsystem is 3042 a function of many factors: 3043 3044 * Requested resolutions of output image streams 3045 * Availability of binning / skipping modes on the imager 3046 * The bandwidth of the imager interface 3047 * The bandwidth of the various ISP processing blocks 3048 3049 Since these factors can vary greatly between different ISPs and 3050 sensors, the camera abstraction tries to represent the bandwidth 3051 restrictions with as simple a model as possible. 3052 3053 The model presented has the following characteristics: 3054 3055 * The image sensor is always configured to output the smallest 3056 resolution possible given the application's requested output stream 3057 sizes. The smallest resolution is defined as being at least as large 3058 as the largest requested output stream size; the camera pipeline must 3059 never digitally upsample sensor data when the crop region covers the 3060 whole sensor. In general, this means that if only small output stream 3061 resolutions are configured, the sensor can provide a higher frame 3062 rate. 3063 * Since any request may use any or all the currently configured 3064 output streams, the sensor and ISP must be configured to support 3065 scaling a single capture to all the streams at the same time. This 3066 means the camera pipeline must be ready to produce the largest 3067 requested output size without any delay. Therefore, the overall 3068 frame rate of a given configured stream set is governed only by the 3069 largest requested stream resolution. 3070 * Using more than one output stream in a request does not affect the 3071 frame duration. 3072 * JPEG streams act like processed YUV streams in requests for which 3073 they are not included; in requests in which they are directly 3074 referenced, they act as JPEG streams. This is because supporting a 3075 JPEG stream requires the underlying YUV data to always be ready for 3076 use by a JPEG encoder, but the encoder will only be used (and impact 3077 frame duration) on requests that actually reference a JPEG stream. 3078 * The JPEG processor can run concurrently to the rest of the camera 3079 pipeline, but cannot process more than 1 capture at a time. 3080 3081 The necessary information for the application, given the model above, 3082 is provided via the android.scaler.available*MinDurations fields. 3083 These are used to determine the maximum frame rate / minimum frame 3084 duration that is possible for a given stream configuration. 3085 3086 Specifically, the application can use the following rules to 3087 determine the minimum frame duration it can request from the HAL 3088 device: 3089 3090 1. Given the application's currently configured set of output 3091 streams, `S`, divide them into three sets: streams in a JPEG format 3092 `SJ`, streams in a raw sensor format `SR`, and the rest ('processed') 3093 `SP`. 3094 1. For each subset of streams, find the largest resolution (by pixel 3095 count) in the subset. This gives (at most) three resolutions `RJ`, 3096 `RR`, and `RP`. 3097 1. If `RJ` is greater than `RP`, set `RP` equal to `RJ`. If there is 3098 no exact match for `RP == RJ` (in particular there isn't an available 3099 processed resolution at the same size as `RJ`), then set `RP` equal 3100 to the smallest processed resolution that is larger than `RJ`. If 3101 there are no processed resolutions larger than `RJ`, then set `RJ` to 3102 the processed resolution closest to `RJ`. 3103 1. If `RP` is greater than `RR`, set `RR` equal to `RP`. If there is 3104 no exact match for `RR == RP` (in particular there isn't an available 3105 raw resolution at the same size as `RP`), then set `RR` equal to 3106 or to the smallest raw resolution that is larger than `RP`. If 3107 there are no raw resolutions larger than `RP`, then set `RR` to 3108 the raw resolution closest to `RP`. 3109 1. Look up the matching minimum frame durations in the property lists 3110 android.scaler.availableJpegMinDurations, 3111 android.scaler.availableRawMinDurations, and 3112 android.scaler.availableProcessedMinDurations. This gives three 3113 minimum frame durations `FJ`, `FR`, and `FP`. 3114 1. If a stream of requests do not use a JPEG stream, then the minimum 3115 supported frame duration for each request is `max(FR, FP)`. 3116 1. If a stream of requests all use the JPEG stream, then the minimum 3117 supported frame duration for each request is `max(FR, FP, FJ)`. 3118 1. If a mix of JPEG-using and non-JPEG-using requests is submitted by 3119 the application, then the HAL will have to delay JPEG-using requests 3120 whenever the JPEG encoder is still busy processing an older capture. 3121 This will happen whenever a JPEG-using request starts capture less 3122 than `FJ` _ns_ after a previous JPEG-using request. The minimum 3123 supported frame duration will vary between the values calculated in 3124 \#6 and \#7. 3125 </details> 3126 <tag id="V1" /> 3127 <tag id="BC" /> 3128 </entry> 3129 <entry name="sensitivity" type="int32" visibility="public"> 3130 <description>Gain applied to image data. Must be 3131 implemented through analog gain only if set to values 3132 below 'maximum analog sensitivity'. 3133 3134 If the sensor can't apply this exact gain, it should lessen the 3135 gain to the nearest possible value (rather than gain more). 3136 </description> 3137 <units>ISO arithmetic units</units> 3138 <range>android.sensor.info.sensitivityRange</range> 3139 <details>ISO 12232:2006 REI method</details> 3140 <tag id="V1" /> 3141 </entry> 3142 </controls> 3143 <static> 3144 <namespace name="info"> 3145 <entry name="activeArraySize" type="int32" visibility="public" 3146 type_notes="Four ints defining the active pixel rectangle" 3147 container="array" 3148 typedef="rectangle"> 3149 <array> 3150 <size>4</size> 3151 </array> 3152 <description>Area of raw data which corresponds to only 3153 active pixels.</description> 3154 <range>This array contains `(xmin, ymin, width, height)`. The `(xmin, ymin)` must be 3155 &gt;= `(0,0)`. The `(width, height)` must be &lt;= 3156 `android.sensor.info.pixelArraySize`. 3157 </range> 3158 <details>It is smaller or equal to 3159 sensor full pixel array, which could include the black calibration pixels.</details> 3160 <tag id="DNG" /> 3161 </entry> 3162 <entry name="sensitivityRange" type="int32" visibility="public" 3163 type_notes="Range of supported sensitivities" 3164 container="array"> 3165 <array> 3166 <size>2</size> 3167 </array> 3168 <description>Range of valid sensitivities</description> 3169 <range>Min <= 100, Max &gt;= 1600</range> 3170 <tag id="BC" /> 3171 <tag id="V1" /> 3172 </entry> 3173 <entry name="colorFilterArrangement" type="byte" enum="true"> 3174 <enum> 3175 <value>RGGB</value> 3176 <value>GRBG</value> 3177 <value>GBRG</value> 3178 <value>BGGR</value> 3179 <value>RGB 3180 <notes>Sensor is not Bayer; output has 3 16-bit 3181 values for each pixel, instead of just 1 16-bit value 3182 per pixel.</notes></value> 3183 </enum> 3184 <description>Arrangement of color filters on sensor; 3185 represents the colors in the top-left 2x2 section of 3186 the sensor, in reading order</description> 3187 <tag id="DNG" /> 3188 </entry> 3189 <entry name="exposureTimeRange" type="int64" visibility="public" 3190 type_notes="nanoseconds" container="array"> 3191 <array> 3192 <size>2</size> 3193 </array> 3194 <description>Range of valid exposure 3195 times used by android.sensor.exposureTime.</description> 3196 <range>Min <= 100e3 (100 us), Max &gt;= 1e9 (1 3197 sec)</range> 3198 <hal_details>The maximum of the range must be at least 3199 1 second. It should be at least 30 seconds.</hal_details> 3200 <tag id="V1" /> 3201 </entry> 3202 <entry name="maxFrameDuration" type="int64" visibility="public"> 3203 <description>Maximum possible frame duration (minimum frame 3204 rate).</description> 3205 <units>nanoseconds</units> 3206 <range>&gt;= 30e9</range> 3207 <details>The largest possible android.sensor.frameDuration 3208 that will be accepted by the camera device. Attempting to use 3209 frame durations beyond the maximum will result in the frame duration 3210 being clipped to the maximum. See that control 3211 for a full definition of frame durations. 3212 3213 Refer to 3214 android.scaler.availableProcessedMinDurations, 3215 android.scaler.availableJpegMinDurations, and 3216 android.scaler.availableRawMinDurations for the minimum 3217 frame duration values. 3218 </details> 3219 <hal_details> 3220 This value must be at least 1 second. It should be at least 30 3221 seconds (30e9 ns). 3222 3223 android.sensor.maxFrameDuration must be greater or equal to the 3224 android.sensor.exposureTimeRange max value (since exposure time 3225 overrides frame duration). 3226 </hal_details> 3227 <tag id="BC" /> 3228 <tag id="V1" /> 3229 </entry> 3230 <entry name="physicalSize" type="float" visibility="public" 3231 type_notes="width x height in millimeters" 3232 container="array"> 3233 <array> 3234 <size>2</size> 3235 </array> 3236 <description>The physical dimensions of the full pixel 3237 array</description> 3238 <details>Needed for FOV calculation for old API</details> 3239 <tag id="V1" /> 3240 <tag id="BC" /> 3241 </entry> 3242 <entry name="pixelArraySize" type="int32" 3243 container="array" typedef="size"> 3244 <array> 3245 <size>2</size> 3246 </array> 3247 <description>Dimensions of full pixel array, possibly 3248 including black calibration pixels.</description> 3249 <details>The maximum output resolution for raw format in 3250 android.scaler.availableRawSizes will be equal to this size. 3251 </details> 3252 <tag id="DNG" /> 3253 <tag id="BC" /> 3254 </entry> 3255 <entry name="whiteLevel" type="int32"> 3256 <description>Maximum raw value output by 3257 sensor</description> 3258 <range>&gt; 1024 (10-bit output)</range> 3259 <details>Defines sensor bit depth (10-14 bits is 3260 expected)</details> 3261 <tag id="DNG" /> 3262 </entry> 3263 </namespace> 3264 <entry name="baseGainFactor" type="rational" visibility="public" 3265 optional="true"> 3266 <description>Gain factor from electrons to raw units when 3267 ISO=100</description> 3268 <tag id="V1" /> 3269 <tag id="FULL" /> 3270 </entry> 3271 <entry name="blackLevelPattern" type="int32" visibility="public" 3272 optional="true" type_notes="2x2 raw count block" container="array"> 3273 <array> 3274 <size>4</size> 3275 </array> 3276 <description> 3277 A fixed black level offset for each of the color filter arrangement 3278 (CFA) mosaic channels. 3279 </description> 3280 <range>&gt;= 0 for each.</range> 3281 <details> 3282 This tag specifies the zero light value for each of the CFA mosaic 3283 channels in the camera sensor. 3284 3285 The values are given in row-column scan order, with the first value 3286 corresponding to the element of the CFA in row=0, column=0. 3287 </details> 3288 <tag id="DNG" /> 3289 </entry> 3290 <entry name="calibrationTransform1" type="rational" 3291 type_notes="3x3 matrix in row-major-order" 3292 container="array"> 3293 <array> 3294 <size>9</size> 3295 </array> 3296 <description>Per-device calibration on top of color space 3297 transform 1</description> 3298 <tag id="DNG" /> 3299 </entry> 3300 <entry name="calibrationTransform2" type="rational" 3301 type_notes="3x3 matrix in row-major-order" 3302 container="array"> 3303 <array> 3304 <size>9</size> 3305 </array> 3306 <description>Per-device calibration on top of color space 3307 transform 2</description> 3308 <tag id="DNG" /> 3309 </entry> 3310 <entry name="colorTransform1" type="rational" 3311 type_notes="3x3 matrix in row-major-order" 3312 container="array"> 3313 <array> 3314 <size>9</size> 3315 </array> 3316 <description>Linear mapping from XYZ (D50) color space to 3317 reference linear sensor color, for first reference 3318 illuminant</description> 3319 <details>Use as follows XYZ = inv(transform) * clip( (raw - 3320 black level(raw) ) / ( white level - max black level) ). 3321 At least in the simple case</details> 3322 <tag id="DNG" /> 3323 </entry> 3324 <entry name="colorTransform2" type="rational" 3325 type_notes="3x3 matrix in row-major-order" 3326 container="array"> 3327 <array> 3328 <size>9</size> 3329 </array> 3330 <description>Linear mapping from XYZ (D50) color space to 3331 reference linear sensor color, for second reference 3332 illuminant</description> 3333 <tag id="DNG" /> 3334 </entry> 3335 <entry name="forwardMatrix1" type="rational" 3336 type_notes="3x3 matrix in row-major-order" 3337 container="array"> 3338 <array> 3339 <size>9</size> 3340 </array> 3341 <description>Used by DNG for better WB 3342 adaptation</description> 3343 <tag id="DNG" /> 3344 </entry> 3345 <entry name="forwardMatrix2" type="rational" 3346 type_notes="3x3 matrix in row-major-order" 3347 container="array"> 3348 <array> 3349 <size>9</size> 3350 </array> 3351 <description>Used by DNG for better WB 3352 adaptation</description> 3353 <tag id="DNG" /> 3354 </entry> 3355 <entry name="maxAnalogSensitivity" type="int32" visibility="public" 3356 optional="true"> 3357 <description>Maximum sensitivity that is implemented 3358 purely through analog gain.</description> 3359 <details>For android.sensor.sensitivity values less than or 3360 equal to this, all applied gain must be analog. For 3361 values above this, the gain applied can be a mix of analog and 3362 digital.</details> 3363 <tag id="V1" /> 3364 <tag id="FULL" /> 3365 </entry> 3366 <entry name="orientation" type="int32" visibility="public"> 3367 <description>Clockwise angle through which the output 3368 image needs to be rotated to be upright on the device 3369 screen in its native orientation. Also defines the 3370 direction of rolling shutter readout, which is from top 3371 to bottom in the sensor's coordinate system</description> 3372 <units>degrees clockwise rotation, only multiples of 3373 90</units> 3374 <range>0,90,180,270</range> 3375 <tag id="BC" /> 3376 </entry> 3377 <entry name="profileHueSatMapDimensions" type="int32" 3378 visibility="public" optional="true" 3379 type_notes="Number of samples for hue, saturation, and value" 3380 container="array"> 3381 <array> 3382 <size>3</size> 3383 </array> 3384 <description> 3385 The number of input samples for each dimension of 3386 android.sensor.profileHueSatMap. 3387 </description> 3388 <range> 3389 Hue &gt;= 1, 3390 Saturation &gt;= 2, 3391 Value &gt;= 1 3392 </range> 3393 <details> 3394 The number of input samples for the hue, saturation, and value 3395 dimension of android.sensor.profileHueSatMap. The order of the 3396 dimensions given is hue, saturation, value; where hue is the 0th 3397 element. 3398 </details> 3399 <tag id="DNG" /> 3400 </entry> 3401 <entry name="referenceIlluminant1" type="byte" enum="true"> 3402 <enum> 3403 <value id="1">DAYLIGHT</value> 3404 <value id="2">FLUORESCENT</value> 3405 <value id="3">TUNGSTEN 3406 <notes>Incandescent light</notes></value> 3407 <value id="4">FLASH</value> 3408 <value id="9">FINE_WEATHER</value> 3409 <value id="10">CLOUDY_WEATHER</value> 3410 <value id="11">SHADE</value> 3411 <value id="12">DAYLIGHT_FLUORESCENT 3412 <notes>D 5700 - 7100K</notes></value> 3413 <value id="13">DAY_WHITE_FLUORESCENT 3414 <notes>N 4600 - 5400K</notes></value> 3415 <value id="14">COOL_WHITE_FLUORESCENT 3416 <notes>W 3900 - 4500K</notes></value> 3417 <value id="15">WHITE_FLUORESCENT 3418 <notes>WW 3200 - 3700K</notes></value> 3419 <value id="17">STANDARD_A</value> 3420 <value id="18">STANDARD_B</value> 3421 <value id="19">STANDARD_C</value> 3422 <value id="20">D55</value> 3423 <value id="21">D65</value> 3424 <value id="22">D75</value> 3425 <value id="23">D50</value> 3426 <value id="24">ISO_STUDIO_TUNGSTEN</value> 3427 </enum> 3428 <description>Light source used to define transform 3429 1</description> 3430 <details>[EXIF LightSource tag] Must all these be 3431 supported? Need CCT for each!</details> 3432 <tag id="DNG" /> 3433 <tag id="EXIF" /> 3434 </entry> 3435 <entry name="referenceIlluminant2" type="byte"> 3436 <description>Light source used to define transform 3437 2</description> 3438 <units>Same as illuminant 1</units> 3439 </entry> 3440 </static> 3441 <dynamic> 3442 <clone entry="android.sensor.exposureTime" kind="controls"> 3443 </clone> 3444 <clone entry="android.sensor.frameDuration" 3445 kind="controls"></clone> 3446 <clone entry="android.sensor.sensitivity" kind="controls"> 3447 </clone> 3448 <entry name="timestamp" type="int64" visibility="public"> 3449 <description>Time at start of exposure of first 3450 row</description> 3451 <units>nanoseconds</units> 3452 <range>&gt; 0</range> 3453 <details>Monotonic, should be synced to other timestamps in 3454 system</details> 3455 <tag id="BC" /> 3456 </entry> 3457 <entry name="temperature" type="float" visibility="public" 3458 optional="true"> 3459 <description>The temperature of the sensor, sampled at the time 3460 exposure began for this frame. 3461 3462 The thermal diode being queried should be inside the sensor PCB, or 3463 somewhere close to it. 3464 </description> 3465 3466 <units>celsius</units> 3467 <range>Optional. This value is missing if no temperature is available.</range> 3468 <tag id="FULL" /> 3469 </entry> 3470 <entry name="neutralColorPoint" type="rational" visibility="public" 3471 optional="true" container="array"> 3472 <array> 3473 <size>3</size> 3474 </array> 3475 <description> 3476 The estimated white balance at the time of capture. 3477 </description> 3478 <details> 3479 The estimated white balance encoded as the RGB values of the 3480 perfectly neutral color point in the linear native sensor color space. 3481 The order of the values is R, G, B; where R is in the lowest index. 3482 </details> 3483 <tag id="DNG" /> 3484 </entry> 3485 <entry name="profileHueSatMap" type="float" 3486 visibility="public" optional="true" 3487 type_notes="Mapping for hue, saturation, and value" 3488 container="array"> 3489 <array> 3490 <size>hue_samples</size> 3491 <size>saturation_samples</size> 3492 <size>value_samples</size> 3493 <size>3</size> 3494 </array> 3495 <description> 3496 A mapping containing a hue shift, saturation scale, and value scale 3497 for each pixel. 3498 </description> 3499 <units> 3500 Hue shift is given in degrees; saturation and value scale factors are 3501 unitless. 3502 </units> 3503 <details> 3504 hue_samples, saturation_samples, and value_samples are given in 3505 android.sensor.profileHueSatMapDimensions. 3506 3507 Each entry of this map contains three floats corresponding to the 3508 hue shift, saturation scale, and value scale, respectively; where the 3509 hue shift has the lowest index. The map entries are stored in the tag 3510 in nested loop order, with the value divisions in the outer loop, the 3511 hue divisions in the middle loop, and the saturation divisions in the 3512 inner loop. All zero input saturation entries are required to have a 3513 value scale factor of 1.0. 3514 </details> 3515 <tag id="DNG" /> 3516 </entry> 3517 <entry name="profileToneCurve" type="float" 3518 visibility="public" optional="true" 3519 type_notes="Samples defining a spline for a tone-mapping curve" 3520 container="array"> 3521 <array> 3522 <size>samples</size> 3523 <size>2</size> 3524 </array> 3525 <description> 3526 A list of x,y samples defining a tone-mapping curve for gamma adjustment. 3527 </description> 3528 <range> 3529 Each sample has an input range of `[0, 1]` and an output range of 3530 `[0, 1]`. The first sample is required to be `(0, 0)`, and the last 3531 sample is required to be `(1, 1)`. 3532 </range> 3533 <details> 3534 This tag contains a default tone curve that can be applied while 3535 processing the image as a starting point for user adjustments. 3536 The curve is specified as a list of value pairs in linear gamma. 3537 The curve is interpolated using a cubic spline. 3538 </details> 3539 <tag id="DNG" /> 3540 </entry> 3541 </dynamic> 3542 <controls> 3543 <entry name="testPatternData" type="int32" visibility="public" optional="true" container="array"> 3544 <array> 3545 <size>4</size> 3546 </array> 3547 <description> 3548 A pixel `[R, G_even, G_odd, B]` that supplies the test pattern 3549 when android.sensor.testPatternMode is SOLID_COLOR. 3550 </description> 3551 <range>Optional. 3552 Must be supported if android.sensor.availableTestPatternModes contains 3553 SOLID_COLOR.</range> 3554 <details> 3555 Each color channel is treated as an unsigned 32-bit integer. 3556 The camera device then uses the most significant X bits 3557 that correspond to how many bits are in its Bayer raw sensor 3558 output. 3559 3560 For example, a sensor with RAW10 Bayer output would use the 3561 10 most significant bits from each color channel. 3562 </details> 3563 <hal_details> 3564 </hal_details> 3565 </entry> 3566 <entry name="testPatternMode" type="int32" visibility="public" optional="true" 3567 enum="true"> 3568 <enum> 3569 <value>OFF 3570 <notes>Default. No test pattern mode is used, and the camera 3571 device returns captures from the image sensor.</notes> 3572 </value> 3573 <value>SOLID_COLOR 3574 <notes> 3575 Each pixel in `[R, G_even, G_odd, B]` is replaced by its 3576 respective color channel provided in 3577 android.sensor.testPatternData. 3578 3579 For example: 3580 3581 android.testPatternData = [0, 0xFFFFFFFF, 0xFFFFFFFF, 0] 3582 3583 All green pixels are 100% green. All red/blue pixels are black. 3584 3585 android.testPatternData = [0xFFFFFFFF, 0, 0xFFFFFFFF, 0] 3586 3587 All red pixels are 100% red. Only the odd green pixels 3588 are 100% green. All blue pixels are 100% black. 3589 </notes> 3590 </value> 3591 <value>COLOR_BARS 3592 <notes> 3593 All pixel data is replaced with an 8-bar color pattern. 3594 3595 The vertical bars (left-to-right) are as follows: 3596 3597 * 100% white 3598 * yellow 3599 * cyan 3600 * green 3601 * magenta 3602 * red 3603 * blue 3604 * black 3605 3606 In general the image would look like the following: 3607 3608 W Y C G M R B K 3609 W Y C G M R B K 3610 W Y C G M R B K 3611 W Y C G M R B K 3612 W Y C G M R B K 3613 . . . . . . . . 3614 . . . . . . . . 3615 . . . . . . . . 3616 3617 (B = Blue, K = Black) 3618 3619 Each bar should take up 1/8 of the sensor pixel array width. 3620 When this is not possible, the bar size should be rounded 3621 down to the nearest integer and the pattern can repeat 3622 on the right side. 3623 3624 Each bar's height must always take up the full sensor 3625 pixel array height. 3626 3627 Each pixel in this test pattern must be set to either 3628 0% intensity or 100% intensity. 3629 </notes> 3630 </value> 3631 <value>COLOR_BARS_FADE_TO_GRAY 3632 <notes> 3633 The test pattern is similar to COLOR_BARS, except that 3634 each bar should start at its specified color at the top, 3635 and fade to gray at the bottom. 3636 3637 Furthermore each bar is further subdivided into a left and 3638 right half. The left half should have a smooth gradient, 3639 and the right half should have a quantized gradient. 3640 3641 In particular, the right half's should consist of blocks of the 3642 same color for 1/16th active sensor pixel array width. 3643 3644 The least significant bits in the quantized gradient should 3645 be copied from the most significant bits of the smooth gradient. 3646 3647 The height of each bar should always be a multiple of 128. 3648 When this is not the case, the pattern should repeat at the bottom 3649 of the image. 3650 </notes> 3651 </value> 3652 <value>PN9 3653 <notes> 3654 All pixel data is replaced by a pseudo-random sequence 3655 generated from a PN9 512-bit sequence (typically implemented 3656 in hardware with a linear feedback shift register). 3657 3658 The generator should be reset at the beginning of each frame, 3659 and thus each subsequent raw frame with this test pattern should 3660 be exactly the same as the last. 3661 </notes> 3662 </value> 3663 <value id="256">CUSTOM1 3664 <notes>The first custom test pattern. All custom patterns that are 3665 available only on this camera device are at least this numeric 3666 value. 3667 3668 All of the custom test patterns will be static 3669 (that is the raw image must not vary from frame to frame). 3670 </notes> 3671 </value> 3672 </enum> 3673 <description>When enabled, the sensor sends a test pattern instead of 3674 doing a real exposure from the camera. 3675 </description> 3676 <range>Optional. Defaults to OFF. Value must be one of 3677 android.sensor.availableTestPatternModes</range> 3678 <details> 3679 When a test pattern is enabled, all manual sensor controls specified 3680 by android.sensor.* should be ignored. All other controls should 3681 work as normal. 3682 3683 For example, if manual flash is enabled, flash firing should still 3684 occur (and that the test pattern remain unmodified, since the flash 3685 would not actually affect it). 3686 </details> 3687 <hal_details> 3688 All test patterns are specified in the Bayer domain. 3689 3690 The HAL may choose to substitute test patterns from the sensor 3691 with test patterns from on-device memory. In that case, it should be 3692 indistinguishable to the ISP whether the data came from the 3693 sensor interconnect bus (such as CSI2) or memory. 3694 </hal_details> 3695 </entry> 3696 </controls> 3697 <dynamic> 3698 <clone entry="android.sensor.testPatternMode" kind="controls"> 3699 </clone> 3700 </dynamic> 3701 <static> 3702 <entry name="availableTestPatternModes" type="byte" visibility="public" 3703 optional="true"> 3704 <description>Optional. Defaults to [OFF]. Lists the supported test 3705 pattern modes for android.test.patternMode. 3706 </description> 3707 <range>Must include OFF. All custom modes must be >= CUSTOM1</range> 3708 </entry> 3709 3710 </static> 3711 </section> 3712 <section name="shading"> 3713 <controls> 3714 <entry name="mode" type="byte" visibility="hidden" enum="true"> 3715 <enum> 3716 <value>OFF 3717 <notes>No lens shading correction is applied</notes></value> 3718 <value>FAST 3719 <notes>Must not slow down frame rate relative to sensor raw output</notes></value> 3720 <value>HIGH_QUALITY 3721 <notes>Frame rate may be reduced by high quality</notes></value> 3722 </enum> 3723 <description>Quality of lens shading correction applied 3724 to the image data.</description> 3725 <details> 3726 When set to OFF mode, no lens shading correction will be applied by the 3727 camera device, and an identity lens shading map data will be provided 3728 if `android.statistics.lensShadingMapMode == ON`. For example, for lens 3729 shading map with size specified as `android.lens.info.shadingMapSize = [ 4, 3 ]`, 3730 the output android.statistics.lensShadingMap for this case will be an identity map 3731 shown below: 3732 3733 [ 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 3734 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 3735 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 3736 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 3737 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 3738 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0 ] 3739 3740 When set to other modes, lens shading correction will be applied by the 3741 camera device. Applications can request lens shading map data by setting 3742 android.statistics.lensShadingMapMode to ON, and then the camera device will provide 3743 lens shading map data in android.statistics.lensShadingMap, with size specified 3744 by android.lens.info.shadingMapSize. 3745 </details> 3746 </entry> 3747 <entry name="strength" type="byte"> 3748 <description>Control the amount of shading correction 3749 applied to the images</description> 3750 <units>unitless: 1-10; 10 is full shading 3751 compensation</units> 3752 <tag id="ADV" /> 3753 </entry> 3754 </controls> 3755 <dynamic> 3756 <clone entry="android.shading.mode" kind="controls"> 3757 </clone> 3758 </dynamic> 3759 </section> 3760 <section name="statistics"> 3761 <controls> 3762 <entry name="faceDetectMode" type="byte" visibility="public" enum="true"> 3763 <enum> 3764 <value>OFF</value> 3765 <value>SIMPLE 3766 <notes>Optional Return rectangle and confidence 3767 only</notes></value> 3768 <value>FULL 3769 <notes>Optional Return all face 3770 metadata</notes></value> 3771 </enum> 3772 <description>State of the face detector 3773 unit</description> 3774 <range> 3775 android.statistics.info.availableFaceDetectModes</range> 3776 <details>Whether face detection is enabled, and whether it 3777 should output just the basic fields or the full set of 3778 fields. Value must be one of the 3779 android.statistics.info.availableFaceDetectModes.</details> 3780 <tag id="BC" /> 3781 </entry> 3782 <entry name="histogramMode" type="byte" enum="true" typedef="boolean"> 3783 <enum> 3784 <value>OFF</value> 3785 <value>ON</value> 3786 </enum> 3787 <description>Operating mode for histogram 3788 generation</description> 3789 <tag id="V1" /> 3790 </entry> 3791 <entry name="sharpnessMapMode" type="byte" enum="true" typedef="boolean"> 3792 <enum> 3793 <value>OFF</value> 3794 <value>ON</value> 3795 </enum> 3796 <description>Operating mode for sharpness map 3797 generation</description> 3798 <tag id="V1" /> 3799 </entry> 3800 </controls> 3801 <static> 3802 <namespace name="info"> 3803 <entry name="availableFaceDetectModes" type="byte" 3804 visibility="public" 3805 type_notes="List of enums from android.statistics.faceDetectMode" 3806 container="array"> 3807 <array> 3808 <size>n</size> 3809 </array> 3810 <description>Which face detection modes are available, 3811 if any</description> 3812 <units>List of enum: 3813 OFF 3814 SIMPLE 3815 FULL</units> 3816 <details>OFF means face detection is disabled, it must 3817 be included in the list. 3818 3819 SIMPLE means the device supports the 3820 android.statistics.faceRectangles and 3821 android.statistics.faceScores outputs. 3822 3823 FULL means the device additionally supports the 3824 android.statistics.faceIds and 3825 android.statistics.faceLandmarks outputs. 3826 </details> 3827 </entry> 3828 <entry name="histogramBucketCount" type="int32"> 3829 <description>Number of histogram buckets 3830 supported</description> 3831 <range>&gt;= 64</range> 3832 </entry> 3833 <entry name="maxFaceCount" type="int32" visibility="public" > 3834 <description>Maximum number of simultaneously detectable 3835 faces</description> 3836 <range>&gt;= 4 if availableFaceDetectionModes lists 3837 modes besides OFF, otherwise 0</range> 3838 </entry> 3839 <entry name="maxHistogramCount" type="int32"> 3840 <description>Maximum value possible for a histogram 3841 bucket</description> 3842 </entry> 3843 <entry name="maxSharpnessMapValue" type="int32"> 3844 <description>Maximum value possible for a sharpness map 3845 region.</description> 3846 </entry> 3847 <entry name="sharpnessMapSize" type="int32" 3848 type_notes="width x height" container="array" typedef="size"> 3849 <array> 3850 <size>2</size> 3851 </array> 3852 <description>Dimensions of the sharpness 3853 map</description> 3854 <range>Must be at least 32 x 32</range> 3855 </entry> 3856 </namespace> 3857 </static> 3858 <dynamic> 3859 <clone entry="android.statistics.faceDetectMode" 3860 kind="controls"></clone> 3861 <entry name="faceIds" type="int32" visibility="hidden" container="array"> 3862 <array> 3863 <size>n</size> 3864 </array> 3865 <description>List of unique IDs for detected 3866 faces</description> 3867 <details>Only available if faceDetectMode == FULL</details> 3868 <tag id="BC" /> 3869 </entry> 3870 <entry name="faceLandmarks" type="int32" visibility="hidden" 3871 type_notes="(leftEyeX, leftEyeY, rightEyeX, rightEyeY, mouthX, mouthY)" 3872 container="array"> 3873 <array> 3874 <size>n</size> 3875 <size>6</size> 3876 </array> 3877 <description>List of landmarks for detected 3878 faces</description> 3879 <details>Only available if faceDetectMode == FULL</details> 3880 <tag id="BC" /> 3881 </entry> 3882 <entry name="faceRectangles" type="int32" visibility="hidden" 3883 type_notes="(xmin, ymin, xmax, ymax). (0,0) is top-left of active pixel area" 3884 container="array" typedef="rectangle"> 3885 <array> 3886 <size>n</size> 3887 <size>4</size> 3888 </array> 3889 <description>List of the bounding rectangles for detected 3890 faces</description> 3891 <details>Only available if faceDetectMode != OFF</details> 3892 <tag id="BC" /> 3893 </entry> 3894 <entry name="faceScores" type="byte" visibility="hidden" container="array"> 3895 <array> 3896 <size>n</size> 3897 </array> 3898 <description>List of the face confidence scores for 3899 detected faces</description> 3900 <range>1-100</range> 3901 <details>Only available if faceDetectMode != OFF. The value should be 3902 meaningful (for example, setting 100 at all times is illegal).</details> 3903 <tag id="BC" /> 3904 </entry> 3905 <entry name="histogram" type="int32" 3906 type_notes="count of pixels for each color channel that fall into each histogram bucket, scaled to be between 0 and maxHistogramCount" 3907 container="array"> 3908 <array> 3909 <size>n</size> 3910 <size>3</size> 3911 </array> 3912 <description>A 3-channel histogram based on the raw 3913 sensor data</description> 3914 <details>The k'th bucket (0-based) covers the input range 3915 (with w = android.sensor.info.whiteLevel) of [ k * w/N, 3916 (k + 1) * w / N ). If only a monochrome sharpness map is 3917 supported, all channels should have the same data</details> 3918 <tag id="V1" /> 3919 </entry> 3920 <clone entry="android.statistics.histogramMode" 3921 kind="controls"></clone> 3922 <entry name="sharpnessMap" type="int32" 3923 type_notes="estimated sharpness for each region of the input image. Normalized to be between 0 and maxSharpnessMapValue. Higher values mean sharper (better focused)" 3924 container="array"> 3925 <array> 3926 <size>n</size> 3927 <size>m</size> 3928 <size>3</size> 3929 </array> 3930 <description>A 3-channel sharpness map, based on the raw 3931 sensor data</description> 3932 <details>If only a monochrome sharpness map is supported, 3933 all channels should have the same data</details> 3934 <tag id="V1" /> 3935 </entry> 3936 <clone entry="android.statistics.sharpnessMapMode" 3937 kind="controls"></clone> 3938 <entry name="lensShadingMap" type="float" visibility="public" 3939 type_notes="2D array of float gain factors per channel to correct lens shading" 3940 container="array"> 3941 <array> 3942 <size>4</size> 3943 <size>n</size> 3944 <size>m</size> 3945 </array> 3946 <description>The shading map is a low-resolution floating-point map 3947 that lists the coefficients used to correct for vignetting, for each 3948 Bayer color channel.</description> 3949 <range>Each gain factor is &gt;= 1</range> 3950 <details>The least shaded section of the image should have a gain factor 3951 of 1; all other sections should have gains above 1. 3952 3953 When android.colorCorrection.mode = TRANSFORM_MATRIX, the map 3954 must take into account the colorCorrection settings. 3955 3956 The shading map is for the entire active pixel array, and is not 3957 affected by the crop region specified in the request. Each shading map 3958 entry is the value of the shading compensation map over a specific 3959 pixel on the sensor. Specifically, with a (N x M) resolution shading 3960 map, and an active pixel array size (W x H), shading map entry 3961 (x,y) ϵ (0 ... N-1, 0 ... M-1) is the value of the shading map at 3962 pixel ( ((W-1)/(N-1)) * x, ((H-1)/(M-1)) * y) for the four color channels. 3963 The map is assumed to be bilinearly interpolated between the sample points. 3964 3965 The channel order is [R, Geven, Godd, B], where Geven is the green 3966 channel for the even rows of a Bayer pattern, and Godd is the odd rows. 3967 The shading map is stored in a fully interleaved format, and its size 3968 is provided in the camera static metadata by android.lens.info.shadingMapSize. 3969 3970 The shading map should have on the order of 30-40 rows and columns, 3971 and must be smaller than 64x64. 3972 3973 As an example, given a very small map defined as: 3974 3975 android.lens.info.shadingMapSize = [ 4, 3 ] 3976 android.statistics.lensShadingMap = 3977 [ 1.3, 1.2, 1.15, 1.2, 1.2, 1.2, 1.15, 1.2, 3978 1.1, 1.2, 1.2, 1.2, 1.3, 1.2, 1.3, 1.3, 3979 1.2, 1.2, 1.25, 1.1, 1.1, 1.1, 1.1, 1.0, 3980 1.0, 1.0, 1.0, 1.0, 1.2, 1.3, 1.25, 1.2, 3981 1.3, 1.2, 1.2, 1.3, 1.2, 1.15, 1.1, 1.2, 3982 1.2, 1.1, 1.0, 1.2, 1.3, 1.15, 1.2, 1.3 ] 3983 3984 The low-resolution scaling map images for each channel are 3985 (displayed using nearest-neighbor interpolation): 3986 3987 ![Red lens shading map](android.statistics.lensShadingMap/red_shading.png) 3988 ![Green (even rows) lens shading map](android.statistics.lensShadingMap/green_e_shading.png) 3989 ![Green (odd rows) lens shading map](android.statistics.lensShadingMap/green_o_shading.png) 3990 ![Blue lens shading map](android.statistics.lensShadingMap/blue_shading.png) 3991 3992 As a visualization only, inverting the full-color map to recover an 3993 image of a gray wall (using bicubic interpolation for visual quality) as captured by the sensor gives: 3994 3995 ![Image of a uniform white wall (inverse shading map)](android.statistics.lensShadingMap/inv_shading.png) 3996 </details> 3997 </entry> 3998 <entry name="predictedColorGains" type="float" 3999 visibility="hidden" 4000 optional="true" 4001 type_notes="A 1D array of floats for 4 color channel gains" 4002 container="array"> 4003 <array> 4004 <size>4</size> 4005 </array> 4006 <description>The best-fit color channel gains calculated 4007 by the HAL's statistics units for the current output frame 4008 </description> 4009 <range>**Deprecated**. Do not use.</range> 4010 <details> 4011 This may be different than the gains used for this frame, 4012 since statistics processing on data from a new frame 4013 typically completes after the transform has already been 4014 applied to that frame. 4015 4016 The 4 channel gains are defined in Bayer domain, 4017 see android.colorCorrection.gains for details. 4018 4019 This value should always be calculated by the AWB block, 4020 regardless of the android.control.* current values. 4021 </details> 4022 </entry> 4023 <entry name="predictedColorTransform" type="rational" 4024 visibility="hidden" 4025 optional="true" 4026 type_notes="3x3 rational matrix in row-major order" 4027 container="array"> 4028 <array> 4029 <size>3</size> 4030 <size>3</size> 4031 </array> 4032 <description>The best-fit color transform matrix estimate 4033 calculated by the HAL's statistics units for the current 4034 output frame</description> 4035 <range>**Deprecated**. Do not use.</range> 4036 <details>The HAL must provide the estimate from its 4037 statistics unit on the white balance transforms to use 4038 for the next frame. These are the values the HAL believes 4039 are the best fit for the current output frame. This may 4040 be different than the transform used for this frame, since 4041 statistics processing on data from a new frame typically 4042 completes after the transform has already been applied to 4043 that frame. 4044 4045 These estimates must be provided for all frames, even if 4046 capture settings and color transforms are set by the application. 4047 4048 This value should always be calculated by the AWB block, 4049 regardless of the android.control.* current values. 4050 </details> 4051 </entry> 4052 <entry name="sceneFlicker" type="byte" visibility="public" enum="true"> 4053 <enum> 4054 <value>NONE</value> 4055 <value>50HZ</value> 4056 <value>60HZ</value> 4057 </enum> 4058 <description>The camera device estimated scene illumination lighting 4059 frequency.</description> 4060 <details> 4061 Many light sources, such as most fluorescent lights, flicker at a rate 4062 that depends on the local utility power standards. This flicker must be 4063 accounted for by auto-exposure routines to avoid artifacts in captured images. 4064 The camera device uses this entry to tell the application what the scene 4065 illuminant frequency is. 4066 4067 When manual exposure control is enabled 4068 (`android.control.aeMode == OFF` or `android.control.mode == OFF`), 4069 the android.control.aeAntibandingMode doesn't do the antibanding, and the 4070 application can ensure it selects exposure times that do not cause banding 4071 issues by looking into this metadata field. See android.control.aeAntibandingMode 4072 for more details. 4073 4074 Report NONE if there doesn't appear to be flickering illumination. 4075 </details> 4076 </entry> 4077 </dynamic> 4078 <controls> 4079 <entry name="lensShadingMapMode" type="byte" visibility="public" enum="true"> 4080 <enum> 4081 <value>OFF</value> 4082 <value>ON</value> 4083 </enum> 4084 <description>Whether the HAL needs to output the lens 4085 shading map in output result metadata</description> 4086 <details>When set to ON, 4087 android.statistics.lensShadingMap must be provided in 4088 the output result metadata.</details> 4089 </entry> 4090 </controls> 4091 </section> 4092 <section name="tonemap"> 4093 <controls> 4094 <entry name="curveBlue" type="float" visibility="public" 4095 type_notes="1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints." 4096 container="array"> 4097 <array> 4098 <size>n</size> 4099 <size>2</size> 4100 </array> 4101 <description>Tonemapping / contrast / gamma curve for the blue 4102 channel, to use when android.tonemap.mode is 4103 CONTRAST_CURVE.</description> 4104 <units>same as android.tonemap.curveRed</units> 4105 <range>same as android.tonemap.curveRed</range> 4106 <details>See android.tonemap.curveRed for more details.</details> 4107 </entry> 4108 <entry name="curveGreen" type="float" visibility="public" 4109 type_notes="1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints." 4110 container="array"> 4111 <array> 4112 <size>n</size> 4113 <size>2</size> 4114 </array> 4115 <description>Tonemapping / contrast / gamma curve for the green 4116 channel, to use when android.tonemap.mode is 4117 CONTRAST_CURVE.</description> 4118 <units>same as android.tonemap.curveRed</units> 4119 <range>same as android.tonemap.curveRed</range> 4120 <details>See android.tonemap.curveRed for more details.</details> 4121 </entry> 4122 <entry name="curveRed" type="float" visibility="public" 4123 type_notes="1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints." 4124 container="array"> 4125 <array> 4126 <size>n</size> 4127 <size>2</size> 4128 </array> 4129 <description>Tonemapping / contrast / gamma curve for the red 4130 channel, to use when android.tonemap.mode is 4131 CONTRAST_CURVE.</description> 4132 <range>0-1 on both input and output coordinates, normalized 4133 as a floating-point value such that 0 == black and 1 == white. 4134 </range> 4135 <details> 4136 Each channel's curve is defined by an array of control points: 4137 4138 android.tonemap.curveRed = 4139 [ P0in, P0out, P1in, P1out, P2in, P2out, P3in, P3out, ..., PNin, PNout ] 4140 2 &lt;= N &lt;= android.tonemap.maxCurvePoints 4141 4142 These are sorted in order of increasing `Pin`; it is always 4143 guaranteed that input values 0.0 and 1.0 are included in the list to 4144 define a complete mapping. For input values between control points, 4145 the camera device must linearly interpolate between the control 4146 points. 4147 4148 Each curve can have an independent number of points, and the number 4149 of points can be less than max (that is, the request doesn't have to 4150 always provide a curve with number of points equivalent to 4151 android.tonemap.maxCurvePoints). 4152 4153 A few examples, and their corresponding graphical mappings; these 4154 only specify the red channel and the precision is limited to 4 4155 digits, for conciseness. 4156 4157 Linear mapping: 4158 4159 android.tonemap.curveRed = [ 0, 0, 1.0, 1.0 ] 4160 4161 ![Linear mapping curve](android.tonemap.curveRed/linear_tonemap.png) 4162 4163 Invert mapping: 4164 4165 android.tonemap.curveRed = [ 0, 1.0, 1.0, 0 ] 4166 4167 ![Inverting mapping curve](android.tonemap.curveRed/inverse_tonemap.png) 4168 4169 Gamma 1/2.2 mapping, with 16 control points: 4170 4171 android.tonemap.curveRed = [ 4172 0.0000, 0.0000, 0.0667, 0.2920, 0.1333, 0.4002, 0.2000, 0.4812, 4173 0.2667, 0.5484, 0.3333, 0.6069, 0.4000, 0.6594, 0.4667, 0.7072, 4174 0.5333, 0.7515, 0.6000, 0.7928, 0.6667, 0.8317, 0.7333, 0.8685, 4175 0.8000, 0.9035, 0.8667, 0.9370, 0.9333, 0.9691, 1.0000, 1.0000 ] 4176 4177 ![Gamma = 1/2.2 tonemapping curve](android.tonemap.curveRed/gamma_tonemap.png) 4178 4179 Standard sRGB gamma mapping, per IEC 61966-2-1:1999, with 16 control points: 4180 4181 android.tonemap.curveRed = [ 4182 0.0000, 0.0000, 0.0667, 0.2864, 0.1333, 0.4007, 0.2000, 0.4845, 4183 0.2667, 0.5532, 0.3333, 0.6125, 0.4000, 0.6652, 0.4667, 0.7130, 4184 0.5333, 0.7569, 0.6000, 0.7977, 0.6667, 0.8360, 0.7333, 0.8721, 4185 0.8000, 0.9063, 0.8667, 0.9389, 0.9333, 0.9701, 1.0000, 1.0000 ] 4186 4187 ![sRGB tonemapping curve](android.tonemap.curveRed/srgb_tonemap.png) 4188 </details> 4189 <hal_details> 4190 For good quality of mapping, at least 128 control points are 4191 preferred. 4192 4193 A typical use case of this would be a gamma-1/2.2 curve, with as many 4194 control points used as are available. 4195 </hal_details> 4196 <tag id="DNG" /> 4197 </entry> 4198 <entry name="mode" type="byte" visibility="public" enum="true"> 4199 <enum> 4200 <value>CONTRAST_CURVE 4201 <notes>Use the tone mapping curve specified in 4202 android.tonemap.curve. 4203 4204 All color enhancement and tonemapping must be disabled, except 4205 for applying the tonemapping curve specified by 4206 android.tonemap.curveRed, android.tonemap.curveBlue, or 4207 android.tonemap.curveGreen. 4208 4209 Must not slow down frame rate relative to raw 4210 sensor output. 4211 </notes> 4212 </value> 4213 <value>FAST 4214 <notes> 4215 Advanced gamma mapping and color enhancement may be applied. 4216 4217 Should not slow down frame rate relative to raw sensor output. 4218 </notes> 4219 </value> 4220 <value>HIGH_QUALITY 4221 <notes> 4222 Advanced gamma mapping and color enhancement may be applied. 4223 4224 May slow down frame rate relative to raw sensor output. 4225 </notes> 4226 </value> 4227 </enum> 4228 <description>High-level global contrast/gamma/tonemapping control. 4229 </description> 4230 <details> 4231 When switching to an application-defined contrast curve by setting 4232 android.tonemap.mode to CONTRAST_CURVE, the curve is defined 4233 per-channel with a set of `(in, out)` points that specify the 4234 mapping from input high-bit-depth pixel value to the output 4235 low-bit-depth value. Since the actual pixel ranges of both input 4236 and output may change depending on the camera pipeline, the values 4237 are specified by normalized floating-point numbers. 4238 4239 More-complex color mapping operations such as 3D color look-up 4240 tables, selective chroma enhancement, or other non-linear color 4241 transforms will be disabled when android.tonemap.mode is 4242 CONTRAST_CURVE. 4243 4244 When using either FAST or HIGH_QUALITY, the camera device will 4245 emit its own tonemap curve in android.tonemap.curveRed, 4246 android.tonemap.curveGreen, and android.tonemap.curveBlue. 4247 These values are always available, and as close as possible to the 4248 actually used nonlinear/nonglobal transforms. 4249 4250 If a request is sent with TRANSFORM_MATRIX with the camera device's 4251 provided curve in FAST or HIGH_QUALITY, the image's tonemap will be 4252 roughly the same.</details> 4253 </entry> 4254 </controls> 4255 <static> 4256 <entry name="maxCurvePoints" type="int32" visibility="public" > 4257 <description>Maximum number of supported points in the 4258 tonemap curve that can be used for android.tonemap.curveRed, or 4259 android.tonemap.curveGreen, or android.tonemap.curveBlue. 4260 </description> 4261 <range>&gt;= 64</range> 4262 <details> 4263 If the actual number of points provided by the application (in 4264 android.tonemap.curve*) is less than max, the camera device will 4265 resample the curve to its internal representation, using linear 4266 interpolation. 4267 4268 The output curves in the result metadata may have a different number 4269 of points than the input curves, and will represent the actual 4270 hardware curves used as closely as possible when linearly interpolated. 4271 </details> 4272 <hal_details> 4273 This value must be at least 64. This should be at least 128. 4274 </hal_details> 4275 </entry> 4276 </static> 4277 <dynamic> 4278 <clone entry="android.tonemap.curveBlue" kind="controls"> 4279 </clone> 4280 <clone entry="android.tonemap.curveGreen" kind="controls"> 4281 </clone> 4282 <clone entry="android.tonemap.curveRed" kind="controls"> 4283 </clone> 4284 <clone entry="android.tonemap.mode" kind="controls"> 4285 </clone> 4286 </dynamic> 4287 </section> 4288 <section name="led"> 4289 <controls> 4290 <entry name="transmit" type="byte" visibility="hidden" enum="true" 4291 typedef="boolean"> 4292 <enum> 4293 <value>OFF</value> 4294 <value>ON</value> 4295 </enum> 4296 <description>This LED is nominally used to indicate to the user 4297 that the camera is powered on and may be streaming images back to the 4298 Application Processor. In certain rare circumstances, the OS may 4299 disable this when video is processed locally and not transmitted to 4300 any untrusted applications. 4301 4302 In particular, the LED *must* always be on when the data could be 4303 transmitted off the device. The LED *should* always be on whenever 4304 data is stored locally on the device. 4305 4306 The LED *may* be off if a trusted application is using the data that 4307 doesn't violate the above rules. 4308 </description> 4309 </entry> 4310 </controls> 4311 <dynamic> 4312 <clone entry="android.led.transmit" kind="controls"></clone> 4313 </dynamic> 4314 <static> 4315 <entry name="availableLeds" type="byte" visibility="hidden" enum="true" 4316 container="array"> 4317 <array> 4318 <size>n</size> 4319 </array> 4320 <enum> 4321 <value>TRANSMIT 4322 <notes>android.led.transmit control is used</notes> 4323 </value> 4324 </enum> 4325 <description>A list of camera LEDs that are available on this system. 4326 </description> 4327 </entry> 4328 </static> 4329 </section> 4330 <section name="info"> 4331 <static> 4332 <entry name="supportedHardwareLevel" type="byte" visibility="public" 4333 enum="true" > 4334 <enum> 4335 <value>LIMITED</value> 4336 <value>FULL</value> 4337 </enum> 4338 <description> 4339 The camera 3 HAL device can implement one of two possible 4340 operational modes; limited and full. Full support is 4341 expected from new higher-end devices. Limited mode has 4342 hardware requirements roughly in line with those for a 4343 camera HAL device v1 implementation, and is expected from 4344 older or inexpensive devices. Full is a strict superset of 4345 limited, and they share the same essential operational flow. 4346 4347 For full details refer to "S3. Operational Modes" in camera3.h 4348 </description> 4349 <range>Optional. Default value is LIMITED.</range> 4350 </entry> 4351 </static> 4352 </section> 4353 <section name="blackLevel"> 4354 <controls> 4355 <entry name="lock" type="byte" visibility="public" enum="true" 4356 typedef="boolean"> 4357 <enum> 4358 <value>OFF</value> 4359 <value>ON</value> 4360 </enum> 4361 <description> Whether black-level compensation is locked 4362 to its current values, or is free to vary.</description> 4363 <details>When set to ON, the values used for black-level 4364 compensation will not change until the lock is set to 4365 OFF. 4366 4367 Since changes to certain capture parameters (such as 4368 exposure time) may require resetting of black level 4369 compensation, the camera device must report whether setting 4370 the black level lock was successful in the output result 4371 metadata. 4372 4373 For example, if a sequence of requests is as follows: 4374 4375 * Request 1: Exposure = 10ms, Black level lock = OFF 4376 * Request 2: Exposure = 10ms, Black level lock = ON 4377 * Request 3: Exposure = 10ms, Black level lock = ON 4378 * Request 4: Exposure = 20ms, Black level lock = ON 4379 * Request 5: Exposure = 20ms, Black level lock = ON 4380 * Request 6: Exposure = 20ms, Black level lock = ON 4381 4382 And the exposure change in Request 4 requires the camera 4383 device to reset the black level offsets, then the output 4384 result metadata is expected to be: 4385 4386 * Result 1: Exposure = 10ms, Black level lock = OFF 4387 * Result 2: Exposure = 10ms, Black level lock = ON 4388 * Result 3: Exposure = 10ms, Black level lock = ON 4389 * Result 4: Exposure = 20ms, Black level lock = OFF 4390 * Result 5: Exposure = 20ms, Black level lock = ON 4391 * Result 6: Exposure = 20ms, Black level lock = ON 4392 4393 This indicates to the application that on frame 4, black 4394 levels were reset due to exposure value changes, and pixel 4395 values may not be consistent across captures. 4396 4397 The camera device will maintain the lock to the extent 4398 possible, only overriding the lock to OFF when changes to 4399 other request parameters require a black level recalculation 4400 or reset. 4401 </details> 4402 <hal_details> 4403 If for some reason black level locking is no longer possible 4404 (for example, the analog gain has changed, which forces 4405 black level offsets to be recalculated), then the HAL must 4406 override this request (and it must report 'OFF' when this 4407 does happen) until the next capture for which locking is 4408 possible again.</hal_details> 4409 <tag id="HAL2" /> 4410 </entry> 4411 </controls> 4412 <dynamic> 4413 <clone entry="android.blackLevel.lock" 4414 kind="controls"> 4415 <details> 4416 Whether the black level offset was locked for this frame. Should be 4417 ON if android.blackLevel.lock was ON in the capture request, unless 4418 a change in other capture settings forced the camera device to 4419 perform a black level reset. 4420 </details> 4421 </clone> 4422 </dynamic> 4423 </section> 4424 <section name="sync"> 4425 <dynamic> 4426 <entry name="frameNumber" type="int64" visibility="hidden" enum="true"> 4427 <enum> 4428 <value id="-1">CONVERGING 4429 <notes> 4430 The current result is not yet fully synchronized to any request. 4431 Synchronization is in progress, and reading metadata from this 4432 result may include a mix of data that have taken effect since the 4433 last synchronization time. 4434 4435 In some future result, within android.sync.maxLatency frames, 4436 this value will update to the actual frame number frame number 4437 the result is guaranteed to be synchronized to (as long as the 4438 request settings remain constant). 4439 </notes> 4440 </value> 4441 <value id="-2">UNKNOWN 4442 <notes> 4443 The current result's synchronization status is unknown. The 4444 result may have already converged, or it may be in progress. 4445 Reading from this result may include some mix of settings from 4446 past requests. 4447 4448 After a settings change, the new settings will eventually all 4449 take effect for the output buffers and results. However, this 4450 value will not change when that happens. Altering settings 4451 rapidly may provide outcomes using mixes of settings from recent 4452 requests. 4453 4454 This value is intended primarily for backwards compatibility with 4455 the older camera implementations (for android.hardware.Camera). 4456 </notes> 4457 </value> 4458 </enum> 4459 <description>The frame number corresponding to the last request 4460 with which the output result (metadata + buffers) has been fully 4461 synchronized.</description> 4462 <range>Either a non-negative value corresponding to a 4463 `frame_number`, or one of the two enums (CONVERGING / UNKNOWN). 4464 </range> 4465 <details> 4466 When a request is submitted to the camera device, there is usually a 4467 delay of several frames before the controls get applied. A camera 4468 device may either choose to account for this delay by implementing a 4469 pipeline and carefully submit well-timed atomic control updates, or 4470 it may start streaming control changes that span over several frame 4471 boundaries. 4472 4473 In the latter case, whenever a request's settings change relative to 4474 the previous submitted request, the full set of changes may take 4475 multiple frame durations to fully take effect. Some settings may 4476 take effect sooner (in less frame durations) than others. 4477 4478 While a set of control changes are being propagated, this value 4479 will be CONVERGING. 4480 4481 Once it is fully known that a set of control changes have been 4482 finished propagating, and the resulting updated control settings 4483 have been read back by the camera device, this value will be set 4484 to a non-negative frame number (corresponding to the request to 4485 which the results have synchronized to). 4486 4487 Older camera device implementations may not have a way to detect 4488 when all camera controls have been applied, and will always set this 4489 value to UNKNOWN. 4490 4491 FULL capability devices will always have this value set to the 4492 frame number of the request corresponding to this result. 4493 4494 _Further details_: 4495 4496 * Whenever a request differs from the last request, any future 4497 results not yet returned may have this value set to CONVERGING (this 4498 could include any in-progress captures not yet returned by the camera 4499 device, for more details see pipeline considerations below). 4500 * Submitting a series of multiple requests that differ from the 4501 previous request (e.g. r1, r2, r3 s.t. r1 != r2 != r3) 4502 moves the new synchronization frame to the last non-repeating 4503 request (using the smallest frame number from the contiguous list of 4504 repeating requests). 4505 * Submitting the same request repeatedly will not change this value 4506 to CONVERGING, if it was already a non-negative value. 4507 * When this value changes to non-negative, that means that all of the 4508 metadata controls from the request have been applied, all of the 4509 metadata controls from the camera device have been read to the 4510 updated values (into the result), and all of the graphics buffers 4511 corresponding to this result are also synchronized to the request. 4512 4513 _Pipeline considerations_: 4514 4515 Submitting a request with updated controls relative to the previously 4516 submitted requests may also invalidate the synchronization state 4517 of all the results corresponding to currently in-flight requests. 4518 4519 In other words, results for this current request and up to 4520 android.request.pipelineMaxDepth prior requests may have their 4521 android.sync.frameNumber change to CONVERGING. 4522 </details> 4523 <hal_details> 4524 Using UNKNOWN here is illegal unless android.sync.maxLatency 4525 is also UNKNOWN. 4526 4527 FULL capability devices should simply set this value to the 4528 `frame_number` of the request this result corresponds to. 4529 </hal_details> 4530 <tag id="LIMITED" /> 4531 </entry> 4532 </dynamic> 4533 <static> 4534 <entry name="maxLatency" type="int32" visibility="public" enum="true"> 4535 <enum> 4536 <value id="0">PER_FRAME_CONTROL 4537 <notes> 4538 Every frame has the requests immediately applied. 4539 (and furthermore for all results, 4540 `android.sync.frameNumber == android.request.frameCount`) 4541 4542 Changing controls over multiple requests one after another will 4543 produce results that have those controls applied atomically 4544 each frame. 4545 4546 All FULL capability devices will have this as their maxLatency. 4547 </notes> 4548 </value> 4549 <value id="-1">UNKNOWN 4550 <notes> 4551 Each new frame has some subset (potentially the entire set) 4552 of the past requests applied to the camera settings. 4553 4554 By submitting a series of identical requests, the camera device 4555 will eventually have the camera settings applied, but it is 4556 unknown when that exact point will be. 4557 </notes> 4558 </value> 4559 </enum> 4560 <description> 4561 The maximum number of frames that can occur after a request 4562 (different than the previous) has been submitted, and before the 4563 result's state becomes synchronized (by setting 4564 android.sync.frameNumber to a non-negative value). 4565 </description> 4566 <units>number of processed requests</units> 4567 <range>&gt;= -1</range> 4568 <details> 4569 This defines the maximum distance (in number of metadata results), 4570 between android.sync.frameNumber and the equivalent 4571 android.request.frameCount. 4572 4573 In other words this acts as an upper boundary for how many frames 4574 must occur before the camera device knows for a fact that the new 4575 submitted camera settings have been applied in outgoing frames. 4576 4577 For example if the distance was 2, 4578 4579 initial request = X (repeating) 4580 request1 = X 4581 request2 = Y 4582 request3 = Y 4583 request4 = Y 4584 4585 where requestN has frameNumber N, and the first of the repeating 4586 initial request's has frameNumber F (and F < 1). 4587 4588 initial result = X' + { android.sync.frameNumber == F } 4589 result1 = X' + { android.sync.frameNumber == F } 4590 result2 = X' + { android.sync.frameNumber == CONVERGING } 4591 result3 = X' + { android.sync.frameNumber == CONVERGING } 4592 result4 = X' + { android.sync.frameNumber == 2 } 4593 4594 where resultN has frameNumber N. 4595 4596 Since `result4` has a `frameNumber == 4` and 4597 `android.sync.frameNumber == 2`, the distance is clearly 4598 `4 - 2 = 2`. 4599 </details> 4600 <hal_details> 4601 Use `frame_count` from camera3_request_t instead of 4602 android.request.frameCount. 4603 4604 LIMITED devices are strongly encouraged to use a non-negative 4605 value. If UNKNOWN is used here then app developers do not have a way 4606 to know when sensor settings have been applied. 4607 </hal_details> 4608 <tag id="LIMITED" /> 4609 </entry> 4610 </static> 4611 </section> 4612 </namespace> 4613</metadata> 4614