metadata_properties.xml revision 6928d139018763a6e701145ec20701e760767708
1<?xml version="1.0" encoding="utf-8"?> 2<!-- Copyright (C) 2012 The Android Open Source Project 3 4 Licensed under the Apache License, Version 2.0 (the "License"); 5 you may not use this file except in compliance with the License. 6 You may obtain a copy of the License at 7 8 http://www.apache.org/licenses/LICENSE-2.0 9 10 Unless required by applicable law or agreed to in writing, software 11 distributed under the License is distributed on an "AS IS" BASIS, 12 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 See the License for the specific language governing permissions and 14 limitations under the License. 15--> 16<metadata xmlns="http://schemas.android.com/service/camera/metadata/" 17xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 18xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata_properties.xsd"> 19 20 <tags> 21 <tag id="AWB"> 22 Needed for auto white balance 23 </tag> 24 <tag id="BC"> 25 Needed for backwards compatibility with old Java API 26 </tag> 27 <tag id="V1"> 28 New features for first camera 2 release (API1) 29 </tag> 30 <tag id="ADV"> 31 <!-- TODO: fill the tag description --> 32 </tag> 33 <tag id="DNG"> 34 Needed for DNG file support 35 </tag> 36 <tag id="EXIF"> 37 <!-- TODO: fill the tag description --> 38 </tag> 39 <tag id="HAL2"> 40 Entry is only used by camera device HAL 2.x 41 </tag> 42 <tag id="FULL"> 43 Entry is required for full hardware level devices, and optional for other hardware levels 44 </tag> 45 <tag id="LIMITED"> 46 Entry assists with LIMITED device implementation. LIMITED devices 47 must implement all entries with this tag. Optional for FULL devices. 48 </tag> 49 </tags> 50 51 <types> 52 <typedef name="rectangle"> 53 <language name="java">android.graphics.Rect</language> 54 </typedef> 55 <typedef name="size"> 56 <language name="java">android.hardware.camera2.Size</language> 57 </typedef> 58 <typedef name="string"> 59 <language name="java">String</language> 60 </typedef> 61 <typedef name="boolean"> 62 <language name="java">boolean</language> 63 </typedef> 64 <typedef name="imageFormat"> 65 <language name="java">int</language> 66 </typedef> 67 </types> 68 69 <namespace name="android"> 70 <section name="colorCorrection"> 71 <controls> 72 <entry name="mode" type="byte" visibility="public" enum="true"> 73 <enum> 74 <value>TRANSFORM_MATRIX 75 <notes>Use the android.colorCorrection.transform matrix 76 and android.colorCorrection.gains to do color conversion. 77 78 All advanced white balance adjustments (not specified 79 by our white balance pipeline) must be disabled. 80 81 If AWB is enabled with `android.control.awbMode != OFF`, then 82 TRANSFORM_MATRIX is ignored. The camera device will override 83 this value to either FAST or HIGH_QUALITY. 84 </notes> 85 </value> 86 <value>FAST 87 <notes>Must not slow down capture rate relative to sensor raw 88 output. 89 90 Advanced white balance adjustments above and beyond 91 the specified white balance pipeline may be applied. 92 93 If AWB is enabled with `android.control.awbMode != OFF`, then 94 the camera device uses the last frame's AWB values 95 (or defaults if AWB has never been run). 96 </notes> 97 </value> 98 <value>HIGH_QUALITY 99 <notes>Capture rate (relative to sensor raw output) 100 may be reduced by high quality. 101 102 Advanced white balance adjustments above and beyond 103 the specified white balance pipeline may be applied. 104 105 If AWB is enabled with `android.control.awbMode != OFF`, then 106 the camera device uses the last frame's AWB values 107 (or defaults if AWB has never been run). 108 </notes> 109 </value> 110 </enum> 111 112 <description> 113 The mode control selects how the image data is converted from the 114 sensor's native color into linear sRGB color. 115 </description> 116 <details> 117 When auto-white balance is enabled with android.control.awbMode, this 118 control is overridden by the AWB routine. When AWB is disabled, the 119 application controls how the color mapping is performed. 120 121 We define the expected processing pipeline below. For consistency 122 across devices, this is always the case with TRANSFORM_MATRIX. 123 124 When either FULL or HIGH_QUALITY is used, the camera device may 125 do additional processing but android.colorCorrection.gains and 126 android.colorCorrection.transform will still be provided by the 127 camera device (in the results) and be roughly correct. 128 129 Switching to TRANSFORM_MATRIX and using the data provided from 130 FAST or HIGH_QUALITY will yield a picture with the same white point 131 as what was produced by the camera device in the earlier frame. 132 133 The expected processing pipeline is as follows: 134 135 ![White balance processing pipeline](android.colorCorrection.mode/processing_pipeline.png) 136 137 The white balance is encoded by two values, a 4-channel white-balance 138 gain vector (applied in the Bayer domain), and a 3x3 color transform 139 matrix (applied after demosaic). 140 141 The 4-channel white-balance gains are defined as: 142 143 android.colorCorrection.gains = [ R G_even G_odd B ] 144 145 where `G_even` is the gain for green pixels on even rows of the 146 output, and `G_odd` is the gain for green pixels on the odd rows. 147 These may be identical for a given camera device implementation; if 148 the camera device does not support a separate gain for even/odd green 149 channels, it will use the `G_even` value, and write `G_odd` equal to 150 `G_even` in the output result metadata. 151 152 The matrices for color transforms are defined as a 9-entry vector: 153 154 android.colorCorrection.transform = [ I0 I1 I2 I3 I4 I5 I6 I7 I8 ] 155 156 which define a transform from input sensor colors, `P_in = [ r g b ]`, 157 to output linear sRGB, `P_out = [ r' g' b' ]`, 158 159 with colors as follows: 160 161 r' = I0r + I1g + I2b 162 g' = I3r + I4g + I5b 163 b' = I6r + I7g + I8b 164 165 Both the input and output value ranges must match. Overflow/underflow 166 values are clipped to fit within the range. 167 </details> 168 </entry> 169 <entry name="transform" type="rational" visibility="public" 170 type_notes="3x3 rational matrix in row-major order" 171 container="array"> 172 <array> 173 <size>3</size> 174 <size>3</size> 175 </array> 176 <description>A color transform matrix to use to transform 177 from sensor RGB color space to output linear sRGB color space 178 </description> 179 <details>This matrix is either set by the camera device when the request 180 android.colorCorrection.mode is not TRANSFORM_MATRIX, or 181 directly by the application in the request when the 182 android.colorCorrection.mode is TRANSFORM_MATRIX. 183 184 In the latter case, the camera device may round the matrix to account 185 for precision issues; the final rounded matrix should be reported back 186 in this matrix result metadata. The transform should keep the magnitude 187 of the output color values within `[0, 1.0]` (assuming input color 188 values is within the normalized range `[0, 1.0]`), or clipping may occur. 189 </details> 190 </entry> 191 <entry name="gains" type="float" visibility="public" 192 type_notes="A 1D array of floats for 4 color channel gains" 193 container="array"> 194 <array> 195 <size>4</size> 196 </array> 197 <description>Gains applying to Bayer raw color channels for 198 white-balance</description> 199 <details>The 4-channel white-balance gains are defined in 200 the order of `[R G_even G_odd B]`, where `G_even` is the gain 201 for green pixels on even rows of the output, and `G_odd` 202 is the gain for green pixels on the odd rows. if a HAL 203 does not support a separate gain for even/odd green channels, 204 it should use the `G_even` value, and write `G_odd` equal to 205 `G_even` in the output result metadata. 206 207 This array is either set by HAL when the request 208 android.colorCorrection.mode is not TRANSFORM_MATRIX, or 209 directly by the application in the request when the 210 android.colorCorrection.mode is TRANSFORM_MATRIX. 211 212 The output should be the gains actually applied by the HAL to 213 the current frame.</details> 214 </entry> 215 </controls> 216 <dynamic> 217 <clone entry="android.colorCorrection.transform" kind="controls"> 218 </clone> 219 <clone entry="android.colorCorrection.gains" kind="controls"> 220 </clone> 221 </dynamic> 222 </section> 223 <section name="control"> 224 <controls> 225 <entry name="aeAntibandingMode" type="byte" visibility="public" 226 enum="true" > 227 <enum> 228 <value>OFF 229 <notes> 230 The camera device will not adjust exposure duration to 231 avoid banding problems. 232 </notes> 233 </value> 234 <value>50HZ 235 <notes> 236 The camera device will adjust exposure duration to 237 avoid banding problems with 50Hz illumination sources. 238 </notes> 239 </value> 240 <value>60HZ 241 <notes> 242 The camera device will adjust exposure duration to 243 avoid banding problems with 60Hz illumination 244 sources. 245 </notes> 246 </value> 247 <value>AUTO 248 <notes> 249 The camera device will automatically adapt its 250 antibanding routine to the current illumination 251 conditions. This is the default. 252 </notes> 253 </value> 254 </enum> 255 <description> 256 The desired setting for the camera device's auto-exposure 257 algorithm's antibanding compensation. 258 </description> 259 <range> 260 android.control.aeAvailableAntibandingModes 261 </range> 262 <details> 263 Some kinds of lighting fixtures, such as some fluorescent 264 lights, flicker at the rate of the power supply frequency 265 (60Hz or 50Hz, depending on country). While this is 266 typically not noticeable to a person, it can be visible to 267 a camera device. If a camera sets its exposure time to the 268 wrong value, the flicker may become visible in the 269 viewfinder as flicker or in a final captured image, as a 270 set of variable-brightness bands across the image. 271 272 Therefore, the auto-exposure routines of camera devices 273 include antibanding routines that ensure that the chosen 274 exposure value will not cause such banding. The choice of 275 exposure time depends on the rate of flicker, which the 276 camera device can detect automatically, or the expected 277 rate can be selected by the application using this 278 control. 279 280 A given camera device may not support all of the possible 281 options for the antibanding mode. The 282 android.control.aeAvailableAntibandingModes key contains 283 the available modes for a given camera device. 284 285 The default mode is AUTO, which must be supported by all 286 camera devices. 287 288 If manual exposure control is enabled (by setting 289 android.control.aeMode or android.control.mode to OFF), 290 then this setting has no effect, and the application must 291 ensure it selects exposure times that do not cause banding 292 issues. The android.statistics.sceneFlicker key can assist 293 the application in this. 294 </details> 295 <hal_details> 296 For all capture request templates, this field must be set 297 to AUTO. AUTO is the only mode that must supported; 298 OFF, 50HZ, 60HZ are all optional. 299 300 If manual exposure control is enabled (by setting 301 android.control.aeMode or android.control.mode to OFF), 302 then the exposure values provided by the application must not be 303 adjusted for antibanding. 304 </hal_details> 305 <tag id="BC" /> 306 </entry> 307 <entry name="aeExposureCompensation" type="int32" visibility="public"> 308 <description>Adjustment to AE target image 309 brightness</description> 310 <units>count of positive/negative EV steps</units> 311 <details>For example, if EV step is 0.333, '6' will mean an 312 exposure compensation of +2 EV; -3 will mean an exposure 313 compensation of -1</details> 314 <tag id="BC" /> 315 </entry> 316 <entry name="aeLock" type="byte" visibility="public" enum="true" 317 typedef="boolean"> 318 <enum> 319 <value>OFF 320 <notes>Autoexposure lock is disabled; the AE algorithm 321 is free to update its parameters.</notes></value> 322 <value>ON 323 <notes>Autoexposure lock is enabled; the AE algorithm 324 must not update the exposure and sensitivity parameters 325 while the lock is active</notes></value> 326 </enum> 327 <description>Whether AE is currently locked to its latest 328 calculated values.</description> 329 <details>Note that even when AE is locked, the flash may be 330 fired if the android.control.aeMode is ON_AUTO_FLASH / ON_ALWAYS_FLASH / 331 ON_AUTO_FLASH_REDEYE. 332 333 If AE precapture is triggered (see android.control.aePrecaptureTrigger) 334 when AE is already locked, the camera device will not change the exposure time 335 (android.sensor.exposureTime) and sensitivity (android.sensor.sensitivity) 336 parameters. The flash may be fired if the android.control.aeMode 337 is ON_AUTO_FLASH/ON_AUTO_FLASH_REDEYE and the scene is too dark. If the 338 android.control.aeMode is ON_ALWAYS_FLASH, the scene may become overexposed. 339 340 See android.control.aeState for AE lock related state transition details. 341 </details> 342 <tag id="BC" /> 343 </entry> 344 <entry name="aeMode" type="byte" visibility="public" enum="true"> 345 <enum> 346 <value>OFF 347 <notes> 348 The camera device's autoexposure routine is disabled; 349 the application-selected android.sensor.exposureTime, 350 android.sensor.sensitivity and 351 android.sensor.frameDuration are used by the camera 352 device, along with android.flash.* fields, if there's 353 a flash unit for this camera device. 354 </notes> 355 </value> 356 <value>ON 357 <notes> 358 The camera device's autoexposure routine is active, 359 with no flash control. The application's values for 360 android.sensor.exposureTime, 361 android.sensor.sensitivity, and 362 android.sensor.frameDuration are ignored. The 363 application has control over the various 364 android.flash.* fields. 365 </notes> 366 </value> 367 <value>ON_AUTO_FLASH 368 <notes> 369 Like ON, except that the camera device also controls 370 the camera's flash unit, firing it in low-light 371 conditions. The flash may be fired during a 372 precapture sequence (triggered by 373 android.control.aePrecaptureTrigger) and may be fired 374 for captures for which the 375 android.control.captureIntent field is set to 376 STILL_CAPTURE 377 </notes> 378 </value> 379 <value>ON_ALWAYS_FLASH 380 <notes> 381 Like ON, except that the camera device also controls 382 the camera's flash unit, always firing it for still 383 captures. The flash may be fired during a precapture 384 sequence (triggered by 385 android.control.aePrecaptureTrigger) and will always 386 be fired for captures for which the 387 android.control.captureIntent field is set to 388 STILL_CAPTURE 389 </notes> 390 </value> 391 <value>ON_AUTO_FLASH_REDEYE 392 <notes> 393 Like ON_AUTO_FLASH, but with automatic red eye 394 reduction. If deemed necessary by the camera device, 395 a red eye reduction flash will fire during the 396 precapture sequence. 397 </notes> 398 </value> 399 </enum> 400 <description>The desired mode for the camera device's 401 auto-exposure routine.</description> 402 <range>android.control.aeAvailableModes</range> 403 <details> 404 This control is only effective if android.control.mode is 405 AUTO. 406 407 When set to any of the ON modes, the camera device's 408 auto-exposure routine is enabled, overriding the 409 application's selected exposure time, sensor sensitivity, 410 and frame duration (android.sensor.exposureTime, 411 android.sensor.sensitivity, and 412 android.sensor.frameDuration). If one of the FLASH modes 413 is selected, the camera device's flash unit controls are 414 also overridden. 415 416 The FLASH modes are only available if the camera device 417 has a flash unit (android.flash.info.available is `true`). 418 419 If flash TORCH mode is desired, this field must be set to 420 ON or OFF, and android.flash.mode set to TORCH. 421 422 When set to any of the ON modes, the values chosen by the 423 camera device auto-exposure routine for the overridden 424 fields for a given capture will be available in its 425 CaptureResult. 426 </details> 427 <tag id="BC" /> 428 </entry> 429 <entry name="aeRegions" type="int32" visibility="public" 430 container="array"> 431 <array> 432 <size>5</size> 433 <size>area_count</size> 434 </array> 435 <description>List of areas to use for 436 metering.</description> 437 <range>`area_count <= android.control.maxRegions[0]`</range> 438 <details>Each area is a rectangle plus weight: xmin, ymin, 439 xmax, ymax, weight. The rectangle is defined to be inclusive of the 440 specified coordinates. 441 442 The coordinate system is based on the active pixel array, 443 with (0,0) being the top-left pixel in the active pixel array, and 444 (android.sensor.info.activeArraySize.width - 1, 445 android.sensor.info.activeArraySize.height - 1) being the 446 bottom-right pixel in the active pixel array. The weight 447 should be nonnegative. 448 449 If all regions have 0 weight, then no specific metering area 450 needs to be used by the HAL. If the metering region is 451 outside the current android.scaler.cropRegion, the HAL 452 should ignore the sections outside the region and output the 453 used sections in the frame metadata.</details> 454 <tag id="BC" /> 455 </entry> 456 <entry name="aeTargetFpsRange" type="int32" visibility="public" 457 container="array"> 458 <array> 459 <size>2</size> 460 </array> 461 <description>Range over which fps can be adjusted to 462 maintain exposure</description> 463 <range>android.control.aeAvailableTargetFpsRanges</range> 464 <details>Only constrains AE algorithm, not manual control 465 of android.sensor.exposureTime</details> 466 <tag id="BC" /> 467 </entry> 468 <entry name="aePrecaptureTrigger" type="byte" visibility="public" 469 enum="true"> 470 <enum> 471 <value>IDLE 472 <notes>The trigger is idle.</notes> 473 </value> 474 <value>START 475 <notes>The precapture metering sequence will be started 476 by the camera device. The exact effect of the precapture 477 trigger depends on the current AE mode and state.</notes> 478 </value> 479 </enum> 480 <description>Whether the camera device will trigger a precapture 481 metering sequence when it processes this request.</description> 482 <details>This entry is normally set to IDLE, or is not 483 included at all in the request settings. When included and 484 set to START, the camera device will trigger the autoexposure 485 precapture metering sequence. 486 487 The effect of AE precapture trigger depends on the current 488 AE mode and state; see android.control.aeState for AE precapture 489 state transition details.</details> 490 <tag id="BC" /> 491 </entry> 492 <entry name="afMode" type="byte" visibility="public" enum="true"> 493 <enum> 494 <value>OFF 495 <notes>The auto-focus routine does not control the lens; 496 android.lens.focusDistance is controlled by the 497 application</notes></value> 498 <value>AUTO 499 <notes> 500 If lens is not fixed focus. 501 502 Use android.lens.info.minimumFocusDistance to determine if lens 503 is fixed-focus. In this mode, the lens does not move unless 504 the autofocus trigger action is called. When that trigger 505 is activated, AF must transition to ACTIVE_SCAN, then to 506 the outcome of the scan (FOCUSED or NOT_FOCUSED). 507 508 Triggering AF_CANCEL resets the lens position to default, 509 and sets the AF state to INACTIVE.</notes></value> 510 <value>MACRO 511 <notes>In this mode, the lens does not move unless the 512 autofocus trigger action is called. 513 514 When that trigger is activated, AF must transition to 515 ACTIVE_SCAN, then to the outcome of the scan (FOCUSED or 516 NOT_FOCUSED). Triggering cancel AF resets the lens 517 position to default, and sets the AF state to 518 INACTIVE.</notes></value> 519 <value>CONTINUOUS_VIDEO 520 <notes>In this mode, the AF algorithm modifies the lens 521 position continually to attempt to provide a 522 constantly-in-focus image stream. 523 524 The focusing behavior should be suitable for good quality 525 video recording; typically this means slower focus 526 movement and no overshoots. When the AF trigger is not 527 involved, the AF algorithm should start in INACTIVE state, 528 and then transition into PASSIVE_SCAN and PASSIVE_FOCUSED 529 states as appropriate. When the AF trigger is activated, 530 the algorithm should immediately transition into 531 AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the 532 lens position until a cancel AF trigger is received. 533 534 Once cancel is received, the algorithm should transition 535 back to INACTIVE and resume passive scan. Note that this 536 behavior is not identical to CONTINUOUS_PICTURE, since an 537 ongoing PASSIVE_SCAN must immediately be 538 canceled.</notes></value> 539 <value>CONTINUOUS_PICTURE 540 <notes>In this mode, the AF algorithm modifies the lens 541 position continually to attempt to provide a 542 constantly-in-focus image stream. 543 544 The focusing behavior should be suitable for still image 545 capture; typically this means focusing as fast as 546 possible. When the AF trigger is not involved, the AF 547 algorithm should start in INACTIVE state, and then 548 transition into PASSIVE_SCAN and PASSIVE_FOCUSED states as 549 appropriate as it attempts to maintain focus. When the AF 550 trigger is activated, the algorithm should finish its 551 PASSIVE_SCAN if active, and then transition into 552 AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the 553 lens position until a cancel AF trigger is received. 554 555 When the AF cancel trigger is activated, the algorithm 556 should transition back to INACTIVE and then act as if it 557 has just been started.</notes></value> 558 <value>EDOF 559 <notes>Extended depth of field (digital focus). AF 560 trigger is ignored, AF state should always be 561 INACTIVE.</notes></value> 562 </enum> 563 <description>Whether AF is currently enabled, and what 564 mode it is set to</description> 565 <range>android.control.afAvailableModes</range> 566 <details>Only effective if android.control.mode = AUTO. 567 568 If the lens is controlled by the camera device auto-focus algorithm, 569 the camera device will report the current AF status in android.control.afState 570 in result metadata.</details> 571 <tag id="BC" /> 572 </entry> 573 <entry name="afRegions" type="int32" visibility="public" 574 container="array"> 575 <array> 576 <size>5</size> 577 <size>area_count</size> 578 </array> 579 <description>List of areas to use for focus 580 estimation.</description> 581 <range>`area_count <= android.control.maxRegions[2]`</range> 582 <details>Each area is a rectangle plus weight: xmin, ymin, 583 xmax, ymax, weight. The rectangle is defined to be inclusive of the 584 specified coordinates. 585 586 The coordinate system is based on the active pixel array, 587 with (0,0) being the top-left pixel in the active pixel array, and 588 (android.sensor.info.activeArraySize.width - 1, 589 android.sensor.info.activeArraySize.height - 1) being the 590 bottom-right pixel in the active pixel array. The weight 591 should be nonnegative. 592 593 If all regions have 0 weight, then no specific focus area 594 needs to be used by the HAL. If the focusing region is 595 outside the current android.scaler.cropRegion, the HAL 596 should ignore the sections outside the region and output the 597 used sections in the frame metadata.</details> 598 <tag id="BC" /> 599 </entry> 600 <entry name="afTrigger" type="byte" visibility="public" enum="true"> 601 <enum> 602 <value>IDLE 603 <notes>The trigger is idle.</notes> 604 </value> 605 <value>START 606 <notes>Autofocus will trigger now.</notes> 607 </value> 608 <value>CANCEL 609 <notes>Autofocus will return to its initial 610 state, and cancel any currently active trigger.</notes> 611 </value> 612 </enum> 613 <description> 614 Whether the camera device will trigger autofocus for this request. 615 </description> 616 <details>This entry is normally set to IDLE, or is not 617 included at all in the request settings. 618 619 When included and set to START, the camera device will trigger the 620 autofocus algorithm. If autofocus is disabled, this trigger has no effect. 621 622 When set to CANCEL, the camera device will cancel any active trigger, 623 and return to its initial AF state. 624 625 See android.control.afState for what that means for each AF mode. 626 </details> 627 <tag id="BC" /> 628 </entry> 629 <entry name="awbLock" type="byte" visibility="public" enum="true" 630 typedef="boolean"> 631 <enum> 632 <value>OFF 633 <notes>Auto-whitebalance lock is disabled; the AWB 634 algorithm is free to update its parameters if in AUTO 635 mode.</notes></value> 636 <value>ON 637 <notes>Auto-whitebalance lock is enabled; the AWB 638 algorithm must not update its parameters while the lock 639 is active.</notes></value> 640 </enum> 641 <description>Whether AWB is currently locked to its 642 latest calculated values.</description> 643 <details>Note that AWB lock is only meaningful for AUTO 644 mode; in other modes, AWB is already fixed to a specific 645 setting.</details> 646 <tag id="BC" /> 647 </entry> 648 <entry name="awbMode" type="byte" visibility="public" enum="true"> 649 <enum> 650 <value>OFF 651 <notes> 652 The camera device's auto white balance routine is disabled; 653 the application-selected color transform matrix 654 (android.colorCorrection.transform) and gains 655 (android.colorCorrection.gains) are used by the camera 656 device for manual white balance control. 657 </notes> 658 </value> 659 <value>AUTO 660 <notes> 661 The camera device's auto white balance routine is active; 662 the application's values for android.colorCorrection.transform 663 and android.colorCorrection.gains are ignored. 664 </notes> 665 </value> 666 <value>INCANDESCENT 667 <notes> 668 The camera device's auto white balance routine is disabled; 669 the camera device uses incandescent light as the assumed scene 670 illumination for white balance. While the exact white balance 671 transforms are up to the camera device, they will approximately 672 match the CIE standard illuminant A. 673 </notes> 674 </value> 675 <value>FLUORESCENT 676 <notes> 677 The camera device's auto white balance routine is disabled; 678 the camera device uses fluorescent light as the assumed scene 679 illumination for white balance. While the exact white balance 680 transforms are up to the camera device, they will approximately 681 match the CIE standard illuminant F2. 682 </notes> 683 </value> 684 <value>WARM_FLUORESCENT 685 <notes> 686 The camera device's auto white balance routine is disabled; 687 the camera device uses warm fluorescent light as the assumed scene 688 illumination for white balance. While the exact white balance 689 transforms are up to the camera device, they will approximately 690 match the CIE standard illuminant F4. 691 </notes> 692 </value> 693 <value>DAYLIGHT 694 <notes> 695 The camera device's auto white balance routine is disabled; 696 the camera device uses daylight light as the assumed scene 697 illumination for white balance. While the exact white balance 698 transforms are up to the camera device, they will approximately 699 match the CIE standard illuminant D65. 700 </notes> 701 </value> 702 <value>CLOUDY_DAYLIGHT 703 <notes> 704 The camera device's auto white balance routine is disabled; 705 the camera device uses cloudy daylight light as the assumed scene 706 illumination for white balance. 707 </notes> 708 </value> 709 <value>TWILIGHT 710 <notes> 711 The camera device's auto white balance routine is disabled; 712 the camera device uses twilight light as the assumed scene 713 illumination for white balance. 714 </notes> 715 </value> 716 <value>SHADE 717 <notes> 718 The camera device's auto white balance routine is disabled; 719 the camera device uses shade light as the assumed scene 720 illumination for white balance. 721 </notes> 722 </value> 723 </enum> 724 <description>Whether AWB is currently setting the color 725 transform fields, and what its illumination target 726 is</description> 727 <range>android.control.awbAvailableModes</range> 728 <details> 729 This control is only effective if android.control.mode is AUTO. 730 731 When set to the ON mode, the camera device's auto white balance 732 routine is enabled, overriding the application's selected 733 android.colorCorrection.transform, android.colorCorrection.gains and 734 android.colorCorrection.mode. 735 736 When set to the OFF mode, the camera device's auto white balance 737 routine is disabled. The applicantion manually controls the white 738 balance by android.colorCorrection.transform, android.colorCorrection.gains 739 and android.colorCorrection.mode. 740 741 When set to any other modes, the camera device's auto white balance 742 routine is disabled. The camera device uses each particular illumination 743 target for white balance adjustment. 744 </details> 745 <tag id="BC" /> 746 <tag id="AWB" /> 747 </entry> 748 <entry name="awbRegions" type="int32" visibility="public" 749 container="array"> 750 <array> 751 <size>5</size> 752 <size>area_count</size> 753 </array> 754 <description>List of areas to use for illuminant 755 estimation.</description> 756 <range>`area_count <= android.control.maxRegions[1]`</range> 757 <details>Only used in AUTO mode. 758 759 Each area is a rectangle plus weight: xmin, ymin, 760 xmax, ymax, weight. The rectangle is defined to be inclusive of the 761 specified coordinates. 762 763 The coordinate system is based on the active pixel array, 764 with (0,0) being the top-left pixel in the active pixel array, and 765 (android.sensor.info.activeArraySize.width - 1, 766 android.sensor.info.activeArraySize.height - 1) being the 767 bottom-right pixel in the active pixel array. The weight 768 should be nonnegative. 769 770 If all regions have 0 weight, then no specific metering area 771 needs to be used by the HAL. If the metering region is 772 outside the current android.scaler.cropRegion, the HAL 773 should ignore the sections outside the region and output the 774 used sections in the frame metadata. 775 </details> 776 <tag id="BC" /> 777 </entry> 778 <entry name="captureIntent" type="byte" visibility="public" enum="true"> 779 <enum> 780 <value>CUSTOM 781 <notes>This request doesn't fall into the other 782 categories. Default to preview-like 783 behavior.</notes></value> 784 <value>PREVIEW 785 <notes>This request is for a preview-like usecase. The 786 precapture trigger may be used to start off a metering 787 w/flash sequence</notes></value> 788 <value>STILL_CAPTURE 789 <notes>This request is for a still capture-type 790 usecase.</notes></value> 791 <value>VIDEO_RECORD 792 <notes>This request is for a video recording 793 usecase.</notes></value> 794 <value>VIDEO_SNAPSHOT 795 <notes>This request is for a video snapshot (still 796 image while recording video) usecase</notes></value> 797 <value>ZERO_SHUTTER_LAG 798 <notes>This request is for a ZSL usecase; the 799 application will stream full-resolution images and 800 reprocess one or several later for a final 801 capture</notes></value> 802 </enum> 803 <description>Information to the camera device 3A (auto-exposure, 804 auto-focus, auto-white balance) routines about the purpose 805 of this capture, to help the camera device to decide optimal 3A 806 strategy.</description> 807 <range>All must be supported</range> 808 <details>This control is only effective if `android.control.mode != OFF` 809 and any 3A routine is active.</details> 810 <tag id="BC" /> 811 </entry> 812 <entry name="effectMode" type="byte" visibility="public" enum="true"> 813 <enum> 814 <value>OFF 815 <notes> 816 No color effect will be applied. 817 </notes> 818 </value> 819 <value optional="true">MONO 820 <notes> 821 A "monocolor" effect where the image is mapped into 822 a single color. This will typically be grayscale. 823 </notes> 824 </value> 825 <value optional="true">NEGATIVE 826 <notes> 827 A "photo-negative" effect where the image's colors 828 are inverted. 829 </notes> 830 </value> 831 <value optional="true">SOLARIZE 832 <notes> 833 A "solarisation" effect (Sabattier effect) where the 834 image is wholly or partially reversed in 835 tone. 836 </notes> 837 </value> 838 <value optional="true">SEPIA 839 <notes> 840 A "sepia" effect where the image is mapped into warm 841 gray, red, and brown tones. 842 </notes> 843 </value> 844 <value optional="true">POSTERIZE 845 <notes> 846 A "posterization" effect where the image uses 847 discrete regions of tone rather than a continuous 848 gradient of tones. 849 </notes> 850 </value> 851 <value optional="true">WHITEBOARD 852 <notes> 853 A "whiteboard" effect where the image is typically displayed 854 as regions of white, with black or grey details. 855 </notes> 856 </value> 857 <value optional="true">BLACKBOARD 858 <notes> 859 A "blackboard" effect where the image is typically displayed 860 as regions of black, with white or grey details. 861 </notes> 862 </value> 863 <value optional="true">AQUA 864 <notes> 865 An "aqua" effect where a blue hue is added to the image. 866 </notes> 867 </value> 868 </enum> 869 <description>A special color effect to apply.</description> 870 <range>android.control.availableEffects</range> 871 <details> 872 When this mode is set, a color effect will be applied 873 to images produced by the camera device. The interpretation 874 and implementation of these color effects is left to the 875 implementor of the camera device, and should not be 876 depended on to be consistent (or present) across all 877 devices. 878 879 A color effect will only be applied if 880 android.control.mode != OFF. 881 </details> 882 <tag id="BC" /> 883 </entry> 884 <entry name="mode" type="byte" visibility="public" enum="true"> 885 <enum> 886 <value>OFF 887 <notes>Full application control of pipeline. All 3A 888 routines are disabled, no other settings in 889 android.control.* have any effect</notes></value> 890 <value>AUTO 891 <notes>Use settings for each individual 3A routine. 892 Manual control of capture parameters is disabled. All 893 controls in android.control.* besides sceneMode take 894 effect</notes></value> 895 <value>USE_SCENE_MODE 896 <notes>Use specific scene mode. Enabling this disables 897 control.aeMode, control.awbMode and control.afMode 898 controls; the HAL must ignore those settings while 899 USE_SCENE_MODE is active (except for FACE_PRIORITY 900 scene mode). Other control entries are still active. 901 This setting can only be used if availableSceneModes != 902 UNSUPPORTED</notes></value> 903 <value>OFF_KEEP_STATE 904 <notes>Same as OFF mode, except that this capture will not be 905 used by camera device background auto-exposure, auto-white balance and 906 auto-focus algorithms to update their statistics.</notes></value> 907 </enum> 908 <description>Overall mode of 3A control 909 routines</description> 910 <range>all must be supported</range> 911 <details>High-level 3A control. When set to OFF, all 3A control 912 by the camera device is disabled. The application must set the fields for 913 capture parameters itself. 914 915 When set to AUTO, the individual algorithm controls in 916 android.control.* are in effect, such as android.control.afMode. 917 918 When set to USE_SCENE_MODE, the individual controls in 919 android.control.* are mostly disabled, and the camera device implements 920 one of the scene mode settings (such as ACTION, SUNSET, or PARTY) 921 as it wishes. The camera device scene mode 3A settings are provided by 922 android.control.sceneModeOverrides. 923 924 When set to OFF_KEEP_STATE, it is similar to OFF mode, the only difference 925 is that this frame will not be used by camera device background 3A statistics 926 update, as if this frame is never captured. This mode can be used in the scenario 927 where the application doesn't want a 3A manual control capture to affect 928 the subsequent auto 3A capture results. 929 </details> 930 <tag id="BC" /> 931 </entry> 932 <entry name="sceneMode" type="byte" visibility="public" enum="true"> 933 <enum> 934 <value id="0">DISABLED 935 <notes> 936 Indicates that no scene modes are set for a given capture request. 937 </notes> 938 </value> 939 <value>FACE_PRIORITY 940 <notes>If face detection support exists, use face 941 detection data for auto-focus, auto-white balance, and 942 auto-exposure routines. If face detection statistics are 943 disabled (i.e. android.statistics.faceDetectMode is set to OFF), 944 this should still operate correctly (but will not return 945 face detection statistics to the framework). 946 947 Unlike the other scene modes, android.control.aeMode, 948 android.control.awbMode, and android.control.afMode 949 remain active when FACE_PRIORITY is set. 950 </notes> 951 </value> 952 <value optional="true">ACTION 953 <notes> 954 Optimized for photos of quickly moving objects. 955 Similar to SPORTS. 956 </notes> 957 </value> 958 <value optional="true">PORTRAIT 959 <notes> 960 Optimized for still photos of people. 961 </notes> 962 </value> 963 <value optional="true">LANDSCAPE 964 <notes> 965 Optimized for photos of distant macroscopic objects. 966 </notes> 967 </value> 968 <value optional="true">NIGHT 969 <notes> 970 Optimized for low-light settings. 971 </notes> 972 </value> 973 <value optional="true">NIGHT_PORTRAIT 974 <notes> 975 Optimized for still photos of people in low-light 976 settings. 977 </notes> 978 </value> 979 <value optional="true">THEATRE 980 <notes> 981 Optimized for dim, indoor settings where flash must 982 remain off. 983 </notes> 984 </value> 985 <value optional="true">BEACH 986 <notes> 987 Optimized for bright, outdoor beach settings. 988 </notes> 989 </value> 990 <value optional="true">SNOW 991 <notes> 992 Optimized for bright, outdoor settings containing snow. 993 </notes> 994 </value> 995 <value optional="true">SUNSET 996 <notes> 997 Optimized for scenes of the setting sun. 998 </notes> 999 </value> 1000 <value optional="true">STEADYPHOTO 1001 <notes> 1002 Optimized to avoid blurry photos due to small amounts of 1003 device motion (for example: due to hand shake). 1004 </notes> 1005 </value> 1006 <value optional="true">FIREWORKS 1007 <notes> 1008 Optimized for nighttime photos of fireworks. 1009 </notes> 1010 </value> 1011 <value optional="true">SPORTS 1012 <notes> 1013 Optimized for photos of quickly moving people. 1014 Similar to ACTION. 1015 </notes> 1016 </value> 1017 <value optional="true">PARTY 1018 <notes> 1019 Optimized for dim, indoor settings with multiple moving 1020 people. 1021 </notes> 1022 </value> 1023 <value optional="true">CANDLELIGHT 1024 <notes> 1025 Optimized for dim settings where the main light source 1026 is a flame. 1027 </notes> 1028 </value> 1029 <value optional="true">BARCODE 1030 <notes> 1031 Optimized for accurately capturing a photo of barcode 1032 for use by camera applications that wish to read the 1033 barcode value. 1034 </notes> 1035 </value> 1036 </enum> 1037 <description> 1038 A camera mode optimized for conditions typical in a particular 1039 capture setting. 1040 </description> 1041 <range>android.control.availableSceneModes</range> 1042 <details> 1043 This is the mode that that is active when 1044 `android.control.mode == USE_SCENE_MODE`. Aside from FACE_PRIORITY, 1045 these modes will disable android.control.aeMode, 1046 android.control.awbMode, and android.control.afMode while in use. 1047 1048 The interpretation and implementation of these scene modes is left 1049 to the implementor of the camera device. Their behavior will not be 1050 consistent across all devices, and any given device may only implement 1051 a subset of these modes. 1052 </details> 1053 <hal_details> 1054 HAL implementations that include scene modes are expected to provide 1055 the per-scene settings to use for android.control.aeMode, 1056 android.control.awbMode, and android.control.afMode in 1057 android.control.sceneModeOverrides. 1058 </hal_details> 1059 <tag id="BC" /> 1060 </entry> 1061 <entry name="videoStabilizationMode" type="byte" visibility="public" 1062 enum="true" typedef="boolean"> 1063 <enum> 1064 <value>OFF</value> 1065 <value>ON</value> 1066 </enum> 1067 <description>Whether video stabilization is 1068 active</description> 1069 <details>If enabled, video stabilization can modify the 1070 android.scaler.cropRegion to keep the video stream 1071 stabilized</details> 1072 <tag id="BC" /> 1073 </entry> 1074 </controls> 1075 <static> 1076 <entry name="aeAvailableAntibandingModes" type="byte" visibility="public" 1077 type_notes="list of enums" container="array"> 1078 <array> 1079 <size>n</size> 1080 </array> 1081 <description> 1082 The set of auto-exposure antibanding modes that are 1083 supported by this camera device. 1084 </description> 1085 <details> 1086 Not all of the auto-exposure anti-banding modes may be 1087 supported by a given camera device. This field lists the 1088 valid anti-banding modes that the application may request 1089 for this camera device; they must include AUTO. 1090 </details> 1091 </entry> 1092 <entry name="aeAvailableModes" type="byte" visibility="public" 1093 type_notes="list of enums" container="array"> 1094 <array> 1095 <size>n</size> 1096 </array> 1097 <description> 1098 The set of auto-exposure modes that are supported by this 1099 camera device. 1100 </description> 1101 <details> 1102 Not all the auto-exposure modes may be supported by a 1103 given camera device, especially if no flash unit is 1104 available. This entry lists the valid modes for 1105 android.control.aeMode for this camera device. 1106 1107 All camera devices support ON, and all camera devices with 1108 flash units support ON_AUTO_FLASH and 1109 ON_ALWAYS_FLASH. 1110 1111 Full-capability camera devices always support OFF mode, 1112 which enables application control of camera exposure time, 1113 sensitivity, and frame duration. 1114 </details> 1115 <tag id="BC" /> 1116 </entry> 1117 <entry name="aeAvailableTargetFpsRanges" type="int32" visibility="public" 1118 type_notes="list of pairs of frame rates" 1119 container="array"> 1120 <array> 1121 <size>2</size> 1122 <size>n</size> 1123 </array> 1124 <description>List of frame rate ranges supported by the 1125 AE algorithm/hardware</description> 1126 </entry> 1127 <entry name="aeCompensationRange" type="int32" visibility="public" 1128 container="array"> 1129 <array> 1130 <size>2</size> 1131 </array> 1132 <description>Maximum and minimum exposure compensation 1133 setting, in counts of 1134 android.control.aeCompensationStepSize</description> 1135 <range>At least (-2,2)/(exp compensation step 1136 size)</range> 1137 <tag id="BC" /> 1138 </entry> 1139 <entry name="aeCompensationStep" type="rational" visibility="public"> 1140 <description>Smallest step by which exposure compensation 1141 can be changed</description> 1142 <range><= 1/2</range> 1143 <tag id="BC" /> 1144 </entry> 1145 <entry name="afAvailableModes" type="byte" visibility="public" 1146 type_notes="List of enums" container="array"> 1147 <array> 1148 <size>n</size> 1149 </array> 1150 <description>List of AF modes that can be 1151 selected with android.control.afMode.</description> 1152 <details> 1153 Not all the auto-focus modes may be supported by a 1154 given camera device. This entry lists the valid modes for 1155 android.control.afMode for this camera device. 1156 1157 All camera devices will support OFF mode, and all camera devices with 1158 adjustable focuser units (`android.lens.info.minimumFocusDistance > 0`) 1159 will support AUTO mode. 1160 </details> 1161 <tag id="BC" /> 1162 </entry> 1163 <entry name="availableEffects" type="byte" visibility="public" 1164 type_notes="List of enums (android.control.effectMode)." container="array"> 1165 <array> 1166 <size>n</size> 1167 </array> 1168 <description> 1169 List containing the subset of color effects 1170 specified in android.control.effectMode that is supported by 1171 this device. 1172 </description> 1173 <range> 1174 Any subset of enums from those specified in 1175 android.control.effectMode. OFF must be included in any subset. 1176 </range> 1177 <details> 1178 This list contains the color effect modes that can be applied to 1179 images produced by the camera device. Only modes that have 1180 been fully implemented for the current device may be included here. 1181 Implementations are not expected to be consistent across all devices. 1182 If no color effect modes are available for a device, this should 1183 simply be set to OFF. 1184 1185 A color effect will only be applied if 1186 android.control.mode != OFF. 1187 </details> 1188 <tag id="BC" /> 1189 </entry> 1190 <entry name="availableSceneModes" type="byte" visibility="public" 1191 type_notes="List of enums (android.control.sceneMode)." 1192 container="array"> 1193 <array> 1194 <size>n</size> 1195 </array> 1196 <description> 1197 List containing a subset of scene modes 1198 specified in android.control.sceneMode. 1199 </description> 1200 <range> 1201 Any subset of the enums specified in android.control.sceneMode 1202 not including DISABLED, or solely DISABLED if no 1203 scene modes are available. FACE_PRIORITY must be included 1204 if face detection is supported (i.e.`android.statistics.info.maxFaceCount > 0`). 1205 </range> 1206 <details> 1207 This list contains scene modes that can be set for the camera device. 1208 Only scene modes that have been fully implemented for the 1209 camera device may be included here. Implementations are not expected 1210 to be consistent across all devices. If no scene modes are supported 1211 by the camera device, this will be set to `[DISABLED]`. 1212 </details> 1213 <tag id="BC" /> 1214 </entry> 1215 <entry name="availableVideoStabilizationModes" type="byte" 1216 visibility="public" type_notes="List of enums." container="array"> 1217 <array> 1218 <size>n</size> 1219 </array> 1220 <description>List of video stabilization modes that can 1221 be supported</description> 1222 <range>OFF must be included</range> 1223 <tag id="BC" /> 1224 </entry> 1225 <entry name="awbAvailableModes" type="byte" visibility="public" 1226 type_notes="List of enums" 1227 container="array"> 1228 <array> 1229 <size>n</size> 1230 </array> 1231 <description>The set of auto-white-balance modes (android.control.awbMode) 1232 that are supported by this camera device.</description> 1233 <details> 1234 Not all the auto-white-balance modes may be supported by a 1235 given camera device. This entry lists the valid modes for 1236 android.control.awbMode for this camera device. 1237 1238 All camera devices will support ON mode. 1239 1240 Full-capability camera devices will always support OFF mode, 1241 which enables application control of white balance, by using 1242 android.colorCorrection.transform and android.colorCorrection.gains 1243 (android.colorCorrection.mode must be set to TRANSFORM_MATRIX). 1244 </details> 1245 <tag id="BC" /> 1246 </entry> 1247 <entry name="maxRegions" type="int32" visibility="public" container="array"> 1248 <array> 1249 <size>3</size> 1250 </array> 1251 <description> 1252 List of the maximum number of regions that can be used for metering in 1253 auto-exposure (AE), auto-white balance (AWB), and auto-focus (AF); 1254 this corresponds to the the maximum number of elements in 1255 android.control.aeRegions, android.control.awbRegions, 1256 and android.control.afRegions. 1257 </description> 1258 <range> 1259 Value must be &gt;= 0 for each element. For full-capability devices 1260 this value must be &gt;= 1 for AE and AF. The order of the elements is: 1261 `(AE, AWB, AF)`.</range> 1262 <tag id="BC" /> 1263 </entry> 1264 <entry name="sceneModeOverrides" type="byte" visibility="system" 1265 container="array"> 1266 <array> 1267 <size>3</size> 1268 <size>length(availableSceneModes)</size> 1269 </array> 1270 <description> 1271 Ordered list of auto-exposure, auto-white balance, and auto-focus 1272 settings to use with each available scene mode. 1273 </description> 1274 <range> 1275 For each available scene mode, the list must contain three 1276 entries containing the android.control.aeMode, 1277 android.control.awbMode, and android.control.afMode values used 1278 by the camera device. The entry order is `(aeMode, awbMode, afMode)` 1279 where aeMode has the lowest index position. 1280 </range> 1281 <details> 1282 When a scene mode is enabled, the camera device is expected 1283 to override android.control.aeMode, android.control.awbMode, 1284 and android.control.afMode with its preferred settings for 1285 that scene mode. 1286 1287 The order of this list matches that of availableSceneModes, 1288 with 3 entries for each mode. The overrides listed 1289 for FACE_PRIORITY are ignored, since for that 1290 mode the application-set android.control.aeMode, 1291 android.control.awbMode, and android.control.afMode values are 1292 used instead, matching the behavior when android.control.mode 1293 is set to AUTO. It is recommended that the FACE_PRIORITY 1294 overrides should be set to 0. 1295 1296 For example, if availableSceneModes contains 1297 `(FACE_PRIORITY, ACTION, NIGHT)`, then the camera framework 1298 expects sceneModeOverrides to have 9 entries formatted like: 1299 `(0, 0, 0, ON_AUTO_FLASH, AUTO, CONTINUOUS_PICTURE, 1300 ON_AUTO_FLASH, INCANDESCENT, AUTO)`. 1301 </details> 1302 <hal_details> 1303 To maintain backward compatibility, this list will be made available 1304 in the static metadata of the camera service. The camera service will 1305 use these values to set android.control.aeMode, 1306 android.control.awbMode, and android.control.afMode when using a scene 1307 mode other than FACE_PRIORITY. 1308 </hal_details> 1309 <tag id="BC" /> 1310 </entry> 1311 </static> 1312 <dynamic> 1313 <entry name="aePrecaptureId" type="int32" visibility="hidden"> 1314 <description>The ID sent with the latest 1315 CAMERA2_TRIGGER_PRECAPTURE_METERING call</description> 1316 <range>**Deprecated**. Do not use.</range> 1317 <details>Must be 0 if no 1318 CAMERA2_TRIGGER_PRECAPTURE_METERING trigger received yet 1319 by HAL. Always updated even if AE algorithm ignores the 1320 trigger</details> 1321 </entry> 1322 <clone entry="android.control.aeMode" kind="controls"> 1323 </clone> 1324 <clone entry="android.control.aeRegions" kind="controls"> 1325 </clone> 1326 <entry name="aeState" type="byte" visibility="public" enum="true"> 1327 <enum> 1328 <value>INACTIVE 1329 <notes>AE is off or recently reset. When a camera device is opened, it starts in 1330 this state.</notes></value> 1331 <value>SEARCHING 1332 <notes>AE doesn't yet have a good set of control values 1333 for the current scene.</notes></value> 1334 <value>CONVERGED 1335 <notes>AE has a good set of control values for the 1336 current scene.</notes></value> 1337 <value>LOCKED 1338 <notes>AE has been locked.</notes></value> 1339 <value>FLASH_REQUIRED 1340 <notes>AE has a good set of control values, but flash 1341 needs to be fired for good quality still 1342 capture.</notes></value> 1343 <value>PRECAPTURE 1344 <notes>AE has been asked to do a precapture sequence 1345 (through the android.control.aePrecaptureTrigger START), 1346 and is currently executing it. Once PRECAPTURE 1347 completes, AE will transition to CONVERGED or 1348 FLASH_REQUIRED as appropriate.</notes></value> 1349 </enum> 1350 <description>Current state of AE algorithm</description> 1351 <details>Switching between or enabling AE modes (android.control.aeMode) always 1352 resets the AE state to INACTIVE. Similarly, switching between android.control.mode, 1353 or android.control.sceneMode if `android.control.mode == USE_SCENE_MODE` resets all 1354 the algorithm states to INACTIVE. 1355 1356 The camera device can do several state transitions between two results, if it is 1357 allowed by the state transition table. For example: INACTIVE may never actually be 1358 seen in a result. 1359 1360 The state in the result is the state for this image (in sync with this image): if 1361 AE state becomes CONVERGED, then the image data associated with this result should 1362 be good to use. 1363 1364 Below are state transition tables for different AE modes. 1365 1366 State | Transition Cause | New State | Notes 1367 :------------:|:----------------:|:---------:|:-----------------------: 1368 INACTIVE | | INACTIVE | Camera device auto exposure algorithm is disabled 1369 1370 When android.control.aeMode is AE_MODE_ON_*: 1371 1372 State | Transition Cause | New State | Notes 1373 :-------------:|:--------------------------------------------:|:--------------:|:-----------------: 1374 INACTIVE | Camera device initiates AE scan | SEARCHING | Values changing 1375 INACTIVE | android.control.aeLock is ON | LOCKED | Values locked 1376 SEARCHING | Camera device finishes AE scan | CONVERGED | Good values, not changing 1377 SEARCHING | Camera device finishes AE scan | FLASH_REQUIRED | Converged but too dark w/o flash 1378 SEARCHING | android.control.aeLock is ON | LOCKED | Values locked 1379 CONVERGED | Camera device initiates AE scan | SEARCHING | Values changing 1380 CONVERGED | android.control.aeLock is ON | LOCKED | Values locked 1381 FLASH_REQUIRED | Camera device initiates AE scan | SEARCHING | Values changing 1382 FLASH_REQUIRED | android.control.aeLock is ON | LOCKED | Values locked 1383 LOCKED | android.control.aeLock is OFF | SEARCHING | Values not good after unlock 1384 LOCKED | android.control.aeLock is OFF | CONVERGED | Values good after unlock 1385 LOCKED | android.control.aeLock is OFF | FLASH_REQUIRED | Exposure good, but too dark 1386 PRECAPTURE | Sequence done. android.control.aeLock is OFF | CONVERGED | Ready for high-quality capture 1387 PRECAPTURE | Sequence done. android.control.aeLock is ON | LOCKED | Ready for high-quality capture 1388 Any state | android.control.aePrecaptureTrigger is START | PRECAPTURE | Start AE precapture metering sequence 1389 </details> 1390 </entry> 1391 <clone entry="android.control.afMode" kind="controls"> 1392 </clone> 1393 <clone entry="android.control.afRegions" kind="controls"> 1394 </clone> 1395 <entry name="afState" type="byte" visibility="public" enum="true"> 1396 <enum> 1397 <value>INACTIVE 1398 <notes>AF off or has not yet tried to scan/been asked 1399 to scan. When a camera device is opened, it starts in 1400 this state.</notes></value> 1401 <value>PASSIVE_SCAN 1402 <notes>if CONTINUOUS_* modes are supported. AF is 1403 currently doing an AF scan initiated by a continuous 1404 autofocus mode</notes></value> 1405 <value>PASSIVE_FOCUSED 1406 <notes>if CONTINUOUS_* modes are supported. AF currently 1407 believes it is in focus, but may restart scanning at 1408 any time.</notes></value> 1409 <value>ACTIVE_SCAN 1410 <notes>if AUTO or MACRO modes are supported. AF is doing 1411 an AF scan because it was triggered by AF 1412 trigger</notes></value> 1413 <value>FOCUSED_LOCKED 1414 <notes>if any AF mode besides OFF is supported. AF 1415 believes it is focused correctly and is 1416 locked</notes></value> 1417 <value>NOT_FOCUSED_LOCKED 1418 <notes>if any AF mode besides OFF is supported. AF has 1419 failed to focus successfully and is 1420 locked</notes></value> 1421 <value>PASSIVE_UNFOCUSED 1422 <notes>if CONTINUOUS_* modes are supported. AF finished a 1423 passive scan without finding focus, and may restart 1424 scanning at any time.</notes></value> 1425 </enum> 1426 <description>Current state of AF algorithm</description> 1427 <details> 1428 Switching between or enabling AF modes (android.control.afMode) always 1429 resets the AF state to INACTIVE. Similarly, switching between android.control.mode, 1430 or android.control.sceneMode if `android.control.mode == USE_SCENE_MODE` resets all 1431 the algorithm states to INACTIVE. 1432 1433 The camera device can do several state transitions between two results, if it is 1434 allowed by the state transition table. For example: INACTIVE may never actually be 1435 seen in a result. 1436 1437 The state in the result is the state for this image (in sync with this image): if 1438 AF state becomes FOCUSED, then the image data associated with this result should 1439 be sharp. 1440 1441 Below are state transition tables for different AF modes. 1442 1443 When android.control.afMode is AF_MODE_OFF or AF_MODE_EDOF: 1444 1445 State | Transition Cause | New State | Notes 1446 :------------:|:----------------:|:---------:|:-----------: 1447 INACTIVE | | INACTIVE | Never changes 1448 1449 When android.control.afMode is AF_MODE_AUTO or AF_MODE_MACRO: 1450 1451 State | Transition Cause | New State | Notes 1452 :-----------------:|:----------------:|:------------------:|:--------------: 1453 INACTIVE | AF_TRIGGER | ACTIVE_SCAN | Start AF sweep, Lens now moving 1454 ACTIVE_SCAN | AF sweep done | FOCUSED_LOCKED | Focused, Lens now locked 1455 ACTIVE_SCAN | AF sweep done | NOT_FOCUSED_LOCKED | Not focused, Lens now locked 1456 ACTIVE_SCAN | AF_CANCEL | INACTIVE | Cancel/reset AF, Lens now locked 1457 FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Cancel/reset AF 1458 FOCUSED_LOCKED | AF_TRIGGER | ACTIVE_SCAN | Start new sweep, Lens now moving 1459 NOT_FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Cancel/reset AF 1460 NOT_FOCUSED_LOCKED | AF_TRIGGER | ACTIVE_SCAN | Start new sweep, Lens now moving 1461 Any state | Mode change | INACTIVE | 1462 1463 When android.control.afMode is AF_MODE_CONTINUOUS_VIDEO: 1464 1465 State | Transition Cause | New State | Notes 1466 :-----------------:|:-----------------------------------:|:------------------:|:--------------: 1467 INACTIVE | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1468 INACTIVE | AF_TRIGGER | NOT_FOCUSED_LOCKED | AF state query, Lens now locked 1469 PASSIVE_SCAN | Camera device completes current scan| PASSIVE_FOCUSED | End AF scan, Lens now locked 1470 PASSIVE_SCAN | Camera device fails current scan | PASSIVE_UNFOCUSED | End AF scan, Lens now locked 1471 PASSIVE_SCAN | AF_TRIGGER | FOCUSED_LOCKED | Immediate trans. If focus is good, Lens now locked 1472 PASSIVE_SCAN | AF_TRIGGER | NOT_FOCUSED_LOCKED | Immediate trans. if focus is bad, Lens now locked 1473 PASSIVE_SCAN | AF_CANCEL | INACTIVE | Reset lens position, Lens now locked 1474 PASSIVE_FOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1475 PASSIVE_UNFOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1476 PASSIVE_FOCUSED | AF_TRIGGER | FOCUSED_LOCKED | Immediate trans. Lens now locked 1477 PASSIVE_UNFOCUSED | AF_TRIGGER | NOT_FOCUSED_LOCKED | Immediate trans. Lens now locked 1478 FOCUSED_LOCKED | AF_TRIGGER | FOCUSED_LOCKED | No effect 1479 FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 1480 NOT_FOCUSED_LOCKED | AF_TRIGGER | NOT_FOCUSED_LOCKED | No effect 1481 NOT_FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 1482 1483 When android.control.afMode is AF_MODE_CONTINUOUS_PICTURE: 1484 1485 State | Transition Cause | New State | Notes 1486 :-----------------:|:------------------------------------:|:------------------:|:--------------: 1487 INACTIVE | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1488 INACTIVE | AF_TRIGGER | NOT_FOCUSED_LOCKED | AF state query, Lens now locked 1489 PASSIVE_SCAN | Camera device completes current scan | PASSIVE_FOCUSED | End AF scan, Lens now locked 1490 PASSIVE_SCAN | Camera device fails current scan | PASSIVE_UNFOCUSED | End AF scan, Lens now locked 1491 PASSIVE_SCAN | AF_TRIGGER | FOCUSED_LOCKED | Eventual trans. once focus good, Lens now locked 1492 PASSIVE_SCAN | AF_TRIGGER | NOT_FOCUSED_LOCKED | Eventual trans. if cannot focus, Lens now locked 1493 PASSIVE_SCAN | AF_CANCEL | INACTIVE | Reset lens position, Lens now locked 1494 PASSIVE_FOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1495 PASSIVE_UNFOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1496 PASSIVE_FOCUSED | AF_TRIGGER | FOCUSED_LOCKED | Immediate trans. Lens now locked 1497 PASSIVE_UNFOCUSED | AF_TRIGGER | NOT_FOCUSED_LOCKED | Immediate trans. Lens now locked 1498 FOCUSED_LOCKED | AF_TRIGGER | FOCUSED_LOCKED | No effect 1499 FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 1500 NOT_FOCUSED_LOCKED | AF_TRIGGER | NOT_FOCUSED_LOCKED | No effect 1501 NOT_FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 1502 </details> 1503 </entry> 1504 <entry name="afTriggerId" type="int32" visibility="hidden"> 1505 <description>The ID sent with the latest 1506 CAMERA2_TRIGGER_AUTOFOCUS call</description> 1507 <range>**Deprecated**. Do not use.</range> 1508 <details>Must be 0 if no CAMERA2_TRIGGER_AUTOFOCUS trigger 1509 received yet by HAL. Always updated even if AF algorithm 1510 ignores the trigger</details> 1511 </entry> 1512 <clone entry="android.control.awbMode" kind="controls"> 1513 </clone> 1514 <clone entry="android.control.awbRegions" kind="controls"> 1515 </clone> 1516 <entry name="awbState" type="byte" visibility="public" enum="true"> 1517 <enum> 1518 <value>INACTIVE 1519 <notes>AWB is not in auto mode. When a camera device is opened, it 1520 starts in this state.</notes></value> 1521 <value>SEARCHING 1522 <notes>AWB doesn't yet have a good set of control 1523 values for the current scene.</notes></value> 1524 <value>CONVERGED 1525 <notes>AWB has a good set of control values for the 1526 current scene.</notes></value> 1527 <value>LOCKED 1528 <notes>AWB has been locked. 1529 </notes></value> 1530 </enum> 1531 <description>Current state of AWB algorithm</description> 1532 <details>Switching between or enabling AWB modes (android.control.awbMode) always 1533 resets the AWB state to INACTIVE. Similarly, switching between android.control.mode, 1534 or android.control.sceneMode if `android.control.mode == USE_SCENE_MODE` resets all 1535 the algorithm states to INACTIVE. 1536 1537 The camera device can do several state transitions between two results, if it is 1538 allowed by the state transition table. So INACTIVE may never actually be seen in 1539 a result. 1540 1541 The state in the result is the state for this image (in sync with this image): if 1542 AWB state becomes CONVERGED, then the image data associated with this result should 1543 be good to use. 1544 1545 Below are state transition tables for different AWB modes. 1546 1547 When `android.control.awbMode != AWB_MODE_AUTO`: 1548 1549 State | Transition Cause | New State | Notes 1550 :------------:|:----------------:|:---------:|:-----------------------: 1551 INACTIVE | |INACTIVE |Camera device auto white balance algorithm is disabled 1552 1553 When android.control.awbMode is AWB_MODE_AUTO: 1554 1555 State | Transition Cause | New State | Notes 1556 :-------------:|:--------------------------------:|:-------------:|:-----------------: 1557 INACTIVE | Camera device initiates AWB scan | SEARCHING | Values changing 1558 INACTIVE | android.control.awbLock is ON | LOCKED | Values locked 1559 SEARCHING | Camera device finishes AWB scan | CONVERGED | Good values, not changing 1560 SEARCHING | android.control.awbLock is ON | LOCKED | Values locked 1561 CONVERGED | Camera device initiates AWB scan | SEARCHING | Values changing 1562 CONVERGED | android.control.awbLock is ON | LOCKED | Values locked 1563 LOCKED | android.control.awbLock is OFF | SEARCHING | Values not good after unlock 1564 LOCKED | android.control.awbLock is OFF | CONVERGED | Values good after unlock 1565 </details> 1566 </entry> 1567 <clone entry="android.control.mode" kind="controls"> 1568 </clone> 1569 </dynamic> 1570 </section> 1571 <section name="demosaic"> 1572 <controls> 1573 <entry name="mode" type="byte" enum="true"> 1574 <enum> 1575 <value>FAST 1576 <notes>Minimal or no slowdown of frame rate compared to 1577 Bayer RAW output</notes></value> 1578 <value>HIGH_QUALITY 1579 <notes>High-quality may reduce output frame 1580 rate</notes></value> 1581 </enum> 1582 <description>Controls the quality of the demosaicing 1583 processing</description> 1584 <tag id="V1" /> 1585 </entry> 1586 </controls> 1587 </section> 1588 <section name="edge"> 1589 <controls> 1590 <entry name="mode" type="byte" visibility="public" enum="true"> 1591 <enum> 1592 <value>OFF 1593 <notes>No edge enhancement is applied</notes></value> 1594 <value>FAST 1595 <notes>Must not slow down frame rate relative to sensor 1596 output</notes></value> 1597 <value>HIGH_QUALITY 1598 <notes>Frame rate may be reduced by high 1599 quality</notes></value> 1600 </enum> 1601 <description>Operation mode for edge 1602 enhancement</description> 1603 <details>Edge/sharpness/detail enhancement. OFF means no 1604 enhancement will be applied by the HAL. 1605 1606 FAST/HIGH_QUALITY both mean camera device determined enhancement 1607 will be applied. HIGH_QUALITY mode indicates that the 1608 camera device will use the highest-quality enhancement algorithms, 1609 even if it slows down capture rate. FAST means the camera device will 1610 not slow down capture rate when applying edge enhancement.</details> 1611 </entry> 1612 <entry name="strength" type="byte"> 1613 <description>Control the amount of edge enhancement 1614 applied to the images</description> 1615 <units>1-10; 10 is maximum sharpening</units> 1616 </entry> 1617 </controls> 1618 <dynamic> 1619 <clone entry="android.edge.mode" kind="controls"></clone> 1620 </dynamic> 1621 </section> 1622 <section name="flash"> 1623 <controls> 1624 <entry name="firingPower" type="byte"> 1625 <description>Power for flash firing/torch</description> 1626 <units>10 is max power; 0 is no flash. Linear</units> 1627 <range>0 - 10</range> 1628 <details>Power for snapshot may use a different scale than 1629 for torch mode. Only one entry for torch mode will be 1630 used</details> 1631 <tag id="V1" /> 1632 </entry> 1633 <entry name="firingTime" type="int64"> 1634 <description>Firing time of flash relative to start of 1635 exposure</description> 1636 <units>nanoseconds</units> 1637 <range>0-(exposure time-flash duration)</range> 1638 <details>Clamped to (0, exposure time - flash 1639 duration).</details> 1640 <tag id="V1" /> 1641 </entry> 1642 <entry name="mode" type="byte" visibility="public" enum="true"> 1643 <enum> 1644 <value>OFF 1645 <notes> 1646 Do not fire the flash for this capture. 1647 </notes> 1648 </value> 1649 <value>SINGLE 1650 <notes> 1651 If the flash is available and charged, fire flash 1652 for this capture based on android.flash.firingPower and 1653 android.flash.firingTime. 1654 </notes> 1655 </value> 1656 <value>TORCH 1657 <notes> 1658 Transition flash to continuously on. 1659 </notes> 1660 </value> 1661 </enum> 1662 <description>The desired mode for for the camera device's flash control.</description> 1663 <details> 1664 This control is only effective when flash unit is available 1665 (`android.flash.info.available == true`). 1666 1667 When this control is used, the android.control.aeMode must be set to ON or OFF. 1668 Otherwise, the camera device auto-exposure related flash control (ON_AUTO_FLASH, 1669 ON_ALWAYS_FLASH, or ON_AUTO_FLASH_REDEYE) will override this control. 1670 1671 When set to OFF, the camera device will not fire flash for this capture. 1672 1673 When set to SINGLE, the camera device will fire flash regardless of the camera 1674 device's auto-exposure routine's result. When used in still capture case, this 1675 control should be used along with AE precapture metering sequence 1676 (android.control.aePrecaptureTrigger), otherwise, the image may be incorrectly exposed. 1677 1678 When set to TORCH, the flash will be on continuously. This mode can be used 1679 for use cases such as preview, auto-focus assist, still capture, or video recording. 1680 1681 The flash status will be reported by android.flash.state in the capture result metadata. 1682 </details> 1683 <tag id="BC" /> 1684 </entry> 1685 </controls> 1686 <static> 1687 <namespace name="info"> 1688 <entry name="available" type="byte" visibility="public" enum="true" typedef="boolean"> 1689 <enum> 1690 <value>FALSE</value> 1691 <value>TRUE</value> 1692 </enum> 1693 <description>Whether this camera device has a 1694 flash.</description> 1695 <details>If no flash, none of the flash controls do 1696 anything. All other metadata should return 0.</details> 1697 <tag id="BC" /> 1698 </entry> 1699 <entry name="chargeDuration" type="int64"> 1700 <description>Time taken before flash can fire 1701 again</description> 1702 <units>nanoseconds</units> 1703 <range>0-1e9</range> 1704 <details>1 second too long/too short for recharge? Should 1705 this be power-dependent?</details> 1706 <tag id="V1" /> 1707 </entry> 1708 </namespace> 1709 <entry name="colorTemperature" type="byte"> 1710 <description>The x,y whitepoint of the 1711 flash</description> 1712 <units>pair of floats</units> 1713 <range>0-1 for both</range> 1714 <tag id="ADV" /> 1715 </entry> 1716 <entry name="maxEnergy" type="byte"> 1717 <description>Max energy output of the flash for a full 1718 power single flash</description> 1719 <units>lumen-seconds</units> 1720 <range>&gt;= 0</range> 1721 <tag id="ADV" /> 1722 </entry> 1723 </static> 1724 <dynamic> 1725 <clone entry="android.flash.firingPower" kind="controls"> 1726 </clone> 1727 <clone entry="android.flash.firingTime" kind="controls"> 1728 </clone> 1729 <clone entry="android.flash.mode" kind="controls"></clone> 1730 <entry name="state" type="byte" visibility="public" enum="true"> 1731 <enum> 1732 <value>UNAVAILABLE 1733 <notes>No flash on camera</notes></value> 1734 <value>CHARGING 1735 <notes>if android.flash.info.available is true Flash is 1736 charging and cannot be fired</notes></value> 1737 <value>READY 1738 <notes>if android.flash.info.available is true Flash is 1739 ready to fire</notes></value> 1740 <value>FIRED 1741 <notes>if android.flash.info.available is true Flash fired 1742 for this capture</notes></value> 1743 </enum> 1744 <description>Current state of the flash 1745 unit.</description> 1746 <details> 1747 When the camera device doesn't have flash unit 1748 (i.e. `android.flash.info.available == false`), this state will always be UNAVAILABLE. 1749 Other states indicate the current flash status. 1750 </details> 1751 </entry> 1752 </dynamic> 1753 </section> 1754 <section name="hotPixel"> 1755 <controls> 1756 <entry name="mode" type="byte" visibility="public" enum="true"> 1757 <enum> 1758 <value>OFF 1759 <notes> 1760 The frame rate must not be reduced relative to sensor raw output 1761 for this option. 1762 1763 No hot pixel correction is applied. 1764 </notes> 1765 </value> 1766 <value>FAST 1767 <notes> 1768 The frame rate must not be reduced relative to sensor raw output 1769 for this option. 1770 1771 Hot pixel correction is applied. 1772 </notes> 1773 </value> 1774 <value>HIGH_QUALITY 1775 <notes> 1776 The frame rate may be reduced relative to sensor raw output 1777 for this option. 1778 1779 A high-quality hot pixel correction is applied. 1780 </notes> 1781 </value> 1782 </enum> 1783 <description> 1784 Set operational mode for hot pixel correction. 1785 1786 Hotpixel correction interpolates out, or otherwise removes, pixels 1787 that do not accurately encode the incoming light (i.e. pixels that 1788 are stuck at an arbitrary value). 1789 </description> 1790 <tag id="V1" /> 1791 </entry> 1792 </controls> 1793 <dynamic> 1794 <entry name="map" type="int32" visibility="public" 1795 type_notes="list of coordinates based on android.sensor.pixelArraySize" 1796 container="array"> 1797 <array> 1798 <size>2</size> 1799 <size>n</size> 1800 </array> 1801 <description> 1802 List of `(x, y)` coordinates of hot/defective pixels on the 1803 sensor, where `(x, y)` lies between `(0, 0)`, which is the top-left 1804 of the pixel array, and the width,height of the pixel array given in 1805 android.sensor.info.pixelArraySize. This may include hot pixels 1806 that lie outside of the active array bounds given by 1807 android.sensor.activeArraySize. 1808 </description> 1809 <range> 1810 n <= number of pixels on the sensor. 1811 The `(x, y)` coordinates must be bounded by 1812 android.sensor.info.pixelArraySize. 1813 </range> 1814 <hal_details> 1815 A hotpixel map contains the coordinates of pixels on the camera 1816 sensor that do report valid values (usually due to defects in 1817 the camera sensor). This includes pixels that are stuck at certain 1818 values, or have a response that does not accuractly encode the 1819 incoming light from the scene. 1820 1821 To avoid performance issues, there should be significantly fewer hot 1822 pixels than actual pixels on the camera sensor. 1823 </hal_details> 1824 <tag id="ADV" /> 1825 </entry> 1826 <clone entry="android.hotPixel.mode" kind="controls"> 1827 <tag id="V1" /> 1828 </clone> 1829 </dynamic> 1830 </section> 1831 <section name="jpeg"> 1832 <controls> 1833 <entry name="gpsCoordinates" type="double" visibility="public" 1834 type_notes="latitude, longitude, altitude. First two in degrees, the third in meters" 1835 container="array"> 1836 <array> 1837 <size>3</size> 1838 </array> 1839 <description>GPS coordinates to include in output JPEG 1840 EXIF</description> 1841 <range>(-180 - 180], [-90,90], [-inf, inf]</range> 1842 <tag id="BC" /> 1843 </entry> 1844 <entry name="gpsProcessingMethod" type="byte" visibility="public" 1845 typedef="string"> 1846 <description>32 characters describing GPS algorithm to 1847 include in EXIF</description> 1848 <units>UTF-8 null-terminated string</units> 1849 <tag id="BC" /> 1850 </entry> 1851 <entry name="gpsTimestamp" type="int64" visibility="public"> 1852 <description>Time GPS fix was made to include in 1853 EXIF</description> 1854 <units>UTC in seconds since January 1, 1970</units> 1855 <tag id="BC" /> 1856 </entry> 1857 <entry name="orientation" type="int32" visibility="public"> 1858 <description>Orientation of JPEG image to 1859 write</description> 1860 <units>Degrees in multiples of 90</units> 1861 <range>0, 90, 180, 270</range> 1862 <tag id="BC" /> 1863 </entry> 1864 <entry name="quality" type="byte" visibility="public"> 1865 <description>Compression quality of the final JPEG 1866 image</description> 1867 <range>1-100; larger is higher quality</range> 1868 <details>85-95 is typical usage range</details> 1869 <tag id="BC" /> 1870 </entry> 1871 <entry name="thumbnailQuality" type="byte" visibility="public"> 1872 <description>Compression quality of JPEG 1873 thumbnail</description> 1874 <range>1-100; larger is higher quality</range> 1875 <tag id="BC" /> 1876 </entry> 1877 <entry name="thumbnailSize" type="int32" visibility="public" 1878 container="array" typedef="size"> 1879 <array> 1880 <size>2</size> 1881 </array> 1882 <description>Resolution of embedded JPEG thumbnail</description> 1883 <range>Size must be one of the size from android.jpeg.availableThumbnailSizes</range> 1884 <details>When set to (0, 0) value, the JPEG EXIF will not contain thumbnail, 1885 but the captured JPEG will still be a valid image. 1886 1887 When a jpeg image capture is issued, the thumbnail size selected should have 1888 the same aspect ratio as the jpeg image.</details> 1889 <tag id="BC" /> 1890 </entry> 1891 </controls> 1892 <static> 1893 <entry name="availableThumbnailSizes" type="int32" visibility="public" 1894 container="array" typedef="size"> 1895 <array> 1896 <size>2</size> 1897 <size>n</size> 1898 </array> 1899 <description>Supported resolutions for the JPEG thumbnail</description> 1900 <range>Will include at least one valid resolution, plus 1901 (0,0) for no thumbnail generation, and each size will be distinct.</range> 1902 <details>Below condiditions will be satisfied for this size list: 1903 1904 * The sizes will be sorted by increasing pixel area (width x height). 1905 If several resolutions have the same area, they will be sorted by increasing width. 1906 * The aspect ratio of the largest thumbnail size will be same as the 1907 aspect ratio of largest JPEG output size in android.scaler.availableStreamConfigurations. 1908 The largest size is defined as the size that has the largest pixel area 1909 in a given size list. 1910 * Each output JPEG size in android.scaler.availableStreamConfigurations will have at least 1911 one corresponding size that has the same aspect ratio in availableThumbnailSizes, 1912 and vice versa. 1913 * All non (0, 0) sizes will have non-zero widths and heights.</details> 1914 <tag id="BC" /> 1915 </entry> 1916 <entry name="maxSize" type="int32" visibility="system"> 1917 <description>Maximum size in bytes for the compressed 1918 JPEG buffer</description> 1919 <range>Must be large enough to fit any JPEG produced by 1920 the camera</range> 1921 <details>This is used for sizing the gralloc buffers for 1922 JPEG</details> 1923 </entry> 1924 </static> 1925 <dynamic> 1926 <clone entry="android.jpeg.gpsCoordinates" kind="controls"> 1927 </clone> 1928 <clone entry="android.jpeg.gpsProcessingMethod" 1929 kind="controls"></clone> 1930 <clone entry="android.jpeg.gpsTimestamp" kind="controls"> 1931 </clone> 1932 <clone entry="android.jpeg.orientation" kind="controls"> 1933 </clone> 1934 <clone entry="android.jpeg.quality" kind="controls"> 1935 </clone> 1936 <entry name="size" type="int32"> 1937 <description>The size of the compressed JPEG image, in 1938 bytes</description> 1939 <range>&gt;= 0</range> 1940 <details>If no JPEG output is produced for the request, 1941 this must be 0. 1942 1943 Otherwise, this describes the real size of the compressed 1944 JPEG image placed in the output stream. More specifically, 1945 if android.jpeg.maxSize = 1000000, and a specific capture 1946 has android.jpeg.size = 500000, then the output buffer from 1947 the JPEG stream will be 1000000 bytes, of which the first 1948 500000 make up the real data.</details> 1949 </entry> 1950 <clone entry="android.jpeg.thumbnailQuality" 1951 kind="controls"></clone> 1952 <clone entry="android.jpeg.thumbnailSize" kind="controls"> 1953 </clone> 1954 </dynamic> 1955 </section> 1956 <section name="lens"> 1957 <controls> 1958 <entry name="aperture" type="float" visibility="public"> 1959 <description>The ratio of lens focal length to the effective 1960 aperture diameter.</description> 1961 <units>f-number (f/NNN)</units> 1962 <range>android.lens.info.availableApertures</range> 1963 <details>This will only be supported on the camera devices that 1964 have variable aperture lens. The aperture value can only be 1965 one of the values listed in android.lens.info.availableApertures. 1966 1967 When this is supported and android.control.aeMode is OFF, 1968 this can be set along with android.sensor.exposureTime, 1969 android.sensor.sensitivity, and android.sensor.frameDuration 1970 to achieve manual exposure control. 1971 1972 The requested aperture value may take several frames to reach the 1973 requested value; the camera device will report the current (intermediate) 1974 aperture size in capture result metadata while the aperture is changing. 1975 While the aperture is still changing, android.lens.state will be set to MOVING. 1976 1977 When this is supported and android.control.aeMode is one of 1978 the ON modes, this will be overridden by the camera device 1979 auto-exposure algorithm, the overridden values are then provided 1980 back to the user in the corresponding result.</details> 1981 <tag id="V1" /> 1982 </entry> 1983 <entry name="filterDensity" type="float" visibility="public"> 1984 <description> 1985 State of lens neutral density filter(s). 1986 </description> 1987 <units>Steps of Exposure Value (EV).</units> 1988 <range>android.lens.info.availableFilterDensities</range> 1989 <details> 1990 This will not be supported on most camera devices. On devices 1991 where this is supported, this may only be set to one of the 1992 values included in android.lens.info.availableFilterDensities. 1993 1994 Lens filters are typically used to lower the amount of light the 1995 sensor is exposed to (measured in steps of EV). As used here, an EV 1996 step is the standard logarithmic representation, which are 1997 non-negative, and inversely proportional to the amount of light 1998 hitting the sensor. For example, setting this to 0 would result 1999 in no reduction of the incoming light, and setting this to 2 would 2000 mean that the filter is set to reduce incoming light by two stops 2001 (allowing 1/4 of the prior amount of light to the sensor). 2002 2003 It may take several frames before the lens filter density changes 2004 to the requested value. While the filter density is still changing, 2005 android.lens.state will be set to MOVING. 2006 </details> 2007 <tag id="V1" /> 2008 </entry> 2009 <entry name="focalLength" type="float" visibility="public"> 2010 <description> 2011 The current lens focal length; used for optical zoom. 2012 </description> 2013 <units>focal length in mm</units> 2014 <range>android.lens.info.availableFocalLengths</range> 2015 <details> 2016 This setting controls the physical focal length of the camera 2017 device's lens. Changing the focal length changes the field of 2018 view of the camera device, and is usually used for optical zoom. 2019 2020 Like android.lens.focusDistance and android.lens.aperture, this 2021 setting won't be applied instantaneously, and it may take several 2022 frames before the lens can change to the requested focal length. 2023 While the focal length is still changing, android.lens.state will 2024 be set to MOVING. 2025 2026 This is expected not to be supported on most devices. 2027 </details> 2028 <tag id="V1" /> 2029 </entry> 2030 <entry name="focusDistance" type="float" visibility="public"> 2031 <description>Distance to plane of sharpest focus, 2032 measured from frontmost surface of the lens</description> 2033 <units>See android.lens.info.focusDistanceCalibration for details.</units> 2034 <range>&gt;= 0</range> 2035 <details>0 means infinity focus. Used value will be clamped 2036 to [0, android.lens.info.minimumFocusDistance]. 2037 2038 Like android.lens.focalLength, this setting won't be applied 2039 instantaneously, and it may take several frames before the lens 2040 can move to the requested focus distance. While the lens is still moving, 2041 android.lens.state will be set to MOVING. 2042 </details> 2043 <tag id="BC" /> 2044 <tag id="V1" /> 2045 </entry> 2046 <entry name="opticalStabilizationMode" type="byte" visibility="public" 2047 enum="true"> 2048 <enum> 2049 <value>OFF 2050 <notes>Optical stabilization is unavailable.</notes> 2051 </value> 2052 <value optional="true">ON 2053 <notes>Optical stabilization is enabled.</notes> 2054 </value> 2055 </enum> 2056 <description> 2057 Sets whether the camera device uses optical image stabilization (OIS) 2058 when capturing images. 2059 </description> 2060 <range>android.lens.info.availableOpticalStabilization</range> 2061 <details> 2062 OIS is used to compensate for motion blur due to small movements of 2063 the camera during capture. Unlike digital image stabilization, OIS makes 2064 use of mechanical elements to stabilize the camera sensor, and thus 2065 allows for longer exposure times before camera shake becomes 2066 apparent. 2067 2068 This is not expected to be supported on most devices. 2069 </details> 2070 <tag id="V1" /> 2071 </entry> 2072 </controls> 2073 <static> 2074 <namespace name="info"> 2075 <entry name="availableApertures" type="float" visibility="public" 2076 container="array"> 2077 <array> 2078 <size>n</size> 2079 </array> 2080 <description>List of supported aperture 2081 values.</description> 2082 <range>one entry required, &gt; 0</range> 2083 <details>If the camera device doesn't support variable apertures, 2084 listed value will be the fixed aperture. 2085 2086 If the camera device supports variable apertures, the aperture value 2087 in this list will be sorted in ascending order.</details> 2088 <tag id="V1" /> 2089 </entry> 2090 <entry name="availableFilterDensities" type="float" visibility="public" 2091 container="array"> 2092 <array> 2093 <size>n</size> 2094 </array> 2095 <description> 2096 List of supported neutral density filter values for 2097 android.lens.filterDensity. 2098 </description> 2099 <range> 2100 At least one value is required. Values must be &gt;= 0. 2101 </range> 2102 <details> 2103 If changing android.lens.filterDensity is not supported, 2104 availableFilterDensities must contain only 0. Otherwise, this 2105 list contains only the exact filter density values available on 2106 this camera device. 2107 </details> 2108 <tag id="V1" /> 2109 </entry> 2110 <entry name="availableFocalLengths" type="float" visibility="public" 2111 type_notes="The list of available focal lengths" 2112 container="array"> 2113 <array> 2114 <size>n</size> 2115 </array> 2116 <description> 2117 The available focal lengths for this device for use with 2118 android.lens.focalLength. 2119 </description> 2120 <range> 2121 Each value in this list must be &gt; 0. This list must 2122 contain at least one value. 2123 </range> 2124 <details> 2125 If optical zoom is not supported, this will only report 2126 a single value corresponding to the static focal length of the 2127 device. Otherwise, this will report every focal length supported 2128 by the device. 2129 </details> 2130 <tag id="BC" /> 2131 <tag id="V1" /> 2132 </entry> 2133 <entry name="availableOpticalStabilization" type="byte" 2134 visibility="public" type_notes="list of enums" container="array"> 2135 <array> 2136 <size>n</size> 2137 </array> 2138 <description> 2139 List containing a subset of the optical image 2140 stabilization (OIS) modes specified in 2141 android.lens.opticalStabilizationMode. 2142 </description> 2143 <details> 2144 If OIS is not implemented for a given camera device, this should 2145 contain only OFF. 2146 </details> 2147 <tag id="V1" /> 2148 </entry> 2149 <entry name="hyperfocalDistance" type="float" visibility="public" optional="true"> 2150 <description>Optional. Hyperfocal distance for this lens.</description> 2151 <units>See android.lens.info.focusDistanceCalibration for details.</units> 2152 <range>&gt;= 0</range> 2153 <details>If the lens is fixed focus, the camera device will report 0. 2154 2155 If the lens is not fixed focus, the camera device will report this 2156 field when android.lens.info.focusDistanceCalibration is APPROXIMATE or CALIBRATED. 2157 </details> 2158 </entry> 2159 <entry name="minimumFocusDistance" type="float" visibility="public"> 2160 <description>Shortest distance from frontmost surface 2161 of the lens that can be focused correctly.</description> 2162 <units>See android.lens.info.focusDistanceCalibration for details.</units> 2163 <range>&gt;= 0</range> 2164 <details>If the lens is fixed-focus, this should be 2165 0.</details> 2166 <tag id="V1" /> 2167 </entry> 2168 <entry name="shadingMapSize" type="int32" visibility="public" 2169 type_notes="width and height of lens shading map provided by the HAL. (N x M)" 2170 container="array" typedef="size"> 2171 <array> 2172 <size>2</size> 2173 </array> 2174 <description>Dimensions of lens shading map.</description> 2175 <range>Both values &gt;= 1</range> 2176 <details> 2177 The map should be on the order of 30-40 rows and columns, and 2178 must be smaller than 64x64. 2179 </details> 2180 <tag id="V1" /> 2181 </entry> 2182 <entry name="focusDistanceCalibration" type="byte" visibility="public" enum="true"> 2183 <enum> 2184 <value>UNCALIBRATED 2185 <notes> 2186 The lens focus distance is not accurate, and the units used for 2187 android.lens.focusDistance do not correspond to any physical units. 2188 Setting the lens to the same focus distance on separate occasions may 2189 result in a different real focus distance, depending on factors such 2190 as the orientation of the device, the age of the focusing mechanism, 2191 and the device temperature. The focus distance value will still be 2192 in the range of `[0, android.lens.info.minimumFocusDistance]`, where 0 2193 represents the farthest focus. 2194 </notes> 2195 </value> 2196 <value>APPROXIMATE 2197 <notes> 2198 The lens focus distance is measured in diopters. However, setting the lens 2199 to the same focus distance on separate occasions may result in a 2200 different real focus distance, depending on factors such as the 2201 orientation of the device, the age of the focusing mechanism, and 2202 the device temperature. 2203 </notes> 2204 </value> 2205 <value>CALIBRATED 2206 <notes> 2207 The lens focus distance is measured in diopters. The lens mechanism is 2208 calibrated so that setting the same focus distance is repeatable on 2209 multiple occasions with good accuracy, and the focus distance corresponds 2210 to the real physical distance to the plane of best focus. 2211 </notes> 2212 </value> 2213 </enum> 2214 <description>The lens focus distance calibration quality.</description> 2215 <details> 2216 The lens focus distance calibration quality determines the reliability of 2217 focus related metadata entries, i.e. android.lens.focusDistance, 2218 android.lens.focusRange, android.lens.info.hyperfocalDistance, and 2219 android.lens.info.minimumFocusDistance. 2220 </details> 2221 <tag id="V1" /> 2222 </entry> 2223 </namespace> 2224 <entry name="facing" type="byte" visibility="public" enum="true"> 2225 <enum> 2226 <value>FRONT</value> 2227 <value>BACK</value> 2228 </enum> 2229 <description>Direction the camera faces relative to 2230 device screen</description> 2231 </entry> 2232 <entry name="opticalAxisAngle" type="float" 2233 type_notes="degrees. First defines the angle of separation between the perpendicular to the screen and the camera optical axis. The second then defines the clockwise rotation of the optical axis from native device up." 2234 container="array"> 2235 <array> 2236 <size>2</size> 2237 </array> 2238 <description>Relative angle of camera optical axis to the 2239 perpendicular axis from the display</description> 2240 <range>[0-90) for first angle, [0-360) for second</range> 2241 <details>Examples: 2242 2243 (0,0) means that the camera optical axis 2244 is perpendicular to the display surface; 2245 2246 (45,0) means that the camera points 45 degrees up when 2247 device is held upright; 2248 2249 (45,90) means the camera points 45 degrees to the right when 2250 the device is held upright. 2251 2252 Use FACING field to determine perpendicular outgoing 2253 direction</details> 2254 <tag id="ADV" /> 2255 </entry> 2256 <entry name="position" type="float" container="array"> 2257 <array> 2258 <size>3, location in mm, in the sensor coordinate 2259 system</size> 2260 </array> 2261 <description>Coordinates of camera optical axis on 2262 device</description> 2263 <tag id="V1" /> 2264 </entry> 2265 </static> 2266 <dynamic> 2267 <clone entry="android.lens.aperture" kind="controls"> 2268 <tag id="V1" /> 2269 </clone> 2270 <clone entry="android.lens.filterDensity" kind="controls"> 2271 <tag id="V1" /> 2272 </clone> 2273 <clone entry="android.lens.focalLength" kind="controls"> 2274 <tag id="BC" /> 2275 </clone> 2276 <clone entry="android.lens.focusDistance" kind="controls"> 2277 <details>Should be zero for fixed-focus cameras</details> 2278 <tag id="BC" /> 2279 </clone> 2280 <entry name="focusRange" type="float" visibility="public" 2281 type_notes="Range of scene distances that are in focus" 2282 container="array"> 2283 <array> 2284 <size>2</size> 2285 </array> 2286 <description>The range of scene distances that are in 2287 sharp focus (depth of field)</description> 2288 <units>pair of focus distances in diopters: (near, 2289 far), see android.lens.info.focusDistanceCalibration for details.</units> 2290 <range>&gt;=0</range> 2291 <details>If variable focus not supported, can still report 2292 fixed depth of field range</details> 2293 <tag id="BC" /> 2294 </entry> 2295 <clone entry="android.lens.opticalStabilizationMode" 2296 kind="controls"> 2297 <tag id="V1" /> 2298 </clone> 2299 <entry name="state" type="byte" visibility="public" enum="true"> 2300 <enum> 2301 <value>STATIONARY 2302 <notes> 2303 The lens parameters (android.lens.focalLength, android.lens.focusDistance 2304 android.lens.filterDensity and android.lens.aperture) are not changing. 2305 </notes> 2306 </value> 2307 <value>MOVING 2308 <notes> 2309 Any of the lens parameters (android.lens.focalLength, android.lens.focusDistance 2310 android.lens.filterDensity or android.lens.aperture) is changing. 2311 </notes> 2312 </value> 2313 </enum> 2314 <description>Current lens status.</description> 2315 <details> 2316 For lens parameters android.lens.focalLength, android.lens.focusDistance, 2317 android.lens.filterDensity and android.lens.aperture, when changes are requested, 2318 they may take several frames to reach the requested values. This state indicates 2319 the current status of the lens parameters. 2320 2321 When the state is STATIONARY, the lens parameters are not changing. This could be 2322 either because the parameters are all fixed, or because the lens has had enough 2323 time to reach the most recently-requested values. 2324 If all these lens parameters are not changable for a camera device, as listed below: 2325 2326 * Fixed focus (`android.lens.info.minimumFocusDistance == 0`), which means 2327 android.lens.focusDistance parameter will always be 0. 2328 * Fixed focal length (android.lens.info.availableFocalLengths contains single value), 2329 which means the optical zoom is not supported. 2330 * No ND filter (android.lens.info.availableFilterDensities contains only 0). 2331 * Fixed aperture (android.lens.info.availableApertures contains single value). 2332 2333 Then this state will always be STATIONARY. 2334 2335 When the state is MOVING, it indicates that at least one of the lens parameters 2336 is changing. 2337 </details> 2338 <tag id="V1" /> 2339 </entry> 2340 </dynamic> 2341 </section> 2342 <section name="noiseReduction"> 2343 <controls> 2344 <entry name="mode" type="byte" visibility="public" enum="true"> 2345 <enum> 2346 <value>OFF 2347 <notes>No noise reduction is applied</notes></value> 2348 <value>FAST 2349 <notes>Must not slow down frame rate relative to sensor 2350 output</notes></value> 2351 <value>HIGH_QUALITY 2352 <notes>May slow down frame rate to provide highest 2353 quality</notes></value> 2354 </enum> 2355 <description>Mode of operation for the noise reduction 2356 algorithm</description> 2357 <range>android.noiseReduction.availableModes</range> 2358 <details>Noise filtering control. OFF means no noise reduction 2359 will be applied by the HAL. 2360 2361 FAST/HIGH_QUALITY both mean camera device determined noise filtering 2362 will be applied. HIGH_QUALITY mode indicates that the camera device 2363 will use the highest-quality noise filtering algorithms, 2364 even if it slows down capture rate. FAST means the camera device should not 2365 slow down capture rate when applying noise filtering.</details> 2366 <tag id="V1" /> 2367 </entry> 2368 <entry name="strength" type="byte"> 2369 <description>Control the amount of noise reduction 2370 applied to the images</description> 2371 <units>1-10; 10 is max noise reduction</units> 2372 <range>1 - 10</range> 2373 </entry> 2374 </controls> 2375 <dynamic> 2376 <clone entry="android.noiseReduction.mode" kind="controls"> 2377 </clone> 2378 </dynamic> 2379 </section> 2380 <section name="quirks"> 2381 <static> 2382 <entry name="meteringCropRegion" type="byte" visibility="system" optional="true"> 2383 <description>If set to 1, the camera service does not 2384 scale 'normalized' coordinates with respect to the crop 2385 region. This applies to metering input (a{e,f,wb}Region 2386 and output (face rectangles).</description> 2387 <range>**Deprecated**. Do not use.</range> 2388 <details>Normalized coordinates refer to those in the 2389 (-1000,1000) range mentioned in the 2390 android.hardware.Camera API. 2391 2392 HAL implementations should instead always use and emit 2393 sensor array-relative coordinates for all region data. Does 2394 not need to be listed in static metadata. Support will be 2395 removed in future versions of camera service.</details> 2396 </entry> 2397 <entry name="triggerAfWithAuto" type="byte" visibility="system" optional="true"> 2398 <description>If set to 1, then the camera service always 2399 switches to FOCUS_MODE_AUTO before issuing a AF 2400 trigger.</description> 2401 <range>**Deprecated**. Do not use.</range> 2402 <details>HAL implementations should implement AF trigger 2403 modes for AUTO, MACRO, CONTINUOUS_FOCUS, and 2404 CONTINUOUS_PICTURE modes instead of using this flag. Does 2405 not need to be listed in static metadata. Support will be 2406 removed in future versions of camera service</details> 2407 </entry> 2408 <entry name="useZslFormat" type="byte" visibility="system" optional="true"> 2409 <description>If set to 1, the camera service uses 2410 CAMERA2_PIXEL_FORMAT_ZSL instead of 2411 HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED for the zero 2412 shutter lag stream</description> 2413 <range>**Deprecated**. Do not use.</range> 2414 <details>HAL implementations should use gralloc usage flags 2415 to determine that a stream will be used for 2416 zero-shutter-lag, instead of relying on an explicit 2417 format setting. Does not need to be listed in static 2418 metadata. Support will be removed in future versions of 2419 camera service.</details> 2420 </entry> 2421 <entry name="usePartialResult" type="byte" visibility="hidden" optional="true"> 2422 <description> 2423 If set to 1, the HAL will always split result 2424 metadata for a single capture into multiple buffers, 2425 returned using multiple process_capture_result calls. 2426 </description> 2427 <range>**Deprecated**. Do not use.</range> 2428 <details> 2429 Does not need to be listed in static 2430 metadata. Support for partial results will be reworked in 2431 future versions of camera service. This quirk will stop 2432 working at that point; DO NOT USE without careful 2433 consideration of future support. 2434 </details> 2435 <hal_details> 2436 Refer to `camera3_capture_result::partial_result` 2437 for information on how to implement partial results. 2438 </hal_details> 2439 </entry> 2440 </static> 2441 <dynamic> 2442 <entry name="partialResult" type="byte" visibility="hidden" optional="true" enum="true" typedef="boolean"> 2443 <enum> 2444 <value>FINAL 2445 <notes>The last or only metadata result buffer 2446 for this capture.</notes> 2447 </value> 2448 <value>PARTIAL 2449 <notes>A partial buffer of result metadata for this 2450 capture. More result buffers for this capture will be sent 2451 by the HAL, the last of which will be marked 2452 FINAL.</notes> 2453 </value> 2454 </enum> 2455 <description> 2456 Whether a result given to the framework is the 2457 final one for the capture, or only a partial that contains a 2458 subset of the full set of dynamic metadata 2459 values.</description> 2460 <range>**Deprecated**. Do not use. Optional. Default value is FINAL.</range> 2461 <details> 2462 The entries in the result metadata buffers for a 2463 single capture may not overlap, except for this entry. The 2464 FINAL buffers must retain FIFO ordering relative to the 2465 requests that generate them, so the FINAL buffer for frame 3 must 2466 always be sent to the framework after the FINAL buffer for frame 2, and 2467 before the FINAL buffer for frame 4. PARTIAL buffers may be returned 2468 in any order relative to other frames, but all PARTIAL buffers for a given 2469 capture must arrive before the FINAL buffer for that capture. This entry may 2470 only be used by the HAL if quirks.usePartialResult is set to 1. 2471 </details> 2472 <hal_details> 2473 Refer to `camera3_capture_result::partial_result` 2474 for information on how to implement partial results. 2475 </hal_details> 2476 </entry> 2477 </dynamic> 2478 </section> 2479 <section name="request"> 2480 <controls> 2481 <entry name="frameCount" type="int32" visibility="system"> 2482 <description>A frame counter set by the framework. Must 2483 be maintained unchanged in output frame. This value monotonically 2484 increases with every new result (that is, each new result has a unique 2485 frameCount value). 2486 </description> 2487 <units>incrementing integer</units> 2488 <range>**Deprecated**. Do not use. Any int.</range> 2489 </entry> 2490 <entry name="id" type="int32" visibility="hidden"> 2491 <description>An application-specified ID for the current 2492 request. Must be maintained unchanged in output 2493 frame</description> 2494 <units>arbitrary integer assigned by application</units> 2495 <range>Any int</range> 2496 <tag id="V1" /> 2497 </entry> 2498 <entry name="inputStreams" type="int32" visibility="system" 2499 container="array"> 2500 <array> 2501 <size>n</size> 2502 </array> 2503 <description>List which camera reprocess stream is used 2504 for the source of reprocessing data.</description> 2505 <units>List of camera reprocess stream IDs</units> 2506 <range>**Deprecated**. Do not use. 2507 2508 Typically, only one entry allowed, must be a valid reprocess stream ID. 2509 2510 If android.jpeg.needsThumbnail is set, then multiple 2511 reprocess streams may be included in a single request; they 2512 must be different scaled versions of the same image.</range> 2513 <details>Only meaningful when android.request.type == 2514 REPROCESS. Ignored otherwise</details> 2515 <tag id="HAL2" /> 2516 </entry> 2517 <entry name="metadataMode" type="byte" visibility="system" 2518 enum="true"> 2519 <enum> 2520 <value>NONE 2521 <notes>No metadata should be produced on output, except 2522 for application-bound buffer data. If no 2523 application-bound streams exist, no frame should be 2524 placed in the output frame queue. If such streams 2525 exist, a frame should be placed on the output queue 2526 with null metadata but with the necessary output buffer 2527 information. Timestamp information should still be 2528 included with any output stream buffers</notes></value> 2529 <value>FULL 2530 <notes>All metadata should be produced. Statistics will 2531 only be produced if they are separately 2532 enabled</notes></value> 2533 </enum> 2534 <description>How much metadata to produce on 2535 output</description> 2536 </entry> 2537 <entry name="outputStreams" type="int32" visibility="system" 2538 container="array"> 2539 <array> 2540 <size>n</size> 2541 </array> 2542 <description>Lists which camera output streams image data 2543 from this capture must be sent to</description> 2544 <units>List of camera stream IDs</units> 2545 <range>**Deprecated**. Do not use. List must only include streams that have been 2546 created</range> 2547 <details>If no output streams are listed, then the image 2548 data should simply be discarded. The image data must 2549 still be captured for metadata and statistics production, 2550 and the lens and flash must operate as requested.</details> 2551 <tag id="HAL2" /> 2552 </entry> 2553 <entry name="type" type="byte" visibility="system" enum="true"> 2554 <enum> 2555 <value>CAPTURE 2556 <notes>Capture a new image from the imaging hardware, 2557 and process it according to the 2558 settings</notes></value> 2559 <value>REPROCESS 2560 <notes>Process previously captured data; the 2561 android.request.inputStream parameter determines the 2562 source reprocessing stream. TODO: Mark dynamic metadata 2563 needed for reprocessing with [RP]</notes></value> 2564 </enum> 2565 <description>The type of the request; either CAPTURE or 2566 REPROCESS. For HAL3, this tag is redundant.</description> 2567 <tag id="HAL2" /> 2568 </entry> 2569 </controls> 2570 <static> 2571 <entry name="maxNumOutputStreams" type="int32" visibility="public" 2572 container="array"> 2573 <array> 2574 <size>3</size> 2575 </array> 2576 <description>The maximum numbers of different types of output streams 2577 that can be configured and used simultaneously by a camera device. 2578 </description> 2579 <range> 2580 &gt;= 1 for JPEG-compressed format streams. 2581 2582 &gt;= 0 for Raw format streams. 2583 2584 &gt;= 3 for processed, uncompressed format streams. 2585 </range> 2586 <details> 2587 This is a 3 element tuple that contains the max number of output simultaneous 2588 streams for raw sensor, processed (and uncompressed), and JPEG formats respectively. 2589 For example, if max raw sensor format output stream number is 1, max YUV streams 2590 number is 3, and max JPEG stream number is 2, then this tuple should be `(1, 3, 2)`. 2591 2592 This lists the upper bound of the number of output streams supported by 2593 the camera device. Using more streams simultaneously may require more hardware and 2594 CPU resources that will consume more power. The image format for a output stream can 2595 be any supported format provided by android.scaler.availableFormats. The formats 2596 defined in android.scaler.availableFormats can be catergorized into the 3 stream types 2597 as below: 2598 2599 * JPEG-compressed format: BLOB. 2600 * Raw formats: RAW_SENSOR and RAW_OPAQUE. 2601 * processed, uncompressed formats: YCbCr_420_888, YCrCb_420_SP, YV12. 2602 </details> 2603 <tag id="BC" /> 2604 </entry> 2605 <entry name="maxNumReprocessStreams" type="int32" visibility="system" 2606 container="array"> 2607 <array> 2608 <size>1</size> 2609 </array> 2610 <description>How many reprocessing streams of any type 2611 can be allocated at the same time.</description> 2612 <range>&gt;= 0</range> 2613 <details> 2614 **Deprecated**. Only used by HAL2.x. 2615 2616 When set to 0, it means no reprocess stream is supported. 2617 </details> 2618 <tag id="HAL2" /> 2619 </entry> 2620 <entry name="maxNumInputStreams" type="int32" visibility="public"> 2621 <description> 2622 The maximum numbers of any type of input streams 2623 that can be configured and used simultaneously by a camera device. 2624 </description> 2625 <range> 2626 &gt;= 0 for LIMITED mode device (`android.info.supportedHardwareLevel == LIMITED`). 2627 &gt;= 1 for FULL mode device (`android.info.supportedHardwareLevel == FULL`). 2628 </range> 2629 <details>When set to 0, it means no input stream is supported. 2630 2631 The image format for a input stream can be any supported format provided 2632 by android.scaler.availableInputFormats. When using an input stream, there must be 2633 at least one output stream configured to to receive the reprocessed images. 2634 2635 For example, for Zero Shutter Lag (ZSL) still capture use case, the input 2636 stream image format will be RAW_OPAQUE, the associated output stream image format 2637 should be JPEG. 2638 </details> 2639 </entry> 2640 </static> 2641 <dynamic> 2642 <entry name="frameCount" type="int32" visibility="public"> 2643 <description>A frame counter set by the framework. This value monotonically 2644 increases with every new result (that is, each new result has a unique 2645 frameCount value).</description> 2646 <units>count of frames</units> 2647 <range>&gt; 0</range> 2648 <details>Reset on release()</details> 2649 </entry> 2650 <clone entry="android.request.id" kind="controls"></clone> 2651 <clone entry="android.request.metadataMode" 2652 kind="controls"></clone> 2653 <clone entry="android.request.outputStreams" 2654 kind="controls"></clone> 2655 <entry name="pipelineDepth" type="byte" visibility="public"> 2656 <description>Specifies the number of pipeline stages the frame went 2657 through from when it was exposed to when the final completed result 2658 was available to the framework.</description> 2659 <range>&lt;= android.request.pipelineMaxDepth</range> 2660 <details>Depending on what settings are used in the request, and 2661 what streams are configured, the data may undergo less processing, 2662 and some pipeline stages skipped. 2663 2664 See android.request.pipelineMaxDepth for more details. 2665 </details> 2666 <hal_details> 2667 This value must always represent the accurate count of how many 2668 pipeline stages were actually used. 2669 </hal_details> 2670 </entry> 2671 </dynamic> 2672 <static> 2673 <entry name="pipelineMaxDepth" type="byte" visibility="public"> 2674 <description>Specifies the number of maximum pipeline stages a frame 2675 has to go through from when it's exposed to when it's available 2676 to the framework.</description> 2677 <details>A typical minimum value for this is 2 (one stage to expose, 2678 one stage to readout) from the sensor. The ISP then usually adds 2679 its own stages to do custom HW processing. Further stages may be 2680 added by SW processing. 2681 2682 Depending on what settings are used (e.g. YUV, JPEG) and what 2683 processing is enabled (e.g. face detection), the actual pipeline 2684 depth (specified by android.request.pipelineDepth) may be less than 2685 the max pipeline depth. 2686 2687 A pipeline depth of X stages is equivalent to a pipeline latency of 2688 X frame intervals. 2689 2690 This value will be 8 or less. 2691 </details> 2692 <hal_details> 2693 This value should be 4 or less. 2694 </hal_details> 2695 </entry> 2696 <entry name="partialResultCount" type="int32" visibility="public"> 2697 <description>Optional. Defaults to 1. Defines how many sub-components 2698 a result will be composed of. 2699 </description> 2700 <range>&gt;= 1</range> 2701 <details>In order to combat the pipeline latency, partial results 2702 may be delivered to the application layer from the camera device as 2703 soon as they are available. 2704 2705 A value of 1 means that partial results are not supported. 2706 2707 A typical use case for this might be: after requesting an AF lock the 2708 new AF state might be available 50% of the way through the pipeline. 2709 The camera device could then immediately dispatch this state via a 2710 partial result to the framework/application layer, and the rest of 2711 the metadata via later partial results. 2712 </details> 2713 </entry> 2714 <entry name="availableCapabilities" type="byte" visibility="public" 2715 enum="true"> 2716 <enum> 2717 <value>BACKWARD_COMPATIBLE 2718 <notes>The minimal set of capabilities that every camera 2719 device (regardless of android.info.supportedHardwareLevel) 2720 will support. 2721 2722 The full set of features supported by this capability makes 2723 the camera2 api backwards compatible with the camera1 2724 (android.hardware.Camera) API. 2725 2726 TODO: @hide this. Doesn't really mean anything except 2727 act as a catch-all for all the 'base' functionality. 2728 </notes> 2729 </value> 2730 <value>OPTIONAL 2731 <notes>This is a catch-all capability to include all other 2732 tags or functionality not encapsulated by one of the other 2733 capabilities. 2734 2735 A typical example is all tags marked 'optional'. 2736 2737 TODO: @hide. We may not need this if we @hide all the optional 2738 tags not belonging to a capability. 2739 </notes> 2740 </value> 2741 <value>MANUAL_SENSOR 2742 <notes> 2743 The camera device can be manually controlled (3A algorithms such 2744 as auto exposure, and auto focus can be 2745 bypassed), this includes but is not limited to: 2746 2747 * Manual exposure control 2748 * android.sensor.exposureTime 2749 * android.sensor.info.exposureTimeRange 2750 * Manual sensitivity control 2751 * android.sensor.sensitivity 2752 * android.sensor.info.sensitivityRange 2753 * android.sensor.baseGainFactor 2754 * Manual lens control 2755 * android.lens.* 2756 * Manual flash control 2757 * android.flash.* 2758 * Manual black level locking 2759 * android.blackLevel.lock 2760 2761 If any of the above 3A algorithms are enabled, then the camera 2762 device will accurately report the values applied by 3A in the 2763 result. 2764 </notes> 2765 </value> 2766 <value optional="true">GCAM 2767 <notes> 2768 TODO: This should be @hide 2769 2770 * Manual tonemap control 2771 * android.tonemap.curveBlue 2772 * android.tonemap.curveGreen 2773 * android.tonemap.curveRed 2774 * android.tonemap.mode 2775 * android.tonemap.maxCurvePoints 2776 * Manual white balance control 2777 * android.colorCorrection.transform 2778 * android.colorCorrection.gains 2779 * Lens shading map information 2780 * android.statistics.lensShadingMap 2781 * android.lens.info.shadingMapSize 2782 2783 If auto white balance is enabled, then the camera device 2784 will accurately report the values applied by AWB in the result. 2785 2786 The camera device will also support everything in MANUAL_SENSOR 2787 except manual lens control and manual flash control. 2788 </notes> 2789 </value> 2790 <value>ZSL 2791 <notes> 2792 The camera device supports the Zero Shutter Lag use case. 2793 2794 * At least one input stream can be used. 2795 * RAW_OPAQUE is supported as an output/input format 2796 * Using RAW_OPAQUE does not cause a frame rate drop 2797 relative to the sensor's maximum capture rate (at that 2798 resolution). 2799 * RAW_OPAQUE will be reprocessable into both YUV_420_888 2800 and JPEG formats. 2801 * The maximum available resolution for RAW_OPAQUE streams 2802 (both input/output) will match the maximum available 2803 resolution of JPEG streams. 2804 </notes> 2805 </value> 2806 <value optional="true">DNG 2807 <notes> 2808 The camera device supports outputting RAW buffers that can be 2809 saved offline into a DNG format. It can reprocess DNG 2810 files (produced from the same camera device) back into YUV. 2811 2812 * At least one input stream can be used. 2813 * RAW16 is supported as output/input format. 2814 * RAW16 is reprocessable into both YUV_420_888 and JPEG 2815 formats. 2816 * The maximum available resolution for RAW16 streams (both 2817 input/output) will match the value in 2818 android.sensor.info.pixelArraySize. 2819 * All DNG-related optional metadata entries are provided 2820 by the camera device. 2821 </notes> 2822 </value> 2823 </enum> 2824 <description>List of capabilities that the camera device 2825 advertises as fully supporting.</description> 2826 <details> 2827 A capability is a contract that the camera device makes in order 2828 to be able to satisfy one or more use cases. 2829 2830 Listing a capability guarantees that the whole set of features 2831 required to support a common use will all be available. 2832 2833 Using a subset of the functionality provided by an unsupported 2834 capability may be possible on a specific camera device implementation; 2835 to do this query each of android.request.availableRequestKeys, 2836 android.request.availableResultKeys, 2837 android.request.availableCharacteristicsKeys. 2838 2839 XX: Maybe these should go into android.info.supportedHardwareLevel 2840 as a table instead? 2841 2842 The following capabilities are guaranteed to be available on 2843 android.info.supportedHardwareLevel `==` FULL devices: 2844 2845 * MANUAL_SENSOR 2846 * ZSL 2847 2848 Other capabilities may be available on either FULL or LIMITED 2849 devices, but the app. should query this field to be sure. 2850 </details> 2851 <hal_details> 2852 Additional constraint details per-capability will be available 2853 in the Compatibility Test Suite. 2854 2855 BACKWARD_COMPATIBLE capability requirements are not explicitly listed. 2856 Instead refer to "BC" tags and the camera CTS tests in the 2857 android.hardware.cts package. 2858 2859 Listed controls that can be either request or result (e.g. 2860 android.sensor.exposureTime) must be available both in the 2861 request and the result in order to be considered to be 2862 capability-compliant. 2863 2864 For example, if the HAL claims to support MANUAL control, 2865 then exposure time must be configurable via the request _and_ 2866 the actual exposure applied must be available via 2867 the result. 2868 </hal_details> 2869 </entry> 2870 <entry name="availableRequestKeys" type="int32" visibility="hidden" 2871 container="array"> 2872 <array> 2873 <size>n</size> 2874 </array> 2875 <description>A list of all keys that the camera device has available 2876 to use with CaptureRequest.</description> 2877 2878 <details>Attempting to set a key into a CaptureRequest that is not 2879 listed here will result in an invalid request and will be rejected 2880 by the camera device. 2881 2882 This field can be used to query the feature set of a camera device 2883 at a more granular level than capabilities. This is especially 2884 important for optional keys that are not listed under any capability 2885 in android.request.availableCapabilities. 2886 2887 TODO: This should be used by #getAvailableCaptureRequestKeys. 2888 </details> 2889 <hal_details> 2890 Vendor tags must not be listed here. Use the vendor tag metadata 2891 extensions C api instead (refer to camera3.h for more details). 2892 2893 Setting/getting vendor tags will be checked against the metadata 2894 vendor extensions API and not against this field. 2895 2896 The HAL must not consume any request tags that are not listed either 2897 here or in the vendor tag list. 2898 2899 The public camera2 API will always make the vendor tags visible 2900 via CameraCharacteristics#getAvailableCaptureRequestKeys. 2901 </hal_details> 2902 </entry> 2903 <entry name="availableResultKeys" type="int32" visibility="hidden" 2904 container="array"> 2905 <array> 2906 <size>n</size> 2907 </array> 2908 <description>A list of all keys that the camera device has available 2909 to use with CaptureResult.</description> 2910 2911 <details>Attempting to get a key from a CaptureResult that is not 2912 listed here will always return a `null` value. Getting a key from 2913 a CaptureResult that is listed here must never return a `null` 2914 value. 2915 2916 The following keys may return `null` unless they are enabled: 2917 2918 * android.statistics.lensShadingMap (non-null iff android.statistics.lensShadingMapMode == ON) 2919 2920 (Those sometimes-null keys should nevertheless be listed here 2921 if they are available.) 2922 2923 This field can be used to query the feature set of a camera device 2924 at a more granular level than capabilities. This is especially 2925 important for optional keys that are not listed under any capability 2926 in android.request.availableCapabilities. 2927 2928 TODO: This should be used by #getAvailableCaptureResultKeys. 2929 </details> 2930 <hal_details> 2931 Tags listed here must always have an entry in the result metadata, 2932 even if that size is 0 elements. Only array-type tags (e.g. lists, 2933 matrices, strings) are allowed to have 0 elements. 2934 2935 Vendor tags must not be listed here. Use the vendor tag metadata 2936 extensions C api instead (refer to camera3.h for more details). 2937 2938 Setting/getting vendor tags will be checked against the metadata 2939 vendor extensions API and not against this field. 2940 2941 The HAL must not produce any result tags that are not listed either 2942 here or in the vendor tag list. 2943 2944 The public camera2 API will always make the vendor tags visible 2945 via CameraCharacteristics#getAvailableCaptureResultKeys. 2946 </hal_details> 2947 </entry> 2948 <entry name="availableCharacteristicsKeys" type="int32" visibility="hidden" 2949 container="array"> 2950 <array> 2951 <size>n</size> 2952 </array> 2953 <description>A list of all keys that the camera device has available 2954 to use with CameraCharacteristics.</description> 2955 <details>This entry follows the same rules as 2956 android.request.availableResultKeys (except that it applies for 2957 CameraCharacteristics instead of CaptureResult). See above for more 2958 details. 2959 2960 TODO: This should be used by CameraCharacteristics#getKeys. 2961 </details> 2962 <hal_details> 2963 Tags listed here must always have an entry in the static info metadata, 2964 even if that size is 0 elements. Only array-type tags (e.g. lists, 2965 matrices, strings) are allowed to have 0 elements. 2966 2967 Vendor tags must not be listed here. Use the vendor tag metadata 2968 extensions C api instead (refer to camera3.h for more details). 2969 2970 Setting/getting vendor tags will be checked against the metadata 2971 vendor extensions API and not against this field. 2972 2973 The HAL must not have any tags in its static info that are not listed 2974 either here or in the vendor tag list. 2975 2976 The public camera2 API will always make the vendor tags visible 2977 via CameraCharacteristics#getKeys. 2978 </hal_details> 2979 </entry> 2980 </static> 2981 </section> 2982 <section name="scaler"> 2983 <controls> 2984 <entry name="cropRegion" type="int32" visibility="public" 2985 container="array" typedef="rectangle"> 2986 <array> 2987 <size>4</size> 2988 </array> 2989 <description>(x, y, width, height). 2990 2991 A rectangle with the top-level corner of (x,y) and size 2992 (width, height). The region of the sensor that is used for 2993 output. Each stream must use this rectangle to produce its 2994 output, cropping to a smaller region if necessary to 2995 maintain the stream's aspect ratio. 2996 2997 HAL2.x uses only (x, y, width)</description> 2998 <units>(x,y) of top-left corner, width and height of region 2999 in pixels; (0,0) is top-left corner of 3000 android.sensor.activeArraySize</units> 3001 <details> 3002 Any additional per-stream cropping must be done to 3003 maximize the final pixel area of the stream. 3004 3005 For example, if the crop region is set to a 4:3 aspect 3006 ratio, then 4:3 streams should use the exact crop 3007 region. 16:9 streams should further crop vertically 3008 (letterbox). 3009 3010 Conversely, if the crop region is set to a 16:9, then 4:3 3011 outputs should crop horizontally (pillarbox), and 16:9 3012 streams should match exactly. These additional crops must 3013 be centered within the crop region. 3014 3015 The output streams must maintain square pixels at all 3016 times, no matter what the relative aspect ratios of the 3017 crop region and the stream are. Negative values for 3018 corner are allowed for raw output if full pixel array is 3019 larger than active pixel array. Width and height may be 3020 rounded to nearest larger supportable width, especially 3021 for raw output, where only a few fixed scales may be 3022 possible. The width and height of the crop region cannot 3023 be set to be smaller than floor( activeArraySize.width / 3024 android.scaler.maxDigitalZoom ) and floor( 3025 activeArraySize.height / android.scaler.maxDigitalZoom), 3026 respectively. 3027 </details> 3028 <tag id="BC" /> 3029 </entry> 3030 </controls> 3031 <static> 3032 <entry name="availableFormats" type="int32" 3033 visibility="public" enum="true" 3034 container="array" typedef="imageFormat"> 3035 <array> 3036 <size>n</size> 3037 </array> 3038 <enum> 3039 <value optional="true" id="0x20">RAW16 3040 <notes> 3041 RAW16 is a standard, cross-platform format for raw image 3042 buffers with 16-bit pixels. Buffers of this format are typically 3043 expected to have a Bayer Color Filter Array (CFA) layout, which 3044 is given in android.sensor.info.colorFilterArrangement. Sensors 3045 with CFAs that are not representable by a format in 3046 android.sensor.info.colorFilterArrangement should not use this 3047 format. 3048 3049 Buffers of this format will also follow the constraints given for 3050 RAW_OPAQUE buffers, but with relaxed performance constraints. 3051 3052 See android.scaler.availableInputFormats for the full set of 3053 performance guarantees. 3054 </notes> 3055 </value> 3056 <value optional="true" id="0x24">RAW_OPAQUE 3057 <notes> 3058 RAW_OPAQUE is a format for raw image buffers coming from an 3059 image sensor. The actual structure of buffers of this format is 3060 platform-specific, but must follow several constraints: 3061 3062 1. No image post-processing operations may have been applied to 3063 buffers of this type. These buffers contain raw image data coming 3064 directly from the image sensor. 3065 1. If a buffer of this format is passed to the camera device for 3066 reprocessing, the resulting images will be identical to the images 3067 produced if the buffer had come directly from the sensor and was 3068 processed with the same settings. 3069 3070 The intended use for this format is to allow access to the native 3071 raw format buffers coming directly from the camera sensor without 3072 any additional conversions or decrease in framerate. 3073 3074 See android.scaler.availableInputFormats for the full set of 3075 performance guarantees. 3076 </notes> 3077 </value> 3078 <value optional="true" id="0x32315659">YV12 3079 <notes>YCrCb 4:2:0 Planar</notes> 3080 </value> 3081 <value optional="true" id="0x11">YCrCb_420_SP 3082 <notes>NV21</notes> 3083 </value> 3084 <value id="0x22">IMPLEMENTATION_DEFINED 3085 <notes>System internal format, not application-accessible</notes> 3086 </value> 3087 <value id="0x23">YCbCr_420_888 3088 <notes>Flexible YUV420 Format</notes> 3089 </value> 3090 <value id="0x21">BLOB 3091 <notes>JPEG format</notes> 3092 </value> 3093 </enum> 3094 <description>The list of image formats that are supported by this 3095 camera device for output streams.</description> 3096 <details> 3097 All camera devices will support JPEG and YUV_420_888 formats. 3098 3099 When set to YUV_420_888, application can access the YUV420 data directly. 3100 </details> 3101 <hal_details> 3102 These format values are from HAL_PIXEL_FORMAT_* in 3103 system/core/include/system/graphics.h. 3104 3105 When IMPLEMENTATION_DEFINED is used, the platform 3106 gralloc module will select a format based on the usage flags provided 3107 by the camera HAL device and the other endpoint of the stream. It is 3108 usually used by preview and recording streams, where the application doesn't 3109 need access the image data. 3110 3111 YCbCr_420_888 format must be supported by the HAL. When an image stream 3112 needs CPU/application direct access, this format will be used. 3113 3114 The BLOB format must be supported by the HAL. This is used for the JPEG stream. 3115 3116 A RAW_OPAQUE buffer should contain only pixel data. It is strongly 3117 recommended that any information used by the camera device when 3118 processing images is fully expressed by the result metadata 3119 for that image buffer. 3120 </hal_details> 3121 <tag id="BC" /> 3122 </entry> 3123 <entry name="availableJpegMinDurations" type="int64" visibility="public" 3124 container="array"> 3125 <array> 3126 <size>n</size> 3127 </array> 3128 <description>The minimum frame duration that is supported 3129 for each resolution in android.scaler.availableJpegSizes. 3130 </description> 3131 <units>ns</units> 3132 <range>**Deprecated**. Do not use. TODO: Remove property.</range> 3133 <details> 3134 This corresponds to the minimum steady-state frame duration when only 3135 that JPEG stream is active and captured in a burst, with all 3136 processing (typically in android.*.mode) set to FAST. 3137 3138 When multiple streams are configured, the minimum 3139 frame duration will be &gt;= max(individual stream min 3140 durations)</details> 3141 <tag id="BC" /> 3142 </entry> 3143 <entry name="availableJpegSizes" type="int32" visibility="public" 3144 container="array" typedef="size"> 3145 <array> 3146 <size>n</size> 3147 <size>2</size> 3148 </array> 3149 <description>The JPEG resolutions that are supported by this camera device.</description> 3150 <range>**Deprecated**. Do not use. TODO: Remove property.</range> 3151 <details> 3152 The resolutions are listed as `(width, height)` pairs. All camera devices will support 3153 sensor maximum resolution (defined by android.sensor.info.activeArraySize). 3154 </details> 3155 <hal_details> 3156 The HAL must include sensor maximum resolution 3157 (defined by android.sensor.info.activeArraySize), 3158 and should include half/quarter of sensor maximum resolution. 3159 </hal_details> 3160 <tag id="BC" /> 3161 </entry> 3162 <entry name="availableMaxDigitalZoom" type="float" visibility="public"> 3163 <description>The maximum ratio between active area width 3164 and crop region width, or between active area height and 3165 crop region height, if the crop region height is larger 3166 than width</description> 3167 <range>&gt;=1</range> 3168 <tag id="BC" /> 3169 </entry> 3170 <entry name="availableProcessedMinDurations" type="int64" visibility="public" 3171 container="array"> 3172 <array> 3173 <size>n</size> 3174 </array> 3175 <description>For each available processed output size (defined in 3176 android.scaler.availableProcessedSizes), this property lists the 3177 minimum supportable frame duration for that size. 3178 </description> 3179 <units>ns</units> 3180 <range>**Deprecated**. Do not use. TODO: Remove property.</range> 3181 <details> 3182 This should correspond to the frame duration when only that processed 3183 stream is active, with all processing (typically in android.*.mode) 3184 set to FAST. 3185 3186 When multiple streams are configured, the minimum frame duration will 3187 be &gt;= max(individual stream min durations). 3188 </details> 3189 <tag id="BC" /> 3190 </entry> 3191 <entry name="availableProcessedSizes" type="int32" visibility="public" 3192 container="array" typedef="size"> 3193 <array> 3194 <size>n</size> 3195 <size>2</size> 3196 </array> 3197 <description>The resolutions available for use with 3198 processed output streams, such as YV12, NV12, and 3199 platform opaque YUV/RGB streams to the GPU or video 3200 encoders.</description> 3201 <range>**Deprecated**. Do not use. TODO: Remove property.</range> 3202 <details> 3203 The resolutions are listed as `(width, height)` pairs. 3204 3205 For a given use case, the actual maximum supported resolution 3206 may be lower than what is listed here, depending on the destination 3207 Surface for the image data. For example, for recording video, 3208 the video encoder chosen may have a maximum size limit (e.g. 1080p) 3209 smaller than what the camera (e.g. maximum resolution is 3264x2448) 3210 can provide. 3211 3212 Please reference the documentation for the image data destination to 3213 check if it limits the maximum size for image data. 3214 </details> 3215 <hal_details> 3216 For FULL capability devices (`android.info.supportedHardwareLevel == FULL`), 3217 the HAL must include all JPEG sizes listed in android.scaler.availableJpegSizes 3218 and each below resolution if it is smaller than or equal to the sensor 3219 maximum resolution (if they are not listed in JPEG sizes already): 3220 3221 * 240p (320 x 240) 3222 * 480p (640 x 480) 3223 * 720p (1280 x 720) 3224 * 1080p (1920 x 1080) 3225 3226 For LIMITED capability devices (`android.info.supportedHardwareLevel == LIMITED`), 3227 the HAL only has to list up to the maximum video size supported by the devices. 3228 </hal_details> 3229 <tag id="BC" /> 3230 </entry> 3231 <entry name="availableRawMinDurations" type="int64" 3232 container="array"> 3233 <array> 3234 <size>n</size> 3235 </array> 3236 <description> 3237 For each available raw output size (defined in 3238 android.scaler.availableRawSizes), this property lists the minimum 3239 supportable frame duration for that size. 3240 </description> 3241 <units>ns</units> 3242 <range>**Deprecated**. Do not use. TODO: Remove property.</range> 3243 <details> 3244 Should correspond to the frame duration when only the raw stream is 3245 active. 3246 3247 When multiple streams are configured, the minimum 3248 frame duration will be &gt;= max(individual stream min 3249 durations)</details> 3250 <tag id="BC" /> 3251 </entry> 3252 <entry name="availableRawSizes" type="int32" 3253 container="array" typedef="size"> 3254 <array> 3255 <size>n</size> 3256 <size>2</size> 3257 </array> 3258 <description>The resolutions available for use with raw 3259 sensor output streams, listed as width, 3260 height</description> 3261 <range>**Deprecated**. Do not use. TODO: Remove property. 3262 Must include: - sensor maximum resolution.</range> 3263 </entry> 3264 </static> 3265 <dynamic> 3266 <clone entry="android.scaler.cropRegion" kind="controls"> 3267 </clone> 3268 </dynamic> 3269 <static> 3270 <entry name="availableInputOutputFormatsMap" type="int32" 3271 visibility="public" 3272 container="array" typedef="imageFormat"> 3273 <array> 3274 <size>n</size> 3275 </array> 3276 <description>The mapping of image formats that are supported by this 3277 camera device for input streams, to their corresponding output formats. 3278 </description> 3279 <range>See android.scaler.availableFormats for enum definitions.</range> 3280 <details> 3281 All camera devices with at least 1 3282 android.request.request.maxNumInputStreams will have at least one 3283 available input format. 3284 3285 The camera device will support the following map of formats, 3286 if its dependent capability is supported: 3287 3288 Input Format | Output Format | Capability 3289 :---------------|:-----------------|:---------- 3290 RAW_OPAQUE | JPEG | ZSL 3291 RAW_OPAQUE | YUV_420_888 | ZSL 3292 RAW_OPAQUE | RAW16 | DNG 3293 RAW16 | YUV_420_888 | DNG 3294 RAW16 | JPEG | DNG 3295 3296 For ZSL-capable camera devices, using the RAW_OPAQUE format 3297 as either input or output will never hurt maximum frame rate (i.e. 3298 android.scaler.availableStallDurations will not have RAW_OPAQUE). 3299 3300 Attempting to configure an input stream with output streams not 3301 listed as available in this map is not valid. 3302 3303 TODO: Add java type mapping for this property. 3304 </details> 3305 <hal_details> 3306 This value is encoded as a variable-size array-of-arrays. 3307 The inner array always contains `[format, length, ...]` where 3308 `...` has `length` elements. An inner array is followed by another 3309 inner array if the total metadata entry size hasn't yet been exceeded. 3310 3311 A code sample to read/write this encoding (with a device that 3312 supports reprocessing RAW_OPAQUE to RAW16, YUV_420_888, and JPEG, 3313 and reprocessing RAW16 to YUV_420_888 and JPEG): 3314 3315 // reading 3316 int32_t* contents = &entry.i32[0]; 3317 for (size_t i = 0; i < entry.count; ) { 3318 int32_t format = contents[i++]; 3319 int32_t length = contents[i++]; 3320 int32_t output_formats[length]; 3321 memcpy(&output_formats[0], &contents[i], 3322 length * sizeof(int32_t)); 3323 i += length; 3324 } 3325 3326 // writing (static example, DNG+ZSL) 3327 int32_t[] contents = { 3328 RAW_OPAQUE, 3, RAW16, YUV_420_888, BLOB, 3329 RAW16, 2, YUV_420_888, BLOB, 3330 }; 3331 update_camera_metadata_entry(metadata, index, &contents[0], 3332 sizeof(contents)/sizeof(contents[0]), &updated_entry); 3333 3334 If the HAL claims to support any of the capabilities listed in the 3335 above details, then it must also support all the input-output 3336 combinations listed for that capability. It can optionally support 3337 additional formats if it so chooses. 3338 3339 Refer to android.scaler.availableFormats for the enum values 3340 which correspond to HAL_PIXEL_FORMAT_* in 3341 system/core/include/system/graphics.h. 3342 </hal_details> 3343 </entry> 3344 <entry name="availableStreamConfigurations" type="int32" visibility="public" 3345 enum="true" container="array"> 3346 <array> 3347 <size>n</size> 3348 <size>4</size> 3349 </array> 3350 <enum> 3351 <value>OUTPUT</value> 3352 <value>INPUT</value> 3353 </enum> 3354 <description>The available stream configurations that this 3355 camera device supports 3356 (i.e. format, width, height, output/input stream). 3357 </description> 3358 <details> 3359 The configurations are listed as `(format, width, height, input?)` 3360 tuples. 3361 3362 All camera devices will support sensor maximum resolution (defined by 3363 android.sensor.info.activeArraySize) for the JPEG format. 3364 3365 For a given use case, the actual maximum supported resolution 3366 may be lower than what is listed here, depending on the destination 3367 Surface for the image data. For example, for recording video, 3368 the video encoder chosen may have a maximum size limit (e.g. 1080p) 3369 smaller than what the camera (e.g. maximum resolution is 3264x2448) 3370 can provide. 3371 3372 Please reference the documentation for the image data destination to 3373 check if it limits the maximum size for image data. 3374 3375 Not all output formats may be supported in a configuration with 3376 an input stream of a particular format. For more details, see 3377 android.scaler.availableInputOutputFormatsMap. 3378 3379 The following table describes the minimum required output stream 3380 configurations based on the hardware level 3381 (android.info.supportedHardwareLevel): 3382 3383 Format | Size | Hardware Level | Notes 3384 :-------------:|:--------------------------------------------:|:--------------:|:--------------: 3385 JPEG | android.sensor.info.activeArraySize | Any | 3386 JPEG | 1920x1080 (1080p) | Any | if 1080p <= activeArraySize 3387 JPEG | 1280x720 (720) | Any | if 720p <= activeArraySize 3388 JPEG | 640x480 (480p) | Any | if 480p <= activeArraySize 3389 JPEG | 320x240 (240p) | Any | if 240p <= activeArraySize 3390 YUV_420_888 | all output sizes available for JPEG | FULL | 3391 YUV_420_888 | all output sizes available for JPEG, up to the maximum video size | LIMITED | 3392 IMPLEMENTATION_DEFINED | same as YUV_420_888 | Any | 3393 3394 Refer to android.request.availableCapabilities for additional 3395 mandatory stream configurations on a per-capability basis. 3396 </details> 3397 <hal_details> 3398 It is recommended (but not mandatory) to also include half/quarter 3399 of sensor maximum resolution for JPEG formats (regardless of hardware 3400 level). 3401 3402 (The following is a rewording of the above required table): 3403 3404 The HAL must include sensor maximum resolution (defined by 3405 android.sensor.info.activeArraySize). 3406 3407 For FULL capability devices (`android.info.supportedHardwareLevel == FULL`), 3408 the HAL must include all YUV_420_888 sizes that have JPEG sizes listed 3409 here as output streams. 3410 3411 It must also include each below resolution if it is smaller than or 3412 equal to the sensor maximum resolution (for both YUV_420_888 and JPEG 3413 formats), as output streams: 3414 3415 * 240p (320 x 240) 3416 * 480p (640 x 480) 3417 * 720p (1280 x 720) 3418 * 1080p (1920 x 1080) 3419 3420 For LIMITED capability devices 3421 (`android.info.supportedHardwareLevel == LIMITED`), 3422 the HAL only has to list up to the maximum video size 3423 supported by the device. 3424 3425 Regardless of hardware level, every output resolution available for 3426 YUV_420_888 must also be available for IMPLEMENTATION_DEFINED. 3427 3428 This supercedes the following fields, which are now deprecated: 3429 3430 * availableFormats 3431 * available[Processed,Raw,Jpeg]Sizes 3432 </hal_details> 3433 </entry> 3434 <entry name="availableMinFrameDurations" type="int64" visibility="public" 3435 container="array"> 3436 <array> 3437 <size>4</size> 3438 <size>n</size> 3439 </array> 3440 <description>This lists the minimum frame duration for each 3441 format/size combination. 3442 </description> 3443 <units>(format, width, height, ns) x n</units> 3444 <details> 3445 This should correspond to the frame duration when only that 3446 stream is active, with all processing (typically in android.*.mode) 3447 set to either OFF or FAST. 3448 3449 When multiple streams are used in a request, the minimum frame 3450 duration will be max(individual stream min durations). 3451 3452 The minimum frame duration of a stream (of a particular format, size) 3453 is the same regardless of whether the stream is input or output. 3454 3455 See android.sensor.frameDuration and 3456 android.scaler.availableStallDurations for more details about 3457 calculating the max frame rate. 3458 </details> 3459 <tag id="BC" /> 3460 </entry> 3461 <entry name="availableStallDurations" type="int64" visibility="public" 3462 container="array"> 3463 <array> 3464 <size>4</size> 3465 <size>n</size> 3466 </array> 3467 <description>This lists the maximum stall duration for each 3468 format/size combination. 3469 </description> 3470 <units>(format, width, height, ns) x n</units> 3471 <details> 3472 A stall duration is how much extra time would get added 3473 to the normal minimum frame duration for a repeating request 3474 that has streams with non-zero stall. 3475 3476 For example, consider JPEG captures which have the following 3477 characteristics: 3478 3479 * JPEG streams act like processed YUV streams in requests for which 3480 they are not included; in requests in which they are directly 3481 referenced, they act as JPEG streams. This is because supporting a 3482 JPEG stream requires the underlying YUV data to always be ready for 3483 use by a JPEG encoder, but the encoder will only be used (and impact 3484 frame duration) on requests that actually reference a JPEG stream. 3485 * The JPEG processor can run concurrently to the rest of the camera 3486 pipeline, but cannot process more than 1 capture at a time. 3487 3488 In other words, using a repeating YUV request would result 3489 in a steady frame rate (let's say it's 30 FPS). If a single 3490 JPEG request is submitted periodically, the frame rate will stay 3491 at 30 FPS (as long as we wait for the previous JPEG to return each 3492 time). If we try to submit a repeating YUV + JPEG request, then 3493 the frame rate will drop from 30 FPS. 3494 3495 In general, submitting a new request with a non-0 stall time 3496 stream will _not_ cause a frame rate drop unless there are still 3497 outstanding buffers for that stream from previous requests. 3498 3499 Submitting a repeating request with streams (call this `S`) 3500 is the same as setting the minimum frame duration from 3501 the normal minimum frame duration corresponding to `S`, added with 3502 the maximum stall duration for `S`. 3503 3504 If interleaving requests with and without a stall duration, 3505 a request will stall by the maximum of the remaining times 3506 for each can-stall stream with outstanding buffers. 3507 3508 This means that a stalling request will not have an exposure start 3509 until the stall has completed. 3510 3511 This should correspond to the stall duration when only that stream is 3512 active, with all processing (typically in android.*.mode) set to FAST 3513 or OFF. Setting any of the processing modes to HIGH_QUALITY 3514 effectively results in an indeterminate stall duration for all 3515 streams in a request (the regular stall calculation rules are 3516 ignored). 3517 3518 The following formats may always have a stall duration: 3519 3520 * JPEG 3521 * RAW16 3522 3523 The following formats will never have a stall duration: 3524 3525 * YUV_420_888 3526 * IMPLEMENTATION_DEFINED 3527 3528 All other formats may or may not have an allowed stall duration on 3529 a per-capability basis; refer to android.request.availableCapabilities 3530 for more details. 3531 3532 See android.sensor.frameDuration for more information about 3533 calculating the max frame rate (absent stalls). 3534 </details> 3535 <hal_details> 3536 If possible, it is recommended that all non-JPEG formats 3537 (such as RAW16) should not have a stall duration. 3538 </hal_details> 3539 <tag id="BC" /> 3540 </entry> 3541 </static> 3542 </section> 3543 <section name="sensor"> 3544 <controls> 3545 <entry name="exposureTime" type="int64" visibility="public"> 3546 <description>Duration each pixel is exposed to 3547 light. 3548 3549 If the sensor can't expose this exact duration, it should shorten the 3550 duration exposed to the nearest possible value (rather than expose longer). 3551 </description> 3552 <units>nanoseconds</units> 3553 <range>android.sensor.info.exposureTimeRange</range> 3554 <details>1/10000 - 30 sec range. No bulb mode</details> 3555 <tag id="V1" /> 3556 </entry> 3557 <entry name="frameDuration" type="int64" visibility="public"> 3558 <description>Duration from start of frame exposure to 3559 start of next frame exposure.</description> 3560 <units>nanoseconds</units> 3561 <range>See android.sensor.info.maxFrameDuration, 3562 android.scaler.availableMinFrameDurations. The duration 3563 is capped to `max(duration, exposureTime + overhead)`.</range> 3564 <details> 3565 The maximum frame rate that can be supported by a camera subsystem is 3566 a function of many factors: 3567 3568 * Requested resolutions of output image streams 3569 * Availability of binning / skipping modes on the imager 3570 * The bandwidth of the imager interface 3571 * The bandwidth of the various ISP processing blocks 3572 3573 Since these factors can vary greatly between different ISPs and 3574 sensors, the camera abstraction tries to represent the bandwidth 3575 restrictions with as simple a model as possible. 3576 3577 The model presented has the following characteristics: 3578 3579 * The image sensor is always configured to output the smallest 3580 resolution possible given the application's requested output stream 3581 sizes. The smallest resolution is defined as being at least as large 3582 as the largest requested output stream size; the camera pipeline must 3583 never digitally upsample sensor data when the crop region covers the 3584 whole sensor. In general, this means that if only small output stream 3585 resolutions are configured, the sensor can provide a higher frame 3586 rate. 3587 * Since any request may use any or all the currently configured 3588 output streams, the sensor and ISP must be configured to support 3589 scaling a single capture to all the streams at the same time. This 3590 means the camera pipeline must be ready to produce the largest 3591 requested output size without any delay. Therefore, the overall 3592 frame rate of a given configured stream set is governed only by the 3593 largest requested stream resolution. 3594 * Using more than one output stream in a request does not affect the 3595 frame duration. 3596 * Certain format-streams may need to do additional background processing 3597 before data is consumed/produced by that stream. These processors 3598 can run concurrently to the rest of the camera pipeline, but 3599 cannot process more than 1 capture at a time. 3600 3601 The necessary information for the application, given the model above, 3602 is provided via the android.scaler.availableMinFrameDurations field. 3603 These are used to determine the maximum frame rate / minimum frame 3604 duration that is possible for a given stream configuration. 3605 3606 Specifically, the application can use the following rules to 3607 determine the minimum frame duration it can request from the camera 3608 device: 3609 3610 1. Let the set of currently configured input/output streams 3611 be called `S`. 3612 1. Find the minimum frame durations for each stream in `S`, by 3613 looking it up in android.scaler.availableMinFrameDurations (with 3614 its respective size/format). Let this set of frame durations be called 3615 `F`. 3616 1. For any given request `R`, the minimum frame duration allowed 3617 for `R` is the maximum out of all values in `F`. Let the streams 3618 used in `R` be called `S_r`. 3619 3620 If none of the streams in `S_r` have a stall time (listed in 3621 android.scaler.availableStallDurations), then the frame duration in 3622 `F` determines the steady state frame rate that the application will 3623 get if it uses `R` as a repeating request. Let this special kind 3624 of request be called `Rsimple`. 3625 3626 A repeating request `Rsimple` can be _occasionally_ interleaved 3627 by a single capture of a new request `Rstall` (which has at least 3628 one in-use stream with a non-0 stall time) and if `Rstall` has the 3629 same minimum frame duration this will not cause a frame rate loss 3630 if all buffers from the previous `Rstall` have already been 3631 delivered. 3632 3633 For more details about stalling, see 3634 android.scaler.availableStallDurations. 3635 </details> 3636 <tag id="V1" /> 3637 <tag id="BC" /> 3638 </entry> 3639 <entry name="sensitivity" type="int32" visibility="public"> 3640 <description>Gain applied to image data. Must be 3641 implemented through analog gain only if set to values 3642 below 'maximum analog sensitivity'. 3643 3644 If the sensor can't apply this exact gain, it should lessen the 3645 gain to the nearest possible value (rather than gain more). 3646 </description> 3647 <units>ISO arithmetic units</units> 3648 <range>android.sensor.info.sensitivityRange</range> 3649 <details>ISO 12232:2006 REI method</details> 3650 <tag id="V1" /> 3651 </entry> 3652 </controls> 3653 <static> 3654 <namespace name="info"> 3655 <entry name="activeArraySize" type="int32" visibility="public" 3656 type_notes="Four ints defining the active pixel rectangle" 3657 container="array" 3658 typedef="rectangle"> 3659 <array> 3660 <size>4</size> 3661 </array> 3662 <description>Area of raw data which corresponds to only 3663 active pixels.</description> 3664 <range>This array contains `(xmin, ymin, width, height)`. The `(xmin, ymin)` must be 3665 &gt;= `(0,0)`. The `(width, height)` must be &lt;= 3666 `android.sensor.info.pixelArraySize`. 3667 </range> 3668 <details>It is smaller or equal to 3669 sensor full pixel array, which could include the black calibration pixels.</details> 3670 <tag id="DNG" /> 3671 </entry> 3672 <entry name="sensitivityRange" type="int32" visibility="public" 3673 type_notes="Range of supported sensitivities" 3674 container="array"> 3675 <array> 3676 <size>2</size> 3677 </array> 3678 <description>Range of valid sensitivities</description> 3679 <range>Min <= 100, Max &gt;= 1600</range> 3680 <tag id="BC" /> 3681 <tag id="V1" /> 3682 </entry> 3683 <entry name="colorFilterArrangement" type="byte" enum="true"> 3684 <enum> 3685 <value>RGGB</value> 3686 <value>GRBG</value> 3687 <value>GBRG</value> 3688 <value>BGGR</value> 3689 <value>RGB 3690 <notes>Sensor is not Bayer; output has 3 16-bit 3691 values for each pixel, instead of just 1 16-bit value 3692 per pixel.</notes></value> 3693 </enum> 3694 <description>Arrangement of color filters on sensor; 3695 represents the colors in the top-left 2x2 section of 3696 the sensor, in reading order</description> 3697 <tag id="DNG" /> 3698 </entry> 3699 <entry name="exposureTimeRange" type="int64" visibility="public" 3700 type_notes="nanoseconds" container="array"> 3701 <array> 3702 <size>2</size> 3703 </array> 3704 <description>Range of valid exposure 3705 times used by android.sensor.exposureTime.</description> 3706 <range>Min <= 100e3 (100 us), Max &gt;= 1e9 (1 3707 sec)</range> 3708 <hal_details>The maximum of the range must be at least 3709 1 second. It should be at least 30 seconds.</hal_details> 3710 <tag id="V1" /> 3711 </entry> 3712 <entry name="maxFrameDuration" type="int64" visibility="public"> 3713 <description>Maximum possible frame duration (minimum frame 3714 rate).</description> 3715 <units>nanoseconds</units> 3716 <range>&gt;= 30e9</range> 3717 <details>The largest possible android.sensor.frameDuration 3718 that will be accepted by the camera device. Attempting to use 3719 frame durations beyond the maximum will result in the frame duration 3720 being clipped to the maximum. See that control 3721 for a full definition of frame durations. 3722 3723 Refer to 3724 android.scaler.availableProcessedMinDurations, 3725 android.scaler.availableJpegMinDurations, and 3726 android.scaler.availableRawMinDurations for the minimum 3727 frame duration values. 3728 </details> 3729 <hal_details> 3730 This value must be at least 1 second. It should be at least 30 3731 seconds (30e9 ns). 3732 3733 android.sensor.maxFrameDuration must be greater or equal to the 3734 android.sensor.exposureTimeRange max value (since exposure time 3735 overrides frame duration). 3736 3737 Available minimum frame durations for JPEG must be no greater 3738 than that of the YUV_420_888/IMPLEMENTATION_DEFINED 3739 minimum frame durations (for that respective size). 3740 3741 Since JPEG processing is considered offline and can take longer than 3742 a single uncompressed capture, refer to 3743 android.scaler.availableStallDurations 3744 for details about encoding this scenario. 3745 </hal_details> 3746 <tag id="BC" /> 3747 <tag id="V1" /> 3748 </entry> 3749 <entry name="physicalSize" type="float" visibility="public" 3750 type_notes="width x height in millimeters" 3751 container="array"> 3752 <array> 3753 <size>2</size> 3754 </array> 3755 <description>The physical dimensions of the full pixel 3756 array</description> 3757 <details>Needed for FOV calculation for old API</details> 3758 <tag id="V1" /> 3759 <tag id="BC" /> 3760 </entry> 3761 <entry name="pixelArraySize" type="int32" visibility="public" 3762 container="array" typedef="size"> 3763 <array> 3764 <size>2</size> 3765 </array> 3766 <description>Dimensions of full pixel array, possibly 3767 including black calibration pixels.</description> 3768 <details>Maximum output resolution for raw format must 3769 match this in 3770 android.scaler.info.availableSizesPerFormat.</details> 3771 <tag id="DNG" /> 3772 <tag id="BC" /> 3773 </entry> 3774 <entry name="whiteLevel" type="int32"> 3775 <description>Maximum raw value output by 3776 sensor</description> 3777 <range>&gt; 1024 (10-bit output)</range> 3778 <details>Defines sensor bit depth (10-14 bits is 3779 expected)</details> 3780 <tag id="DNG" /> 3781 </entry> 3782 </namespace> 3783 <entry name="baseGainFactor" type="rational" visibility="public" 3784 optional="true"> 3785 <description>Gain factor from electrons to raw units when 3786 ISO=100</description> 3787 <tag id="V1" /> 3788 <tag id="FULL" /> 3789 </entry> 3790 <entry name="blackLevelPattern" type="int32" visibility="public" 3791 optional="true" type_notes="2x2 raw count block" container="array"> 3792 <array> 3793 <size>4</size> 3794 </array> 3795 <description> 3796 A fixed black level offset for each of the color filter arrangement 3797 (CFA) mosaic channels. 3798 </description> 3799 <range>&gt;= 0 for each.</range> 3800 <details> 3801 This tag specifies the zero light value for each of the CFA mosaic 3802 channels in the camera sensor. 3803 3804 The values are given in row-column scan order, with the first value 3805 corresponding to the element of the CFA in row=0, column=0. 3806 </details> 3807 <tag id="DNG" /> 3808 </entry> 3809 <entry name="maxAnalogSensitivity" type="int32" visibility="public" 3810 optional="true"> 3811 <description>Maximum sensitivity that is implemented 3812 purely through analog gain.</description> 3813 <details>For android.sensor.sensitivity values less than or 3814 equal to this, all applied gain must be analog. For 3815 values above this, the gain applied can be a mix of analog and 3816 digital.</details> 3817 <tag id="V1" /> 3818 <tag id="FULL" /> 3819 </entry> 3820 <entry name="orientation" type="int32" visibility="public"> 3821 <description>Clockwise angle through which the output 3822 image needs to be rotated to be upright on the device 3823 screen in its native orientation. Also defines the 3824 direction of rolling shutter readout, which is from top 3825 to bottom in the sensor's coordinate system</description> 3826 <units>degrees clockwise rotation, only multiples of 3827 90</units> 3828 <range>0,90,180,270</range> 3829 <tag id="BC" /> 3830 </entry> 3831 <entry name="profileHueSatMapDimensions" type="int32" 3832 visibility="public" optional="true" 3833 type_notes="Number of samples for hue, saturation, and value" 3834 container="array"> 3835 <array> 3836 <size>3</size> 3837 </array> 3838 <description> 3839 The number of input samples for each dimension of 3840 android.sensor.profileHueSatMap. 3841 </description> 3842 <range> 3843 Hue &gt;= 1, 3844 Saturation &gt;= 2, 3845 Value &gt;= 1 3846 </range> 3847 <details> 3848 The number of input samples for the hue, saturation, and value 3849 dimension of android.sensor.profileHueSatMap. The order of the 3850 dimensions given is hue, saturation, value; where hue is the 0th 3851 element. 3852 </details> 3853 <tag id="DNG" /> 3854 </entry> 3855 </static> 3856 <dynamic> 3857 <clone entry="android.sensor.exposureTime" kind="controls"> 3858 </clone> 3859 <clone entry="android.sensor.frameDuration" 3860 kind="controls"></clone> 3861 <clone entry="android.sensor.sensitivity" kind="controls"> 3862 </clone> 3863 <entry name="timestamp" type="int64" visibility="public"> 3864 <description>Time at start of exposure of first 3865 row</description> 3866 <units>nanoseconds</units> 3867 <range>&gt; 0</range> 3868 <details>Monotonic, should be synced to other timestamps in 3869 system</details> 3870 <tag id="BC" /> 3871 </entry> 3872 <entry name="temperature" type="float" visibility="public" 3873 optional="true"> 3874 <description>The temperature of the sensor, sampled at the time 3875 exposure began for this frame. 3876 3877 The thermal diode being queried should be inside the sensor PCB, or 3878 somewhere close to it. 3879 </description> 3880 3881 <units>celsius</units> 3882 <range>Optional. This value is missing if no temperature is available.</range> 3883 <tag id="FULL" /> 3884 </entry> 3885 <entry name="referenceIlluminant" type="byte" enum="true"> 3886 <enum> 3887 <value id="1">DAYLIGHT</value> 3888 <value id="2">FLUORESCENT</value> 3889 <value id="3">TUNGSTEN 3890 <notes>Incandescent light</notes></value> 3891 <value id="4">FLASH</value> 3892 <value id="9">FINE_WEATHER</value> 3893 <value id="10">CLOUDY_WEATHER</value> 3894 <value id="11">SHADE</value> 3895 <value id="12">DAYLIGHT_FLUORESCENT 3896 <notes>D 5700 - 7100K</notes></value> 3897 <value id="13">DAY_WHITE_FLUORESCENT 3898 <notes>N 4600 - 5400K</notes></value> 3899 <value id="14">COOL_WHITE_FLUORESCENT 3900 <notes>W 3900 - 4500K</notes></value> 3901 <value id="15">WHITE_FLUORESCENT 3902 <notes>WW 3200 - 3700K</notes></value> 3903 <value id="17">STANDARD_A</value> 3904 <value id="18">STANDARD_B</value> 3905 <value id="19">STANDARD_C</value> 3906 <value id="20">D55</value> 3907 <value id="21">D65</value> 3908 <value id="22">D75</value> 3909 <value id="23">D50</value> 3910 <value id="24">ISO_STUDIO_TUNGSTEN</value> 3911 </enum> 3912 <description> 3913 A reference illumination source roughly matching the current scene 3914 illumination, which is used to describe the sensor color space 3915 transformations. 3916 </description> 3917 <details> 3918 The values in this tag correspond to the values defined for the 3919 EXIF LightSource tag. These illuminants are standard light sources 3920 that are often used for calibrating camera devices. 3921 </details> 3922 <tag id="DNG" /> 3923 <tag id="EXIF" /> 3924 </entry> 3925 <entry name="calibrationTransform" type="rational" 3926 visibility="public" optional="true" 3927 type_notes="3x3 matrix in row-major-order" container="array"> 3928 <array> 3929 <size>3</size> 3930 <size>3</size> 3931 </array> 3932 <description> 3933 A per-device calibration transform matrix to be applied after the 3934 color space transform when rendering the raw image buffer. 3935 </description> 3936 <details> 3937 This matrix is expressed as a 3x3 matrix in row-major-order, and 3938 contains a per-device calibration transform that maps colors 3939 from reference camera color space (i.e. the "golden module" 3940 colorspace) into this camera device's linear native sensor color 3941 space for the current scene illumination and white balance choice. 3942 </details> 3943 <tag id="DNG" /> 3944 </entry> 3945 <entry name="colorTransform" type="rational" 3946 visibility="public" optional="true" 3947 type_notes="3x3 matrix in row-major-order" container="array"> 3948 <array> 3949 <size>3</size> 3950 <size>3</size> 3951 </array> 3952 <description> 3953 A matrix that transforms color values from CIE XYZ color space to 3954 reference camera color space when rendering the raw image buffer. 3955 </description> 3956 <details> 3957 This matrix is expressed as a 3x3 matrix in row-major-order, and 3958 contains a color transform matrix that maps colors from the CIE 3959 XYZ color space to the reference camera raw color space (i.e. the 3960 "golden module" colorspace) for the current scene illumination and 3961 white balance choice. 3962 </details> 3963 <tag id="DNG" /> 3964 </entry> 3965 <entry name="forwardMatrix" type="rational" 3966 visibility="public" optional="true" 3967 type_notes="3x3 matrix in row-major-order" container="array"> 3968 <array> 3969 <size>3</size> 3970 <size>3</size> 3971 </array> 3972 <description> 3973 A matrix that transforms white balanced camera colors to the CIE XYZ 3974 colorspace with a D50 whitepoint. 3975 </description> 3976 <details> 3977 This matrix is expressed as a 3x3 matrix in row-major-order, and contains 3978 a color transform matrix that maps a unit vector in the linear native 3979 sensor color space to the D50 whitepoint in CIE XYZ color space. 3980 </details> 3981 <tag id="DNG" /> 3982 </entry> 3983 <entry name="neutralColorPoint" type="rational" visibility="public" 3984 optional="true" container="array"> 3985 <array> 3986 <size>3</size> 3987 </array> 3988 <description> 3989 The estimated white balance at the time of capture. 3990 </description> 3991 <details> 3992 The estimated white balance encoded as the RGB values of the 3993 perfectly neutral color point in the linear native sensor color space. 3994 The order of the values is R, G, B; where R is in the lowest index. 3995 </details> 3996 <tag id="DNG" /> 3997 </entry> 3998 <entry name="profileHueSatMap" type="float" 3999 visibility="public" optional="true" 4000 type_notes="Mapping for hue, saturation, and value" 4001 container="array"> 4002 <array> 4003 <size>hue_samples</size> 4004 <size>saturation_samples</size> 4005 <size>value_samples</size> 4006 <size>3</size> 4007 </array> 4008 <description> 4009 A mapping containing a hue shift, saturation scale, and value scale 4010 for each pixel. 4011 </description> 4012 <units> 4013 Hue shift is given in degrees; saturation and value scale factors are 4014 unitless. 4015 </units> 4016 <details> 4017 hue_samples, saturation_samples, and value_samples are given in 4018 android.sensor.profileHueSatMapDimensions. 4019 4020 Each entry of this map contains three floats corresponding to the 4021 hue shift, saturation scale, and value scale, respectively; where the 4022 hue shift has the lowest index. The map entries are stored in the tag 4023 in nested loop order, with the value divisions in the outer loop, the 4024 hue divisions in the middle loop, and the saturation divisions in the 4025 inner loop. All zero input saturation entries are required to have a 4026 value scale factor of 1.0. 4027 </details> 4028 <tag id="DNG" /> 4029 </entry> 4030 <entry name="profileToneCurve" type="float" 4031 visibility="public" optional="true" 4032 type_notes="Samples defining a spline for a tone-mapping curve" 4033 container="array"> 4034 <array> 4035 <size>samples</size> 4036 <size>2</size> 4037 </array> 4038 <description> 4039 A list of x,y samples defining a tone-mapping curve for gamma adjustment. 4040 </description> 4041 <range> 4042 Each sample has an input range of `[0, 1]` and an output range of 4043 `[0, 1]`. The first sample is required to be `(0, 0)`, and the last 4044 sample is required to be `(1, 1)`. 4045 </range> 4046 <details> 4047 This tag contains a default tone curve that can be applied while 4048 processing the image as a starting point for user adjustments. 4049 The curve is specified as a list of value pairs in linear gamma. 4050 The curve is interpolated using a cubic spline. 4051 </details> 4052 <tag id="DNG" /> 4053 </entry> 4054 </dynamic> 4055 <controls> 4056 <entry name="testPatternData" type="int32" visibility="public" optional="true" container="array"> 4057 <array> 4058 <size>4</size> 4059 </array> 4060 <description> 4061 A pixel `[R, G_even, G_odd, B]` that supplies the test pattern 4062 when android.sensor.testPatternMode is SOLID_COLOR. 4063 </description> 4064 <range>Optional. 4065 Must be supported if android.sensor.availableTestPatternModes contains 4066 SOLID_COLOR.</range> 4067 <details> 4068 Each color channel is treated as an unsigned 32-bit integer. 4069 The camera device then uses the most significant X bits 4070 that correspond to how many bits are in its Bayer raw sensor 4071 output. 4072 4073 For example, a sensor with RAW10 Bayer output would use the 4074 10 most significant bits from each color channel. 4075 </details> 4076 <hal_details> 4077 </hal_details> 4078 </entry> 4079 <entry name="testPatternMode" type="int32" visibility="public" optional="true" 4080 enum="true"> 4081 <enum> 4082 <value>OFF 4083 <notes>Default. No test pattern mode is used, and the camera 4084 device returns captures from the image sensor.</notes> 4085 </value> 4086 <value>SOLID_COLOR 4087 <notes> 4088 Each pixel in `[R, G_even, G_odd, B]` is replaced by its 4089 respective color channel provided in 4090 android.sensor.testPatternData. 4091 4092 For example: 4093 4094 android.testPatternData = [0, 0xFFFFFFFF, 0xFFFFFFFF, 0] 4095 4096 All green pixels are 100% green. All red/blue pixels are black. 4097 4098 android.testPatternData = [0xFFFFFFFF, 0, 0xFFFFFFFF, 0] 4099 4100 All red pixels are 100% red. Only the odd green pixels 4101 are 100% green. All blue pixels are 100% black. 4102 </notes> 4103 </value> 4104 <value>COLOR_BARS 4105 <notes> 4106 All pixel data is replaced with an 8-bar color pattern. 4107 4108 The vertical bars (left-to-right) are as follows: 4109 4110 * 100% white 4111 * yellow 4112 * cyan 4113 * green 4114 * magenta 4115 * red 4116 * blue 4117 * black 4118 4119 In general the image would look like the following: 4120 4121 W Y C G M R B K 4122 W Y C G M R B K 4123 W Y C G M R B K 4124 W Y C G M R B K 4125 W Y C G M R B K 4126 . . . . . . . . 4127 . . . . . . . . 4128 . . . . . . . . 4129 4130 (B = Blue, K = Black) 4131 4132 Each bar should take up 1/8 of the sensor pixel array width. 4133 When this is not possible, the bar size should be rounded 4134 down to the nearest integer and the pattern can repeat 4135 on the right side. 4136 4137 Each bar's height must always take up the full sensor 4138 pixel array height. 4139 4140 Each pixel in this test pattern must be set to either 4141 0% intensity or 100% intensity. 4142 </notes> 4143 </value> 4144 <value>COLOR_BARS_FADE_TO_GRAY 4145 <notes> 4146 The test pattern is similar to COLOR_BARS, except that 4147 each bar should start at its specified color at the top, 4148 and fade to gray at the bottom. 4149 4150 Furthermore each bar is further subdivided into a left and 4151 right half. The left half should have a smooth gradient, 4152 and the right half should have a quantized gradient. 4153 4154 In particular, the right half's should consist of blocks of the 4155 same color for 1/16th active sensor pixel array width. 4156 4157 The least significant bits in the quantized gradient should 4158 be copied from the most significant bits of the smooth gradient. 4159 4160 The height of each bar should always be a multiple of 128. 4161 When this is not the case, the pattern should repeat at the bottom 4162 of the image. 4163 </notes> 4164 </value> 4165 <value>PN9 4166 <notes> 4167 All pixel data is replaced by a pseudo-random sequence 4168 generated from a PN9 512-bit sequence (typically implemented 4169 in hardware with a linear feedback shift register). 4170 4171 The generator should be reset at the beginning of each frame, 4172 and thus each subsequent raw frame with this test pattern should 4173 be exactly the same as the last. 4174 </notes> 4175 </value> 4176 <value id="256">CUSTOM1 4177 <notes>The first custom test pattern. All custom patterns that are 4178 available only on this camera device are at least this numeric 4179 value. 4180 4181 All of the custom test patterns will be static 4182 (that is the raw image must not vary from frame to frame). 4183 </notes> 4184 </value> 4185 </enum> 4186 <description>When enabled, the sensor sends a test pattern instead of 4187 doing a real exposure from the camera. 4188 </description> 4189 <range>Optional. Defaults to OFF. Value must be one of 4190 android.sensor.availableTestPatternModes</range> 4191 <details> 4192 When a test pattern is enabled, all manual sensor controls specified 4193 by android.sensor.* should be ignored. All other controls should 4194 work as normal. 4195 4196 For example, if manual flash is enabled, flash firing should still 4197 occur (and that the test pattern remain unmodified, since the flash 4198 would not actually affect it). 4199 </details> 4200 <hal_details> 4201 All test patterns are specified in the Bayer domain. 4202 4203 The HAL may choose to substitute test patterns from the sensor 4204 with test patterns from on-device memory. In that case, it should be 4205 indistinguishable to the ISP whether the data came from the 4206 sensor interconnect bus (such as CSI2) or memory. 4207 </hal_details> 4208 </entry> 4209 </controls> 4210 <dynamic> 4211 <clone entry="android.sensor.testPatternMode" kind="controls"> 4212 </clone> 4213 </dynamic> 4214 <static> 4215 <entry name="availableTestPatternModes" type="byte" visibility="public" 4216 optional="true"> 4217 <description>Optional. Defaults to [OFF]. Lists the supported test 4218 pattern modes for android.test.patternMode. 4219 </description> 4220 <range>Must include OFF. All custom modes must be >= CUSTOM1</range> 4221 </entry> 4222 4223 </static> 4224 </section> 4225 <section name="shading"> 4226 <controls> 4227 <entry name="mode" type="byte" visibility="hidden" enum="true"> 4228 <enum> 4229 <value>OFF 4230 <notes>No lens shading correction is applied</notes></value> 4231 <value>FAST 4232 <notes>Must not slow down frame rate relative to sensor raw output</notes></value> 4233 <value>HIGH_QUALITY 4234 <notes>Frame rate may be reduced by high quality</notes></value> 4235 </enum> 4236 <description>Quality of lens shading correction applied 4237 to the image data.</description> 4238 <details> 4239 When set to OFF mode, no lens shading correction will be applied by the 4240 camera device, and an identity lens shading map data will be provided 4241 if `android.statistics.lensShadingMapMode == ON`. For example, for lens 4242 shading map with size specified as `android.lens.info.shadingMapSize = [ 4, 3 ]`, 4243 the output android.statistics.lensShadingMap for this case will be an identity map 4244 shown below: 4245 4246 [ 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 4247 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 4248 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 4249 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 4250 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 4251 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0 ] 4252 4253 When set to other modes, lens shading correction will be applied by the 4254 camera device. Applications can request lens shading map data by setting 4255 android.statistics.lensShadingMapMode to ON, and then the camera device will provide 4256 lens shading map data in android.statistics.lensShadingMap, with size specified 4257 by android.lens.info.shadingMapSize. 4258 </details> 4259 </entry> 4260 <entry name="strength" type="byte"> 4261 <description>Control the amount of shading correction 4262 applied to the images</description> 4263 <units>unitless: 1-10; 10 is full shading 4264 compensation</units> 4265 <tag id="ADV" /> 4266 </entry> 4267 </controls> 4268 <dynamic> 4269 <clone entry="android.shading.mode" kind="controls"> 4270 </clone> 4271 </dynamic> 4272 </section> 4273 <section name="statistics"> 4274 <controls> 4275 <entry name="faceDetectMode" type="byte" visibility="public" enum="true"> 4276 <enum> 4277 <value>OFF</value> 4278 <value>SIMPLE 4279 <notes>Optional Return rectangle and confidence 4280 only</notes></value> 4281 <value>FULL 4282 <notes>Optional Return all face 4283 metadata</notes></value> 4284 </enum> 4285 <description>State of the face detector 4286 unit</description> 4287 <range> 4288 android.statistics.info.availableFaceDetectModes</range> 4289 <details>Whether face detection is enabled, and whether it 4290 should output just the basic fields or the full set of 4291 fields. Value must be one of the 4292 android.statistics.info.availableFaceDetectModes.</details> 4293 <tag id="BC" /> 4294 </entry> 4295 <entry name="histogramMode" type="byte" enum="true" typedef="boolean"> 4296 <enum> 4297 <value>OFF</value> 4298 <value>ON</value> 4299 </enum> 4300 <description>Operating mode for histogram 4301 generation</description> 4302 <tag id="V1" /> 4303 </entry> 4304 <entry name="sharpnessMapMode" type="byte" enum="true" typedef="boolean"> 4305 <enum> 4306 <value>OFF</value> 4307 <value>ON</value> 4308 </enum> 4309 <description>Operating mode for sharpness map 4310 generation</description> 4311 <tag id="V1" /> 4312 </entry> 4313 </controls> 4314 <static> 4315 <namespace name="info"> 4316 <entry name="availableFaceDetectModes" type="byte" 4317 visibility="public" 4318 type_notes="List of enums from android.statistics.faceDetectMode" 4319 container="array"> 4320 <array> 4321 <size>n</size> 4322 </array> 4323 <description>Which face detection modes are available, 4324 if any</description> 4325 <units>List of enum: 4326 OFF 4327 SIMPLE 4328 FULL</units> 4329 <details>OFF means face detection is disabled, it must 4330 be included in the list. 4331 4332 SIMPLE means the device supports the 4333 android.statistics.faceRectangles and 4334 android.statistics.faceScores outputs. 4335 4336 FULL means the device additionally supports the 4337 android.statistics.faceIds and 4338 android.statistics.faceLandmarks outputs. 4339 </details> 4340 </entry> 4341 <entry name="histogramBucketCount" type="int32"> 4342 <description>Number of histogram buckets 4343 supported</description> 4344 <range>&gt;= 64</range> 4345 </entry> 4346 <entry name="maxFaceCount" type="int32" visibility="public" > 4347 <description>Maximum number of simultaneously detectable 4348 faces</description> 4349 <range>&gt;= 4 if availableFaceDetectionModes lists 4350 modes besides OFF, otherwise 0</range> 4351 </entry> 4352 <entry name="maxHistogramCount" type="int32"> 4353 <description>Maximum value possible for a histogram 4354 bucket</description> 4355 </entry> 4356 <entry name="maxSharpnessMapValue" type="int32"> 4357 <description>Maximum value possible for a sharpness map 4358 region.</description> 4359 </entry> 4360 <entry name="sharpnessMapSize" type="int32" 4361 type_notes="width x height" container="array" typedef="size"> 4362 <array> 4363 <size>2</size> 4364 </array> 4365 <description>Dimensions of the sharpness 4366 map</description> 4367 <range>Must be at least 32 x 32</range> 4368 </entry> 4369 </namespace> 4370 </static> 4371 <dynamic> 4372 <clone entry="android.statistics.faceDetectMode" 4373 kind="controls"></clone> 4374 <entry name="faceIds" type="int32" visibility="hidden" container="array"> 4375 <array> 4376 <size>n</size> 4377 </array> 4378 <description>List of unique IDs for detected 4379 faces</description> 4380 <details>Only available if faceDetectMode == FULL</details> 4381 <tag id="BC" /> 4382 </entry> 4383 <entry name="faceLandmarks" type="int32" visibility="hidden" 4384 type_notes="(leftEyeX, leftEyeY, rightEyeX, rightEyeY, mouthX, mouthY)" 4385 container="array"> 4386 <array> 4387 <size>n</size> 4388 <size>6</size> 4389 </array> 4390 <description>List of landmarks for detected 4391 faces</description> 4392 <details>Only available if faceDetectMode == FULL</details> 4393 <tag id="BC" /> 4394 </entry> 4395 <entry name="faceRectangles" type="int32" visibility="hidden" 4396 type_notes="(xmin, ymin, xmax, ymax). (0,0) is top-left of active pixel area" 4397 container="array" typedef="rectangle"> 4398 <array> 4399 <size>n</size> 4400 <size>4</size> 4401 </array> 4402 <description>List of the bounding rectangles for detected 4403 faces</description> 4404 <details>Only available if faceDetectMode != OFF</details> 4405 <tag id="BC" /> 4406 </entry> 4407 <entry name="faceScores" type="byte" visibility="hidden" container="array"> 4408 <array> 4409 <size>n</size> 4410 </array> 4411 <description>List of the face confidence scores for 4412 detected faces</description> 4413 <range>1-100</range> 4414 <details>Only available if faceDetectMode != OFF. The value should be 4415 meaningful (for example, setting 100 at all times is illegal).</details> 4416 <tag id="BC" /> 4417 </entry> 4418 <entry name="histogram" type="int32" 4419 type_notes="count of pixels for each color channel that fall into each histogram bucket, scaled to be between 0 and maxHistogramCount" 4420 container="array"> 4421 <array> 4422 <size>n</size> 4423 <size>3</size> 4424 </array> 4425 <description>A 3-channel histogram based on the raw 4426 sensor data</description> 4427 <details>The k'th bucket (0-based) covers the input range 4428 (with w = android.sensor.info.whiteLevel) of [ k * w/N, 4429 (k + 1) * w / N ). If only a monochrome sharpness map is 4430 supported, all channels should have the same data</details> 4431 <tag id="V1" /> 4432 </entry> 4433 <clone entry="android.statistics.histogramMode" 4434 kind="controls"></clone> 4435 <entry name="sharpnessMap" type="int32" 4436 type_notes="estimated sharpness for each region of the input image. Normalized to be between 0 and maxSharpnessMapValue. Higher values mean sharper (better focused)" 4437 container="array"> 4438 <array> 4439 <size>n</size> 4440 <size>m</size> 4441 <size>3</size> 4442 </array> 4443 <description>A 3-channel sharpness map, based on the raw 4444 sensor data</description> 4445 <details>If only a monochrome sharpness map is supported, 4446 all channels should have the same data</details> 4447 <tag id="V1" /> 4448 </entry> 4449 <clone entry="android.statistics.sharpnessMapMode" 4450 kind="controls"></clone> 4451 <entry name="lensShadingMap" type="float" visibility="public" 4452 type_notes="2D array of float gain factors per channel to correct lens shading" 4453 container="array"> 4454 <array> 4455 <size>4</size> 4456 <size>n</size> 4457 <size>m</size> 4458 </array> 4459 <description>The shading map is a low-resolution floating-point map 4460 that lists the coefficients used to correct for vignetting, for each 4461 Bayer color channel.</description> 4462 <range>Each gain factor is &gt;= 1</range> 4463 <details>The least shaded section of the image should have a gain factor 4464 of 1; all other sections should have gains above 1. 4465 4466 When android.colorCorrection.mode = TRANSFORM_MATRIX, the map 4467 must take into account the colorCorrection settings. 4468 4469 The shading map is for the entire active pixel array, and is not 4470 affected by the crop region specified in the request. Each shading map 4471 entry is the value of the shading compensation map over a specific 4472 pixel on the sensor. Specifically, with a (N x M) resolution shading 4473 map, and an active pixel array size (W x H), shading map entry 4474 (x,y) ϵ (0 ... N-1, 0 ... M-1) is the value of the shading map at 4475 pixel ( ((W-1)/(N-1)) * x, ((H-1)/(M-1)) * y) for the four color channels. 4476 The map is assumed to be bilinearly interpolated between the sample points. 4477 4478 The channel order is [R, Geven, Godd, B], where Geven is the green 4479 channel for the even rows of a Bayer pattern, and Godd is the odd rows. 4480 The shading map is stored in a fully interleaved format, and its size 4481 is provided in the camera static metadata by android.lens.info.shadingMapSize. 4482 4483 The shading map should have on the order of 30-40 rows and columns, 4484 and must be smaller than 64x64. 4485 4486 As an example, given a very small map defined as: 4487 4488 android.lens.info.shadingMapSize = [ 4, 3 ] 4489 android.statistics.lensShadingMap = 4490 [ 1.3, 1.2, 1.15, 1.2, 1.2, 1.2, 1.15, 1.2, 4491 1.1, 1.2, 1.2, 1.2, 1.3, 1.2, 1.3, 1.3, 4492 1.2, 1.2, 1.25, 1.1, 1.1, 1.1, 1.1, 1.0, 4493 1.0, 1.0, 1.0, 1.0, 1.2, 1.3, 1.25, 1.2, 4494 1.3, 1.2, 1.2, 1.3, 1.2, 1.15, 1.1, 1.2, 4495 1.2, 1.1, 1.0, 1.2, 1.3, 1.15, 1.2, 1.3 ] 4496 4497 The low-resolution scaling map images for each channel are 4498 (displayed using nearest-neighbor interpolation): 4499 4500 ![Red lens shading map](android.statistics.lensShadingMap/red_shading.png) 4501 ![Green (even rows) lens shading map](android.statistics.lensShadingMap/green_e_shading.png) 4502 ![Green (odd rows) lens shading map](android.statistics.lensShadingMap/green_o_shading.png) 4503 ![Blue lens shading map](android.statistics.lensShadingMap/blue_shading.png) 4504 4505 As a visualization only, inverting the full-color map to recover an 4506 image of a gray wall (using bicubic interpolation for visual quality) as captured by the sensor gives: 4507 4508 ![Image of a uniform white wall (inverse shading map)](android.statistics.lensShadingMap/inv_shading.png) 4509 </details> 4510 </entry> 4511 <entry name="predictedColorGains" type="float" 4512 visibility="hidden" 4513 optional="true" 4514 type_notes="A 1D array of floats for 4 color channel gains" 4515 container="array"> 4516 <array> 4517 <size>4</size> 4518 </array> 4519 <description>The best-fit color channel gains calculated 4520 by the HAL's statistics units for the current output frame 4521 </description> 4522 <range>**Deprecated**. Do not use.</range> 4523 <details> 4524 This may be different than the gains used for this frame, 4525 since statistics processing on data from a new frame 4526 typically completes after the transform has already been 4527 applied to that frame. 4528 4529 The 4 channel gains are defined in Bayer domain, 4530 see android.colorCorrection.gains for details. 4531 4532 This value should always be calculated by the AWB block, 4533 regardless of the android.control.* current values. 4534 </details> 4535 </entry> 4536 <entry name="predictedColorTransform" type="rational" 4537 visibility="hidden" 4538 optional="true" 4539 type_notes="3x3 rational matrix in row-major order" 4540 container="array"> 4541 <array> 4542 <size>3</size> 4543 <size>3</size> 4544 </array> 4545 <description>The best-fit color transform matrix estimate 4546 calculated by the HAL's statistics units for the current 4547 output frame</description> 4548 <range>**Deprecated**. Do not use.</range> 4549 <details>The HAL must provide the estimate from its 4550 statistics unit on the white balance transforms to use 4551 for the next frame. These are the values the HAL believes 4552 are the best fit for the current output frame. This may 4553 be different than the transform used for this frame, since 4554 statistics processing on data from a new frame typically 4555 completes after the transform has already been applied to 4556 that frame. 4557 4558 These estimates must be provided for all frames, even if 4559 capture settings and color transforms are set by the application. 4560 4561 This value should always be calculated by the AWB block, 4562 regardless of the android.control.* current values. 4563 </details> 4564 </entry> 4565 <entry name="sceneFlicker" type="byte" visibility="public" enum="true"> 4566 <enum> 4567 <value>NONE</value> 4568 <value>50HZ</value> 4569 <value>60HZ</value> 4570 </enum> 4571 <description>The camera device estimated scene illumination lighting 4572 frequency.</description> 4573 <details> 4574 Many light sources, such as most fluorescent lights, flicker at a rate 4575 that depends on the local utility power standards. This flicker must be 4576 accounted for by auto-exposure routines to avoid artifacts in captured images. 4577 The camera device uses this entry to tell the application what the scene 4578 illuminant frequency is. 4579 4580 When manual exposure control is enabled 4581 (`android.control.aeMode == OFF` or `android.control.mode == OFF`), 4582 the android.control.aeAntibandingMode doesn't do the antibanding, and the 4583 application can ensure it selects exposure times that do not cause banding 4584 issues by looking into this metadata field. See android.control.aeAntibandingMode 4585 for more details. 4586 4587 Report NONE if there doesn't appear to be flickering illumination. 4588 </details> 4589 </entry> 4590 </dynamic> 4591 <controls> 4592 <entry name="lensShadingMapMode" type="byte" visibility="public" enum="true"> 4593 <enum> 4594 <value>OFF</value> 4595 <value>ON</value> 4596 </enum> 4597 <description>Whether the HAL needs to output the lens 4598 shading map in output result metadata</description> 4599 <details>When set to ON, 4600 android.statistics.lensShadingMap must be provided in 4601 the output result metadata.</details> 4602 </entry> 4603 </controls> 4604 </section> 4605 <section name="tonemap"> 4606 <controls> 4607 <entry name="curveBlue" type="float" visibility="public" 4608 type_notes="1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints." 4609 container="array"> 4610 <array> 4611 <size>n</size> 4612 <size>2</size> 4613 </array> 4614 <description>Tonemapping / contrast / gamma curve for the blue 4615 channel, to use when android.tonemap.mode is 4616 CONTRAST_CURVE.</description> 4617 <units>same as android.tonemap.curveRed</units> 4618 <range>same as android.tonemap.curveRed</range> 4619 <details>See android.tonemap.curveRed for more details.</details> 4620 </entry> 4621 <entry name="curveGreen" type="float" visibility="public" 4622 type_notes="1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints." 4623 container="array"> 4624 <array> 4625 <size>n</size> 4626 <size>2</size> 4627 </array> 4628 <description>Tonemapping / contrast / gamma curve for the green 4629 channel, to use when android.tonemap.mode is 4630 CONTRAST_CURVE.</description> 4631 <units>same as android.tonemap.curveRed</units> 4632 <range>same as android.tonemap.curveRed</range> 4633 <details>See android.tonemap.curveRed for more details.</details> 4634 </entry> 4635 <entry name="curveRed" type="float" visibility="public" 4636 type_notes="1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints." 4637 container="array"> 4638 <array> 4639 <size>n</size> 4640 <size>2</size> 4641 </array> 4642 <description>Tonemapping / contrast / gamma curve for the red 4643 channel, to use when android.tonemap.mode is 4644 CONTRAST_CURVE.</description> 4645 <range>0-1 on both input and output coordinates, normalized 4646 as a floating-point value such that 0 == black and 1 == white. 4647 </range> 4648 <details> 4649 Each channel's curve is defined by an array of control points: 4650 4651 android.tonemap.curveRed = 4652 [ P0in, P0out, P1in, P1out, P2in, P2out, P3in, P3out, ..., PNin, PNout ] 4653 2 &lt;= N &lt;= android.tonemap.maxCurvePoints 4654 4655 These are sorted in order of increasing `Pin`; it is always 4656 guaranteed that input values 0.0 and 1.0 are included in the list to 4657 define a complete mapping. For input values between control points, 4658 the camera device must linearly interpolate between the control 4659 points. 4660 4661 Each curve can have an independent number of points, and the number 4662 of points can be less than max (that is, the request doesn't have to 4663 always provide a curve with number of points equivalent to 4664 android.tonemap.maxCurvePoints). 4665 4666 A few examples, and their corresponding graphical mappings; these 4667 only specify the red channel and the precision is limited to 4 4668 digits, for conciseness. 4669 4670 Linear mapping: 4671 4672 android.tonemap.curveRed = [ 0, 0, 1.0, 1.0 ] 4673 4674 ![Linear mapping curve](android.tonemap.curveRed/linear_tonemap.png) 4675 4676 Invert mapping: 4677 4678 android.tonemap.curveRed = [ 0, 1.0, 1.0, 0 ] 4679 4680 ![Inverting mapping curve](android.tonemap.curveRed/inverse_tonemap.png) 4681 4682 Gamma 1/2.2 mapping, with 16 control points: 4683 4684 android.tonemap.curveRed = [ 4685 0.0000, 0.0000, 0.0667, 0.2920, 0.1333, 0.4002, 0.2000, 0.4812, 4686 0.2667, 0.5484, 0.3333, 0.6069, 0.4000, 0.6594, 0.4667, 0.7072, 4687 0.5333, 0.7515, 0.6000, 0.7928, 0.6667, 0.8317, 0.7333, 0.8685, 4688 0.8000, 0.9035, 0.8667, 0.9370, 0.9333, 0.9691, 1.0000, 1.0000 ] 4689 4690 ![Gamma = 1/2.2 tonemapping curve](android.tonemap.curveRed/gamma_tonemap.png) 4691 4692 Standard sRGB gamma mapping, per IEC 61966-2-1:1999, with 16 control points: 4693 4694 android.tonemap.curveRed = [ 4695 0.0000, 0.0000, 0.0667, 0.2864, 0.1333, 0.4007, 0.2000, 0.4845, 4696 0.2667, 0.5532, 0.3333, 0.6125, 0.4000, 0.6652, 0.4667, 0.7130, 4697 0.5333, 0.7569, 0.6000, 0.7977, 0.6667, 0.8360, 0.7333, 0.8721, 4698 0.8000, 0.9063, 0.8667, 0.9389, 0.9333, 0.9701, 1.0000, 1.0000 ] 4699 4700 ![sRGB tonemapping curve](android.tonemap.curveRed/srgb_tonemap.png) 4701 </details> 4702 <hal_details> 4703 For good quality of mapping, at least 128 control points are 4704 preferred. 4705 4706 A typical use case of this would be a gamma-1/2.2 curve, with as many 4707 control points used as are available. 4708 </hal_details> 4709 <tag id="DNG" /> 4710 </entry> 4711 <entry name="mode" type="byte" visibility="public" enum="true"> 4712 <enum> 4713 <value>CONTRAST_CURVE 4714 <notes>Use the tone mapping curve specified in 4715 android.tonemap.curve. 4716 4717 All color enhancement and tonemapping must be disabled, except 4718 for applying the tonemapping curve specified by 4719 android.tonemap.curveRed, android.tonemap.curveBlue, or 4720 android.tonemap.curveGreen. 4721 4722 Must not slow down frame rate relative to raw 4723 sensor output. 4724 </notes> 4725 </value> 4726 <value>FAST 4727 <notes> 4728 Advanced gamma mapping and color enhancement may be applied. 4729 4730 Should not slow down frame rate relative to raw sensor output. 4731 </notes> 4732 </value> 4733 <value>HIGH_QUALITY 4734 <notes> 4735 Advanced gamma mapping and color enhancement may be applied. 4736 4737 May slow down frame rate relative to raw sensor output. 4738 </notes> 4739 </value> 4740 </enum> 4741 <description>High-level global contrast/gamma/tonemapping control. 4742 </description> 4743 <details> 4744 When switching to an application-defined contrast curve by setting 4745 android.tonemap.mode to CONTRAST_CURVE, the curve is defined 4746 per-channel with a set of `(in, out)` points that specify the 4747 mapping from input high-bit-depth pixel value to the output 4748 low-bit-depth value. Since the actual pixel ranges of both input 4749 and output may change depending on the camera pipeline, the values 4750 are specified by normalized floating-point numbers. 4751 4752 More-complex color mapping operations such as 3D color look-up 4753 tables, selective chroma enhancement, or other non-linear color 4754 transforms will be disabled when android.tonemap.mode is 4755 CONTRAST_CURVE. 4756 4757 When using either FAST or HIGH_QUALITY, the camera device will 4758 emit its own tonemap curve in android.tonemap.curveRed, 4759 android.tonemap.curveGreen, and android.tonemap.curveBlue. 4760 These values are always available, and as close as possible to the 4761 actually used nonlinear/nonglobal transforms. 4762 4763 If a request is sent with TRANSFORM_MATRIX with the camera device's 4764 provided curve in FAST or HIGH_QUALITY, the image's tonemap will be 4765 roughly the same.</details> 4766 </entry> 4767 </controls> 4768 <static> 4769 <entry name="maxCurvePoints" type="int32" visibility="public" > 4770 <description>Maximum number of supported points in the 4771 tonemap curve that can be used for android.tonemap.curveRed, or 4772 android.tonemap.curveGreen, or android.tonemap.curveBlue. 4773 </description> 4774 <range>&gt;= 64</range> 4775 <details> 4776 If the actual number of points provided by the application (in 4777 android.tonemap.curve*) is less than max, the camera device will 4778 resample the curve to its internal representation, using linear 4779 interpolation. 4780 4781 The output curves in the result metadata may have a different number 4782 of points than the input curves, and will represent the actual 4783 hardware curves used as closely as possible when linearly interpolated. 4784 </details> 4785 <hal_details> 4786 This value must be at least 64. This should be at least 128. 4787 </hal_details> 4788 </entry> 4789 </static> 4790 <dynamic> 4791 <clone entry="android.tonemap.curveBlue" kind="controls"> 4792 </clone> 4793 <clone entry="android.tonemap.curveGreen" kind="controls"> 4794 </clone> 4795 <clone entry="android.tonemap.curveRed" kind="controls"> 4796 </clone> 4797 <clone entry="android.tonemap.mode" kind="controls"> 4798 </clone> 4799 </dynamic> 4800 </section> 4801 <section name="led"> 4802 <controls> 4803 <entry name="transmit" type="byte" visibility="hidden" enum="true" 4804 typedef="boolean"> 4805 <enum> 4806 <value>OFF</value> 4807 <value>ON</value> 4808 </enum> 4809 <description>This LED is nominally used to indicate to the user 4810 that the camera is powered on and may be streaming images back to the 4811 Application Processor. In certain rare circumstances, the OS may 4812 disable this when video is processed locally and not transmitted to 4813 any untrusted applications. 4814 4815 In particular, the LED *must* always be on when the data could be 4816 transmitted off the device. The LED *should* always be on whenever 4817 data is stored locally on the device. 4818 4819 The LED *may* be off if a trusted application is using the data that 4820 doesn't violate the above rules. 4821 </description> 4822 </entry> 4823 </controls> 4824 <dynamic> 4825 <clone entry="android.led.transmit" kind="controls"></clone> 4826 </dynamic> 4827 <static> 4828 <entry name="availableLeds" type="byte" visibility="hidden" enum="true" 4829 container="array"> 4830 <array> 4831 <size>n</size> 4832 </array> 4833 <enum> 4834 <value>TRANSMIT 4835 <notes>android.led.transmit control is used</notes> 4836 </value> 4837 </enum> 4838 <description>A list of camera LEDs that are available on this system. 4839 </description> 4840 </entry> 4841 </static> 4842 </section> 4843 <section name="info"> 4844 <static> 4845 <entry name="supportedHardwareLevel" type="byte" visibility="public" 4846 enum="true" > 4847 <enum> 4848 <value>LIMITED</value> 4849 <value>FULL</value> 4850 </enum> 4851 <description> 4852 Generally classifies the overall set of the camera device functionality. 4853 </description> 4854 <range>Optional. Default value is LIMITED.</range> 4855 <details> 4856 Camera devices will come in two flavors: LIMITED and FULL. 4857 4858 A FULL device has the most support possible and will enable the 4859 widest range of use cases such as: 4860 4861 * 30 FPS at maximum resolution (== sensor resolution) 4862 * Per frame control 4863 * Manual sensor control 4864 * Zero Shutter Lag (ZSL) 4865 4866 A LIMITED device may have some or none of the above characteristics. 4867 To find out more refer to android.request.availableCapabilities. 4868 </details> 4869 <hal_details> 4870 The camera 3 HAL device can implement one of two possible 4871 operational modes; limited and full. Full support is 4872 expected from new higher-end devices. Limited mode has 4873 hardware requirements roughly in line with those for a 4874 camera HAL device v1 implementation, and is expected from 4875 older or inexpensive devices. Full is a strict superset of 4876 limited, and they share the same essential operational flow. 4877 4878 For full details refer to "S3. Operational Modes" in camera3.h 4879 </hal_details> 4880 </entry> 4881 </static> 4882 </section> 4883 <section name="blackLevel"> 4884 <controls> 4885 <entry name="lock" type="byte" visibility="public" enum="true" 4886 typedef="boolean"> 4887 <enum> 4888 <value>OFF</value> 4889 <value>ON</value> 4890 </enum> 4891 <description> Whether black-level compensation is locked 4892 to its current values, or is free to vary.</description> 4893 <details>When set to ON, the values used for black-level 4894 compensation will not change until the lock is set to 4895 OFF. 4896 4897 Since changes to certain capture parameters (such as 4898 exposure time) may require resetting of black level 4899 compensation, the camera device must report whether setting 4900 the black level lock was successful in the output result 4901 metadata. 4902 4903 For example, if a sequence of requests is as follows: 4904 4905 * Request 1: Exposure = 10ms, Black level lock = OFF 4906 * Request 2: Exposure = 10ms, Black level lock = ON 4907 * Request 3: Exposure = 10ms, Black level lock = ON 4908 * Request 4: Exposure = 20ms, Black level lock = ON 4909 * Request 5: Exposure = 20ms, Black level lock = ON 4910 * Request 6: Exposure = 20ms, Black level lock = ON 4911 4912 And the exposure change in Request 4 requires the camera 4913 device to reset the black level offsets, then the output 4914 result metadata is expected to be: 4915 4916 * Result 1: Exposure = 10ms, Black level lock = OFF 4917 * Result 2: Exposure = 10ms, Black level lock = ON 4918 * Result 3: Exposure = 10ms, Black level lock = ON 4919 * Result 4: Exposure = 20ms, Black level lock = OFF 4920 * Result 5: Exposure = 20ms, Black level lock = ON 4921 * Result 6: Exposure = 20ms, Black level lock = ON 4922 4923 This indicates to the application that on frame 4, black 4924 levels were reset due to exposure value changes, and pixel 4925 values may not be consistent across captures. 4926 4927 The camera device will maintain the lock to the extent 4928 possible, only overriding the lock to OFF when changes to 4929 other request parameters require a black level recalculation 4930 or reset. 4931 </details> 4932 <hal_details> 4933 If for some reason black level locking is no longer possible 4934 (for example, the analog gain has changed, which forces 4935 black level offsets to be recalculated), then the HAL must 4936 override this request (and it must report 'OFF' when this 4937 does happen) until the next capture for which locking is 4938 possible again.</hal_details> 4939 <tag id="HAL2" /> 4940 </entry> 4941 </controls> 4942 <dynamic> 4943 <clone entry="android.blackLevel.lock" 4944 kind="controls"> 4945 <details> 4946 Whether the black level offset was locked for this frame. Should be 4947 ON if android.blackLevel.lock was ON in the capture request, unless 4948 a change in other capture settings forced the camera device to 4949 perform a black level reset. 4950 </details> 4951 </clone> 4952 </dynamic> 4953 </section> 4954 <section name="sync"> 4955 <dynamic> 4956 <entry name="frameNumber" type="int64" visibility="hidden" enum="true"> 4957 <enum> 4958 <value id="-1">CONVERGING 4959 <notes> 4960 The current result is not yet fully synchronized to any request. 4961 Synchronization is in progress, and reading metadata from this 4962 result may include a mix of data that have taken effect since the 4963 last synchronization time. 4964 4965 In some future result, within android.sync.maxLatency frames, 4966 this value will update to the actual frame number frame number 4967 the result is guaranteed to be synchronized to (as long as the 4968 request settings remain constant). 4969 </notes> 4970 </value> 4971 <value id="-2">UNKNOWN 4972 <notes> 4973 The current result's synchronization status is unknown. The 4974 result may have already converged, or it may be in progress. 4975 Reading from this result may include some mix of settings from 4976 past requests. 4977 4978 After a settings change, the new settings will eventually all 4979 take effect for the output buffers and results. However, this 4980 value will not change when that happens. Altering settings 4981 rapidly may provide outcomes using mixes of settings from recent 4982 requests. 4983 4984 This value is intended primarily for backwards compatibility with 4985 the older camera implementations (for android.hardware.Camera). 4986 </notes> 4987 </value> 4988 </enum> 4989 <description>The frame number corresponding to the last request 4990 with which the output result (metadata + buffers) has been fully 4991 synchronized.</description> 4992 <range>Either a non-negative value corresponding to a 4993 `frame_number`, or one of the two enums (CONVERGING / UNKNOWN). 4994 </range> 4995 <details> 4996 When a request is submitted to the camera device, there is usually a 4997 delay of several frames before the controls get applied. A camera 4998 device may either choose to account for this delay by implementing a 4999 pipeline and carefully submit well-timed atomic control updates, or 5000 it may start streaming control changes that span over several frame 5001 boundaries. 5002 5003 In the latter case, whenever a request's settings change relative to 5004 the previous submitted request, the full set of changes may take 5005 multiple frame durations to fully take effect. Some settings may 5006 take effect sooner (in less frame durations) than others. 5007 5008 While a set of control changes are being propagated, this value 5009 will be CONVERGING. 5010 5011 Once it is fully known that a set of control changes have been 5012 finished propagating, and the resulting updated control settings 5013 have been read back by the camera device, this value will be set 5014 to a non-negative frame number (corresponding to the request to 5015 which the results have synchronized to). 5016 5017 Older camera device implementations may not have a way to detect 5018 when all camera controls have been applied, and will always set this 5019 value to UNKNOWN. 5020 5021 FULL capability devices will always have this value set to the 5022 frame number of the request corresponding to this result. 5023 5024 _Further details_: 5025 5026 * Whenever a request differs from the last request, any future 5027 results not yet returned may have this value set to CONVERGING (this 5028 could include any in-progress captures not yet returned by the camera 5029 device, for more details see pipeline considerations below). 5030 * Submitting a series of multiple requests that differ from the 5031 previous request (e.g. r1, r2, r3 s.t. r1 != r2 != r3) 5032 moves the new synchronization frame to the last non-repeating 5033 request (using the smallest frame number from the contiguous list of 5034 repeating requests). 5035 * Submitting the same request repeatedly will not change this value 5036 to CONVERGING, if it was already a non-negative value. 5037 * When this value changes to non-negative, that means that all of the 5038 metadata controls from the request have been applied, all of the 5039 metadata controls from the camera device have been read to the 5040 updated values (into the result), and all of the graphics buffers 5041 corresponding to this result are also synchronized to the request. 5042 5043 _Pipeline considerations_: 5044 5045 Submitting a request with updated controls relative to the previously 5046 submitted requests may also invalidate the synchronization state 5047 of all the results corresponding to currently in-flight requests. 5048 5049 In other words, results for this current request and up to 5050 android.request.pipelineMaxDepth prior requests may have their 5051 android.sync.frameNumber change to CONVERGING. 5052 </details> 5053 <hal_details> 5054 Using UNKNOWN here is illegal unless android.sync.maxLatency 5055 is also UNKNOWN. 5056 5057 FULL capability devices should simply set this value to the 5058 `frame_number` of the request this result corresponds to. 5059 </hal_details> 5060 <tag id="LIMITED" /> 5061 </entry> 5062 </dynamic> 5063 <static> 5064 <entry name="maxLatency" type="int32" visibility="public" enum="true"> 5065 <enum> 5066 <value id="0">PER_FRAME_CONTROL 5067 <notes> 5068 Every frame has the requests immediately applied. 5069 (and furthermore for all results, 5070 `android.sync.frameNumber == android.request.frameCount`) 5071 5072 Changing controls over multiple requests one after another will 5073 produce results that have those controls applied atomically 5074 each frame. 5075 5076 All FULL capability devices will have this as their maxLatency. 5077 </notes> 5078 </value> 5079 <value id="-1">UNKNOWN 5080 <notes> 5081 Each new frame has some subset (potentially the entire set) 5082 of the past requests applied to the camera settings. 5083 5084 By submitting a series of identical requests, the camera device 5085 will eventually have the camera settings applied, but it is 5086 unknown when that exact point will be. 5087 </notes> 5088 </value> 5089 </enum> 5090 <description> 5091 The maximum number of frames that can occur after a request 5092 (different than the previous) has been submitted, and before the 5093 result's state becomes synchronized (by setting 5094 android.sync.frameNumber to a non-negative value). 5095 </description> 5096 <units>number of processed requests</units> 5097 <range>&gt;= -1</range> 5098 <details> 5099 This defines the maximum distance (in number of metadata results), 5100 between android.sync.frameNumber and the equivalent 5101 android.request.frameCount. 5102 5103 In other words this acts as an upper boundary for how many frames 5104 must occur before the camera device knows for a fact that the new 5105 submitted camera settings have been applied in outgoing frames. 5106 5107 For example if the distance was 2, 5108 5109 initial request = X (repeating) 5110 request1 = X 5111 request2 = Y 5112 request3 = Y 5113 request4 = Y 5114 5115 where requestN has frameNumber N, and the first of the repeating 5116 initial request's has frameNumber F (and F < 1). 5117 5118 initial result = X' + { android.sync.frameNumber == F } 5119 result1 = X' + { android.sync.frameNumber == F } 5120 result2 = X' + { android.sync.frameNumber == CONVERGING } 5121 result3 = X' + { android.sync.frameNumber == CONVERGING } 5122 result4 = X' + { android.sync.frameNumber == 2 } 5123 5124 where resultN has frameNumber N. 5125 5126 Since `result4` has a `frameNumber == 4` and 5127 `android.sync.frameNumber == 2`, the distance is clearly 5128 `4 - 2 = 2`. 5129 </details> 5130 <hal_details> 5131 Use `frame_count` from camera3_request_t instead of 5132 android.request.frameCount. 5133 5134 LIMITED devices are strongly encouraged to use a non-negative 5135 value. If UNKNOWN is used here then app developers do not have a way 5136 to know when sensor settings have been applied. 5137 </hal_details> 5138 <tag id="LIMITED" /> 5139 </entry> 5140 </static> 5141 </section> 5142 </namespace> 5143</metadata> 5144