metadata_properties.xml revision b432916043290beb246054a77f8978b3136f4315
1<?xml version="1.0" encoding="utf-8"?> 2<!-- Copyright (C) 2012 The Android Open Source Project 3 4 Licensed under the Apache License, Version 2.0 (the "License"); 5 you may not use this file except in compliance with the License. 6 You may obtain a copy of the License at 7 8 http://www.apache.org/licenses/LICENSE-2.0 9 10 Unless required by applicable law or agreed to in writing, software 11 distributed under the License is distributed on an "AS IS" BASIS, 12 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 See the License for the specific language governing permissions and 14 limitations under the License. 15--> 16<metadata xmlns="http://schemas.android.com/service/camera/metadata/" 17xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 18xsi:schemaLocation="http://schemas.android.com/service/camera/metadata/ metadata_properties.xsd"> 19 20 <tags> 21 <tag id="BC"> 22 Needed for backwards compatibility with old Java API 23 </tag> 24 <tag id="V1"> 25 New features for first camera 2 release (API1) 26 </tag> 27 <tag id="DNG"> 28 Needed for DNG file support 29 </tag> 30 <tag id="HAL2"> 31 Entry is only used by camera device HAL 2.x 32 </tag> 33 <tag id="FULL"> 34 Entry is required for full hardware level devices, and optional for other hardware levels 35 </tag> 36 <tag id="FUTURE"> 37 Entry is under-specified and is not required for now. This is for book-keeping purpose, 38 do not implement or use it, it may be revised for future. 39 </tag> 40 </tags> 41 42 <types> 43 <typedef name="pairFloatFloat"> 44 <language name="java">android.util.Pair<Float,Float></language> 45 </typedef> 46 <typedef name="rectangle"> 47 <language name="java">android.graphics.Rect</language> 48 </typedef> 49 <typedef name="size"> 50 <language name="java">android.util.Size</language> 51 </typedef> 52 <typedef name="string"> 53 <language name="java">String</language> 54 </typedef> 55 <typedef name="boolean"> 56 <language name="java">boolean</language> 57 </typedef> 58 <typedef name="imageFormat"> 59 <language name="java">int</language> 60 </typedef> 61 <typedef name="streamConfigurationMap"> 62 <language name="java">android.hardware.camera2.params.StreamConfigurationMap</language> 63 </typedef> 64 <typedef name="streamConfiguration"> 65 <language name="java">android.hardware.camera2.params.StreamConfiguration</language> 66 </typedef> 67 <typedef name="streamConfigurationDuration"> 68 <language name="java">android.hardware.camera2.params.StreamConfigurationDuration</language> 69 </typedef> 70 <typedef name="face"> 71 <language name="java">android.hardware.camera2.params.Face</language> 72 </typedef> 73 <typedef name="meteringRectangle"> 74 <language name="java">android.hardware.camera2.params.MeteringRectangle</language> 75 </typedef> 76 <typedef name="rangeFloat"> 77 <language name="java">android.util.Range<Float></language> 78 </typedef> 79 <typedef name="rangeInt"> 80 <language name="java">android.util.Range<Integer></language> 81 </typedef> 82 <typedef name="rangeLong"> 83 <language name="java">android.util.Range<Long></language> 84 </typedef> 85 <typedef name="colorSpaceTransform"> 86 <language name="java">android.hardware.camera2.params.ColorSpaceTransform</language> 87 </typedef> 88 <typedef name="rggbChannelVector"> 89 <language name="java">android.hardware.camera2.params.RggbChannelVector</language> 90 </typedef> 91 <typedef name="enumList"> 92 <language name="java">int</language> 93 </typedef> 94 <typedef name="sizeF"> 95 <language name="java">android.util.SizeF</language> 96 </typedef> 97 <typedef name="point"> 98 <language name="java">android.graphics.Point</language> 99 </typedef> 100 <typedef name="tonemapCurve"> 101 <language name="java">android.hardware.camera2.params.TonemapCurve</language> 102 </typedef> 103 <typedef name="lensShadingMap"> 104 <language name="java">android.hardware.camera2.params.LensShadingMap</language> 105 </typedef> 106 <typedef name="location"> 107 <language name="java">android.location.Location</language> 108 </typedef> 109 </types> 110 111 <namespace name="android"> 112 <section name="colorCorrection"> 113 <controls> 114 <entry name="mode" type="byte" visibility="public" enum="true"> 115 <enum> 116 <value>TRANSFORM_MATRIX 117 <notes>Use the android.colorCorrection.transform matrix 118 and android.colorCorrection.gains to do color conversion. 119 120 All advanced white balance adjustments (not specified 121 by our white balance pipeline) must be disabled. 122 123 If AWB is enabled with `android.control.awbMode != OFF`, then 124 TRANSFORM_MATRIX is ignored. The camera device will override 125 this value to either FAST or HIGH_QUALITY. 126 </notes> 127 </value> 128 <value>FAST 129 <notes>Color correction processing must not slow down 130 capture rate relative to sensor raw output. 131 132 Advanced white balance adjustments above and beyond 133 the specified white balance pipeline may be applied. 134 135 If AWB is enabled with `android.control.awbMode != OFF`, then 136 the camera device uses the last frame's AWB values 137 (or defaults if AWB has never been run). 138 </notes> 139 </value> 140 <value>HIGH_QUALITY 141 <notes>Color correction processing operates at improved 142 quality but reduced capture rate (relative to sensor raw 143 output). 144 145 Advanced white balance adjustments above and beyond 146 the specified white balance pipeline may be applied. 147 148 If AWB is enabled with `android.control.awbMode != OFF`, then 149 the camera device uses the last frame's AWB values 150 (or defaults if AWB has never been run). 151 </notes> 152 </value> 153 </enum> 154 155 <description> 156 The mode control selects how the image data is converted from the 157 sensor's native color into linear sRGB color. 158 </description> 159 <details> 160 When auto-white balance (AWB) is enabled with android.control.awbMode, this 161 control is overridden by the AWB routine. When AWB is disabled, the 162 application controls how the color mapping is performed. 163 164 We define the expected processing pipeline below. For consistency 165 across devices, this is always the case with TRANSFORM_MATRIX. 166 167 When either FULL or HIGH_QUALITY is used, the camera device may 168 do additional processing but android.colorCorrection.gains and 169 android.colorCorrection.transform will still be provided by the 170 camera device (in the results) and be roughly correct. 171 172 Switching to TRANSFORM_MATRIX and using the data provided from 173 FAST or HIGH_QUALITY will yield a picture with the same white point 174 as what was produced by the camera device in the earlier frame. 175 176 The expected processing pipeline is as follows: 177 178 ![White balance processing pipeline](android.colorCorrection.mode/processing_pipeline.png) 179 180 The white balance is encoded by two values, a 4-channel white-balance 181 gain vector (applied in the Bayer domain), and a 3x3 color transform 182 matrix (applied after demosaic). 183 184 The 4-channel white-balance gains are defined as: 185 186 android.colorCorrection.gains = [ R G_even G_odd B ] 187 188 where `G_even` is the gain for green pixels on even rows of the 189 output, and `G_odd` is the gain for green pixels on the odd rows. 190 These may be identical for a given camera device implementation; if 191 the camera device does not support a separate gain for even/odd green 192 channels, it will use the `G_even` value, and write `G_odd` equal to 193 `G_even` in the output result metadata. 194 195 The matrices for color transforms are defined as a 9-entry vector: 196 197 android.colorCorrection.transform = [ I0 I1 I2 I3 I4 I5 I6 I7 I8 ] 198 199 which define a transform from input sensor colors, `P_in = [ r g b ]`, 200 to output linear sRGB, `P_out = [ r' g' b' ]`, 201 202 with colors as follows: 203 204 r' = I0r + I1g + I2b 205 g' = I3r + I4g + I5b 206 b' = I6r + I7g + I8b 207 208 Both the input and output value ranges must match. Overflow/underflow 209 values are clipped to fit within the range. 210 </details> 211 </entry> 212 <entry name="transform" type="rational" visibility="public" 213 type_notes="3x3 rational matrix in row-major order" 214 container="array" typedef="colorSpaceTransform" > 215 <array> 216 <size>3</size> 217 <size>3</size> 218 </array> 219 <description>A color transform matrix to use to transform 220 from sensor RGB color space to output linear sRGB color space. 221 </description> 222 <details>This matrix is either set by the camera device when the request 223 android.colorCorrection.mode is not TRANSFORM_MATRIX, or 224 directly by the application in the request when the 225 android.colorCorrection.mode is TRANSFORM_MATRIX. 226 227 In the latter case, the camera device may round the matrix to account 228 for precision issues; the final rounded matrix should be reported back 229 in this matrix result metadata. The transform should keep the magnitude 230 of the output color values within `[0, 1.0]` (assuming input color 231 values is within the normalized range `[0, 1.0]`), or clipping may occur. 232 </details> 233 </entry> 234 <entry name="gains" type="float" visibility="public" 235 type_notes="A 1D array of floats for 4 color channel gains" 236 container="array" typedef="rggbChannelVector" > 237 <array> 238 <size>4</size> 239 </array> 240 <description>Gains applying to Bayer raw color channels for 241 white-balance.</description> 242 <details> 243 These per-channel gains are either set by the camera device 244 when the request android.colorCorrection.mode is not 245 TRANSFORM_MATRIX, or directly by the application in the 246 request when the android.colorCorrection.mode is 247 TRANSFORM_MATRIX. 248 249 The gains in the result metadata are the gains actually 250 applied by the camera device to the current frame. 251 </details> 252 <hal_details> 253 The 4-channel white-balance gains are defined in 254 the order of `[R G_even G_odd B]`, where `G_even` is the gain 255 for green pixels on even rows of the output, and `G_odd` 256 is the gain for green pixels on the odd rows. 257 258 If a HAL does not support a separate gain for even/odd green 259 channels, it must use the `G_even` value, and write 260 `G_odd` equal to `G_even` in the output result metadata. 261 </hal_details> 262 </entry> 263 </controls> 264 <dynamic> 265 <clone entry="android.colorCorrection.mode" kind="controls"> 266 </clone> 267 <clone entry="android.colorCorrection.transform" kind="controls"> 268 </clone> 269 <clone entry="android.colorCorrection.gains" kind="controls"> 270 </clone> 271 </dynamic> 272 </section> 273 <section name="control"> 274 <controls> 275 <entry name="aeAntibandingMode" type="byte" visibility="public" 276 enum="true" > 277 <enum> 278 <value>OFF 279 <notes> 280 The camera device will not adjust exposure duration to 281 avoid banding problems. 282 </notes> 283 </value> 284 <value>50HZ 285 <notes> 286 The camera device will adjust exposure duration to 287 avoid banding problems with 50Hz illumination sources. 288 </notes> 289 </value> 290 <value>60HZ 291 <notes> 292 The camera device will adjust exposure duration to 293 avoid banding problems with 60Hz illumination 294 sources. 295 </notes> 296 </value> 297 <value>AUTO 298 <notes> 299 The camera device will automatically adapt its 300 antibanding routine to the current illumination 301 conditions. This is the default. 302 </notes> 303 </value> 304 </enum> 305 <description> 306 The desired setting for the camera device's auto-exposure 307 algorithm's antibanding compensation. 308 </description> 309 <range> 310 android.control.aeAvailableAntibandingModes 311 </range> 312 <details> 313 Some kinds of lighting fixtures, such as some fluorescent 314 lights, flicker at the rate of the power supply frequency 315 (60Hz or 50Hz, depending on country). While this is 316 typically not noticeable to a person, it can be visible to 317 a camera device. If a camera sets its exposure time to the 318 wrong value, the flicker may become visible in the 319 viewfinder as flicker or in a final captured image, as a 320 set of variable-brightness bands across the image. 321 322 Therefore, the auto-exposure routines of camera devices 323 include antibanding routines that ensure that the chosen 324 exposure value will not cause such banding. The choice of 325 exposure time depends on the rate of flicker, which the 326 camera device can detect automatically, or the expected 327 rate can be selected by the application using this 328 control. 329 330 A given camera device may not support all of the possible 331 options for the antibanding mode. The 332 android.control.aeAvailableAntibandingModes key contains 333 the available modes for a given camera device. 334 335 The default mode is AUTO, which must be supported by all 336 camera devices. 337 338 If manual exposure control is enabled (by setting 339 android.control.aeMode or android.control.mode to OFF), 340 then this setting has no effect, and the application must 341 ensure it selects exposure times that do not cause banding 342 issues. The android.statistics.sceneFlicker key can assist 343 the application in this. 344 </details> 345 <hal_details> 346 For all capture request templates, this field must be set 347 to AUTO. AUTO is the only mode that must supported; 348 OFF, 50HZ, 60HZ are all optional. 349 350 If manual exposure control is enabled (by setting 351 android.control.aeMode or android.control.mode to OFF), 352 then the exposure values provided by the application must not be 353 adjusted for antibanding. 354 </hal_details> 355 <tag id="BC" /> 356 </entry> 357 <entry name="aeExposureCompensation" type="int32" visibility="public"> 358 <description>Adjustment to auto-exposure (AE) target image 359 brightness.</description> 360 <units>count of positive/negative EV steps</units> 361 <range>android.control.aeCompensationRange</range> 362 <details> 363 The adjustment is measured as a count of steps, with the 364 step size defined by android.control.aeCompensationStep and the 365 allowed range by android.control.aeCompensationRange. 366 367 For example, if the exposure value (EV) step is 0.333, '6' 368 will mean an exposure compensation of +2 EV; -3 will mean an 369 exposure compensation of -1 EV. One EV represents a doubling 370 of image brightness. Note that this control will only be 371 effective if android.control.aeMode `!=` OFF. This control 372 will take effect even when android.control.aeLock `== true`. 373 374 In the event of exposure compensation value being changed, camera device 375 may take several frames to reach the newly requested exposure target. 376 During that time, android.control.aeState field will be in the SEARCHING 377 state. Once the new exposure target is reached, android.control.aeState will 378 change from SEARCHING to either CONVERGED, LOCKED (if AE lock is enabled), or 379 FLASH_REQUIRED (if the scene is too dark for still capture). 380 </details> 381 <tag id="BC" /> 382 </entry> 383 <entry name="aeLock" type="byte" visibility="public" enum="true" 384 typedef="boolean"> 385 <enum> 386 <value>OFF 387 <notes>Auto-exposure lock is disabled; the AE algorithm 388 is free to update its parameters.</notes></value> 389 <value>ON 390 <notes>Auto-exposure lock is enabled; the AE algorithm 391 must not update the exposure and sensitivity parameters 392 while the lock is active. 393 394 android.control.aeExposureCompensation setting changes 395 will still take effect while auto-exposure is locked. 396 </notes></value> 397 </enum> 398 <description>Whether auto-exposure (AE) is currently locked to its latest 399 calculated values.</description> 400 <details>Note that even when AE is locked, the flash may be 401 fired if the android.control.aeMode is ON_AUTO_FLASH / ON_ALWAYS_FLASH / 402 ON_AUTO_FLASH_REDEYE. 403 404 When android.control.aeExposureCompensation is changed, even if the AE lock 405 is ON, the camera device will still adjust its exposure value. 406 407 If AE precapture is triggered (see android.control.aePrecaptureTrigger) 408 when AE is already locked, the camera device will not change the exposure time 409 (android.sensor.exposureTime) and sensitivity (android.sensor.sensitivity) 410 parameters. The flash may be fired if the android.control.aeMode 411 is ON_AUTO_FLASH/ON_AUTO_FLASH_REDEYE and the scene is too dark. If the 412 android.control.aeMode is ON_ALWAYS_FLASH, the scene may become overexposed. 413 414 See android.control.aeState for AE lock related state transition details. 415 </details> 416 <tag id="BC" /> 417 </entry> 418 <entry name="aeMode" type="byte" visibility="public" enum="true"> 419 <enum> 420 <value>OFF 421 <notes> 422 The camera device's autoexposure routine is disabled. 423 424 The application-selected android.sensor.exposureTime, 425 android.sensor.sensitivity and 426 android.sensor.frameDuration are used by the camera 427 device, along with android.flash.* fields, if there's 428 a flash unit for this camera device. 429 </notes> 430 </value> 431 <value>ON 432 <notes> 433 The camera device's autoexposure routine is active, 434 with no flash control. 435 436 The application's values for 437 android.sensor.exposureTime, 438 android.sensor.sensitivity, and 439 android.sensor.frameDuration are ignored. The 440 application has control over the various 441 android.flash.* fields. 442 </notes> 443 </value> 444 <value>ON_AUTO_FLASH 445 <notes> 446 Like ON, except that the camera device also controls 447 the camera's flash unit, firing it in low-light 448 conditions. 449 450 The flash may be fired during a precapture sequence 451 (triggered by android.control.aePrecaptureTrigger) and 452 may be fired for captures for which the 453 android.control.captureIntent field is set to 454 STILL_CAPTURE 455 </notes> 456 </value> 457 <value>ON_ALWAYS_FLASH 458 <notes> 459 Like ON, except that the camera device also controls 460 the camera's flash unit, always firing it for still 461 captures. 462 463 The flash may be fired during a precapture sequence 464 (triggered by android.control.aePrecaptureTrigger) and 465 will always be fired for captures for which the 466 android.control.captureIntent field is set to 467 STILL_CAPTURE 468 </notes> 469 </value> 470 <value>ON_AUTO_FLASH_REDEYE 471 <notes> 472 Like ON_AUTO_FLASH, but with automatic red eye 473 reduction. 474 475 If deemed necessary by the camera device, a red eye 476 reduction flash will fire during the precapture 477 sequence. 478 </notes> 479 </value> 480 </enum> 481 <description>The desired mode for the camera device's 482 auto-exposure routine.</description> 483 <range>android.control.aeAvailableModes</range> 484 <details> 485 This control is only effective if android.control.mode is 486 AUTO. 487 488 When set to any of the ON modes, the camera device's 489 auto-exposure routine is enabled, overriding the 490 application's selected exposure time, sensor sensitivity, 491 and frame duration (android.sensor.exposureTime, 492 android.sensor.sensitivity, and 493 android.sensor.frameDuration). If one of the FLASH modes 494 is selected, the camera device's flash unit controls are 495 also overridden. 496 497 The FLASH modes are only available if the camera device 498 has a flash unit (android.flash.info.available is `true`). 499 500 If flash TORCH mode is desired, this field must be set to 501 ON or OFF, and android.flash.mode set to TORCH. 502 503 When set to any of the ON modes, the values chosen by the 504 camera device auto-exposure routine for the overridden 505 fields for a given capture will be available in its 506 CaptureResult. 507 </details> 508 <tag id="BC" /> 509 </entry> 510 <entry name="aeRegions" type="int32" visibility="public" 511 container="array" typedef="meteringRectangle"> 512 <array> 513 <size>5</size> 514 <size>area_count</size> 515 </array> 516 <description>List of areas to use for 517 metering.</description> 518 <range>`area_count <= android.control.maxRegions[0]`</range> 519 <details> 520 The coordinate system is based on the active pixel array, 521 with (0,0) being the top-left pixel in the active pixel array, and 522 (android.sensor.info.activeArraySize.width - 1, 523 android.sensor.info.activeArraySize.height - 1) being the 524 bottom-right pixel in the active pixel array. 525 526 The weight must range from 0 to 1000, and represents a weight 527 for every pixel in the area. This means that a large metering area 528 with the same weight as a smaller area will have more effect in 529 the metering result. Metering areas can partially overlap and the 530 camera device will add the weights in the overlap region. 531 532 If all regions have 0 weight, then no specific metering area 533 needs to be used by the camera device. If the metering region is 534 outside the used android.scaler.cropRegion returned in capture result metadata, 535 the camera device will ignore the sections outside the region and output the 536 used sections in the result metadata. 537 </details> 538 <hal_details> 539 The HAL level representation of MeteringRectangle[] is a 540 int[5 * area_count]. 541 Every five elements represent a metering region of 542 (xmin, ymin, xmax, ymax, weight). 543 The rectangle is defined to be inclusive on xmin and ymin, but 544 exclusive on xmax and ymax. 545 </hal_details> 546 <tag id="BC" /> 547 </entry> 548 <entry name="aeTargetFpsRange" type="int32" visibility="public" 549 container="array" typedef="rangeInt"> 550 <array> 551 <size>2</size> 552 </array> 553 <description>Range over which fps can be adjusted to 554 maintain exposure.</description> 555 <range>android.control.aeAvailableTargetFpsRanges</range> 556 <details>Only constrains auto-exposure (AE) algorithm, not 557 manual control of android.sensor.exposureTime</details> 558 <tag id="BC" /> 559 </entry> 560 <entry name="aePrecaptureTrigger" type="byte" visibility="public" 561 enum="true"> 562 <enum> 563 <value>IDLE 564 <notes>The trigger is idle.</notes> 565 </value> 566 <value>START 567 <notes>The precapture metering sequence will be started 568 by the camera device. 569 570 The exact effect of the precapture trigger depends on 571 the current AE mode and state.</notes> 572 </value> 573 </enum> 574 <description>Whether the camera device will trigger a precapture 575 metering sequence when it processes this request.</description> 576 <details>This entry is normally set to IDLE, or is not 577 included at all in the request settings. When included and 578 set to START, the camera device will trigger the autoexposure 579 precapture metering sequence. 580 581 The precapture sequence should triggered before starting a 582 high-quality still capture for final metering decisions to 583 be made, and for firing pre-capture flash pulses to estimate 584 scene brightness and required final capture flash power, when 585 the flash is enabled. 586 587 Normally, this entry should be set to START for only a 588 single request, and the application should wait until the 589 sequence completes before starting a new one. 590 591 The exact effect of auto-exposure (AE) precapture trigger 592 depends on the current AE mode and state; see 593 android.control.aeState for AE precapture state transition 594 details.</details> 595 <tag id="BC" /> 596 </entry> 597 <entry name="afMode" type="byte" visibility="public" enum="true"> 598 <enum> 599 <value>OFF 600 <notes>The auto-focus routine does not control the lens; 601 android.lens.focusDistance is controlled by the 602 application.</notes></value> 603 <value>AUTO 604 <notes>Basic automatic focus mode. 605 606 In this mode, the lens does not move unless 607 the autofocus trigger action is called. When that trigger 608 is activated, AF will transition to ACTIVE_SCAN, then to 609 the outcome of the scan (FOCUSED or NOT_FOCUSED). 610 611 Always supported if lens is not fixed focus. 612 613 Use android.lens.info.minimumFocusDistance to determine if lens 614 is fixed-focus. 615 616 Triggering AF_CANCEL resets the lens position to default, 617 and sets the AF state to INACTIVE.</notes></value> 618 <value>MACRO 619 <notes>Close-up focusing mode. 620 621 In this mode, the lens does not move unless the 622 autofocus trigger action is called. When that trigger is 623 activated, AF will transition to ACTIVE_SCAN, then to 624 the outcome of the scan (FOCUSED or NOT_FOCUSED). This 625 mode is optimized for focusing on objects very close to 626 the camera. 627 628 When that trigger is activated, AF will transition to 629 ACTIVE_SCAN, then to the outcome of the scan (FOCUSED or 630 NOT_FOCUSED). Triggering cancel AF resets the lens 631 position to default, and sets the AF state to 632 INACTIVE.</notes></value> 633 <value>CONTINUOUS_VIDEO 634 <notes>In this mode, the AF algorithm modifies the lens 635 position continually to attempt to provide a 636 constantly-in-focus image stream. 637 638 The focusing behavior should be suitable for good quality 639 video recording; typically this means slower focus 640 movement and no overshoots. When the AF trigger is not 641 involved, the AF algorithm should start in INACTIVE state, 642 and then transition into PASSIVE_SCAN and PASSIVE_FOCUSED 643 states as appropriate. When the AF trigger is activated, 644 the algorithm should immediately transition into 645 AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the 646 lens position until a cancel AF trigger is received. 647 648 Once cancel is received, the algorithm should transition 649 back to INACTIVE and resume passive scan. Note that this 650 behavior is not identical to CONTINUOUS_PICTURE, since an 651 ongoing PASSIVE_SCAN must immediately be 652 canceled.</notes></value> 653 <value>CONTINUOUS_PICTURE 654 <notes>In this mode, the AF algorithm modifies the lens 655 position continually to attempt to provide a 656 constantly-in-focus image stream. 657 658 The focusing behavior should be suitable for still image 659 capture; typically this means focusing as fast as 660 possible. When the AF trigger is not involved, the AF 661 algorithm should start in INACTIVE state, and then 662 transition into PASSIVE_SCAN and PASSIVE_FOCUSED states as 663 appropriate as it attempts to maintain focus. When the AF 664 trigger is activated, the algorithm should finish its 665 PASSIVE_SCAN if active, and then transition into 666 AF_FOCUSED or AF_NOT_FOCUSED as appropriate, and lock the 667 lens position until a cancel AF trigger is received. 668 669 When the AF cancel trigger is activated, the algorithm 670 should transition back to INACTIVE and then act as if it 671 has just been started.</notes></value> 672 <value>EDOF 673 <notes>Extended depth of field (digital focus) mode. 674 675 The camera device will produce images with an extended 676 depth of field automatically; no special focusing 677 operations need to be done before taking a picture. 678 679 AF triggers are ignored, and the AF state will always be 680 INACTIVE.</notes></value> 681 </enum> 682 <description>Whether auto-focus (AF) is currently enabled, and what 683 mode it is set to.</description> 684 <range>android.control.afAvailableModes</range> 685 <details>Only effective if android.control.mode = AUTO and the lens is not fixed focus 686 (i.e. `android.lens.info.minimumFocusDistance > 0`). 687 688 If the lens is controlled by the camera device auto-focus algorithm, 689 the camera device will report the current AF status in android.control.afState 690 in result metadata.</details> 691 <hal_details> 692 When afMode is AUTO or MACRO, the lens must not move until an AF trigger is sent in a 693 request (android.control.afTrigger `==` START). After an AF trigger, the afState will end 694 up with either FOCUSED_LOCKED or NOT_FOCUSED_LOCKED state (see 695 android.control.afState for detailed state transitions), which indicates that the lens is 696 locked and will not move. If camera movement (e.g. tilting camera) causes the lens to move 697 after the lens is locked, the HAL must compensate this movement appropriately such that 698 the same focal plane remains in focus. 699 700 When afMode is one of the continuous auto focus modes, the HAL is free to start a AF 701 scan whenever it's not locked. When the lens is locked after an AF trigger 702 (see android.control.afState for detailed state transitions), the HAL should maintain the 703 same lock behavior as above. 704 705 When afMode is OFF, the application controls focus manually. The accuracy of the 706 focus distance control depends on the android.lens.info.focusDistanceCalibration. 707 However, the lens must not move regardless of the camera movement for any focus distance 708 manual control. 709 710 To put this in concrete terms, if the camera has lens elements which may move based on 711 camera orientation or motion (e.g. due to gravity), then the HAL must drive the lens to 712 remain in a fixed position invariant to the camera's orientation or motion, for example, 713 by using accelerometer measurements in the lens control logic. This is a typical issue 714 that will arise on camera modules with open-loop VCMs. 715 </hal_details> 716 <tag id="BC" /> 717 </entry> 718 <entry name="afRegions" type="int32" visibility="public" 719 container="array" typedef="meteringRectangle"> 720 <array> 721 <size>5</size> 722 <size>area_count</size> 723 </array> 724 <description>List of areas to use for focus 725 estimation.</description> 726 <range>`area_count <= android.control.maxRegions[2]`</range> 727 <details> 728 The coordinate system is based on the active pixel array, 729 with (0,0) being the top-left pixel in the active pixel array, and 730 (android.sensor.info.activeArraySize.width - 1, 731 android.sensor.info.activeArraySize.height - 1) being the 732 bottom-right pixel in the active pixel array. 733 734 The weight must range from 0 to 1000, and represents a weight 735 for every pixel in the area. This means that a large metering area 736 with the same weight as a smaller area will have more effect in 737 the metering result. Metering areas can partially overlap and the 738 camera device will add the weights in the overlap region. 739 740 If all regions have 0 weight, then no specific metering area 741 needs to be used by the camera device. If the metering region is 742 outside the used android.scaler.cropRegion returned in capture result metadata, 743 the camera device will ignore the sections outside the region and output the 744 used sections in the result metadata. 745 </details> 746 <hal_details> 747 The HAL level representation of MeteringRectangle[] is a 748 int[5 * area_count]. 749 Every five elements represent a metering region of 750 (xmin, ymin, xmax, ymax, weight). 751 The rectangle is defined to be inclusive on xmin and ymin, but 752 exclusive on xmax and ymax. 753 </hal_details> 754 <tag id="BC" /> 755 </entry> 756 <entry name="afTrigger" type="byte" visibility="public" enum="true"> 757 <enum> 758 <value>IDLE 759 <notes>The trigger is idle.</notes> 760 </value> 761 <value>START 762 <notes>Autofocus will trigger now.</notes> 763 </value> 764 <value>CANCEL 765 <notes>Autofocus will return to its initial 766 state, and cancel any currently active trigger.</notes> 767 </value> 768 </enum> 769 <description> 770 Whether the camera device will trigger autofocus for this request. 771 </description> 772 <details>This entry is normally set to IDLE, or is not 773 included at all in the request settings. 774 775 When included and set to START, the camera device will trigger the 776 autofocus algorithm. If autofocus is disabled, this trigger has no effect. 777 778 When set to CANCEL, the camera device will cancel any active trigger, 779 and return to its initial AF state. 780 781 Generally, applications should set this entry to START or CANCEL for only a 782 single capture, and then return it to IDLE (or not set at all). Specifying 783 START for multiple captures in a row means restarting the AF operation over 784 and over again. 785 786 See android.control.afState for what the trigger means for each AF mode. 787 </details> 788 <tag id="BC" /> 789 </entry> 790 <entry name="awbLock" type="byte" visibility="public" enum="true" 791 typedef="boolean"> 792 <enum> 793 <value>OFF 794 <notes>Auto-white balance lock is disabled; the AWB 795 algorithm is free to update its parameters if in AUTO 796 mode.</notes></value> 797 <value>ON 798 <notes>Auto-white balance lock is enabled; the AWB 799 algorithm will not update its parameters while the lock 800 is active.</notes></value> 801 </enum> 802 <description>Whether auto-white balance (AWB) is currently locked to its 803 latest calculated values.</description> 804 <details>Note that AWB lock is only meaningful when 805 android.control.awbMode is in the AUTO mode; in other modes, 806 AWB is already fixed to a specific setting.</details> 807 <tag id="BC" /> 808 </entry> 809 <entry name="awbMode" type="byte" visibility="public" enum="true"> 810 <enum> 811 <value>OFF 812 <notes> 813 The camera device's auto-white balance routine is disabled. 814 815 The application-selected color transform matrix 816 (android.colorCorrection.transform) and gains 817 (android.colorCorrection.gains) are used by the camera 818 device for manual white balance control. 819 </notes> 820 </value> 821 <value>AUTO 822 <notes> 823 The camera device's auto-white balance routine is active. 824 825 The application's values for android.colorCorrection.transform 826 and android.colorCorrection.gains are ignored. 827 For devices that support the MANUAL_POST_PROCESSING capability, the 828 values used by the camera device for the transform and gains 829 will be available in the capture result for this request. 830 </notes> 831 </value> 832 <value>INCANDESCENT 833 <notes> 834 The camera device's auto-white balance routine is disabled; 835 the camera device uses incandescent light as the assumed scene 836 illumination for white balance. 837 838 While the exact white balance transforms are up to the 839 camera device, they will approximately match the CIE 840 standard illuminant A. 841 842 The application's values for android.colorCorrection.transform 843 and android.colorCorrection.gains are ignored. 844 For devices that support the MANUAL_POST_PROCESSING capability, the 845 values used by the camera device for the transform and gains 846 will be available in the capture result for this request. 847 </notes> 848 </value> 849 <value>FLUORESCENT 850 <notes> 851 The camera device's auto-white balance routine is disabled; 852 the camera device uses fluorescent light as the assumed scene 853 illumination for white balance. 854 855 While the exact white balance transforms are up to the 856 camera device, they will approximately match the CIE 857 standard illuminant F2. 858 859 The application's values for android.colorCorrection.transform 860 and android.colorCorrection.gains are ignored. 861 For devices that support the MANUAL_POST_PROCESSING capability, the 862 values used by the camera device for the transform and gains 863 will be available in the capture result for this request. 864 </notes> 865 </value> 866 <value>WARM_FLUORESCENT 867 <notes> 868 The camera device's auto-white balance routine is disabled; 869 the camera device uses warm fluorescent light as the assumed scene 870 illumination for white balance. 871 872 While the exact white balance transforms are up to the 873 camera device, they will approximately match the CIE 874 standard illuminant F4. 875 876 The application's values for android.colorCorrection.transform 877 and android.colorCorrection.gains are ignored. 878 For devices that support the MANUAL_POST_PROCESSING capability, the 879 values used by the camera device for the transform and gains 880 will be available in the capture result for this request. 881 </notes> 882 </value> 883 <value>DAYLIGHT 884 <notes> 885 The camera device's auto-white balance routine is disabled; 886 the camera device uses daylight light as the assumed scene 887 illumination for white balance. 888 889 While the exact white balance transforms are up to the 890 camera device, they will approximately match the CIE 891 standard illuminant D65. 892 893 The application's values for android.colorCorrection.transform 894 and android.colorCorrection.gains are ignored. 895 For devices that support the MANUAL_POST_PROCESSING capability, the 896 values used by the camera device for the transform and gains 897 will be available in the capture result for this request. 898 </notes> 899 </value> 900 <value>CLOUDY_DAYLIGHT 901 <notes> 902 The camera device's auto-white balance routine is disabled; 903 the camera device uses cloudy daylight light as the assumed scene 904 illumination for white balance. 905 906 The application's values for android.colorCorrection.transform 907 and android.colorCorrection.gains are ignored. 908 For devices that support the MANUAL_POST_PROCESSING capability, the 909 values used by the camera device for the transform and gains 910 will be available in the capture result for this request. 911 </notes> 912 </value> 913 <value>TWILIGHT 914 <notes> 915 The camera device's auto-white balance routine is disabled; 916 the camera device uses twilight light as the assumed scene 917 illumination for white balance. 918 919 The application's values for android.colorCorrection.transform 920 and android.colorCorrection.gains are ignored. 921 For devices that support the MANUAL_POST_PROCESSING capability, the 922 values used by the camera device for the transform and gains 923 will be available in the capture result for this request. 924 </notes> 925 </value> 926 <value>SHADE 927 <notes> 928 The camera device's auto-white balance routine is disabled; 929 the camera device uses shade light as the assumed scene 930 illumination for white balance. 931 932 The application's values for android.colorCorrection.transform 933 and android.colorCorrection.gains are ignored. 934 For devices that support the MANUAL_POST_PROCESSING capability, the 935 values used by the camera device for the transform and gains 936 will be available in the capture result for this request. 937 </notes> 938 </value> 939 </enum> 940 <description>Whether auto-white balance (AWB) is currently setting the color 941 transform fields, and what its illumination target 942 is.</description> 943 <range>android.control.awbAvailableModes</range> 944 <details> 945 This control is only effective if android.control.mode is AUTO. 946 947 When set to the ON mode, the camera device's auto-white balance 948 routine is enabled, overriding the application's selected 949 android.colorCorrection.transform, android.colorCorrection.gains and 950 android.colorCorrection.mode. 951 952 When set to the OFF mode, the camera device's auto-white balance 953 routine is disabled. The application manually controls the white 954 balance by android.colorCorrection.transform, android.colorCorrection.gains 955 and android.colorCorrection.mode. 956 957 When set to any other modes, the camera device's auto-white 958 balance routine is disabled. The camera device uses each 959 particular illumination target for white balance 960 adjustment. The application's values for 961 android.colorCorrection.transform, 962 android.colorCorrection.gains and 963 android.colorCorrection.mode are ignored. 964 </details> 965 <tag id="BC" /> 966 </entry> 967 <entry name="awbRegions" type="int32" visibility="public" 968 container="array" typedef="meteringRectangle"> 969 <array> 970 <size>5</size> 971 <size>area_count</size> 972 </array> 973 <description>List of areas to use for illuminant 974 estimation.</description> 975 <range>`area_count <= android.control.maxRegions[1]`</range> 976 <details> 977 The coordinate system is based on the active pixel array, 978 with (0,0) being the top-left pixel in the active pixel array, and 979 (android.sensor.info.activeArraySize.width - 1, 980 android.sensor.info.activeArraySize.height - 1) being the 981 bottom-right pixel in the active pixel array. 982 983 The weight must range from 0 to 1000, and represents a weight 984 for every pixel in the area. This means that a large metering area 985 with the same weight as a smaller area will have more effect in 986 the metering result. Metering areas can partially overlap and the 987 camera device will add the weights in the overlap region. 988 989 If all regions have 0 weight, then no specific metering area 990 needs to be used by the camera device. If the metering region is 991 outside the used android.scaler.cropRegion returned in capture result metadata, 992 the camera device will ignore the sections outside the region and output the 993 used sections in the result metadata. 994 </details> 995 <hal_details> 996 The HAL level representation of MeteringRectangle[] is a 997 int[5 * area_count]. 998 Every five elements represent a metering region of 999 (xmin, ymin, xmax, ymax, weight). 1000 The rectangle is defined to be inclusive on xmin and ymin, but 1001 exclusive on xmax and ymax. 1002 </hal_details> 1003 <tag id="BC" /> 1004 </entry> 1005 <entry name="captureIntent" type="byte" visibility="public" enum="true"> 1006 <enum> 1007 <value>CUSTOM 1008 <notes>The goal of this request doesn't fall into the other 1009 categories. The camera device will default to preview-like 1010 behavior.</notes></value> 1011 <value>PREVIEW 1012 <notes>This request is for a preview-like use case. 1013 1014 The precapture trigger may be used to start off a metering 1015 w/flash sequence. 1016 </notes></value> 1017 <value>STILL_CAPTURE 1018 <notes>This request is for a still capture-type 1019 use case. 1020 1021 If the flash unit is under automatic control, it may fire as needed. 1022 </notes></value> 1023 <value>VIDEO_RECORD 1024 <notes>This request is for a video recording 1025 use case.</notes></value> 1026 <value>VIDEO_SNAPSHOT 1027 <notes>This request is for a video snapshot (still 1028 image while recording video) use case. 1029 1030 The camera device should take the highest-quality image 1031 possible (given the other settings) without disrupting the 1032 frame rate of video recording. </notes></value> 1033 <value>ZERO_SHUTTER_LAG 1034 <notes>This request is for a ZSL usecase; the 1035 application will stream full-resolution images and 1036 reprocess one or several later for a final 1037 capture. 1038 </notes></value> 1039 <value>MANUAL 1040 <notes>This request is for manual capture use case where 1041 the applications want to directly control the capture parameters. 1042 1043 For example, the application may wish to manually control 1044 android.sensor.exposureTime, android.sensor.sensitivity, etc. 1045 </notes></value> 1046 </enum> 1047 <description>Information to the camera device 3A (auto-exposure, 1048 auto-focus, auto-white balance) routines about the purpose 1049 of this capture, to help the camera device to decide optimal 3A 1050 strategy.</description> 1051 <range>All must be supported except for ZERO_SHUTTER_LAG and MANUAL.</range> 1052 <details>This control (except for MANUAL) is only effective if 1053 `android.control.mode != OFF` and any 3A routine is active. 1054 1055 ZERO_SHUTTER_LAG will be supported if android.request.availableCapabilities 1056 contains ZSL. MANUAL will be supported if android.request.availableCapabilities 1057 contains MANUAL_SENSOR.</details> 1058 <tag id="BC" /> 1059 </entry> 1060 <entry name="effectMode" type="byte" visibility="public" enum="true"> 1061 <enum> 1062 <value>OFF 1063 <notes> 1064 No color effect will be applied. 1065 </notes> 1066 </value> 1067 <value optional="true">MONO 1068 <notes> 1069 A "monocolor" effect where the image is mapped into 1070 a single color. 1071 1072 This will typically be grayscale. 1073 </notes> 1074 </value> 1075 <value optional="true">NEGATIVE 1076 <notes> 1077 A "photo-negative" effect where the image's colors 1078 are inverted. 1079 </notes> 1080 </value> 1081 <value optional="true">SOLARIZE 1082 <notes> 1083 A "solarisation" effect (Sabattier effect) where the 1084 image is wholly or partially reversed in 1085 tone. 1086 </notes> 1087 </value> 1088 <value optional="true">SEPIA 1089 <notes> 1090 A "sepia" effect where the image is mapped into warm 1091 gray, red, and brown tones. 1092 </notes> 1093 </value> 1094 <value optional="true">POSTERIZE 1095 <notes> 1096 A "posterization" effect where the image uses 1097 discrete regions of tone rather than a continuous 1098 gradient of tones. 1099 </notes> 1100 </value> 1101 <value optional="true">WHITEBOARD 1102 <notes> 1103 A "whiteboard" effect where the image is typically displayed 1104 as regions of white, with black or grey details. 1105 </notes> 1106 </value> 1107 <value optional="true">BLACKBOARD 1108 <notes> 1109 A "blackboard" effect where the image is typically displayed 1110 as regions of black, with white or grey details. 1111 </notes> 1112 </value> 1113 <value optional="true">AQUA 1114 <notes> 1115 An "aqua" effect where a blue hue is added to the image. 1116 </notes> 1117 </value> 1118 </enum> 1119 <description>A special color effect to apply.</description> 1120 <range>android.control.availableEffects</range> 1121 <details> 1122 When this mode is set, a color effect will be applied 1123 to images produced by the camera device. The interpretation 1124 and implementation of these color effects is left to the 1125 implementor of the camera device, and should not be 1126 depended on to be consistent (or present) across all 1127 devices. 1128 1129 A color effect will only be applied if 1130 android.control.mode != OFF. 1131 </details> 1132 <tag id="BC" /> 1133 </entry> 1134 <entry name="mode" type="byte" visibility="public" enum="true"> 1135 <enum> 1136 <value>OFF 1137 <notes>Full application control of pipeline. 1138 1139 All control by the device's metering and focusing (3A) 1140 routines is disabled, and no other settings in 1141 android.control.* have any effect, except that 1142 android.control.captureIntent may be used by the camera 1143 device to select post-processing values for processing 1144 blocks that do not allow for manual control, or are not 1145 exposed by the camera API. 1146 1147 However, the camera device's 3A routines may continue to 1148 collect statistics and update their internal state so that 1149 when control is switched to AUTO mode, good control values 1150 can be immediately applied. 1151 </notes></value> 1152 <value>AUTO 1153 <notes>Use settings for each individual 3A routine. 1154 1155 Manual control of capture parameters is disabled. All 1156 controls in android.control.* besides sceneMode take 1157 effect.</notes></value> 1158 <value>USE_SCENE_MODE 1159 <notes>Use a specific scene mode. 1160 1161 Enabling this disables control.aeMode, control.awbMode and 1162 control.afMode controls; the camera device will ignore 1163 those settings while USE_SCENE_MODE is active (except for 1164 FACE_PRIORITY scene mode). Other control entries are still 1165 active. This setting can only be used if scene mode is 1166 supported (i.e. android.control.availableSceneModes 1167 contain some modes other than DISABLED).</notes></value> 1168 <value>OFF_KEEP_STATE 1169 <notes>Same as OFF mode, except that this capture will not be 1170 used by camera device background auto-exposure, auto-white balance and 1171 auto-focus algorithms (3A) to update their statistics. 1172 1173 Specifically, the 3A routines are locked to the last 1174 values set from a request with AUTO, OFF, or 1175 USE_SCENE_MODE, and any statistics or state updates 1176 collected from manual captures with OFF_KEEP_STATE will be 1177 discarded by the camera device. 1178 </notes></value> 1179 </enum> 1180 <description>Overall mode of 3A control 1181 routines.</description> 1182 <range>all must be supported</range> 1183 <details>High-level 3A control. When set to OFF, all 3A control 1184 by the camera device is disabled. The application must set the fields for 1185 capture parameters itself. 1186 1187 When set to AUTO, the individual algorithm controls in 1188 android.control.* are in effect, such as android.control.afMode. 1189 1190 When set to USE_SCENE_MODE, the individual controls in 1191 android.control.* are mostly disabled, and the camera device implements 1192 one of the scene mode settings (such as ACTION, SUNSET, or PARTY) 1193 as it wishes. The camera device scene mode 3A settings are provided by 1194 android.control.sceneModeOverrides. 1195 1196 When set to OFF_KEEP_STATE, it is similar to OFF mode, the only difference 1197 is that this frame will not be used by camera device background 3A statistics 1198 update, as if this frame is never captured. This mode can be used in the scenario 1199 where the application doesn't want a 3A manual control capture to affect 1200 the subsequent auto 3A capture results. 1201 </details> 1202 <tag id="BC" /> 1203 </entry> 1204 <entry name="sceneMode" type="byte" visibility="public" enum="true"> 1205 <enum> 1206 <value id="0">DISABLED 1207 <notes> 1208 Indicates that no scene modes are set for a given capture request. 1209 </notes> 1210 </value> 1211 <value>FACE_PRIORITY 1212 <notes>If face detection support exists, use face 1213 detection data for auto-focus, auto-white balance, and 1214 auto-exposure routines. 1215 1216 If face detection statistics are disabled 1217 (i.e. android.statistics.faceDetectMode is set to OFF), 1218 this should still operate correctly (but will not return 1219 face detection statistics to the framework). 1220 1221 Unlike the other scene modes, android.control.aeMode, 1222 android.control.awbMode, and android.control.afMode 1223 remain active when FACE_PRIORITY is set. 1224 </notes> 1225 </value> 1226 <value optional="true">ACTION 1227 <notes> 1228 Optimized for photos of quickly moving objects. 1229 1230 Similar to SPORTS. 1231 </notes> 1232 </value> 1233 <value optional="true">PORTRAIT 1234 <notes> 1235 Optimized for still photos of people. 1236 </notes> 1237 </value> 1238 <value optional="true">LANDSCAPE 1239 <notes> 1240 Optimized for photos of distant macroscopic objects. 1241 </notes> 1242 </value> 1243 <value optional="true">NIGHT 1244 <notes> 1245 Optimized for low-light settings. 1246 </notes> 1247 </value> 1248 <value optional="true">NIGHT_PORTRAIT 1249 <notes> 1250 Optimized for still photos of people in low-light 1251 settings. 1252 </notes> 1253 </value> 1254 <value optional="true">THEATRE 1255 <notes> 1256 Optimized for dim, indoor settings where flash must 1257 remain off. 1258 </notes> 1259 </value> 1260 <value optional="true">BEACH 1261 <notes> 1262 Optimized for bright, outdoor beach settings. 1263 </notes> 1264 </value> 1265 <value optional="true">SNOW 1266 <notes> 1267 Optimized for bright, outdoor settings containing snow. 1268 </notes> 1269 </value> 1270 <value optional="true">SUNSET 1271 <notes> 1272 Optimized for scenes of the setting sun. 1273 </notes> 1274 </value> 1275 <value optional="true">STEADYPHOTO 1276 <notes> 1277 Optimized to avoid blurry photos due to small amounts of 1278 device motion (for example: due to hand shake). 1279 </notes> 1280 </value> 1281 <value optional="true">FIREWORKS 1282 <notes> 1283 Optimized for nighttime photos of fireworks. 1284 </notes> 1285 </value> 1286 <value optional="true">SPORTS 1287 <notes> 1288 Optimized for photos of quickly moving people. 1289 1290 Similar to ACTION. 1291 </notes> 1292 </value> 1293 <value optional="true">PARTY 1294 <notes> 1295 Optimized for dim, indoor settings with multiple moving 1296 people. 1297 </notes> 1298 </value> 1299 <value optional="true">CANDLELIGHT 1300 <notes> 1301 Optimized for dim settings where the main light source 1302 is a flame. 1303 </notes> 1304 </value> 1305 <value optional="true">BARCODE 1306 <notes> 1307 Optimized for accurately capturing a photo of barcode 1308 for use by camera applications that wish to read the 1309 barcode value. 1310 </notes> 1311 </value> 1312 </enum> 1313 <description> 1314 A camera mode optimized for conditions typical in a particular 1315 capture setting. 1316 </description> 1317 <range>android.control.availableSceneModes</range> 1318 <details> 1319 This is the mode that that is active when 1320 `android.control.mode == USE_SCENE_MODE`. Aside from FACE_PRIORITY, 1321 these modes will disable android.control.aeMode, 1322 android.control.awbMode, and android.control.afMode while in use. 1323 The scene modes available for a given camera device are listed in 1324 android.control.availableSceneModes. 1325 1326 The interpretation and implementation of these scene modes is left 1327 to the implementor of the camera device. Their behavior will not be 1328 consistent across all devices, and any given device may only implement 1329 a subset of these modes. 1330 </details> 1331 <hal_details> 1332 HAL implementations that include scene modes are expected to provide 1333 the per-scene settings to use for android.control.aeMode, 1334 android.control.awbMode, and android.control.afMode in 1335 android.control.sceneModeOverrides. 1336 </hal_details> 1337 <tag id="BC" /> 1338 </entry> 1339 <entry name="videoStabilizationMode" type="byte" visibility="public" 1340 enum="true"> 1341 <enum> 1342 <value>OFF 1343 <notes> 1344 Video stabilization is disabled. 1345 </notes></value> 1346 <value>ON 1347 <notes> 1348 Video stabilization is enabled. 1349 </notes></value> 1350 </enum> 1351 <description>Whether video stabilization is 1352 active.</description> 1353 <details> 1354 Video stabilization automatically translates and scales images from the camera 1355 in order to stabilize motion between consecutive frames. 1356 1357 If enabled, video stabilization can modify the 1358 android.scaler.cropRegion to keep the video stream 1359 stabilized</details> 1360 <tag id="BC" /> 1361 </entry> 1362 </controls> 1363 <static> 1364 <entry name="aeAvailableAntibandingModes" type="byte" visibility="public" 1365 type_notes="list of enums" container="array" typedef="enumList" > 1366 <array> 1367 <size>n</size> 1368 </array> 1369 <description> 1370 The set of auto-exposure antibanding modes that are 1371 supported by this camera device. 1372 </description> 1373 <details> 1374 Not all of the auto-exposure anti-banding modes may be 1375 supported by a given camera device. This field lists the 1376 valid anti-banding modes that the application may request 1377 for this camera device; they must include AUTO. 1378 </details> 1379 <tag id="BC" /> 1380 </entry> 1381 <entry name="aeAvailableModes" type="byte" visibility="public" 1382 type_notes="list of enums" container="array" typedef="enumList"> 1383 <array> 1384 <size>n</size> 1385 </array> 1386 <description> 1387 The set of auto-exposure modes that are supported by this 1388 camera device. 1389 </description> 1390 <details> 1391 Not all the auto-exposure modes may be supported by a 1392 given camera device, especially if no flash unit is 1393 available. This entry lists the valid modes for 1394 android.control.aeMode for this camera device. 1395 1396 All camera devices support ON, and all camera devices with 1397 flash units support ON_AUTO_FLASH and 1398 ON_ALWAYS_FLASH. 1399 1400 FULL mode camera devices always support OFF mode, 1401 which enables application control of camera exposure time, 1402 sensitivity, and frame duration. 1403 </details> 1404 <tag id="BC" /> 1405 </entry> 1406 <entry name="aeAvailableTargetFpsRanges" type="int32" visibility="public" 1407 type_notes="list of pairs of frame rates" 1408 container="array" typedef="rangeInt"> 1409 <array> 1410 <size>2</size> 1411 <size>n</size> 1412 </array> 1413 <description>List of frame rate ranges supported by the 1414 auto-exposure (AE) algorithm/hardware</description> 1415 <tag id="BC" /> 1416 </entry> 1417 <entry name="aeCompensationRange" type="int32" visibility="public" 1418 container="array" typedef="rangeInt"> 1419 <array> 1420 <size>2</size> 1421 </array> 1422 <description>Maximum and minimum exposure compensation 1423 setting, in counts of 1424 android.control.aeCompensationStep.</description> 1425 <range>At least (-2,2)/(exp compensation step 1426 size)</range> 1427 <tag id="BC" /> 1428 </entry> 1429 <entry name="aeCompensationStep" type="rational" visibility="public"> 1430 <description>Smallest step by which exposure compensation 1431 can be changed</description> 1432 <range><= 1/2</range> 1433 <tag id="BC" /> 1434 </entry> 1435 <entry name="afAvailableModes" type="byte" visibility="public" 1436 type_notes="List of enums" container="array" typedef="enumList"> 1437 <array> 1438 <size>n</size> 1439 </array> 1440 <description>List of auto-focus (AF) modes that can be 1441 selected with android.control.afMode.</description> 1442 <details> 1443 Not all the auto-focus modes may be supported by a 1444 given camera device. This entry lists the valid modes for 1445 android.control.afMode for this camera device. 1446 1447 All camera devices will support OFF mode, and all camera devices with 1448 adjustable focuser units (`android.lens.info.minimumFocusDistance > 0`) 1449 will support AUTO mode. 1450 </details> 1451 <tag id="BC" /> 1452 </entry> 1453 <entry name="availableEffects" type="byte" visibility="public" 1454 type_notes="List of enums (android.control.effectMode)." container="array" 1455 typedef="enumList"> 1456 <array> 1457 <size>n</size> 1458 </array> 1459 <description> 1460 List containing the subset of color effects 1461 specified in android.control.effectMode that is supported by 1462 this device. 1463 </description> 1464 <range> 1465 Any subset of enums from those specified in 1466 android.control.effectMode. OFF must be included in any subset. 1467 </range> 1468 <details> 1469 This list contains the color effect modes that can be applied to 1470 images produced by the camera device. Only modes that have 1471 been fully implemented for the current device may be included here. 1472 Implementations are not expected to be consistent across all devices. 1473 If no color effect modes are available for a device, this should 1474 simply be set to OFF. 1475 1476 A color effect will only be applied if 1477 android.control.mode != OFF. 1478 </details> 1479 <tag id="BC" /> 1480 </entry> 1481 <entry name="availableSceneModes" type="byte" visibility="public" 1482 type_notes="List of enums (android.control.sceneMode)." 1483 container="array" typedef="enumList"> 1484 <array> 1485 <size>n</size> 1486 </array> 1487 <description> 1488 List containing a subset of scene modes 1489 specified in android.control.sceneMode. 1490 </description> 1491 <range> 1492 Any subset of the enums specified in android.control.sceneMode 1493 not including DISABLED, or solely DISABLED if no 1494 scene modes are available. FACE_PRIORITY must be included 1495 if face detection is supported (i.e.`android.statistics.info.maxFaceCount > 0`). 1496 </range> 1497 <details> 1498 This list contains scene modes that can be set for the camera device. 1499 Only scene modes that have been fully implemented for the 1500 camera device may be included here. Implementations are not expected 1501 to be consistent across all devices. If no scene modes are supported 1502 by the camera device, this will be set to `[DISABLED]`. 1503 </details> 1504 <tag id="BC" /> 1505 </entry> 1506 <entry name="availableVideoStabilizationModes" type="byte" 1507 visibility="public" type_notes="List of enums." container="array" 1508 typedef="enumList"> 1509 <array> 1510 <size>n</size> 1511 </array> 1512 <description>List of video stabilization modes that can 1513 be supported</description> 1514 <range>OFF must be included</range> 1515 <tag id="BC" /> 1516 </entry> 1517 <entry name="awbAvailableModes" type="byte" visibility="public" 1518 type_notes="List of enums" 1519 container="array" typedef="enumList"> 1520 <array> 1521 <size>n</size> 1522 </array> 1523 <description>The set of auto-white-balance modes (android.control.awbMode) 1524 that are supported by this camera device.</description> 1525 <details> 1526 Not all the auto-white-balance modes may be supported by a 1527 given camera device. This entry lists the valid modes for 1528 android.control.awbMode for this camera device. 1529 1530 All camera devices will support ON mode. 1531 1532 FULL mode camera devices will always support OFF mode, 1533 which enables application control of white balance, by using 1534 android.colorCorrection.transform and android.colorCorrection.gains 1535 (android.colorCorrection.mode must be set to TRANSFORM_MATRIX). 1536 </details> 1537 <tag id="BC" /> 1538 </entry> 1539 <entry name="maxRegions" type="int32" visibility="hidden" container="array"> 1540 <array> 1541 <size>3</size> 1542 </array> 1543 <description> 1544 List of the maximum number of regions that can be used for metering in 1545 auto-exposure (AE), auto-white balance (AWB), and auto-focus (AF); 1546 this corresponds to the the maximum number of elements in 1547 android.control.aeRegions, android.control.awbRegions, 1548 and android.control.afRegions. 1549 </description> 1550 <range> 1551 Value must be &gt;= 0 for each element. For full-capability devices 1552 this value must be &gt;= 1 for AE and AF. The order of the elements is: 1553 `(AE, AWB, AF)`.</range> 1554 <tag id="BC" /> 1555 </entry> 1556 <entry name="maxRegionsAe" type="int32" visibility="public" synthetic="true"> 1557 <description> 1558 List of the maximum number of regions that can be used for metering in 1559 auto-exposure (AE); 1560 this corresponds to the the maximum number of elements in 1561 android.control.aeRegions. 1562 </description> 1563 <range> 1564 Value will be &gt;= 0. For FULL-capability devices, this 1565 value will be &gt;= 1. 1566 </range> 1567 <hal_details>This entry is private to the framework. Fill in 1568 maxRegions to have this entry be automatically populated. 1569 </hal_details> 1570 </entry> 1571 <entry name="maxRegionsAwb" type="int32" visibility="public" synthetic="true"> 1572 <description> 1573 List of the maximum number of regions that can be used for metering in 1574 auto-white balance (AWB); 1575 this corresponds to the the maximum number of elements in 1576 android.control.awbRegions. 1577 </description> 1578 <range> 1579 Value will be &gt;= 0. 1580 </range> 1581 <hal_details>This entry is private to the framework. Fill in 1582 maxRegions to have this entry be automatically populated. 1583 </hal_details> 1584 </entry> 1585 <entry name="maxRegionsAf" type="int32" visibility="public" synthetic="true"> 1586 <description> 1587 List of the maximum number of regions that can be used for metering in 1588 auto-focus (AF); 1589 this corresponds to the the maximum number of elements in 1590 android.control.afRegions. 1591 </description> 1592 <range> 1593 Value will be &gt;= 0. For FULL-capability devices, this 1594 value will be &gt;= 1. 1595 </range> 1596 <hal_details>This entry is private to the framework. Fill in 1597 maxRegions to have this entry be automatically populated. 1598 </hal_details> 1599 </entry> 1600 <entry name="sceneModeOverrides" type="byte" visibility="system" 1601 container="array"> 1602 <array> 1603 <size>3</size> 1604 <size>length(availableSceneModes)</size> 1605 </array> 1606 <description> 1607 Ordered list of auto-exposure, auto-white balance, and auto-focus 1608 settings to use with each available scene mode. 1609 </description> 1610 <range> 1611 For each available scene mode, the list must contain three 1612 entries containing the android.control.aeMode, 1613 android.control.awbMode, and android.control.afMode values used 1614 by the camera device. The entry order is `(aeMode, awbMode, afMode)` 1615 where aeMode has the lowest index position. 1616 </range> 1617 <details> 1618 When a scene mode is enabled, the camera device is expected 1619 to override android.control.aeMode, android.control.awbMode, 1620 and android.control.afMode with its preferred settings for 1621 that scene mode. 1622 1623 The order of this list matches that of availableSceneModes, 1624 with 3 entries for each mode. The overrides listed 1625 for FACE_PRIORITY are ignored, since for that 1626 mode the application-set android.control.aeMode, 1627 android.control.awbMode, and android.control.afMode values are 1628 used instead, matching the behavior when android.control.mode 1629 is set to AUTO. It is recommended that the FACE_PRIORITY 1630 overrides should be set to 0. 1631 1632 For example, if availableSceneModes contains 1633 `(FACE_PRIORITY, ACTION, NIGHT)`, then the camera framework 1634 expects sceneModeOverrides to have 9 entries formatted like: 1635 `(0, 0, 0, ON_AUTO_FLASH, AUTO, CONTINUOUS_PICTURE, 1636 ON_AUTO_FLASH, INCANDESCENT, AUTO)`. 1637 </details> 1638 <hal_details> 1639 To maintain backward compatibility, this list will be made available 1640 in the static metadata of the camera service. The camera service will 1641 use these values to set android.control.aeMode, 1642 android.control.awbMode, and android.control.afMode when using a scene 1643 mode other than FACE_PRIORITY. 1644 </hal_details> 1645 <tag id="BC" /> 1646 </entry> 1647 </static> 1648 <dynamic> 1649 <entry name="aePrecaptureId" type="int32" visibility="system" deprecated="true"> 1650 <description>The ID sent with the latest 1651 CAMERA2_TRIGGER_PRECAPTURE_METERING call</description> 1652 <details>Must be 0 if no 1653 CAMERA2_TRIGGER_PRECAPTURE_METERING trigger received yet 1654 by HAL. Always updated even if AE algorithm ignores the 1655 trigger</details> 1656 </entry> 1657 <clone entry="android.control.aeAntibandingMode" kind="controls"> 1658 </clone> 1659 <clone entry="android.control.aeExposureCompensation" kind="controls"> 1660 </clone> 1661 <clone entry="android.control.aeLock" kind="controls"> 1662 </clone> 1663 <clone entry="android.control.aeMode" kind="controls"> 1664 </clone> 1665 <clone entry="android.control.aeRegions" kind="controls"> 1666 </clone> 1667 <clone entry="android.control.aeTargetFpsRange" kind="controls"> 1668 </clone> 1669 <clone entry="android.control.aePrecaptureTrigger" kind="controls"> 1670 </clone> 1671 <entry name="aeState" type="byte" visibility="public" enum="true"> 1672 <enum> 1673 <value>INACTIVE 1674 <notes>AE is off or recently reset. 1675 1676 When a camera device is opened, it starts in 1677 this state. This is a transient state, the camera device may skip reporting 1678 this state in capture result.</notes></value> 1679 <value>SEARCHING 1680 <notes>AE doesn't yet have a good set of control values 1681 for the current scene. 1682 1683 This is a transient state, the camera device may skip 1684 reporting this state in capture result.</notes></value> 1685 <value>CONVERGED 1686 <notes>AE has a good set of control values for the 1687 current scene.</notes></value> 1688 <value>LOCKED 1689 <notes>AE has been locked.</notes></value> 1690 <value>FLASH_REQUIRED 1691 <notes>AE has a good set of control values, but flash 1692 needs to be fired for good quality still 1693 capture.</notes></value> 1694 <value>PRECAPTURE 1695 <notes>AE has been asked to do a precapture sequence 1696 and is currently executing it. 1697 1698 Precapture can be triggered through setting 1699 android.control.aePrecaptureTrigger to START. 1700 1701 Once PRECAPTURE completes, AE will transition to CONVERGED 1702 or FLASH_REQUIRED as appropriate. This is a transient 1703 state, the camera device may skip reporting this state in 1704 capture result.</notes></value> 1705 </enum> 1706 <description>Current state of the auto-exposure (AE) algorithm.</description> 1707 <details>Switching between or enabling AE modes (android.control.aeMode) always 1708 resets the AE state to INACTIVE. Similarly, switching between android.control.mode, 1709 or android.control.sceneMode if `android.control.mode == USE_SCENE_MODE` resets all 1710 the algorithm states to INACTIVE. 1711 1712 The camera device can do several state transitions between two results, if it is 1713 allowed by the state transition table. For example: INACTIVE may never actually be 1714 seen in a result. 1715 1716 The state in the result is the state for this image (in sync with this image): if 1717 AE state becomes CONVERGED, then the image data associated with this result should 1718 be good to use. 1719 1720 Below are state transition tables for different AE modes. 1721 1722 State | Transition Cause | New State | Notes 1723 :------------:|:----------------:|:---------:|:-----------------------: 1724 INACTIVE | | INACTIVE | Camera device auto exposure algorithm is disabled 1725 1726 When android.control.aeMode is AE_MODE_ON_*: 1727 1728 State | Transition Cause | New State | Notes 1729 :-------------:|:--------------------------------------------:|:--------------:|:-----------------: 1730 INACTIVE | Camera device initiates AE scan | SEARCHING | Values changing 1731 INACTIVE | android.control.aeLock is ON | LOCKED | Values locked 1732 SEARCHING | Camera device finishes AE scan | CONVERGED | Good values, not changing 1733 SEARCHING | Camera device finishes AE scan | FLASH_REQUIRED | Converged but too dark w/o flash 1734 SEARCHING | android.control.aeLock is ON | LOCKED | Values locked 1735 CONVERGED | Camera device initiates AE scan | SEARCHING | Values changing 1736 CONVERGED | android.control.aeLock is ON | LOCKED | Values locked 1737 FLASH_REQUIRED | Camera device initiates AE scan | SEARCHING | Values changing 1738 FLASH_REQUIRED | android.control.aeLock is ON | LOCKED | Values locked 1739 LOCKED | android.control.aeLock is OFF | SEARCHING | Values not good after unlock 1740 LOCKED | android.control.aeLock is OFF | CONVERGED | Values good after unlock 1741 LOCKED | android.control.aeLock is OFF | FLASH_REQUIRED | Exposure good, but too dark 1742 PRECAPTURE | Sequence done. android.control.aeLock is OFF | CONVERGED | Ready for high-quality capture 1743 PRECAPTURE | Sequence done. android.control.aeLock is ON | LOCKED | Ready for high-quality capture 1744 Any state | android.control.aePrecaptureTrigger is START | PRECAPTURE | Start AE precapture metering sequence 1745 1746 For the above table, the camera device may skip reporting any state changes that happen 1747 without application intervention (i.e. mode switch, trigger, locking). Any state that 1748 can be skipped in that manner is called a transient state. 1749 1750 For example, for above AE modes (AE_MODE_ON_*), in addition to the state transitions 1751 listed in above table, it is also legal for the camera device to skip one or more 1752 transient states between two results. See below table for examples: 1753 1754 State | Transition Cause | New State | Notes 1755 :-------------:|:-----------------------------------------------------------:|:--------------:|:-----------------: 1756 INACTIVE | Camera device finished AE scan | CONVERGED | Values are already good, transient states are skipped by camera device. 1757 Any state | android.control.aePrecaptureTrigger is START, sequence done | FLASH_REQUIRED | Converged but too dark w/o flash after a precapture sequence, transient states are skipped by camera device. 1758 Any state | android.control.aePrecaptureTrigger is START, sequence done | CONVERGED | Converged after a precapture sequence, transient states are skipped by camera device. 1759 CONVERGED | Camera device finished AE scan | FLASH_REQUIRED | Converged but too dark w/o flash after a new scan, transient states are skipped by camera device. 1760 FLASH_REQUIRED | Camera device finished AE scan | CONVERGED | Converged after a new scan, transient states are skipped by camera device. 1761 </details> 1762 </entry> 1763 <clone entry="android.control.afMode" kind="controls"> 1764 </clone> 1765 <clone entry="android.control.afRegions" kind="controls"> 1766 </clone> 1767 <clone entry="android.control.afTrigger" kind="controls"> 1768 </clone> 1769 <entry name="afState" type="byte" visibility="public" enum="true"> 1770 <enum> 1771 <value>INACTIVE 1772 <notes>AF is off or has not yet tried to scan/been asked 1773 to scan. 1774 1775 When a camera device is opened, it starts in this 1776 state. This is a transient state, the camera device may 1777 skip reporting this state in capture 1778 result.</notes></value> 1779 <value>PASSIVE_SCAN 1780 <notes>AF is currently performing an AF scan initiated the 1781 camera device in a continuous autofocus mode. 1782 1783 Only used by CONTINUOUS_* AF modes. This is a transient 1784 state, the camera device may skip reporting this state in 1785 capture result.</notes></value> 1786 <value>PASSIVE_FOCUSED 1787 <notes>AF currently believes it is in focus, but may 1788 restart scanning at any time. 1789 1790 Only used by CONTINUOUS_* AF modes. This is a transient 1791 state, the camera device may skip reporting this state in 1792 capture result.</notes></value> 1793 <value>ACTIVE_SCAN 1794 <notes>AF is performing an AF scan because it was 1795 triggered by AF trigger. 1796 1797 Only used by AUTO or MACRO AF modes. This is a transient 1798 state, the camera device may skip reporting this state in 1799 capture result.</notes></value> 1800 <value>FOCUSED_LOCKED 1801 <notes>AF believes it is focused correctly and has locked 1802 focus. 1803 1804 This state is reached only after an explicit START AF trigger has been 1805 sent (android.control.afTrigger), when good focus has been obtained. 1806 1807 The lens will remain stationary until the AF mode (android.control.afMode) is changed or 1808 a new AF trigger is sent to the camera device (android.control.afTrigger). 1809 </notes></value> 1810 <value>NOT_FOCUSED_LOCKED 1811 <notes>AF has failed to focus successfully and has locked 1812 focus. 1813 1814 This state is reached only after an explicit START AF trigger has been 1815 sent (android.control.afTrigger), when good focus cannot be obtained. 1816 1817 The lens will remain stationary until the AF mode (android.control.afMode) is changed or 1818 a new AF trigger is sent to the camera device (android.control.afTrigger). 1819 </notes></value> 1820 <value>PASSIVE_UNFOCUSED 1821 <notes>AF finished a passive scan without finding focus, 1822 and may restart scanning at any time. 1823 1824 Only used by CONTINUOUS_* AF modes. This is a transient state, the camera 1825 device may skip reporting this state in capture result.</notes></value> 1826 </enum> 1827 <description>Current state of auto-focus (AF) algorithm.</description> 1828 <details> 1829 Switching between or enabling AF modes (android.control.afMode) always 1830 resets the AF state to INACTIVE. Similarly, switching between android.control.mode, 1831 or android.control.sceneMode if `android.control.mode == USE_SCENE_MODE` resets all 1832 the algorithm states to INACTIVE. 1833 1834 The camera device can do several state transitions between two results, if it is 1835 allowed by the state transition table. For example: INACTIVE may never actually be 1836 seen in a result. 1837 1838 The state in the result is the state for this image (in sync with this image): if 1839 AF state becomes FOCUSED, then the image data associated with this result should 1840 be sharp. 1841 1842 Below are state transition tables for different AF modes. 1843 1844 When android.control.afMode is AF_MODE_OFF or AF_MODE_EDOF: 1845 1846 State | Transition Cause | New State | Notes 1847 :------------:|:----------------:|:---------:|:-----------: 1848 INACTIVE | | INACTIVE | Never changes 1849 1850 When android.control.afMode is AF_MODE_AUTO or AF_MODE_MACRO: 1851 1852 State | Transition Cause | New State | Notes 1853 :-----------------:|:----------------:|:------------------:|:--------------: 1854 INACTIVE | AF_TRIGGER | ACTIVE_SCAN | Start AF sweep, Lens now moving 1855 ACTIVE_SCAN | AF sweep done | FOCUSED_LOCKED | Focused, Lens now locked 1856 ACTIVE_SCAN | AF sweep done | NOT_FOCUSED_LOCKED | Not focused, Lens now locked 1857 ACTIVE_SCAN | AF_CANCEL | INACTIVE | Cancel/reset AF, Lens now locked 1858 FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Cancel/reset AF 1859 FOCUSED_LOCKED | AF_TRIGGER | ACTIVE_SCAN | Start new sweep, Lens now moving 1860 NOT_FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Cancel/reset AF 1861 NOT_FOCUSED_LOCKED | AF_TRIGGER | ACTIVE_SCAN | Start new sweep, Lens now moving 1862 Any state | Mode change | INACTIVE | 1863 1864 For the above table, the camera device may skip reporting any state changes that happen 1865 without application intervention (i.e. mode switch, trigger, locking). Any state that 1866 can be skipped in that manner is called a transient state. 1867 1868 For example, for these AF modes (AF_MODE_AUTO and AF_MODE_MACRO), in addition to the 1869 state transitions listed in above table, it is also legal for the camera device to skip 1870 one or more transient states between two results. See below table for examples: 1871 1872 State | Transition Cause | New State | Notes 1873 :-----------------:|:----------------:|:------------------:|:--------------: 1874 INACTIVE | AF_TRIGGER | FOCUSED_LOCKED | Focus is already good or good after a scan, lens is now locked. 1875 INACTIVE | AF_TRIGGER | NOT_FOCUSED_LOCKED | Focus failed after a scan, lens is now locked. 1876 FOCUSED_LOCKED | AF_TRIGGER | FOCUSED_LOCKED | Focus is already good or good after a scan, lens is now locked. 1877 NOT_FOCUSED_LOCKED | AF_TRIGGER | FOCUSED_LOCKED | Focus is good after a scan, lens is not locked. 1878 1879 1880 When android.control.afMode is AF_MODE_CONTINUOUS_VIDEO: 1881 1882 State | Transition Cause | New State | Notes 1883 :-----------------:|:-----------------------------------:|:------------------:|:--------------: 1884 INACTIVE | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1885 INACTIVE | AF_TRIGGER | NOT_FOCUSED_LOCKED | AF state query, Lens now locked 1886 PASSIVE_SCAN | Camera device completes current scan| PASSIVE_FOCUSED | End AF scan, Lens now locked 1887 PASSIVE_SCAN | Camera device fails current scan | PASSIVE_UNFOCUSED | End AF scan, Lens now locked 1888 PASSIVE_SCAN | AF_TRIGGER | FOCUSED_LOCKED | Immediate transition, if focus is good. Lens now locked 1889 PASSIVE_SCAN | AF_TRIGGER | NOT_FOCUSED_LOCKED | Immediate transition, if focus is bad. Lens now locked 1890 PASSIVE_SCAN | AF_CANCEL | INACTIVE | Reset lens position, Lens now locked 1891 PASSIVE_FOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1892 PASSIVE_UNFOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1893 PASSIVE_FOCUSED | AF_TRIGGER | FOCUSED_LOCKED | Immediate transition, lens now locked 1894 PASSIVE_UNFOCUSED | AF_TRIGGER | NOT_FOCUSED_LOCKED | Immediate transition, lens now locked 1895 FOCUSED_LOCKED | AF_TRIGGER | FOCUSED_LOCKED | No effect 1896 FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 1897 NOT_FOCUSED_LOCKED | AF_TRIGGER | NOT_FOCUSED_LOCKED | No effect 1898 NOT_FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 1899 1900 When android.control.afMode is AF_MODE_CONTINUOUS_PICTURE: 1901 1902 State | Transition Cause | New State | Notes 1903 :-----------------:|:------------------------------------:|:------------------:|:--------------: 1904 INACTIVE | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1905 INACTIVE | AF_TRIGGER | NOT_FOCUSED_LOCKED | AF state query, Lens now locked 1906 PASSIVE_SCAN | Camera device completes current scan | PASSIVE_FOCUSED | End AF scan, Lens now locked 1907 PASSIVE_SCAN | Camera device fails current scan | PASSIVE_UNFOCUSED | End AF scan, Lens now locked 1908 PASSIVE_SCAN | AF_TRIGGER | FOCUSED_LOCKED | Eventual transition once the focus is good. Lens now locked 1909 PASSIVE_SCAN | AF_TRIGGER | NOT_FOCUSED_LOCKED | Eventual transition if cannot find focus. Lens now locked 1910 PASSIVE_SCAN | AF_CANCEL | INACTIVE | Reset lens position, Lens now locked 1911 PASSIVE_FOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1912 PASSIVE_UNFOCUSED | Camera device initiates new scan | PASSIVE_SCAN | Start AF scan, Lens now moving 1913 PASSIVE_FOCUSED | AF_TRIGGER | FOCUSED_LOCKED | Immediate trans. Lens now locked 1914 PASSIVE_UNFOCUSED | AF_TRIGGER | NOT_FOCUSED_LOCKED | Immediate trans. Lens now locked 1915 FOCUSED_LOCKED | AF_TRIGGER | FOCUSED_LOCKED | No effect 1916 FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 1917 NOT_FOCUSED_LOCKED | AF_TRIGGER | NOT_FOCUSED_LOCKED | No effect 1918 NOT_FOCUSED_LOCKED | AF_CANCEL | INACTIVE | Restart AF scan 1919 1920 When switch between AF_MODE_CONTINUOUS_* (CAF modes) and AF_MODE_AUTO/AF_MODE_MACRO 1921 (AUTO modes), the initial INACTIVE or PASSIVE_SCAN states may be skipped by the 1922 camera device. When a trigger is included in a mode switch request, the trigger 1923 will be evaluated in the context of the new mode in the request. 1924 See below table for examples: 1925 1926 State | Transition Cause | New State | Notes 1927 :-----------:|:--------------------------------------:|:----------------------------------------:|:--------------: 1928 any state | CAF-->AUTO mode switch | INACTIVE | Mode switch without trigger, initial state must be INACTIVE 1929 any state | CAF-->AUTO mode switch with AF_TRIGGER | trigger-reachable states from INACTIVE | Mode switch with trigger, INACTIVE is skipped 1930 any state | AUTO-->CAF mode switch | passively reachable states from INACTIVE | Mode switch without trigger, passive transient state is skipped 1931 </details> 1932 </entry> 1933 <entry name="afTriggerId" type="int32" visibility="system" deprecated="true"> 1934 <description>The ID sent with the latest 1935 CAMERA2_TRIGGER_AUTOFOCUS call</description> 1936 <details>Must be 0 if no CAMERA2_TRIGGER_AUTOFOCUS trigger 1937 received yet by HAL. Always updated even if AF algorithm 1938 ignores the trigger</details> 1939 </entry> 1940 <clone entry="android.control.awbLock" kind="controls"> 1941 </clone> 1942 <clone entry="android.control.awbMode" kind="controls"> 1943 </clone> 1944 <clone entry="android.control.awbRegions" kind="controls"> 1945 </clone> 1946 <clone entry="android.control.captureIntent" kind="controls"> 1947 </clone> 1948 <entry name="awbState" type="byte" visibility="public" enum="true"> 1949 <enum> 1950 <value>INACTIVE 1951 <notes>AWB is not in auto mode, or has not yet started metering. 1952 1953 When a camera device is opened, it starts in this 1954 state. This is a transient state, the camera device may 1955 skip reporting this state in capture 1956 result.</notes></value> 1957 <value>SEARCHING 1958 <notes>AWB doesn't yet have a good set of control 1959 values for the current scene. 1960 1961 This is a transient state, the camera device 1962 may skip reporting this state in capture result.</notes></value> 1963 <value>CONVERGED 1964 <notes>AWB has a good set of control values for the 1965 current scene.</notes></value> 1966 <value>LOCKED 1967 <notes>AWB has been locked. 1968 </notes></value> 1969 </enum> 1970 <description>Current state of auto-white balance (AWB) algorithm.</description> 1971 <details>Switching between or enabling AWB modes (android.control.awbMode) always 1972 resets the AWB state to INACTIVE. Similarly, switching between android.control.mode, 1973 or android.control.sceneMode if `android.control.mode == USE_SCENE_MODE` resets all 1974 the algorithm states to INACTIVE. 1975 1976 The camera device can do several state transitions between two results, if it is 1977 allowed by the state transition table. So INACTIVE may never actually be seen in 1978 a result. 1979 1980 The state in the result is the state for this image (in sync with this image): if 1981 AWB state becomes CONVERGED, then the image data associated with this result should 1982 be good to use. 1983 1984 Below are state transition tables for different AWB modes. 1985 1986 When `android.control.awbMode != AWB_MODE_AUTO`: 1987 1988 State | Transition Cause | New State | Notes 1989 :------------:|:----------------:|:---------:|:-----------------------: 1990 INACTIVE | |INACTIVE |Camera device auto white balance algorithm is disabled 1991 1992 When android.control.awbMode is AWB_MODE_AUTO: 1993 1994 State | Transition Cause | New State | Notes 1995 :-------------:|:--------------------------------:|:-------------:|:-----------------: 1996 INACTIVE | Camera device initiates AWB scan | SEARCHING | Values changing 1997 INACTIVE | android.control.awbLock is ON | LOCKED | Values locked 1998 SEARCHING | Camera device finishes AWB scan | CONVERGED | Good values, not changing 1999 SEARCHING | android.control.awbLock is ON | LOCKED | Values locked 2000 CONVERGED | Camera device initiates AWB scan | SEARCHING | Values changing 2001 CONVERGED | android.control.awbLock is ON | LOCKED | Values locked 2002 LOCKED | android.control.awbLock is OFF | SEARCHING | Values not good after unlock 2003 2004 For the above table, the camera device may skip reporting any state changes that happen 2005 without application intervention (i.e. mode switch, trigger, locking). Any state that 2006 can be skipped in that manner is called a transient state. 2007 2008 For example, for this AWB mode (AWB_MODE_AUTO), in addition to the state transitions 2009 listed in above table, it is also legal for the camera device to skip one or more 2010 transient states between two results. See below table for examples: 2011 2012 State | Transition Cause | New State | Notes 2013 :-------------:|:--------------------------------:|:-------------:|:-----------------: 2014 INACTIVE | Camera device finished AWB scan | CONVERGED | Values are already good, transient states are skipped by camera device. 2015 LOCKED | android.control.awbLock is OFF | CONVERGED | Values good after unlock, transient states are skipped by camera device. 2016 </details> 2017 </entry> 2018 <clone entry="android.control.effectMode" kind="controls"> 2019 </clone> 2020 <clone entry="android.control.mode" kind="controls"> 2021 </clone> 2022 <clone entry="android.control.sceneMode" kind="controls"> 2023 </clone> 2024 <clone entry="android.control.videoStabilizationMode" kind="controls"> 2025 </clone> 2026 </dynamic> 2027 </section> 2028 <section name="demosaic"> 2029 <controls> 2030 <entry name="mode" type="byte" enum="true"> 2031 <enum> 2032 <value>FAST 2033 <notes>Minimal or no slowdown of frame rate compared to 2034 Bayer RAW output.</notes></value> 2035 <value>HIGH_QUALITY 2036 <notes>Improved processing quality but the frame rate is slowed down 2037 relative to raw output.</notes></value> 2038 </enum> 2039 <description>Controls the quality of the demosaicing 2040 processing.</description> 2041 <tag id="FUTURE" /> 2042 </entry> 2043 </controls> 2044 </section> 2045 <section name="edge"> 2046 <controls> 2047 <entry name="mode" type="byte" visibility="public" enum="true"> 2048 <enum> 2049 <value>OFF 2050 <notes>No edge enhancement is applied.</notes></value> 2051 <value>FAST 2052 <notes>Apply edge enhancement at a quality level that does not slow down frame rate relative to sensor 2053 output</notes></value> 2054 <value>HIGH_QUALITY 2055 <notes>Apply high-quality edge enhancement, at a cost of reducing output frame rate. 2056 </notes></value> 2057 </enum> 2058 <description>Operation mode for edge 2059 enhancement.</description> 2060 <details>Edge/sharpness/detail enhancement. OFF means no 2061 enhancement will be applied by the camera device. 2062 2063 This must be set to one of the modes listed in android.edge.availableEdgeModes. 2064 2065 FAST/HIGH_QUALITY both mean camera device determined enhancement 2066 will be applied. HIGH_QUALITY mode indicates that the 2067 camera device will use the highest-quality enhancement algorithms, 2068 even if it slows down capture rate. FAST means the camera device will 2069 not slow down capture rate when applying edge enhancement.</details> 2070 <tag id="V1" /> 2071 </entry> 2072 <entry name="strength" type="byte"> 2073 <description>Control the amount of edge enhancement 2074 applied to the images</description> 2075 <units>1-10; 10 is maximum sharpening</units> 2076 <tag id="FUTURE" /> 2077 </entry> 2078 </controls> 2079 <static> 2080 <entry name="availableEdgeModes" type="byte" visibility="public" 2081 type_notes="list of enums" container="array" typedef="enumList"> 2082 <array> 2083 <size>n</size> 2084 </array> 2085 <description> 2086 The set of edge enhancement modes supported by this camera device. 2087 </description> 2088 <details> 2089 This tag lists the valid modes for android.edge.mode. 2090 2091 Full-capability camera devices must always support OFF and FAST. 2092 </details> 2093 <tag id="V1" /> 2094 </entry> 2095 </static> 2096 <dynamic> 2097 <clone entry="android.edge.mode" kind="controls"> 2098 <tag id="V1" /> 2099 </clone> 2100 </dynamic> 2101 </section> 2102 <section name="flash"> 2103 <controls> 2104 <entry name="firingPower" type="byte"> 2105 <description>Power for flash firing/torch</description> 2106 <units>10 is max power; 0 is no flash. Linear</units> 2107 <range>0 - 10</range> 2108 <details>Power for snapshot may use a different scale than 2109 for torch mode. Only one entry for torch mode will be 2110 used</details> 2111 <tag id="FUTURE" /> 2112 </entry> 2113 <entry name="firingTime" type="int64"> 2114 <description>Firing time of flash relative to start of 2115 exposure</description> 2116 <units>nanoseconds</units> 2117 <range>0-(exposure time-flash duration)</range> 2118 <details>Clamped to (0, exposure time - flash 2119 duration).</details> 2120 <tag id="FUTURE" /> 2121 </entry> 2122 <entry name="mode" type="byte" visibility="public" enum="true"> 2123 <enum> 2124 <value>OFF 2125 <notes> 2126 Do not fire the flash for this capture. 2127 </notes> 2128 </value> 2129 <value>SINGLE 2130 <notes> 2131 If the flash is available and charged, fire flash 2132 for this capture. 2133 </notes> 2134 </value> 2135 <value>TORCH 2136 <notes> 2137 Transition flash to continuously on. 2138 </notes> 2139 </value> 2140 </enum> 2141 <description>The desired mode for for the camera device's flash control.</description> 2142 <details> 2143 This control is only effective when flash unit is available 2144 (`android.flash.info.available == true`). 2145 2146 When this control is used, the android.control.aeMode must be set to ON or OFF. 2147 Otherwise, the camera device auto-exposure related flash control (ON_AUTO_FLASH, 2148 ON_ALWAYS_FLASH, or ON_AUTO_FLASH_REDEYE) will override this control. 2149 2150 When set to OFF, the camera device will not fire flash for this capture. 2151 2152 When set to SINGLE, the camera device will fire flash regardless of the camera 2153 device's auto-exposure routine's result. When used in still capture case, this 2154 control should be used along with auto-exposure (AE) precapture metering sequence 2155 (android.control.aePrecaptureTrigger), otherwise, the image may be incorrectly exposed. 2156 2157 When set to TORCH, the flash will be on continuously. This mode can be used 2158 for use cases such as preview, auto-focus assist, still capture, or video recording. 2159 2160 The flash status will be reported by android.flash.state in the capture result metadata. 2161 </details> 2162 <tag id="BC" /> 2163 </entry> 2164 </controls> 2165 <static> 2166 <namespace name="info"> 2167 <entry name="available" type="byte" visibility="public" enum="true" typedef="boolean"> 2168 <enum> 2169 <value>FALSE</value> 2170 <value>TRUE</value> 2171 </enum> 2172 <description>Whether this camera device has a 2173 flash.</description> 2174 <details>If no flash, none of the flash controls do 2175 anything. All other metadata should return 0.</details> 2176 <tag id="BC" /> 2177 </entry> 2178 <entry name="chargeDuration" type="int64"> 2179 <description>Time taken before flash can fire 2180 again</description> 2181 <units>nanoseconds</units> 2182 <range>0-1e9</range> 2183 <details>1 second too long/too short for recharge? Should 2184 this be power-dependent?</details> 2185 <tag id="FUTURE" /> 2186 </entry> 2187 </namespace> 2188 <entry name="colorTemperature" type="byte"> 2189 <description>The x,y whitepoint of the 2190 flash</description> 2191 <units>pair of floats</units> 2192 <range>0-1 for both</range> 2193 <tag id="FUTURE" /> 2194 </entry> 2195 <entry name="maxEnergy" type="byte"> 2196 <description>Max energy output of the flash for a full 2197 power single flash</description> 2198 <units>lumen-seconds</units> 2199 <range>&gt;= 0</range> 2200 <tag id="FUTURE" /> 2201 </entry> 2202 </static> 2203 <dynamic> 2204 <clone entry="android.flash.firingPower" kind="controls"> 2205 </clone> 2206 <clone entry="android.flash.firingTime" kind="controls"> 2207 </clone> 2208 <clone entry="android.flash.mode" kind="controls"></clone> 2209 <entry name="state" type="byte" visibility="public" enum="true"> 2210 <enum> 2211 <value>UNAVAILABLE 2212 <notes>No flash on camera.</notes></value> 2213 <value>CHARGING 2214 <notes>Flash is charging and cannot be fired.</notes></value> 2215 <value>READY 2216 <notes>Flash is ready to fire.</notes></value> 2217 <value>FIRED 2218 <notes>Flash fired for this capture.</notes></value> 2219 <value>PARTIAL 2220 <notes>Flash partially illuminated this frame. 2221 2222 This is usually due to the next or previous frame having 2223 the flash fire, and the flash spilling into this capture 2224 due to hardware limitations.</notes></value> 2225 </enum> 2226 <description>Current state of the flash 2227 unit.</description> 2228 <details> 2229 When the camera device doesn't have flash unit 2230 (i.e. `android.flash.info.available == false`), this state will always be UNAVAILABLE. 2231 Other states indicate the current flash status. 2232 </details> 2233 </entry> 2234 </dynamic> 2235 </section> 2236 <section name="hotPixel"> 2237 <controls> 2238 <entry name="mode" type="byte" visibility="public" enum="true"> 2239 <enum> 2240 <value>OFF 2241 <notes> 2242 No hot pixel correction is applied. 2243 2244 The frame rate must not be reduced relative to sensor raw output 2245 for this option. 2246 2247 The hotpixel map may be returned in android.statistics.hotPixelMap. 2248 </notes> 2249 </value> 2250 <value>FAST 2251 <notes> 2252 Hot pixel correction is applied, without reducing frame 2253 rate relative to sensor raw output. 2254 2255 The hotpixel map may be returned in android.statistics.hotPixelMap. 2256 </notes> 2257 </value> 2258 <value>HIGH_QUALITY 2259 <notes> 2260 High-quality hot pixel correction is applied, at a cost 2261 of reducing frame rate relative to sensor raw output. 2262 2263 The hotpixel map may be returned in android.statistics.hotPixelMap. 2264 </notes> 2265 </value> 2266 </enum> 2267 <description> 2268 Set operational mode for hot pixel correction. 2269 </description> 2270 <details> 2271 Valid modes for this camera device are listed in 2272 android.hotPixel.availableHotPixelModes. 2273 2274 Hotpixel correction interpolates out, or otherwise removes, pixels 2275 that do not accurately encode the incoming light (i.e. pixels that 2276 are stuck at an arbitrary value). 2277 </details> 2278 <tag id="V1" /> 2279 <tag id="DNG" /> 2280 </entry> 2281 </controls> 2282 <static> 2283 <entry name="availableHotPixelModes" type="byte" visibility="public" 2284 type_notes="list of enums" container="array" typedef="enumList"> 2285 <array> 2286 <size>n</size> 2287 </array> 2288 <description> 2289 The set of hot pixel correction modes that are supported by this 2290 camera device. 2291 </description> 2292 <details> 2293 This tag lists valid modes for android.hotPixel.mode. 2294 2295 FULL mode camera devices will always support FAST. 2296 </details> 2297 <hal_details> 2298 To avoid performance issues, there will be significantly fewer hot 2299 pixels than actual pixels on the camera sensor. 2300 </hal_details> 2301 <tag id="V1" /> 2302 <tag id="DNG" /> 2303 </entry> 2304 </static> 2305 <dynamic> 2306 <clone entry="android.hotPixel.mode" kind="controls"> 2307 <tag id="V1" /> 2308 <tag id="DNG" /> 2309 </clone> 2310 </dynamic> 2311 </section> 2312 <section name="jpeg"> 2313 <controls> 2314 <entry name="gpsLocation" type="byte" visibility="public" synthetic="true" 2315 typedef="location"> 2316 <description> 2317 A location object to use when generating image GPS metadata. 2318 </description> 2319 </entry> 2320 <entry name="gpsCoordinates" type="double" visibility="hidden" 2321 type_notes="latitude, longitude, altitude. First two in degrees, the third in meters" 2322 container="array"> 2323 <array> 2324 <size>3</size> 2325 </array> 2326 <description>GPS coordinates to include in output JPEG 2327 EXIF</description> 2328 <range>(-180 - 180], [-90,90], [-inf, inf]</range> 2329 <tag id="BC" /> 2330 </entry> 2331 <entry name="gpsProcessingMethod" type="byte" visibility="hidden" 2332 typedef="string"> 2333 <description>32 characters describing GPS algorithm to 2334 include in EXIF</description> 2335 <units>UTF-8 null-terminated string</units> 2336 <tag id="BC" /> 2337 </entry> 2338 <entry name="gpsTimestamp" type="int64" visibility="hidden"> 2339 <description>Time GPS fix was made to include in 2340 EXIF</description> 2341 <units>UTC in seconds since January 1, 1970</units> 2342 <tag id="BC" /> 2343 </entry> 2344 <entry name="orientation" type="int32" visibility="public"> 2345 <description>Orientation of JPEG image to 2346 write</description> 2347 <units>Degrees in multiples of 90</units> 2348 <range>0, 90, 180, 270</range> 2349 <tag id="BC" /> 2350 </entry> 2351 <entry name="quality" type="byte" visibility="public"> 2352 <description>Compression quality of the final JPEG 2353 image.</description> 2354 <range>1-100; larger is higher quality</range> 2355 <details>85-95 is typical usage range.</details> 2356 <tag id="BC" /> 2357 </entry> 2358 <entry name="thumbnailQuality" type="byte" visibility="public"> 2359 <description>Compression quality of JPEG 2360 thumbnail.</description> 2361 <range>1-100; larger is higher quality</range> 2362 <tag id="BC" /> 2363 </entry> 2364 <entry name="thumbnailSize" type="int32" visibility="public" 2365 container="array" typedef="size"> 2366 <array> 2367 <size>2</size> 2368 </array> 2369 <description>Resolution of embedded JPEG thumbnail.</description> 2370 <range>Size must be one of the size from android.jpeg.availableThumbnailSizes</range> 2371 <details>When set to (0, 0) value, the JPEG EXIF will not contain thumbnail, 2372 but the captured JPEG will still be a valid image. 2373 2374 When a jpeg image capture is issued, the thumbnail size selected should have 2375 the same aspect ratio as the jpeg image. 2376 2377 If the thumbnail image aspect ratio differs from the JPEG primary image aspect 2378 ratio, the camera device creates the thumbnail by cropping it from the primary image. 2379 For example, if the primary image has 4:3 aspect ratio, the thumbnail image has 2380 16:9 aspect ratio, the primary image will be cropped vertically (letterbox) to 2381 generate the thumbnail image. The thumbnail image will always have a smaller Field 2382 Of View (FOV) than the primary image when aspect ratios differ. 2383 </details> 2384 <hal_details> 2385 The HAL must not squeeze or stretch the downscaled primary image to generate thumbnail. 2386 The cropping must be done on the primary jpeg image rather than the sensor active array. 2387 The stream cropping rule specified by "S5. Cropping" in camera3.h doesn't apply to the 2388 thumbnail image cropping. 2389 </hal_details> 2390 <tag id="BC" /> 2391 </entry> 2392 </controls> 2393 <static> 2394 <entry name="availableThumbnailSizes" type="int32" visibility="public" 2395 container="array" typedef="size"> 2396 <array> 2397 <size>2</size> 2398 <size>n</size> 2399 </array> 2400 <description>Supported resolutions for the JPEG thumbnail.</description> 2401 <range>Will include at least one valid resolution, plus 2402 (0,0) for no thumbnail generation, and each size will be distinct.</range> 2403 <details>Below condiditions will be satisfied for this size list: 2404 2405 * The sizes will be sorted by increasing pixel area (width x height). 2406 If several resolutions have the same area, they will be sorted by increasing width. 2407 * The aspect ratio of the largest thumbnail size will be same as the 2408 aspect ratio of largest JPEG output size in android.scaler.availableStreamConfigurations. 2409 The largest size is defined as the size that has the largest pixel area 2410 in a given size list. 2411 * Each output JPEG size in android.scaler.availableStreamConfigurations will have at least 2412 one corresponding size that has the same aspect ratio in availableThumbnailSizes, 2413 and vice versa. 2414 * All non (0, 0) sizes will have non-zero widths and heights.</details> 2415 <tag id="BC" /> 2416 </entry> 2417 <entry name="maxSize" type="int32" visibility="system"> 2418 <description>Maximum size in bytes for the compressed 2419 JPEG buffer</description> 2420 <range>Must be large enough to fit any JPEG produced by 2421 the camera</range> 2422 <details>This is used for sizing the gralloc buffers for 2423 JPEG</details> 2424 </entry> 2425 </static> 2426 <dynamic> 2427 <clone entry="android.jpeg.gpsLocation" kind="controls"> 2428 </clone> 2429 <clone entry="android.jpeg.gpsCoordinates" kind="controls"> 2430 </clone> 2431 <clone entry="android.jpeg.gpsProcessingMethod" 2432 kind="controls"></clone> 2433 <clone entry="android.jpeg.gpsTimestamp" kind="controls"> 2434 </clone> 2435 <clone entry="android.jpeg.orientation" kind="controls"> 2436 </clone> 2437 <clone entry="android.jpeg.quality" kind="controls"> 2438 </clone> 2439 <entry name="size" type="int32"> 2440 <description>The size of the compressed JPEG image, in 2441 bytes</description> 2442 <range>&gt;= 0</range> 2443 <details>If no JPEG output is produced for the request, 2444 this must be 0. 2445 2446 Otherwise, this describes the real size of the compressed 2447 JPEG image placed in the output stream. More specifically, 2448 if android.jpeg.maxSize = 1000000, and a specific capture 2449 has android.jpeg.size = 500000, then the output buffer from 2450 the JPEG stream will be 1000000 bytes, of which the first 2451 500000 make up the real data.</details> 2452 <tag id="FUTURE" /> 2453 </entry> 2454 <clone entry="android.jpeg.thumbnailQuality" 2455 kind="controls"></clone> 2456 <clone entry="android.jpeg.thumbnailSize" kind="controls"> 2457 </clone> 2458 </dynamic> 2459 </section> 2460 <section name="lens"> 2461 <controls> 2462 <entry name="aperture" type="float" visibility="public"> 2463 <description>The ratio of lens focal length to the effective 2464 aperture diameter.</description> 2465 <units>f-number (f/NNN)</units> 2466 <range>android.lens.info.availableApertures</range> 2467 <details>This will only be supported on the camera devices that 2468 have variable aperture lens. The aperture value can only be 2469 one of the values listed in android.lens.info.availableApertures. 2470 2471 When this is supported and android.control.aeMode is OFF, 2472 this can be set along with android.sensor.exposureTime, 2473 android.sensor.sensitivity, and android.sensor.frameDuration 2474 to achieve manual exposure control. 2475 2476 The requested aperture value may take several frames to reach the 2477 requested value; the camera device will report the current (intermediate) 2478 aperture size in capture result metadata while the aperture is changing. 2479 While the aperture is still changing, android.lens.state will be set to MOVING. 2480 2481 When this is supported and android.control.aeMode is one of 2482 the ON modes, this will be overridden by the camera device 2483 auto-exposure algorithm, the overridden values are then provided 2484 back to the user in the corresponding result.</details> 2485 <tag id="V1" /> 2486 </entry> 2487 <entry name="filterDensity" type="float" visibility="public"> 2488 <description> 2489 State of lens neutral density filter(s). 2490 </description> 2491 <units>Steps of Exposure Value (EV).</units> 2492 <range>android.lens.info.availableFilterDensities</range> 2493 <details> 2494 This will not be supported on most camera devices. On devices 2495 where this is supported, this may only be set to one of the 2496 values included in android.lens.info.availableFilterDensities. 2497 2498 Lens filters are typically used to lower the amount of light the 2499 sensor is exposed to (measured in steps of EV). As used here, an EV 2500 step is the standard logarithmic representation, which are 2501 non-negative, and inversely proportional to the amount of light 2502 hitting the sensor. For example, setting this to 0 would result 2503 in no reduction of the incoming light, and setting this to 2 would 2504 mean that the filter is set to reduce incoming light by two stops 2505 (allowing 1/4 of the prior amount of light to the sensor). 2506 2507 It may take several frames before the lens filter density changes 2508 to the requested value. While the filter density is still changing, 2509 android.lens.state will be set to MOVING. 2510 </details> 2511 <tag id="V1" /> 2512 </entry> 2513 <entry name="focalLength" type="float" visibility="public"> 2514 <description> 2515 The current lens focal length; used for optical zoom. 2516 </description> 2517 <units>focal length in mm</units> 2518 <range>android.lens.info.availableFocalLengths</range> 2519 <details> 2520 This setting controls the physical focal length of the camera 2521 device's lens. Changing the focal length changes the field of 2522 view of the camera device, and is usually used for optical zoom. 2523 2524 Like android.lens.focusDistance and android.lens.aperture, this 2525 setting won't be applied instantaneously, and it may take several 2526 frames before the lens can change to the requested focal length. 2527 While the focal length is still changing, android.lens.state will 2528 be set to MOVING. 2529 2530 This is expected not to be supported on most devices. 2531 </details> 2532 <tag id="V1" /> 2533 </entry> 2534 <entry name="focusDistance" type="float" visibility="public"> 2535 <description>Distance to plane of sharpest focus, 2536 measured from frontmost surface of the lens.</description> 2537 <units>See android.lens.info.focusDistanceCalibration for details.</units> 2538 <range>&gt;= 0</range> 2539 <details>0 means infinity focus. Used value will be clamped 2540 to [0, android.lens.info.minimumFocusDistance]. 2541 2542 Like android.lens.focalLength, this setting won't be applied 2543 instantaneously, and it may take several frames before the lens 2544 can move to the requested focus distance. While the lens is still moving, 2545 android.lens.state will be set to MOVING. 2546 </details> 2547 <tag id="BC" /> 2548 <tag id="V1" /> 2549 </entry> 2550 <entry name="opticalStabilizationMode" type="byte" visibility="public" 2551 enum="true"> 2552 <enum> 2553 <value>OFF 2554 <notes>Optical stabilization is unavailable.</notes> 2555 </value> 2556 <value optional="true">ON 2557 <notes>Optical stabilization is enabled.</notes> 2558 </value> 2559 </enum> 2560 <description> 2561 Sets whether the camera device uses optical image stabilization (OIS) 2562 when capturing images. 2563 </description> 2564 <range>android.lens.info.availableOpticalStabilization</range> 2565 <details> 2566 OIS is used to compensate for motion blur due to small 2567 movements of the camera during capture. Unlike digital image 2568 stabilization (android.control.videoStabilizationMode), OIS 2569 makes use of mechanical elements to stabilize the camera 2570 sensor, and thus allows for longer exposure times before 2571 camera shake becomes apparent. 2572 2573 Not all devices will support OIS; see 2574 android.lens.info.availableOpticalStabilization for 2575 available controls. 2576 </details> 2577 <tag id="V1" /> 2578 </entry> 2579 </controls> 2580 <static> 2581 <namespace name="info"> 2582 <entry name="availableApertures" type="float" visibility="public" 2583 container="array"> 2584 <array> 2585 <size>n</size> 2586 </array> 2587 <description>List of supported aperture 2588 values.</description> 2589 <range>one entry required, &gt; 0</range> 2590 <details>If the camera device doesn't support variable apertures, 2591 listed value will be the fixed aperture. 2592 2593 If the camera device supports variable apertures, the aperture value 2594 in this list will be sorted in ascending order.</details> 2595 <tag id="V1" /> 2596 </entry> 2597 <entry name="availableFilterDensities" type="float" visibility="public" 2598 container="array"> 2599 <array> 2600 <size>n</size> 2601 </array> 2602 <description> 2603 List of supported neutral density filter values for 2604 android.lens.filterDensity. 2605 </description> 2606 <range> 2607 At least one value is required. Values must be &gt;= 0. 2608 </range> 2609 <details> 2610 If changing android.lens.filterDensity is not supported, 2611 availableFilterDensities must contain only 0. Otherwise, this 2612 list contains only the exact filter density values available on 2613 this camera device. 2614 </details> 2615 <tag id="V1" /> 2616 </entry> 2617 <entry name="availableFocalLengths" type="float" visibility="public" 2618 type_notes="The list of available focal lengths" 2619 container="array"> 2620 <array> 2621 <size>n</size> 2622 </array> 2623 <description> 2624 The available focal lengths for this device for use with 2625 android.lens.focalLength. 2626 </description> 2627 <range> 2628 Each value in this list must be &gt; 0. This list must 2629 contain at least one value. 2630 </range> 2631 <details> 2632 If optical zoom is not supported, this will only report 2633 a single value corresponding to the static focal length of the 2634 device. Otherwise, this will report every focal length supported 2635 by the device. 2636 </details> 2637 <tag id="BC" /> 2638 <tag id="V1" /> 2639 </entry> 2640 <entry name="availableOpticalStabilization" type="byte" 2641 visibility="public" type_notes="list of enums" container="array" 2642 typedef="enumList"> 2643 <array> 2644 <size>n</size> 2645 </array> 2646 <description> 2647 List containing a subset of the optical image 2648 stabilization (OIS) modes specified in 2649 android.lens.opticalStabilizationMode. 2650 </description> 2651 <details> 2652 If OIS is not implemented for a given camera device, this will 2653 contain only OFF. 2654 </details> 2655 <tag id="V1" /> 2656 </entry> 2657 <entry name="hyperfocalDistance" type="float" visibility="public" optional="true"> 2658 <description>Optional. Hyperfocal distance for this lens.</description> 2659 <units>See android.lens.info.focusDistanceCalibration for details.</units> 2660 <range>If lens is fixed focus, &gt;= 0. If lens has focuser unit, the range is 2661 `(0, android.lens.info.minimumFocusDistance]`</range> 2662 <details> 2663 If the lens is not fixed focus, the camera device will report this 2664 field when android.lens.info.focusDistanceCalibration is APPROXIMATE or CALIBRATED. 2665 </details> 2666 </entry> 2667 <entry name="minimumFocusDistance" type="float" visibility="public"> 2668 <description>Shortest distance from frontmost surface 2669 of the lens that can be focused correctly.</description> 2670 <units>See android.lens.info.focusDistanceCalibration for details.</units> 2671 <range>&gt;= 0</range> 2672 <details>If the lens is fixed-focus, this should be 2673 0.</details> 2674 <tag id="V1" /> 2675 </entry> 2676 <entry name="shadingMapSize" type="int32" visibility="hidden" 2677 type_notes="width and height (N, M) of lens shading map provided by the camera device." 2678 container="array" typedef="size"> 2679 <array> 2680 <size>2</size> 2681 </array> 2682 <description>Dimensions of lens shading map.</description> 2683 <range>Both values &gt;= 1</range> 2684 <details> 2685 The map should be on the order of 30-40 rows and columns, and 2686 must be smaller than 64x64. 2687 </details> 2688 <tag id="V1" /> 2689 </entry> 2690 <entry name="focusDistanceCalibration" type="byte" visibility="public" enum="true"> 2691 <enum> 2692 <value>UNCALIBRATED 2693 <notes> 2694 The lens focus distance is not accurate, and the units used for 2695 android.lens.focusDistance do not correspond to any physical units. 2696 2697 Setting the lens to the same focus distance on separate occasions may 2698 result in a different real focus distance, depending on factors such 2699 as the orientation of the device, the age of the focusing mechanism, 2700 and the device temperature. The focus distance value will still be 2701 in the range of `[0, android.lens.info.minimumFocusDistance]`, where 0 2702 represents the farthest focus. 2703 </notes> 2704 </value> 2705 <value>APPROXIMATE 2706 <notes> 2707 The lens focus distance is measured in diopters. 2708 2709 However, setting the lens to the same focus distance 2710 on separate occasions may result in a different real 2711 focus distance, depending on factors such as the 2712 orientation of the device, the age of the focusing 2713 mechanism, and the device temperature. 2714 </notes> 2715 </value> 2716 <value>CALIBRATED 2717 <notes> 2718 The lens focus distance is measured in diopters, and 2719 is calibrated. 2720 2721 The lens mechanism is calibrated so that setting the 2722 same focus distance is repeatable on multiple 2723 occasions with good accuracy, and the focus distance 2724 corresponds to the real physical distance to the plane 2725 of best focus. 2726 </notes> 2727 </value> 2728 </enum> 2729 <description>The lens focus distance calibration quality.</description> 2730 <details> 2731 The lens focus distance calibration quality determines the reliability of 2732 focus related metadata entries, i.e. android.lens.focusDistance, 2733 android.lens.focusRange, android.lens.info.hyperfocalDistance, and 2734 android.lens.info.minimumFocusDistance. 2735 </details> 2736 <tag id="V1" /> 2737 </entry> 2738 </namespace> 2739 <entry name="facing" type="byte" visibility="public" enum="true"> 2740 <enum> 2741 <value>FRONT 2742 <notes> 2743 The camera device faces the same direction as the device's screen. 2744 </notes></value> 2745 <value>BACK 2746 <notes> 2747 The camera device faces the opposite direction as the device's screen. 2748 </notes></value> 2749 </enum> 2750 <description>Direction the camera faces relative to 2751 device screen.</description> 2752 </entry> 2753 <entry name="opticalAxisAngle" type="float" 2754 type_notes="degrees. First defines the angle of separation between the perpendicular to the screen and the camera optical axis. The second then defines the clockwise rotation of the optical axis from native device up." 2755 container="array"> 2756 <array> 2757 <size>2</size> 2758 </array> 2759 <description>Relative angle of camera optical axis to the 2760 perpendicular axis from the display</description> 2761 <range>[0-90) for first angle, [0-360) for second</range> 2762 <details>Examples: 2763 2764 (0,0) means that the camera optical axis 2765 is perpendicular to the display surface; 2766 2767 (45,0) means that the camera points 45 degrees up when 2768 device is held upright; 2769 2770 (45,90) means the camera points 45 degrees to the right when 2771 the device is held upright. 2772 2773 Use FACING field to determine perpendicular outgoing 2774 direction</details> 2775 <tag id="FUTURE" /> 2776 </entry> 2777 <entry name="position" type="float" container="array"> 2778 <array> 2779 <size>3, location in mm, in the sensor coordinate 2780 system</size> 2781 </array> 2782 <description>Coordinates of camera optical axis on 2783 device</description> 2784 <tag id="FUTURE" /> 2785 </entry> 2786 </static> 2787 <dynamic> 2788 <clone entry="android.lens.aperture" kind="controls"> 2789 <tag id="V1" /> 2790 </clone> 2791 <clone entry="android.lens.filterDensity" kind="controls"> 2792 <tag id="V1" /> 2793 </clone> 2794 <clone entry="android.lens.focalLength" kind="controls"> 2795 <tag id="BC" /> 2796 </clone> 2797 <clone entry="android.lens.focusDistance" kind="controls"> 2798 <details>Should be zero for fixed-focus cameras</details> 2799 <tag id="BC" /> 2800 </clone> 2801 <entry name="focusRange" type="float" visibility="public" 2802 type_notes="Range of scene distances that are in focus" 2803 container="array" typedef="pairFloatFloat"> 2804 <array> 2805 <size>2</size> 2806 </array> 2807 <description>The range of scene distances that are in 2808 sharp focus (depth of field).</description> 2809 <units>pair of focus distances in diopters: (near, 2810 far), see android.lens.info.focusDistanceCalibration for details.</units> 2811 <range>&gt;=0</range> 2812 <details>If variable focus not supported, can still report 2813 fixed depth of field range</details> 2814 <tag id="BC" /> 2815 </entry> 2816 <clone entry="android.lens.opticalStabilizationMode" 2817 kind="controls"> 2818 <tag id="V1" /> 2819 </clone> 2820 <entry name="state" type="byte" visibility="public" enum="true"> 2821 <enum> 2822 <value>STATIONARY 2823 <notes> 2824 The lens parameters (android.lens.focalLength, android.lens.focusDistance, 2825 android.lens.filterDensity and android.lens.aperture) are not changing. 2826 </notes> 2827 </value> 2828 <value>MOVING 2829 <notes> 2830 One or several of the lens parameters 2831 (android.lens.focalLength, android.lens.focusDistance, 2832 android.lens.filterDensity or android.lens.aperture) is 2833 currently changing. 2834 </notes> 2835 </value> 2836 </enum> 2837 <description>Current lens status.</description> 2838 <details> 2839 For lens parameters android.lens.focalLength, android.lens.focusDistance, 2840 android.lens.filterDensity and android.lens.aperture, when changes are requested, 2841 they may take several frames to reach the requested values. This state indicates 2842 the current status of the lens parameters. 2843 2844 When the state is STATIONARY, the lens parameters are not changing. This could be 2845 either because the parameters are all fixed, or because the lens has had enough 2846 time to reach the most recently-requested values. 2847 If all these lens parameters are not changable for a camera device, as listed below: 2848 2849 * Fixed focus (`android.lens.info.minimumFocusDistance == 0`), which means 2850 android.lens.focusDistance parameter will always be 0. 2851 * Fixed focal length (android.lens.info.availableFocalLengths contains single value), 2852 which means the optical zoom is not supported. 2853 * No ND filter (android.lens.info.availableFilterDensities contains only 0). 2854 * Fixed aperture (android.lens.info.availableApertures contains single value). 2855 2856 Then this state will always be STATIONARY. 2857 2858 When the state is MOVING, it indicates that at least one of the lens parameters 2859 is changing. 2860 </details> 2861 <tag id="V1" /> 2862 </entry> 2863 </dynamic> 2864 </section> 2865 <section name="noiseReduction"> 2866 <controls> 2867 <entry name="mode" type="byte" visibility="public" enum="true"> 2868 <enum> 2869 <value>OFF 2870 <notes>No noise reduction is applied.</notes></value> 2871 <value>FAST 2872 <notes>Noise reduction is applied without reducing frame rate relative to sensor 2873 output.</notes></value> 2874 <value>HIGH_QUALITY 2875 <notes>High-quality noise reduction is applied, at the cost of reducing frame rate 2876 relative to sensor output.</notes></value> 2877 </enum> 2878 <description>Mode of operation for the noise reduction algorithm.</description> 2879 <details>Noise filtering control. OFF means no noise reduction 2880 will be applied by the camera device. 2881 2882 This must be set to a valid mode from 2883 android.noiseReduction.availableNoiseReductionModes. 2884 2885 FAST/HIGH_QUALITY both mean camera device determined noise filtering 2886 will be applied. HIGH_QUALITY mode indicates that the camera device 2887 will use the highest-quality noise filtering algorithms, 2888 even if it slows down capture rate. FAST means the camera device will not 2889 slow down capture rate when applying noise filtering.</details> 2890 <tag id="V1" /> 2891 </entry> 2892 <entry name="strength" type="byte"> 2893 <description>Control the amount of noise reduction 2894 applied to the images</description> 2895 <units>1-10; 10 is max noise reduction</units> 2896 <range>1 - 10</range> 2897 <tag id="FUTURE" /> 2898 </entry> 2899 </controls> 2900 <static> 2901 <entry name="availableNoiseReductionModes" type="byte" visibility="public" 2902 type_notes="list of enums" container="array" typedef="enumList"> 2903 <array> 2904 <size>n</size> 2905 </array> 2906 <description> 2907 The set of noise reduction modes supported by this camera device. 2908 </description> 2909 <details> 2910 This tag lists the valid modes for android.noiseReduction.mode. 2911 2912 Full-capability camera devices must always support OFF and FAST. 2913 </details> 2914 <tag id="V1" /> 2915 </entry> 2916 </static> 2917 <dynamic> 2918 <clone entry="android.noiseReduction.mode" kind="controls"> 2919 <tag id="V1" /> 2920 </clone> 2921 </dynamic> 2922 </section> 2923 <section name="quirks"> 2924 <static> 2925 <entry name="meteringCropRegion" type="byte" visibility="system" deprecated="true" optional="true"> 2926 <description>If set to 1, the camera service does not 2927 scale 'normalized' coordinates with respect to the crop 2928 region. This applies to metering input (a{e,f,wb}Region 2929 and output (face rectangles).</description> 2930 <details>Normalized coordinates refer to those in the 2931 (-1000,1000) range mentioned in the 2932 android.hardware.Camera API. 2933 2934 HAL implementations should instead always use and emit 2935 sensor array-relative coordinates for all region data. Does 2936 not need to be listed in static metadata. Support will be 2937 removed in future versions of camera service.</details> 2938 </entry> 2939 <entry name="triggerAfWithAuto" type="byte" visibility="system" deprecated="true" optional="true"> 2940 <description>If set to 1, then the camera service always 2941 switches to FOCUS_MODE_AUTO before issuing a AF 2942 trigger.</description> 2943 <details>HAL implementations should implement AF trigger 2944 modes for AUTO, MACRO, CONTINUOUS_FOCUS, and 2945 CONTINUOUS_PICTURE modes instead of using this flag. Does 2946 not need to be listed in static metadata. Support will be 2947 removed in future versions of camera service</details> 2948 </entry> 2949 <entry name="useZslFormat" type="byte" visibility="system" deprecated="true" optional="true"> 2950 <description>If set to 1, the camera service uses 2951 CAMERA2_PIXEL_FORMAT_ZSL instead of 2952 HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED for the zero 2953 shutter lag stream</description> 2954 <details>HAL implementations should use gralloc usage flags 2955 to determine that a stream will be used for 2956 zero-shutter-lag, instead of relying on an explicit 2957 format setting. Does not need to be listed in static 2958 metadata. Support will be removed in future versions of 2959 camera service.</details> 2960 </entry> 2961 <entry name="usePartialResult" type="byte" visibility="hidden" deprecated="true" optional="true"> 2962 <description> 2963 If set to 1, the HAL will always split result 2964 metadata for a single capture into multiple buffers, 2965 returned using multiple process_capture_result calls. 2966 </description> 2967 <details> 2968 Does not need to be listed in static 2969 metadata. Support for partial results will be reworked in 2970 future versions of camera service. This quirk will stop 2971 working at that point; DO NOT USE without careful 2972 consideration of future support. 2973 </details> 2974 <hal_details> 2975 Refer to `camera3_capture_result::partial_result` 2976 for information on how to implement partial results. 2977 </hal_details> 2978 </entry> 2979 </static> 2980 <dynamic> 2981 <entry name="partialResult" type="byte" visibility="hidden" deprecated="true" optional="true" enum="true" typedef="boolean"> 2982 <enum> 2983 <value>FINAL 2984 <notes>The last or only metadata result buffer 2985 for this capture.</notes> 2986 </value> 2987 <value>PARTIAL 2988 <notes>A partial buffer of result metadata for this 2989 capture. More result buffers for this capture will be sent 2990 by the camera device, the last of which will be marked 2991 FINAL.</notes> 2992 </value> 2993 </enum> 2994 <description> 2995 Whether a result given to the framework is the 2996 final one for the capture, or only a partial that contains a 2997 subset of the full set of dynamic metadata 2998 values.</description> 2999 <range>Optional. Default value is FINAL.</range> 3000 <details> 3001 The entries in the result metadata buffers for a 3002 single capture may not overlap, except for this entry. The 3003 FINAL buffers must retain FIFO ordering relative to the 3004 requests that generate them, so the FINAL buffer for frame 3 must 3005 always be sent to the framework after the FINAL buffer for frame 2, and 3006 before the FINAL buffer for frame 4. PARTIAL buffers may be returned 3007 in any order relative to other frames, but all PARTIAL buffers for a given 3008 capture must arrive before the FINAL buffer for that capture. This entry may 3009 only be used by the camera device if quirks.usePartialResult is set to 1. 3010 </details> 3011 <hal_details> 3012 Refer to `camera3_capture_result::partial_result` 3013 for information on how to implement partial results. 3014 </hal_details> 3015 </entry> 3016 </dynamic> 3017 </section> 3018 <section name="request"> 3019 <controls> 3020 <entry name="frameCount" type="int32" visibility="system" deprecated="true"> 3021 <description>A frame counter set by the framework. Must 3022 be maintained unchanged in output frame. This value monotonically 3023 increases with every new result (that is, each new result has a unique 3024 frameCount value). 3025 </description> 3026 <units>incrementing integer</units> 3027 <range>Any int.</range> 3028 </entry> 3029 <entry name="id" type="int32" visibility="hidden"> 3030 <description>An application-specified ID for the current 3031 request. Must be maintained unchanged in output 3032 frame</description> 3033 <units>arbitrary integer assigned by application</units> 3034 <range>Any int</range> 3035 <tag id="V1" /> 3036 </entry> 3037 <entry name="inputStreams" type="int32" visibility="system" deprecated="true" 3038 container="array"> 3039 <array> 3040 <size>n</size> 3041 </array> 3042 <description>List which camera reprocess stream is used 3043 for the source of reprocessing data.</description> 3044 <units>List of camera reprocess stream IDs</units> 3045 <range> 3046 Typically, only one entry allowed, must be a valid reprocess stream ID. 3047 </range> 3048 <details>Only meaningful when android.request.type == 3049 REPROCESS. Ignored otherwise</details> 3050 <tag id="HAL2" /> 3051 </entry> 3052 <entry name="metadataMode" type="byte" visibility="system" 3053 enum="true"> 3054 <enum> 3055 <value>NONE 3056 <notes>No metadata should be produced on output, except 3057 for application-bound buffer data. If no 3058 application-bound streams exist, no frame should be 3059 placed in the output frame queue. If such streams 3060 exist, a frame should be placed on the output queue 3061 with null metadata but with the necessary output buffer 3062 information. Timestamp information should still be 3063 included with any output stream buffers</notes></value> 3064 <value>FULL 3065 <notes>All metadata should be produced. Statistics will 3066 only be produced if they are separately 3067 enabled</notes></value> 3068 </enum> 3069 <description>How much metadata to produce on 3070 output</description> 3071 <tag id="FUTURE" /> 3072 </entry> 3073 <entry name="outputStreams" type="int32" visibility="system" deprecated="true" 3074 container="array"> 3075 <array> 3076 <size>n</size> 3077 </array> 3078 <description>Lists which camera output streams image data 3079 from this capture must be sent to</description> 3080 <units>List of camera stream IDs</units> 3081 <range>List must only include streams that have been 3082 created</range> 3083 <details>If no output streams are listed, then the image 3084 data should simply be discarded. The image data must 3085 still be captured for metadata and statistics production, 3086 and the lens and flash must operate as requested.</details> 3087 <tag id="HAL2" /> 3088 </entry> 3089 <entry name="type" type="byte" visibility="system" deprecated="true" enum="true"> 3090 <enum> 3091 <value>CAPTURE 3092 <notes>Capture a new image from the imaging hardware, 3093 and process it according to the 3094 settings</notes></value> 3095 <value>REPROCESS 3096 <notes>Process previously captured data; the 3097 android.request.inputStreams parameter determines the 3098 source reprocessing stream. TODO: Mark dynamic metadata 3099 needed for reprocessing with [RP]</notes></value> 3100 </enum> 3101 <description>The type of the request; either CAPTURE or 3102 REPROCESS. For HAL3, this tag is redundant. 3103 </description> 3104 <tag id="HAL2" /> 3105 </entry> 3106 </controls> 3107 <static> 3108 <entry name="maxNumOutputStreams" type="int32" visibility="hidden" 3109 container="array"> 3110 <array> 3111 <size>3</size> 3112 </array> 3113 <description>The maximum numbers of different types of output streams 3114 that can be configured and used simultaneously by a camera device. 3115 </description> 3116 <range> 3117 For processed (and stalling) format streams, &gt;= 1. 3118 3119 For Raw format (either stalling or non-stalling) streams, &gt;= 0. 3120 3121 For processed (but not stalling) format streams, &gt;= 3 3122 for FULL mode devices (`android.info.supportedHardwareLevel == FULL`); 3123 &gt;= 2 for LIMITED mode devices (`android.info.supportedHardwareLevel == LIMITED`). 3124 </range> 3125 <details> 3126 This is a 3 element tuple that contains the max number of output simultaneous 3127 streams for raw sensor, processed (but not stalling), and processed (and stalling) 3128 formats respectively. For example, assuming that JPEG is typically a processed and 3129 stalling stream, if max raw sensor format output stream number is 1, max YUV streams 3130 number is 3, and max JPEG stream number is 2, then this tuple should be `(1, 3, 2)`. 3131 3132 This lists the upper bound of the number of output streams supported by 3133 the camera device. Using more streams simultaneously may require more hardware and 3134 CPU resources that will consume more power. The image format for an output stream can 3135 be any supported format provided by android.scaler.availableStreamConfigurations. 3136 The formats defined in android.scaler.availableStreamConfigurations can be catergorized 3137 into the 3 stream types as below: 3138 3139 * Processed (but stalling): any non-RAW format with a stallDurations &gt; 0. 3140 Typically JPEG format (ImageFormat#JPEG). 3141 * Raw formats: ImageFormat#RAW_SENSOR and ImageFormat#RAW_OPAQUE. 3142 * Processed (but not-stalling): any non-RAW format without a stall duration. 3143 Typically ImageFormat#YUV_420_888, ImageFormat#NV21, ImageFormat#YV12. 3144 </details> 3145 <tag id="BC" /> 3146 </entry> 3147 <entry name="maxNumOutputRaw" type="int32" visibility="public" synthetic="true"> 3148 <description>The maximum numbers of different types of output streams 3149 that can be configured and used simultaneously by a camera device 3150 for any `RAW` formats. 3151 </description> 3152 <range> 3153 &gt;= 0 3154 </range> 3155 <details> 3156 This value contains the max number of output simultaneous 3157 streams from the raw sensor. 3158 3159 This lists the upper bound of the number of output streams supported by 3160 the camera device. Using more streams simultaneously may require more hardware and 3161 CPU resources that will consume more power. The image format for this kind of an output stream can 3162 be any `RAW` and supported format provided by android.scaler.streamConfigurationMap. 3163 3164 In particular, a `RAW` format is typically one of: 3165 3166 * ImageFormat#RAW_SENSOR 3167 * Opaque `RAW` 3168 </details> 3169 </entry> 3170 <entry name="maxNumOutputProc" type="int32" visibility="public" synthetic="true"> 3171 <description>The maximum numbers of different types of output streams 3172 that can be configured and used simultaneously by a camera device 3173 for any processed (but not-stalling) formats. 3174 </description> 3175 <range> 3176 &gt;= 3 3177 for FULL mode devices (`android.info.supportedHardwareLevel == FULL`); 3178 &gt;= 2 for LIMITED mode devices (`android.info.supportedHardwareLevel == LIMITED`). 3179 </range> 3180 <details> 3181 This value contains the max number of output simultaneous 3182 streams for any processed (but not-stalling) formats. 3183 3184 This lists the upper bound of the number of output streams supported by 3185 the camera device. Using more streams simultaneously may require more hardware and 3186 CPU resources that will consume more power. The image format for this kind of an output stream can 3187 be any non-`RAW` and supported format provided by android.scaler.streamConfigurationMap. 3188 3189 Processed (but not-stalling) is defined as any non-RAW format without a stall duration. 3190 Typically: 3191 3192 * ImageFormat#YUV_420_888 3193 * ImageFormat#NV21 3194 * ImageFormat#YV12 3195 * Implementation-defined formats, i.e. StreamConfiguration#isOutputSupportedFor(Class) 3196 3197 For full guarantees, query StreamConfigurationMap#getOutputStallDuration with 3198 a processed format -- it will return 0 for a non-stalling stream. 3199 </details> 3200 </entry> 3201 <entry name="maxNumOutputProcStalling" type="int32" visibility="public" synthetic="true"> 3202 <description>The maximum numbers of different types of output streams 3203 that can be configured and used simultaneously by a camera device 3204 for any processed (and stalling) formats. 3205 </description> 3206 <range> 3207 &gt;= 1 3208 </range> 3209 <details> 3210 This value contains the max number of output simultaneous 3211 streams for any processed (but not-stalling) formats. 3212 3213 This lists the upper bound of the number of output streams supported by 3214 the camera device. Using more streams simultaneously may require more hardware and 3215 CPU resources that will consume more power. The image format for this kind of an output stream can 3216 be any non-`RAW` and supported format provided by android.scaler.streamConfigurationMap. 3217 3218 A processed and stalling format is defined as any non-RAW format with a stallDurations &gt; 0. 3219 Typically only the `JPEG` format (ImageFormat#JPEG) 3220 3221 For full guarantees, query StreamConfigurationMap#getOutputStallDuration with 3222 a processed format -- it will return a non-0 value for a stalling stream. 3223 </details> 3224 </entry> 3225 <entry name="maxNumReprocessStreams" type="int32" visibility="system" 3226 deprecated="true" container="array"> 3227 <array> 3228 <size>1</size> 3229 </array> 3230 <description>How many reprocessing streams of any type 3231 can be allocated at the same time.</description> 3232 <range>&gt;= 0</range> 3233 <details> 3234 Only used by HAL2.x. 3235 3236 When set to 0, it means no reprocess stream is supported. 3237 </details> 3238 <tag id="HAL2" /> 3239 </entry> 3240 <entry name="maxNumInputStreams" type="int32" visibility="hidden"> 3241 <description> 3242 The maximum numbers of any type of input streams 3243 that can be configured and used simultaneously by a camera device. 3244 </description> 3245 <range> 3246 &gt;= 0 for LIMITED mode device (`android.info.supportedHardwareLevel == LIMITED`). 3247 &gt;= 1 for FULL mode device (`android.info.supportedHardwareLevel == FULL`). 3248 </range> 3249 <details>When set to 0, it means no input stream is supported. 3250 3251 The image format for a input stream can be any supported 3252 format provided by 3253 android.scaler.availableInputOutputFormatsMap. When using an 3254 input stream, there must be at least one output stream 3255 configured to to receive the reprocessed images. 3256 3257 For example, for Zero Shutter Lag (ZSL) still capture use case, the input 3258 stream image format will be RAW_OPAQUE, the associated output stream image format 3259 should be JPEG. 3260 </details> 3261 </entry> 3262 </static> 3263 <dynamic> 3264 <entry name="frameCount" type="int32" visibility="public"> 3265 <description>A frame counter set by the framework. This value monotonically 3266 increases with every new result (that is, each new result has a unique 3267 frameCount value).</description> 3268 <units>count of frames</units> 3269 <range>&gt; 0</range> 3270 <details>Reset on release()</details> 3271 </entry> 3272 <clone entry="android.request.id" kind="controls"></clone> 3273 <clone entry="android.request.metadataMode" 3274 kind="controls"></clone> 3275 <clone entry="android.request.outputStreams" 3276 kind="controls"></clone> 3277 <entry name="pipelineDepth" type="byte" visibility="public"> 3278 <description>Specifies the number of pipeline stages the frame went 3279 through from when it was exposed to when the final completed result 3280 was available to the framework.</description> 3281 <range>&lt;= android.request.pipelineMaxDepth</range> 3282 <details>Depending on what settings are used in the request, and 3283 what streams are configured, the data may undergo less processing, 3284 and some pipeline stages skipped. 3285 3286 See android.request.pipelineMaxDepth for more details. 3287 </details> 3288 <hal_details> 3289 This value must always represent the accurate count of how many 3290 pipeline stages were actually used. 3291 </hal_details> 3292 </entry> 3293 </dynamic> 3294 <static> 3295 <entry name="pipelineMaxDepth" type="byte" visibility="public"> 3296 <description>Specifies the number of maximum pipeline stages a frame 3297 has to go through from when it's exposed to when it's available 3298 to the framework.</description> 3299 <details>A typical minimum value for this is 2 (one stage to expose, 3300 one stage to readout) from the sensor. The ISP then usually adds 3301 its own stages to do custom HW processing. Further stages may be 3302 added by SW processing. 3303 3304 Depending on what settings are used (e.g. YUV, JPEG) and what 3305 processing is enabled (e.g. face detection), the actual pipeline 3306 depth (specified by android.request.pipelineDepth) may be less than 3307 the max pipeline depth. 3308 3309 A pipeline depth of X stages is equivalent to a pipeline latency of 3310 X frame intervals. 3311 3312 This value will be 8 or less. 3313 </details> 3314 <hal_details> 3315 This value should be 4 or less. 3316 </hal_details> 3317 </entry> 3318 <entry name="partialResultCount" type="int32" visibility="public"> 3319 <description>Defines how many sub-components 3320 a result will be composed of. 3321 </description> 3322 <range>&gt;= 1</range> 3323 <details>In order to combat the pipeline latency, partial results 3324 may be delivered to the application layer from the camera device as 3325 soon as they are available. 3326 3327 Optional; defaults to 1. A value of 1 means that partial 3328 results are not supported, and only the final TotalCaptureResult will 3329 be produced by the camera device. 3330 3331 A typical use case for this might be: after requesting an 3332 auto-focus (AF) lock the new AF state might be available 50% 3333 of the way through the pipeline. The camera device could 3334 then immediately dispatch this state via a partial result to 3335 the application, and the rest of the metadata via later 3336 partial results. 3337 </details> 3338 </entry> 3339 <entry name="availableCapabilities" type="byte" visibility="public" 3340 enum="true" container="array"> 3341 <array> 3342 <size>n</size> 3343 </array> 3344 <enum> 3345 <value hidden="true">BACKWARD_COMPATIBLE 3346 <notes>The minimal set of capabilities that every camera 3347 device (regardless of android.info.supportedHardwareLevel) 3348 will support. 3349 3350 The full set of features supported by this capability makes 3351 the camera2 api backwards compatible with the camera1 3352 (android.hardware.Camera) API. 3353 </notes> 3354 </value> 3355 <value hidden="true">OPTIONAL 3356 <notes>This is a catch-all capability to include all other 3357 tags or functionality not encapsulated by one of the other 3358 capabilities. 3359 3360 A typical example is all tags marked 'optional'. 3361 </notes> 3362 </value> 3363 <value optional="true">MANUAL_SENSOR 3364 <notes> 3365 The camera device can be manually controlled (3A algorithms such 3366 as auto-exposure, and auto-focus can be bypassed). 3367 The camera device supports basic manual control of the sensor image 3368 acquisition related stages. This means the following controls are 3369 guaranteed to be supported: 3370 3371 * Manual frame duration control 3372 * android.sensor.frameDuration 3373 * android.sensor.info.maxFrameDuration 3374 * android.scaler.streamConfigurationMap 3375 * Manual exposure control 3376 * android.sensor.exposureTime 3377 * android.sensor.info.exposureTimeRange 3378 * Manual sensitivity control 3379 * android.sensor.sensitivity 3380 * android.sensor.info.sensitivityRange 3381 * Manual lens control (if the lens is adjustable) 3382 * android.lens.* 3383 * Manual flash control (if a flash unit is present) 3384 * android.flash.* 3385 * Manual black level locking 3386 * android.blackLevel.lock 3387 3388 If any of the above 3A algorithms are enabled, then the camera 3389 device will accurately report the values applied by 3A in the 3390 result. 3391 3392 A given camera device may also support additional manual sensor controls, 3393 but this capability only covers the above list of controls. 3394 </notes> 3395 </value> 3396 <value optional="true">MANUAL_POST_PROCESSING 3397 <notes> 3398 The camera device post-processing stages can be manually controlled. 3399 The camera device supports basic manual control of the image post-processing 3400 stages. This means the following controls are guaranteed to be supported: 3401 3402 * Manual tonemap control 3403 * android.tonemap.curve 3404 * android.tonemap.mode 3405 * android.tonemap.maxCurvePoints 3406 * Manual white balance control 3407 * android.colorCorrection.transform 3408 * android.colorCorrection.gains 3409 * Lens shading map information 3410 * android.statistics.lensShadingMap 3411 * android.lens.info.shadingMapSize 3412 3413 If auto white balance is enabled, then the camera device 3414 will accurately report the values applied by AWB in the result. 3415 3416 A given camera device may also support additional post-processing 3417 controls, but this capability only covers the above list of controls. 3418 </notes> 3419 </value> 3420 <value optional="true" hidden="true">ZSL 3421 <notes> 3422 The camera device supports the Zero Shutter Lag use case. 3423 3424 * At least one input stream can be used. 3425 * RAW_OPAQUE is supported as an output/input format 3426 * Using RAW_OPAQUE does not cause a frame rate drop 3427 relative to the sensor's maximum capture rate (at that 3428 resolution). 3429 * RAW_OPAQUE will be reprocessable into both YUV_420_888 3430 and JPEG formats. 3431 * The maximum available resolution for RAW_OPAQUE streams 3432 (both input/output) will match the maximum available 3433 resolution of JPEG streams. 3434 </notes> 3435 </value> 3436 <value optional="true">DNG 3437 <notes> 3438 The camera device supports outputting RAW buffers that can be 3439 saved offline into a DNG format. It can reprocess DNG 3440 files (produced from the same camera device) back into YUV. 3441 3442 * At least one input stream can be used. 3443 * RAW16 is supported as output/input format. 3444 * RAW16 is reprocessable into both YUV_420_888 and JPEG 3445 formats. 3446 * The maximum available resolution for RAW16 streams (both 3447 input/output) will match either the value in 3448 android.sensor.info.pixelArraySize or 3449 android.sensor.info.activeArraySize. 3450 * All DNG-related optional metadata entries are provided 3451 by the camera device. 3452 </notes> 3453 </value> 3454 </enum> 3455 <description>List of capabilities that the camera device 3456 advertises as fully supporting.</description> 3457 <details> 3458 A capability is a contract that the camera device makes in order 3459 to be able to satisfy one or more use cases. 3460 3461 Listing a capability guarantees that the whole set of features 3462 required to support a common use will all be available. 3463 3464 Using a subset of the functionality provided by an unsupported 3465 capability may be possible on a specific camera device implementation; 3466 to do this query each of android.request.availableRequestKeys, 3467 android.request.availableResultKeys, 3468 android.request.availableCharacteristicsKeys. 3469 3470 The following capabilities are guaranteed to be available on 3471 android.info.supportedHardwareLevel `==` FULL devices: 3472 3473 * MANUAL_SENSOR 3474 * MANUAL_POST_PROCESSING 3475 3476 Other capabilities may be available on either FULL or LIMITED 3477 devices, but the application should query this field to be sure. 3478 </details> 3479 <hal_details> 3480 Additional constraint details per-capability will be available 3481 in the Compatibility Test Suite. 3482 3483 BACKWARD_COMPATIBLE capability requirements are not explicitly listed. 3484 Instead refer to "BC" tags and the camera CTS tests in the 3485 android.hardware.cts package. 3486 3487 Listed controls that can be either request or result (e.g. 3488 android.sensor.exposureTime) must be available both in the 3489 request and the result in order to be considered to be 3490 capability-compliant. 3491 3492 For example, if the HAL claims to support MANUAL control, 3493 then exposure time must be configurable via the request _and_ 3494 the actual exposure applied must be available via 3495 the result. 3496 </hal_details> 3497 </entry> 3498 <entry name="availableRequestKeys" type="int32" visibility="hidden" 3499 container="array"> 3500 <array> 3501 <size>n</size> 3502 </array> 3503 <description>A list of all keys that the camera device has available 3504 to use with CaptureRequest.</description> 3505 3506 <details>Attempting to set a key into a CaptureRequest that is not 3507 listed here will result in an invalid request and will be rejected 3508 by the camera device. 3509 3510 This field can be used to query the feature set of a camera device 3511 at a more granular level than capabilities. This is especially 3512 important for optional keys that are not listed under any capability 3513 in android.request.availableCapabilities. 3514 </details> 3515 <hal_details> 3516 Vendor tags must not be listed here. Use the vendor tag metadata 3517 extensions C api instead (refer to camera3.h for more details). 3518 3519 Setting/getting vendor tags will be checked against the metadata 3520 vendor extensions API and not against this field. 3521 3522 The HAL must not consume any request tags that are not listed either 3523 here or in the vendor tag list. 3524 3525 The public camera2 API will always make the vendor tags visible 3526 via CameraCharacteristics#getAvailableCaptureRequestKeys. 3527 </hal_details> 3528 </entry> 3529 <entry name="availableResultKeys" type="int32" visibility="hidden" 3530 container="array"> 3531 <array> 3532 <size>n</size> 3533 </array> 3534 <description>A list of all keys that the camera device has available 3535 to use with CaptureResult.</description> 3536 3537 <details>Attempting to get a key from a CaptureResult that is not 3538 listed here will always return a `null` value. Getting a key from 3539 a CaptureResult that is listed here must never return a `null` 3540 value. 3541 3542 The following keys may return `null` unless they are enabled: 3543 3544 * android.statistics.lensShadingMap (non-null iff android.statistics.lensShadingMapMode == ON) 3545 3546 (Those sometimes-null keys should nevertheless be listed here 3547 if they are available.) 3548 3549 This field can be used to query the feature set of a camera device 3550 at a more granular level than capabilities. This is especially 3551 important for optional keys that are not listed under any capability 3552 in android.request.availableCapabilities. 3553 </details> 3554 <hal_details> 3555 Tags listed here must always have an entry in the result metadata, 3556 even if that size is 0 elements. Only array-type tags (e.g. lists, 3557 matrices, strings) are allowed to have 0 elements. 3558 3559 Vendor tags must not be listed here. Use the vendor tag metadata 3560 extensions C api instead (refer to camera3.h for more details). 3561 3562 Setting/getting vendor tags will be checked against the metadata 3563 vendor extensions API and not against this field. 3564 3565 The HAL must not produce any result tags that are not listed either 3566 here or in the vendor tag list. 3567 3568 The public camera2 API will always make the vendor tags visible 3569 via CameraCharacteristics#getAvailableCaptureResultKeys. 3570 </hal_details> 3571 </entry> 3572 <entry name="availableCharacteristicsKeys" type="int32" visibility="hidden" 3573 container="array"> 3574 <array> 3575 <size>n</size> 3576 </array> 3577 <description>A list of all keys that the camera device has available 3578 to use with CameraCharacteristics.</description> 3579 <details>This entry follows the same rules as 3580 android.request.availableResultKeys (except that it applies for 3581 CameraCharacteristics instead of CaptureResult). See above for more 3582 details. 3583 </details> 3584 <hal_details> 3585 Tags listed here must always have an entry in the static info metadata, 3586 even if that size is 0 elements. Only array-type tags (e.g. lists, 3587 matrices, strings) are allowed to have 0 elements. 3588 3589 Vendor tags must not be listed here. Use the vendor tag metadata 3590 extensions C api instead (refer to camera3.h for more details). 3591 3592 Setting/getting vendor tags will be checked against the metadata 3593 vendor extensions API and not against this field. 3594 3595 The HAL must not have any tags in its static info that are not listed 3596 either here or in the vendor tag list. 3597 3598 The public camera2 API will always make the vendor tags visible 3599 via CameraCharacteristics#getKeys. 3600 </hal_details> 3601 </entry> 3602 </static> 3603 </section> 3604 <section name="scaler"> 3605 <controls> 3606 <entry name="cropRegion" type="int32" visibility="public" 3607 container="array" typedef="rectangle"> 3608 <array> 3609 <size>4</size> 3610 </array> 3611 <description>The region of the sensor to read out for this capture.</description> 3612 <units>(x,y) of top-left corner, width and height of region 3613 in pixels; (0,0) is top-left corner of 3614 android.sensor.info.activeArraySize</units> 3615 <details> 3616 The crop region coordinate system is based off 3617 android.sensor.info.activeArraySize, with `(0, 0)` being the 3618 top-left corner of the sensor active array. 3619 3620 Output streams use this rectangle to produce their output, 3621 cropping to a smaller region if necessary to maintain the 3622 stream's aspect ratio, then scaling the sensor input to 3623 match the output's configured resolution. 3624 3625 The crop region is applied after the RAW to other color 3626 space (e.g. YUV) conversion. Since raw streams 3627 (e.g. RAW16) don't have the conversion stage, they are not 3628 croppable. The crop region will be ignored by raw streams. 3629 3630 For non-raw streams, any additional per-stream cropping will 3631 be done to maximize the final pixel area of the stream. 3632 3633 For example, if the crop region is set to a 4:3 aspect 3634 ratio, then 4:3 streams will use the exact crop 3635 region. 16:9 streams will further crop vertically 3636 (letterbox). 3637 3638 Conversely, if the crop region is set to a 16:9, then 4:3 3639 outputs will crop horizontally (pillarbox), and 16:9 3640 streams will match exactly. These additional crops will 3641 be centered within the crop region. 3642 3643 The width and height of the crop region cannot 3644 be set to be smaller than 3645 `floor( activeArraySize.width / android.scaler.availableMaxDigitalZoom )` and 3646 `floor( activeArraySize.height / android.scaler.availableMaxDigitalZoom )`, respectively. 3647 3648 The camera device may adjust the crop region to account 3649 for rounding and other hardware requirements; the final 3650 crop region used will be included in the output capture 3651 result. 3652 </details> 3653 <hal_details> 3654 The output streams must maintain square pixels at all 3655 times, no matter what the relative aspect ratios of the 3656 crop region and the stream are. Negative values for 3657 corner are allowed for raw output if full pixel array is 3658 larger than active pixel array. Width and height may be 3659 rounded to nearest larger supportable width, especially 3660 for raw output, where only a few fixed scales may be 3661 possible. 3662 3663 HAL2.x uses only (x, y, width) 3664 </hal_details> 3665 <tag id="BC" /> 3666 </entry> 3667 </controls> 3668 <static> 3669 <entry name="availableFormats" type="int32" 3670 visibility="hidden" deprecated="true" enum="true" 3671 container="array" typedef="imageFormat"> 3672 <array> 3673 <size>n</size> 3674 </array> 3675 <enum> 3676 <value optional="true" id="0x20">RAW16 3677 <notes> 3678 RAW16 is a standard, cross-platform format for raw image 3679 buffers with 16-bit pixels. 3680 3681 Buffers of this format are typically expected to have a 3682 Bayer Color Filter Array (CFA) layout, which is given in 3683 android.sensor.info.colorFilterArrangement. Sensors with 3684 CFAs that are not representable by a format in 3685 android.sensor.info.colorFilterArrangement should not 3686 use this format. 3687 3688 Buffers of this format will also follow the constraints given for 3689 RAW_OPAQUE buffers, but with relaxed performance constraints. 3690 3691 See android.scaler.availableInputOutputFormatsMap for 3692 the full set of performance guarantees. 3693 </notes> 3694 </value> 3695 <value optional="true" id="0x24">RAW_OPAQUE 3696 <notes> 3697 RAW_OPAQUE is a format for raw image buffers coming from an 3698 image sensor. 3699 3700 The actual structure of buffers of this format is 3701 platform-specific, but must follow several constraints: 3702 3703 1. No image post-processing operations may have been applied to 3704 buffers of this type. These buffers contain raw image data coming 3705 directly from the image sensor. 3706 1. If a buffer of this format is passed to the camera device for 3707 reprocessing, the resulting images will be identical to the images 3708 produced if the buffer had come directly from the sensor and was 3709 processed with the same settings. 3710 3711 The intended use for this format is to allow access to the native 3712 raw format buffers coming directly from the camera sensor without 3713 any additional conversions or decrease in framerate. 3714 3715 See android.scaler.availableInputOutputFormatsMap for the full set of 3716 performance guarantees. 3717 </notes> 3718 </value> 3719 <value optional="true" id="0x32315659">YV12 3720 <notes>YCrCb 4:2:0 Planar</notes> 3721 </value> 3722 <value optional="true" id="0x11">YCrCb_420_SP 3723 <notes>NV21</notes> 3724 </value> 3725 <value id="0x22">IMPLEMENTATION_DEFINED 3726 <notes>System internal format, not application-accessible</notes> 3727 </value> 3728 <value id="0x23">YCbCr_420_888 3729 <notes>Flexible YUV420 Format</notes> 3730 </value> 3731 <value id="0x21">BLOB 3732 <notes>JPEG format</notes> 3733 </value> 3734 </enum> 3735 <description>The list of image formats that are supported by this 3736 camera device for output streams.</description> 3737 <details> 3738 All camera devices will support JPEG and YUV_420_888 formats. 3739 3740 When set to YUV_420_888, application can access the YUV420 data directly. 3741 </details> 3742 <hal_details> 3743 These format values are from HAL_PIXEL_FORMAT_* in 3744 system/core/include/system/graphics.h. 3745 3746 When IMPLEMENTATION_DEFINED is used, the platform 3747 gralloc module will select a format based on the usage flags provided 3748 by the camera HAL device and the other endpoint of the stream. It is 3749 usually used by preview and recording streams, where the application doesn't 3750 need access the image data. 3751 3752 YCbCr_420_888 format must be supported by the HAL. When an image stream 3753 needs CPU/application direct access, this format will be used. 3754 3755 The BLOB format must be supported by the HAL. This is used for the JPEG stream. 3756 3757 A RAW_OPAQUE buffer should contain only pixel data. It is strongly 3758 recommended that any information used by the camera device when 3759 processing images is fully expressed by the result metadata 3760 for that image buffer. 3761 </hal_details> 3762 <tag id="BC" /> 3763 </entry> 3764 <entry name="availableJpegMinDurations" type="int64" visibility="hidden" deprecated="true" 3765 container="array"> 3766 <array> 3767 <size>n</size> 3768 </array> 3769 <description>The minimum frame duration that is supported 3770 for each resolution in android.scaler.availableJpegSizes. 3771 </description> 3772 <units>ns</units> 3773 <range>TODO: Remove property.</range> 3774 <details> 3775 This corresponds to the minimum steady-state frame duration when only 3776 that JPEG stream is active and captured in a burst, with all 3777 processing (typically in android.*.mode) set to FAST. 3778 3779 When multiple streams are configured, the minimum 3780 frame duration will be &gt;= max(individual stream min 3781 durations)</details> 3782 <tag id="BC" /> 3783 </entry> 3784 <entry name="availableJpegSizes" type="int32" visibility="hidden" 3785 deprecated="true" container="array" typedef="size"> 3786 <array> 3787 <size>n</size> 3788 <size>2</size> 3789 </array> 3790 <description>The JPEG resolutions that are supported by this camera device.</description> 3791 <range>TODO: Remove property.</range> 3792 <details> 3793 The resolutions are listed as `(width, height)` pairs. All camera devices will support 3794 sensor maximum resolution (defined by android.sensor.info.activeArraySize). 3795 </details> 3796 <hal_details> 3797 The HAL must include sensor maximum resolution 3798 (defined by android.sensor.info.activeArraySize), 3799 and should include half/quarter of sensor maximum resolution. 3800 </hal_details> 3801 <tag id="BC" /> 3802 </entry> 3803 <entry name="availableMaxDigitalZoom" type="float" visibility="public"> 3804 <description>The maximum ratio between both active area width 3805 and crop region width, and active area height and 3806 crop region height. 3807 3808 This represents the maximum amount of zooming possible by 3809 the camera device, or equivalently, the minimum cropping 3810 window size. 3811 3812 Crop regions that have a width or height that is smaller 3813 than this ratio allows will be rounded up to the minimum 3814 allowed size by the camera device. 3815 </description> 3816 <range>&gt;=1</range> 3817 <tag id="BC" /> 3818 </entry> 3819 <entry name="availableProcessedMinDurations" type="int64" visibility="hidden" deprecated="true" 3820 container="array"> 3821 <array> 3822 <size>n</size> 3823 </array> 3824 <description>For each available processed output size (defined in 3825 android.scaler.availableProcessedSizes), this property lists the 3826 minimum supportable frame duration for that size. 3827 </description> 3828 <units>ns</units> 3829 <range>TODO: Remove property.</range> 3830 <details> 3831 This should correspond to the frame duration when only that processed 3832 stream is active, with all processing (typically in android.*.mode) 3833 set to FAST. 3834 3835 When multiple streams are configured, the minimum frame duration will 3836 be &gt;= max(individual stream min durations). 3837 </details> 3838 <tag id="BC" /> 3839 </entry> 3840 <entry name="availableProcessedSizes" type="int32" visibility="hidden" 3841 deprecated="true" container="array" typedef="size"> 3842 <array> 3843 <size>n</size> 3844 <size>2</size> 3845 </array> 3846 <description>The resolutions available for use with 3847 processed output streams, such as YV12, NV12, and 3848 platform opaque YUV/RGB streams to the GPU or video 3849 encoders.</description> 3850 <range>TODO: Remove property.</range> 3851 <details> 3852 The resolutions are listed as `(width, height)` pairs. 3853 3854 For a given use case, the actual maximum supported resolution 3855 may be lower than what is listed here, depending on the destination 3856 Surface for the image data. For example, for recording video, 3857 the video encoder chosen may have a maximum size limit (e.g. 1080p) 3858 smaller than what the camera (e.g. maximum resolution is 3264x2448) 3859 can provide. 3860 3861 Please reference the documentation for the image data destination to 3862 check if it limits the maximum size for image data. 3863 </details> 3864 <hal_details> 3865 For FULL capability devices (`android.info.supportedHardwareLevel == FULL`), 3866 the HAL must include all JPEG sizes listed in android.scaler.availableJpegSizes 3867 and each below resolution if it is smaller than or equal to the sensor 3868 maximum resolution (if they are not listed in JPEG sizes already): 3869 3870 * 240p (320 x 240) 3871 * 480p (640 x 480) 3872 * 720p (1280 x 720) 3873 * 1080p (1920 x 1080) 3874 3875 For LIMITED capability devices (`android.info.supportedHardwareLevel == LIMITED`), 3876 the HAL only has to list up to the maximum video size supported by the devices. 3877 </hal_details> 3878 <tag id="BC" /> 3879 </entry> 3880 <entry name="availableRawMinDurations" type="int64" deprecated="true" 3881 container="array"> 3882 <array> 3883 <size>n</size> 3884 </array> 3885 <description> 3886 For each available raw output size (defined in 3887 android.scaler.availableRawSizes), this property lists the minimum 3888 supportable frame duration for that size. 3889 </description> 3890 <units>ns</units> 3891 <range>TODO: Remove property.</range> 3892 <details> 3893 Should correspond to the frame duration when only the raw stream is 3894 active. 3895 3896 When multiple streams are configured, the minimum 3897 frame duration will be &gt;= max(individual stream min 3898 durations)</details> 3899 <tag id="BC" /> 3900 </entry> 3901 <entry name="availableRawSizes" type="int32" deprecated="true" 3902 container="array" typedef="size"> 3903 <array> 3904 <size>n</size> 3905 <size>2</size> 3906 </array> 3907 <description>The resolutions available for use with raw 3908 sensor output streams, listed as width, 3909 height</description> 3910 <range>TODO: Remove property. 3911 Must include: - sensor maximum resolution.</range> 3912 </entry> 3913 </static> 3914 <dynamic> 3915 <clone entry="android.scaler.cropRegion" kind="controls"> 3916 </clone> 3917 </dynamic> 3918 <static> 3919 <entry name="availableInputOutputFormatsMap" type="int32" 3920 visibility="hidden" 3921 container="array" typedef="imageFormat"> 3922 <array> 3923 <size>n</size> 3924 </array> 3925 <description>The mapping of image formats that are supported by this 3926 camera device for input streams, to their corresponding output formats. 3927 </description> 3928 <details> 3929 All camera devices with at least 1 3930 android.request.maxNumInputStreams will have at least one 3931 available input format. 3932 3933 The camera device will support the following map of formats, 3934 if its dependent capability is supported: 3935 3936 Input Format | Output Format | Capability 3937 :---------------|:-----------------|:---------- 3938 RAW_OPAQUE | JPEG | ZSL 3939 RAW_OPAQUE | YUV_420_888 | ZSL 3940 RAW_OPAQUE | RAW16 | DNG 3941 RAW16 | YUV_420_888 | DNG 3942 RAW16 | JPEG | DNG 3943 3944 For ZSL-capable camera devices, using the RAW_OPAQUE format 3945 as either input or output will never hurt maximum frame rate (i.e. 3946 StreamConfigurationMap#getOutputStallDuration(int,Size) 3947 for a `format =` RAW_OPAQUE is always 0). 3948 3949 Attempting to configure an input stream with output streams not 3950 listed as available in this map is not valid. 3951 3952 TODO: typedef to ReprocessFormatMap 3953 </details> 3954 <hal_details> 3955 For the formats, see `system/core/include/system/graphics.h` for a definition 3956 of the image format enumerations. 3957 3958 This value is encoded as a variable-size array-of-arrays. 3959 The inner array always contains `[format, length, ...]` where 3960 `...` has `length` elements. An inner array is followed by another 3961 inner array if the total metadata entry size hasn't yet been exceeded. 3962 3963 A code sample to read/write this encoding (with a device that 3964 supports reprocessing RAW_OPAQUE to RAW16, YUV_420_888, and JPEG, 3965 and reprocessing RAW16 to YUV_420_888 and JPEG): 3966 3967 // reading 3968 int32_t* contents = &entry.i32[0]; 3969 for (size_t i = 0; i < entry.count; ) { 3970 int32_t format = contents[i++]; 3971 int32_t length = contents[i++]; 3972 int32_t output_formats[length]; 3973 memcpy(&output_formats[0], &contents[i], 3974 length * sizeof(int32_t)); 3975 i += length; 3976 } 3977 3978 // writing (static example, DNG+ZSL) 3979 int32_t[] contents = { 3980 RAW_OPAQUE, 3, RAW16, YUV_420_888, BLOB, 3981 RAW16, 2, YUV_420_888, BLOB, 3982 }; 3983 update_camera_metadata_entry(metadata, index, &contents[0], 3984 sizeof(contents)/sizeof(contents[0]), &updated_entry); 3985 3986 If the HAL claims to support any of the capabilities listed in the 3987 above details, then it must also support all the input-output 3988 combinations listed for that capability. It can optionally support 3989 additional formats if it so chooses. 3990 3991 Refer to android.scaler.availableFormats for the enum values 3992 which correspond to HAL_PIXEL_FORMAT_* in 3993 system/core/include/system/graphics.h. 3994 </hal_details> 3995 </entry> 3996 <entry name="availableStreamConfigurations" type="int32" visibility="hidden" 3997 enum="true" container="array" 3998 typedef="streamConfiguration"> 3999 <array> 4000 <size>n</size> 4001 <size>4</size> 4002 </array> 4003 <enum> 4004 <value>OUTPUT</value> 4005 <value>INPUT</value> 4006 </enum> 4007 <description>The available stream configurations that this 4008 camera device supports 4009 (i.e. format, width, height, output/input stream). 4010 </description> 4011 <details> 4012 The configurations are listed as `(format, width, height, input?)` 4013 tuples. 4014 4015 For a given use case, the actual maximum supported resolution 4016 may be lower than what is listed here, depending on the destination 4017 Surface for the image data. For example, for recording video, 4018 the video encoder chosen may have a maximum size limit (e.g. 1080p) 4019 smaller than what the camera (e.g. maximum resolution is 3264x2448) 4020 can provide. 4021 4022 Please reference the documentation for the image data destination to 4023 check if it limits the maximum size for image data. 4024 4025 Not all output formats may be supported in a configuration with 4026 an input stream of a particular format. For more details, see 4027 android.scaler.availableInputOutputFormatsMap. 4028 4029 The following table describes the minimum required output stream 4030 configurations based on the hardware level 4031 (android.info.supportedHardwareLevel): 4032 4033 Format | Size | Hardware Level | Notes 4034 :-------------:|:--------------------------------------------:|:--------------:|:--------------: 4035 JPEG | android.sensor.info.activeArraySize | Any | 4036 JPEG | 1920x1080 (1080p) | Any | if 1080p <= activeArraySize 4037 JPEG | 1280x720 (720) | Any | if 720p <= activeArraySize 4038 JPEG | 640x480 (480p) | Any | if 480p <= activeArraySize 4039 JPEG | 320x240 (240p) | Any | if 240p <= activeArraySize 4040 YUV_420_888 | all output sizes available for JPEG | FULL | 4041 YUV_420_888 | all output sizes available for JPEG, up to the maximum video size | LIMITED | 4042 IMPLEMENTATION_DEFINED | same as YUV_420_888 | Any | 4043 4044 Refer to android.request.availableCapabilities for additional 4045 mandatory stream configurations on a per-capability basis. 4046 </details> 4047 <hal_details> 4048 It is recommended (but not mandatory) to also include half/quarter 4049 of sensor maximum resolution for JPEG formats (regardless of hardware 4050 level). 4051 4052 (The following is a rewording of the above required table): 4053 4054 For JPEG format, the sizes may be restricted by below conditions: 4055 4056 * The HAL may choose the aspect ratio of each Jpeg size to be one of well known ones 4057 (e.g. 4:3, 16:9, 3:2 etc.). If the sensor maximum resolution 4058 (defined by android.sensor.info.activeArraySize) has an aspect ratio other than these, 4059 it does not have to be included in the supported JPEG sizes. 4060 * Some hardware JPEG encoders may have pixel boundary alignment requirements, such as 4061 the dimensions being a multiple of 16. 4062 4063 Therefore, the maximum JPEG size may be smaller than sensor maximum resolution. 4064 However, the largest JPEG size must be as close as possible to the sensor maximum 4065 resolution given above constraints. It is required that after aspect ratio adjustments, 4066 additional size reduction due to other issues must be less than 3% in area. For example, 4067 if the sensor maximum resolution is 3280x2464, if the maximum JPEG size has aspect 4068 ratio 4:3, the JPEG encoder alignment requirement is 16, the maximum JPEG size will be 4069 3264x2448. 4070 4071 For FULL capability devices (`android.info.supportedHardwareLevel == FULL`), 4072 the HAL must include all YUV_420_888 sizes that have JPEG sizes listed 4073 here as output streams. 4074 4075 It must also include each below resolution if it is smaller than or 4076 equal to the sensor maximum resolution (for both YUV_420_888 and JPEG 4077 formats), as output streams: 4078 4079 * 240p (320 x 240) 4080 * 480p (640 x 480) 4081 * 720p (1280 x 720) 4082 * 1080p (1920 x 1080) 4083 4084 For LIMITED capability devices 4085 (`android.info.supportedHardwareLevel == LIMITED`), 4086 the HAL only has to list up to the maximum video size 4087 supported by the device. 4088 4089 Regardless of hardware level, every output resolution available for 4090 YUV_420_888 must also be available for IMPLEMENTATION_DEFINED. 4091 4092 This supercedes the following fields, which are now deprecated: 4093 4094 * availableFormats 4095 * available[Processed,Raw,Jpeg]Sizes 4096 </hal_details> 4097 </entry> 4098 <entry name="availableMinFrameDurations" type="int64" visibility="hidden" 4099 container="array" 4100 typedef="streamConfigurationDuration" > 4101 <array> 4102 <size>4</size> 4103 <size>n</size> 4104 </array> 4105 <description>This lists the minimum frame duration for each 4106 format/size combination. 4107 </description> 4108 <units>(format, width, height, ns) x n</units> 4109 <details> 4110 This should correspond to the frame duration when only that 4111 stream is active, with all processing (typically in android.*.mode) 4112 set to either OFF or FAST. 4113 4114 When multiple streams are used in a request, the minimum frame 4115 duration will be max(individual stream min durations). 4116 4117 The minimum frame duration of a stream (of a particular format, size) 4118 is the same regardless of whether the stream is input or output. 4119 4120 See android.sensor.frameDuration and 4121 android.scaler.availableStallDurations for more details about 4122 calculating the max frame rate. 4123 4124 (Keep in sync with 4125 StreamConfigurationMap#getOutputMinFrameDuration) 4126 </details> 4127 <tag id="V1" /> 4128 </entry> 4129 <entry name="availableStallDurations" type="int64" visibility="hidden" 4130 container="array" typedef="streamConfigurationDuration"> 4131 <array> 4132 <size>4</size> 4133 <size>n</size> 4134 </array> 4135 <description>This lists the maximum stall duration for each 4136 format/size combination. 4137 </description> 4138 <units>(format, width, height, ns) x n</units> 4139 <details> 4140 A stall duration is how much extra time would get added 4141 to the normal minimum frame duration for a repeating request 4142 that has streams with non-zero stall. 4143 4144 For example, consider JPEG captures which have the following 4145 characteristics: 4146 4147 * JPEG streams act like processed YUV streams in requests for which 4148 they are not included; in requests in which they are directly 4149 referenced, they act as JPEG streams. This is because supporting a 4150 JPEG stream requires the underlying YUV data to always be ready for 4151 use by a JPEG encoder, but the encoder will only be used (and impact 4152 frame duration) on requests that actually reference a JPEG stream. 4153 * The JPEG processor can run concurrently to the rest of the camera 4154 pipeline, but cannot process more than 1 capture at a time. 4155 4156 In other words, using a repeating YUV request would result 4157 in a steady frame rate (let's say it's 30 FPS). If a single 4158 JPEG request is submitted periodically, the frame rate will stay 4159 at 30 FPS (as long as we wait for the previous JPEG to return each 4160 time). If we try to submit a repeating YUV + JPEG request, then 4161 the frame rate will drop from 30 FPS. 4162 4163 In general, submitting a new request with a non-0 stall time 4164 stream will _not_ cause a frame rate drop unless there are still 4165 outstanding buffers for that stream from previous requests. 4166 4167 Submitting a repeating request with streams (call this `S`) 4168 is the same as setting the minimum frame duration from 4169 the normal minimum frame duration corresponding to `S`, added with 4170 the maximum stall duration for `S`. 4171 4172 If interleaving requests with and without a stall duration, 4173 a request will stall by the maximum of the remaining times 4174 for each can-stall stream with outstanding buffers. 4175 4176 This means that a stalling request will not have an exposure start 4177 until the stall has completed. 4178 4179 This should correspond to the stall duration when only that stream is 4180 active, with all processing (typically in android.*.mode) set to FAST 4181 or OFF. Setting any of the processing modes to HIGH_QUALITY 4182 effectively results in an indeterminate stall duration for all 4183 streams in a request (the regular stall calculation rules are 4184 ignored). 4185 4186 The following formats may always have a stall duration: 4187 4188 * JPEG 4189 * RAW16 4190 4191 The following formats will never have a stall duration: 4192 4193 * YUV_420_888 4194 * IMPLEMENTATION_DEFINED 4195 4196 All other formats may or may not have an allowed stall duration on 4197 a per-capability basis; refer to android.request.availableCapabilities 4198 for more details. 4199 4200 See android.sensor.frameDuration for more information about 4201 calculating the max frame rate (absent stalls). 4202 4203 (Keep up to date with 4204 StreamConfigurationMap#getOutputStallDuration(int, Size) ) 4205 </details> 4206 <hal_details> 4207 If possible, it is recommended that all non-JPEG formats 4208 (such as RAW16) should not have a stall duration. 4209 </hal_details> 4210 <tag id="V1" /> 4211 </entry> 4212 <entry name="streamConfigurationMap" type="int32" visibility="public" synthetic="true" typedef="streamConfigurationMap"> 4213 <description>The available stream configurations that this 4214 camera device supports; also includes the minimum frame durations 4215 and the stall durations for each format/size combination. 4216 </description> 4217 <details> 4218 All camera devices will support sensor maximum resolution (defined by 4219 android.sensor.info.activeArraySize) for the JPEG format. 4220 4221 For a given use case, the actual maximum supported resolution 4222 may be lower than what is listed here, depending on the destination 4223 Surface for the image data. For example, for recording video, 4224 the video encoder chosen may have a maximum size limit (e.g. 1080p) 4225 smaller than what the camera (e.g. maximum resolution is 3264x2448) 4226 can provide. 4227 4228 Please reference the documentation for the image data destination to 4229 check if it limits the maximum size for image data. 4230 4231 The following table describes the minimum required output stream 4232 configurations based on the hardware level 4233 (android.info.supportedHardwareLevel): 4234 4235 Format | Size | Hardware Level | Notes 4236 :-------------:|:--------------------------------------------:|:--------------:|:--------------: 4237 JPEG | android.sensor.info.activeArraySize | Any | 4238 JPEG | 1920x1080 (1080p) | Any | if 1080p <= activeArraySize 4239 JPEG | 1280x720 (720) | Any | if 720p <= activeArraySize 4240 JPEG | 640x480 (480p) | Any | if 480p <= activeArraySize 4241 JPEG | 320x240 (240p) | Any | if 240p <= activeArraySize 4242 YUV_420_888 | all output sizes available for JPEG | FULL | 4243 YUV_420_888 | all output sizes available for JPEG, up to the maximum video size | LIMITED | 4244 IMPLEMENTATION_DEFINED | same as YUV_420_888 | Any | 4245 4246 Refer to android.request.availableCapabilities for additional 4247 mandatory stream configurations on a per-capability basis. 4248 </details> 4249 <hal_details> 4250 Do not set this property directly 4251 (it is synthetic and will not be available at the HAL layer); 4252 set the android.scaler.availableStreamConfigurations instead. 4253 4254 Not all output formats may be supported in a configuration with 4255 an input stream of a particular format. For more details, see 4256 android.scaler.availableInputOutputFormatsMap. 4257 4258 It is recommended (but not mandatory) to also include half/quarter 4259 of sensor maximum resolution for JPEG formats (regardless of hardware 4260 level). 4261 4262 (The following is a rewording of the above required table): 4263 4264 The HAL must include sensor maximum resolution (defined by 4265 android.sensor.info.activeArraySize). 4266 4267 For FULL capability devices (`android.info.supportedHardwareLevel == FULL`), 4268 the HAL must include all YUV_420_888 sizes that have JPEG sizes listed 4269 here as output streams. 4270 4271 It must also include each below resolution if it is smaller than or 4272 equal to the sensor maximum resolution (for both YUV_420_888 and JPEG 4273 formats), as output streams: 4274 4275 * 240p (320 x 240) 4276 * 480p (640 x 480) 4277 * 720p (1280 x 720) 4278 * 1080p (1920 x 1080) 4279 4280 For LIMITED capability devices 4281 (`android.info.supportedHardwareLevel == LIMITED`), 4282 the HAL only has to list up to the maximum video size 4283 supported by the device. 4284 4285 Regardless of hardware level, every output resolution available for 4286 YUV_420_888 must also be available for IMPLEMENTATION_DEFINED. 4287 4288 This supercedes the following fields, which are now deprecated: 4289 4290 * availableFormats 4291 * available[Processed,Raw,Jpeg]Sizes 4292 </hal_details> 4293 </entry> 4294 <entry name="croppingType" type="byte" visibility="public" enum="true"> 4295 <enum> 4296 <value>CENTER_ONLY 4297 <notes> 4298 The camera device only supports centered crop regions. 4299 </notes> 4300 </value> 4301 <value>FREEFORM 4302 <notes> 4303 The camera device supports arbitrarily chosen crop regions. 4304 </notes> 4305 </value> 4306 </enum> 4307 <description>The crop type that this camera device supports.</description> 4308 <details> 4309 When passing a non-centered crop region (android.scaler.cropRegion) to a camera 4310 device that only supports CENTER_ONLY cropping, the camera device will move the 4311 crop region to the center of the sensor active array (android.sensor.info.activeArraySize) 4312 and keep the crop region width and height unchanged. The camera device will return the 4313 final used crop region in metadata result android.scaler.cropRegion. 4314 4315 Camera devices that support FREEFORM cropping will support any crop region that 4316 is inside of the active array. The camera device will apply the same crop region and 4317 return the final used crop region in capture result metadata android.scaler.cropRegion. 4318 4319 FULL capability devices (android.info.supportedHardwareLevel `==` FULL) will support 4320 FREEFORM cropping. 4321 </details> 4322 </entry> 4323 </static> 4324 </section> 4325 <section name="sensor"> 4326 <controls> 4327 <entry name="exposureTime" type="int64" visibility="public"> 4328 <description>Duration each pixel is exposed to 4329 light.</description> 4330 <units>nanoseconds</units> 4331 <range>android.sensor.info.exposureTimeRange</range> 4332 <details>If the sensor can't expose this exact duration, it should shorten the 4333 duration exposed to the nearest possible value (rather than expose longer). 4334 </details> 4335 <tag id="V1" /> 4336 </entry> 4337 <entry name="frameDuration" type="int64" visibility="public"> 4338 <description>Duration from start of frame exposure to 4339 start of next frame exposure.</description> 4340 <units>nanoseconds</units> 4341 <range>See android.sensor.info.maxFrameDuration, 4342 android.scaler.streamConfigurationMap. The duration 4343 is capped to `max(duration, exposureTime + overhead)`.</range> 4344 <details> 4345 The maximum frame rate that can be supported by a camera subsystem is 4346 a function of many factors: 4347 4348 * Requested resolutions of output image streams 4349 * Availability of binning / skipping modes on the imager 4350 * The bandwidth of the imager interface 4351 * The bandwidth of the various ISP processing blocks 4352 4353 Since these factors can vary greatly between different ISPs and 4354 sensors, the camera abstraction tries to represent the bandwidth 4355 restrictions with as simple a model as possible. 4356 4357 The model presented has the following characteristics: 4358 4359 * The image sensor is always configured to output the smallest 4360 resolution possible given the application's requested output stream 4361 sizes. The smallest resolution is defined as being at least as large 4362 as the largest requested output stream size; the camera pipeline must 4363 never digitally upsample sensor data when the crop region covers the 4364 whole sensor. In general, this means that if only small output stream 4365 resolutions are configured, the sensor can provide a higher frame 4366 rate. 4367 * Since any request may use any or all the currently configured 4368 output streams, the sensor and ISP must be configured to support 4369 scaling a single capture to all the streams at the same time. This 4370 means the camera pipeline must be ready to produce the largest 4371 requested output size without any delay. Therefore, the overall 4372 frame rate of a given configured stream set is governed only by the 4373 largest requested stream resolution. 4374 * Using more than one output stream in a request does not affect the 4375 frame duration. 4376 * Certain format-streams may need to do additional background processing 4377 before data is consumed/produced by that stream. These processors 4378 can run concurrently to the rest of the camera pipeline, but 4379 cannot process more than 1 capture at a time. 4380 4381 The necessary information for the application, given the model above, 4382 is provided via the android.scaler.streamConfigurationMap field 4383 using StreamConfigurationMap#getOutputMinFrameDuration(int, Size). 4384 These are used to determine the maximum frame rate / minimum frame 4385 duration that is possible for a given stream configuration. 4386 4387 Specifically, the application can use the following rules to 4388 determine the minimum frame duration it can request from the camera 4389 device: 4390 4391 1. Let the set of currently configured input/output streams 4392 be called `S`. 4393 1. Find the minimum frame durations for each stream in `S`, by 4394 looking it up in android.scaler.streamConfigurationMap using 4395 StreamConfigurationMap#getOutputMinFrameDuration(int, Size) (with 4396 its respective size/format). Let this set of frame durations be called 4397 `F`. 4398 1. For any given request `R`, the minimum frame duration allowed 4399 for `R` is the maximum out of all values in `F`. Let the streams 4400 used in `R` be called `S_r`. 4401 4402 If none of the streams in `S_r` have a stall time (listed in 4403 StreamConfigurationMap#getOutputStallDuration(int,Size) using its 4404 respective size/format), then the frame duration in 4405 `F` determines the steady state frame rate that the application will 4406 get if it uses `R` as a repeating request. Let this special kind 4407 of request be called `Rsimple`. 4408 4409 A repeating request `Rsimple` can be _occasionally_ interleaved 4410 by a single capture of a new request `Rstall` (which has at least 4411 one in-use stream with a non-0 stall time) and if `Rstall` has the 4412 same minimum frame duration this will not cause a frame rate loss 4413 if all buffers from the previous `Rstall` have already been 4414 delivered. 4415 4416 For more details about stalling, see 4417 StreamConfigurationMap#getOutputStallDuration(int,Size). 4418 </details> 4419 <hal_details> 4420 For more details about stalling, see 4421 android.scaler.availableStallDurations. 4422 </hal_details> 4423 <tag id="V1" /> 4424 </entry> 4425 <entry name="sensitivity" type="int32" visibility="public"> 4426 <description>The amount of gain applied to sensor data 4427 before processing.</description> 4428 <units>ISO arithmetic units</units> 4429 <range>android.sensor.info.sensitivityRange</range> 4430 <details> 4431 The sensitivity is the standard ISO sensitivity value, 4432 as defined in ISO 12232:2006. 4433 4434 The sensitivity must be within android.sensor.info.sensitivityRange, and 4435 if if it less than android.sensor.maxAnalogSensitivity, the camera device 4436 is guaranteed to use only analog amplification for applying the gain. 4437 4438 If the camera device cannot apply the exact sensitivity 4439 requested, it will reduce the gain to the nearest supported 4440 value. The final sensitivity used will be available in the 4441 output capture result. 4442 </details> 4443 <hal_details>ISO 12232:2006 REI method is acceptable.</hal_details> 4444 <tag id="V1" /> 4445 </entry> 4446 </controls> 4447 <static> 4448 <namespace name="info"> 4449 <entry name="activeArraySize" type="int32" visibility="public" 4450 type_notes="Four ints defining the active pixel rectangle" 4451 container="array" 4452 typedef="rectangle"> 4453 <array> 4454 <size>4</size> 4455 </array> 4456 <description>Area of raw data which corresponds to only 4457 active pixels.</description> 4458 <range>This array contains `(xmin, ymin, width, height)`. The `(xmin, ymin)` must be 4459 &gt;= `(0,0)`. The `(width, height)` must be &lt;= 4460 `android.sensor.info.pixelArraySize`. 4461 </range> 4462 <details>It is smaller or equal to 4463 sensor full pixel array, which could include the black calibration pixels.</details> 4464 <tag id="DNG" /> 4465 </entry> 4466 <entry name="sensitivityRange" type="int32" visibility="public" 4467 type_notes="Range of supported sensitivities" 4468 container="array" typedef="rangeInt"> 4469 <array> 4470 <size>2</size> 4471 </array> 4472 <description>Range of valid sensitivities.</description> 4473 <range>Min <= 100, Max &gt;= 1600</range> 4474 <details> 4475 The minimum and maximum valid values for the 4476 android.sensor.sensitivity control. 4477 4478 The values are the standard ISO sensitivity values, 4479 as defined in ISO 12232:2006. 4480 </details> 4481 4482 <tag id="BC" /> 4483 <tag id="V1" /> 4484 </entry> 4485 <entry name="colorFilterArrangement" type="byte" visibility="public" enum="true"> 4486 <enum> 4487 <value>RGGB</value> 4488 <value>GRBG</value> 4489 <value>GBRG</value> 4490 <value>BGGR</value> 4491 <value>RGB 4492 <notes>Sensor is not Bayer; output has 3 16-bit 4493 values for each pixel, instead of just 1 16-bit value 4494 per pixel.</notes></value> 4495 </enum> 4496 <description>The arrangement of color filters on sensor; 4497 represents the colors in the top-left 2x2 section of 4498 the sensor, in reading order.</description> 4499 <tag id="DNG" /> 4500 </entry> 4501 <entry name="exposureTimeRange" type="int64" visibility="public" 4502 type_notes="nanoseconds" container="array" typedef="rangeLong"> 4503 <array> 4504 <size>2</size> 4505 </array> 4506 <description>Range of valid exposure 4507 times used by android.sensor.exposureTime.</description> 4508 <range>Min <= 100e3 (100 us). For FULL capability devices 4509 (android.info.supportedHardwareLevel == FULL), Max SHOULD be 4510 &gt;= 1e9 (1sec), MUST be &gt;= 100e6 (100ms)</range> 4511 <hal_details>For FULL capability devices (android.info.supportedHardwareLevel == FULL), 4512 The maximum of the range SHOULD be at least 4513 1 second (1e9), MUST be at least 100ms.</hal_details> 4514 <tag id="V1" /> 4515 </entry> 4516 <entry name="maxFrameDuration" type="int64" visibility="public"> 4517 <description>Maximum possible frame duration (minimum frame 4518 rate).</description> 4519 <units>nanoseconds</units> 4520 <range>For FULL capability devices 4521 (android.info.supportedHardwareLevel == FULL), Max SHOULD be 4522 &gt;= 1e9 (1sec), MUST be &gt;= 100e6 (100ms) 4523 </range> 4524 <details>The largest possible android.sensor.frameDuration 4525 that will be accepted by the camera device. Attempting to use 4526 frame durations beyond the maximum will result in the frame duration 4527 being clipped to the maximum. See that control 4528 for a full definition of frame durations. 4529 4530 Refer to 4531 StreamConfigurationMap#getOutputMinFrameDuration(int,Size) 4532 for the minimum frame duration values. 4533 </details> 4534 <hal_details> 4535 For FULL capability devices (android.info.supportedHardwareLevel == FULL), 4536 The maximum of the range SHOULD be at least 4537 1 second (1e9), MUST be at least 100ms (100e6). 4538 4539 android.sensor.info.maxFrameDuration must be greater or 4540 equal to the android.sensor.info.exposureTimeRange max 4541 value (since exposure time overrides frame duration). 4542 4543 Available minimum frame durations for JPEG must be no greater 4544 than that of the YUV_420_888/IMPLEMENTATION_DEFINED 4545 minimum frame durations (for that respective size). 4546 4547 Since JPEG processing is considered offline and can take longer than 4548 a single uncompressed capture, refer to 4549 android.scaler.availableStallDurations 4550 for details about encoding this scenario. 4551 </hal_details> 4552 <tag id="V1" /> 4553 </entry> 4554 <entry name="physicalSize" type="float" visibility="public" 4555 type_notes="width x height in millimeters" 4556 container="array" typedef="sizeF"> 4557 <array> 4558 <size>2</size> 4559 </array> 4560 <description>The physical dimensions of the full pixel 4561 array.</description> 4562 <details>This is the physical size of the sensor pixel 4563 array defined by android.sensor.info.pixelArraySize. 4564 </details> 4565 <hal_details>Needed for FOV calculation for old API</hal_details> 4566 <tag id="V1" /> 4567 <tag id="BC" /> 4568 </entry> 4569 <entry name="pixelArraySize" type="int32" visibility="public" 4570 container="array" typedef="size"> 4571 <array> 4572 <size>2</size> 4573 </array> 4574 <description>Dimensions of full pixel array, possibly 4575 including black calibration pixels.</description> 4576 <details>The pixel count of the full pixel array, 4577 which covers android.sensor.info.physicalSize area. 4578 4579 If a camera device supports raw sensor formats, either this 4580 or android.sensor.info.activeArraySize is the maximum output 4581 raw size listed in android.scaler.streamConfigurationMap. 4582 If a size corresponding to pixelArraySize is listed, the resulting 4583 raw sensor image will include black pixels. 4584 </details> 4585 <tag id="DNG" /> 4586 <tag id="BC" /> 4587 </entry> 4588 <entry name="whiteLevel" type="int32" visibility="public"> 4589 <description> 4590 Maximum raw value output by sensor. 4591 </description> 4592 <range>&gt; 255 (8-bit output)</range> 4593 <details> 4594 This specifies the fully-saturated encoding level for the raw 4595 sample values from the sensor. This is typically caused by the 4596 sensor becoming highly non-linear or clipping. The minimum for 4597 each channel is specified by the offset in the 4598 android.sensor.blackLevelPattern tag. 4599 4600 The white level is typically determined either by sensor bit depth 4601 (8-14 bits is expected), or by the point where the sensor response 4602 becomes too non-linear to be useful. The default value for this is 4603 maximum representable value for a 16-bit raw sample (2^16 - 1). 4604 </details> 4605 <hal_details> 4606 The full bit depth of the sensor must be available in the raw data, 4607 so the value for linear sensors should not be significantly lower 4608 than maximum raw value supported, i.e. 2^(sensor bits per pixel). 4609 </hal_details> 4610 <tag id="DNG" /> 4611 </entry> 4612 </namespace> 4613 <entry name="referenceIlluminant1" type="byte" visibility="public" enum="true"> 4614 <enum> 4615 <value id="1">DAYLIGHT</value> 4616 <value id="2">FLUORESCENT</value> 4617 <value id="3">TUNGSTEN 4618 <notes>Incandescent light</notes> 4619 </value> 4620 <value id="4">FLASH</value> 4621 <value id="9">FINE_WEATHER</value> 4622 <value id="10">CLOUDY_WEATHER</value> 4623 <value id="11">SHADE</value> 4624 <value id="12">DAYLIGHT_FLUORESCENT 4625 <notes>D 5700 - 7100K</notes> 4626 </value> 4627 <value id="13">DAY_WHITE_FLUORESCENT 4628 <notes>N 4600 - 5400K</notes> 4629 </value> 4630 <value id="14">COOL_WHITE_FLUORESCENT 4631 <notes>W 3900 - 4500K</notes> 4632 </value> 4633 <value id="15">WHITE_FLUORESCENT 4634 <notes>WW 3200 - 3700K</notes> 4635 </value> 4636 <value id="17">STANDARD_A</value> 4637 <value id="18">STANDARD_B</value> 4638 <value id="19">STANDARD_C</value> 4639 <value id="20">D55</value> 4640 <value id="21">D65</value> 4641 <value id="22">D75</value> 4642 <value id="23">D50</value> 4643 <value id="24">ISO_STUDIO_TUNGSTEN</value> 4644 </enum> 4645 <description> 4646 The standard reference illuminant used as the scene light source when 4647 calculating the android.sensor.colorTransform1, 4648 android.sensor.calibrationTransform1, and 4649 android.sensor.forwardMatrix1 matrices. 4650 </description> 4651 <details> 4652 The values in this tag correspond to the values defined for the 4653 EXIF LightSource tag. These illuminants are standard light sources 4654 that are often used calibrating camera devices. 4655 4656 If this tag is present, then android.sensor.colorTransform1, 4657 android.sensor.calibrationTransform1, and 4658 android.sensor.forwardMatrix1 will also be present. 4659 4660 Some devices may choose to provide a second set of calibration 4661 information for improved quality, including 4662 android.sensor.referenceIlluminant2 and its corresponding matrices. 4663 </details> 4664 <hal_details> 4665 The first reference illuminant (android.sensor.referenceIlluminant1) 4666 and corresponding matrices must be present to support DNG output. 4667 4668 When producing raw images with a color profile that has only been 4669 calibrated against a single light source, it is valid to omit 4670 android.sensor.referenceIlluminant2 along with the 4671 android.sensor.colorTransform2, android.sensor.calibrationTransform2, 4672 and android.sensor.forwardMatrix2 matrices. 4673 4674 If only android.sensor.referenceIlluminant1 is included, it should be 4675 chosen so that it is representative of typical scene lighting. In 4676 general, D50 or DAYLIGHT will be chosen for this case. 4677 4678 If both android.sensor.referenceIlluminant1 and 4679 android.sensor.referenceIlluminant2 are included, they should be 4680 chosen to represent the typical range of scene lighting conditions. 4681 In general, low color temperature illuminant such as Standard-A will 4682 be chosen for the first reference illuminant and a higher color 4683 temperature illuminant such as D65 will be chosen for the second 4684 reference illuminant. 4685 </hal_details> 4686 <tag id="DNG" /> 4687 </entry> 4688 <entry name="referenceIlluminant2" type="byte" visibility="public"> 4689 <description> 4690 The standard reference illuminant used as the scene light source when 4691 calculating the android.sensor.colorTransform2, 4692 android.sensor.calibrationTransform2, and 4693 android.sensor.forwardMatrix2 matrices. 4694 </description> 4695 <details> 4696 See android.sensor.referenceIlluminant1 for more details. 4697 Valid values for this are the same as those given for the first 4698 reference illuminant. 4699 4700 If this tag is present, then android.sensor.colorTransform2, 4701 android.sensor.calibrationTransform2, and 4702 android.sensor.forwardMatrix2 will also be present. 4703 </details> 4704 <tag id="DNG" /> 4705 </entry> 4706 <entry name="calibrationTransform1" type="rational" 4707 visibility="public" optional="true" 4708 type_notes="3x3 matrix in row-major-order" container="array" 4709 typedef="colorSpaceTransform" > 4710 <array> 4711 <size>3</size> 4712 <size>3</size> 4713 </array> 4714 <description> 4715 A per-device calibration transform matrix that maps from the 4716 reference sensor colorspace to the actual device sensor colorspace. 4717 </description> 4718 <details> 4719 This matrix is used to correct for per-device variations in the 4720 sensor colorspace, and is used for processing raw buffer data. 4721 4722 The matrix is expressed as a 3x3 matrix in row-major-order, and 4723 contains a per-device calibration transform that maps colors 4724 from reference sensor color space (i.e. the "golden module" 4725 colorspace) into this camera device's native sensor color 4726 space under the first reference illuminant 4727 (android.sensor.referenceIlluminant1). 4728 </details> 4729 <tag id="DNG" /> 4730 </entry> 4731 <entry name="calibrationTransform2" type="rational" 4732 visibility="public" optional="true" 4733 type_notes="3x3 matrix in row-major-order" container="array" 4734 typedef="colorSpaceTransform"> 4735 <array> 4736 <size>3</size> 4737 <size>3</size> 4738 </array> 4739 <description> 4740 A per-device calibration transform matrix that maps from the 4741 reference sensor colorspace to the actual device sensor colorspace 4742 (this is the colorspace of the raw buffer data). 4743 </description> 4744 <details> 4745 This matrix is used to correct for per-device variations in the 4746 sensor colorspace, and is used for processing raw buffer data. 4747 4748 The matrix is expressed as a 3x3 matrix in row-major-order, and 4749 contains a per-device calibration transform that maps colors 4750 from reference sensor color space (i.e. the "golden module" 4751 colorspace) into this camera device's native sensor color 4752 space under the second reference illuminant 4753 (android.sensor.referenceIlluminant2). 4754 4755 This matrix will only be present if the second reference 4756 illuminant is present. 4757 </details> 4758 <tag id="DNG" /> 4759 </entry> 4760 <entry name="colorTransform1" type="rational" 4761 visibility="public" optional="true" 4762 type_notes="3x3 matrix in row-major-order" container="array" 4763 typedef="colorSpaceTransform" > 4764 <array> 4765 <size>3</size> 4766 <size>3</size> 4767 </array> 4768 <description> 4769 A matrix that transforms color values from CIE XYZ color space to 4770 reference sensor color space. 4771 </description> 4772 <details> 4773 This matrix is used to convert from the standard CIE XYZ color 4774 space to the reference sensor colorspace, and is used when processing 4775 raw buffer data. 4776 4777 The matrix is expressed as a 3x3 matrix in row-major-order, and 4778 contains a color transform matrix that maps colors from the CIE 4779 XYZ color space to the reference sensor color space (i.e. the 4780 "golden module" colorspace) under the first reference illuminant 4781 (android.sensor.referenceIlluminant1). 4782 4783 The white points chosen in both the reference sensor color space 4784 and the CIE XYZ colorspace when calculating this transform will 4785 match the standard white point for the first reference illuminant 4786 (i.e. no chromatic adaptation will be applied by this transform). 4787 </details> 4788 <tag id="DNG" /> 4789 </entry> 4790 <entry name="colorTransform2" type="rational" 4791 visibility="public" optional="true" 4792 type_notes="3x3 matrix in row-major-order" container="array" 4793 typedef="colorSpaceTransform" > 4794 <array> 4795 <size>3</size> 4796 <size>3</size> 4797 </array> 4798 <description> 4799 A matrix that transforms color values from CIE XYZ color space to 4800 reference sensor color space. 4801 </description> 4802 <details> 4803 This matrix is used to convert from the standard CIE XYZ color 4804 space to the reference sensor colorspace, and is used when processing 4805 raw buffer data. 4806 4807 The matrix is expressed as a 3x3 matrix in row-major-order, and 4808 contains a color transform matrix that maps colors from the CIE 4809 XYZ color space to the reference sensor color space (i.e. the 4810 "golden module" colorspace) under the second reference illuminant 4811 (android.sensor.referenceIlluminant2). 4812 4813 The white points chosen in both the reference sensor color space 4814 and the CIE XYZ colorspace when calculating this transform will 4815 match the standard white point for the second reference illuminant 4816 (i.e. no chromatic adaptation will be applied by this transform). 4817 4818 This matrix will only be present if the second reference 4819 illuminant is present. 4820 </details> 4821 <tag id="DNG" /> 4822 </entry> 4823 <entry name="forwardMatrix1" type="rational" 4824 visibility="public" optional="true" 4825 type_notes="3x3 matrix in row-major-order" container="array" 4826 typedef="colorSpaceTransform" > 4827 <array> 4828 <size>3</size> 4829 <size>3</size> 4830 </array> 4831 <description> 4832 A matrix that transforms white balanced camera colors from the reference 4833 sensor colorspace to the CIE XYZ colorspace with a D50 whitepoint. 4834 </description> 4835 <details> 4836 This matrix is used to convert to the standard CIE XYZ colorspace, and 4837 is used when processing raw buffer data. 4838 4839 This matrix is expressed as a 3x3 matrix in row-major-order, and contains 4840 a color transform matrix that maps white balanced colors from the 4841 reference sensor color space to the CIE XYZ color space with a D50 white 4842 point. 4843 4844 Under the first reference illuminant (android.sensor.referenceIlluminant1) 4845 this matrix is chosen so that the standard white point for this reference 4846 illuminant in the reference sensor colorspace is mapped to D50 in the 4847 CIE XYZ colorspace. 4848 </details> 4849 <tag id="DNG" /> 4850 </entry> 4851 <entry name="forwardMatrix2" type="rational" 4852 visibility="public" optional="true" 4853 type_notes="3x3 matrix in row-major-order" container="array" 4854 typedef="colorSpaceTransform" > 4855 <array> 4856 <size>3</size> 4857 <size>3</size> 4858 </array> 4859 <description> 4860 A matrix that transforms white balanced camera colors from the reference 4861 sensor colorspace to the CIE XYZ colorspace with a D50 whitepoint. 4862 </description> 4863 <details> 4864 This matrix is used to convert to the standard CIE XYZ colorspace, and 4865 is used when processing raw buffer data. 4866 4867 This matrix is expressed as a 3x3 matrix in row-major-order, and contains 4868 a color transform matrix that maps white balanced colors from the 4869 reference sensor color space to the CIE XYZ color space with a D50 white 4870 point. 4871 4872 Under the second reference illuminant (android.sensor.referenceIlluminant2) 4873 this matrix is chosen so that the standard white point for this reference 4874 illuminant in the reference sensor colorspace is mapped to D50 in the 4875 CIE XYZ colorspace. 4876 4877 This matrix will only be present if the second reference 4878 illuminant is present. 4879 </details> 4880 <tag id="DNG" /> 4881 </entry> 4882 <entry name="baseGainFactor" type="rational" 4883 optional="true"> 4884 <description>Gain factor from electrons to raw units when 4885 ISO=100</description> 4886 <tag id="FUTURE" /> 4887 </entry> 4888 <entry name="blackLevelPattern" type="int32" visibility="public" 4889 optional="true" type_notes="2x2 raw count block" container="array"> 4890 <array> 4891 <size>4</size> 4892 </array> 4893 <description> 4894 A fixed black level offset for each of the color filter arrangement 4895 (CFA) mosaic channels. 4896 </description> 4897 <range>&gt;= 0 for each.</range> 4898 <details> 4899 This tag specifies the zero light value for each of the CFA mosaic 4900 channels in the camera sensor. The maximal value output by the 4901 sensor is represented by the value in android.sensor.info.whiteLevel. 4902 4903 The values are given in row-column scan order, with the first value 4904 corresponding to the element of the CFA in row=0, column=0. 4905 </details> 4906 <tag id="DNG" /> 4907 </entry> 4908 <entry name="maxAnalogSensitivity" type="int32" visibility="public" 4909 optional="true"> 4910 <description>Maximum sensitivity that is implemented 4911 purely through analog gain.</description> 4912 <details>For android.sensor.sensitivity values less than or 4913 equal to this, all applied gain must be analog. For 4914 values above this, the gain applied can be a mix of analog and 4915 digital.</details> 4916 <tag id="V1" /> 4917 <tag id="FULL" /> 4918 </entry> 4919 <entry name="orientation" type="int32" visibility="public"> 4920 <description>Clockwise angle through which the output 4921 image needs to be rotated to be upright on the device 4922 screen in its native orientation. Also defines the 4923 direction of rolling shutter readout, which is from top 4924 to bottom in the sensor's coordinate system</description> 4925 <units>degrees clockwise rotation, only multiples of 4926 90</units> 4927 <range>0,90,180,270</range> 4928 <tag id="BC" /> 4929 </entry> 4930 <entry name="profileHueSatMapDimensions" type="int32" 4931 visibility="system" optional="true" 4932 type_notes="Number of samples for hue, saturation, and value" 4933 container="array"> 4934 <array> 4935 <size>3</size> 4936 </array> 4937 <description> 4938 The number of input samples for each dimension of 4939 android.sensor.profileHueSatMap. 4940 </description> 4941 <range> 4942 Hue &gt;= 1, 4943 Saturation &gt;= 2, 4944 Value &gt;= 1 4945 </range> 4946 <details> 4947 The number of input samples for the hue, saturation, and value 4948 dimension of android.sensor.profileHueSatMap. The order of the 4949 dimensions given is hue, saturation, value; where hue is the 0th 4950 element. 4951 </details> 4952 <tag id="DNG" /> 4953 </entry> 4954 </static> 4955 <dynamic> 4956 <clone entry="android.sensor.exposureTime" kind="controls"> 4957 </clone> 4958 <clone entry="android.sensor.frameDuration" 4959 kind="controls"></clone> 4960 <clone entry="android.sensor.sensitivity" kind="controls"> 4961 </clone> 4962 <entry name="timestamp" type="int64" visibility="public"> 4963 <description>Time at start of exposure of first 4964 row of the image sensor, in nanoseconds.</description> 4965 <units>nanoseconds</units> 4966 <range>&gt; 0</range> 4967 <details>The timestamps are also included in all image 4968 buffers produced for the same capture, and will be identical 4969 on all the outputs. The timestamps measure time since an 4970 unspecified starting point, and are monotonically 4971 increasing. 4972 4973 They can be compared with the timestamps for other captures 4974 from the same camera device, but are not guaranteed to be 4975 comparable to any other time source.</details> 4976 <hal_details> 4977 All timestamps should be in reference to the kernel's 4978 CLOCK_BOOTTIME monotonic clock, which properly accounts for 4979 time spent asleep. This allows for synchronization with 4980 sensors that continue to operate while the system is 4981 otherwise asleep. 4982 4983 If not CLOCK_BOOTTIME, timestamps must be in reference to 4984 CLOCK_MONOTONIC. 4985 </hal_details> 4986 <tag id="BC" /> 4987 </entry> 4988 <entry name="temperature" type="float" 4989 optional="true"> 4990 <description>The temperature of the sensor, sampled at the time 4991 exposure began for this frame. 4992 4993 The thermal diode being queried should be inside the sensor PCB, or 4994 somewhere close to it. 4995 </description> 4996 4997 <units>celsius</units> 4998 <range>Optional. This value is missing if no temperature is available.</range> 4999 <tag id="FUTURE" /> 5000 </entry> 5001 <entry name="neutralColorPoint" type="rational" visibility="public" 5002 optional="true" container="array"> 5003 <array> 5004 <size>3</size> 5005 </array> 5006 <description> 5007 The estimated camera neutral color in the native sensor colorspace at 5008 the time of capture. 5009 </description> 5010 <details> 5011 This value gives the neutral color point encoded as an RGB value in the 5012 native sensor color space. The neutral color point indicates the 5013 currently estimated white point of the scene illumination. It can be 5014 used to interpolate between the provided color transforms when 5015 processing raw sensor data. 5016 5017 The order of the values is R, G, B; where R is in the lowest index. 5018 </details> 5019 <tag id="DNG" /> 5020 </entry> 5021 <entry name="profileHueSatMap" type="float" 5022 visibility="system" optional="true" 5023 type_notes="Mapping for hue, saturation, and value" 5024 container="array"> 5025 <array> 5026 <size>hue_samples</size> 5027 <size>saturation_samples</size> 5028 <size>value_samples</size> 5029 <size>3</size> 5030 </array> 5031 <description> 5032 A mapping containing a hue shift, saturation scale, and value scale 5033 for each pixel. 5034 </description> 5035 <units> 5036 Hue shift is given in degrees; saturation and value scale factors are 5037 unitless. 5038 </units> 5039 <details> 5040 hue_samples, saturation_samples, and value_samples are given in 5041 android.sensor.profileHueSatMapDimensions. 5042 5043 Each entry of this map contains three floats corresponding to the 5044 hue shift, saturation scale, and value scale, respectively; where the 5045 hue shift has the lowest index. The map entries are stored in the tag 5046 in nested loop order, with the value divisions in the outer loop, the 5047 hue divisions in the middle loop, and the saturation divisions in the 5048 inner loop. All zero input saturation entries are required to have a 5049 value scale factor of 1.0. 5050 </details> 5051 <tag id="DNG" /> 5052 </entry> 5053 <entry name="profileToneCurve" type="float" 5054 visibility="system" optional="true" 5055 type_notes="Samples defining a spline for a tone-mapping curve" 5056 container="array"> 5057 <array> 5058 <size>samples</size> 5059 <size>2</size> 5060 </array> 5061 <description> 5062 A list of x,y samples defining a tone-mapping curve for gamma adjustment. 5063 </description> 5064 <range> 5065 Each sample has an input range of `[0, 1]` and an output range of 5066 `[0, 1]`. The first sample is required to be `(0, 0)`, and the last 5067 sample is required to be `(1, 1)`. 5068 </range> 5069 <details> 5070 This tag contains a default tone curve that can be applied while 5071 processing the image as a starting point for user adjustments. 5072 The curve is specified as a list of value pairs in linear gamma. 5073 The curve is interpolated using a cubic spline. 5074 </details> 5075 <tag id="DNG" /> 5076 </entry> 5077 <entry name="greenSplit" type="float" visibility="public" optional="true"> 5078 <description> 5079 The worst-case divergence between Bayer green channels. 5080 </description> 5081 <range> 5082 &gt;= 0 5083 </range> 5084 <details> 5085 This value is an estimate of the worst case split between the 5086 Bayer green channels in the red and blue rows in the sensor color 5087 filter array. 5088 5089 The green split is calculated as follows: 5090 5091 1. A 5x5 pixel (or larger) window W within the active sensor array is 5092 chosen. The term 'pixel' here is taken to mean a group of 4 Bayer 5093 mosaic channels (R, Gr, Gb, B). The location and size of the window 5094 chosen is implementation defined, and should be chosen to provide a 5095 green split estimate that is both representative of the entire image 5096 for this camera sensor, and can be calculated quickly. 5097 1. The arithmetic mean of the green channels from the red 5098 rows (mean_Gr) within W is computed. 5099 1. The arithmetic mean of the green channels from the blue 5100 rows (mean_Gb) within W is computed. 5101 1. The maximum ratio R of the two means is computed as follows: 5102 `R = max((mean_Gr + 1)/(mean_Gb + 1), (mean_Gb + 1)/(mean_Gr + 1))` 5103 5104 The ratio R is the green split divergence reported for this property, 5105 which represents how much the green channels differ in the mosaic 5106 pattern. This value is typically used to determine the treatment of 5107 the green mosaic channels when demosaicing. 5108 5109 The green split value can be roughly interpreted as follows: 5110 5111 * R &lt; 1.03 is a negligible split (&lt;3% divergence). 5112 * 1.20 &lt;= R &gt;= 1.03 will require some software 5113 correction to avoid demosaic errors (3-20% divergence). 5114 * R &gt; 1.20 will require strong software correction to produce 5115 a usuable image (&gt;20% divergence). 5116 </details> 5117 <hal_details> 5118 The green split given may be a static value based on prior 5119 characterization of the camera sensor using the green split 5120 calculation method given here over a large, representative, sample 5121 set of images. Other methods of calculation that produce equivalent 5122 results, and can be interpreted in the same manner, may be used. 5123 </hal_details> 5124 <tag id="DNG" /> 5125 </entry> 5126 </dynamic> 5127 <controls> 5128 <entry name="testPatternData" type="int32" visibility="public" optional="true" container="array"> 5129 <array> 5130 <size>4</size> 5131 </array> 5132 <description> 5133 A pixel `[R, G_even, G_odd, B]` that supplies the test pattern 5134 when android.sensor.testPatternMode is SOLID_COLOR. 5135 </description> 5136 <range>Optional. 5137 Must be supported if android.sensor.availableTestPatternModes contains 5138 SOLID_COLOR.</range> 5139 <details> 5140 Each color channel is treated as an unsigned 32-bit integer. 5141 The camera device then uses the most significant X bits 5142 that correspond to how many bits are in its Bayer raw sensor 5143 output. 5144 5145 For example, a sensor with RAW10 Bayer output would use the 5146 10 most significant bits from each color channel. 5147 </details> 5148 <hal_details> 5149 </hal_details> 5150 </entry> 5151 <entry name="testPatternMode" type="int32" visibility="public" optional="true" 5152 enum="true"> 5153 <enum> 5154 <value>OFF 5155 <notes>No test pattern mode is used, and the camera 5156 device returns captures from the image sensor. 5157 5158 This is the default if the key is not set.</notes> 5159 </value> 5160 <value>SOLID_COLOR 5161 <notes> 5162 Each pixel in `[R, G_even, G_odd, B]` is replaced by its 5163 respective color channel provided in 5164 android.sensor.testPatternData. 5165 5166 For example: 5167 5168 android.testPatternData = [0, 0xFFFFFFFF, 0xFFFFFFFF, 0] 5169 5170 All green pixels are 100% green. All red/blue pixels are black. 5171 5172 android.testPatternData = [0xFFFFFFFF, 0, 0xFFFFFFFF, 0] 5173 5174 All red pixels are 100% red. Only the odd green pixels 5175 are 100% green. All blue pixels are 100% black. 5176 </notes> 5177 </value> 5178 <value>COLOR_BARS 5179 <notes> 5180 All pixel data is replaced with an 8-bar color pattern. 5181 5182 The vertical bars (left-to-right) are as follows: 5183 5184 * 100% white 5185 * yellow 5186 * cyan 5187 * green 5188 * magenta 5189 * red 5190 * blue 5191 * black 5192 5193 In general the image would look like the following: 5194 5195 W Y C G M R B K 5196 W Y C G M R B K 5197 W Y C G M R B K 5198 W Y C G M R B K 5199 W Y C G M R B K 5200 . . . . . . . . 5201 . . . . . . . . 5202 . . . . . . . . 5203 5204 (B = Blue, K = Black) 5205 5206 Each bar should take up 1/8 of the sensor pixel array width. 5207 When this is not possible, the bar size should be rounded 5208 down to the nearest integer and the pattern can repeat 5209 on the right side. 5210 5211 Each bar's height must always take up the full sensor 5212 pixel array height. 5213 5214 Each pixel in this test pattern must be set to either 5215 0% intensity or 100% intensity. 5216 </notes> 5217 </value> 5218 <value>COLOR_BARS_FADE_TO_GRAY 5219 <notes> 5220 The test pattern is similar to COLOR_BARS, except that 5221 each bar should start at its specified color at the top, 5222 and fade to gray at the bottom. 5223 5224 Furthermore each bar is further subdivided into a left and 5225 right half. The left half should have a smooth gradient, 5226 and the right half should have a quantized gradient. 5227 5228 In particular, the right half's should consist of blocks of the 5229 same color for 1/16th active sensor pixel array width. 5230 5231 The least significant bits in the quantized gradient should 5232 be copied from the most significant bits of the smooth gradient. 5233 5234 The height of each bar should always be a multiple of 128. 5235 When this is not the case, the pattern should repeat at the bottom 5236 of the image. 5237 </notes> 5238 </value> 5239 <value>PN9 5240 <notes> 5241 All pixel data is replaced by a pseudo-random sequence 5242 generated from a PN9 512-bit sequence (typically implemented 5243 in hardware with a linear feedback shift register). 5244 5245 The generator should be reset at the beginning of each frame, 5246 and thus each subsequent raw frame with this test pattern should 5247 be exactly the same as the last. 5248 </notes> 5249 </value> 5250 <value id="256">CUSTOM1 5251 <notes>The first custom test pattern. All custom patterns that are 5252 available only on this camera device are at least this numeric 5253 value. 5254 5255 All of the custom test patterns will be static 5256 (that is the raw image must not vary from frame to frame). 5257 </notes> 5258 </value> 5259 </enum> 5260 <description>When enabled, the sensor sends a test pattern instead of 5261 doing a real exposure from the camera. 5262 </description> 5263 <range>Optional. Defaults to OFF. Value must be one of 5264 android.sensor.availableTestPatternModes</range> 5265 <details> 5266 When a test pattern is enabled, all manual sensor controls specified 5267 by android.sensor.* will be ignored. All other controls should 5268 work as normal. 5269 5270 For example, if manual flash is enabled, flash firing should still 5271 occur (and that the test pattern remain unmodified, since the flash 5272 would not actually affect it). 5273 </details> 5274 <hal_details> 5275 All test patterns are specified in the Bayer domain. 5276 5277 The HAL may choose to substitute test patterns from the sensor 5278 with test patterns from on-device memory. In that case, it should be 5279 indistinguishable to the ISP whether the data came from the 5280 sensor interconnect bus (such as CSI2) or memory. 5281 </hal_details> 5282 </entry> 5283 </controls> 5284 <dynamic> 5285 <clone entry="android.sensor.testPatternData" kind="controls"> 5286 </clone> 5287 <clone entry="android.sensor.testPatternMode" kind="controls"> 5288 </clone> 5289 </dynamic> 5290 <static> 5291 <entry name="availableTestPatternModes" type="int32" visibility="public" optional="true" 5292 type_notes="list of enums" container="array"> 5293 <array> 5294 <size>n</size> 5295 </array> 5296 <description>Lists the supported sensor test pattern modes for android.sensor.testPatternMode. 5297 </description> 5298 <range>Always includes OFF if defined. All custom modes must be >= CUSTOM1</range> 5299 <details> 5300 Optional. Defaults to [OFF]. 5301 </details> 5302 </entry> 5303 </static> 5304 </section> 5305 <section name="shading"> 5306 <controls> 5307 <entry name="mode" type="byte" visibility="public" enum="true"> 5308 <enum> 5309 <value>OFF 5310 <notes>No lens shading correction is applied.</notes></value> 5311 <value>FAST 5312 <notes>Apply lens shading corrections, without slowing 5313 frame rate relative to sensor raw output</notes></value> 5314 <value>HIGH_QUALITY 5315 <notes>Apply high-quality lens shading correction, at the 5316 cost of reduced frame rate.</notes></value> 5317 </enum> 5318 <description>Quality of lens shading correction applied 5319 to the image data.</description> 5320 <details> 5321 When set to OFF mode, no lens shading correction will be applied by the 5322 camera device, and an identity lens shading map data will be provided 5323 if `android.statistics.lensShadingMapMode == ON`. For example, for lens 5324 shading map with size specified as `android.lens.info.shadingMapSize = [ 4, 3 ]`, 5325 the output android.statistics.lensShadingMap for this case will be an identity map 5326 shown below: 5327 5328 [ 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 5329 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 5330 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 5331 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 5332 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 5333 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0 ] 5334 5335 When set to other modes, lens shading correction will be applied by the 5336 camera device. Applications can request lens shading map data by setting 5337 android.statistics.lensShadingMapMode to ON, and then the camera device will provide 5338 lens shading map data in android.statistics.lensShadingMap, with size specified 5339 by android.lens.info.shadingMapSize; the returned shading map data will be the one 5340 applied by the camera device for this capture request. 5341 5342 The shading map data may depend on the auto-exposure (AE) and AWB statistics, therefore the reliability 5343 of the map data may be affected by the AE and AWB algorithms. When AE and AWB are in 5344 AUTO modes(android.control.aeMode `!=` OFF and android.control.awbMode `!=` OFF), 5345 to get best results, it is recommended that the applications wait for the AE and AWB to 5346 be converged before using the returned shading map data. 5347 </details> 5348 </entry> 5349 <entry name="strength" type="byte"> 5350 <description>Control the amount of shading correction 5351 applied to the images</description> 5352 <units>unitless: 1-10; 10 is full shading 5353 compensation</units> 5354 <tag id="FUTURE" /> 5355 </entry> 5356 </controls> 5357 <dynamic> 5358 <clone entry="android.shading.mode" kind="controls"> 5359 </clone> 5360 </dynamic> 5361 </section> 5362 <section name="statistics"> 5363 <controls> 5364 <entry name="faceDetectMode" type="byte" visibility="public" enum="true"> 5365 <enum> 5366 <value>OFF 5367 <notes>Do not include face detection statistics in capture 5368 results.</notes></value> 5369 <value optional="true">SIMPLE 5370 <notes>Return face rectangle and confidence values only. 5371 5372 In this mode, only android.statistics.faceRectangles and 5373 android.statistics.faceScores outputs are valid. 5374 </notes></value> 5375 <value optional="true">FULL 5376 <notes>Return all face 5377 metadata. 5378 5379 In this mode, 5380 android.statistics.faceRectangles, 5381 android.statistics.faceScores, 5382 android.statistics.faceIds, and 5383 android.statistics.faceLandmarks outputs are valid. 5384 </notes></value> 5385 </enum> 5386 <description>Control for the face detector 5387 unit.</description> 5388 <range> 5389 android.statistics.info.availableFaceDetectModes</range> 5390 <details>Whether face detection is enabled, and whether it 5391 should output just the basic fields or the full set of 5392 fields. Value must be one of the 5393 android.statistics.info.availableFaceDetectModes.</details> 5394 <tag id="BC" /> 5395 </entry> 5396 <entry name="histogramMode" type="byte" enum="true" typedef="boolean"> 5397 <enum> 5398 <value>OFF</value> 5399 <value>ON</value> 5400 </enum> 5401 <description>Operating mode for histogram 5402 generation</description> 5403 <tag id="FUTURE" /> 5404 </entry> 5405 <entry name="sharpnessMapMode" type="byte" enum="true" typedef="boolean"> 5406 <enum> 5407 <value>OFF</value> 5408 <value>ON</value> 5409 </enum> 5410 <description>Operating mode for sharpness map 5411 generation</description> 5412 <tag id="FUTURE" /> 5413 </entry> 5414 <entry name="hotPixelMapMode" type="byte" visibility="public" enum="true" 5415 typedef="boolean"> 5416 <enum> 5417 <value>OFF 5418 <notes>Hot pixel map production is disabled. 5419 </notes></value> 5420 <value>ON 5421 <notes>Hot pixel map production is enabled. 5422 </notes></value> 5423 </enum> 5424 <description> 5425 Operating mode for hotpixel map generation. 5426 </description> 5427 <details> 5428 If set to ON, a hotpixel map is returned in android.statistics.hotPixelMap. 5429 If set to OFF, no hotpixel map will be returned. 5430 5431 This must be set to a valid mode from android.statistics.info.availableHotPixelMapModes. 5432 </details> 5433 <tag id="V1" /> 5434 <tag id="DNG" /> 5435 </entry> 5436 </controls> 5437 <static> 5438 <namespace name="info"> 5439 <entry name="availableFaceDetectModes" type="byte" 5440 visibility="public" 5441 type_notes="List of enums from android.statistics.faceDetectMode" 5442 container="array" 5443 typedef="enumList"> 5444 <array> 5445 <size>n</size> 5446 </array> 5447 <description>The face detection modes that are available 5448 for this camera device. 5449 </description> 5450 <units>List of enum: 5451 OFF 5452 SIMPLE 5453 FULL</units> 5454 <details>OFF is always supported. 5455 5456 SIMPLE means the device supports the 5457 android.statistics.faceRectangles and 5458 android.statistics.faceScores outputs. 5459 5460 FULL means the device additionally supports the 5461 android.statistics.faceIds and 5462 android.statistics.faceLandmarks outputs. 5463 </details> 5464 </entry> 5465 <entry name="histogramBucketCount" type="int32"> 5466 <description>Number of histogram buckets 5467 supported</description> 5468 <range>&gt;= 64</range> 5469 <tag id="FUTURE" /> 5470 </entry> 5471 <entry name="maxFaceCount" type="int32" visibility="public" > 5472 <description>The maximum number of simultaneously detectable 5473 faces.</description> 5474 <range>&gt;= 4 if android.statistics.info.availableFaceDetectModes lists 5475 modes besides OFF, otherwise 0</range> 5476 <tag id="BC" /> 5477 </entry> 5478 <entry name="maxHistogramCount" type="int32"> 5479 <description>Maximum value possible for a histogram 5480 bucket</description> 5481 <tag id="FUTURE" /> 5482 </entry> 5483 <entry name="maxSharpnessMapValue" type="int32"> 5484 <description>Maximum value possible for a sharpness map 5485 region.</description> 5486 <tag id="FUTURE" /> 5487 </entry> 5488 <entry name="sharpnessMapSize" type="int32" 5489 type_notes="width x height" container="array" typedef="size"> 5490 <array> 5491 <size>2</size> 5492 </array> 5493 <description>Dimensions of the sharpness 5494 map</description> 5495 <range>Must be at least 32 x 32</range> 5496 <tag id="FUTURE" /> 5497 </entry> 5498 <entry name="availableHotPixelMapModes" type="byte" visibility="public" 5499 type_notes="list of enums" container="array" typedef="boolean"> 5500 <array> 5501 <size>n</size> 5502 </array> 5503 <description> 5504 The set of hot pixel map output modes supported by this camera device. 5505 </description> 5506 <details> 5507 This tag lists valid output modes for android.statistics.hotPixelMapMode. 5508 5509 If no hotpixel map is available for this camera device, this will contain 5510 only OFF. If the hotpixel map is available, this will include both 5511 the ON and OFF options. 5512 </details> 5513 <tag id="V1" /> 5514 <tag id="DNG" /> 5515 </entry> 5516 </namespace> 5517 </static> 5518 <dynamic> 5519 <clone entry="android.statistics.faceDetectMode" 5520 kind="controls"></clone> 5521 <entry name="faceIds" type="int32" visibility="hidden" container="array"> 5522 <array> 5523 <size>n</size> 5524 </array> 5525 <description>List of unique IDs for detected faces.</description> 5526 <details> 5527 Each detected face is given a unique ID that is valid for as long as the face is visible 5528 to the camera device. A face that leaves the field of view and later returns may be 5529 assigned a new ID. 5530 5531 Only available if android.statistics.faceDetectMode == FULL</details> 5532 <tag id="BC" /> 5533 </entry> 5534 <entry name="faceLandmarks" type="int32" visibility="hidden" 5535 type_notes="(leftEyeX, leftEyeY, rightEyeX, rightEyeY, mouthX, mouthY)" 5536 container="array"> 5537 <array> 5538 <size>n</size> 5539 <size>6</size> 5540 </array> 5541 <description>List of landmarks for detected 5542 faces.</description> 5543 <details> 5544 The coordinate system is that of android.sensor.info.activeArraySize, with 5545 `(0, 0)` being the top-left pixel of the active array. 5546 5547 Only available if android.statistics.faceDetectMode == FULL</details> 5548 <tag id="BC" /> 5549 </entry> 5550 <entry name="faceRectangles" type="int32" visibility="hidden" 5551 type_notes="(xmin, ymin, xmax, ymax). (0,0) is top-left of active pixel area" 5552 container="array" typedef="rectangle"> 5553 <array> 5554 <size>n</size> 5555 <size>4</size> 5556 </array> 5557 <description>List of the bounding rectangles for detected 5558 faces.</description> 5559 <details> 5560 The coordinate system is that of android.sensor.info.activeArraySize, with 5561 `(0, 0)` being the top-left pixel of the active array. 5562 5563 Only available if android.statistics.faceDetectMode != OFF</details> 5564 <tag id="BC" /> 5565 </entry> 5566 <entry name="faceScores" type="byte" visibility="hidden" container="array"> 5567 <array> 5568 <size>n</size> 5569 </array> 5570 <description>List of the face confidence scores for 5571 detected faces</description> 5572 <range>1-100</range> 5573 <details>Only available if android.statistics.faceDetectMode != OFF. 5574 </details> 5575 <hal_details> 5576 The value should be meaningful (for example, setting 100 at 5577 all times is illegal).</hal_details> 5578 <tag id="BC" /> 5579 </entry> 5580 <entry name="faces" type="int32" visibility="public" synthetic="true" container="array" typedef="face"> 5581 <array> 5582 <size>n</size> 5583 </array> 5584 <description>List of the faces detected through camera face detection 5585 in this result.</description> 5586 <details> 5587 Only available if android.statistics.faceDetectMode `!=` OFF. 5588 </details> 5589 </entry> 5590 <entry name="histogram" type="int32" 5591 type_notes="count of pixels for each color channel that fall into each histogram bucket, scaled to be between 0 and maxHistogramCount" 5592 container="array"> 5593 <array> 5594 <size>n</size> 5595 <size>3</size> 5596 </array> 5597 <description>A 3-channel histogram based on the raw 5598 sensor data</description> 5599 <details>The k'th bucket (0-based) covers the input range 5600 (with w = android.sensor.info.whiteLevel) of [ k * w/N, 5601 (k + 1) * w / N ). If only a monochrome sharpness map is 5602 supported, all channels should have the same data</details> 5603 <tag id="FUTURE" /> 5604 </entry> 5605 <clone entry="android.statistics.histogramMode" 5606 kind="controls"></clone> 5607 <entry name="sharpnessMap" type="int32" 5608 type_notes="estimated sharpness for each region of the input image. Normalized to be between 0 and maxSharpnessMapValue. Higher values mean sharper (better focused)" 5609 container="array"> 5610 <array> 5611 <size>n</size> 5612 <size>m</size> 5613 <size>3</size> 5614 </array> 5615 <description>A 3-channel sharpness map, based on the raw 5616 sensor data</description> 5617 <details>If only a monochrome sharpness map is supported, 5618 all channels should have the same data</details> 5619 <tag id="FUTURE" /> 5620 </entry> 5621 <clone entry="android.statistics.sharpnessMapMode" 5622 kind="controls"></clone> 5623 <entry name="lensShadingCorrectionMap" type="byte" visibility="public" typedef="lensShadingMap"> 5624 <description>The shading map is a low-resolution floating-point map 5625 that lists the coefficients used to correct for vignetting, for each 5626 Bayer color channel.</description> 5627 <range>Each gain factor is &gt;= 1</range> 5628 <details>The least shaded section of the image should have a gain factor 5629 of 1; all other sections should have gains above 1. 5630 5631 When android.colorCorrection.mode = TRANSFORM_MATRIX, the map 5632 must take into account the colorCorrection settings. 5633 5634 The shading map is for the entire active pixel array, and is not 5635 affected by the crop region specified in the request. Each shading map 5636 entry is the value of the shading compensation map over a specific 5637 pixel on the sensor. Specifically, with a (N x M) resolution shading 5638 map, and an active pixel array size (W x H), shading map entry 5639 (x,y) ϵ (0 ... N-1, 0 ... M-1) is the value of the shading map at 5640 pixel ( ((W-1)/(N-1)) * x, ((H-1)/(M-1)) * y) for the four color channels. 5641 The map is assumed to be bilinearly interpolated between the sample points. 5642 5643 The channel order is [R, Geven, Godd, B], where Geven is the green 5644 channel for the even rows of a Bayer pattern, and Godd is the odd rows. 5645 The shading map is stored in a fully interleaved format. 5646 5647 The shading map should have on the order of 30-40 rows and columns, 5648 and must be smaller than 64x64. 5649 5650 As an example, given a very small map defined as: 5651 5652 width,height = [ 4, 3 ] 5653 values = 5654 [ 1.3, 1.2, 1.15, 1.2, 1.2, 1.2, 1.15, 1.2, 5655 1.1, 1.2, 1.2, 1.2, 1.3, 1.2, 1.3, 1.3, 5656 1.2, 1.2, 1.25, 1.1, 1.1, 1.1, 1.1, 1.0, 5657 1.0, 1.0, 1.0, 1.0, 1.2, 1.3, 1.25, 1.2, 5658 1.3, 1.2, 1.2, 1.3, 1.2, 1.15, 1.1, 1.2, 5659 1.2, 1.1, 1.0, 1.2, 1.3, 1.15, 1.2, 1.3 ] 5660 5661 The low-resolution scaling map images for each channel are 5662 (displayed using nearest-neighbor interpolation): 5663 5664 ![Red lens shading map](android.statistics.lensShadingMap/red_shading.png) 5665 ![Green (even rows) lens shading map](android.statistics.lensShadingMap/green_e_shading.png) 5666 ![Green (odd rows) lens shading map](android.statistics.lensShadingMap/green_o_shading.png) 5667 ![Blue lens shading map](android.statistics.lensShadingMap/blue_shading.png) 5668 5669 As a visualization only, inverting the full-color map to recover an 5670 image of a gray wall (using bicubic interpolation for visual quality) as captured by the sensor gives: 5671 5672 ![Image of a uniform white wall (inverse shading map)](android.statistics.lensShadingMap/inv_shading.png) 5673 </details> 5674 </entry> 5675 <entry name="lensShadingMap" type="float" visibility="hidden" 5676 type_notes="2D array of float gain factors per channel to correct lens shading" 5677 container="array"> 5678 <array> 5679 <size>4</size> 5680 <size>n</size> 5681 <size>m</size> 5682 </array> 5683 <description>The shading map is a low-resolution floating-point map 5684 that lists the coefficients used to correct for vignetting, for each 5685 Bayer color channel.</description> 5686 <range>Each gain factor is &gt;= 1</range> 5687 <details>The least shaded section of the image should have a gain factor 5688 of 1; all other sections should have gains above 1. 5689 5690 When android.colorCorrection.mode = TRANSFORM_MATRIX, the map 5691 must take into account the colorCorrection settings. 5692 5693 The shading map is for the entire active pixel array, and is not 5694 affected by the crop region specified in the request. Each shading map 5695 entry is the value of the shading compensation map over a specific 5696 pixel on the sensor. Specifically, with a (N x M) resolution shading 5697 map, and an active pixel array size (W x H), shading map entry 5698 (x,y) ϵ (0 ... N-1, 0 ... M-1) is the value of the shading map at 5699 pixel ( ((W-1)/(N-1)) * x, ((H-1)/(M-1)) * y) for the four color channels. 5700 The map is assumed to be bilinearly interpolated between the sample points. 5701 5702 The channel order is [R, Geven, Godd, B], where Geven is the green 5703 channel for the even rows of a Bayer pattern, and Godd is the odd rows. 5704 The shading map is stored in a fully interleaved format, and its size 5705 is provided in the camera static metadata by android.lens.info.shadingMapSize. 5706 5707 The shading map should have on the order of 30-40 rows and columns, 5708 and must be smaller than 64x64. 5709 5710 As an example, given a very small map defined as: 5711 5712 android.lens.info.shadingMapSize = [ 4, 3 ] 5713 android.statistics.lensShadingMap = 5714 [ 1.3, 1.2, 1.15, 1.2, 1.2, 1.2, 1.15, 1.2, 5715 1.1, 1.2, 1.2, 1.2, 1.3, 1.2, 1.3, 1.3, 5716 1.2, 1.2, 1.25, 1.1, 1.1, 1.1, 1.1, 1.0, 5717 1.0, 1.0, 1.0, 1.0, 1.2, 1.3, 1.25, 1.2, 5718 1.3, 1.2, 1.2, 1.3, 1.2, 1.15, 1.1, 1.2, 5719 1.2, 1.1, 1.0, 1.2, 1.3, 1.15, 1.2, 1.3 ] 5720 5721 The low-resolution scaling map images for each channel are 5722 (displayed using nearest-neighbor interpolation): 5723 5724 ![Red lens shading map](android.statistics.lensShadingMap/red_shading.png) 5725 ![Green (even rows) lens shading map](android.statistics.lensShadingMap/green_e_shading.png) 5726 ![Green (odd rows) lens shading map](android.statistics.lensShadingMap/green_o_shading.png) 5727 ![Blue lens shading map](android.statistics.lensShadingMap/blue_shading.png) 5728 5729 As a visualization only, inverting the full-color map to recover an 5730 image of a gray wall (using bicubic interpolation for visual quality) as captured by the sensor gives: 5731 5732 ![Image of a uniform white wall (inverse shading map)](android.statistics.lensShadingMap/inv_shading.png) 5733 </details> 5734 <hal_details> 5735 The lens shading map calculation may depend on exposure and white balance statistics. 5736 When AE and AWB are in AUTO modes 5737 (android.control.aeMode `!=` OFF and android.control.awbMode `!=` OFF), the HAL 5738 may have all the information it need to generate most accurate lens shading map. When 5739 AE or AWB are in manual mode 5740 (android.control.aeMode `==` OFF or android.control.awbMode `==` OFF), the shading map 5741 may be adversely impacted by manual exposure or white balance parameters. To avoid 5742 generating unreliable shading map data, the HAL may choose to lock the shading map with 5743 the latest known good map generated when the AE and AWB are in AUTO modes. 5744 </hal_details> 5745 </entry> 5746 <entry name="predictedColorGains" type="float" 5747 visibility="hidden" 5748 deprecated="true" 5749 optional="true" 5750 type_notes="A 1D array of floats for 4 color channel gains" 5751 container="array"> 5752 <array> 5753 <size>4</size> 5754 </array> 5755 <description>The best-fit color channel gains calculated 5756 by the camera device's statistics units for the current output frame. 5757 </description> 5758 <details> 5759 This may be different than the gains used for this frame, 5760 since statistics processing on data from a new frame 5761 typically completes after the transform has already been 5762 applied to that frame. 5763 5764 The 4 channel gains are defined in Bayer domain, 5765 see android.colorCorrection.gains for details. 5766 5767 This value should always be calculated by the auto-white balance (AWB) block, 5768 regardless of the android.control.* current values. 5769 </details> 5770 </entry> 5771 <entry name="predictedColorTransform" type="rational" 5772 visibility="hidden" 5773 deprecated="true" 5774 optional="true" 5775 type_notes="3x3 rational matrix in row-major order" 5776 container="array"> 5777 <array> 5778 <size>3</size> 5779 <size>3</size> 5780 </array> 5781 <description>The best-fit color transform matrix estimate 5782 calculated by the camera device's statistics units for the current 5783 output frame.</description> 5784 <details>The camera device will provide the estimate from its 5785 statistics unit on the white balance transforms to use 5786 for the next frame. These are the values the camera device believes 5787 are the best fit for the current output frame. This may 5788 be different than the transform used for this frame, since 5789 statistics processing on data from a new frame typically 5790 completes after the transform has already been applied to 5791 that frame. 5792 5793 These estimates must be provided for all frames, even if 5794 capture settings and color transforms are set by the application. 5795 5796 This value should always be calculated by the auto-white balance (AWB) block, 5797 regardless of the android.control.* current values. 5798 </details> 5799 </entry> 5800 <entry name="sceneFlicker" type="byte" visibility="public" enum="true"> 5801 <enum> 5802 <value>NONE 5803 <notes>The camera device does not detect any flickering illumination 5804 in the current scene.</notes></value> 5805 <value>50HZ 5806 <notes>The camera device detects illumination flickering at 50Hz 5807 in the current scene.</notes></value> 5808 <value>60HZ 5809 <notes>The camera device detects illumination flickering at 60Hz 5810 in the current scene.</notes></value> 5811 </enum> 5812 <description>The camera device estimated scene illumination lighting 5813 frequency.</description> 5814 <details> 5815 Many light sources, such as most fluorescent lights, flicker at a rate 5816 that depends on the local utility power standards. This flicker must be 5817 accounted for by auto-exposure routines to avoid artifacts in captured images. 5818 The camera device uses this entry to tell the application what the scene 5819 illuminant frequency is. 5820 5821 When manual exposure control is enabled 5822 (`android.control.aeMode == OFF` or `android.control.mode == 5823 OFF`), the android.control.aeAntibandingMode doesn't perform 5824 antibanding, and the application can ensure it selects 5825 exposure times that do not cause banding issues by looking 5826 into this metadata field. See 5827 android.control.aeAntibandingMode for more details. 5828 5829 Reports NONE if there doesn't appear to be flickering illumination. 5830 </details> 5831 </entry> 5832 <clone entry="android.statistics.hotPixelMapMode" kind="controls"> 5833 </clone> 5834 <entry name="hotPixelMap" type="int32" visibility="public" 5835 type_notes="list of coordinates based on android.sensor.pixelArraySize" 5836 container="array" typedef="point"> 5837 <array> 5838 <size>2</size> 5839 <size>n</size> 5840 </array> 5841 <description> 5842 List of `(x, y)` coordinates of hot/defective pixels on the sensor. 5843 </description> 5844 <range> 5845 n <= number of pixels on the sensor. 5846 The `(x, y)` coordinates must be bounded by 5847 android.sensor.info.pixelArraySize. 5848 </range> 5849 <details> 5850 A coordinate `(x, y)` must lie between `(0, 0)`, and 5851 `(width - 1, height - 1)` (inclusive), which are the top-left and 5852 bottom-right of the pixel array, respectively. The width and 5853 height dimensions are given in android.sensor.info.pixelArraySize. 5854 This may include hot pixels that lie outside of the active array 5855 bounds given by android.sensor.info.activeArraySize. 5856 </details> 5857 <hal_details> 5858 A hotpixel map contains the coordinates of pixels on the camera 5859 sensor that do report valid values (usually due to defects in 5860 the camera sensor). This includes pixels that are stuck at certain 5861 values, or have a response that does not accuractly encode the 5862 incoming light from the scene. 5863 5864 To avoid performance issues, there should be significantly fewer hot 5865 pixels than actual pixels on the camera sensor. 5866 </hal_details> 5867 <tag id="V1" /> 5868 <tag id="DNG" /> 5869 </entry> 5870 </dynamic> 5871 <controls> 5872 <entry name="lensShadingMapMode" type="byte" visibility="public" enum="true"> 5873 <enum> 5874 <value>OFF 5875 <notes>Do not include a lens shading map in the capture result.</notes></value> 5876 <value>ON 5877 <notes>Include a lens shading map in the capture result.</notes></value> 5878 </enum> 5879 <description>Whether the camera device will output the lens 5880 shading map in output result metadata.</description> 5881 <details>When set to ON, 5882 android.statistics.lensShadingMap will be provided in 5883 the output result metadata.</details> 5884 </entry> 5885 </controls> 5886 <dynamic> 5887 <clone entry="android.statistics.lensShadingMapMode" kind="controls"> 5888 </clone> 5889 </dynamic> 5890 </section> 5891 <section name="tonemap"> 5892 <controls> 5893 <entry name="curveBlue" type="float" visibility="hidden" 5894 type_notes="1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints." 5895 container="array"> 5896 <array> 5897 <size>n</size> 5898 <size>2</size> 5899 </array> 5900 <description>Tonemapping / contrast / gamma curve for the blue 5901 channel, to use when android.tonemap.mode is 5902 CONTRAST_CURVE.</description> 5903 <units>same as android.tonemap.curveRed</units> 5904 <range>same as android.tonemap.curveRed</range> 5905 <details>See android.tonemap.curveRed for more details.</details> 5906 </entry> 5907 <entry name="curveGreen" type="float" visibility="hidden" 5908 type_notes="1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints." 5909 container="array"> 5910 <array> 5911 <size>n</size> 5912 <size>2</size> 5913 </array> 5914 <description>Tonemapping / contrast / gamma curve for the green 5915 channel, to use when android.tonemap.mode is 5916 CONTRAST_CURVE.</description> 5917 <units>same as android.tonemap.curveRed</units> 5918 <range>same as android.tonemap.curveRed</range> 5919 <details>See android.tonemap.curveRed for more details.</details> 5920 </entry> 5921 <entry name="curveRed" type="float" visibility="hidden" 5922 type_notes="1D array of float pairs (P_IN, P_OUT). The maximum number of pairs is specified by android.tonemap.maxCurvePoints." 5923 container="array"> 5924 <array> 5925 <size>n</size> 5926 <size>2</size> 5927 </array> 5928 <description>Tonemapping / contrast / gamma curve for the red 5929 channel, to use when android.tonemap.mode is 5930 CONTRAST_CURVE.</description> 5931 <range>0-1 on both input and output coordinates, normalized 5932 as a floating-point value such that 0 == black and 1 == white. 5933 </range> 5934 <details> 5935 Each channel's curve is defined by an array of control points: 5936 5937 android.tonemap.curveRed = 5938 [ P0in, P0out, P1in, P1out, P2in, P2out, P3in, P3out, ..., PNin, PNout ] 5939 2 <= N <= android.tonemap.maxCurvePoints 5940 5941 These are sorted in order of increasing `Pin`; it is always 5942 guaranteed that input values 0.0 and 1.0 are included in the list to 5943 define a complete mapping. For input values between control points, 5944 the camera device must linearly interpolate between the control 5945 points. 5946 5947 Each curve can have an independent number of points, and the number 5948 of points can be less than max (that is, the request doesn't have to 5949 always provide a curve with number of points equivalent to 5950 android.tonemap.maxCurvePoints). 5951 5952 A few examples, and their corresponding graphical mappings; these 5953 only specify the red channel and the precision is limited to 4 5954 digits, for conciseness. 5955 5956 Linear mapping: 5957 5958 android.tonemap.curveRed = [ 0, 0, 1.0, 1.0 ] 5959 5960 ![Linear mapping curve](android.tonemap.curveRed/linear_tonemap.png) 5961 5962 Invert mapping: 5963 5964 android.tonemap.curveRed = [ 0, 1.0, 1.0, 0 ] 5965 5966 ![Inverting mapping curve](android.tonemap.curveRed/inverse_tonemap.png) 5967 5968 Gamma 1/2.2 mapping, with 16 control points: 5969 5970 android.tonemap.curveRed = [ 5971 0.0000, 0.0000, 0.0667, 0.2920, 0.1333, 0.4002, 0.2000, 0.4812, 5972 0.2667, 0.5484, 0.3333, 0.6069, 0.4000, 0.6594, 0.4667, 0.7072, 5973 0.5333, 0.7515, 0.6000, 0.7928, 0.6667, 0.8317, 0.7333, 0.8685, 5974 0.8000, 0.9035, 0.8667, 0.9370, 0.9333, 0.9691, 1.0000, 1.0000 ] 5975 5976 ![Gamma = 1/2.2 tonemapping curve](android.tonemap.curveRed/gamma_tonemap.png) 5977 5978 Standard sRGB gamma mapping, per IEC 61966-2-1:1999, with 16 control points: 5979 5980 android.tonemap.curveRed = [ 5981 0.0000, 0.0000, 0.0667, 0.2864, 0.1333, 0.4007, 0.2000, 0.4845, 5982 0.2667, 0.5532, 0.3333, 0.6125, 0.4000, 0.6652, 0.4667, 0.7130, 5983 0.5333, 0.7569, 0.6000, 0.7977, 0.6667, 0.8360, 0.7333, 0.8721, 5984 0.8000, 0.9063, 0.8667, 0.9389, 0.9333, 0.9701, 1.0000, 1.0000 ] 5985 5986 ![sRGB tonemapping curve](android.tonemap.curveRed/srgb_tonemap.png) 5987 </details> 5988 <hal_details> 5989 For good quality of mapping, at least 128 control points are 5990 preferred. 5991 5992 A typical use case of this would be a gamma-1/2.2 curve, with as many 5993 control points used as are available. 5994 </hal_details> 5995 </entry> 5996 <entry name="curve" type="float" visibility="public" synthetic="true" typedef="tonemapCurve"> 5997 <description>Tonemapping / contrast / gamma curve to use when android.tonemap.mode 5998 is CONTRAST_CURVE.</description> 5999 <details> 6000 The tonemapCurve consist of three curves for each of red, green, and blue 6001 channels respectively. The following example uses the red channel as an 6002 example. The same logic applies to green and blue channel. 6003 Each channel's curve is defined by an array of control points: 6004 6005 curveRed = 6006 [ P0(in, out), P1(in, out), P2(in, out), P3(in, out), ..., PN(in, out) ] 6007 2 <= N <= android.tonemap.maxCurvePoints 6008 6009 These are sorted in order of increasing `Pin`; it is always 6010 guaranteed that input values 0.0 and 1.0 are included in the list to 6011 define a complete mapping. For input values between control points, 6012 the camera device must linearly interpolate between the control 6013 points. 6014 6015 Each curve can have an independent number of points, and the number 6016 of points can be less than max (that is, the request doesn't have to 6017 always provide a curve with number of points equivalent to 6018 android.tonemap.maxCurvePoints). 6019 6020 A few examples, and their corresponding graphical mappings; these 6021 only specify the red channel and the precision is limited to 4 6022 digits, for conciseness. 6023 6024 Linear mapping: 6025 6026 curveRed = [ (0, 0), (1.0, 1.0) ] 6027 6028 ![Linear mapping curve](android.tonemap.curveRed/linear_tonemap.png) 6029 6030 Invert mapping: 6031 6032 curveRed = [ (0, 1.0), (1.0, 0) ] 6033 6034 ![Inverting mapping curve](android.tonemap.curveRed/inverse_tonemap.png) 6035 6036 Gamma 1/2.2 mapping, with 16 control points: 6037 6038 curveRed = [ 6039 (0.0000, 0.0000), (0.0667, 0.2920), (0.1333, 0.4002), (0.2000, 0.4812), 6040 (0.2667, 0.5484), (0.3333, 0.6069), (0.4000, 0.6594), (0.4667, 0.7072), 6041 (0.5333, 0.7515), (0.6000, 0.7928), (0.6667, 0.8317), (0.7333, 0.8685), 6042 (0.8000, 0.9035), (0.8667, 0.9370), (0.9333, 0.9691), (1.0000, 1.0000) ] 6043 6044 ![Gamma = 1/2.2 tonemapping curve](android.tonemap.curveRed/gamma_tonemap.png) 6045 6046 Standard sRGB gamma mapping, per IEC 61966-2-1:1999, with 16 control points: 6047 6048 curveRed = [ 6049 (0.0000, 0.0000), (0.0667, 0.2864), (0.1333, 0.4007), (0.2000, 0.4845), 6050 (0.2667, 0.5532), (0.3333, 0.6125), (0.4000, 0.6652), (0.4667, 0.7130), 6051 (0.5333, 0.7569), (0.6000, 0.7977), (0.6667, 0.8360), (0.7333, 0.8721), 6052 (0.8000, 0.9063), (0.8667, 0.9389), (0.9333, 0.9701), (1.0000, 1.0000) ] 6053 6054 ![sRGB tonemapping curve](android.tonemap.curveRed/srgb_tonemap.png) 6055 </details> 6056 <hal_details> 6057 This entry is created by the framework from the curveRed, curveGreen and 6058 curveBlue entries. 6059 </hal_details> 6060 </entry> 6061 <entry name="mode" type="byte" visibility="public" enum="true"> 6062 <enum> 6063 <value>CONTRAST_CURVE 6064 <notes>Use the tone mapping curve specified in 6065 the android.tonemap.curve* entries. 6066 6067 All color enhancement and tonemapping must be disabled, except 6068 for applying the tonemapping curve specified by 6069 android.tonemap.curve. 6070 6071 Must not slow down frame rate relative to raw 6072 sensor output. 6073 </notes> 6074 </value> 6075 <value>FAST 6076 <notes> 6077 Advanced gamma mapping and color enhancement may be applied, without 6078 reducing frame rate compared to raw sensor output. 6079 </notes> 6080 </value> 6081 <value>HIGH_QUALITY 6082 <notes> 6083 High-quality gamma mapping and color enhancement will be applied, at 6084 the cost of reduced frame rate compared to raw sensor output. 6085 </notes> 6086 </value> 6087 </enum> 6088 <description>High-level global contrast/gamma/tonemapping control. 6089 </description> 6090 <details> 6091 When switching to an application-defined contrast curve by setting 6092 android.tonemap.mode to CONTRAST_CURVE, the curve is defined 6093 per-channel with a set of `(in, out)` points that specify the 6094 mapping from input high-bit-depth pixel value to the output 6095 low-bit-depth value. Since the actual pixel ranges of both input 6096 and output may change depending on the camera pipeline, the values 6097 are specified by normalized floating-point numbers. 6098 6099 More-complex color mapping operations such as 3D color look-up 6100 tables, selective chroma enhancement, or other non-linear color 6101 transforms will be disabled when android.tonemap.mode is 6102 CONTRAST_CURVE. 6103 6104 This must be set to a valid mode in 6105 android.tonemap.availableToneMapModes. 6106 6107 When using either FAST or HIGH_QUALITY, the camera device will 6108 emit its own tonemap curve in android.tonemap.curve. 6109 These values are always available, and as close as possible to the 6110 actually used nonlinear/nonglobal transforms. 6111 6112 If a request is sent with CONTRAST_CURVE with the camera device's 6113 provided curve in FAST or HIGH_QUALITY, the image's tonemap will be 6114 roughly the same.</details> 6115 </entry> 6116 </controls> 6117 <static> 6118 <entry name="maxCurvePoints" type="int32" visibility="public" > 6119 <description>Maximum number of supported points in the 6120 tonemap curve that can be used for android.tonemap.curve. 6121 </description> 6122 <range>&gt;= 64</range> 6123 <details> 6124 If the actual number of points provided by the application (in 6125 android.tonemap.curve*) is less than max, the camera device will 6126 resample the curve to its internal representation, using linear 6127 interpolation. 6128 6129 The output curves in the result metadata may have a different number 6130 of points than the input curves, and will represent the actual 6131 hardware curves used as closely as possible when linearly interpolated. 6132 </details> 6133 <hal_details> 6134 This value must be at least 64. This should be at least 128. 6135 </hal_details> 6136 </entry> 6137 <entry name="availableToneMapModes" type="byte" visibility="public" 6138 type_notes="list of enums" container="array" typedef="enumList"> 6139 <array> 6140 <size>n</size> 6141 </array> 6142 <description> 6143 The set of tonemapping modes supported by this camera device. 6144 </description> 6145 <details> 6146 This tag lists the valid modes for android.tonemap.mode. 6147 6148 Full-capability camera devices must always support CONTRAST_CURVE and 6149 FAST. 6150 </details> 6151 </entry> 6152 </static> 6153 <dynamic> 6154 <clone entry="android.tonemap.curveBlue" kind="controls"> 6155 </clone> 6156 <clone entry="android.tonemap.curveGreen" kind="controls"> 6157 </clone> 6158 <clone entry="android.tonemap.curveRed" kind="controls"> 6159 </clone> 6160 <clone entry="android.tonemap.curve" kind="controls"> 6161 </clone> 6162 <clone entry="android.tonemap.mode" kind="controls"> 6163 </clone> 6164 </dynamic> 6165 </section> 6166 <section name="led"> 6167 <controls> 6168 <entry name="transmit" type="byte" visibility="hidden" enum="true" 6169 typedef="boolean"> 6170 <enum> 6171 <value>OFF</value> 6172 <value>ON</value> 6173 </enum> 6174 <description>This LED is nominally used to indicate to the user 6175 that the camera is powered on and may be streaming images back to the 6176 Application Processor. In certain rare circumstances, the OS may 6177 disable this when video is processed locally and not transmitted to 6178 any untrusted applications. 6179 6180 In particular, the LED *must* always be on when the data could be 6181 transmitted off the device. The LED *should* always be on whenever 6182 data is stored locally on the device. 6183 6184 The LED *may* be off if a trusted application is using the data that 6185 doesn't violate the above rules. 6186 </description> 6187 </entry> 6188 </controls> 6189 <dynamic> 6190 <clone entry="android.led.transmit" kind="controls"></clone> 6191 </dynamic> 6192 <static> 6193 <entry name="availableLeds" type="byte" visibility="hidden" enum="true" 6194 container="array" > 6195 <array> 6196 <size>n</size> 6197 </array> 6198 <enum> 6199 <value>TRANSMIT 6200 <notes>android.led.transmit control is used.</notes> 6201 </value> 6202 </enum> 6203 <description>A list of camera LEDs that are available on this system. 6204 </description> 6205 </entry> 6206 </static> 6207 </section> 6208 <section name="info"> 6209 <static> 6210 <entry name="supportedHardwareLevel" type="byte" visibility="public" 6211 enum="true" > 6212 <enum> 6213 <value>LIMITED 6214 <notes>This camera device has only limited capabilities. 6215 </notes></value> 6216 <value>FULL 6217 <notes>This camera device is capable of supporting advanced imaging 6218 applications.</notes></value> 6219 </enum> 6220 <description> 6221 Generally classifies the overall set of the camera device functionality. 6222 </description> 6223 <range>Optional. Default value is LIMITED.</range> 6224 <details> 6225 Camera devices will come in two flavors: LIMITED and FULL. 6226 6227 A FULL device has the most support possible and will enable the 6228 widest range of use cases such as: 6229 6230 * 30fps at maximum resolution (== sensor resolution) is preferred, more than 20fps is required. 6231 * Per frame control (android.sync.maxLatency `==` PER_FRAME_CONTROL) 6232 * Manual sensor control (android.request.availableCapabilities contains MANUAL_SENSOR) 6233 * Manual post-processing control (android.request.availableCapabilities contains MANUAL_POST_PROCESSING) 6234 6235 A LIMITED device may have some or none of the above characteristics. 6236 To find out more refer to android.request.availableCapabilities. 6237 </details> 6238 <hal_details> 6239 The camera 3 HAL device can implement one of two possible 6240 operational modes; limited and full. Full support is 6241 expected from new higher-end devices. Limited mode has 6242 hardware requirements roughly in line with those for a 6243 camera HAL device v1 implementation, and is expected from 6244 older or inexpensive devices. Full is a strict superset of 6245 limited, and they share the same essential operational flow. 6246 6247 For full details refer to "S3. Operational Modes" in camera3.h 6248 </hal_details> 6249 </entry> 6250 </static> 6251 </section> 6252 <section name="blackLevel"> 6253 <controls> 6254 <entry name="lock" type="byte" visibility="public" enum="true" 6255 typedef="boolean"> 6256 <enum> 6257 <value>OFF</value> 6258 <value>ON</value> 6259 </enum> 6260 <description> Whether black-level compensation is locked 6261 to its current values, or is free to vary.</description> 6262 <details>When set to ON, the values used for black-level 6263 compensation will not change until the lock is set to 6264 OFF. 6265 6266 Since changes to certain capture parameters (such as 6267 exposure time) may require resetting of black level 6268 compensation, the camera device must report whether setting 6269 the black level lock was successful in the output result 6270 metadata. 6271 6272 For example, if a sequence of requests is as follows: 6273 6274 * Request 1: Exposure = 10ms, Black level lock = OFF 6275 * Request 2: Exposure = 10ms, Black level lock = ON 6276 * Request 3: Exposure = 10ms, Black level lock = ON 6277 * Request 4: Exposure = 20ms, Black level lock = ON 6278 * Request 5: Exposure = 20ms, Black level lock = ON 6279 * Request 6: Exposure = 20ms, Black level lock = ON 6280 6281 And the exposure change in Request 4 requires the camera 6282 device to reset the black level offsets, then the output 6283 result metadata is expected to be: 6284 6285 * Result 1: Exposure = 10ms, Black level lock = OFF 6286 * Result 2: Exposure = 10ms, Black level lock = ON 6287 * Result 3: Exposure = 10ms, Black level lock = ON 6288 * Result 4: Exposure = 20ms, Black level lock = OFF 6289 * Result 5: Exposure = 20ms, Black level lock = ON 6290 * Result 6: Exposure = 20ms, Black level lock = ON 6291 6292 This indicates to the application that on frame 4, black 6293 levels were reset due to exposure value changes, and pixel 6294 values may not be consistent across captures. 6295 6296 The camera device will maintain the lock to the extent 6297 possible, only overriding the lock to OFF when changes to 6298 other request parameters require a black level recalculation 6299 or reset. 6300 </details> 6301 <hal_details> 6302 If for some reason black level locking is no longer possible 6303 (for example, the analog gain has changed, which forces 6304 black level offsets to be recalculated), then the HAL must 6305 override this request (and it must report 'OFF' when this 6306 does happen) until the next capture for which locking is 6307 possible again.</hal_details> 6308 <tag id="HAL2" /> 6309 </entry> 6310 </controls> 6311 <dynamic> 6312 <clone entry="android.blackLevel.lock" 6313 kind="controls"> 6314 <details> 6315 Whether the black level offset was locked for this frame. Should be 6316 ON if android.blackLevel.lock was ON in the capture request, unless 6317 a change in other capture settings forced the camera device to 6318 perform a black level reset. 6319 </details> 6320 </clone> 6321 </dynamic> 6322 </section> 6323 <section name="sync"> 6324 <dynamic> 6325 <entry name="frameNumber" type="int64" visibility="hidden" enum="true"> 6326 <enum> 6327 <value id="-1">CONVERGING 6328 <notes> 6329 The current result is not yet fully synchronized to any request. 6330 6331 Synchronization is in progress, and reading metadata from this 6332 result may include a mix of data that have taken effect since the 6333 last synchronization time. 6334 6335 In some future result, within android.sync.maxLatency frames, 6336 this value will update to the actual frame number frame number 6337 the result is guaranteed to be synchronized to (as long as the 6338 request settings remain constant). 6339 </notes> 6340 </value> 6341 <value id="-2">UNKNOWN 6342 <notes> 6343 The current result's synchronization status is unknown. 6344 6345 The result may have already converged, or it may be in 6346 progress. Reading from this result may include some mix 6347 of settings from past requests. 6348 6349 After a settings change, the new settings will eventually all 6350 take effect for the output buffers and results. However, this 6351 value will not change when that happens. Altering settings 6352 rapidly may provide outcomes using mixes of settings from recent 6353 requests. 6354 6355 This value is intended primarily for backwards compatibility with 6356 the older camera implementations (for android.hardware.Camera). 6357 </notes> 6358 </value> 6359 </enum> 6360 <description>The frame number corresponding to the last request 6361 with which the output result (metadata + buffers) has been fully 6362 synchronized.</description> 6363 <range>Either a non-negative value corresponding to a 6364 `frame_number`, or one of the two enums (CONVERGING / UNKNOWN). 6365 </range> 6366 <details> 6367 When a request is submitted to the camera device, there is usually a 6368 delay of several frames before the controls get applied. A camera 6369 device may either choose to account for this delay by implementing a 6370 pipeline and carefully submit well-timed atomic control updates, or 6371 it may start streaming control changes that span over several frame 6372 boundaries. 6373 6374 In the latter case, whenever a request's settings change relative to 6375 the previous submitted request, the full set of changes may take 6376 multiple frame durations to fully take effect. Some settings may 6377 take effect sooner (in less frame durations) than others. 6378 6379 While a set of control changes are being propagated, this value 6380 will be CONVERGING. 6381 6382 Once it is fully known that a set of control changes have been 6383 finished propagating, and the resulting updated control settings 6384 have been read back by the camera device, this value will be set 6385 to a non-negative frame number (corresponding to the request to 6386 which the results have synchronized to). 6387 6388 Older camera device implementations may not have a way to detect 6389 when all camera controls have been applied, and will always set this 6390 value to UNKNOWN. 6391 6392 FULL capability devices will always have this value set to the 6393 frame number of the request corresponding to this result. 6394 6395 _Further details_: 6396 6397 * Whenever a request differs from the last request, any future 6398 results not yet returned may have this value set to CONVERGING (this 6399 could include any in-progress captures not yet returned by the camera 6400 device, for more details see pipeline considerations below). 6401 * Submitting a series of multiple requests that differ from the 6402 previous request (e.g. r1, r2, r3 s.t. r1 != r2 != r3) 6403 moves the new synchronization frame to the last non-repeating 6404 request (using the smallest frame number from the contiguous list of 6405 repeating requests). 6406 * Submitting the same request repeatedly will not change this value 6407 to CONVERGING, if it was already a non-negative value. 6408 * When this value changes to non-negative, that means that all of the 6409 metadata controls from the request have been applied, all of the 6410 metadata controls from the camera device have been read to the 6411 updated values (into the result), and all of the graphics buffers 6412 corresponding to this result are also synchronized to the request. 6413 6414 _Pipeline considerations_: 6415 6416 Submitting a request with updated controls relative to the previously 6417 submitted requests may also invalidate the synchronization state 6418 of all the results corresponding to currently in-flight requests. 6419 6420 In other words, results for this current request and up to 6421 android.request.pipelineMaxDepth prior requests may have their 6422 android.sync.frameNumber change to CONVERGING. 6423 </details> 6424 <hal_details> 6425 Using UNKNOWN here is illegal unless android.sync.maxLatency 6426 is also UNKNOWN. 6427 6428 FULL capability devices should simply set this value to the 6429 `frame_number` of the request this result corresponds to. 6430 </hal_details> 6431 <tag id="V1" /> 6432 </entry> 6433 </dynamic> 6434 <static> 6435 <entry name="maxLatency" type="int32" visibility="public" enum="true"> 6436 <enum> 6437 <value id="0">PER_FRAME_CONTROL 6438 <notes> 6439 Every frame has the requests immediately applied. 6440 6441 Furthermore for all results, 6442 `android.sync.frameNumber == android.request.frameCount` 6443 6444 Changing controls over multiple requests one after another will 6445 produce results that have those controls applied atomically 6446 each frame. 6447 6448 All FULL capability devices will have this as their maxLatency. 6449 </notes> 6450 </value> 6451 <value id="-1">UNKNOWN 6452 <notes> 6453 Each new frame has some subset (potentially the entire set) 6454 of the past requests applied to the camera settings. 6455 6456 By submitting a series of identical requests, the camera device 6457 will eventually have the camera settings applied, but it is 6458 unknown when that exact point will be. 6459 </notes> 6460 </value> 6461 </enum> 6462 <description> 6463 The maximum number of frames that can occur after a request 6464 (different than the previous) has been submitted, and before the 6465 result's state becomes synchronized (by setting 6466 android.sync.frameNumber to a non-negative value). 6467 </description> 6468 <units>number of processed requests</units> 6469 <range>&gt;= -1</range> 6470 <details> 6471 This defines the maximum distance (in number of metadata results), 6472 between android.sync.frameNumber and the equivalent 6473 android.request.frameCount. 6474 6475 In other words this acts as an upper boundary for how many frames 6476 must occur before the camera device knows for a fact that the new 6477 submitted camera settings have been applied in outgoing frames. 6478 6479 For example if the distance was 2, 6480 6481 initial request = X (repeating) 6482 request1 = X 6483 request2 = Y 6484 request3 = Y 6485 request4 = Y 6486 6487 where requestN has frameNumber N, and the first of the repeating 6488 initial request's has frameNumber F (and F < 1). 6489 6490 initial result = X' + { android.sync.frameNumber == F } 6491 result1 = X' + { android.sync.frameNumber == F } 6492 result2 = X' + { android.sync.frameNumber == CONVERGING } 6493 result3 = X' + { android.sync.frameNumber == CONVERGING } 6494 result4 = X' + { android.sync.frameNumber == 2 } 6495 6496 where resultN has frameNumber N. 6497 6498 Since `result4` has a `frameNumber == 4` and 6499 `android.sync.frameNumber == 2`, the distance is clearly 6500 `4 - 2 = 2`. 6501 </details> 6502 <hal_details> 6503 Use `frame_count` from camera3_request_t instead of 6504 android.request.frameCount. 6505 6506 LIMITED devices are strongly encouraged to use a non-negative 6507 value. If UNKNOWN is used here then app developers do not have a way 6508 to know when sensor settings have been applied. 6509 </hal_details> 6510 <tag id="V1" /> 6511 </entry> 6512 </static> 6513 </section> 6514 </namespace> 6515</metadata> 6516