android-4.3.jd revision ca3b4e9b7e5498650993dc99d70acacc1fa2d1fb
1page.title=Android 4.3 APIs 2excludeFromSuggestions=true 3sdk.platform.version=4.3 4sdk.platform.apiLevel=18 5@jd:body 6 7 8<div id="qv-wrapper"> 9<div id="qv"> 10 11<h2>In this document 12 <a href="#" onclick="hideNestedItems('#toc43',this);return false;" class="header-toggle"> 13 <span class="more">show more</span> 14 <span class="less" style="display:none">show less</span></a></h2> 15 16<ol id="toc43" class="hide-nested"> 17 <li><a href="#ApiLevel">Update your target API level</a></li> 18 <li><a href="#Behaviors">Important Behavior Changes</a> 19 <ol> 20 <li><a href="#BehaviorsIntents">If your app uses implicit intents...</a></li> 21 <li><a href="#BehaviorsAccounts">If your app depends on accounts...</a></li> 22 </ol> 23 </li> 24 <li><a href="#RestrictedProfiles">Restricted Profiles</a> 25 <ol> 26 <li><a href="#AccountsInProfile">Supporting accounts in a restricted profile</a></li> 27 </ol> 28 </li> 29 <li><a href="#Wireless">Wireless and Connectivity</a> 30 <ol> 31 <li><a href="#BTLE">Bluetooth Low Energy (Smart Ready)</a></li> 32 <li><a href="#WiFiScan">Wi-Fi scan-only mode</a></li> 33 <li><a href="#WiFiConfig">Wi-Fi configuration</a></li> 34 <li><a href="#QuickResponse">Quick response for incoming calls</a></li> 35 </ol> 36 </li> 37 <li><a href="#Multimedia">Multimedia</a> 38 <ol> 39 <li><a href="#DASH">MPEG DASH support</a></li> 40 <li><a href="#DRM">Media DRM</a></li> 41 <li><a href="#EncodingSurface">Video encoding from a Surface</a></li> 42 <li><a href="#MediaMuxing">Media muxing</a></li> 43 <li><a href="#ProgressAndScrubbing">Playback progress and scrubbing for RemoteControlClient</a></li> 44 </ol> 45 </li> 46 <li><a href="#Graphics">Graphics</a> 47 <ol> 48 <li><a href="#OpenGL">Support for OpenGL ES 3.0</a></li> 49 <li><a href="#MipMap">Mipmapping for drawables</a></li> 50 </ol> 51 </li> 52 <li><a href="#UI">User Interface</a> 53 <ol> 54 <li><a href="#ViewOverlay">View overlays</a></li> 55 <li><a href="#OpticalBounds">Optical bounds layout</a></li> 56 <li><a href="#AnimationRect">Animation for Rect values</a></li> 57 <li><a href="#AttachFocus">Window attach and focus listener</a></li> 58 <li><a href="#Overscan">TV overscan support</a></li> 59 <li><a href="#Orientation">Screen orientation</a></li> 60 <li><a href="#RotationAnimation">Rotation animations</a></li> 61 </ol> 62 </li> 63 <li><a href="#UserInput">User Input</a> 64 <ol> 65 <li><a href="#Sensors">New sensor types</a></li> 66 </ol> 67 </li> 68 <li><a href="#NotificationListener">Notification Listener</a></li> 69 <li><a href="#Contacts">Contacts Provider</a> 70 <ol> 71 <li><a href="#Contactables">Query for "contactables"</a></li> 72 <li><a href="#ContactsDelta">Query for contacts deltas</a></li> 73 </ol> 74 </li> 75 <li><a href="#Localization">Localization</a> 76 <ol> 77 <li><a href="#BiDi">Improved support for bi-directional text</a></li> 78 </ol> 79 </li> 80 <li><a href="#A11yService">Accessibility Services</a> 81 <ol> 82 <li><a href="#A11yKeyEvents">Handle key events</a></li> 83 <li><a href="#A11yText">Select text and copy/paste</a></li> 84 <li><a href="#A11yFeatures">Declare accessibility features</a></li> 85 </ol> 86 </li> 87 <li><a href="#Testing">Testing and Debugging</a> 88 <ol> 89 <li><a href="#UiAutomation">Automated UI testing</a></li> 90 <li><a href="#Systrace">Systrace events for apps</a></li> 91 </ol> 92 </li> 93 <li><a href="#Security">Security</a> 94 <ol> 95 <li><a href="#KeyStore">Android key store for app-private keys</a></li> 96 <li><a href="#HardwareKeyChain">Hardware credential storage</a></li> 97 </ol> 98 </li> 99 <li><a href="#Manifest">Manifest Declarations</a> 100 <ol> 101 <li><a href="#ManifestFeatures">Declarable required features</a></li> 102 <li><a href="#ManifestPermissions">User permissions</a></li> 103 </ol> 104 </li> 105</ol> 106 107<h2>See also</h2> 108<ol> 109<li><a href="{@docRoot}sdk/api_diff/18/changes.html">API 110Differences Report »</a> </li> 111<li><a 112href="{@docRoot}tools/extras/support-library.html">Support Library</a></li> 113</ol> 114 115</div> 116</div> 117 118 119 120<p>API Level: {@sdkPlatformApiLevel}</p> 121 122<p>Android {@sdkPlatformVersion} ({@link android.os.Build.VERSION_CODES#JELLY_BEAN_MR2}) 123is an update to the Jelly Bean release that offers new features for users and app 124developers. This document provides an introduction to the most notable 125new APIs.</p> 126 127<p>As an app developer, you should download the Android {@sdkPlatformVersion} system image 128and SDK platform from the <a href="{@docRoot}tools/help/sdk-manager.html">SDK Manager</a> as 129soon as possible. If you don't have a device running Android {@sdkPlatformVersion} on which to 130test your app, use the Android {@sdkPlatformVersion} system 131image to test your app on the <a href="{@docRoot}tools/devices/emulator.html">Android emulator</a>. 132Then build your apps against the Android {@sdkPlatformVersion} platform to begin using the 133latest APIs.</p> 134 135 136<h3 id="ApiLevel">Update your target API level</h3> 137 138<p>To better optimize your app for devices running Android {@sdkPlatformVersion}, 139 you should set your <a 140href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#target">{@code targetSdkVersion}</a> to 141<code>"{@sdkPlatformApiLevel}"</code>, install it on an Android {@sdkPlatformVersion} system image, 142test it, then publish an update with this change.</p> 143 144<p>You can use APIs in Android {@sdkPlatformVersion} while also supporting older versions by adding 145conditions to your code that check for the system API level before executing 146APIs not supported by your <a 147href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#min">{@code minSdkVersion}</a>. 148To learn more about maintaining backward compatibility, read <a 149href="{@docRoot}training/basics/supporting-devices/platforms.html">Supporting Different 150Platform Versions</a>.</p> 151 152<p>Various APIs are also available in the Android <a 153href="{@docRoot}tools/extras/support-library.html">Support Library</a> that allow you to implement 154new features on older versions of the platform.</p> 155 156<p>For more information about how API levels work, read <a 157href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#ApiLevels">What is API 158Level?</a></p> 159 160 161 162 163 164<h2 id="Behaviors">Important Behavior Changes</h2> 165 166<p>If you have previously published an app for Android, be aware that your app might 167be affected by changes in Android {@sdkPlatformVersion}.</p> 168 169 170<h3 id="BehaviorsIntents">If your app uses implicit intents...</h3> 171 172<p>Your app might misbehave in a restricted profile environment.</p> 173 174<p>Users in a <a href="#RestrictedProfiles">restricted profile</a> environment might not 175have all the standard Android apps available. For example, a restricted profile might have the 176web browser and camera app disabled. So your app should not make assumptions about which apps are 177available, because if you call {@link android.app.Activity#startActivity startActivity()} without 178verifying whether an app is available to handle the {@link android.content.Intent}, 179your app might crash in a restricted profile.</p> 180 181<p>When using an implicit intent, you should always verify that an app is available to handle the intent by calling {@link android.content.Intent#resolveActivity resolveActivity()} or {@link android.content.pm.PackageManager#queryIntentActivities queryIntentActivities()}. For example:</p> 182 183<pre> 184Intent intent = new Intent(Intent.ACTION_SEND); 185... 186if (intent.resolveActivity(getPackageManager()) != null) { 187 startActivity(intent); 188} else { 189 Toast.makeText(context, R.string.app_not_available, Toast.LENGTH_LONG).show(); 190} 191</pre> 192 193 194<h3 id="BehaviorsAccounts">If your app depends on accounts...</h3> 195 196<p>Your app might misbehave in a restricted profile environment.</p> 197 198<p>Users within a restricted profile environment do not have access to user accounts by default. 199If your app depends on an {@link android.accounts.Account}, then your app might crash or behave 200unexpectedly when used in a restricted profile.</p> 201 202<p>If you'd like to prevent restricted profiles from using your app entirely because your 203app depends on account information that's sensitive, specify the <a 204href="{@docRoot}guide/topics/manifest/application-element.html#requiredAccountType">{@code 205android:requiredAccountType}</a> attribute in your manifest's <a 206href="{@docRoot}guide/topics/manifest/application-element.html">{@code <application>}</a> 207element.</p> 208 209<p>If you’d like to allow restricted profiles to continue using your app even though they can’t 210create their own accounts, then you can either disable your app features that require an account 211or allow restricted profiles to access the accounts created by the primary user. For more 212information, see the section 213below about <a href="#AccountsInProfile">Supporting accounts in a restricted profile</a>.</p> 214 215 216 217 218<h2 id="RestrictedProfiles">Restricted Profiles</h2> 219 220<p>On Android tablets, users can now create restricted profiles based on the primary user. 221When users create a restricted profile, they can enable restrictions such as which apps are 222available to the profile. A new set of APIs in Android 4.3 also allow you to build fine-grain 223restriction settings for the apps you develop. For example, by using the new APIs, you can 224allow users to control what type of content is available within your app when running in a 225restricted profile environment.</p> 226 227<p>The UI for users to control the restrictions you've built is managed by the system's 228Settings application. To make your app's restriction settings appear to the user, 229you must declare the restrictions your app provides by creating a {@link 230android.content.BroadcastReceiver} that receives the {@link android.content.Intent#ACTION_GET_RESTRICTION_ENTRIES} intent. The system invokes this intent to query 231all apps for available restrictions, then builds the UI to allow the primary user to 232manage restrictions for each restricted profile. </p> 233 234<p>In the {@link android.content.BroadcastReceiver#onReceive onReceive()} method of 235your {@link android.content.BroadcastReceiver}, you must create a {@link 236android.content.RestrictionEntry} for each restriction your app provides. Each {@link 237android.content.RestrictionEntry} defines a restriction title, description, and one of the 238following data types:</p> 239 240<ul> 241 <li>{@link android.content.RestrictionEntry#TYPE_BOOLEAN} for a restriction that is 242 either true or false. 243 <li>{@link android.content.RestrictionEntry#TYPE_CHOICE} for a restriction that has 244 multiple choices that are mutually exclusive (radio button choices). 245 <li>{@link android.content.RestrictionEntry#TYPE_MULTI_SELECT} for a restriction that 246 has multiple choices that are <em>not</em> mutually exclusive (checkbox choices). 247</ul> 248 249<p>You then put all the {@link android.content.RestrictionEntry} objects into an {@link 250java.util.ArrayList} and put it into the broadcast receiver's result as the value for the 251{@link android.content.Intent#EXTRA_RESTRICTIONS_LIST} extra.</p> 252 253<p>The system creates the UI for your app's restrictions in the Settings app and saves each 254restriction with the unique key you provided for each {@link android.content.RestrictionEntry} 255object. When the user opens your app, you can query for any current restrictions by 256calling {@link android.os.UserManager#getApplicationRestrictions getApplicationRestrictions()}. 257This returns a {@link android.os.Bundle} containing the key-value pairs for each restriction 258you defined with the {@link android.content.RestrictionEntry} objects.</p> 259 260<p>If you want to provide more specific restrictions that can't be handled by boolean, single 261choice, and multi-choice values, then you can create an activity where the user can specify the 262restrictions and allow users to open that activity from the restriction settings. In your 263broadcast receiver, include the {@link android.content.Intent#EXTRA_RESTRICTIONS_INTENT} extra 264in the result {@link android.os.Bundle}. This extra must specify an {@link android.content.Intent} 265indicating the {@link android.app.Activity} class to launch (use the 266{@link android.os.Bundle#putParcelable putParcelable()} method to pass {@link 267android.content.Intent#EXTRA_RESTRICTIONS_INTENT} with the intent). 268When the primary user enters your activity to set custom restrictions, your 269activity must then return a result containing the restriction values in an extra using either 270the {@link android.content.Intent#EXTRA_RESTRICTIONS_LIST} or {@link 271android.content.Intent#EXTRA_RESTRICTIONS_BUNDLE} key, depending on whether you specify 272{@link android.content.RestrictionEntry} objects or key-value pairs, respectively.</p> 273 274 275<h3 id="AccountsInProfile">Supporting accounts in a restricted profile</h3> 276 277<p>Any accounts added to the primary user are available to a restricted profile, but the 278accounts are not accessible from the {@link android.accounts.AccountManager} APIs by default. 279If you attempt to add an account with {@link android.accounts.AccountManager} while in a restricted 280profile, you will get a failure result. Due to these restrictions, you have the following 281three options:</p> 282 283<li><strong>Allow access to the owner’s accounts from a restricted profile.</strong> 284<p>To get access to an account from a restricted profile, you must add the <a href="{@docRoot}guide/topics/manifest/application-element.html#restrictedAccountType">{@code android:restrictedAccountType}</a> attribute to the <a 285href="{@docRoot}guide/topics/manifest/application-element.html"><application></a> tag:</p> 286<pre> 287<application ... 288 android:restrictedAccountType="com.example.account.type" > 289</pre> 290 291<p class="caution"><strong>Caution:</strong> Enabling this attribute provides your 292app access to the primary user's accounts from restricted profiles. So you should allow this 293only if the information displayed by your app does not reveal personally identifiable 294information (PII) that’s considered sensitive. The system settings will inform the primary 295user that your app grants restricted profiles to their accounts, so it should be clear to the user 296that account access is important for your app's functionality. If possible, you should also 297provide adequate restriction controls for the primary user that define how much account access 298is allowed in your app.</p> 299</li> 300 301 302<li><strong>Disable certain functionality when unable to modify accounts.</strong> 303<p>If you want to use accounts, but don’t actually require them for your app’s primary 304functionality, you can check for account availability and disable features when not available. 305You should first check if there is an existing account available. If not, then query whether 306it’s possible to create a new account by calling {@link 307android.os.UserManager#getUserRestrictions()} and check the {@link 308android.os.UserManager#DISALLOW_MODIFY_ACCOUNTS} extra in the result. If it is {@code true}, 309then you should disable whatever functionality of your app requires access to accounts. 310For example:</p> 311<pre> 312UserManager um = (UserManager) context.getSystemService(Context.USER_SERVICE); 313Bundle restrictions = um.getUserRestrictions(); 314if (restrictions.getBoolean(UserManager.DISALLOW_MODIFY_ACCOUNTS, false)) { 315 // cannot add accounts, disable some functionality 316} 317</pre> 318<p class="note"><strong>Note:</strong> In this scenario, you should <em>not</em> declare 319any new attributes in your manifest file.</p> 320</li> 321 322<li><strong>Disable your app when unable to access private accounts.</strong> 323<p>If it’s instead important that your app not be available to restricted profiles because 324your app depends on sensitive personal information in an account (and because restricted profiles 325currently cannot add new accounts), add 326the <a href="{@docRoot}guide/topics/manifest/application-element.html#requiredAccountType">{@code 327android:requiredAccountType}</a> attribute to the <a 328href="{@docRoot}guide/topics/manifest/application-element.html"><application></a> tag:</p> 329<pre> 330<application ... 331 android:requiredAccountType="com.example.account.type" > 332</pre> 333<p>For example, the Gmail app uses this attribute to disable itself for restricted profiles, 334because the owner's personal email should not be available to restricted profiles.</p> 335</li> 336 337 338 339<h2 id="Wireless">Wireless and Connectivity</h2> 340 341 342<h3 id="BTLE">Bluetooth Low Energy (Smart Ready)</h3> 343 344<p>Android now supports Bluetooth Low Energy (LE) with new APIs in {@link android.bluetooth}. 345With the new APIs, you can build Android apps that communicate with Bluetooth Low Energy 346peripherals such as heart rate monitors and pedometers.</p> 347 348<p>Because Bluetooth LE is a hardware feature that is not available on all 349Android-powered devices, you must declare in your manifest file a <a 350href="{@docRoot}guide/topics/manifest/uses-feature-element.html">{@code <uses-feature>}</a> 351element for {@code "android.hardware.bluetooth_le"}:</p> 352<pre> 353<uses-feature android:name="android.hardware.bluetooth_le" android:required="true" /> 354</pre> 355 356<p>If you're already familiar with Android's Classic Bluetooth APIs, notice that using the 357Bluetooth LE APIs has some differences. Most importantly is that there's now a {@link 358android.bluetooth.BluetoothManager} class that you should use for some high level operations 359such as acquiring a {@link android.bluetooth.BluetoothAdapter}, getting a list of connected 360devices, and checking the state of a device. For example, here's how you should now get the 361{@link android.bluetooth.BluetoothAdapter}:</p> 362<pre> 363final BluetoothManager bluetoothManager = 364 (BluetoothManager) getSystemService(Context.BLUETOOTH_SERVICE); 365mBluetoothAdapter = bluetoothManager.getAdapter(); 366</pre> 367 368<p>To discover Bluetooth LE peripherals, call {@link android.bluetooth.BluetoothAdapter#startLeScan 369startLeScan()} on the {@link android.bluetooth.BluetoothAdapter}, passing it an implementation 370of the {@link android.bluetooth.BluetoothAdapter.LeScanCallback} interface. When the Bluetooth 371adapter detects a Bluetooth LE peripheral, your {@link 372android.bluetooth.BluetoothAdapter.LeScanCallback} implementation receives a call to the 373{@link android.bluetooth.BluetoothAdapter.LeScanCallback#onLeScan onLeScan()} method. This 374method provides you with a {@link android.bluetooth.BluetoothDevice} object representing the 375detected device, the RSSI value for the device, and a byte array containing the device's 376advertisement record.</p> 377 378<p>If you want to scan for only specific types of peripherals, you can instead call {@link 379android.bluetooth.BluetoothAdapter#startLeScan startLeScan()} and include an array of {@link 380java.util.UUID} objects that specify the GATT services your app supports.</p> 381 382<p class="note"><strong>Note:</strong> You can only scan for Bluetooth LE devices <em>or</em> 383scan for Classic Bluetooth devices using previous APIs. You cannot scan for both LE and Classic 384Bluetooth devices at once.</p> 385 386<p>To then connect to a Bluetooth LE peripheral, call {@link 387android.bluetooth.BluetoothDevice#connectGatt connectGatt()} on the corresponding 388{@link android.bluetooth.BluetoothDevice} object, passing it an implementation of 389{@link android.bluetooth.BluetoothGattCallback}. Your implementation of {@link 390android.bluetooth.BluetoothGattCallback} receives callbacks regarding the connectivity 391state with the device and other events. It's during the {@link 392android.bluetooth.BluetoothGattCallback#onConnectionStateChange onConnectionStateChange()} 393callback that you can begin communicating with the device if the method passes {@link 394android.bluetooth.BluetoothProfile#STATE_CONNECTED} as the new state.</p> 395 396<p>Accessing Bluetooth features on a device also requires that your app request certain 397Bluetooth user permissions. For more information, see the <a 398href="{@docRoot}guide/topics/connectivity/bluetooth-le.html">Bluetooth Low Energy</a> API guide.</p> 399 400 401<h3 id="WiFiScan">Wi-Fi scan-only mode</h3> 402 403<p>When attempting to identify the user's location, Android may use Wi-Fi to help determine 404the location by scanning nearby access points. However, users often keep Wi-Fi turned off to 405conserve battery, resulting in location data that's less accurate. Android now includes a 406scan-only mode that allows the device Wi-Fi to scan access points to help obtain the location 407without connecting to an access point, thus greatly reducing battery usage.</p> 408 409<p>If you want to acquire the user's location but Wi-Fi is currently off, you can request the 410user to enable Wi-Fi scan-only mode by calling {@link android.content.Context#startActivity 411startActivity()} with the action {@link 412android.net.wifi.WifiManager#ACTION_REQUEST_SCAN_ALWAYS_AVAILABLE}.</p> 413 414 415<h3 id="WiFiConfig">Wi-Fi configuration</h3> 416 417<p>New {@link android.net.wifi.WifiEnterpriseConfig} APIs allow enterprise-oriented services to 418automate Wi-Fi configuration for managed devices.</p> 419 420 421<h3 id="QuickResponse">Quick response for incoming calls</h3> 422 423<p>Since Android 4.0, a feature called "Quick response" allows users to respond to incoming 424calls with an immediate text message without needing to pick up the call or unlock the device. 425Until now, these quick messages were always handled by the default Messaging app. Now any app 426can declare its capability to handle these messages by creating a {@link android.app.Service} 427with an intent filter for {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE}.</p> 428 429<p>When the user responds to an incoming call with a quick response, the Phone app sends 430the {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE} intent with a URI 431describing the recipient (the caller) and the {@link android.content.Intent#EXTRA_TEXT} extra 432with the message the user wants to send. When your service receives the intent, it should deliver 433the message and immediately stop itself (your app should not show an activity).</p> 434 435<p>In order to receive this intent, you must declare the {@link 436android.Manifest.permission#SEND_RESPOND_VIA_MESSAGE} permission.</p> 437 438 439 440<h2 id="Multimedia">Multimedia</h2> 441 442<h3 id="DASH">MPEG DASH support</h3> 443 444<p>Android now supports Dynamic Adaptive Streaming over HTTP (DASH) in accordance with the 445ISO/IEC 23009-1 standard, using existing APIs in {@link android.media.MediaCodec} and {@link 446android.media.MediaExtractor}. The framework underlying these APIs has been updated to support 447parsing of fragmented MP4 files, but your app is still responsible for parsing the MPD metadata 448and passing the individual streams to {@link android.media.MediaExtractor}.</p> 449 450<p>If you want to use DASH with encrypted content, notice that the {@link android.media.MediaExtractor#getSampleCryptoInfo getSampleCryptoInfo()} method returns the {@link 451android.media.MediaCodec.CryptoInfo} metadata describing the structure of each encrypted media 452sample. Also, the {@link android.media.MediaExtractor#getPsshInfo()} method has been added to 453{@link android.media.MediaExtractor} so you can access the PSSH metadata for your DASH media. 454This method returns a map of {@link java.util.UUID} objects to bytes, with the 455{@link java.util.UUID} specifying the crypto scheme, and the bytes being the data specific 456to that scheme.</p> 457 458 459<h3 id="DRM">Media DRM</h3> 460 461<p>The new {@link android.media.MediaDrm} class provides a modular solution for digital rights 462management (DRM) with your media content by separating DRM concerns from media playback. For 463instance, this API separation allows you to play back Widevine-encrypted content without having 464to use the Widevine media format. This DRM solution also supports DASH Common Encryption so you 465can use a variety of DRM schemes with your streaming content.</p> 466 467<p>You can use {@link android.media.MediaDrm} to obtain opaque key-request messages and process 468key-response messages from the server for license acquisition and provisioning. Your app is 469responsible for handling the network communication with the servers; the {@link 470android.media.MediaDrm} class provides only the ability to generate and process the messages.</p> 471 472<p>The {@link android.media.MediaDrm} APIs are intended to be used in conjunction with the 473{@link android.media.MediaCodec} APIs that were introduced in Android 4.1 (API level 16), 474including {@link android.media.MediaCodec} for encoding and decoding your content, {@link 475android.media.MediaCrypto} for handling encrypted content, and {@link android.media.MediaExtractor} 476for extracting and demuxing your content.</p> 477 478<p>You must first construct {@link android.media.MediaExtractor} and 479{@link android.media.MediaCodec} objects. You can then access the DRM-scheme-identifying 480{@link java.util.UUID}, typically from metadata in the content, and use it to construct an 481instance of a {@link android.media.MediaDrm} object with its constructor.</p> 482 483 484<h3 id="EncodingSurface">Video encoding from a Surface</h3> 485 486<p>Android 4.1 (API level 16) added the {@link android.media.MediaCodec} class for low-level 487encoding and decoding of media content. When encoding video, Android 4.1 required that you provide 488the media with a {@link java.nio.ByteBuffer} array, but Android 4.3 now allows you to use a {@link 489android.view.Surface} as the input to an encoder. For instance, this allows you to encode input 490from an existing video file or using frames generated from OpenGL ES.</p> 491 492<p>To use a {@link android.view.Surface} as the input to your encoder, first call {@link 493android.media.MediaCodec#configure configure()} for your {@link android.media.MediaCodec}. 494Then call {@link android.media.MediaCodec#createInputSurface()} to receive the {@link 495android.view.Surface} upon which you can stream your media.</p> 496 497<p>For example, you can use the given {@link android.view.Surface} as the window for an OpenGL 498context by passing it to {@link android.opengl.EGL14#eglCreateWindowSurface 499eglCreateWindowSurface()}. Then while rendering the surface, call {@link 500android.opengl.EGL14#eglSwapBuffers eglSwapBuffers()} to pass the frame to the {@link 501android.media.MediaCodec}.</p> 502 503<p>To begin encoding, call {@link android.media.MediaCodec#start()} on the {@link 504android.media.MediaCodec}. When done, call {@link android.media.MediaCodec#signalEndOfInputStream} 505to terminate encoding, and call {@link android.view.Surface#release()} on the 506{@link android.view.Surface}.</p> 507 508 509<h3 id="MediaMuxing">Media muxing</h3> 510 511<p>The new {@link android.media.MediaMuxer} class enables multiplexing between one audio stream 512and one video stream. These APIs serve as a counterpart to the {@link android.media.MediaExtractor} 513class added in Android 4.2 for de-multiplexing (demuxing) media.</p> 514 515<p>Supported output formats are defined in {@link android.media.MediaMuxer.OutputFormat}. Currently, 516MP4 is the only supported output format and {@link android.media.MediaMuxer} currently supports 517only one audio stream and/or one video stream at a time.</p> 518 519<p>{@link android.media.MediaMuxer} is mostly designed to work with {@link android.media.MediaCodec} 520so you can perform video processing through {@link android.media.MediaCodec} then save the 521output to an MP4 file through {@link android.media.MediaMuxer}. You can also use {@link 522android.media.MediaMuxer} in combination with {@link android.media.MediaExtractor} to perform 523media editing without the need to encode or decode.</p> 524 525 526<h3 id="ProgressAndScrubbing">Playback progress and scrubbing for RemoteControlClient</h3> 527 528<p>In Android 4.0 (API level 14), the {@link android.media.RemoteControlClient} was added to 529enable media playback controls from remote control clients such as the controls available on the 530lock screen. Android 4.3 now provides the ability for such controllers to display the playback 531position and controls for scrubbing the playback. If you've enabled remote control for your 532media app with the {@link android.media.RemoteControlClient} APIs, then you can allow playback 533scrubbing by implementing two new interfaces.</p> 534 535<p>First, you must enable the {@link 536android.media.RemoteControlClient#FLAG_KEY_MEDIA_POSITION_UPDATE} flag by passing it to 537{@link android.media.RemoteControlClient#setTransportControlFlags setTransportControlsFlags()}.</p> 538 539<p>Then implement the following two new interfaces:</p> 540<dl> 541 <dt>{@link android.media.RemoteControlClient.OnGetPlaybackPositionListener}</dt> 542 <dd>This includes the callback {@link android.media.RemoteControlClient.OnGetPlaybackPositionListener#onGetPlaybackPosition}, which requests the current position 543 of your media when the remote control needs to update the progress in its UI.</dd> 544 545 <dt>{@link android.media.RemoteControlClient.OnPlaybackPositionUpdateListener}</dt> 546 <dd>This includes the callback {@link android.media.RemoteControlClient.OnPlaybackPositionUpdateListener#onPlaybackPositionUpdate onPlaybackPositionUpdate()}, which 547 tells your app the new time code for your media when the user scrubs the playback with the 548 remote control UI. 549 <p>Once you update your playback with the new position, call {@link 550 android.media.RemoteControlClient#setPlaybackState setPlaybackState()} to indicate the 551 new playback state, position, and speed.</p> 552 </dd> 553</dl> 554 555<p>With these interfaces defined, you can set them for your {@link 556android.media.RemoteControlClient} by calling {@link android.media.RemoteControlClient#setOnGetPlaybackPositionListener setOnGetPlaybackPositionListener()} and 557{@link android.media.RemoteControlClient#setPlaybackPositionUpdateListener 558setPlaybackPositionUpdateListener()}, respectively.</p> 559 560 561 562<h2 id="Graphics">Graphics</h2> 563 564<h3 id="OpenGL">Support for OpenGL ES 3.0</h3> 565 566<p>Android 4.3 adds Java interfaces and native support for OpenGL ES 3.0. Key new functionality 567provided in OpenGL ES 3.0 includes:</p> 568<ul> 569 <li>Acceleration of advanced visual effects</li> 570 <li>High quality ETC2/EAC texture compression as a standard feature</li> 571 <li>A new version of the GLSL ES shading language with integer and 32-bit floating point support</li> 572 <li>Advanced texture rendering</li> 573 <li>Broader standardization of texture size and render-buffer formats</li> 574</ul> 575 576<p>The Java interface for OpenGL ES 3.0 on Android is provided with {@link android.opengl.GLES30}. 577When using OpenGL ES 3.0, be sure that you declare it in your manifest file with the 578<a href="{@docRoot}guide/topics/manifest/uses-feature-element.html"><uses-feature></a> 579tag and the {@code android:glEsVersion} attribute. For example:</p> 580<pre> 581<manifest> 582 <uses-feature android:glEsVersion="0x00030000" /> 583 ... 584</manifest> 585</pre> 586 587<p>And remember to specify the OpenGL ES context by calling {@link android.opengl.GLSurfaceView#setEGLContextClientVersion setEGLContextClientVersion()}, passing {@code 3} as the version.</p> 588 589 590<h3 id="MipMap">Mipmapping for drawables</h3> 591 592<p>Using a mipmap as the source for your bitmap or drawable is a simple way to provide a 593quality image and various image scales, which can be particularly useful if you expect your 594image to be scaled during an animation.</p> 595 596<p>Android 4.2 (API level 17) added support for mipmaps in the {@link android.graphics.Bitmap} 597class—Android swaps the mip images in your {@link android.graphics.Bitmap} when you've 598supplied a mipmap source and have enabled {@link android.graphics.Bitmap#setHasMipMap 599setHasMipMap()}. Now in Android 4.3, you can enable mipmaps for a {@link 600android.graphics.drawable.BitmapDrawable} object as well, by providing a mipmap asset and 601setting the {@code android:mipMap} attribute in a bitmap resource file or by calling {@link 602android.graphics.drawable.BitmapDrawable#hasMipMap hasMipMap()}. 603</p> 604 605 606 607<h2 id="UI">User Interface</h2> 608 609<h3 id="ViewOverlay">View overlays</h3> 610 611<p>The new {@link android.view.ViewOverlay} class provides a transparent layer on top of 612a {@link android.view.View} on which you can add visual content and which does not affect 613the layout hierarchy. You can get a {@link android.view.ViewOverlay} for any {@link 614android.view.View} by calling {@link android.view.View#getOverlay}. The overlay 615always has the same size and position as its host view (the view from which it was created), 616allowing you to add content that appears in front of the host view, but which cannot extend 617the bounds of that host view. 618</p> 619 620<p>Using a {@link android.view.ViewOverlay} is particularly useful when you want to create 621animations such as sliding a view outside of its container or moving items around the screen 622without affecting the view hierarchy. However, because the usable area of an overlay is 623restricted to the same area as its host view, if you want to animate a view moving outside 624its position in the layout, you must use an overlay from a parent view that has the desired 625layout bounds.</p> 626 627<p>When you create an overlay for a widget view such as a {@link android.widget.Button}, you 628can add {@link android.graphics.drawable.Drawable} objects to the overlay by calling 629{@link android.view.ViewOverlay#add(Drawable)}. If you call {@link 630android.view.ViewGroup#getOverlay} for a layout view, such as {@link android.widget.RelativeLayout}, 631the object returned is a {@link android.view.ViewGroupOverlay}. The 632{@link android.view.ViewGroupOverlay} class is a subclass 633of {@link android.view.ViewOverlay} that also allows you to add {@link android.view.View} 634objects by calling {@link android.view.ViewGroupOverlay#add(View)}. 635</p> 636 637<p class="note"><strong>Note:</strong> All drawables and views that you add to an overlay 638are visual only. They cannot receive focus or input events.</p> 639 640<p>For example, the following code animates a view sliding to the right by placing the view 641in the parent view's overlay, then performing a translation animation on that view:</p> 642<pre> 643View view = findViewById(R.id.view_to_remove); 644ViewGroup container = (ViewGroup) view.getParent(); 645container.getOverlay().add(view); 646ObjectAnimator anim = ObjectAnimator.ofFloat(view, "translationX", container.getRight()); 647anim.start(); 648</pre> 649 650 651<h3 id="OpticalBounds">Optical bounds layout</h3> 652 653<p>For views that contain nine-patch background images, you can now specify that they should 654be aligned with neighboring views based on the "optical" bounds of the background image rather 655than the "clip" bounds of the view.</p> 656 657<p>For example, figures 1 and 2 each show the same layout, but the version in figure 1 is 658using clip bounds (the default behavior), while figure 2 is using optical bounds. Because the 659nine-patch images used for the button and the photo frame include padding around the edges, 660they don’t appear to align with each other or the text when using clip bounds.</p> 661 662<p class="note"><strong>Note:</strong> The screenshot in figures 1 and 2 have the "Show 663layout bounds" developer setting enabled. For each view, red lines indicate the optical 664bounds, blue lines indicate the clip bounds, and pink indicates margins.</p> 665 666<script type="text/javascript"> 667function toggleOpticalImages(mouseover) { 668 669 $("img.optical-img").each(function() { 670 $img = $(this); 671 var index = $img.attr('src').lastIndexOf("/") + 1; 672 var path = $img.attr('src').substr(0,index); 673 var name = $img.attr('src').substr(index); 674 var splitname; 675 var highres = false; 676 if (name.indexOf("@2x") != -1) { 677 splitname = name.split("@2x."); 678 highres = true; 679 } else { 680 splitname = name.split("."); 681 } 682 683 var newname; 684 if (mouseover) { 685 if (highres) { 686 newname = splitname[0] + "-normal@2x.png"; 687 } else { 688 newname = splitname[0] + "-normal.png"; 689 } 690 } else { 691 if (highres) { 692 newname = splitname[0].split("-normal")[0] + "@2x.png"; 693 } else { 694 newname = splitname[0].split("-normal")[0] + ".png"; 695 } 696 } 697 698 $img.attr('src', path + newname); 699 700 }); 701} 702</script> 703 704<p class="table-caption"><em>Mouse over to hide the layout bounds.</em></p> 705<div style="float:left;width:296px"> 706<img src="{@docRoot}images/tools/clipbounds@2x.png" width="296" alt="" class="optical-img" 707 onmouseover="toggleOpticalImages(true)" onmouseout="toggleOpticalImages(false)" /> 708<p class="img-caption"><strong>Figure 1.</strong> Layout using clip bounds (default).</p> 709</div> 710<div style="float:left;width:296px;margin-left:60px"> 711<img src="{@docRoot}images/tools/opticalbounds@2x.png" width="296" alt="" class="optical-img" 712 onmouseover="toggleOpticalImages(true)" onmouseout="toggleOpticalImages(false)" /> 713<p class="img-caption"><strong>Figure 2.</strong> Layout using optical bounds.</p> 714</div> 715 716 717<p style="clear:left">To align the views based on their optical bounds, set the {@code android:layoutMode} attribute to {@code "opticalBounds"} in one of the parent layouts. For example:</p> 718 719<pre> 720<LinearLayout android:layoutMode="opticalBounds" ... > 721</pre> 722 723 724<div class="figure" style="width:155px"> 725<img src="{@docRoot}images/tools/ninepatch_opticalbounds@2x.png" width="121" alt="" /> 726<p class="img-caption"><strong>Figure 3.</strong> Zoomed view of the Holo button nine-patch with 727optical bounds. 728</p> 729</div> 730 731<p>For this to work, the nine-patch images applied to the background of your views must specify 732the optical bounds using red lines along the bottom and right-side of the nine-patch file (as 733shown in figure 3). The red lines indicate the region that should be subtracted from 734the clip bounds, leaving the optical bounds of the image.</p> 735 736<p>When you enable optical bounds for a {@link android.view.ViewGroup} in your layout, all 737descendant views inherit the optical bounds layout mode unless you override it for a group by 738setting {@code android:layoutMode} to {@code "clipBounds"}. All layout elements also honor the 739optical bounds of their child views, adapting their own bounds based on the optical bounds of 740the views within them. However, layout elements (subclasses of {@link android.view.ViewGroup}) 741currently do not support optical bounds for nine-patch images applied to their own background.</p> 742 743<p>If you create a custom view by subclassing {@link android.view.View}, {@link android.view.ViewGroup}, or any subclasses thereof, your view will inherit these optical bound behaviors.</p> 744 745<p class="note"><strong>Note:</strong> All widgets supported by the Holo theme have been updated 746with optical bounds, including {@link android.widget.Button}, {@link android.widget.Spinner}, 747{@link android.widget.EditText}, and others. So you can immediately benefit by setting the 748{@code android:layoutMode} attribute to {@code "opticalBounds"} if your app applies a Holo theme 749({@link android.R.style#Theme_Holo Theme.Holo}, {@link android.R.style#Theme_Holo_Light 750Theme.Holo.Light}, etc.). 751</p> 752 753<p>To specify optical bounds for your own nine-patch images with the <a 754href="{@docRoot}tools/help/draw9patch.html">Draw 9-patch</a> tool, hold CTRL when clicking on 755the border pixels.</p> 756 757 758 759 760<h3 id="AnimationRect">Animation for Rect values</h3> 761 762<p>You can now animate between two {@link android.graphics.Rect} values with the new {@link 763android.animation.RectEvaluator}. This new class is an implementation of {@link 764android.animation.TypeEvaluator} that you can pass to {@link 765android.animation.ValueAnimator#setEvaluator ValueAnimator.setEvaluator()}. 766</p> 767 768<h3 id="AttachFocus">Window attach and focus listener</h3> 769 770<p>Previously, if you wanted to listen for when your view attached/detached to the window or 771when its focus changed, you needed to override the {@link android.view.View} class to 772implement {@link android.view.View#onAttachedToWindow onAttachedToWindow()} and {@link 773android.view.View#onDetachedFromWindow onDetachedFromWindow()}, or {@link 774android.view.View#onWindowFocusChanged onWindowFocusChanged()}, respectively. 775</p> 776 777<p>Now, to receive attach and detach events you can instead implement {@link 778android.view.ViewTreeObserver.OnWindowAttachListener} and set it on a view with 779{@link android.view.ViewTreeObserver#addOnWindowAttachListener addOnWindowAttachListener()}. 780And to receive focus events, you can implement {@link 781android.view.ViewTreeObserver.OnWindowFocusChangeListener} and set it on a view with 782{@link android.view.ViewTreeObserver#addOnWindowFocusChangeListener 783addOnWindowFocusChangeListener()}. 784</p> 785 786 787<h3 id="Overscan">TV overscan support</h3> 788 789<p>To be sure your app fills the entire screen on every television, you can now enable overscan 790for you app layout. Overscan mode is determined by the {@link android.view.WindowManager.LayoutParams#FLAG_LAYOUT_IN_OVERSCAN} flag, which you can enable with platform themes such as 791{@link android.R.style#Theme_DeviceDefault_NoActionBar_Overscan} or by enabling the 792{@link android.R.attr#windowOverscan} style in a custom theme.</p> 793 794 795<h3 id="Orientation">Screen orientation</h3> 796 797<p>The <a 798href="{@docRoot}guide/topics/manifest/activity-element.html">{@code <activity>}</a> 799tag's <a 800href="{@docRoot}guide/topics/manifest/activity-element.html#screen">{@code screenOrientation}</a> 801attribute now supports additional values to honor the user's preference for auto-rotation:</p> 802<dl> 803<dt>{@code "userLandscape"}</dt> 804<dd>Behaves the same as {@code "sensorLandscape"}, except if the user disables auto-rotate 805then it locks in the normal landscape orientation and will not flip. 806</dd> 807 808<dt>{@code "userPortrait"}</dt> 809<dd>Behaves the same as {@code "sensorPortrait"}, except if the user disables auto-rotate then 810it locks in the normal portrait orientation and will not flip. 811</dd> 812 813<dt>{@code "fullUser"}</dt> 814<dd>Behaves the same as {@code "fullSensor"} and allows rotation in all four directions, except 815if the user disables auto-rotate then it locks in the user's preferred orientation. 816</dd></dl> 817 818<p>Additionally, you can now also declare {@code "locked"} to lock your app's orientation into 819the screen's current orientation.</p> 820 821 822<h3 id="RotationAnimation">Rotation animations</h3> 823 824<p>The new {@link android.view.WindowManager.LayoutParams#rotationAnimation} field in 825{@link android.view.WindowManager} allows you to select between one of three animations you 826want to use when the system switches screen orientations. The three animations are:</p> 827<ul> 828 <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_CROSSFADE}</li> 829 <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_JUMPCUT}</li> 830 <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_ROTATE}</li> 831</ul> 832 833<p class="note"><strong>Note:</strong> These animations are available only if you've set your activity to use "fullscreen" mode, which you can enable with themes such as {@link android.R.style#Theme_Holo_NoActionBar_Fullscreen Theme.Holo.NoActionBar.Fullscreen}.</p> 834 835<p>For example, here's how you can enable the "crossfade" animation:</p> 836<pre> 837protected void onCreate(Bundle savedInstanceState) { 838 super.onCreate(savedInstanceState); 839 840 WindowManager.LayoutParams params = getWindow().getAttributes(); 841 params.rotationAnimation = WindowManager.LayoutParams.ROTATION_ANIMATION_CROSSFADE; 842 getWindow().setAttributes(params); 843 ... 844} 845</pre> 846 847 848<h2 id="UserInput">User Input</h2> 849 850<h3 id="Sensors">New sensor types</h3> 851<p>The new {@link android.hardware.Sensor#TYPE_GAME_ROTATION_VECTOR} sensor allows you to detect the device's rotations without worrying about magnetic interferences. Unlike the {@link android.hardware.Sensor#TYPE_ROTATION_VECTOR} sensor, the {@link android.hardware.Sensor#TYPE_GAME_ROTATION_VECTOR} is not based on magnetic north.</p> 852 853<p>The new {@link android.hardware.Sensor#TYPE_GYROSCOPE_UNCALIBRATED} and {@link 854android.hardware.Sensor#TYPE_MAGNETIC_FIELD_UNCALIBRATED} sensors provide raw sensor data without 855consideration for bias estimations. That is, the existing {@link 856android.hardware.Sensor#TYPE_GYROSCOPE} and {@link android.hardware.Sensor#TYPE_MAGNETIC_FIELD} 857sensors provide sensor data that takes into account estimated bias from gyro-drift and hard iron 858in the device, respectively. Whereas the new "uncalibrated" versions of these sensors instead provide 859the raw sensor data and offer the estimated bias values separately. These sensors allow you to 860provide your own custom calibration for the sensor data by enhancing the estimated bias with 861external data.</p> 862 863 864 865<h2 id="NotificationListener">Notification Listener</h2> 866 867<p>Android 4.3 adds a new service class, {@link android.service.notification.NotificationListenerService}, that allows your app to receive information about new notifications as they are posted by the system. </p> 868 869<p>If your app currently uses the accessibility service APIs to access system notifications, you should update your app to use these APIs instead.</p> 870 871 872 873 874<h2 id="Contacts">Contacts Provider</h2> 875 876<h3 id="Contactables">Query for "contactables"</h3> 877 878<p>The new Contacts Provider query, {@link android.provider.ContactsContract.CommonDataKinds.Contactables#CONTENT_URI Contactables.CONTENT_URI}, provides an efficient way to get one {@link android.database.Cursor} that contains all email addresses and phone numbers belonging to all contacts matching the specified query.</p> 879 880 881<h3 id="ContactsDelta">Query for contacts deltas</h3> 882 883<p>New APIs have been added to Contacts Provider that allow you to efficiently query recent changes to the contacts data. Previously, your app could be notified when something in the contacts data changed, but you would not know exactly what changed and would need to retrieve all contacts then iterate through them to discover the change.</p> 884 885<p>To track changes to inserts and updates, you can now include the {@link android.provider.ContactsContract.ContactsColumns#CONTACT_LAST_UPDATED_TIMESTAMP} parameter with your selection to query only the contacts that have changed since the last time you queried the provider.</p> 886 887<p>To track which contacts have been deleted, the new table {@link android.provider.ContactsContract.DeletedContacts} provides a log of contacts that have been deleted (but each contact deleted is held in this table for a limited time). Similar to {@link android.provider.ContactsContract.ContactsColumns#CONTACT_LAST_UPDATED_TIMESTAMP}, you can use the new selection parameter, {@link android.provider.ContactsContract.DeletedContacts#CONTACT_DELETED_TIMESTAMP} to check which contacts have been deleted since the last time you queried the provider. The table also contains the constant {@link android.provider.ContactsContract.DeletedContacts#DAYS_KEPT_MILLISECONDS} containing the number of days (in milliseconds) that the log will be kept.</p> 888 889<p>Additionally, the Contacts Provider now broadcasts the {@link 890android.provider.ContactsContract.Intents#CONTACTS_DATABASE_CREATED} action when the user 891clears the contacts storage through the system settings menu, effectively recreating the 892Contacts Provider database. It’s intended to signal apps that they need to drop all the contact 893information they’ve stored and reload it with a new query.</p> 894 895<p>For sample code using these APIs to check for changes to the contacts, look in the ApiDemos 896sample available in the <a href="{@docRoot}tools/samples/index.html">SDK Samples</a> download.</p> 897 898 899<h2 id="Localization">Localization</h2> 900 901<h3 id="BiDi">Improved support for bi-directional text</h3> 902 903<p>Previous versions of Android support right-to-left (RTL) languages and layout, 904but sometimes don't properly handle mixed-direction text. So Android 4.3 adds the {@link 905android.text.BidiFormatter} APIs that help you properly format text with opposite-direction 906content without garbling any parts of it.</p> 907 908<p>For example, when you want to create a sentence with a string variable, such as "Did you mean 90915 Bay Street, Laurel, CA?", you normally pass a localized string resource and the variable to 910{@link java.lang.String#format String.format()}:</p> 911<pre> 912Resources res = getResources(); 913String suggestion = String.format(res.getString(R.string.did_you_mean), address); 914</pre> 915 916<p>However, if the locale is Hebrew, then the formatted string comes out like this:</p> 917 918<p dir="rtl">האם התכוונת ל 15 Bay Street, Laurel, CA?</p> 919 920<p>That's wrong because the "15" should be left of "Bay Street." The solution is to use {@link 921android.text.BidiFormatter} and its {@link android.text.BidiFormatter#unicodeWrap(String) 922unicodeWrap()} method. For example, the code above becomes:</p> 923<pre> 924Resources res = getResources(); 925BidiFormatter bidiFormatter = BidiFormatter.getInstance(); 926String suggestion = String.format(res.getString(R.string.did_you_mean), 927 bidiFormatter.unicodeWrap(address)); 928</pre> 929 930<p> 931By default, {@link android.text.BidiFormatter#unicodeWrap(String) unicodeWrap()} uses the 932first-strong directionality estimation heuristic, which can get things wrong if the first 933signal for text direction does not represent the appropriate direction for the content as a whole. 934If necessary, you can specify a different heuristic by passing one of the {@link 935android.text.TextDirectionHeuristic} constants from {@link android.text.TextDirectionHeuristics} 936to {@link android.text.BidiFormatter#unicodeWrap(String,TextDirectionHeuristic) unicodeWrap()}.</p> 937 938<p class="note"><strong>Note:</strong> These new APIs are also available for previous versions 939of Android through the Android <a href="{@docRoot}tools/extras/support-library.html">Support 940Library</a>, with the {@link android.support.v4.text.BidiFormatter} class and related APIs.</p> 941 942 943 944<h2 id="A11yService">Accessibility Services</h2> 945 946<h3 id="A11yKeyEvents">Handle key events</h3> 947 948<p>An {@link android.accessibilityservice.AccessibilityService} can now receive a callback for 949key input events with the {@link android.accessibilityservice.AccessibilityService#onKeyEvent 950onKeyEvent()} callback method. This allows your accessibility service to handle input for 951key-based input devices such as a keyboard and translate those events to special actions that 952previously may have been possible only with touch input or the device's directional pad.</p> 953 954 955<h3 id="A11yText">Select text and copy/paste</h3> 956 957<p>The {@link android.view.accessibility.AccessibilityNodeInfo} now provides APIs that allow 958an {@link android.accessibilityservice.AccessibilityService} to select, cut, copy, and paste 959text in a node.</p> 960 961<p>To specify the selection of text to cut or copy, your accessibility service can use the new 962action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_SET_SELECTION}, passing 963with it the selection start and end position with {@link 964android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_SELECTION_START_INT} and {@link 965android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_SELECTION_END_INT}. 966Alternatively you can select text by manipulating the cursor position using the existing 967action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_NEXT_AT_MOVEMENT_GRANULARITY} 968(previously only for moving the cursor position), and adding the argument {@link 969android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_EXTEND_SELECTION_BOOLEAN}.</p> 970 971<p>You can then cut or copy with {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_CUT}, 972{@link android.view.accessibility.AccessibilityNodeInfo#ACTION_COPY}, then later paste with 973{@link android.view.accessibility.AccessibilityNodeInfo#ACTION_PASTE}.</p> 974 975 976<p class="note"><strong>Note:</strong> These new APIs are also available for previous versions 977of Android through the Android <a href="{@docRoot}tools/extras/support-library.html">Support 978Library</a>, with the {@link android.support.v4.view.accessibility.AccessibilityNodeInfoCompat} 979class.</p> 980 981 982 983<h3 id="A11yFeatures">Declare accessibility features</h3> 984 985<p>Beginning with Android 4.3, an accessibility service must declare accessibility capabilities 986in its metadata file in order to use certain accessibility features. If the capability is not 987requested in the metadata file, then the feature will be a no-op. To declare your service's 988accessibility capabilities, you must use XML attributes that correspond to the various 989"capability" constants in the {@link android.accessibilityservice.AccessibilityServiceInfo} 990class.</p> 991 992<p>For example, if a service does not request the {@link android.R.styleable#AccessibilityService_canRequestFilterKeyEvents flagRequestFilterKeyEvents} capability, 993then it will not receive key events.</p> 994 995 996<h2 id="Testing">Testing and Debugging</h2> 997 998<h3 id="UiAutomation">Automated UI testing</h3> 999 1000<p>The new {@link android.app.UiAutomation} class provides APIs that allow you to simulate user 1001actions for test automation. By using the platform's {@link 1002android.accessibilityservice.AccessibilityService} APIs, the {@link android.app.UiAutomation} 1003APIs allow you to inspect the screen content and inject arbitrary keyboard and touch events.</p> 1004 1005<p>To get an instance of {@link android.app.UiAutomation}, call {@link 1006android.app.Instrumentation#getUiAutomation Instrumentation.getUiAutomation()}. In order 1007for this to work, you must supply the {@code -w} option with the {@code instrument} command 1008when running your {@link android.test.InstrumentationTestCase} from <a 1009href="{@docRoot}tools/help/adb.html#am">{@code adb shell}</a>.</p> 1010 1011<p>With the {@link android.app.UiAutomation} instance, you can execute arbitrary events to test 1012your app by calling {@link android.app.UiAutomation#executeAndWaitForEvent 1013executeAndWaitForEvent()}, passing it a {@link java.lang.Runnable} to perform, a timeout 1014period for the operation, and an implementation of the {@link 1015android.app.UiAutomation.AccessibilityEventFilter} interface. It's within your {@link 1016android.app.UiAutomation.AccessibilityEventFilter} implementation that you'll receive a call 1017that allows you to filter the events that you're interested in and determine the success or 1018failure of a given test case.</p> 1019 1020<p>To observe all the events during a test, create an implementation of {@link 1021android.app.UiAutomation.OnAccessibilityEventListener} and pass it to {@link 1022android.app.UiAutomation#setOnAccessibilityEventListener setOnAccessibilityEventListener()}. 1023Your listener interface then receives a call to {@link 1024android.app.UiAutomation.OnAccessibilityEventListener#onAccessibilityEvent onAccessibilityEvent()} 1025each time an event occurs, receiving an {@link android.view.accessibility.AccessibilityEvent} object 1026that describes the event.</p> 1027 1028<p>There is a variety of other operations that the {@link android.app.UiAutomation} APIs expose 1029at a very low level to encourage the development of UI test tools such as <a href="{@docRoot}tools/help/uiautomator/index.html">uiautomator</a>. For instance, 1030{@link android.app.UiAutomation} can also:</p> 1031<ul> 1032 <li>Inject input events 1033 <li>Change the orientation of the screen 1034 <li>Take screenshots 1035</ul> 1036 1037<p>And most importantly for UI test tools, the {@link android.app.UiAutomation} APIs work 1038across application boundaries, unlike those in {@link android.app.Instrumentation}.</p> 1039 1040 1041<h3 id="Systrace">Systrace events for apps</h3> 1042 1043<p>Android 4.3 adds the {@link android.os.Trace} class with two static methods, 1044{@link android.os.Trace#beginSection beginSection()} and {@link android.os.Trace#endSection()}, 1045which allow you to define blocks of code to include with the systrace report. By creating 1046sections of traceable code in your app, the systrace logs provide you a much more detailed 1047analysis of where slowdown occurs within your app.</p> 1048 1049<p>For information about using the Systrace tool, read <a href="{@docRoot}tools/debugging/systrace.html">Analyzing Display and Performance with Systrace</a>.</p> 1050 1051 1052<h2 id="Security">Security</h2> 1053 1054<h3 id="KeyStore">Android key store for app-private keys</h3> 1055 1056<p>Android now offers a custom Java Security Provider in the {@link java.security.KeyStore} 1057facility, called Android Key Store, which allows you to generate and save private keys that 1058may be seen and used by only your app. To load the Android Key Store, pass 1059{@code "AndroidKeyStore"} to {@link java.security.KeyStore#getInstance(String) 1060KeyStore.getInstance()}.</p> 1061 1062<p>To manage your app's private credentials in the Android Key Store, generate a new key with 1063{@link java.security.KeyPairGenerator} with {@link android.security.KeyPairGeneratorSpec}. First 1064get an instance of {@link java.security.KeyPairGenerator} by calling {@link 1065java.security.KeyPairGenerator#getInstance getInstance()}. Then call 1066{@link java.security.KeyPairGenerator#initialize initialize()}, passing it an instance of 1067{@link android.security.KeyPairGeneratorSpec}, which you can get using 1068{@link android.security.KeyPairGeneratorSpec.Builder KeyPairGeneratorSpec.Builder}. 1069Finally, get your {@link java.security.KeyPair} by calling {@link 1070java.security.KeyPairGenerator#generateKeyPair generateKeyPair()}.</p> 1071 1072 1073<h3 id="HardwareKeyChain">Hardware credential storage</h3> 1074 1075<p>Android also now supports hardware-backed storage for your {@link android.security.KeyChain} 1076credentials, providing more security by making the keys unavailable for extraction. That is, once 1077keys are in a hardware-backed key store (Secure Element, TPM, or TrustZone), they can be used for 1078cryptographic operations but the private key material cannot be exported. Even the OS kernel 1079cannot access this key material. While not all Android-powered devices support storage on 1080hardware, you can check at runtime if hardware-backed storage is available by calling 1081{@link android.security.KeyChain#isBoundKeyAlgorithm KeyChain.IsBoundKeyAlgorithm()}.</p> 1082 1083 1084 1085<h2 id="Manifest">Manifest Declarations</h2> 1086 1087<h3 id="ManifestFeatures">Declarable required features</h3> 1088 1089<p>The following values are now supported in the <a 1090href="{@docRoot}guide/topics/manifest/uses-feature-element.html">{@code <uses-feature>}</a> 1091element so you can ensure that your app is installed only on devices that provide the features 1092your app needs.</p> 1093 1094<dl> 1095<dt>{@link android.content.pm.PackageManager#FEATURE_APP_WIDGETS}</dt> 1096<dd>Declares that your app provides an app widget and should be installed only on devices that 1097include a Home screen or similar location where users can embed app widgets. 1098Example: 1099<pre> 1100<uses-feature android:name="android.software.app_widgets" android:required="true" /> 1101</pre> 1102</dd> 1103 1104<dt>{@link android.content.pm.PackageManager#FEATURE_HOME_SCREEN}</dt> 1105<dd>Declares that your app behaves as a Home screen replacement and should be installed only on 1106devices that support third-party Home screen apps. 1107Example: 1108<pre> 1109<uses-feature android:name="android.software.home_screen" android:required="true" /> 1110</pre> 1111</dd> 1112 1113<dt>{@link android.content.pm.PackageManager#FEATURE_INPUT_METHODS}</dt> 1114<dd>Declares that your app provides a custom input method (a keyboard built with {@link 1115android.inputmethodservice.InputMethodService}) and should be installed only on devices that 1116support third-party input methods. 1117Example: 1118<pre> 1119<uses-feature android:name="android.software.input_methods" android:required="true" /> 1120</pre> 1121</dd> 1122 1123<dt>{@link android.content.pm.PackageManager#FEATURE_BLUETOOTH_LE}</dt> 1124<dd>Declares that your app uses Bluetooth Low Energy APIs and should be installed only on devices 1125that are capable of communicating with other devices via Bluetooth Low Energy. 1126Example: 1127<pre> 1128<uses-feature android:name="android.software.bluetooth_le" android:required="true" /> 1129</pre> 1130</dd> 1131</dl> 1132 1133 1134<h3 id="ManifestPermissions">User permissions</h3> 1135<p>The following values are now supported in the <a 1136href="{@docRoot}guide/topics/manifest/uses-permission-element.html">{@code <uses-permission>}</a> 1137to declare the 1138permissions your app requires in order to access certain APIs.</p> 1139 1140<dl> 1141<dt>{@link android.Manifest.permission#BIND_NOTIFICATION_LISTENER_SERVICE} 1142</dt> 1143<dd>Required to use the new {@link android.service.notification.NotificationListenerService} APIs. 1144</dd> 1145 1146<dt>{@link android.Manifest.permission#SEND_RESPOND_VIA_MESSAGE}</dt> 1147<dd>Required to receive the {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE} 1148intent.</dd> 1149</dl> 1150 1151 1152 1153 1154<p class="note">For a detailed view of all API changes in Android 4.3, see the 1155<a href="{@docRoot}sdk/api_diff/18/changes.html">API Differences Report</a>.</p> 1156 1157 1158 1159