android-4.3.jd revision 1c62a0fc32dea7cba9a922a748fc1b1acd5ddb47
1page.title=Android 4.3 APIs
2excludeFromSuggestions=true
3sdk.platform.version=4.3
4sdk.platform.apiLevel=18
5@jd:body
6
7
8<div id="qv-wrapper">
9<div id="qv">
10  
11<h2>In this document
12    <a href="#" onclick="hideNestedItems('#toc43',this);return false;" class="header-toggle">
13        <span class="more">show more</span>
14        <span class="less" style="display:none">show less</span></a></h2>
15
16<ol id="toc43" class="hide-nested">
17  <li><a href="#ApiLevel">Update your target API level</a></li>
18  <li><a href="#Behaviors">Important Behavior Changes</a>
19    <ol>
20      <li><a href="#BehaviorsIntents">If your app uses implicit intents...</a></li>
21      <li><a href="#BehaviorsAccounts">If your app depends on accounts...</a></li>
22    </ol>
23  </li>
24  <li><a href="#RestrictedProfiles">Restricted Profiles</a>
25    <ol>
26      <li><a href="#AccountsInProfile">Supporting accounts in a restricted profile</a></li>
27    </ol>
28  </li>
29  <li><a href="#Wireless">Wireless and Connectivity</a>
30    <ol>
31      <li><a href="#BTLE">Bluetooth Low Energy (Smart Ready)</a></li>
32      <li><a href="#WiFiScan">Wi-Fi scan-only mode</a></li>
33      <li><a href="#WiFiConfig">Wi-Fi configuration</a></li>
34      <li><a href="#QuickResponse">Quick response for incoming calls</a></li>
35    </ol>
36  </li>
37  <li><a href="#Multimedia">Multimedia</a>
38    <ol>
39      <li><a href="#DASH">MPEG DASH support</a></li>
40      <li><a href="#DRM">Media DRM</a></li>
41      <li><a href="#EncodingSurface">Video encoding from a Surface</a></li>
42      <li><a href="#MediaMuxing">Media muxing</a></li>
43      <li><a href="#ProgressAndScrubbing">Playback progress and scrubbing for RemoteControlClient</a></li>
44    </ol>
45  </li>
46  <li><a href="#Graphics">Graphics</a>
47    <ol>
48      <li><a href="#OpenGL">Support for OpenGL ES 3.0</a></li>
49      <li><a href="#MipMap">Mipmapping for drawables</a></li>
50    </ol>
51  </li>
52  <li><a href="#UI">User Interface</a>
53    <ol>
54      <li><a href="#ViewOverlay">View overlays</a></li>
55      <li><a href="#OpticalBounds">Optical bounds layout</a></li>
56      <li><a href="#AnimationRect">Animation for Rect values</a></li>
57      <li><a href="#AttachFocus">Window attach and focus listener</a></li>
58      <li><a href="#Overscan">TV overscan support</a></li>
59      <li><a href="#Orientation">Screen orientation</a></li>
60      <li><a href="#RotationAnimation">Rotation animations</a></li>
61    </ol>
62  </li>
63  <li><a href="#UserInput">User Input</a>
64    <ol>
65      <li><a href="#SignificantMotion">Detect significant motion</a></li>
66      <li><a href="#Sensors">New sensor types</a></li>
67    </ol>
68  </li>
69  <li><a href="#NotificationListener">Notification Listener</a></li>
70  <li><a href="#Contacts">Contacts Provider</a>
71    <ol>
72      <li><a href="#Contactables">Query for "contactables"</a></li>
73      <li><a href="#ContactsDelta">Query for contacts deltas</a></li>
74    </ol>
75  </li>
76  <li><a href="#Localization">Localization</a>
77    <ol>
78      <li><a href="#BiDi">Improved support for bi-directional text</a></li>
79    </ol>
80  </li>
81  <li><a href="#A11yService">Accessibility Services</a>
82    <ol>
83      <li><a href="#A11yKeyEvents">Handle key events</a></li>
84      <li><a href="#A11yText">Select text and copy/paste</a></li>
85      <li><a href="#A11yFeatures">Declare accessibility features</a></li>
86    </ol>
87  </li>
88  <li><a href="#Testing">Testing and Debugging</a>
89    <ol>
90      <li><a href="#UiAutomation">Automated UI testing</a></li>
91      <li><a href="#Systrace">Systrace events for apps</a></li>
92    </ol>
93  </li>
94  <li><a href="#Security">Security</a>
95    <ol>
96      <li><a href="#KeyStore">Android key store for app-private keys</a></li>
97      <li><a href="#HardwareKeyChain">Hardware credential storage</a></li>
98    </ol>
99  </li>
100  <li><a href="#Manifest">Manifest Declarations</a>
101    <ol>
102      <li><a href="#ManifestFeatures">Declarable required features</a></li>
103      <li><a href="#ManifestPermissions">User permissions</a></li>
104    </ol>
105  </li>
106</ol>
107
108<h2>See also</h2>
109<ol>
110<li><a href="{@docRoot}sdk/api_diff/18/changes.html">API
111Differences Report &raquo;</a> </li>
112<li><a
113href="{@docRoot}tools/extras/support-library.html">Support Library</a></li>
114</ol>
115
116</div>
117</div>
118
119
120
121<p>API Level: {@sdkPlatformApiLevel}</p>
122
123<p>Android {@sdkPlatformVersion} ({@link android.os.Build.VERSION_CODES#JELLY_BEAN_MR2})
124is an update to the Jelly Bean release that offers new features for users and app
125developers. This document provides an introduction to the most notable
126new APIs.</p>
127
128<p>As an app developer, you should download the Android {@sdkPlatformVersion} system image
129and SDK platform from the <a href="{@docRoot}tools/help/sdk-manager.html">SDK Manager</a> as
130soon as possible. If you don't have a device running Android {@sdkPlatformVersion} on which to
131test your app, use the Android {@sdkPlatformVersion} system
132image to test your app on the <a href="{@docRoot}tools/devices/emulator.html">Android emulator</a>.
133Then build your apps against the Android {@sdkPlatformVersion} platform to begin using the
134latest APIs.</p>
135
136  
137<h3 id="ApiLevel">Update your target API level</h3>
138
139<p>To better optimize your app for devices running Android {@sdkPlatformVersion},
140  you should set your <a
141href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#target">{@code targetSdkVersion}</a> to
142<code>"{@sdkPlatformApiLevel}"</code>, install it on an Android {@sdkPlatformVersion} system image,
143test it, then publish an update with this change.</p>
144
145<p>You can use APIs in Android {@sdkPlatformVersion} while also supporting older versions by adding
146conditions to your code that check for the system API level before executing
147APIs not supported by your <a
148href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#min">{@code minSdkVersion}</a>. 
149To learn more about maintaining backward compatibility, read <a
150href="{@docRoot}training/basics/supporting-devices/platforms.html">Supporting Different
151Platform Versions</a>.</p>
152
153<p>Various APIs are also available in the Android <a
154href="{@docRoot}tools/extras/support-library.html">Support Library</a> that allow you to implement
155new features on older versions of the platform.</p>
156
157<p>For more information about how API levels work, read <a
158href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#ApiLevels">What is API
159Level?</a></p>
160
161
162
163
164
165<h2 id="Behaviors">Important Behavior Changes</h2>
166
167<p>If you have previously published an app for Android, be aware that your app might
168be affected by changes in Android {@sdkPlatformVersion}.</p>
169
170
171<h3 id="BehaviorsIntents">If your app uses implicit intents...</h3>
172
173<p>Your app might misbehave in a restricted profile environment.</p>
174
175<p>Users in a <a href="#RestrictedProfiles">restricted profile</a> environment might not 
176have all the standard Android apps available. For example, a restricted profile might have the 
177web browser and camera app disabled. So your app should not make assumptions about which apps are 
178available, because if you call {@link android.app.Activity#startActivity startActivity()} without 
179verifying whether an app is available to handle the {@link android.content.Intent}, 
180your app might crash in a restricted profile.</p>
181
182<p>When using an implicit intent, you should always verify that an app is available to handle the intent by calling {@link android.content.Intent#resolveActivity resolveActivity()} or {@link android.content.pm.PackageManager#queryIntentActivities queryIntentActivities()}. For example:</p>
183
184<pre>
185Intent intent = new Intent(Intent.ACTION_SEND);
186...
187if (intent.resolveActivity(getPackageManager()) != null) {
188    startActivity(intent);
189} else {
190    Toast.makeText(context, R.string.app_not_available, Toast.LENGTH_LONG).show();
191}
192</pre>
193
194
195<h3 id="BehaviorsAccounts">If your app depends on accounts...</h3>
196
197<p>Your app might misbehave in a restricted profile environment.</p>
198
199<p>Users within a restricted profile environment do not have access to user accounts by default.
200If your app depends on an {@link android.accounts.Account}, then your app might crash or behave 
201unexpectedly when used in a restricted profile.</p>
202
203<p>If you'd like to prevent restricted profiles from using your app entirely because your
204app depends on account information that's sensitive, specify the <a 
205href="{@docRoot}guide/topics/manifest/application-element.html#requiredAccountType">{@code 
206android:requiredAccountType}</a> attribute in your manifest's <a 
207href="{@docRoot}guide/topics/manifest/application-element.html">{@code &lt;application>}</a> 
208element.</p>
209
210<p>If you’d like to allow restricted profiles to continue using your app even though they can’t 
211create their own accounts, then you can either disable your app features that require an account 
212or allow restricted profiles to access the accounts created by the primary user. For more
213information, see the section 
214below about <a href="#AccountsInProfile">Supporting accounts in a restricted profile</a>.</p>
215
216
217
218
219<h2 id="RestrictedProfiles">Restricted Profiles</h2>
220
221<p>On Android tablets, users can now create restricted profiles based on the primary user. 
222When users create a restricted profile, they can enable restrictions such as which apps are
223available to the profile. A new set of APIs in Android 4.3 also allow you to build fine-grain
224restriction settings for the apps you develop. For example, by using the new APIs, you can 
225allow users to control what type of content is available within your app when running in a 
226restricted profile environment.</p>
227
228<p>The UI for users to control the restrictions you've built is managed by the system's 
229Settings application. To make your app's restriction settings appear to the user,
230you must declare the restrictions your app provides by creating a {@link 
231android.content.BroadcastReceiver} that receives the {@link android.content.Intent#ACTION_GET_RESTRICTION_ENTRIES} intent. The system invokes this intent to query 
232all apps for available restrictions, then builds the UI to allow the primary user to 
233manage restrictions for each restricted profile. </p>
234
235<p>In the {@link android.content.BroadcastReceiver#onReceive onReceive()} method of 
236your {@link android.content.BroadcastReceiver}, you must create a {@link 
237android.content.RestrictionEntry} for each restriction your app provides. Each {@link 
238android.content.RestrictionEntry} defines a restriction title, description, and one of the 
239following data types:</p>
240
241<ul>
242  <li>{@link android.content.RestrictionEntry#TYPE_BOOLEAN} for a restriction that is 
243  either true or false.
244  <li>{@link android.content.RestrictionEntry#TYPE_CHOICE} for a restriction that has 
245  multiple choices that are mutually exclusive (radio button choices).
246  <li>{@link android.content.RestrictionEntry#TYPE_MULTI_SELECT} for a restriction that 
247  has multiple choices that are <em>not</em> mutually exclusive (checkbox choices).
248</ul>
249
250<p>You then put all the {@link android.content.RestrictionEntry} objects into an {@link 
251java.util.ArrayList} and put it into the broadcast receiver's result as the value for the 
252{@link android.content.Intent#EXTRA_RESTRICTIONS_LIST} extra.</p>
253
254<p>The system creates the UI for your app's restrictions in the Settings app and saves each 
255restriction with the unique key you provided for each {@link android.content.RestrictionEntry} 
256object. When the user opens your app, you can query for any current restrictions by 
257calling {@link android.os.UserManager#getApplicationRestrictions getApplicationRestrictions()}. 
258This returns a {@link android.os.Bundle} containing the key-value pairs for each restriction
259you defined with the {@link android.content.RestrictionEntry} objects.</p>
260
261<p>If you want to provide more specific restrictions that can't be handled by boolean, single 
262choice, and multi-choice values, then you can create an activity where the user can specify the 
263restrictions and allow users to open that activity from the restriction settings. In your 
264broadcast receiver, include the {@link android.content.Intent#EXTRA_RESTRICTIONS_INTENT} extra 
265in the result {@link android.os.Bundle}. This extra must specify an {@link android.content.Intent}
266indicating the {@link android.app.Activity} class to launch (use the 
267{@link android.os.Bundle#putParcelable putParcelable()} method to pass {@link 
268android.content.Intent#EXTRA_RESTRICTIONS_INTENT} with the intent).
269When the primary user enters your activity to set custom restrictions, your 
270activity must then return a result containing the restriction values in an extra using either 
271the {@link android.content.Intent#EXTRA_RESTRICTIONS_LIST} or {@link 
272android.content.Intent#EXTRA_RESTRICTIONS_BUNDLE} key, depending on whether you specify
273{@link android.content.RestrictionEntry} objects or key-value pairs, respectively.</p>
274
275
276<h3 id="AccountsInProfile">Supporting accounts in a restricted profile</h3>
277
278<p>Any accounts added to the primary user are available to a restricted profile, but the 
279accounts are not accessible from the {@link android.accounts.AccountManager} APIs by default. 
280If you attempt to add an account with {@link android.accounts.AccountManager} while in a restricted
281profile, you will get a failure result. Due to these restrictions, you have the following 
282three options:</p>
283
284<li><strong>Allow access to the owner’s accounts from a restricted profile.</strong>
285<p>To get access to an account from a restricted profile, you must add the <a href="{@docRoot}guide/topics/manifest/application-element.html#restrictedAccountType">{@code android:restrictedAccountType}</a> attribute to the <a
286href="{@docRoot}guide/topics/manifest/application-element.html">&lt;application></a> tag:</p>
287<pre>
288&lt;application ...
289    android:restrictedAccountType="com.example.account.type" >
290</pre>
291
292<p class="caution"><strong>Caution:</strong> Enabling this attribute provides your 
293app access to the primary user's accounts from restricted profiles. So you should allow this 
294only if the information displayed by your app does not reveal personally identifiable
295information (PII) that’s considered sensitive.</p>
296</li>
297
298
299<li><strong>Disable certain functionality when unable to modify accounts.</strong>
300<p>If you want to use accounts, but don’t actually require them for your app’s primary 
301functionality, you can check for account availability and disable features when not available. 
302You should first check if there is an existing account available. If not, then query whether 
303it’s possible to create a new account by calling {@link 
304android.os.UserManager#getUserRestrictions()} and check the {@link 
305android.os.UserManager#DISALLOW_MODIFY_ACCOUNTS} extra in the result. If it is {@code true}, 
306then you should disable whatever functionality of your app requires access to accounts. 
307For example:</p>
308<pre>
309UserManager um = (UserManager) context.getSystemService(Context.USER_SERVICE);
310Bundle restrictions = um.getUserRestrictions();
311if (restrictions.getBoolean(UserManager.DISALLOW_MODIFY_ACCOUNTS, false)) {
312   // cannot add accounts, disable some functionality
313}
314</pre>
315<p class="note"><strong>Note:</strong> In this scenario, you should <em>not</em> declare 
316any new attributes in your manifest file.</p>
317</li>
318
319<li><strong>Disable your app when unable to access private accounts.</strong>
320<p>If it’s instead important that your app not be available to restricted profiles because 
321your app depends on sensitive personal information in an account (and because restricted profiles 
322currently cannot add new accounts), add
323the <a href="{@docRoot}guide/topics/manifest/application-element.html#requiredAccountType">{@code 
324android:requiredAccountType}</a> attribute to the <a
325href="{@docRoot}guide/topics/manifest/application-element.html">&lt;application></a> tag:</p>
326<pre>
327&lt;application ...
328    android:requiredAccountType="com.example.account.type" >
329</pre>
330<p>For example, the Gmail app uses this attribute to disable itself for restricted profiles,
331because the owner's personal email should not be available to restricted profiles.</p>
332</li>
333
334
335
336<h2 id="Wireless">Wireless and Connectivity</h2>
337
338
339<h3 id="BTLE">Bluetooth Low Energy (Smart Ready)</h3>
340
341<p>Android now supports Bluetooth Low Energy (LE) with new APIs in {@link android.bluetooth}. 
342With the new APIs, you can build Android apps that communicate with Bluetooth Low Energy 
343peripherals such as heart rate monitors and pedometers.</p>
344
345<p>Because Bluetooth LE is a hardware feature that is not available on all 
346Android-powered devices, you must declare in your manifest file a <a
347href="{@docRoot}guide/topics/manifest/uses-feature-element.html">{@code &lt;uses-feature>}</a>
348element for {@code "android.hardware.bluetooth_le"}:</p>
349<pre>
350&lt;uses-feature android:name="android.hardware.bluetooth_le" android:required="true" />
351</pre>
352
353<p>If you're already familiar with Android's Classic Bluetooth APIs, notice that using the 
354Bluetooth LE APIs has some differences. Most importantly is that there's now a {@link 
355android.bluetooth.BluetoothManager} class that you should use for some high level operations 
356such as acquiring a {@link android.bluetooth.BluetoothAdapter}, getting a list of connected 
357devices, and checking the state of a device. For example, here's how you should now get the 
358{@link android.bluetooth.BluetoothAdapter}:</p>
359<pre>
360final BluetoothManager bluetoothManager =
361        (BluetoothManager) getSystemService(Context.BLUETOOTH_SERVICE);
362mBluetoothAdapter = bluetoothManager.getAdapter();
363</pre>
364
365<p>To discover Bluetooth LE peripherals, call {@link android.bluetooth.BluetoothAdapter#startLeScan 
366startLeScan()} on the {@link android.bluetooth.BluetoothAdapter}, passing it an implementation 
367of the {@link android.bluetooth.BluetoothAdapter.LeScanCallback} interface. When the Bluetooth 
368adapter detects a Bluetooth LE peripheral, your {@link 
369android.bluetooth.BluetoothAdapter.LeScanCallback} implementation receives a call to the 
370{@link android.bluetooth.BluetoothAdapter.LeScanCallback#onLeScan onLeScan()} method. This 
371method provides you with a {@link android.bluetooth.BluetoothDevice} object representing the 
372detected device, the RSSI value for the device, and a byte array containing the device's 
373advertisement record.</p>
374
375<p>If you want to scan for only specific types of peripherals, you can instead call {@link 
376android.bluetooth.BluetoothAdapter#startLeScan startLeScan()} and include an array of {@link 
377java.util.UUID} objects that specify the GATT services your app supports.</p>
378
379<p class="note"><strong>Note:</strong> You can only scan for Bluetooth LE devices <em>or</em> 
380scan for Classic Bluetooth devices using previous APIs. You cannot scan for both LE and Classic 
381Bluetooth devices at once.</p>
382
383<p>To then connect to a Bluetooth LE peripheral, call {@link 
384android.bluetooth.BluetoothDevice#connectGatt connectGatt()} on the corresponding 
385{@link android.bluetooth.BluetoothDevice} object, passing it an implementation of 
386{@link android.bluetooth.BluetoothGattCallback}. Your implementation of {@link 
387android.bluetooth.BluetoothGattCallback} receives callbacks regarding the connectivity 
388state with the device and other events. It's during the {@link 
389android.bluetooth.BluetoothGattCallback#onConnectionStateChange onConnectionStateChange()} 
390callback that you can begin communicating with the device if the method passes {@link 
391android.bluetooth.BluetoothProfile#STATE_CONNECTED} as the new state.</p>
392
393<p>Accessing Bluetooth features on a device also requires that your app request certain
394Bluetooth user permissions. For more information, see the <a
395href="{@docRoot}guide/topics/connectivity/bluetooth-le.html">Bluetooth Low Energy</a> API guide.</p>
396
397
398<h3 id="WiFiScan">Wi-Fi scan-only mode</h3>
399
400<p>When attempting to identify the user's location, Android may use Wi-Fi to help determine 
401the location by scanning nearby access points. However, users often keep Wi-Fi turned off to 
402conserve battery, resulting in location data that's less accurate. Android now includes a 
403scan-only mode that allows the device Wi-Fi to scan access points to help obtain the location 
404without connecting to an access point, thus greatly reducing battery usage.</p>
405
406<p>If you want to acquire the user's location but Wi-Fi is currently off, you can request the 
407user to enable Wi-Fi scan-only mode by calling {@link android.content.Context#startActivity 
408startActivity()} with the action {@link 
409android.net.wifi.WifiManager#ACTION_REQUEST_SCAN_ALWAYS_AVAILABLE}.</p>
410
411
412<h3 id="WiFiConfig">Wi-Fi configuration</h3>
413
414<p>New {@link android.net.wifi.WifiEnterpriseConfig} APIs allow enterprise-oriented services to 
415automate Wi-Fi configuration for managed devices.</p>
416
417
418<h3 id="QuickResponse">Quick response for incoming calls</h3>
419
420<p>Since Android 4.0, a feature called "Quick response" allows users to respond to incoming 
421calls with an immediate text message without needing to pick up the call or unlock the device. 
422Until now, these quick messages were always handled by the default Messaging app. Now any app 
423can declare its capability to handle these messages by creating a {@link android.app.Service} 
424with an intent filter for {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE}.</p>
425
426<p>When the user responds to an incoming call with a quick response, the Phone app sends 
427the {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE} intent with a URI 
428describing the recipient (the caller) and the {@link android.content.Intent#EXTRA_TEXT} extra 
429with the message the user wants to send. When your service receives the intent, it should deliver 
430the message and immediately stop itself (your app should not show an activity).</p>
431
432<p>In order to receive this intent, you must declare the {@link 
433android.Manifest.permission#SEND_RESPOND_VIA_MESSAGE} permission.</p>
434
435
436
437<h2 id="Multimedia">Multimedia</h2>
438
439<h3 id="DASH">MPEG DASH support</h3>
440
441<p>Android now supports Dynamic Adaptive Streaming over HTTP (DASH) in accordance with the 
442ISO/IEC 23009-1 standard, using existing APIs in {@link android.media.MediaCodec} and {@link 
443android.media.MediaExtractor}. The framework underlying these APIs has been updated to support 
444parsing of fragmented MP4 files, but your app is still responsible for parsing the MPD metadata 
445and passing the individual streams to {@link android.media.MediaExtractor}.</p>
446
447<p>If you want to use DASH with encrypted content, notice that the {@link android.media.MediaExtractor#getSampleCryptoInfo getSampleCryptoInfo()} method returns the {@link 
448android.media.MediaCodec.CryptoInfo} metadata describing the structure of each encrypted media 
449sample. Also, the {@link android.media.MediaExtractor#getPsshInfo()} method has been added to 
450{@link android.media.MediaExtractor} so you can access the PSSH metadata for your DASH media. 
451This method returns a map of {@link java.util.UUID} objects to bytes, with the 
452{@link java.util.UUID} specifying the crypto scheme, and the bytes being the data specific 
453to that scheme.</p>
454
455
456<h3 id="DRM">Media DRM</h3>
457
458<p>The new {@link android.media.MediaDrm} class provides a modular solution for digital rights
459management (DRM) with your media content by separating DRM concerns from media playback. For 
460instance, this API separation allows you to play back Widevine-encrypted content without having 
461to use the Widevine media format. This DRM solution also supports DASH Common Encryption so you 
462can use a variety of DRM schemes with your streaming content.</p>
463
464<p>You can use {@link android.media.MediaDrm} to obtain opaque key-request messages and process 
465key-response messages from the server for license acquisition and provisioning. Your app is 
466responsible for handling the network communication with the servers; the {@link 
467android.media.MediaDrm} class provides only the ability to generate and process the messages.</p>
468
469<p>The {@link android.media.MediaDrm} APIs are  intended to be used in conjunction with the 
470{@link android.media.MediaCodec} APIs that were introduced in Android 4.1 (API level 16), 
471including {@link android.media.MediaCodec} for encoding and decoding your content, {@link 
472android.media.MediaCrypto} for handling encrypted content, and {@link android.media.MediaExtractor} 
473for extracting and demuxing your content.</p>
474
475<p>You must first construct {@link android.media.MediaExtractor} and 
476{@link android.media.MediaCodec} objects. You can then access the DRM-scheme-identifying 
477{@link java.util.UUID}, typically from metadata in the content, and use it to construct an 
478instance of a {@link android.media.MediaDrm} object with its constructor.</p>
479
480
481<h3 id="EncodingSurface">Video encoding from a Surface</h3>
482
483<p>Android 4.1 (API level 16) added the {@link android.media.MediaCodec} class for low-level 
484encoding and decoding of media content. When encoding video, Android 4.1 required that you provide 
485the media with a {@link java.nio.ByteBuffer} array, but Android 4.3 now allows you to use a {@link 
486android.view.Surface} as the input to an encoder. For instance, this allows you to encode input 
487from an existing video file or using frames generated from OpenGL ES.</p>
488
489<p>To use a {@link android.view.Surface} as the input to your encoder, first call {@link 
490android.media.MediaCodec#configure configure()} for your {@link android.media.MediaCodec}. 
491Then call {@link android.media.MediaCodec#createInputSurface()} to receive the {@link 
492android.view.Surface} upon which you can stream your media.</p>
493
494<p>For example, you can use the given {@link android.view.Surface} as the window for an OpenGL 
495context by passing it to {@link android.opengl.EGL14#eglCreateWindowSurface 
496eglCreateWindowSurface()}. Then while rendering the surface, call {@link 
497android.opengl.EGL14#eglSwapBuffers eglSwapBuffers()} to pass the frame to the {@link 
498android.media.MediaCodec}.</p>
499
500<p>To begin encoding, call {@link android.media.MediaCodec#start()} on the {@link 
501android.media.MediaCodec}. When done, call {@link android.media.MediaCodec#signalEndOfInputStream} 
502to terminate encoding, and call {@link android.view.Surface#release()} on the 
503{@link android.view.Surface}.</p>
504
505
506<h3 id="MediaMuxing">Media muxing</h3>
507
508<p>The new {@link android.media.MediaMuxer} class enables multiplexing between one audio stream 
509and one video stream. These APIs serve as a counterpart to the {@link android.media.MediaExtractor} 
510class added in Android 4.2 for de-multiplexing (demuxing) media.</p>
511
512<p>Supported output formats are defined in {@link android.media.MediaMuxer.OutputFormat}. Currently, 
513MP4 is the only supported output format and {@link android.media.MediaMuxer} currently supports 
514only one audio stream and/or one video stream at a time.</p>
515
516<p>{@link android.media.MediaMuxer} is mostly designed to work with {@link android.media.MediaCodec} 
517so you can perform video processing through {@link android.media.MediaCodec} then save the 
518output to an MP4 file through {@link android.media.MediaMuxer}. You can also use {@link 
519android.media.MediaMuxer} in combination with {@link android.media.MediaExtractor} to perform 
520media editing without the need to encode or decode.</p>
521
522
523<h3 id="ProgressAndScrubbing">Playback progress and scrubbing for RemoteControlClient</h3>
524
525<p>In Android 4.0 (API level 14), the {@link android.media.RemoteControlClient} was added to 
526enable media playback controls from remote control clients such as the controls available on the 
527lock screen. Android 4.3 now provides the ability for such controllers to display the playback 
528position and controls for scrubbing the playback. If you've enabled remote control for your 
529media app with the {@link android.media.RemoteControlClient} APIs, then you can allow playback 
530scrubbing by implementing two new interfaces.</p>
531
532<p>First, you must enable the {@link 
533android.media.RemoteControlClient#FLAG_KEY_MEDIA_POSITION_UPDATE} flag by passing it to 
534{@link android.media.RemoteControlClient#setTransportControlFlags setTransportControlsFlags()}.</p> 
535
536<p>Then implement the following two new interfaces:</p>
537<dl>
538  <dt>{@link android.media.RemoteControlClient.OnGetPlaybackPositionListener}</dt>
539  <dd>This includes the callback {@link android.media.RemoteControlClient.OnGetPlaybackPositionListener#onGetPlaybackPosition}, which requests the current position 
540  of your media when the remote control needs to update the progress in its UI.</dd>
541
542  <dt>{@link android.media.RemoteControlClient.OnPlaybackPositionUpdateListener}</dt>
543  <dd>This includes the callback {@link android.media.RemoteControlClient.OnPlaybackPositionUpdateListener#onPlaybackPositionUpdate onPlaybackPositionUpdate()}, which 
544  tells your app the new time code for your media when the user scrubs the playback with the 
545  remote control UI.
546    <p>Once you update your playback with the new position, call {@link 
547    android.media.RemoteControlClient#setPlaybackState setPlaybackState()} to indicate the 
548    new playback state, position, and speed.</p>
549  </dd>
550</dl>
551
552<p>With these interfaces defined, you can set them for your {@link 
553android.media.RemoteControlClient} by calling {@link android.media.RemoteControlClient#setOnGetPlaybackPositionListener setOnGetPlaybackPositionListener()} and 
554{@link android.media.RemoteControlClient#setPlaybackPositionUpdateListener 
555setPlaybackPositionUpdateListener()}, respectively.</p>
556
557
558
559<h2 id="Graphics">Graphics</h2>
560
561<h3 id="OpenGL">Support for OpenGL ES 3.0</h3>
562
563<p>Android 4.3 adds Java interfaces and native support for OpenGL ES 3.0. Key new functionality 
564provided in OpenGL ES 3.0 includes:</p>
565<ul>
566  <li>Acceleration of advanced visual effects</li>
567  <li>High quality ETC2/EAC texture compression as a standard feature</li>
568  <li>A new version of the GLSL ES shading language with integer and 32-bit floating point support</li>
569  <li>Advanced texture rendering</li>
570  <li>Broader standardization of texture size and render-buffer formats</li>
571</ul>
572
573<p>The Java interface for OpenGL ES 3.0 on Android is provided with {@link android.opengl.GLES30}. 
574When using OpenGL ES 3.0, be sure that you declare it in your manifest file with the 
575<a href="{@docRoot}guide/topics/manifest/uses-feature-element.html">&lt;uses-feature></a>
576tag and the {@code android:glEsVersion} attribute. For example:</p>
577<pre>
578&lt;manifest>
579    &lt;uses-feature android:glEsVersion="0x00030000" />
580    ...
581&lt;/manifest>
582</pre>
583
584<p>And remember to specify the OpenGL ES context by calling {@link android.opengl.GLSurfaceView#setEGLContextClientVersion setEGLContextClientVersion()}, passing {@code 3} as the version.</p>
585
586
587<h3 id="MipMap">Mipmapping for drawables</h3>
588
589<p>Using a mipmap as the source for your bitmap or drawable is a simple way to provide a 
590quality image and various image scales, which can be particularly useful if you expect your 
591image to be scaled during an animation.</p>
592
593<p>Android 4.2 (API level 17) added support for mipmaps in the {@link android.graphics.Bitmap} 
594class&mdash;Android swaps the mip images in your {@link android.graphics.Bitmap} when you've 
595supplied a mipmap source and have enabled {@link android.graphics.Bitmap#setHasMipMap 
596setHasMipMap()}. Now in Android 4.3, you can enable mipmaps for a {@link 
597android.graphics.drawable.BitmapDrawable} object as well, by providing a mipmap asset and 
598setting the {@code android:mipMap} attribute in a bitmap resource file or by calling {@link 
599android.graphics.drawable.BitmapDrawable#hasMipMap hasMipMap()}.
600</p>
601
602
603
604<h2 id="UI">User Interface</h2>
605
606<h3 id="ViewOverlay">View overlays</h3>
607
608<p>The new {@link android.view.ViewOverlay} class provides a transparent layer on top of 
609a {@link android.view.View} on which you can add visual content and which does not affect 
610the layout hierarchy. You can get a {@link android.view.ViewOverlay} for any {@link 
611android.view.View} by calling {@link android.view.View#getOverlay}. The overlay 
612always has the same size and position as its host view (the view from which it was created), 
613allowing you to add content that appears in front of the host view, but which cannot extend 
614the bounds of that host view.
615</p>
616
617<p>Using a {@link android.view.ViewOverlay} is particularly useful when you want to create 
618animations such as sliding a view outside of its container or moving items around the screen 
619without affecting the view hierarchy. However, because the usable area of an overlay is 
620restricted to the same area as its host view, if you want to animate a view moving outside 
621its position in the layout, you must use an overlay from a parent view that has the desired 
622layout bounds.</p>
623
624<p>When you create an overlay for a widget view such as a {@link android.widget.Button}, you 
625can add {@link android.graphics.drawable.Drawable} objects to the overlay by calling 
626{@link android.view.ViewOverlay#add(Drawable)}. If you call {@link 
627android.view.ViewGroup#getOverlay} for a layout view, such as {@link android.widget.RelativeLayout},
628the object returned is a {@link android.view.ViewGroupOverlay}. The
629{@link android.view.ViewGroupOverlay} class is a subclass 
630of {@link android.view.ViewOverlay} that  also allows you to add {@link android.view.View} 
631objects by calling {@link android.view.ViewGroupOverlay#add(View)}.
632</p>
633
634<p class="note"><strong>Note:</strong> All drawables and views that you add to an overlay 
635are visual only. They cannot receive focus or input events.</p>
636
637<p>For example, the following code animates a view sliding to the right by placing the view 
638in the parent view's overlay, then performing a translation animation on that view:</p>
639<pre>
640View view = findViewById(R.id.view_to_remove);
641ViewGroup container = (ViewGroup) view.getParent();
642container.getOverlay().add(view);
643ObjectAnimator anim = ObjectAnimator.ofFloat(view, "translationX", container.getRight());
644anim.start();
645</pre>
646
647
648<h3 id="OpticalBounds">Optical bounds layout</h3>
649
650<p>For views that contain nine-patch background images, you can now specify that they should 
651be aligned with neighboring views based on the "optical" bounds of the background image rather 
652than the "clip" bounds of the view.</p>
653
654<p>For example, figures 1 and 2 each show the same layout, but the version in figure 1 is 
655using clip bounds (the default behavior), while figure 2 is using optical bounds. Because the 
656nine-patch images used for the button and the photo frame include padding around the edges, 
657they don’t appear to align with each other or the text when using clip bounds.</p>
658
659<p class="note"><strong>Note:</strong> The screenshot in figures 1 and 2 have the "Show 
660layout bounds" developer setting enabled. For each view, red lines indicate the optical 
661bounds, blue lines indicate the clip bounds, and pink indicates margins.</p>
662
663<script type="text/javascript">
664function toggleOpticalImages(mouseover) {
665
666  $("img.optical-img").each(function() {
667    $img = $(this);
668    var index = $img.attr('src').lastIndexOf("/") + 1;
669    var path = $img.attr('src').substr(0,index);
670    var name = $img.attr('src').substr(index);
671    var splitname;
672    var highres = false;
673    if (name.indexOf("@2x") != -1) {
674      splitname = name.split("@2x.");
675      highres = true;
676    } else {
677      splitname = name.split(".");
678    }
679
680    var newname;
681    if (mouseover) {
682      if (highres) {
683        newname = splitname[0] + "-normal@2x.png";
684      } else {
685        newname = splitname[0] + "-normal.png";
686      }
687    } else {
688      if (highres) {
689        newname = splitname[0].split("-normal")[0] + "@2x.png";
690      } else {
691        newname = splitname[0].split("-normal")[0] + ".png";
692      }
693    }
694
695    $img.attr('src', path + newname);
696
697  });
698}
699</script>
700
701<p class="table-caption"><em>Mouse over to hide the layout bounds.</em></p>
702<div style="float:left;width:296px">
703<img src="{@docRoot}images/tools/clipbounds@2x.png" width="296" alt="" class="optical-img"
704    onmouseover="toggleOpticalImages(true)" onmouseout="toggleOpticalImages(false)" />
705<p class="img-caption"><strong>Figure 1.</strong> Layout using clip bounds (default).</p>
706</div>
707<div style="float:left;width:296px;margin-left:60px">
708<img src="{@docRoot}images/tools/opticalbounds@2x.png" width="296" alt="" class="optical-img"
709    onmouseover="toggleOpticalImages(true)" onmouseout="toggleOpticalImages(false)" />
710<p class="img-caption"><strong>Figure 2.</strong> Layout using optical bounds.</p>
711</div>
712
713
714<p style="clear:left">To align the views based on their optical bounds, set the {@code android:layoutMode} attribute to {@code "opticalBounds"} in one of the parent layouts. For example:</p>
715
716<pre>
717&lt;LinearLayout android:layoutMode="opticalBounds" ... >
718</pre>
719
720
721<div class="figure" style="width:155px">
722<img src="{@docRoot}images/tools/ninepatch_opticalbounds@2x.png" width="121" alt="" />
723<p class="img-caption"><strong>Figure 3.</strong> Zoomed view of the Holo button nine-patch with
724optical bounds.
725</p>
726</div>
727
728<p>For this to work, the nine-patch images applied to the background of your views must specify 
729the optical bounds using red lines along the bottom and right-side of the nine-patch file (as 
730shown in figure 3). The red lines indicate the region that should be subtracted from 
731the clip bounds, leaving the optical bounds of the image.</p>
732
733<p>When you enable optical bounds for a {@link android.view.ViewGroup} in your layout, all 
734descendant views inherit the optical bounds layout mode unless you override it for a group by 
735setting {@code android:layoutMode} to {@code "clipBounds"}. All layout elements also honor the 
736optical bounds of their child views, adapting their own bounds based on the optical bounds of 
737the views within them. However, layout elements (subclasses of {@link android.view.ViewGroup}) 
738currently do not support optical bounds for nine-patch images applied to their own background.</p>
739
740<p>If you create a custom view by subclassing {@link android.view.View}, {@link android.view.ViewGroup}, or any subclasses thereof, your view will inherit these optical bound behaviors.</p>
741
742<p class="note"><strong>Note:</strong> All widgets supported by the Holo theme have been updated
743with optical bounds, including {@link android.widget.Button},  {@link android.widget.Spinner}, 
744{@link android.widget.EditText}, and others. So you can immediately benefit by setting the
745{@code android:layoutMode} attribute to {@code "opticalBounds"} if your app applies a Holo theme 
746({@link android.R.style#Theme_Holo Theme.Holo}, {@link android.R.style#Theme_Holo_Light 
747Theme.Holo.Light}, etc.).
748</p>
749
750<p>To specify optical bounds for your own nine-patch images with the <a 
751href="{@docRoot}tools/help/draw9patch.html">Draw 9-patch</a> tool, hold CTRL when clicking on 
752the border pixels.</p>
753
754
755
756
757<h3 id="AnimationRect">Animation for Rect values</h3>
758
759<p>You can now animate between two {@link android.graphics.Rect} values with the new {@link 
760android.animation.RectEvaluator}. This new class is an implementation of {@link 
761android.animation.TypeEvaluator} that you can pass to {@link 
762android.animation.ValueAnimator#setEvaluator ValueAnimator.setEvaluator()}.
763</p>
764
765<h3 id="AttachFocus">Window attach and focus listener</h3>
766
767<p>Previously, if you wanted to listen for when your view attached/detached to the window or 
768when its focus changed, you needed to override the {@link android.view.View} class to 
769implement {@link android.view.View#onAttachedToWindow onAttachedToWindow()} and {@link 
770android.view.View#onDetachedFromWindow onDetachedFromWindow()}, or  {@link 
771android.view.View#onWindowFocusChanged onWindowFocusChanged()}, respectively.
772</p>
773
774<p>Now, to receive attach and detach events you can instead implement {@link 
775android.view.ViewTreeObserver.OnWindowAttachListener} and set it on a view with 
776{@link android.view.ViewTreeObserver#addOnWindowAttachListener addOnWindowAttachListener()}. 
777And to receive focus events, you can implement {@link 
778android.view.ViewTreeObserver.OnWindowFocusChangeListener} and set it on a view with 
779{@link android.view.ViewTreeObserver#addOnWindowFocusChangeListener 
780addOnWindowFocusChangeListener()}.
781</p>
782
783
784<h3 id="Overscan">TV overscan support</h3>
785
786<p>To be sure your app fills the entire screen on every television, you can now enable overscan 
787for you app layout. Overscan mode is determined by the {@link android.view.WindowManager.LayoutParams#FLAG_LAYOUT_IN_OVERSCAN} flag, which you can enable with platform themes such as 
788{@link android.R.style#Theme_DeviceDefault_NoActionBar_Overscan} or by enabling the 
789{@link android.R.attr#windowOverscan} style in a custom theme.</p>
790
791
792<h3 id="Orientation">Screen orientation</h3>
793
794<p>The <a 
795href="{@docRoot}guide/topics/manifest/activity-element.html">{@code &lt;activity>}</a>
796tag's <a 
797href="{@docRoot}guide/topics/manifest/activity-element.html#screen">{@code screenOrientation}</a>
798attribute now supports additional values to honor the user's preference for auto-rotation:</p>
799<dl>
800<dt>{@code "userLandscape"}</dt>
801<dd>Behaves the same as {@code "sensorLandscape"}, except if the user disables auto-rotate 
802then it locks in the normal landscape orientation and will not flip.
803</dd>
804
805<dt>{@code "userPortrait"}</dt>
806<dd>Behaves the same as {@code "sensorPortrait"}, except if the user disables auto-rotate then 
807it locks in the normal portrait orientation and will not flip.
808</dd>
809
810<dt>{@code "fullUser"}</dt>
811<dd>Behaves the same as {@code "fullSensor"} and allows rotation in all four directions, except 
812if the user disables auto-rotate then it locks in the user's preferred orientation.
813</dd></dl>
814
815<p>Additionally, you can now also declare {@code "locked"} to lock your app's orientation into
816the screen's current orientation.</p>
817
818
819<h3 id="RotationAnimation">Rotation animations</h3>
820
821<p>The new {@link android.view.WindowManager.LayoutParams#rotationAnimation} field in 
822{@link android.view.WindowManager} allows you to select between one of three animations you 
823want to use when the system switches screen orientations. The three animations are:</p>
824<ul>
825  <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_CROSSFADE}</li>
826  <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_JUMPCUT}</li>
827  <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_ROTATE}</li>
828</ul>
829
830<p class="note"><strong>Note:</strong> These animations are available only if you've set your activity to use "fullscreen" mode, which you can enable with themes such as {@link android.R.style#Theme_Holo_NoActionBar_Fullscreen Theme.Holo.NoActionBar.Fullscreen}.</p>
831
832<p>For example, here's how you can enable the "crossfade" animation:</p>
833<pre>
834protected void onCreate(Bundle savedInstanceState) {
835    super.onCreate(savedInstanceState);
836
837    WindowManager.LayoutParams params = getWindow().getAttributes();
838    params.rotationAnimation = WindowManager.LayoutParams.ROTATION_ANIMATION_CROSSFADE;
839    getWindow().setAttributes(params);
840    ...
841}
842</pre>
843
844
845<h2 id="UserInput">User Input</h2>
846
847<h3 id="SignificantMotion">Detect significant motion</h3>
848
849<p>The {@link android.hardware.SensorManager} APIs now allow you to request a callback when the 
850device sensors detect "significant motion." For instance, this event may be triggered by new 
851motion such as when the user starts to walk.</p>
852
853<p>To register a listener for significant motion, extend the {@link android.hardware.TriggerEventListener} class and implement the {@link android.hardware.TriggerEventListener#onTrigger onTrigger()} callback method. Then register your event listener with the {@link android.hardware.SensorManager} by passing it to {@link android.hardware.SensorManager#requestTriggerSensor requestTriggerSensor()}, passing it your {@link android.hardware.TriggerEventListener} and {@link android.hardware.Sensor#TYPE_SIGNIFICANT_MOTION}.</p>
854
855<h3 id="Sensors">New sensor types</h3>
856<p>The new {@link android.hardware.Sensor#TYPE_GAME_ROTATION_VECTOR} sensor allows you to detect the device's rotations without worrying about magnetic interferences. Unlike the {@link android.hardware.Sensor#TYPE_ROTATION_VECTOR} sensor, the {@link android.hardware.Sensor#TYPE_GAME_ROTATION_VECTOR} is not based on magnetic north.</p>
857
858<p>The new {@link android.hardware.Sensor#TYPE_GYROSCOPE_UNCALIBRATED} and {@link 
859android.hardware.Sensor#TYPE_MAGNETIC_FIELD_UNCALIBRATED} sensors provide raw sensor data without 
860consideration for bias estimations. That is, the existing {@link 
861android.hardware.Sensor#TYPE_GYROSCOPE} and {@link android.hardware.Sensor#TYPE_MAGNETIC_FIELD} 
862sensors provide sensor data that takes into account estimated bias from gyro-drift and hard iron 
863in the device, respectively. Whereas the new "uncalibrated" versions of these sensors instead provide 
864the raw sensor data and offer the estimated bias values separately. These sensors allow you to 
865provide your own custom calibration for the sensor data by enhancing the estimated bias with 
866external data.</p>
867
868
869
870<h2 id="NotificationListener">Notification Listener</h2>
871
872<p>Android 4.3 adds a new service class, {@link android.service.notification.NotificationListenerService}, that allows your app to receive information about new notifications as they are posted by the system. </p>
873
874<p>If your app currently uses the accessibility service APIs to access system notifications, you should update your app to use these APIs instead.</p>
875
876
877
878
879<h2 id="Contacts">Contacts Provider</h2>
880
881<h3 id="Contactables">Query for "contactables"</h3>
882
883<p>The new Contacts Provider query, {@link android.provider.ContactsContract.CommonDataKinds.Contactables#CONTENT_URI Contactables.CONTENT_URI}, provides an efficient way to get one {@link android.database.Cursor} that contains all email addresses and phone numbers belonging to all contacts matching the specified query.</p>
884
885
886<h3 id="ContactsDelta">Query for contacts deltas</h3>
887
888<p>New APIs have been added to Contacts Provider that allow you to efficiently query recent changes to the contacts data. Previously, your app could be notified when something in the contacts data changed, but you would not know exactly what changed and would need to retrieve all contacts then iterate through them to discover the change.</p>
889
890<p>To track changes to inserts and updates, you can now include the {@link android.provider.ContactsContract.ContactsColumns#CONTACT_LAST_UPDATED_TIMESTAMP} parameter with your selection to query only the contacts that have changed since the last time you queried the provider.</p>
891
892<p>To track which contacts have been deleted, the new table {@link android.provider.ContactsContract.DeletedContacts} provides a log of contacts that have been deleted (but each contact deleted is held in this table for a limited time). Similar to {@link android.provider.ContactsContract.ContactsColumns#CONTACT_LAST_UPDATED_TIMESTAMP}, you can use the new selection parameter, {@link android.provider.ContactsContract.DeletedContacts#CONTACT_DELETED_TIMESTAMP} to check which contacts have been deleted since the last time you queried the provider. The table also contains the constant {@link android.provider.ContactsContract.DeletedContacts#DAYS_KEPT_MILLISECONDS} containing the number of days (in milliseconds) that the log will be kept.</p>
893
894<p>Additionally, the Contacts Provider now broadcasts the {@link 
895android.provider.ContactsContract.Intents#CONTACTS_DATABASE_CREATED} action when the user 
896clears the contacts storage through the system settings menu, effectively recreating the 
897Contacts Provider database. It’s intended to signal apps that they need to drop all the contact 
898information they’ve stored and reload it with a new query.</p>
899
900<p>For sample code using these APIs to check for changes to the contacts, look in the ApiDemos 
901sample available in the <a href="{@docRoot}tools/samples/index.html">SDK Samples</a> download.</p>
902
903
904<h2 id="Localization">Localization</h2>
905
906<h3 id="BiDi">Improved support for bi-directional text</h3>
907
908<p>Previous versions of Android support right-to-left (RTL) languages and layout, 
909but sometimes don't properly handle mixed-direction text. So Android 4.3 adds the {@link 
910android.text.BidiFormatter} APIs that help you properly format text with opposite-direction 
911content without garbling any parts of it.</p>
912
913<p>For example, when you want to create a sentence with a string variable, such as "Did you mean 
91415 Bay Street, Laurel, CA?", you normally pass a localized string resource and the variable to 
915{@link java.lang.String#format String.format()}:</p>
916<pre>
917Resources res = getResources();
918String suggestion = String.format(res.getString(R.string.did_you_mean), address);
919</pre>
920
921<p>However, if the locale is Hebrew, then the formatted string comes out like this:</p>
922
923<p dir="rtl">האם התכוונת ל 15 Bay Street, Laurel, CA?</p>
924
925<p>That's wrong because the "15" should be left of "Bay Street." The solution is to use {@link 
926android.text.BidiFormatter} and its {@link android.text.BidiFormatter#unicodeWrap(String) 
927unicodeWrap()} method. For example, the code above becomes:</p>
928<pre>
929Resources res = getResources();
930BidiFormatter bidiFormatter = BidiFormatter.getInstance();
931String suggestion = String.format(res.getString(R.string.did_you_mean),
932        bidiFormatter.unicodeWrap(address));
933</pre>
934
935<p>
936By default, {@link android.text.BidiFormatter#unicodeWrap(String) unicodeWrap()} uses the 
937first-strong directionality estimation heuristic, which can get things wrong if the first 
938signal for text direction does not represent the appropriate direction for the content as a whole. 
939If necessary, you can specify a different heuristic by passing one of the {@link 
940android.text.TextDirectionHeuristic} constants from {@link android.text.TextDirectionHeuristics} 
941to {@link android.text.BidiFormatter#unicodeWrap(String,TextDirectionHeuristic) unicodeWrap()}.</p>
942
943<p class="note"><strong>Note:</strong> These new APIs are also available for previous versions
944of Android through the Android <a href="{@docRoot}tools/extras/support-library.html">Support
945Library</a>, with the {@link android.support.v4.text.BidiFormatter} class and related APIs.</p>
946
947
948
949<h2 id="A11yService">Accessibility Services</h2>
950
951<h3 id="A11yKeyEvents">Handle key events</h3>
952
953<p>An {@link android.accessibilityservice.AccessibilityService} can now receive a callback for 
954key input events with the {@link android.accessibilityservice.AccessibilityService#onKeyEvent 
955onKeyEvent()} callback method. This allows your accessibility service to handle input for 
956key-based input devices such as a keyboard and translate those events to special actions that 
957previously may have been possible only with touch input or the device's directional pad.</p>
958
959
960<h3 id="A11yText">Select text and copy/paste</h3>
961
962<p>The {@link android.view.accessibility.AccessibilityNodeInfo} now provides APIs that allow 
963an {@link android.accessibilityservice.AccessibilityService} to select, cut, copy, and paste 
964text in a node.</p>
965
966<p>To specify the selection of text to cut or copy, your accessibility service can use the new 
967action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_SET_SELECTION}, passing 
968with it the selection start and end position with {@link 
969android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_SELECTION_START_INT} and {@link 
970android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_SELECTION_END_INT}. 
971Alternatively you can select text by manipulating the cursor position using the existing 
972action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_NEXT_AT_MOVEMENT_GRANULARITY} 
973(previously only for moving the cursor position), and adding the argument {@link 
974android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_EXTEND_SELECTION_BOOLEAN}.</p>
975
976<p>You can then cut or copy with {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_CUT}, 
977{@link android.view.accessibility.AccessibilityNodeInfo#ACTION_COPY}, then later paste with 
978{@link android.view.accessibility.AccessibilityNodeInfo#ACTION_PASTE}.</p>
979
980
981<p class="note"><strong>Note:</strong> These new APIs are also available for previous versions
982of Android through the Android <a href="{@docRoot}tools/extras/support-library.html">Support
983Library</a>, with the {@link android.support.v4.view.accessibility.AccessibilityNodeInfoCompat}
984class.</p>
985
986
987
988<h3 id="A11yFeatures">Declare accessibility features</h3>
989
990<p>Beginning with Android 4.3, an accessibility service must declare accessibility capabilities 
991in its metadata file in order to use certain accessibility features. If the capability is not 
992requested in the metadata file, then the feature will be a no-op. To declare your service's 
993accessibility capabilities, you must use XML attributes that correspond to the various 
994"capability" constants in the {@link android.accessibilityservice.AccessibilityServiceInfo} 
995class.</p>
996
997<p>For example, if a service does not request the {@link android.R.styleable#AccessibilityService_canRequestFilterKeyEvents flagRequestFilterKeyEvents} capability, 
998then it will not receive key events.</p>
999
1000
1001<h2 id="Testing">Testing and Debugging</h2>
1002
1003<h3 id="UiAutomation">Automated UI testing</h3>
1004
1005<p>The new {@link android.app.UiAutomation} class provides APIs that allow you to simulate user 
1006actions for test automation. By using the platform's {@link 
1007android.accessibilityservice.AccessibilityService} APIs, the {@link android.app.UiAutomation} 
1008APIs allow you to inspect the screen content and inject arbitrary keyboard and touch events.</p>
1009
1010<p>To get an instance of {@link android.app.UiAutomation}, call {@link 
1011android.app.Instrumentation#getUiAutomation Instrumentation.getUiAutomation()}. In order 
1012for this to work, you must supply the {@code -w} option with the {@code instrument} command 
1013when running your {@link android.test.InstrumentationTestCase} from <a 
1014href="{@docRoot}tools/help/adb.html#am">{@code adb shell}</a>.</p>
1015
1016<p>With the {@link android.app.UiAutomation} instance, you can execute arbitrary events to test 
1017your app by calling {@link android.app.UiAutomation#executeAndWaitForEvent 
1018executeAndWaitForEvent()}, passing it a {@link java.lang.Runnable} to perform, a timeout 
1019period for the operation, and an implementation of the {@link 
1020android.app.UiAutomation.AccessibilityEventFilter} interface. It's within your {@link 
1021android.app.UiAutomation.AccessibilityEventFilter} implementation that you'll receive a call 
1022that allows you to filter the events that you're interested in and determine the success or 
1023failure of a given test case.</p>
1024
1025<p>To observe all the events during a test, create an implementation of {@link 
1026android.app.UiAutomation.OnAccessibilityEventListener} and pass it to {@link 
1027android.app.UiAutomation#setOnAccessibilityEventListener setOnAccessibilityEventListener()}.  
1028Your listener interface then receives a call to {@link 
1029android.app.UiAutomation.OnAccessibilityEventListener#onAccessibilityEvent onAccessibilityEvent()} 
1030each time an event occurs, receiving an {@link android.view.accessibility.AccessibilityEvent} object 
1031that describes the event.</p>
1032
1033<p>There is a variety of other operations that the {@link android.app.UiAutomation} APIs expose 
1034at a very low level to encourage the development of UI test tools such as <a href="{@docRoot}tools/help/uiautomator/index.html">uiautomator</a>. For instance, 
1035{@link android.app.UiAutomation} can also:</p>
1036<ul>
1037  <li>Inject input events
1038  <li>Change the orientation of the screen
1039  <li>Take screenshots
1040</ul>
1041
1042<p>And most importantly for UI test tools, the {@link android.app.UiAutomation} APIs work 
1043across application boundaries, unlike those in {@link android.app.Instrumentation}.</p>
1044
1045
1046<h3 id="Systrace">Systrace events for apps</h3>
1047
1048<p>Android 4.3 adds the {@link android.os.Trace} class with two static methods, 
1049{@link android.os.Trace#beginSection beginSection()} and {@link android.os.Trace#endSection()}, 
1050which allow you to define blocks of code to include with the systrace report. By creating 
1051sections of traceable code in your app, the systrace logs provide you a much more detailed 
1052analysis of where slowdown occurs within your app.</p>
1053
1054<p>For information about using the Systrace tool, read <a href="{@docRoot}tools/debugging/systrace.html">Analyzing Display and Performance with Systrace</a>.</p>
1055
1056
1057<h2 id="Security">Security</h2>
1058
1059<h3 id="KeyStore">Android key store for app-private keys</h3>
1060
1061<p>Android now offers a custom Java Security Provider in the {@link java.security.KeyStore} 
1062facility, called Android Key Store, which allows you to generate and save private keys that 
1063may be seen and used by only your app. To load the Android Key Store, pass 
1064{@code "AndroidKeyStore"} to {@link java.security.KeyStore#getInstance(String) 
1065KeyStore.getInstance()}.</p>
1066
1067<p>To manage your app's private credentials in the Android Key Store, generate a new key with 
1068{@link java.security.KeyPairGenerator} with {@link android.security.KeyPairGeneratorSpec}. First 
1069get an instance of {@link java.security.KeyPairGenerator} by calling {@link 
1070java.security.KeyPairGenerator#getInstance getInstance()}. Then call 
1071{@link java.security.KeyPairGenerator#initialize initialize()}, passing it an instance of 
1072{@link android.security.KeyPairGeneratorSpec}, which you can get using 
1073{@link android.security.KeyPairGeneratorSpec.Builder KeyPairGeneratorSpec.Builder}. 
1074Finally, get your {@link java.security.KeyPair} by calling {@link 
1075java.security.KeyPairGenerator#generateKeyPair generateKeyPair()}.</p>
1076
1077
1078<h3 id="HardwareKeyChain">Hardware credential storage</h3>
1079
1080<p>Android also now supports hardware-backed storage for your {@link android.security.KeyChain} 
1081credentials, providing more security by making the keys unavailable for extraction. That is, once 
1082keys are in a hardware-backed key store (Secure Element, TPM, or TrustZone), they can be used for 
1083cryptographic operations but the private key material cannot be exported. Even the OS kernel 
1084cannot access this key material. While not all Android-powered devices support storage on 
1085hardware, you can check at runtime if hardware-backed storage is available by calling 
1086{@link android.security.KeyChain#isBoundKeyAlgorithm KeyChain.IsBoundKeyAlgorithm()}.</p>
1087
1088
1089
1090<h2 id="Manifest">Manifest Declarations</h2>
1091
1092<h3 id="ManifestFeatures">Declarable required features</h3>
1093
1094<p>The following values are now supported in the <a 
1095href="{@docRoot}guide/topics/manifest/uses-feature-element.html">{@code &lt;uses-feature>}</a>
1096element so you can ensure that your app is installed only on devices that provide the features 
1097your app needs.</p>
1098
1099<dl>
1100<dt>{@link android.content.pm.PackageManager#FEATURE_APP_WIDGETS}</dt>
1101<dd>Declares that your app provides an app widget and should be installed only on devices that
1102include a Home screen or similar location where users can embed app widgets.
1103Example:
1104<pre>
1105&lt;uses-feature android:name="android.software.app_widgets" android:required="true" />
1106</pre>
1107</dd>
1108
1109<dt>{@link android.content.pm.PackageManager#FEATURE_HOME_SCREEN}</dt>
1110<dd>Declares that your app behaves as a Home screen replacement and should be installed only on
1111devices that support third-party Home screen apps.
1112Example:
1113<pre>
1114&lt;uses-feature android:name="android.software.home_screen" android:required="true" />
1115</pre>
1116</dd>
1117
1118<dt>{@link android.content.pm.PackageManager#FEATURE_INPUT_METHODS}</dt>
1119<dd>Declares that your app provides a custom input method (a keyboard built with {@link
1120android.inputmethodservice.InputMethodService}) and should be installed only on devices that
1121support third-party input methods.
1122Example:
1123<pre>
1124&lt;uses-feature android:name="android.software.input_methods" android:required="true" />
1125</pre>
1126</dd>
1127
1128<dt>{@link android.content.pm.PackageManager#FEATURE_BLUETOOTH_LE}</dt>
1129<dd>Declares that your app uses Bluetooth Low Energy APIs and should be installed only on devices
1130that are capable of communicating with other devices via Bluetooth Low Energy.
1131Example:
1132<pre>
1133&lt;uses-feature android:name="android.software.bluetooth_le" android:required="true" />
1134</pre>
1135</dd>
1136</dl>
1137
1138
1139<h3 id="ManifestPermissions">User permissions</h3>
1140<p>The following values are now supported in the <a 
1141href="{@docRoot}guide/topics/manifest/uses-permission-element.html">{@code &lt;uses-permission>}</a> 
1142to declare the
1143permissions your app requires in order to access certain APIs.</p>
1144
1145<dl>
1146<dt>{@link android.Manifest.permission#BIND_NOTIFICATION_LISTENER_SERVICE}
1147</dt>
1148<dd>Required to use the new {@link android.service.notification.NotificationListenerService} APIs.
1149</dd>
1150
1151<dt>{@link android.Manifest.permission#SEND_RESPOND_VIA_MESSAGE}</dt>
1152<dd>Required to receive the {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE}
1153intent.</dd>
1154</dl>
1155
1156
1157
1158
1159<p class="note">For a detailed view of all API changes in Android 4.3, see the
1160<a href="{@docRoot}sdk/api_diff/18/changes.html">API Differences Report</a>.</p>
1161
1162
1163
1164