android-4.0.jd revision 8cc7db7d402569332f32a6e092a9d42feb64c995
1page.title=Android 4.0 APIs 2excludeFromSuggestions=true 3sdk.platform.version=4.0 4sdk.platform.apiLevel=14 5@jd:body 6 7<div id="qv-wrapper"> 8<div id="qv"> 9 10<h2>In this document</h2> 11<ol> 12 <li><a href="#api">API Overview</a></li> 13 <li><a href="#Honeycomb">Previous APIs</a></li> 14 <li><a href="#api-level">API Level</a></li> 15</ol> 16 17<h2>Reference</h2> 18<ol> 19<li><a 20href="{@docRoot}sdk/api_diff/14/changes.html">API 21Differences Report »</a> </li> 22</ol> 23 24</div> 25</div> 26 27 28<p><em>API Level:</em> <strong>{@sdkPlatformApiLevel}</strong></p> 29 30<p>Android 4.0 ({@link android.os.Build.VERSION_CODES#ICE_CREAM_SANDWICH}) 31is a major platform release that adds a variety of new features for users and app 32developers. Besides all the new features and APIs discussed below, Android 4.0 is an important 33platform release because it brings the extensive set of APIs and Holographic themes from Android 3.x 34to smaller screens. As an app developer, you now have a single platform and unified API framework 35that enables you to develop and publish your application with a single APK that provides an 36optimized user experience for handsets, tablets, and more, when running the same version of 37Android—Android 4.0 (API level 14) or greater.</p> 38 39<p>For developers, the Android {@sdkPlatformVersion} platform is available as a 40downloadable component for the Android SDK. The downloadable platform includes 41an Android library and system image, as well as a set of emulator skins and 42more. To get started developing or testing against Android {@sdkPlatformVersion}, 43use the Android SDK Manager to download the platform into your SDK.</p> 44 45<h2 id="api">API Overview</h2> 46 47<p>The sections below provide a technical overview of new APIs in Android 4.0.</p> 48 49<div class="toggle-content closed"> 50 51 <p><a href="#" onclick="return toggleContent(this)"> 52 <img src="{@docRoot}assets/images/triangle-closed.png" 53class="toggle-content-img" alt="" /> 54 <strong>Table of Contents</strong> 55 </a></p> 56 57 <div class="toggle-content-toggleme" style="padding: 5px 18px"> 58 <ol> 59 <li><a href="#Contacts">Social APIs in Contacts Provider</a></li> 60 <li><a href="#Calendar">Calendar Provider</a></li> 61 <li><a href="#Voicemail">Voicemail Provider</a></li> 62 <li><a href="#Multimedia">Multimedia</a></li> 63 <li><a href="#Camera">Camera</a></li> 64 <li><a href="#AndroidBeam">Android Beam (NDEF Push with NFC)</a></li> 65 <li><a href="#WiFiDirect">Wi-Fi Direct</a></li> 66 <li><a href="#Bluetooth">Bluetooth Health Devices</a></li> 67 <li><a href="#A11y">Accessibility</a></li> 68 <li><a href="#SpellChecker">Spell Checker Services</a></li> 69 <li><a href="#TTS">Text-to-speech Engines</a></li> 70 <li><a href="#NetworkUsage">Network Usage</a></li> 71 <li><a href="#RenderScript">RenderScript</a></li> 72 <li><a href="#Enterprise">Enterprise</a></li> 73 <li><a href="#Sensors">Device Sensors</a></li> 74 <li><a href="#ActionBar">Action Bar</a></li> 75 <li><a href="#UI">User Interface and Views</a></li> 76 <li><a href="#Input">Input Framework</a></li> 77 <li><a href="#Properties">Properties</a></li> 78 <li><a href="#HwAccel">Hardware Acceleration</a></li> 79 <li><a href="#Jni">JNI Changes</a></li> 80 <li><a href="#WebKit">WebKit</a></li> 81 <li><a href="#Permissions">Permissions</a></li> 82 <li><a href="#DeviceFeatures">Device Features</a></li> 83 </ol> 84 </div> 85</div> 86 87 88 89 90 91<h3 id="Contacts">Social APIs in Contacts Provider</h3> 92 93<p>The contact APIs defined by the {@link android.provider.ContactsContract} provider have been 94extended to support new social-oriented features such as a personal profile for the device owner and 95the ability for users to invite individual contacts to social networks that are installed on the 96device.</p> 97 98 99<h4>User Profile</h4> 100 101<p>Android now includes a personal profile that represents the device owner, as defined by the 102{@link android.provider.ContactsContract.Profile} table. Social apps that maintain a user identity 103can contribute to the user's profile data by creating a new {@link 104android.provider.ContactsContract.RawContacts} entry within the {@link 105android.provider.ContactsContract.Profile}. That is, raw contacts that represent the device user do 106not belong in the traditional raw contacts table defined by the {@link 107android.provider.ContactsContract.RawContacts} Uri; instead, you must add a profile raw contact in 108the table at {@link android.provider.ContactsContract.Profile#CONTENT_RAW_CONTACTS_URI}. Raw 109contacts in this table are then aggregated into the single user-visible profile labeled "Me".</p> 110 111<p>Adding a new raw contact for the profile requires the {@link 112android.Manifest.permission#WRITE_PROFILE} permission. Likewise, in order to read from the profile 113table, you must request the {@link android.Manifest.permission#READ_PROFILE} permission. However, 114most apps should not need to read the user profile, even when contributing data to the 115profile. Reading the user profile is a sensitive permission and you should expect users to be 116skeptical of apps that request it.</p> 117 118 119<h4>Invite Intent</h4> 120 121<p>The {@link android.provider.ContactsContract.Intents#INVITE_CONTACT} intent action allows an app 122to invoke an action that indicates the user wants to add a contact to a social network. The app 123receiving the app uses it to invite the specified contact to that 124social network. Most apps will be on the receiving-end of this operation. For example, the 125built-in People app invokes the invite intent when the user selects "Add connection" for a specific 126social app that's listed in a person's contact details.</p> 127 128<p>To make your app visible as in the "Add connection" list, your app must provide a sync adapter to 129sync contact information from your social network. You must then indicate to the system that your 130app responds to the {@link android.provider.ContactsContract.Intents#INVITE_CONTACT} intent by 131adding the {@code inviteContactActivity} attribute to your app’s sync configuration file, with a 132fully-qualified name of the activity that the system should start when sending the invite intent. 133The activity that starts can then retrieve the URI for the contact in question from the intent’s 134data and perform the necessary work to invite that contact to the network or add the person to the 135user’s connections.</p> 136 137<p>See the <a href="{@docRoot}resources/samples/SampleSyncAdapter/index.html">Sample Sync 138Adapter</a> app for an example (specifically, see the <a 139href="{@docRoot}resources/samples/SampleSyncAdapter/res/xml-v14/contacts.html">contacts.xml</a> 140file).</p> 141 142 143<h4>Large photos</h4> 144 145<p>Android now supports high resolution photos for contacts. Now, when you push a photo into a 146contact record, the system processes it into both a 96x96 thumbnail (as it has previously) and a 147256x256 "display photo" that's stored in a new file-based photo store (the exact dimensions that the 148system chooses may vary in the future). You can add a large photo to a contact by putting a large 149photo in the usual {@link android.provider.ContactsContract.CommonDataKinds.Photo#PHOTO} column of a 150data row, which the system will then process into the appropriate thumbnail and display photo 151records.</p> 152 153 154<h4>Contact Usage Feedback</h4> 155 156<p>The new {@link android.provider.ContactsContract.DataUsageFeedback} APIs allow you to help track 157how often the user uses particular methods of contacting people, such as how often the user uses 158each phone number or e-mail address. This information helps improve the ranking for each contact 159method associated with each person and provide better suggestions for contacting each person.</p> 160 161 162 163 164 165<h3 id="Calendar">Calendar Provider</h3> 166 167<p>The new calendar APIs allow you to read, add, modify and delete calendars, events, attendees, 168reminders and alerts, which are stored in the Calendar Provider.</p> 169 170<p>A variety of apps and widgets can use these APIs to read and modify calendar events. However, 171some of the most compelling use cases are sync adapters that synchronize the user's calendar from 172other calendar services with the Calendar Provider, in order to offer a unified location for all the 173user's events. Google Calendar events, for example, are synchronized with the Calendar Provider by 174the Google Calendar Sync Adapter, allowing these events to be viewed with Android's built-in 175Calendar app.</p> 176 177<p>The data model for calendars and event-related information in the Calendar Provider is 178defined by {@link android.provider.CalendarContract}. All the user’s calendar data is stored in a 179number of tables defined by various subclasses of {@link android.provider.CalendarContract}:</p> 180 181<ul> 182<li>The {@link android.provider.CalendarContract.Calendars} table holds the calendar-specific 183information. Each row in this table contains the details for a single calendar, such as the name, 184color, sync information, and so on.</li> 185 186<li>The {@link android.provider.CalendarContract.Events} table holds event-specific information. 187Each row in this table contains the information for a single event, such as the 188event title, location, start time, end time, and so on. The event can occur one time or recur 189multiple times. Attendees, reminders, and extended properties are stored in separate tables and 190use the event’s {@code _ID} to link them with the event.</li> 191 192<li>The {@link android.provider.CalendarContract.Instances} table holds the start and end time for 193occurrences of an event. Each row in this table represents a single occurrence. For one-time events 194there is a one-to-one mapping of instances to events. For recurring events, multiple rows are 195automatically generated to correspond to the multiple occurrences of that event.</li> 196 197<li>The {@link android.provider.CalendarContract.Attendees} table holds the event attendee or guest 198information. Each row represents a single guest of an event. It specifies the type of guest the 199person is and the person’s response for the event.</li> 200 201<li>The {@link android.provider.CalendarContract.Reminders} table holds the alert/notification data. 202Each row represents a single alert for an event. An event can have multiple reminders. The number of 203reminders per event is specified in {@code MAX_REMINDERS}, which is set by the sync adapter that 204owns the given calendar. Reminders are specified in number-of-minutes before the event is 205scheduled and specify an alarm method such as to use an alert, email, or SMS to remind 206the user.</li> 207 208<li>The {@link android.provider.CalendarContract.ExtendedProperties} table hold opaque data fields 209used by the sync adapter. The provider takes no action with items in this table except to delete 210them when their related events are deleted.</li> 211</ul> 212 213<p>To access a user’s calendar data with the Calendar Provider, your application must request 214the {@link android.Manifest.permission#READ_CALENDAR} permission (for read access) and 215{@link android.Manifest.permission#WRITE_CALENDAR} (for write access).</p> 216 217 218<h4>Event intent</h4> 219 220<p>If all you want to do is add an event to the user’s calendar, you can use an {@link 221android.content.Intent#ACTION_INSERT} intent with the data defined by {@link 222android.provider.CalendarContract.Events#CONTENT_URI Events.CONTENT_URI} in order to start an 223activity in the Calendar app that creates new events. Using the intent does not require any 224permission and you can specify event details with the following extras:</p> 225 226<ul> 227 <li>{@link android.provider.CalendarContract.EventsColumns#TITLE Events.TITLE}: Name for the 228event</li> 229 <li>{@link 230android.provider.CalendarContract#EXTRA_EVENT_BEGIN_TIME CalendarContract.EXTRA_EVENT_BEGIN_TIME}: 231Event begin time in milliseconds from the 232epoch</li> 233 <li>{@link 234android.provider.CalendarContract#EXTRA_EVENT_END_TIME CalendarContract.EXTRA_EVENT_END_TIME}: Event 235end time in milliseconds from the epoch</li> 236 <li>{@link android.provider.CalendarContract.EventsColumns#EVENT_LOCATION Events.EVENT_LOCATION}: 237Location of the event</li> 238 <li>{@link android.provider.CalendarContract.EventsColumns#DESCRIPTION Events.DESCRIPTION}: Event 239description</li> 240 <li>{@link android.content.Intent#EXTRA_EMAIL Intent.EXTRA_EMAIL}: Email addresses of those to 241invite</li> 242 <li>{@link android.provider.CalendarContract.EventsColumns#RRULE Events.RRULE}: The recurrence 243rule for the event</li> 244 <li>{@link android.provider.CalendarContract.EventsColumns#ACCESS_LEVEL Events.ACCESS_LEVEL}: 245Whether the event is private or public</li> 246 <li>{@link android.provider.CalendarContract.EventsColumns#AVAILABILITY Events.AVAILABILITY}: 247Whether the time period of this event allows for other events to be scheduled at the same time</li> 248</ul> 249 250 251 252 253<h3 id="Voicemail">Voicemail Provider</h3> 254 255<p>The new Voicemail Provider allows applications to add voicemails to the 256device, in order to present all the user's voicemails in a single visual presentation. For instance, 257it’s possible that a user has multiple voicemail sources, such as 258one from the phone’s service provider and others from VoIP or other alternative voice 259services. These apps can use the Voicemail Provider APIs to add their voicemails to the device. The 260built-in Phone application then presents all voicemails to the user in a unified presentation. 261Although the system’s Phone application is the only application that can read all the voicemails, 262each application that provides voicemails can read those that it has added to the system (but cannot 263read voicemails from other services).</p> 264 265<p>Because the APIs currently do not allow third-party apps to read all the voicemails from the 266system, the only third-party apps that should use the voicemail APIs are those that have voicemail 267to deliver to the user.</p> 268 269<p>The {@link android.provider.VoicemailContract} class defines the content provider for the 270Voicemail Provder. The subclasses {@link android.provider.VoicemailContract.Voicemails} and {@link 271android.provider.VoicemailContract.Status} provide tables in which apps can 272insert voicemail data for storage on the device. For an example of a voicemail provider app, see the 273<a href="{@docRoot}resources/samples/VoicemailProviderDemo/index.html">Voicemail Provider 274Demo</a>.</p> 275 276 277 278 279 280<h3 id="Multimedia">Multimedia</h3> 281 282<p>Android 4.0 adds several new APIs for applications that interact with media such as photos, 283videos, and music.</p> 284 285 286<h4>Media Effects</h4> 287 288<p>A new media effects framework allows you to apply a variety of visual effects to images and 289videos. For example, image effects allow you to easily fix red-eye, convert an image to grayscale, 290adjust brightness, adjust saturation, rotate an image, apply a fisheye effect, and much more. The 291system performs all effects processing on the GPU to obtain maximum performance.</p> 292 293<p>For maximum performance, effects are applied directly to OpenGL textures, so your application 294must have a valid OpenGL context before it can use the effects APIs. The textures to which you apply 295effects may be from bitmaps, videos or even the camera. However, there are certain restrictions that 296textures must meet:</p> 297<ol> 298<li>They must be bound to a {@link android.opengl.GLES20#GL_TEXTURE_2D} texture image</li> 299<li>They must contain at least one mipmap level</li> 300</ol> 301 302<p>An {@link android.media.effect.Effect} object defines a single media effect that you can apply to 303an image frame. The basic workflow to create an {@link android.media.effect.Effect} is:</p> 304 305<ol> 306<li>Call {@link android.media.effect.EffectContext#createWithCurrentGlContext 307EffectContext.createWithCurrentGlContext()} from your OpenGL ES 2.0 context.</li> 308<li>Use the returned {@link android.media.effect.EffectContext} to call {@link 309android.media.effect.EffectContext#getFactory EffectContext.getFactory()}, which returns an instance 310of {@link android.media.effect.EffectFactory}.</li> 311<li>Call {@link android.media.effect.EffectFactory#createEffect createEffect()}, passing it an 312effect name from @link android.media.effect.EffectFactory}, such as {@link 313android.media.effect.EffectFactory#EFFECT_FISHEYE} or {@link 314android.media.effect.EffectFactory#EFFECT_VIGNETTE}.</li> 315</ol> 316 317<p>You can adjust an effect’s parameters by calling {@link android.media.effect.Effect#setParameter 318setParameter()} and passing a parameter name and parameter value. Each type of effect accepts 319different parameters, which are documented with the effect name. For example, {@link 320android.media.effect.EffectFactory#EFFECT_FISHEYE} has one parameter for the {@code scale} of the 321distortion.</p> 322 323<p>To apply an effect on a texture, call {@link android.media.effect.Effect#apply apply()} on the 324{@link 325android.media.effect.Effect} and pass in the input texture, it’s width and height, and the output 326texture. The input texture must be bound to a {@link android.opengl.GLES20#GL_TEXTURE_2D} texture 327image (usually done by calling the {@link android.opengl.GLES20#glTexImage2D glTexImage2D()} 328function). You may provide multiple mipmap levels. If the output texture has not been bound to a 329texture image, it will be automatically bound by the effect as a {@link 330android.opengl.GLES20#GL_TEXTURE_2D} and with one mipmap level (0), which will have the same 331size as the input.</p> 332 333<p>All effects listed in {@link android.media.effect.EffectFactory} are guaranteed to be supported. 334However, some additional effects available from external libraries are not supported by all devices, 335so you must first check if the desired effect from the external library is supported by calling 336{@link android.media.effect.EffectFactory#isEffectSupported isEffectSupported()}.</p> 337 338 339<h4>Remote control client</h4> 340 341<p>The new {@link android.media.RemoteControlClient} allows media players to enable playback 342controls from remote control clients such as the device lock screen. Media players can also expose 343information about the media currently playing for display on the remote control, such as track 344information and album art.</p> 345 346<p>To enable remote control clients for your media player, instantiate a {@link 347android.media.RemoteControlClient} with its constructor, passing it a {@link 348android.app.PendingIntent} that broadcasts {@link 349android.content.Intent#ACTION_MEDIA_BUTTON}. The intent must also declare the explicit {@link 350android.content.BroadcastReceiver} component in your app that handles the {@link 351android.content.Intent#ACTION_MEDIA_BUTTON} event.</p> 352 353<p>To declare which media control inputs your player can handle, you must call {@link 354android.media.RemoteControlClient#setTransportControlFlags setTransportControlFlags()} on your 355{@link android.media.RemoteControlClient}, passing a set of {@code FLAG_KEY_MEDIA_*} flags, such as 356{@link android.media.RemoteControlClient#FLAG_KEY_MEDIA_PREVIOUS} and {@link 357android.media.RemoteControlClient#FLAG_KEY_MEDIA_NEXT}.</p> 358 359<p>You must then register your {@link android.media.RemoteControlClient} by passing it to {@link 360android.media.AudioManager#registerRemoteControlClient MediaManager.registerRemoteControlClient()}. 361Once registered, the broadcast receiver you declared when you instantiated the {@link 362android.media.RemoteControlClient} will receive {@link android.content.Intent#ACTION_MEDIA_BUTTON} 363events when a button is pressed from a remote control. The intent you receive includes the {@link 364android.view.KeyEvent} for the media key pressed, which you can retrieve from the intent with {@link 365android.content.Intent#getParcelableExtra getParcelableExtra(Intent.EXTRA_KEY_EVENT)}.</p> 366 367<p>To display information on the remote control about the media playing, call {@link 368android.media.RemoteControlClient#editMetadata editMetaData()} and add metadata to the returned 369{@link android.media.RemoteControlClient.MetadataEditor}. You can supply a bitmap for media artwork, 370numerical information such as elapsed time, and text information such as the track title. For 371information on available keys see the {@code METADATA_KEY_*} flags in {@link 372android.media.MediaMetadataRetriever}.</p> 373 374<p>For a sample implementation, see the <a 375href="{@docRoot}resources/samples/RandomMusicPlayer/index.html">Random Music Player</a>, which 376provides compatibility logic such that it enables the remote control client on Android 4.0 377devices while continuing to support devices back to Android 2.1.</p> 378 379 380<h4>Media player</h4> 381 382<ul> 383<li>Streaming online media from {@link android.media.MediaPlayer} now requires the {@link 384android.Manifest.permission#INTERNET} permission. If you use {@link android.media.MediaPlayer} to 385play content from the Internet, be sure to add the {@link android.Manifest.permission#INTERNET} 386permission to your manifest or else your media playback will not work beginning with Android 3874.0.</li> 388 389<li>{@link android.media.MediaPlayer#setSurface(Surface) setSurface()} allows you define a {@link 390android.view.Surface} to behave as the video sink.</li> 391 392<li>{@link android.media.MediaPlayer#setDataSource(Context,Uri,Map) setDataSource()} allows you to 393send additional HTTP headers with your request, which can be useful for HTTP(S) live streaming</li> 394 395<li>HTTP(S) live streaming now respects HTTP cookies across requests</li> 396</ul> 397 398 399<h4>Media types</h4> 400 401<p>Android 4.0 adds support for:</p> 402<ul> 403<li>HTTP/HTTPS live streaming protocol version 3 </li> 404<li>ADTS raw AAC audio encoding</li> 405<li>WEBP images</li> 406<li>Matroska video</li> 407</ul> 408<p>For more info, see <a href="{@docRoot}guide/appendix/media-formats.html">Supported Media 409Formats</a>.</p> 410 411 412 413 414 415<h3 id="Camera">Camera</h3> 416 417<p>The {@link android.hardware.Camera} class now includes APIs for detecting faces and controlling 418focus and metering areas.</p> 419 420 421<h4>Face detection</h4> 422 423<p>Camera apps can now enhance their abilities with Android’s face detection APIs, which not 424only detect the face of a subject, but also specific facial features, such as the eyes and mouth. 425</p> 426 427<p>To detect faces in your camera application, you must register a {@link 428android.hardware.Camera.FaceDetectionListener} by calling {@link 429android.hardware.Camera#setFaceDetectionListener setFaceDetectionListener()}. You can then start 430your camera surface and start detecting faces by calling {@link 431android.hardware.Camera#startFaceDetection}.</p> 432 433<p>When the system detects one or more faces in the camera scene, it calls the {@link 434android.hardware.Camera.FaceDetectionListener#onFaceDetection onFaceDetection()} callback in your 435implementation of {@link android.hardware.Camera.FaceDetectionListener}, including an array of 436{@link android.hardware.Camera.Face} objects.</p> 437 438<p>An instance of the {@link android.hardware.Camera.Face} class provides various information about 439the face detected, including:</p> 440<ul> 441<li>A {@link android.graphics.Rect} that specifies the bounds of the face, relative to the camera's 442current field of view</li> 443<li>An integer betwen 1 and 100 that indicates how confident the system is that the object is a 444human face</li> 445<li>A unique ID so you can track multiple faces</li> 446<li>Several {@link android.graphics.Point} objects that indicate where the eyes and mouth are 447located</li> 448</ul> 449 450<p class="note"><strong>Note:</strong> Face detection may not be supported on some 451devices, so you should check by calling {@link 452android.hardware.Camera.Parameters#getMaxNumDetectedFaces()} and ensure the return 453value is greater than zero. Also, some devices may not support identification of eyes and mouth, 454in which case, those fields in the {@link android.hardware.Camera.Face} object will be null.</p> 455 456 457<h4>Focus and metering areas</h4> 458 459<p>Camera apps can now control the areas that the camera uses for focus and for metering white 460balance 461and auto-exposure. Both features use the new {@link android.hardware.Camera.Area} class to specify 462the region of the camera’s current view that should be focused or metered. An instance of the {@link 463android.hardware.Camera.Area} class defines the bounds of the area with a {@link 464android.graphics.Rect} and the area's weight—representing the level of importance of that 465area, relative to other areas in consideration—with an integer.</p> 466 467<p>Before setting either a focus area or metering area, you should first call {@link 468android.hardware.Camera.Parameters#getMaxNumFocusAreas} or {@link 469android.hardware.Camera.Parameters#getMaxNumMeteringAreas}, respectively. If these return zero, then 470the device does not support the corresponding feature.</p> 471 472<p>To specify the focus or metering areas to use, simply call {@link 473android.hardware.Camera.Parameters#setFocusAreas setFocusAreas()} or {@link 474android.hardware.Camera.Parameters#setMeteringAreas setMeteringAreas()}. Each take a {@link 475java.util.List} of {@link android.hardware.Camera.Area} objects that indicate the areas to consider 476for focus or metering. For example, you might implement a feature that allows the user to set the 477focus area by touching an area of the preview, which you then translate to an {@link 478android.hardware.Camera.Area} object and request that the camera focus on that area of the scene. 479The focus or exposure in that area will continually update as the scene in the area changes.</p> 480 481 482<h4>Continuous auto focus for photos</h4> 483 484<p>You can now enable continuous auto focusing (CAF) when taking photos. To enable CAF in your 485camera app, pass {@link android.hardware.Camera.Parameters#FOCUS_MODE_CONTINUOUS_PICTURE} 486to {@link android.hardware.Camera.Parameters#setFocusMode setFocusMode()}. When ready to capture 487a photo, call {@link android.hardware.Camera#autoFocus autoFocus()}. Your {@link 488android.hardware.Camera.AutoFocusCallback} immediately receives a callback to indicate whether 489focus was achieved. To resume CAF after receiving the callback, you must call {@link 490android.hardware.Camera#cancelAutoFocus()}.</p> 491 492<p class="note"><strong>Note:</strong> Continuous auto focus is also supported when capturing 493video, using {@link android.hardware.Camera.Parameters#FOCUS_MODE_CONTINUOUS_VIDEO}, which was 494added in API level 9.</p> 495 496 497<h4>Other camera features</h4> 498 499<ul> 500<li>While recording video, you can now call {@link android.hardware.Camera#takePicture 501takePicture()} to save a photo without interrupting the video session. Before doing so, you should 502call {@link android.hardware.Camera.Parameters#isVideoSnapshotSupported} to be sure the hardware 503supports it.</li> 504 505<li>You can now lock auto exposure and white balance with {@link 506android.hardware.Camera.Parameters#setAutoExposureLock setAutoExposureLock()} and {@link 507android.hardware.Camera.Parameters#setAutoWhiteBalanceLock setAutoWhiteBalanceLock()} to prevent 508these properties from changing.</li> 509 510<li>You can now call {@link android.hardware.Camera#setDisplayOrientation 511setDisplayOrientation()} while the camera preview is running. Previously, you could call this 512only before beginning the preview, but you can now change the orientation at any time.</li> 513</ul> 514 515 516<h4>Camera broadcast intents</h4> 517 518<ul> 519<li>{@link android.hardware.Camera#ACTION_NEW_PICTURE Camera.ACTION_NEW_PICTURE}: 520This indicates that the user has captured a new photo. The built-in Camera app invokes this 521broadcast after a photo is captured and third-party camera apps should also broadcast this intent 522after capturing a photo.</li> 523<li>{@link android.hardware.Camera#ACTION_NEW_VIDEO Camera.ACTION_NEW_VIDEO}: 524This indicates that the user has captured a new video. The built-in Camera app invokes this 525broadcast after a video is recorded and third-party camera apps should also broadcast this intent 526after capturing a video.</li> 527</ul> 528 529 530 531 532 533<h3 id="AndroidBeam">Android Beam (NDEF Push with NFC)</h3> 534 535<p>Android Beam is a new NFC feature that allows you to send NDEF messages from one device to 536another (a process also known as “NDEF Push"). The data transfer is initiated when two 537Android-powered devices that support Android Beam are in close proximity (about 4 cm), usually with 538their backs touching. The data inside the NDEF message can contain any data that you wish to share 539between devices. For example, the People app shares contacts, YouTube shares videos, and Browser 540shares URLs using Android Beam.</p> 541 542<p>To transmit data between devices using Android Beam, you need to create an {@link 543android.nfc.NdefMessage} that contains the information you want to share while your activity is in 544the foreground. You must then pass the {@link android.nfc.NdefMessage} to the system in one of two 545ways:</p> 546 547<ul> 548<li>Define a single {@link android.nfc.NdefMessage} to push while in the activity: 549<p>Call {@link android.nfc.NfcAdapter#setNdefPushMessage setNdefPushMessage()} at any time to set 550the message you want to send. For instance, you might call this method and pass it your {@link 551android.nfc.NdefMessage} during your activity’s {@link android.app.Activity#onCreate onCreate()} 552method. Then, whenever Android Beam is activated with another device while the activity is in the 553foreground, the system sends the {@link android.nfc.NdefMessage} to the other device.</p></li> 554 555<li>Define the {@link android.nfc.NdefMessage} to push at the time that Android Beam is initiated: 556<p>Implement {@link android.nfc.NfcAdapter.CreateNdefMessageCallback}, in which your 557implementation of the {@link 558android.nfc.NfcAdapter.CreateNdefMessageCallback#createNdefMessage createNdefMessage()} 559method returns the {@link android.nfc.NdefMessage} you want to send. Then pass the {@link 560android.nfc.NfcAdapter.CreateNdefMessageCallback} implementation to {@link 561android.nfc.NfcAdapter#setNdefPushMessageCallback setNdefPushMessageCallback()}.</p> 562<p>In this case, when Android Beam is activated with another device while your activity is in the 563foreground, the system calls {@link 564android.nfc.NfcAdapter.CreateNdefMessageCallback#createNdefMessage createNdefMessage()} to retrieve 565the {@link android.nfc.NdefMessage} you want to send. This allows you to define the {@link 566android.nfc.NdefMessage} to deliver only once Android Beam is initiated, in case the contents 567of the message might vary throughout the life of the activity.</p></li> 568</ul> 569 570<p>In case you want to run some specific code once the system has successfully delivered your NDEF 571message to the other device, you can implement {@link 572android.nfc.NfcAdapter.OnNdefPushCompleteCallback} and set it with {@link 573android.nfc.NfcAdapter#setOnNdefPushCompleteCallback setNdefPushCompleteCallback()}. The system will 574then call {@link android.nfc.NfcAdapter.OnNdefPushCompleteCallback#onNdefPushComplete 575onNdefPushComplete()} when the message is delivered.</p> 576 577<p>On the receiving device, the system dispatches NDEF Push messages in a similar way to regular NFC 578tags. The system invokes an intent with the {@link android.nfc.NfcAdapter#ACTION_NDEF_DISCOVERED} 579action to start an activity, with either a URL or a MIME type set according to the first {@link 580android.nfc.NdefRecord} in the {@link android.nfc.NdefMessage}. For the activity you want to 581respond, you can declare intent filters for the URLs or MIME types your app cares about. For more 582information about Tag Dispatch see the <a 583href="{@docRoot}guide/topics/connectivity/nfc/index.html#dispatch">NFC</a> developer guide.</p> 584 585<p>If you want your {@link android.nfc.NdefMessage} to carry a URI, you can now use the convenience 586method {@link android.nfc.NdefRecord#createUri createUri} to construct a new {@link 587android.nfc.NdefRecord} based on either a string or a {@link android.net.Uri} object. If the URI is 588a special format that you want your application to also receive during an Android Beam event, you 589should create an intent filter for your activity using the same URI scheme in order to receive the 590incoming NDEF message.</p> 591 592<p>You should also pass an “Android application record" with your {@link android.nfc.NdefMessage} in 593order to guarantee that your application handles the incoming NDEF message, even if other 594applications filter for the same intent action. You can create an Android application record by 595calling {@link android.nfc.NdefRecord#createApplicationRecord createApplicationRecord()}, passing it 596your application’s package name. When the other device receives the NDEF message with the 597application record and multiple applications contain activities that handle the specified intent, 598the system always delivers the message to the activity in your application (based on the matching 599application record). If the target device does not currently have your application installed, the 600system uses the Android application record to launch Google Play and take the user to the 601application in order to install it.</p> 602 603<p>If your application doesn’t use NFC APIs to perform NDEF Push messaging, then Android provides a 604default behavior: When your application is in the foreground on one device and Android Beam is 605invoked with another Android-powered device, then the other device receives an NDEF message with an 606Android application record that identifies your application. If the receiving device has the 607application installed, the system launches it; if it’s not installed, Google Play opens and takes 608the user to your application in order to install it.</p> 609 610<p>You can read more about Android Beam and other NFC features in the <a 611href="{@docRoot}guide/topics/connectivity/nfc/nfc.html">NFC Basics</a> developer guide. For some example code 612using Android Beam, see the <a 613href="{@docRoot}resources/samples/AndroidBeamDemo/src/com/example/android/beam/Beam.html">Android 614Beam Demo</a>.</p> 615 616 617 618 619 620<h3 id="WiFiDirect">Wi-Fi Direct</h3> 621 622<p>Android now supports Wi-Fi Direct for peer-to-peer (P2P) connections between Android-powered 623devices and other device types without a hotspot or Internet connection. The Android framework 624provides a set of Wi-Fi P2P APIs that allow you to discover and connect to other devices when each 625device supports Wi-Fi Direct, then communicate over a speedy connection across distances much longer 626than a Bluetooth connection.</p> 627 628<p>A new package, {@link android.net.wifi.p2p}, contains all the APIs for performing peer-to-peer 629connections with Wi-Fi. The primary class you need to work with is {@link 630android.net.wifi.p2p.WifiP2pManager}, which you can acquire by calling {@link 631android.app.Activity#getSystemService getSystemService(WIFI_P2P_SERVICE)}. The {@link 632android.net.wifi.p2p.WifiP2pManager} includes APIs that allow you to:</p> 633<ul> 634<li>Initialize your application for P2P connections by calling {@link 635android.net.wifi.p2p.WifiP2pManager#initialize initialize()}</li> 636 637<li>Discover nearby devices by calling {@link android.net.wifi.p2p.WifiP2pManager#discoverPeers 638discoverPeers()}</li> 639 640<li>Start a P2P connection by calling {@link android.net.wifi.p2p.WifiP2pManager#connect 641connect()}</li> 642<li>And more</li> 643</ul> 644 645<p>Several other interfaces and classes are necessary as well, such as:</p> 646<ul> 647<li>The {@link android.net.wifi.p2p.WifiP2pManager.ActionListener} interface allows you to receive 648callbacks when an operation such as discovering peers or connecting to them succeeds or fails.</li> 649 650<li>{@link android.net.wifi.p2p.WifiP2pManager.PeerListListener} interface allows you to receive 651information about discovered peers. The callback provides a {@link 652android.net.wifi.p2p.WifiP2pDeviceList}, from which you can retrieve a {@link 653android.net.wifi.p2p.WifiP2pDevice} object for each device within range and get information such as 654the device name, address, device type, the WPS configurations the device supports, and more.</li> 655 656<li>The {@link android.net.wifi.p2p.WifiP2pManager.GroupInfoListener} interface allows you to 657receive information about a P2P group. The callback provides a {@link 658android.net.wifi.p2p.WifiP2pGroup} object, which provides group information such as the owner, the 659network name, and passphrase.</li> 660 661<li>{@link android.net.wifi.p2p.WifiP2pManager.ConnectionInfoListener} interface allows you to 662receive information about the current connection. The callback provides a {@link 663android.net.wifi.p2p.WifiP2pInfo} object, which has information such as whether a group has been 664formed and who is the group owner.</li> 665</ul> 666 667<p>In order to use the Wi-Fi P2P APIs, your app must request the following user permissions:</p> 668<ul> 669<li>{@link android.Manifest.permission#ACCESS_WIFI_STATE}</li> 670<li>{@link android.Manifest.permission#CHANGE_WIFI_STATE}</li> 671<li>{@link android.Manifest.permission#INTERNET} (although your app doesn’t technically connect 672to the Internet, communicating to Wi-Fi Direct peers with standard java sockets requires Internet 673permission).</li> 674</ul> 675 676<p>The Android system also broadcasts several different actions during certain Wi-Fi P2P events:</p> 677<ul> 678<li>{@link android.net.wifi.p2p.WifiP2pManager#WIFI_P2P_CONNECTION_CHANGED_ACTION}: The P2P 679connection state has changed. This carries {@link 680android.net.wifi.p2p.WifiP2pManager#EXTRA_WIFI_P2P_INFO} with a {@link 681android.net.wifi.p2p.WifiP2pInfo} object and {@link 682android.net.wifi.p2p.WifiP2pManager#EXTRA_NETWORK_INFO} with a {@link android.net.NetworkInfo} 683object.</li> 684 685<li>{@link android.net.wifi.p2p.WifiP2pManager#WIFI_P2P_STATE_CHANGED_ACTION}: The P2P state has 686changed between enabled and disabled. It carries {@link 687android.net.wifi.p2p.WifiP2pManager#EXTRA_WIFI_STATE} with either {@link 688android.net.wifi.p2p.WifiP2pManager#WIFI_P2P_STATE_DISABLED} or {@link 689android.net.wifi.p2p.WifiP2pManager#WIFI_P2P_STATE_ENABLED}</li> 690 691<li>{@link android.net.wifi.p2p.WifiP2pManager#WIFI_P2P_PEERS_CHANGED_ACTION}: The list of peer 692devices has changed.</li> 693 694<li>{@link android.net.wifi.p2p.WifiP2pManager#WIFI_P2P_THIS_DEVICE_CHANGED_ACTION}: The details for 695this device have changed.</li> 696</ul> 697 698<p>See the {@link android.net.wifi.p2p.WifiP2pManager} documentation for more information. Also 699look at the <a href="{@docRoot}resources/samples/WiFiDirectDemo/index.html">Wi-Fi Direct Demo</a> 700sample application.</p> 701 702 703 704 705 706<h3 id="Bluetooth">Bluetooth Health Devices</h3> 707 708<p>Android now supports Bluetooth Health Profile devices, so you can create applications that use 709Bluetooth to communicate with health devices that support Bluetooth, such as heart-rate monitors, 710blood meters, thermometers, and scales.</p> 711 712<p>Similar to regular headset and A2DP profile devices, you must call {@link 713android.bluetooth.BluetoothAdapter#getProfileProxy getProfileProxy()} with a {@link 714android.bluetooth.BluetoothProfile.ServiceListener} and the {@link 715android.bluetooth.BluetoothProfile#HEALTH} profile type to establish a connection with the profile 716proxy object.</p> 717 718<p>Once you’ve acquired the Health Profile proxy (the {@link android.bluetooth.BluetoothHealth} 719object), connecting to and communicating with paired health devices involves the following new 720Bluetooth classes:</p> 721<ul> 722<li>{@link android.bluetooth.BluetoothHealthCallback}: You must extend this class and implement the 723callback methods to receive updates about changes in the application’s registration state and 724Bluetooth channel state.</li> 725<li>{@link android.bluetooth.BluetoothHealthAppConfiguration}: During callbacks to your {@link 726android.bluetooth.BluetoothHealthCallback}, you’ll receive an instance of this object, which 727provides configuration information about the available Bluetooth health device, which you must use 728to perform various operations such as initiate and terminate connections with the {@link 729android.bluetooth.BluetoothHealth} APIs.</li> 730</ul> 731 732<p>For more information about using the Bluetooth Health Profile, see the documentation for {@link 733android.bluetooth.BluetoothHealth}.</p> 734 735 736 737 738 739<h3 id="A11y">Accessibility</h3> 740 741<p>Android 4.0 improves accessibility for sight-impaired users with new explore-by-touch mode 742and extended APIs that allow you to provide more information about view content or 743develop advanced accessibility services.</p> 744 745 746<h4>Explore-by-touch mode</h4> 747 748<p>Users with vision loss can now explore the screen by touching and dragging a finger across the 749screen to hear voice descriptions of the content. Because the explore-by-touch mode works like a 750virtual cursor, it allows screen readers to identify the descriptive text the same way that screen 751readers can when the user navigates with a d-pad or trackball—by reading information provided 752by {@link android.R.attr#contentDescription android:contentDescription} and {@link 753android.view.View#setContentDescription setContentDescription()} upon a simulated "hover" event. So, 754consider this is a reminder that you should provide descriptive text for the views in your 755application, especially for {@link android.widget.ImageButton}, {@link android.widget.EditText}, 756{@link android.widget.ImageView} and other widgets that might not naturally contain descriptive 757text.</p> 758 759 760<h4>Accessibility for views</h4> 761 762<p>To enhance the information available to accessibility services such as screen readers, you can 763implement new callback methods for accessibility events in your custom {@link 764android.view.View} components.</p> 765 766<p>It's important to first note that the behavior of the {@link 767android.view.View#sendAccessibilityEvent sendAccessibilityEvent()} method has changed in Android 7684.0. As with previous version of Android, when the user enables accessibility services on the device 769and an input event such as a click or hover occurs, the respective view is notified with a call to 770{@link android.view.View#sendAccessibilityEvent sendAccessibilityEvent()}. Previously, the 771implementation of {@link android.view.View#sendAccessibilityEvent sendAccessibilityEvent()} would 772initialize an {@link android.view.accessibility.AccessibilityEvent} and send it to {@link 773android.view.accessibility.AccessibilityManager}. The new behavior involves some additional callback 774methods that allow the view and its parents to add more contextual information to the event: 775<ol> 776 <li>When invoked, the {@link 777android.view.View#sendAccessibilityEvent sendAccessibilityEvent()} and {@link 778android.view.View#sendAccessibilityEventUnchecked sendAccessibilityEventUnchecked()} methods defer 779to {@link android.view.View#onInitializeAccessibilityEvent onInitializeAccessibilityEvent()}. 780 <p>Custom implementations of {@link android.view.View} might want to implement {@link 781android.view.View#onInitializeAccessibilityEvent onInitializeAccessibilityEvent()} to 782attach additional accessibility information to the {@link 783android.view.accessibility.AccessibilityEvent}, but should also call the super implementation to 784provide default information such as the standard content description, item index, and more. 785However, you should not add additional text content in this callback—that happens 786next.</p></li> 787 <li>Once initialized, if the event is one of several types that should be populated with text 788information, the view then receives a call to {@link 789android.view.View#dispatchPopulateAccessibilityEvent dispatchPopulateAccessibilityEvent()}, which 790defers to the {@link android.view.View#onPopulateAccessibilityEvent onPopulateAccessibilityEvent()} 791callback. 792 <p>Custom implementations of {@link android.view.View} should usually implement {@link 793android.view.View#onPopulateAccessibilityEvent onPopulateAccessibilityEvent()} to add additional 794text content to the {@link android.view.accessibility.AccessibilityEvent} if the {@link 795android.R.attr#contentDescription android:contentDescription} text is missing or 796insufficient. To add more text description to the 797{@link android.view.accessibility.AccessibilityEvent}, call {@link 798android.view.accessibility.AccessibilityEvent#getText()}.{@link java.util.List#add add()}.</p> 799</li> 800 <li>At this point, the {@link android.view.View} passes the event up the view hierarchy by calling 801{@link android.view.ViewGroup#requestSendAccessibilityEvent requestSendAccessibilityEvent()} on the 802parent view. Each parent view then has the chance to augment the accessibility information by 803adding an {@link android.view.accessibility.AccessibilityRecord}, until it 804ultimately reaches the root view, which sends the event to the {@link 805android.view.accessibility.AccessibilityManager} with {@link 806android.view.accessibility.AccessibilityManager#sendAccessibilityEvent 807sendAccessibilityEvent()}.</li> 808</ol> 809 810<p>In addition to the new methods above, which are useful when extending the {@link 811android.view.View} class, you can also intercept these event callbacks on any {@link 812android.view.View} by extending {@link 813android.view.View.AccessibilityDelegate AccessibilityDelegate} and setting it on the view with 814{@link android.view.View#setAccessibilityDelegate setAccessibilityDelegate()}. 815When you do, each accessibility method in the view defers the call to the corresponding method in 816the delegate. For example, when the view receives a call to {@link 817android.view.View#onPopulateAccessibilityEvent onPopulateAccessibilityEvent()}, it passes it to the 818same method in the {@link android.view.View.AccessibilityDelegate}. Any methods not handled by 819the delegate are given right back to the view for default behavior. This allows you to override only 820the methods necessary for any given view without extending the {@link android.view.View} class.</p> 821 822 823<p>If you want to maintain compatibility with Android versions prior to 4.0, while also supporting 824the new the accessibility APIs, you can do so with the latest version of the <em>v4 support 825library</em> (in <a href="{@docRoot}tools/extras/support-library.html">Compatibility Package, r4</a>) 826using a set of utility classes that provide the new accessibility APIs in a backward-compatible 827design.</p> 828 829 830 831 832<h4>Accessibility services</h4> 833 834<p>If you're developing an accessibility service, the information about various accessibility events 835has been significantly expanded to enable more advanced accessibility feedback for users. In 836particular, events are generated based on view composition, providing better context information and 837allowing accessibility services to traverse view hierarchies to get additional view information and 838deal with special cases.</p> 839 840<p>If you're developing an accessibility service (such as a screen reader), you can access 841additional content information and traverse view hierarchies with the following procedure:</p> 842<ol> 843<li>Upon receiving an {@link android.view.accessibility.AccessibilityEvent} from an application, 844call the {@link android.view.accessibility.AccessibilityEvent#getRecord(int) 845AccessibilityEvent.getRecord()} to retrieve a specific {@link 846android.view.accessibility.AccessibilityRecord} (there may be several records attached to the 847event).</li> 848 849<li>From either {@link android.view.accessibility.AccessibilityEvent} or an individual {@link 850android.view.accessibility.AccessibilityRecord}, you can call {@link 851android.view.accessibility.AccessibilityRecord#getSource() getSource()} to retrieve a {@link 852android.view.accessibility.AccessibilityNodeInfo} object. 853 <p>An {@link android.view.accessibility.AccessibilityNodeInfo} represents a single node 854of the window content in a format that allows you to query accessibility information about that 855node. The {@link android.view.accessibility.AccessibilityNodeInfo} object returned from {@link 856android.view.accessibility.AccessibilityEvent} describes the event source, whereas the source from 857an {@link android.view.accessibility.AccessibilityRecord} describes the predecessor of the event 858source.</p></li> 859 860<li>With the {@link android.view.accessibility.AccessibilityNodeInfo}, you can query information 861about it, call {@link 862android.view.accessibility.AccessibilityNodeInfo#getParent getParent()} or {@link 863android.view.accessibility.AccessibilityNodeInfo#getChild getChild()} to traverse the view 864hierarchy, and even add child views to the node.</li> 865</ol> 866 867<p>In order for your application to publish itself to the system as an accessibility service, it 868must declare an XML configuration file that corresponds to {@link 869android.accessibilityservice.AccessibilityServiceInfo}. For more information about creating an 870accessibility service, see {@link 871android.accessibilityservice.AccessibilityService} and {@link 872android.accessibilityservice.AccessibilityService#SERVICE_META_DATA 873SERVICE_META_DATA} for information about the XML configuration.</p> 874 875 876<h4>Other accessibility APIs</h4> 877 878<p>If you're interested in the device's accessibility state, the {@link 879android.view.accessibility.AccessibilityManager} has some new APIs such as:</p> 880<ul> 881 <li>{@link android.view.accessibility.AccessibilityManager.AccessibilityStateChangeListener} 882is an interface that allows you to receive a callback whenever accessibility is enabled or 883disabled.</li> 884 <li>{@link android.view.accessibility.AccessibilityManager#getEnabledAccessibilityServiceList 885 getEnabledAccessibilityServiceList()} provides information about which accessibility services 886 are currently enabled.</li> 887 <li>{@link android.view.accessibility.AccessibilityManager#isTouchExplorationEnabled()} tells 888 you whether the explore-by-touch mode is enabled.</li> 889</ul> 890 891 892 893 894 895 896<h3 id="SpellChecker">Spell Checker Services</h3> 897 898<p>A new spell checker framework allows apps to create spell checkers in a manner similar to the 899input method framework (for IMEs). To create a new spell checker, you must implement a service that 900extends 901{@link android.service.textservice.SpellCheckerService} and extend the {@link 902android.service.textservice.SpellCheckerService.Session} class to provide spelling suggestions based 903on text provided by the interface's callback methods. In the {@link 904android.service.textservice.SpellCheckerService.Session} callback methods, you must return the 905spelling suggestions as {@link android.view.textservice.SuggestionsInfo} objects. </p> 906 907<p>Applications with a spell checker service must declare the {@link 908android.Manifest.permission#BIND_TEXT_SERVICE} permission as required by the service. 909The service must also declare an intent filter with {@code <action 910android:name="android.service.textservice.SpellCheckerService" />} as the intent’s action and should 911include a {@code <meta-data>} element that declares configuration information for the spell 912checker. </p> 913 914<p>See the sample <a href="{@docRoot}resources/samples/SpellChecker/SampleSpellCheckerService/index.html"> 915Spell Checker Service</a> app and 916sample <a href="{@docRoot}resources/samples/SpellChecker/HelloSpellChecker/index.html"> 917Spell Checker Client</a> app for example code.</p> 918 919 920 921 922<h3 id="TTS">Text-to-speech Engines</h3> 923 924<p>Android’s text-to-speech (TTS) APIs have been significantly extended to allow applications to 925more easily implement custom TTS engines, while applications that want to use a TTS engine have a 926couple new APIs for selecting an engine.</p> 927 928 929<h4>Using text-to-speech engines</h4> 930 931<p>In previous versions of Android, you could use the {@link android.speech.tts.TextToSpeech} class 932to perform text-to-speech (TTS) operations using the TTS engine provided by the system or set a 933custom engine using {@link android.speech.tts.TextToSpeech#setEngineByPackageName 934setEngineByPackageName()}. In Android 4.0, the {@link 935android.speech.tts.TextToSpeech#setEngineByPackageName setEngineByPackageName()} method has been 936deprecated and you can now specify the engine to use with a new {@link 937android.speech.tts.TextToSpeech} constructor that accepts the package name of a TTS engine.</p> 938 939<p>You can also query the available TTS engines with {@link 940android.speech.tts.TextToSpeech#getEngines()}. This method returns a list of {@link 941android.speech.tts.TextToSpeech.EngineInfo} objects, which include meta data such as the engine’s 942icon, label, and package name.</p> 943 944 945<h4>Building text-to-speech engines</h4> 946 947<p>Previously, custom engines required that the engine be built using an undocumented native header 948file. In Android 4.0, there is a complete set of framework APIs for building TTS engines. </p> 949 950<p>The basic setup requires an implementation of {@link android.speech.tts.TextToSpeechService} that 951responds to the {@link android.speech.tts.TextToSpeech.Engine#INTENT_ACTION_TTS_SERVICE} intent. The 952primary work for a TTS engine happens during the {@link 953android.speech.tts.TextToSpeechService#onSynthesizeText onSynthesizeText()} callback in a service 954that extends {@link android.speech.tts.TextToSpeechService}. The system delivers this method two 955objects:</p> 956<ul> 957<li>{@link android.speech.tts.SynthesisRequest}: This contains various data including the text to 958synthesize, the locale, the speech rate, and voice pitch.</li> 959<li>{@link android.speech.tts.SynthesisCallback}: This is the interface by which your TTS engine 960delivers the resulting speech data as streaming audio. First the engine must call {@link 961android.speech.tts.SynthesisCallback#start start()} to indicate that the engine is ready to deliver 962the audio, then call {@link android.speech.tts.SynthesisCallback#audioAvailable audioAvailable()}, 963passing it the audio data in a byte buffer. Once your engine has passed all audio through the 964buffer, call {@link android.speech.tts.SynthesisCallback#done()}.</li> 965</ul> 966 967<p>Now that the framework supports a true API for creating TTS engines, support for the native code 968implementation has been removed. Look for a blog post about a compatibility layer 969that you can use to convert your old TTS engines to the new framework.</p> 970 971<p>For an example TTS engine using the new APIs, see the <a 972href="{@docRoot}resources/samples/TtsEngine/index.html">Text To Speech Engine</a> sample app.</p> 973 974 975 976 977 978 979<h3 id="NetworkUsage">Network Usage</h3> 980 981<p>Android 4.0 gives users precise visibility of how much network data their applications are using. 982The Settings app provides controls that allow users to manage set limits for network data usage and 983even disable the use of background data for individual apps. In order to avoid users disabling your 984app’s access to data from the background, you should develop strategies to use the data 985connection efficiently and adjust your usage depending on the type of connection available.</p> 986 987<p>If your application performs a lot of network transactions, you should provide user settings that 988allow users to control your app’s data habits, such as how often your app syncs data, whether to 989perform uploads/downloads only when on Wi-Fi, whether to use data while roaming, etc. With these 990controls available to them, users are much less likely to disable your app’s access to data when 991they approach their limits, because they can instead precisely control how much data your app uses. 992If you provide a preference activity with these settings, you should include in its manifest 993declaration an intent filter for the {@link android.content.Intent#ACTION_MANAGE_NETWORK_USAGE} 994action. For example:</p> 995 996<pre> 997<activity android:name="DataPreferences" android:label="@string/title_preferences"> 998 <intent-filter> 999 <action android:name="android.intent.action.MANAGE_NETWORK_USAGE" /> 1000 <category android:name="android.intent.category.DEFAULT" /> 1001 </intent-filter> 1002</activity> 1003</pre> 1004 1005<p>This intent filter indicates to the system that this is the activity that controls your 1006application’s data usage. Thus, when the user inspects how much data your app is using from the 1007Settings app, a “View application settings" button is available that launches your 1008preference activity so the user can refine how much data your app uses.</p> 1009 1010<p>Also beware that {@link android.net.ConnectivityManager#getBackgroundDataSetting()} is now 1011deprecated and always returns true—use {@link 1012android.net.ConnectivityManager#getActiveNetworkInfo()} instead. Before you attempt any network 1013transactions, you should always call {@link android.net.ConnectivityManager#getActiveNetworkInfo()} 1014to get the {@link android.net.NetworkInfo} that represents the current network and query {@link 1015android.net.NetworkInfo#isConnected()} to check whether the device has a 1016connection. You can then check other connection properties, such as whether the device is 1017roaming or connected to Wi-Fi.</p> 1018 1019 1020 1021 1022 1023 1024 1025 1026<h3 id="Enterprise">Enterprise</h3> 1027 1028<p>Android 4.0 expands the capabilities for enterprise application with the following features.</p> 1029 1030<h4>VPN services</h4> 1031 1032<p>The new {@link android.net.VpnService} allows applications to build their own VPN (Virtual 1033Private Network), running as a {@link android.app.Service}. A VPN service creates an interface for a 1034virtual network with its own address and routing rules and performs all reading and writing with a 1035file descriptor.</p> 1036 1037<p>To create a VPN service, use {@link android.net.VpnService.Builder}, which allows you to specify 1038the network address, DNS server, network route, and more. When complete, you can establish the 1039interface by calling {@link android.net.VpnService.Builder#establish()}, which returns a {@link 1040android.os.ParcelFileDescriptor}. </p> 1041 1042<p>Because a VPN service can intercept packets, there are security implications. As such, if you 1043implement {@link android.net.VpnService}, then your service must require the {@link 1044android.Manifest.permission#BIND_VPN_SERVICE} to ensure that only the system can bind to it (only 1045the system is granted this permission—apps cannot request it). To then use your VPN service, 1046users must manually enable it in the system settings.</p> 1047 1048 1049<h4>Device policies</h4> 1050 1051<p>Applications that manage the device restrictions can now disable the camera using {@link 1052android.app.admin.DevicePolicyManager#setCameraDisabled setCameraDisabled()} and the {@link 1053android.app.admin.DeviceAdminInfo#USES_POLICY_DISABLE_CAMERA} property (applied with a {@code 1054<disable-camera />} element in the policy configuration file).</p> 1055 1056 1057<h4>Certificate management</h4> 1058 1059<p>The new {@link android.security.KeyChain} class provides APIs that allow you to import and access 1060certificates in the system key store. Certificates streamline the installation of both client 1061certificates (to validate the identity of the user) and certificate authority certificates (to 1062verify server identity). Applications such as web browsers or email clients can access the installed 1063certificates to authenticate users to servers. See the {@link android.security.KeyChain} 1064documentation for more information.</p> 1065 1066 1067 1068 1069 1070 1071 1072<h3 id="Sensors">Device Sensors</h3> 1073 1074<p>Two new sensor types have been added in Android 4.0:</p> 1075 1076<ul> 1077 <li>{@link android.hardware.Sensor#TYPE_AMBIENT_TEMPERATURE}: A temperature sensor that provides 1078the ambient (room) temperature in degrees Celsius.</li> 1079 <li>{@link android.hardware.Sensor#TYPE_RELATIVE_HUMIDITY}: A humidity sensor that provides the 1080relative ambient (room) humidity as a percentage.</li> 1081</ul> 1082 1083<p>If a device has both {@link android.hardware.Sensor#TYPE_AMBIENT_TEMPERATURE} and {@link 1084android.hardware.Sensor#TYPE_RELATIVE_HUMIDITY} sensors, you can use them to calculate the dew point 1085and the absolute humidity.</p> 1086 1087<p>The previous temperature sensor, {@link android.hardware.Sensor#TYPE_TEMPERATURE}, has been 1088deprecated. You should use the {@link android.hardware.Sensor#TYPE_AMBIENT_TEMPERATURE} sensor 1089instead.</p> 1090 1091<p>Additionally, Android’s three synthetic sensors have been greatly improved so they now have lower 1092latency and smoother output. These sensors include the gravity sensor ({@link 1093android.hardware.Sensor#TYPE_GRAVITY}), rotation vector sensor ({@link 1094android.hardware.Sensor#TYPE_ROTATION_VECTOR}), and linear acceleration sensor ({@link 1095android.hardware.Sensor#TYPE_LINEAR_ACCELERATION}). The improved sensors rely on the gyroscope 1096sensor to improve their output, so the sensors appear only on devices that have a gyroscope.</p> 1097 1098 1099 1100 1101 1102<h3 id="ActionBar">Action Bar</h3> 1103 1104<p>The {@link android.app.ActionBar} has been updated to support several new behaviors. Most 1105importantly, the system gracefully manages the action bar’s size and configuration when running on 1106smaller screens in order to provide an optimal user experience on all screen sizes. For example, 1107when the screen is narrow (such as when a handset is in portrait orientation), the action bar’s 1108navigation tabs appear in a “stacked bar," which appears directly below the main action bar. You can 1109also opt-in to a “split action bar," which places all action items in a separate bar at the bottom 1110of the screen when the screen is narrow.</p> 1111 1112 1113<h4>Split action bar</h4> 1114 1115<p>If your action bar includes several action items, not all of them will fit into the action bar on 1116a narrow screen, so the system will place more of them into the overflow menu. However, Android 4.0 1117allows you to enable “split action bar" so that more action items can appear on the screen in a 1118separate bar at the bottom of the screen. To enable split action bar, add {@link 1119android.R.attr#uiOptions android:uiOptions} with {@code "splitActionBarWhenNarrow"} to either your 1120<a href="{@docRoot}guide/topics/manifest/application-element.html">{@code <application>}</a> 1121tag or 1122individual <a href="{@docRoot}guide/topics/manifest/activity-element.html">{@code 1123<activity>}</a> tags 1124in your manifest file. When enabled, the system will add an additional bar at the bottom of the 1125screen for all action items when the screen is narrow (no action items will appear in the primary 1126action bar).</p> 1127 1128<p>If you want to use the navigation tabs provided by the {@link android.app.ActionBar.Tab} APIs, 1129but don’t need the main action bar on top (you want only the tabs to appear at the top), then enable 1130the split action bar as described above and also call {@link 1131android.app.ActionBar#setDisplayShowHomeEnabled setDisplayShowHomeEnabled(false)} to disable the 1132application icon in the action bar. With nothing left in the main action bar, it 1133disappears—all that’s left are the navigation tabs at the top and the action items at the 1134bottom of the screen.</p> 1135 1136 1137<h4>Action bar styles</h4> 1138 1139<p>If you want to apply custom styling to the action bar, you can use new style properties {@link 1140android.R.attr#backgroundStacked} and {@link android.R.attr#backgroundSplit} to apply a background 1141drawable or color to the stacked bar and split bar, respectively. You can also set these styles at 1142runtime with {@link android.app.ActionBar#setStackedBackgroundDrawable 1143setStackedBackgroundDrawable()} and {@link android.app.ActionBar#setSplitBackgroundDrawable 1144setSplitBackgroundDrawable()}.</p> 1145 1146 1147<h4>Action provider</h4> 1148 1149<p>The new {@link android.view.ActionProvider} class allows you to create a specialized handler for 1150action items. An action provider can define an action view, a default action behavior, and a submenu 1151for each action item to which it is associated. When you want to create an action item that has 1152dynamic behaviors (such as a variable action view, default action, or submenu), extending {@link 1153android.view.ActionProvider} is a good solution in order to create a reusable component, rather than 1154handling the various action item transformations in your fragment or activity.</p> 1155 1156<p>For example, the {@link android.widget.ShareActionProvider} is an extension of {@link 1157android.view.ActionProvider} that facilitates a “share" action from the action bar. Instead of using 1158traditional action item that invokes the {@link android.content.Intent#ACTION_SEND} intent, you can 1159use this action provider to present an action view with a drop-down list of applications that handle 1160the {@link android.content.Intent#ACTION_SEND} intent. When the user selects an application to use 1161for the action, {@link android.widget.ShareActionProvider} remembers that selection and provides it 1162in the action view for faster access to sharing with that app.</p> 1163 1164<p>To declare an action provider for an action item, include the {@code android:actionProviderClass} 1165attribute in the <a href="{@docRoot}guide/topics/resources/menu-resource.html#item-element">{@code 1166<item>}</a> element for your activity’s options menu, with the class name of the action 1167provider as the value. For example:</p> 1168 1169<pre> 1170<item android:id="@+id/menu_share" 1171 android:title="Share" 1172 android:showAsAction="ifRoom" 1173 android:actionProviderClass="android.widget.ShareActionProvider" /> 1174</pre> 1175 1176<p>In your activity’s {@link android.app.Activity#onCreateOptionsMenu onCreateOptionsMenu()} 1177callback method, retrieve an instance of the action provider from the menu item and set the 1178intent:</p> 1179 1180<pre> 1181public boolean onCreateOptionsMenu(Menu menu) { 1182 getMenuInflater().inflate(R.menu.options, menu); 1183 ShareActionProvider shareActionProvider = 1184 (ShareActionProvider) menu.findItem(R.id.menu_share).getActionProvider(); 1185 // Set the share intent of the share action provider. 1186 shareActionProvider.setShareIntent(createShareIntent()); 1187 ... 1188 return super.onCreateOptionsMenu(menu); 1189} 1190</pre> 1191 1192<p>For an example using the {@link android.widget.ShareActionProvider}, see <a 1193href="{@docRoot}resources/samples/ApiDemos/src/com/example/android/apis/app/ActionBarShareActionProviderActivity.html" 1194>ActionBarShareActionProviderActivity</a> in ApiDemos.</p> 1195 1196 1197<h4>Collapsible action views</h4> 1198 1199<p>Action items that provide an action view can now toggle between their action view state and 1200traditional action item state. Previously only the {@link android.widget.SearchView} supported 1201collapsing when used as an action view, but now you can add an action view for any action item and 1202switch between the expanded state (action view is visible) and collapsed state (action item is 1203visible).</p> 1204 1205<p>To declare that an action item that contains an action view be collapsible, include the {@code 1206“collapseActionView"} flag in the {@code android:showAsAction} attribute for the <a 1207href="{@docRoot}guide/topics/resources/menu-resource.html#item-element">{@code 1208<item>}</a> element in the menu’s XML file.</p> 1209 1210<p>To receive callbacks when an action view switches between expanded and collapsed, register an 1211instance of {@link android.view.MenuItem.OnActionExpandListener} with the respective {@link 1212android.view.MenuItem} by calling {@link android.view.MenuItem#setOnActionExpandListener 1213setOnActionExpandListener()}. Typically, you should do so during the {@link 1214android.app.Activity#onCreateOptionsMenu onCreateOptionsMenu()} callback.</p> 1215 1216<p>To control a collapsible action view, you can call {@link 1217android.view.MenuItem#collapseActionView()} and {@link android.view.MenuItem#expandActionView()} on 1218the respective {@link android.view.MenuItem}.</p> 1219 1220<p>When creating a custom action view, you can also implement the new {@link 1221android.view.CollapsibleActionView} interface to receive callbacks when the view is expanded and 1222collapsed.</p> 1223 1224 1225<h4>Other APIs for action bar</h4> 1226<ul> 1227<li>{@link android.app.ActionBar#setHomeButtonEnabled setHomeButtonEnabled()} allows you to specify 1228whether the icon/logo behaves as a button to navigate home or “up" (pass “true" to make it behave as 1229a button).</li> 1230 1231<li>{@link android.app.ActionBar#setIcon setIcon()} and {@link android.app.ActionBar#setLogo 1232setLogo()} allow you to define the action bar icon or logo at runtime.</li> 1233 1234<li>{@link android.app.Fragment#setMenuVisibility Fragment.setMenuVisibility()} allows you to enable 1235or disable the visibility of the options menu items declared by the fragment. This is useful if the 1236fragment has been added to the activity, but is not visible, so the menu items should be 1237hidden.</li> 1238 1239<li>{@link android.app.FragmentManager#invalidateOptionsMenu 1240FragmentManager.invalidateOptionsMenu()} 1241allows you to invalidate the activity options menu during various states of the fragment lifecycle 1242in which using the equivalent method from {@link android.app.Activity} might not be available.</li> 1243</ul> 1244 1245 1246 1247 1248 1249 1250 1251 1252<h3 id="UI">User Interface and Views</h3> 1253 1254<p>Android 4.0 introduces a variety of new views and other UI components.</p> 1255 1256 1257<h4>GridLayout</h4> 1258 1259<p>{@link android.widget.GridLayout} is a new view group that places child views in a rectangular 1260grid. Unlike {@link android.widget.TableLayout}, {@link android.widget.GridLayout} relies on a flat 1261hierarchy and does not make use of intermediate views such as table rows for providing structure. 1262Instead, children specify which row(s) and column(s) they should occupy (cells can span multiple 1263rows and/or columns), and by default are laid out sequentially across the grid’s rows and columns. 1264The {@link android.widget.GridLayout} orientation determines whether sequential children are by 1265default laid out horizontally or vertically. Space between children may be specified either by using 1266instances of the new {@link android.widget.Space} view or by setting the relevant margin parameters 1267on children.</p> 1268 1269<p>See <a 1270href="{@docRoot}resources/samples/ApiDemos/src/com/example/android/apis/view/index.html">ApiDemos</a 1271> 1272for samples using {@link android.widget.GridLayout}.</p> 1273 1274 1275 1276<h4>TextureView</h4> 1277 1278<p>{@link android.view.TextureView} is a new view that allows you to display a content stream, such 1279as a video or an OpenGL scene. Although similar to {@link android.view.SurfaceView}, {@link 1280android.view.TextureView} is unique in that it behaves like a regular view, rather than creating a 1281separate window, so you can treat it like any other {@link android.view.View} object. For example, 1282you can apply transforms, animate it using {@link android.view.ViewPropertyAnimator}, or 1283adjust its opacity with {@link android.view.View#setAlpha setAlpha()}.</p> 1284 1285<p>Beware that {@link android.view.TextureView} works only within a hardware accelerated window.</p> 1286 1287<p>For more information, see the {@link android.view.TextureView} documentation.</p> 1288 1289 1290<h4>Switch widget</h4> 1291 1292<p>The new {@link android.widget.Switch} widget is a two-state toggle that users can drag to one 1293side or the other (or simply tap) to toggle an option between two states.</p> 1294 1295<p>You can use the {@code android:textOn} and {@code android:textOff} attributes to specify the text 1296to appear on the switch when in the on and off setting. The {@code android:text} attribute also 1297allows you to place a label alongside the switch.</p> 1298 1299<p>For a sample using switches, see the <a 1300href="{@docRoot}resources/samples/ApiDemos/res/layout/switches.html">switches.xml</a> layout file 1301and respective <a 1302href="{@docRoot}resources/samples/ApiDemos/src/com/example/android/apis/view/Switches.html">Switches 1303</a> activity.</p> 1304 1305 1306<h4>Popup menus</h4> 1307 1308<p>Android 3.0 introduced {@link android.widget.PopupMenu} to create short contextual menus that pop 1309up at an anchor point you specify (usually at the point of the item selected). Android 4.0 extends 1310the {@link android.widget.PopupMenu} with a couple useful features:</p> 1311<ul> 1312<li>You can now easily inflate the contents of a popup menu from an XML <a 1313href="{@docRoot}guide/topics/resources/menu-resource.html">menu resource</a> with {@link 1314android.widget.PopupMenu#inflate inflate()}, passing it the menu resource ID.</li> 1315<li>You can also now create a {@link android.widget.PopupMenu.OnDismissListener} that receives a 1316callback when the menu is dismissed.</li> 1317</ul> 1318 1319 1320<h4>Preferences</h4> 1321 1322<p>A new {@link android.preference.TwoStatePreference} abstract class serves as the basis for 1323preferences that provide a two-state selection option. The new {@link 1324android.preference.SwitchPreference} is an extension of {@link 1325android.preference.TwoStatePreference} that provides a {@link android.widget.Switch} widget in the 1326preference view to allow users to toggle a setting on or off without the need to open an additional 1327preference screen or dialog. For example, the Settings application uses a {@link 1328android.preference.SwitchPreference} for the Wi-Fi and Bluetooth settings.</p> 1329 1330 1331 1332<h4>System themes</h4> 1333 1334<p>The default theme for all applications that target Android 4.0 (by setting either <a 1335href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#target">{@code targetSdkVersion}</a> or 1336<a href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#min">{@code minSdkVersion}</a> to 1337{@code “14"} or higher) is now the 1338"device default" theme: {@link android.R.style#Theme_DeviceDefault Theme.DeviceDefault}. This may be 1339the dark Holo theme or a different dark theme defined by the specific device.</p> 1340 1341<p>The {@link android.R.style#Theme_Holo Theme.Holo} family of themes are guaranteed to not change 1342from one device to another when running the same version of Android. If you explicitly 1343apply any of the {@link android.R.style#Theme_Holo Theme.Holo} themes to your activities, you can 1344rest assured that these themes will not change character on different devices within the same 1345platform version.</p> 1346 1347<p>If you wish for your app to blend in with the overall device theme (such as when different OEMs 1348provide different default themes for the system), you should explicitly apply themes from the {@link 1349android.R.style#Theme_DeviceDefault Theme.DeviceDefault} family.</p> 1350 1351 1352<h4>Options menu button</h4> 1353 1354<p>Beginning with Android 4.0, you'll notice that handsets no longer require a Menu hardware button. 1355However, there's no need for you to worry about this if your existing application provides an <a 1356href="{@docRoot}guide/topics/ui/menus.html#options-menu">options menu</a> and expects there to be a 1357Menu button. To ensure that existing apps continue to work as they expect, the system provides an 1358on-screen Menu button for apps that were designed for older versions of Android.</p> 1359 1360<p>For the best user experience, new and updated apps should instead use the {@link 1361android.app.ActionBar} to provide access to menu items and set <a 1362href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#target">{@code targetSdkVersion}</a> to 1363{@code "14"} to take advantage of the latest framework default behaviors.</p> 1364 1365 1366 1367<h4 id="SystemUI">Controls for system UI visibility</h4> 1368 1369<p>Since the early days of Android, the system has managed a UI component known as the <em>status 1370bar</em>, which resides at the top of handset devices to deliver information such as the carrier 1371signal, time, notifications, and so on. Android 3.0 added the <em>system bar</em> for tablet 1372devices, which resides at the bottom of the screen to provide system navigation controls (Home, 1373Back, and so forth) and also an interface for elements traditionally provided by the status bar. In 1374Android 4.0, the system provides a new type of system UI called the <em>navigation bar</em>. You 1375might consider the navigation bar a re-tuned version of the system bar designed for 1376handsets—it provides navigation controls 1377for devices that don’t have hardware counterparts for navigating the system, but it leaves out the 1378system bar's notification UI and setting controls. As such, a device that provides the navigation 1379bar also has the status bar at the top.</p> 1380 1381<p>To this day, you can hide the status bar on handsets using the {@link 1382android.view.WindowManager.LayoutParams#FLAG_FULLSCREEN} flag. In Android 4.0, the APIs that control 1383the system bar’s visibility have been updated to better reflect the behavior of both the system bar 1384and navigation bar:</p> 1385<ul> 1386<li>The {@link android.view.View#SYSTEM_UI_FLAG_LOW_PROFILE} flag replaces the {@code 1387STATUS_BAR_HIDDEN} flag. When set, this flag enables “low profile" mode for the system bar or 1388navigation bar. Navigation buttons dim and other elements in the system bar also hide. Enabling 1389this is useful for creating more immersive games without distraction for the system navigation 1390buttons.</li> 1391 1392<li>The {@link android.view.View#SYSTEM_UI_FLAG_VISIBLE} flag replaces the {@code 1393STATUS_BAR_VISIBLE} flag to request the system bar or navigation bar be visible.</li> 1394 1395<li>The {@link android.view.View#SYSTEM_UI_FLAG_HIDE_NAVIGATION} is a new flag that requests 1396the navigation bar hide completely. Be aware that this works only for the <em>navigation bar</em> 1397used by some handsets (it does <strong>not</strong> hide the system bar on tablets). The navigation 1398bar returns to view as soon as the system receives user input. As such, this mode is useful 1399primarily for video playback or other cases in which the whole screen is needed but user input is 1400not required.</li> 1401</ul> 1402 1403<p>You can set each of these flags for the system bar and navigation bar by calling {@link 1404android.view.View#setSystemUiVisibility setSystemUiVisibility()} on any view in your activity. The 1405window manager combines (OR-together) all flags from all views in your window and 1406apply them to the system UI as long as your window has input focus. When your window loses input 1407focus (the user navigates away from your app, or a dialog appears), your flags cease to have effect. 1408Similarly, if you remove those views from the view hierarchy their flags no longer apply.</p> 1409 1410<p>To synchronize other events in your activity with visibility changes to the system UI (for 1411example, hide the action bar or other UI controls when the system UI hides), you should register a 1412{@link android.view.View.OnSystemUiVisibilityChangeListener} to be notified when the visibility 1413of the system bar or navigation bar changes.</p> 1414 1415<p>See the <a 1416href="{@docRoot}resources/samples/ApiDemos/src/com/example/android/apis/view/OverscanActivity.html"> 1417OverscanActivity</a> class for a demonstration of different system UI options.</p> 1418 1419 1420 1421 1422 1423<h3 id="Input">Input Framework</h3> 1424 1425<p>Android 4.0 adds support for cursor hover events and new stylus and mouse button events.</p> 1426 1427<h4>Hover events</h4> 1428 1429<p>The {@link android.view.View} class now supports “hover" events to enable richer interactions 1430through the use of pointer devices (such as a mouse or other devices that drive an on-screen 1431cursor).</p> 1432 1433<p>To receive hover events on a view, implement the {@link android.view.View.OnHoverListener} and 1434register it with {@link android.view.View#setOnHoverListener setOnHoverListener()}. When a hover 1435event occurs on the view, your listener receives a call to {@link 1436android.view.View.OnHoverListener#onHover onHover()}, providing the {@link android.view.View} that 1437received the event and a {@link android.view.MotionEvent} that describes the type of hover event 1438that occurred. The hover event can be one of the following:</p> 1439<ul> 1440<li>{@link android.view.MotionEvent#ACTION_HOVER_ENTER}</li> 1441<li>{@link android.view.MotionEvent#ACTION_HOVER_EXIT}</li> 1442<li>{@link android.view.MotionEvent#ACTION_HOVER_MOVE}</li> 1443</ul> 1444 1445<p>Your {@link android.view.View.OnHoverListener} should return true from {@link 1446android.view.View.OnHoverListener#onHover onHover()} if it handles the hover event. If your 1447listener returns false, then the hover event will be dispatched to the parent view as usual.</p> 1448 1449<p>If your application uses buttons or other widgets that change their appearance based on the 1450current state, you can now use the {@code android:state_hovered} attribute in a <a 1451href="{@docRoot}guide/topics/resources/drawable-resource.html#StateList">state list drawable</a> to 1452provide a different background drawable when a cursor hovers over the view.</p> 1453 1454<p>For a demonstration of the new hover events, see the <a 1455href="{@docRoot}resources/samples/ApiDemos/src/com/example/android/apis/view/Hover.html">Hover</a> class in 1456ApiDemos.</p> 1457 1458 1459<h4>Stylus and mouse button events</h4> 1460 1461<p>Android now provides APIs for receiving input from a stylus input device such as a digitizer 1462tablet peripheral or a stylus-enabled touch screen.</p> 1463 1464<p>Stylus input operates in a similar manner to touch or mouse input. When the stylus is in contact 1465with the digitizer, applications receive touch events just like they would when a finger is used to 1466touch the display. When the stylus is hovering above the digitizer, applications receive hover 1467events just like they would when a mouse pointer was being moved across the display when no buttons 1468are pressed.</p> 1469 1470<p>Your application can distinguish between finger, mouse, stylus and eraser input by querying the 1471“tool type" associated with each pointer in a {@link android.view.MotionEvent} using {@link 1472android.view.MotionEvent#getToolType getToolType()}. The currently defined tool types are: {@link 1473android.view.MotionEvent#TOOL_TYPE_UNKNOWN}, {@link android.view.MotionEvent#TOOL_TYPE_FINGER}, 1474{@link android.view.MotionEvent#TOOL_TYPE_MOUSE}, {@link android.view.MotionEvent#TOOL_TYPE_STYLUS}, 1475and {@link android.view.MotionEvent#TOOL_TYPE_ERASER}. By querying the tool type, your application 1476can choose to handle stylus input in different ways from finger or mouse input.</p> 1477 1478<p>Your application can also query which mouse or stylus buttons are pressed by querying the “button 1479state" of a {@link android.view.MotionEvent} using {@link android.view.MotionEvent#getButtonState 1480getButtonState()}. The currently defined button states are: {@link 1481android.view.MotionEvent#BUTTON_PRIMARY}, {@link android.view.MotionEvent#BUTTON_SECONDARY}, {@link 1482android.view.MotionEvent#BUTTON_TERTIARY}, {@link android.view.MotionEvent#BUTTON_BACK}, and {@link 1483android.view.MotionEvent#BUTTON_FORWARD}. For convenience, the back and forward mouse buttons are 1484automatically mapped to the {@link android.view.KeyEvent#KEYCODE_BACK} and {@link 1485android.view.KeyEvent#KEYCODE_FORWARD} keys. Your application can handle these keys to support 1486mouse button based back and forward navigation.</p> 1487 1488<p>In addition to precisely measuring the position and pressure of a contact, some stylus input 1489devices also report the distance between the stylus tip and the digitizer, the stylus tilt angle, 1490and the stylus orientation angle. Your application can query this information using {@link 1491android.view.MotionEvent#getAxisValue getAxisValue()} with the axis codes {@link 1492android.view.MotionEvent#AXIS_DISTANCE}, {@link android.view.MotionEvent#AXIS_TILT}, and {@link 1493android.view.MotionEvent#AXIS_ORIENTATION}.</p> 1494 1495<p>For a demonstration of tool types, button states and the new axis codes, see the <a 1496href="{@docRoot}resources/samples/ApiDemos/src/com/example/android/apis/graphics/TouchPaint.html">TouchPaint 1497</a> class in ApiDemos.</p> 1498 1499 1500 1501 1502 1503 1504<h3 id="Properties">Properties</h3> 1505 1506<p>The new {@link android.util.Property} class provides a fast, efficient, and easy way to specify a 1507property on any object that allows callers to generically set/get values on target objects. It also 1508allows the functionality of passing around field/method references and allows code to set/get values 1509of the property without knowing the details of what the fields/methods are.</p> 1510 1511<p>For example, if you want to set the value of field {@code bar} on object {@code foo}, you would 1512previously do this:</p> 1513<pre> 1514foo.bar = value; 1515</pre> 1516 1517<p>If you want to call the setter for an underlying private field {@code bar}, you would previously 1518do this:</p> 1519<pre> 1520foo.setBar(value); 1521</pre> 1522 1523<p>However, if you want to pass around the {@code foo} instance and have some other code set the 1524{@code bar} value, there is really no way to do it prior to Android 4.0.</p> 1525 1526<p>Using the {@link android.util.Property} class, you can declare a {@link android.util.Property} 1527object {@code BAR} on class {@code Foo} so that you can set the field on instance {@code foo} of 1528class {@code Foo} like this:</p> 1529<pre> 1530BAR.set(foo, value); 1531</pre> 1532 1533<p>The {@link android.view.View} class now leverages the {@link android.util.Property} class to 1534allow you to set various fields, such as transform properties that were added in Android 3.0 ({@link 1535android.view.View#ROTATION}, {@link android.view.View#ROTATION_X}, {@link 1536android.view.View#TRANSLATION_X}, etc.).</p> 1537 1538<p>The {@link android.animation.ObjectAnimator} class also uses the {@link android.util.Property} 1539class, so you can create an {@link android.animation.ObjectAnimator} with a {@link 1540android.util.Property}, which is faster, more efficient, and more type-safe than the string-based 1541approach.</p> 1542 1543 1544 1545 1546 1547 1548<h3 id="HwAccel">Hardware Acceleration</h3> 1549 1550<p>Beginning with Android 4.0, hardware acceleration for all windows is enabled by default if your 1551application has set either <a 1552href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#target">{@code targetSdkVersion}</a> or 1553<a href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#min">{@code minSdkVersion}</a> to 1554{@code “14"} or higher. Hardware acceleration generally results in smoother animations, smoother 1555scrolling, and overall better performance and response to user interaction.</p> 1556 1557<p>If necessary, you can manually disable hardware acceleration with the <a 1558href="{@docRoot}guide/topics/manifest/activity-element.html#hwaccel">{@code hardwareAccelerated}</a> 1559attribute for individual <a href="{@docRoot}guide/topics/manifest/activity-element.html">{@code 1560<activity>}</a> elements or the <a 1561href="{@docRoot}guide/topics/manifest/application-element.html">{@code <application>}</a> 1562element. You can alternatively disable hardware acceleration for individual views by calling {@link 1563android.view.View#setLayerType setLayerType(LAYER_TYPE_SOFTWARE)}.</p> 1564 1565<p>For more information about hardware acceleration, including a list of unsupported drawing 1566operations, see the <a href="{@docRoot}guide/topics/graphics/hardware-accel.html">Hardware 1567Acceleration</a> document.</p> 1568 1569 1570 1571<h3 id="Jni">JNI Changes</h3> 1572 1573<p>In previous versions of Android, JNI local references weren’t indirect handles; Android used 1574direct pointers. This wasn't a problem as long as the garbage collector didn't move objects, but it 1575seemed to work because it made it possible to write buggy code. In Android 4.0, the system now uses 1576indirect references in order to detect these bugs.</p> 1577 1578<p>The ins and outs of JNI local references are described in “Local and Global References" in <a 1579href="{@docRoot}guide/practices/jni.html">JNI Tips</a>. In Android 4.0, <a 1580href="http://android-developers.blogspot.com/2011/07/debugging-android-jni-with-checkjni.html"> 1581CheckJNI</a> has been enhanced to detect these errors. Watch the <a 1582href="http://android-developers.blogspot.com/">Android Developers Blog</a> for an upcoming post 1583about common errors with JNI references and how you can fix them.</p> 1584 1585<p>This change in the JNI implementation only affects apps that target Android 4.0 by setting either 1586the <a href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#target">{@code 1587targetSdkVersion}</a> or <a href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#min">{@code 1588minSdkVersion}</a> to {@code “14"} or higher. If you’ve set these attributes to any lower value, 1589then JNI local references behave the same as in previous versions.</p> 1590 1591 1592 1593 1594 1595<h3 id="WebKit">WebKit</h3> 1596<ul> 1597<li>WebKit updated to version 534.30</li> 1598<li>Support for Indic fonts (Devanagari, Bengali, and Tamil, including the complex character support 1599needed for combining glyphs) in {@link android.webkit.WebView} and the built-in Browser</li> 1600<li>Support for Ethiopic, Georgian, and Armenian fonts in {@link android.webkit.WebView} and the 1601built-in Browser</li> 1602<li>Support for <a 1603href="http://google-opensource.blogspot.com/2009/05/introducing-webdriver.html">WebDriver</a> makes 1604it easier for you to test apps that use {@link android.webkit.WebView}</li> 1605</ul> 1606 1607 1608<h4>Android Browser</h4> 1609 1610<p>The Browser application adds the following features to support web applications:</p> 1611<ul> 1612<li>Updated V8 JavaScript compiler for faster performance</li> 1613<li>Plus other notable enhancements carried over from <a 1614href="{@docRoot}about/versions/android-3.0.html">Android 16153.0</a> are now available for handsets: 1616<ul> 1617<li>Support for fixed position elements on all pages</li> 1618<li><a href="http://dev.w3.org/2009/dap/camera/">HTML media capture</a></li> 1619<li><a href="http://dev.w3.org/geo/api/spec-source-orientation.html">Device orientation 1620events</a></li> 1621<li><a href="http://www.w3.org/TR/css3-3d-transforms/">CSS 3D transformations</a></li> 1622</ul> 1623</li> 1624</ul> 1625 1626 1627 1628<h3 id="Permissions">Permissions</h3> 1629 1630<p>The following are new permissions:</p> 1631<ul> 1632<li>{@link android.Manifest.permission#ADD_VOICEMAIL}: Allows a voicemail service to add voicemail 1633messages to the device.</li> 1634<li>{@link android.Manifest.permission#BIND_TEXT_SERVICE}: A service that implements {@link 1635android.service.textservice.SpellCheckerService} must require this permission for itself.</li> 1636<li>{@link android.Manifest.permission#BIND_VPN_SERVICE}: A service that implements {@link 1637android.net.VpnService} must require this permission for itself.</li> 1638<li>{@link android.Manifest.permission#READ_PROFILE}: Provides read access to the {@link 1639android.provider.ContactsContract.Profile} provider.</li> 1640<li>{@link android.Manifest.permission#WRITE_PROFILE}: Provides write access to the {@link 1641android.provider.ContactsContract.Profile} provider.</li> 1642</ul> 1643 1644 1645 1646<h3 id="DeviceFeatures">Device Features</h3> 1647 1648<p>The following are new device features:</p> 1649<ul> 1650<li>{@link android.content.pm.PackageManager#FEATURE_WIFI_DIRECT}: Declares that the application 1651uses 1652Wi-Fi for peer-to-peer communications.</li> 1653</ul> 1654 1655 1656<div class="special" style="margin-top:3em"> 1657<p>For a detailed view of all API changes in Android {@sdkPlatformVersion} (API Level 1658{@sdkPlatformApiLevel}), see the <a 1659href="{@docRoot}sdk/api_diff/{@sdkPlatformApiLevel}/changes.html">API Differences Report</a>.</p> 1660</div> 1661 1662 1663<h2 id="Honeycomb">Previous APIs</h2> 1664 1665<p>In addition to everything above, Android 4.0 naturally supports all APIs from previous releases. 1666Because the Android 3.x platform is available only for large-screen devices, if you've 1667been developing primarily for handsets, then you might not be aware of all the APIs added to Android 1668in these recent releases.</p> 1669 1670<p>Here's a look at some of the most notable APIs you might have missed that are now available 1671on handsets as well:</p> 1672 1673<dl> 1674 <dt><a href="android-3.0.html">Android 3.0</a></dt> 1675 <dd> 1676 <ul> 1677 <li>{@link android.app.Fragment}: A framework component that allows you to separate distinct 1678elements of an activity into self-contained modules that define their own UI and lifecycle. See the 1679<a href="{@docRoot}guide/components/fragments.html">Fragments</a> developer guide.</li> 1680 <li>{@link android.app.ActionBar}: A replacement for the traditional title bar at the top of 1681the activity window. It includes the application logo in the left corner and provides a new 1682interface for menu items. See the 1683<a href="{@docRoot}guide/topics/ui/actionbar.html">Action Bar</a> developer guide.</li> 1684 <li>{@link android.content.Loader}: A framework component that facilitates asynchronous 1685loading of data in combination with UI components to dynamically load data without blocking the 1686main thread. See the 1687<a href="{@docRoot}guide/components/loaders.html">Loaders</a> developer guide.</li> 1688 <li>System clipboard: Applications can copy and paste data (beyond mere text) to and from 1689the system-wide clipboard. Clipped data can be plain text, a URI, or an intent. See the 1690<a href="{@docRoot}guide/topics/text/copy-paste.html">Copy and Paste</a> developer guide.</li> 1691 <li>Drag and drop: A set of APIs built into the view framework that facilitates drag and drop 1692operations. See the 1693<a href="{@docRoot}guide/topics/ui/drag-drop.html">Drag and Drop</a> developer guide.</li> 1694 <li>An all new flexible animation framework allows you to animate arbitrary properties of any 1695object (View, Drawable, Fragment, Object, or anything else) and define animation aspects such 1696as duration, interpolation, repeat and more. The new framework makes Animations in Android 1697simpler than ever. See the 1698<a href="{@docRoot}guide/topics/graphics/prop-animation.html">Property Animation</a> developer 1699guide.</li> 1700 <li>RenderScript graphics and compute engine: RenderScript offers a high performance 3D 1701graphics rendering and compute API at the native level, which you write in the C (C99 standard), 1702providing the type of performance you expect from a native environment while remaining portable 1703across various CPUs and GPUs. See the 1704<a href="{@docRoot}guide/topics/renderscript/index.html">RenderScript</a> developer 1705guide.</li> 1706 <li>Hardware accelerated 2D graphics: You can now enable the OpenGL renderer for your 1707application by setting {android:hardwareAccelerated="true"} in your manifest element's <a 1708href="{@docRoot}guide/topics/manifest/application-element.html"><code><application></code></a> 1709element or for individual <a 1710href="{@docRoot}guide/topics/manifest/activity-element.html"><code><activity></code></a> 1711elements. This results 1712in smoother animations, smoother scrolling, and overall better performance and response to user 1713interaction. 1714 <p class="note"><strong>Note:</strong> If you set your application's <a 1715href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#min">{@code minSdkVersion}</a> or <a 1716href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#target">{@code targetSdkVersion}</a> to 1717{@code "14"} or higher, hardware acceleration is enabled by default.</p></li> 1718 <li>And much, much more. See the <a href="android-3.0.html">Android 3.0 Platform</a> 1719notes for more information.</li> 1720 </ul> 1721 </dd> 1722 1723 <dt><a href="android-3.1.html">Android 3.1</a></dt> 1724 <dd> 1725 <ul> 1726 <li>USB APIs: Powerful new APIs for integrating connected peripherals with 1727Android applications. The APIs are based on a USB stack and services that are 1728built into the platform, including support for both USB host and device interactions. See the <a 1729href="{@docRoot}guide/topics/connectivity/usb/index.html">USB Host and Accessory</a> developer guide.</li> 1730 <li>MTP/PTP APIs: Applications can interact directly with connected cameras and other PTP 1731devices to receive notifications when devices are attached and removed, manage files and storage on 1732those devices, and transfer files and metadata to and from them. The MTP API implements the PTP 1733(Picture Transfer Protocol) subset of the MTP (Media Transfer Protocol) specification. See the 1734{@link android.mtp} documentation.</li> 1735 <li>RTP APIs: Android exposes an API to its built-in RTP (Real-time Transport Protocol) stack, 1736which applications can use to manage on-demand or interactive data streaming. In particular, apps 1737that provide VOIP, push-to-talk, conferencing, and audio streaming can use the API to initiate 1738sessions and transmit or receive data streams over any available network. See the {@link 1739android.net.rtp} documentation.</li> 1740 <li>Support for joysticks and other generic motion inputs.</li> 1741 <li>See the <a href="android-3.1.html">Android 3.1 Platform</a> 1742notes for many more new APIs.</li> 1743 </ul> 1744 </dd> 1745 1746 <dt><a href="android-3.2.html">Android 3.2</a></dt> 1747 <dd> 1748 <ul> 1749 <li>New screens support APIs that give you more control over how your applications are 1750displayed across different screen sizes. The API extends the existing screen support model with the 1751ability to precisely target specific screen size ranges by dimensions, measured in 1752density-independent pixel units (such as 600dp or 720dp wide), rather than by their generalized 1753screen sizes (such as large or xlarge). For example, this is important in order to help you 1754distinguish between a 5" device and a 7" device, which would both traditionally be bucketed as 1755"large" screens. See the blog post, <a 1756href="http://android-developers.blogspot.com/2011/07/new-tools-for-managing-screen-sizes.html"> 1757New Tools for Managing Screen Sizes</a>.</li> 1758 <li>New constants for <a 1759href="{@docRoot}guide/topics/manifest/uses-feature-element.html">{@code <uses-feature>}</a> to 1760declare landscape or portrait screen orientation requirements.</li> 1761 <li>The device "screen size" configuration now changes during a screen orientation 1762change. If your app targets API level 13 or higher, you must handle the {@code "screenSize"} 1763configuration change if you also want to handle the {@code "orientation"} configuration change. See 1764<a href="{@docRoot}guide/topics/manifest/activity-element.html#config">{@code 1765android:configChanges}</a> for more information.</li> 1766 <li>See the <a href="android-3.2.html">Android 3.2 Platform</a> 1767notes for other new APIs.</li> 1768 </ul> 1769 </dd> 1770 1771</dl> 1772 1773 1774 1775 1776<h3 id="api-level">API Level</h3> 1777 1778<p>The Android {@sdkPlatformVersion} API is assigned an integer 1779identifier—<strong>{@sdkPlatformApiLevel}</strong>—that is stored in the system itself. 1780This identifier, called the "API level", allows the system to correctly determine whether an 1781application is compatible with the system, prior to installing the application. </p> 1782 1783<p>To use APIs introduced in Android {@sdkPlatformVersion} in your application, you need compile the 1784application against an Android platform that supports API level {@sdkPlatformApiLevel} or 1785higher. Depending on your needs, you might also need to add an 1786<code>android:minSdkVersion="{@sdkPlatformApiLevel}"</code> attribute to the 1787<a href="{@docRoot}guide/topics/manifest/uses-sdk-element.html">{@code <uses-sdk>}</a> 1788element.</p> 1789 1790<p>For more information, read <a href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#ApiLevels">What is API 1791Level?</a></p> 1792