History log of /frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
Revision Date Author Comments (<<< Hide modified files) (Show modified files >>>)
3d1c5a7236c4709550ca7c0cfa293fc5c974c56b 10-Oct-2013 Alan Viverette <alanv@google.com> Ensure accessibility node cache is synced with service state

BUG: 11152210
Change-Id: Ibffd2909b6b06568de9344e536a200d8a7abac9d
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
38c992841b5f6bc80359dbf60d31aa7b994160fc 03-Sep-2013 Svetoslav Ganov <svetoslavganov@google.com> Crashes in TouchExplorer on two finger swipe.

1. The logic for finding the active pointer was incorrect. The code was
iterating over all pointer ids and taking the minimum, i.e. the pointer
that first went down. The problem was that the down time for pointers
that are not down was also considered (set to zero), thus sometimes we
would consider the first pointer that went down to be a pointer that
is not down at all. Now we are iterating only over the pointers that
are down.

2. The batched events while waiting to see if the user is exploring or
gesturing were added even if we were in touch exploration state at which
point we do not have to batch. As a result we ended up having lefovers
from a previous gesture when handling the delayed events and crash.

bug:10312546

Change-Id: I4728541ac12e4da4577d22e4314101dd169a52fb
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
84044b3ce737487b6e5bb1f6618d151c8659c2a1 09-Aug-2013 Svetoslav <svetoslavganov@google.com> Some hygiene for the touch explorer.

1. Removed the inactive pointer filtering which was not reporting pointers
to the apps if they did not travel a minimal distance. This prohibits
developemnt of apps with innovative interaction models such as using
the screen as a virtual Braille keyboard.

2. We need the first pointer to travel some distance or a minimal amount of
time to pass before deciding if the user explores or performs a gesture.
In this period we were dropping events which was preventing inovative
interfaces such as gesture based typing since we were chopping off a
significant portion of the data.

Change-Id: I5c1aa98d14c83f356a9c59c93f4dc1f970c0faca
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
c4842c11932ea4f60fe7ae09b0a59660207e1587 31-Oct-2012 Svetoslav Ganov <svetoslavganov@google.com> Accessibility support for the lockscreen - phone.

Change-Id: Idc99f1322a1d635dd07e1f5efa1665a4676267c2
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
6ae8a24fc045bc7970f2843fa9baf06aff15e22d 10-Oct-2012 Svetoslav Ganov <svetoslavganov@google.com> The active window for accessibility purposes can be miscomputed.

1. The active window is the one that the user touches or the one
that has input focus. We recognize the user touching a window
by the received accessibility hover events and the user not
touching the screen by a call from the touch explorer. It is
possible that the user touches window that does not have
input focus and as soon as he lifts finger the active one
will become the window that has input focus but now we get
he hover accessibility events from the touched window which
incorrectly changes the active window to be the touched one.
Note that at this point the user is not touching the screen.

bug:7298484

Change-Id: Ife035a798a6e68133f9220eeeabdfcd35a431b56
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
f772cba59760d1ad9eb5cb9205b2e2e9126e488d 06-Oct-2012 Svetoslav Ganov <svetoslavganov@google.com> Accessibility active window not updated on time.

1. The active window is the one the user is touching or the one
that has input focus. It has to be made current immediately
after the user has stopped touching the screen because if the
user types with the IME he should get a feedback for the
letter typed in the text view which is in the input focused
window. Note that we always deliver hover accessibility events
(they are a result of user touching the screen) so change of
the active window before all hover accessibility events from
the touched window are delivered is fine.

bug:7296890

Change-Id: I1ae87c8419e2f19bd8eb68de084c7117c66894bc
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
d367b70c4ad5d8e6cdbcc0d6d429428413cd39b3 04-Oct-2012 Svetoslav Ganov <svetoslavganov@google.com> Merge "Accessibility HOVER_ENTER / EXIT without enclosing EXPLORATION_GESTURE_START / END" into jb-mr1-dev
f068fed6c4c3fc2003aec19b6e7e892358179b02 04-Oct-2012 Svetoslav Ganov <svetoslavganov@google.com> Accessibility HOVER_ENTER / EXIT without enclosing EXPLORATION_GESTURE_START / END

1. The initial implementation was not sending the gesture start and end
events until the the user has moved more than a given slop and did not
do it faster than a given velocity. However, there is the case where
if the user did not move or just taped on the screen an exploration
occurs. The system was not sending the exploration start and end
events for the latter case.

2. The delaued command for long press was not canceled when the pointer
moves more than the slop distance.

bug:7282811

Change-Id: I7d98470cd4d9ea9b2519326e5e550ff68b040747
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
ec33d56300aa417efb4a055786d73d1bf23a6a85 04-Oct-2012 Svetoslav Ganov <svetoslavganov@google.com> Exception in the touch explorer when dragging.

1. During a drag in touch exploration we have two pointers moving in the same
direction but inject only one of them. If the dragging pointer goes up we
send an up to the view system and wait for all pointers to go up to transition
to touch exploring state. At this point the dragging pointer id is cleared
and if a new pointer goes down we are trying to send up (rather do nothing)
for the dragging pointer which we already did and due to the invalid pointer
id we get an exception when splitting the motion event.

bug:7282053

Change-Id: I690bf8bdf6e2e5851ee46a322c4a1bb7d484b53a
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
aeb8d0ed0d98d398a66a092c418f4f2bca8719e0 02-Oct-2012 Svetoslav Ganov <svetoslavganov@google.com> Up motion event not injected by the touch explorer at the end of a drag.

1. The up event was not injected when the last pointer went up, i.e.
at the end of the drag. This patch sends an up event if the dragging
pointer goes up for both cases, when the dragging pointer goes up
first and when it goes up second.

bug:7272830

Change-Id: I708a2b93ee2d0a4c46dbeea002841666e919602d
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
45af84a483165f06c04d74baba67f90da29c6ad2 02-Oct-2012 Svetoslav Ganov <svetoslavganov@google.com> Touch explorer and magnifier do not work well together.

1. If tocuh exploration and screen magnification are enabled and the screen
is currently magnified, gesture detection does not work well. The reason
is because we are transforming the events if the screen is magnified before
passing them to the touch explorer to compensate for the magnification so
the user can poke what he thinks he pokes. However, when doing gesture
detection/velocity computing this compensating shrinks the gestured shape/
decreases velocity leading to poor gesture reco/incorrect velocity.

This change adds a onRawMotionEvent method in the event transformation chain
which will process the raw touch events. In this method of the touch explorer
we are passing events to the gesture recognized and the velocity tracker.

2. Velocity tracker was not cleared on transitions out of touch exploring state
which is the only one that uses velocity.

bug:7266617

Change-Id: I7887fe5f3c3bb6cfa203b7866a145c7341098a02
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
46824214bbe75d7e7e50cb15e3293c703d597a5f 29-Sep-2012 Svetoslav Ganov <svetoslavganov@google.com> Sending interaction end event at the end of a drag.

1. In explore-by-touch when the user slides two fingers in the same
direction we consider it a drag gesture. We merge the pointers into
one and deliver a touch event. When one of the pointers goes up
we were transitioning into touch exploring state. This means that
were transitioning to another state in the middle of a gesture which
creates complications and leads for interaction end event not being
sent.

This change transitions out of dragging state when all pointers go up
- simple and all events are properly sent. Consequentially, staring a
drag the user has to lift all pointers to touch explore. Since usually
users either drags or touch explores this seems the simplest and
*least risky* fix.

bug:7253731

Change-Id: Ie8588fbe9b26cb81312bd7fd377c94732e41e3f8
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
fe304b893968887323b93764caafa66ee8ad44de 28-Sep-2012 Svetoslav Ganov <svetoslavganov@google.com> Some accessibility events not sent from touch explorer if apps misbehave.

1. The touch explorer is relying on the hover exit accessibility event to be sent
from the app's view tree before sending the exploration end and last touch
accessibility events. However, if the app is buggy and does not send the hover
exit event, then the interaction ending events are never sent. Now there is a
timeout in which we wait for the hover exit accessibility event before sending
the gesture end and last touch accessibility events. Hence, we are making a
best effort to have a consistent event stream.

2. Sneaking in the new nine patch for the border around the magnified region
since the current one is engineering art.

bug:7233616

Change-Id: Ie64f23659c25ab914565d50537b9a82bdc6a44a0
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
aed4b6f812674bc60a04470013ca449e5c114fa5 28-Sep-2012 Svetoslav Ganov <svetoslavganov@google.com> Inconsistent events on transition from gesture detection to touch exploration.

1. The problem is that we have a gesture detection timeout after which we transition
to touch exploration state. This handles the case where the user is using too high
velocity while trying to touch explore. The delayed command that transitions from
gesture detection state to touch exploration state was not firing an event for the
end of gesture detection and begin of touch exploration before doing its main work
to transition to touch exploring state.

bug:7233819

Change-Id: I5c4855231aa3826dadbee324e74a3c9e52c96cd9
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
76c0dd48279531cb31e2a284a270c535664cbf81 25-Sep-2012 Svetoslav Ganov <svetoslavganov@google.com> The active window for accessibilitiy incorrectly tracked.

1. The active window for accessibility purposes is the either the
window the user is touching or the window that has input focus. We
were using the touch exploration gesture end event to figure
when the user stops touching the screen so we can set the active
window to the input focused one. However, we do not send such
gesture end if the user does not touch explore. If the user only
taps we do not consider this touch exploring. We now have dedicated
accessibility events for first and last touch and this change uses
them as a guide when to update the active window.

bug:6523219

Change-Id: I6262c0c5f408b02dbaa127664e4b426935d7f81f
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
03e7b8881599da69207a93a2bcbbe5050efb6633 25-Sep-2012 Svetoslav Ganov <svetoslavganov@google.com> More than one finger at a time can trigger a system crash.

1. The crash was happening if: two active pointers are performing a drag;
there are some inactive pointers down; the main dragging pointer (we are
merging the dragging pointers into one) goes up; now an inactive pointer
goes up and the explorer tries to inject up for the dragging pointer
which is no longer in the event resulting in a crash. Basically two
problems: inactive pointers were not ignored; 2) having only one
active pointer should not only send the up event but also transition
the explorer in touch exploring state.

bug:6874128

Change-Id: I341fc360ebc074fe3919d5ba3b98ee5cb08dd71e
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
58d37b55bd228032355360ea3303e46a804e0516 18-Sep-2012 Svetoslav Ganov <svetoslavganov@google.com> Multi-user support for the accessibility layer.

1. This change converts the accessibility manager service to
maintain a state per user. When the user changes the services
for the user that is going away are disconnected, the local
accessibility managers in the processes for this user are
disabled, the state is swapped with the new user's one, and
the new user state is refreshed.

This change updates all calls into the system to use their
user specific versions when applicable. For example, regisetring
content observers, package monitors, calls into other system
services, etc.

There are some components that are shared across users such
as UI created by the system process and the SystemUI package.
Such components are managed as a global state shared across
all users and are updated accordingly on a user switch. Since
the SystemUI is running in a normal app process this change
adds hidden APIs on the local window manager to allow the
SystemUI to notify the accessibility layer that it will run
accross users.

Calls to AccessibiltyManager's isEnabled(), isTouchExplorationEnabled()
and sendAccessibilityEvent return false or a are a nop for a
background user sice he should not send accessibility events,
and should not perform touch exploration.

Update the internal accessibility tests due to changes in the
AccessibilityManager.

This change also fixes several issues that were encountered
such as calling out the accessibility manager service with a
lock held.

Removed some incorrect debugging code from the TouchExplorer
that was leading to a system crash.

bug:6967373

Change-Id: I2cf32ffdee1d827a8197ae4ce717dc0ff798b259
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
8b681cb8813454aac8a626bf3d7adaa8beca4d75 15-Sep-2012 Svetoslav Ganov <svetoslavganov@google.com> Some formatting missed in the previous patch

Change-Id: I299090ca67b1d90cf75a46dc85b13970d32511e5
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
77276b60851a158ad3e142cb3b091d57ae5ceffb 14-Sep-2012 Svetoslav Ganov <svetoslavganov@google.com> Adding accessibility events for touch and gesture detection states.

1. Currently the system fires accessibility events to announce the
start and end of a touch exploration gesture. However, such a
gesture starts after we have decided that the user is not
performing a gesture which is achieved by measuring speed of
movement during a threshold distance. This allows an accessibility
service to provide some feedback to the user so he knows that
he is touch exploring.

This change adds event types for the first and last touches
of the user. Note that the first touch does not conincide with
the start of a touch exploration gesture since we need a time
or distance to pass before we know whether the user explores
or gestures. However, it is very useful for an accessibility
service to know when the user starts to interact with the
touch screen so it can turn the speech off, to name one
compelling use case.

This change also provides event types for the start and end
of gesture detection. If the user has moved over the threshold
with a speed greater than X, then the system detects gestures.
It is useful for an accessibility service to know the begin
and end of gesture detection so it can provide given feedback
type for such a gesture, say it may produce haptic feedback
or sound that differs for the one for touch exploration.

The main benefit of announcing these new events is that an
accessibility service can provide feedback for each touch
state allowing the user to always know what he is doing.

bug:7166935

Change-Id: I26270d774cc059cb921d6a4254bc0aab0530c1dd
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
1cf70bbf96930662cab0e699d70b62865766ff52 06-Aug-2012 Svetoslav Ganov <svetoslavganov@google.com> Screen magnification - feature - framework.

This change is the initial check in of the screen magnification
feature. This feature enables magnification of the screen via
global gestures (assuming it has been enabled from settings)
to allow a low vision user to efficiently use an Android device.

Interaction model:

1. Triple tap toggles permanent screen magnification which is magnifying
the area around the location of the triple tap. One can think of the
location of the triple tap as the center of the magnified viewport.
For example, a triple tap when not magnified would magnify the screen
and leave it in a magnified state. A triple tapping when magnified would
clear magnification and leave the screen in a not magnified state.

2. Triple tap and hold would magnify the screen if not magnified and enable
viewport dragging mode until the finger goes up. One can think of this
mode as a way to move the magnified viewport since the area around the
moving finger will be magnified to fit the screen. For example, if the
screen was not magnified and the user triple taps and holds the screen
would magnify and the viewport will follow the user's finger. When the
finger goes up the screen will clear zoom out. If the same user interaction
is performed when the screen is magnified, the viewport movement will
be the same but when the finger goes up the screen will stay magnified.
In other words, the initial magnified state is sticky.

3. Pinching with any number of additional fingers when viewport dragging
is enabled, i.e. the user triple tapped and holds, would adjust the
magnification scale which will become the current default magnification
scale. The next time the user magnifies the same magnification scale
would be used.

4. When in a permanent magnified state the user can use two or more fingers
to pan the viewport. Note that in this mode the content is panned as
opposed to the viewport dragging mode in which the viewport is moved.

5. When in a permanent magnified state the user can use three or more
fingers to change the magnification scale which will become the current
default magnification scale. The next time the user magnifies the same
magnification scale would be used.

6. The magnification scale will be persisted in settings and in the cloud.

Note: Since two fingers are used to pan the content in a permanently magnified
state no other two finger gestures in touch exploration or applications
will work unless the uses zooms out to normal state where all gestures
works as expected. This is an intentional tradeoff to allow efficient
panning since in a permanently magnified state this would be the dominant
action to be performed.

Design:

1. The window manager exposes APIs for setting accessibility transformation
which is a scale and offsets for X and Y axis. The window manager queries
the window policy for which windows will not be magnified. For example,
the IME windows and the navigation bar are not magnified including windows
that are attached to them.

2. The accessibility features such a screen magnification and touch
exploration are now impemented as a sequence of transformations on the
event stream. The accessibility manager service may request each
of these features or both. The behavior of the features is not changed
based on the fact that another one is enabled.

3. The screen magnifier keeps a viewport of the content that is magnified
which is surrounded by a glow in a magnified state. Interactions outside
of the viewport are delegated directly to the application without
interpretation. For example, a triple tap on the letter 'a' of the IME
would type three letters instead of toggling magnified state. The viewport
is updated on screen rotation and on window transitions. For example,
when the IME pops up the viewport shrinks.

4. The glow around the viewport is implemented as a special type of window
that does not take input focus, cannot be touched, is laid out in the
screen coordiates with width and height matching these of the screen.
When the magnified region changes the root view of the window draws the
hightlight but the size of the window does not change - unless a rotation
happens. All changes in the viewport size or showing or hiding it are
animated.

5. The viewport is encapsulated in a class that knows how to show,
hide, and resize the viewport - potentially animating that.
This class uses the new animation framework for animations.

6. The magnification is handled by a magnification controller that
keeps track of the current trnasformation to be applied to the screen
content and the desired such. If these two are not the same it is
responsibility of the magnification controller to reconcile them by
potentially animating the transition from one to the other.

7. A dipslay content observer wathces for winodw transitions, screen
rotations, and when a rectange on the screen has been reqeusted. This
class is responsible for handling interesting state changes such
as changing the viewport bounds on IME pop up or screen rotation,
panning the content to make a requested rectangle visible on the
screen, etc.

8. To implement viewport updates the window manger was updated with APIs
to watch for window transitions and when a rectangle has been requested
on the screen. These APIs are protected by a signature level permission.
Also a parcelable and poolable window info class has been added with
APIs for getting the window info given the window token. This enables
getting some useful information about a window. There APIs are also
signature protected.

bug:6795382

Change-Id: Iec93da8bf6376beebbd4f5167ab7723dc7d9bd00
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
c9c9a48e7bafae63cb35a9aa69255e80aba83988 16-Jul-2012 Svetoslav Ganov <svetoslavganov@google.com> Removing a workaround for incorrect window position on window move.

1. The window manager was not notifying a window when the latter
has been moved. This was causing incorrect coordinates of the
nodes reported to accessibility services. To workaround that
we have carried the correct window location when making a
call from the accessibility layer into a window. Now the
window manager notifies the window when it is moved and the
workaround is no longer needed. This change takes it out.

2. The left and right in the attach info were not updated properly
after a report that the window has moved.

3. The accessibility manager service was calling directly methods
on the window manager service without going through the interface
of the latter. This leads to unnecessary coupling and in the
long rung increases system complexity and reduces maintability.

bug:6623031

Change-Id: Iacb734b1bf337a47fad02c827ece45bb2f53a79d
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
ea6fbc0981564f7bbf4c6fbb63af0175415121ce 20-Jun-2012 Casey Burkhardt <caseyburkhardt@google.com> Fixing gesture recognition configuration in TouchExplorer.

This fix adjusts the sensitivity of the gesture recognizer by
eliminating gesture rotation in the recognition process.

Bug:6697119
Change-Id: Ic767f513c05210b27e583338c4f0adcaa1c4c625
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
5d043ce8cc2f588fdfb336cc843fb3b07b196f83 14-Jun-2012 Svetoslav Ganov <svetoslavganov@google.com> Active window not updated window not updated properly.

1. Accessibility allows querying only of the active window.
The active window is the one that has input focus or the
one the user is touching. Hence, if the user is touching
a window that does not have input focus this window is
the active one and as soon as the user stops touching
it the active window becomes the one that has input
focus. Currently the active window is not updated properly
when the user lifts his finger. This leads to a scenario
of traversal actions sent to the wrong window and the user
being stuck.

The reason is that the last touch explored event that is
used to determine where to click is cleared when accessibility
focus moves but this event is also used to determine when to
send the hover exit and touch exploration gesture end events.
The problem is that the last hover event is cleared before
it is used for sending the right exit events, thus the event
stream is inconsistent and the accessibility manager service
relies on this stream to update the active window. Now we
are keeping separate copies of the last touch event - one
for clicking and one for determining the which events to
inject to ensure consistent stream.

bug:6666041

Change-Id: Ie9961e562a42ef8a9463afacfff2246adcb66303
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
95068e5d1bea47091e97955f271c789264994550 14-Jun-2012 Svetoslav Ganov <svetoslavganov@google.com> If a gesture cannot be detected the device should transition to touch exploration state.

1. We are deciding whether the user is performing a gesture or an exploration based
on the gesture velocity. If we are detecting gesture we do the recognition at the
gesture end which is when the finger goes up. This is better than having a mode
toggle gesture for exploring and gestures detection. However, it is possible that
the user really wanted to perform an exploration but was moving too fast and
unless he lifts his finger the device is in gesture detection mode. This is
frustrating since the user has no feedback and assumes exploration does not
work.

We want to perform gesture detection only for a maximal time frame and if the
user did not lift his finger we transition into touch exploration state.

bug:6663173

Change-Id: I954ff937cca902e31b51325d1e1dfce84d239624
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
385d9f24b5ce2acb86c0dc192ce702718ab01c39 08-Jun-2012 Svetoslav Ganov <svetoslavganov@google.com> Cannot click on the last touch explored auto-completion item.

1. When typing into an auto completion edit field a list of completions pops up and if
the user touch explores the list and tries to double tap to select the touched
completion the latter is not selected.

The auto completion is a popup that does not take input focus and is overlaid on
top of the window that has input focus. The touch explorer was clicking on the
location of the accessibility focus if the last touch explored location is within
the bounds of the active window. In this case this was the window with the edit
text into which the user is typing. The check performed by the touch explorer
was missing the case when the last touch explored location was within the bounds
of the active window but it actually was deloverd to another overlaid window.
Now we are poking on the accessibility focus location if the last explored
location is within the active window and was delivered to it.

bug:6629535

Change-Id: Ie66d5bb81ab021f2bb0414339b7de26d96826191
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
86783474fdec98a22bc22e224462767eab13e273 07-Jun-2012 Svetoslav Ganov <svetoslavganov@google.com> Cannot interact with dialogs when IME is up and on not touch explored popups.

1. If the last touch explored location is within the active window we
used to click on exact location if it is within the accessibility
focus otherwise in the accessibility focus center. If the last touch
explored location is not within the active window we used to just
click there. This breaks in the case were one has touch explored
at a given place in the current window and now a dialog opens *not*
covering the touch explored location. If one uses swipes to move
accessibility focus i.e. to traverse the dialog without touching
it one cannot activate anything because the touch explorer is using
the last touch explored location that is outside of the active
window e.g the dialog.

The solution is to clear the last touch explored location when a
window opens or accessibility focus moves. If the last touch
explored location is null we are clicking in the accessibility
focus location.

bug:6620911

2. There is a bug in the window manager that does not notify a
window that its location has changed (bug:6623031). This breaks
accessibility interaction with dialogs that have input because
when the IME is up the dialog is moved but not notified. Now
the accessibility layer gets incorrect location for the
accessibility focus and the window bounds.

The soluion is when the accessibility manager service calls
into the remove thress to obtain some accessibility node infos
it passes the window left and top which it gets from the
window manager. These values are used to update the attach info
window left and top so all accessibility node infos emitted
from that window had correct bounds in screen coordinates.

bug:6620796

Change-Id: I18914f2095c55cfc826acf5277bd94b776bda0c8
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
e47957a0bbe2164467ff6e7a566b0c9e4689cdc9 05-Jun-2012 Svetoslav Ganov <svetoslavganov@google.com> Nodes with contentDescription should always be important for accessibility.

1. Now after setting the content description on a view we mark is as
important for accessibility of the current important for accessibility
mode of that view is auto.

2. Minor tweak to a touch explorer coefficient to make performing double
tapping easier.

bug:6615353

Change-Id: I3b477f533a3ebde85d425caf32ace5e851240f88
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
ebac1b79c4a355d8cd73b49df059deb00d7aa256 03-Jun-2012 Svetoslav Ganov <svetoslavganov@google.com> Fixing a crash in the TouchExplorer.

1. If the runnable for performing a long press is not
removed when all pointers are up and it is executed
the explorer gets into delegating mode with no pointer
down and the next down crashes the explorer. Added
code to remove the long press runnable in a few places
it was missing and also added a safety in the runnable
to avoid executing it in case there are no active pointers.

bug:6557183

Change-Id: I9dab3de88fd08d8e2b38af18249ac551837c0736
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
6acca2442572a28b7d9428e5e2fc2aa4271e29f9 01-Jun-2012 Svetoslav Ganov <svetoslavganov@google.com> Merge "Cannot double tap and hold outside of the input focused window." into jb-dev
238099c0dbbdc66b8443552126680ad1c7cab17d 01-Jun-2012 Svetoslav Ganov <svetoslavganov@google.com> Cannot double tap and hold outside of the input focused window.

1. The long press routine was using the coordintates of the
accessibility focused item in the input focused window.
As a result double tap and hold did not work in a window
that does not take input focus such as the system bar.
Now the routine is using the last touch explored location
if it cannot find accessibility focus in the last touched
window.

bug:6584438

Change-Id: Ifd43adb20a066f389a9d4bd5716dd7ad834dd574
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
9a4c5cd19106c3021eeead27fbc2aa05ad7d0d18 30-May-2012 Svetoslav Ganov <svetoslavganov@google.com> Ask to enable touch exploration only the first time it enables the feature.

1. Now we are asking the user to grant permission to the service to enable
touch exploration only the first time this service is enabled. If the
service was uninstalled and then later installed we ask the user again.
This avoids the scenario in which rebooting the device or upgrading an
accessibility service leaves the device in a state in which the user
cannot interact with.

bug:6582088

Change-Id: I51d24e4892b3b48c9fb11dfb09ec1118502ba526
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
e15ccb93add99ebb9cd7aec03a04faa37f45b39d 17-May-2012 Svetoslav Ganov <svetoslavganov@google.com> Changing the interaction model of the touch explorer.

1. Now the user have to double tap to activate the last
item. If the last touched window is not active because
it does not take input focus the click on the last
touch explored location. Othewise the click is on the
accessibility focus location.

bug:5932640

Change-Id: Ibb7b97262a7c5f2f94abef429e02790fdc91a8dd
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
fefd20e927b7252d63acb7bb1852c5188e3c1b2e 20-Apr-2012 Svetoslav Ganov <svetoslavganov@google.com> Adding an opt-in mechanism for gesture detection in AccessibilityService.

1. An accessibility service has to explicitly opt in to be notified
for gestures by the system. There is only one accessibility service
that handles gestures and in case it does not handle a gesture
the system performs default handling. This default handling ensures
that we have gesture navigation even if no accessibility service
would like to participate/customize the interaction model.

bug:5932640

Change-Id: Id8194293bd94097b455e9388b68134a45dc3b8fa
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
4213804541a8b05cd0587b138a2fd9a3b7fd9350 20-Mar-2012 Svetoslav Ganov <svetoslavganov@google.com> Accessibility focus - framework

Usefulness: Keep track of the current user location in the screen when
traversing the it. Enabling structural and directional
navigation over all elements on the screen. This enables
blind users that know the application layout to efficiently
locate desired elements as opposed to try touch exploring the
region where the the element should be - very tedious.

Rationale: There are two ways to implement accessibility focus One is
to let accessibility services keep track of it since they
have access to the screen content, and another to let the view
hierarchy keep track of it. While the first approach would
require almost no work on our part it poses several challenges
which make it a sub-optimal choice. Having the accessibility focus
in the accessibility service would require that service to scrape
the window content every time it changes to sync the view tree
state and the accessibility focus location. Pretty much the service
will have to keep an off screen model of the screen content. This
could be quite challenging to get right and would incur performance
cost for the multiple IPCs to repeatedly fetch the screen content.
Further, keeping virtual accessibility focus (i.e. in the service)
would require sync of the input and accessibility focus. This could
be challenging to implement right as well. Also, having an unlimited
number of accessibility services we cannot guarantee that they will
have a proper implementation, if any, to allow users to perform structural
navigation of the screen content. Assuming two accessibility
services implement structural navigation via accessibility focus,
there is not guarantee that they will behave similarly by default,
i.e. provide some standard way to navigate the screen content.
Also feedback from experienced accessibility researchers, specifically
T.V Raman, provides evidence that having virtual accessibility focus
creates many issues and it is very hard to get right.
Therefore, keeping accessibility focus in the system will avoid
keeping an off-screen model in accessibility services, it will always
be in sync with the state of the view hierarchy and the input focus.
Also this will allow having a default behavior for traversing the
screen via this accessibility focus that is consistent in all
accessibility services. We provide accessibility services with APIs to
override this behavior but all of them will perform screen traversal
in a consistent way by default.

Behavior: If accessibility is enabled the accessibility focus is the leading one
and the input follows it. Putting accessibility focus on a view moves
the input focus there. Clearing the accessibility focus of a view, clears
the input focus of this view. If accessibility focus is on a view that
cannot take input focus, then no other view should have input focus.
In accessibility mode we initially give accessibility focus to the topmost
view and no view has input focus. This ensures consistent behavior accross
all apps. Note that accessibility focus can move hierarchically in the
view tree and having it at the root is better than putting it where the
input focus would be - at the first input focusable which could be at
an arbitrary depth in the view tree. By default not all views are reported
for accessibility, only the important ones. A view may be explicitly labeled
as important or not for accessibility, or the system determines which one
is such - default. Important views for accessibility are all views that are
not dumb layout managers used only to arrange their chidren. Since the same
content arrangement can be obtained via different combintation of layout
managers, such managers cannot be used to reliably determine the application
structure. For example, a user should see a list as a list view with several
list items and each list item as a text view and a button as opposed to seeing
all the layout managers used to arrange the list item's content.
By default only important for accessibility views are regared for accessibility
purposes. View not regarded for accessibility neither fire accessibility events,
nor are reported being on the screen. An accessibility service may request the
system to regard all views. If the target SDK of an accessibility services is
less than JellyBean, then all views are regarded for accessibility.
Note that an accessibility service that requires all view to be ragarded for
accessibility may put accessibility focus on any view. Hence, it may implement
any navigational paradigm if desired. Especially considering the fact that
the system is detecting some standard gestures and delegates their processing
to an accessibility service. The default implementation of an accessibility
services performs the defualt navigation.

bug:5932640
bug:5605641

Change-Id: Ieac461d480579d706a847b9325720cb254736ebe
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
4532e6158474a263d9d26c2b42240bcf7ce9b172 05-Apr-2012 Jeff Brown <jeffbrown@google.com> Refactor input system into its own service.

Extracted the input system from the window manager service into
a new input manager service. This will make it easier to
offer new input-related features to applications.

Cleaned up the input manager service JNI layer somewhat to get rid
of all of the unnecessary checks for whether the input manager
had been initialized. Simplified the callback layer as well.

Change-Id: I3175d01307aed1420780d3c093d2694b41edf66e
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
d8581c7a61a9db042b531ce4baca3c036316e066 18-Oct-2011 Svetoslav Ganov <svetoslavganov@google.com> TouchExplorer crashes if there is incative pointer while dragging.

The TouchExplorer was not taking into account the case with incative
pointers while dragging. If one puts a finger down and then perfroms
a dragging gestore the explorer tries to inject UP event for the end
of the gesture upon every of the two dragging pointers going up instead
only for one the first went up.

bug:5476098

Change-Id: I20d2dd7bde7e016b0678a35d14cd068d9ff37023
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
2e1c66bd53d30d2148afaa4b393b60cd59976d65 12-Oct-2011 Svetoslav Ganov <svetoslavganov@google.com> Dragging in touch explore mode should not become exploring.

In touch exploration two fingers in the same direction drag and if one of them
goes up the other starts to touch explore. This however causes inadvertent touch
exploring to happen on almost every scroll causing confusion. Now two finger
drag and they should both go up to allow exploring. This way the inadvertent
exploring is gone and user experience is much better.

bug:5440411

Change-Id: Id8aaece92e5dea1fc740400d2adc9dd63a1674e4
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
bd206d129fdd1777b9f9646a834d7fc342a8941e 16-Sep-2011 Svetoslav Ganov <svetoslavganov@google.com> Touch explorer does not perform tap with the right pointer.

The touch explorer was using the id of the last pointer that
went up while injecting up and down to tap through the last
touch explore event incorrectly assuming that the last up
pointer did touch explore. This was leading to a system crash.

bug:5319315

Change-Id: Iffe8ef753795ad685abe6f493cc09adac8bfea94
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
406970b06c8472cbd44ecc278d643a12589c6b38 08-Sep-2011 Svetoslav Ganov <svetoslavganov@google.com> Touch explorer does not cancel long press correctly causing system crash.

1. The touch explorer was not canceling long press runnable when a finger
goes down. This was causing system crash in the scenario of one pointer
down and not moving followed by another pointer down. Since the long press
runnable posed when the first pointer went down was not removed it was
sending events with wrong pointer id leading to a crash.

bug:5271592

Change-Id: I40dd7dd21d465ecedd9413f00b3cedc6066fa22d
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
12a024ca681d877fe16b7e087356f7aff175a218 04-Sep-2011 Svetoslav Ganov <svetoslavganov@google.com> Tuning the TouchExplorer

1. Tuned the max angle between two moving fingers in touch
exploration mode for a gesture to be considered a drag.
The previous value was too aggressive and it was fairly
easy for the user to get out of dragging state if she
ingreases the distance between her fingers.

bug:5223787

2. Before clicking the explorer was sending hover enter and
exit resulting in firing the corresponding accessibility
events which leads to announcement of the content under
the tap that triggered the click. However, the click is
actually performed on the last touch explored location
(if in the distance slop of course) instead of the actual
tapping pointer location. Before fixing that the user was
confused since he was hearing announcement of one content
but actually was clicking on something else.

bug:5225721

Change-Id: I79fec704878f98c95f181bf8a9647e0bb1bd10ef
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
f804420d6e37748b75478406e989c69303756980 27-Aug-2011 Svetoslav Ganov <svetoslavganov@google.com> Clean up and bug fixes in the TouchExplorer.

1. The downTime of the first down event was zero but it should the event time.

2. Hover exit events were not injected while transitioning to delegating
state and when tapping.

3. Differentiation between dragging and delagating state based on
two moving pointer direction and distance is now based only on
the direction. Hence, two pointers moving in the same direction
are dragging, otherwise the event stream is delegated unmodified.
The reason for that is the blind people cannot easily determine
and control the distance between their fingers resulting in
different behavior for gestures which the user thinks are the same
which creates confusion. Also in some cases the delegation and
draggig yield the same result, for example in list view, further
adding to the confusion. This was also causing the status bar to
be opened closed inreliably creating frustration.

4. Refactored the code such that now there is only one method that
injects motion events and all request go through it. Some bugs
were introduced by inconsistent implementation in the different
injection methods.

5. Fixed a couple of event stream inconsistencies reported by the
event consistency verifier.

bug:5224183
bug:5223787
bug:5214829

Change-Id: I16c9be3562ad093017af5b974a41ab525b73453f
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
3e4e4af45216aee4d4b009fe842c0324610918eb 05-Aug-2011 Svetoslav Ganov <svetoslavganov@google.com> Turning off accessibility feature reboots the device

1. The touch explorer uses delayed injection of events
which can happen after its hosting accessibility
input filer has been unregistered, thus the explorer
was trying to inject events when this is not allowed.
Now upon unregistration of the accessibility explorer
it resets the state of the touch explorer it hosts.

bug:5105956

Change-Id: I720682abf93382aedf4f431eaac90fd2c781e442
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
47e02711d78ecac9112aa7f66e5664cdc46fb3d1 01-Aug-2011 Svetoslav Ganov <svetoslavganov@google.com> ACTION_HOVER_EXIT sometimes not delivered during touch exploration.

1. The code for detecting the end of a touch exploration gesture
was not injecting the hover exit event upon detection of the
gesture end.

bug:5091758:

Change-Id: I468164617d6677cd2a2a2815e1756c826d49f3a9
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
f5a07905a3e025f95472a3f8d9935263e49ad6d3 25-Jul-2011 Svetoslav Ganov <svetoslavganov@google.com> TouchExplorer long press not working and activation tap not respecting distance slop.

1. The first problem is manifested on Prime. Apparently the Prime screen driver
is very aggresive in filtering move events that origin from almost the same
location. Hence, the framework doesn't see a constant stream of events. However,
the TouchExplorer implementation was assuming a constant event stream to detect
long press. Refactored the code such that no assumptions for the event stream
are made.

2. Touch exploring an item and then tapping far away from that item was activating
it, hence not respecting the distance slop. This was due to incorrect check of
the latter.

bug:5070917

Change-Id: I3627a2feeb3712133f58f8f8f1ab7a2ec50cdc9a
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
ea80b2d02f836214b175ac24a7b4315053a86f06 16-Jul-2011 Svetoslav Ganov <svetoslavganov@google.com> Exception in TouchExplorer due to invalid pointer id.

Change-Id: Iec5d3b3b0d3ae5676e16384ed2b12352fe4a7f3c
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
63c04eeb00dd71380d5ebba701014793d8f9a2ea 14-Jul-2011 Svetoslav Ganov <svetoslavganov@google.com> Touch exploration gesture events change the window id.

1. Touch exploration start and end events are generated
by the sytstem to provide additional information for
accessibility services. Since such events do not come
from any particular window they whould not change the
id of the window that currently allows exploring its
content.

2. Touch exploration start and end events were lealing the
touch explorer class wich is private.

bug:5026258

Change-Id: Icaf3e2bd9566716f2afb876cf8e0d50813b0c76e
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
51cccf0845b36539d42503495f0689d487712b3a 27-Jun-2011 Svetoslav Ganov <svetoslavganov@google.com> ArrayIndexOutOfBounds exception in TouchExplorer.

1. The explorer was injecting up/down touch events to
click with the id of the last pointer that went up
but the prototype i.e. last touch explore event may
not contain this pointer. Since we click on the last
touch explored location then using the action pointer
index of that event is the right approach.

bug:4551506

Change-Id: I73428b09dc014417096a52e667f58768a2871dc8
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
00f7b3f76515d1c6fbe5cf9fee9d3760787c03cd 08-Jun-2011 Svetoslav Ganov <svetoslavganov@google.com> Crash in the TouchExplorer

1. No clearing the last touch explore event in all cases
when transitioning to another mode.

2. Incorrectly assuming the the action index of an up/down
events is 0.

bug:4551506

Change-Id: I43f8e800b54a340968489dc924a539795a9195cb
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
91feae3c5994bd4768cea3507c62c65746adcfa6 20-May-2011 Svetoslav Ganov <svetoslavganov@google.com> TouchExplorer - refactoring and a couple of bug fixes

1. Refactored the code to avoid code duplication.

2. Fixed a bug in removing unused pointers from the event.

3. Fixed a bug that was crashing the explorer.

4. Sending hover exit immediately at the end of touch exploration
gesture rather with a delay.

Change-Id: Ie288cb8090d6fb5e5c715afa6ea5660b17c019e0
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
fe9f8ab03a63b1037f07dd85799fbea80ec6adaa 07-May-2011 Jeff Brown <jeffbrown@google.com> Add initial API for stylus and mouse buttons.

Added the concept of pointer properties in a MotionEvent.
This is currently used to track the pointer tool type to enable
applications to distinguish finger touches from a stylus.

Button states are also reported to application as part of touch events.

There are no new actions for detecting changes in button states.
The application should instead query the button state from the
MotionEvent and take appropriate action as needed.

A good time to check the button state is on ACTION_DOWN.

As a side-effect, applications that do not support multiple buttons
will treat primary, secondary and tertiary buttons identically
for all touch events.

The back button on the mouse is mapped to KEYCODE_BACK
and the forward button is mapped to KEYCODE_FORWARD.

Added basic plumbing for the secondary mouse button to invoke
the context menu, particularly in lists.

Added clamp and split methods on MotionEvent to take care of
common filtering operations so we don't have them scattered
in multiple places across the framework.

Bug: 4260011
Change-Id: Ie992b4d4e00c8f2e76b961da0a902145b27f6d83
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
736c2756bf3c14ae9fef7255c119057f7a2be1ed 23-Apr-2011 Svetoslav Ganov <svetoslavganov@google.com> Touch exploration feature, event bubling, refactor

1. Added an Input Filter that interprets the touch screen motion
events to perfrom accessibility exploration. One finger explores.
Tapping within a given time and distance slop on the last exlopred
location does click and long press, respectively. Two fingers close
and in the same diretion drag. Multiple finglers or two fingers in
different directions or two fingers too far away are delegated to
the view hierarchy. Non moving fingers "accidentally grabbed the
device for the scrren" are ignored.

2. Added accessibility events for hover enter, hover exit, touch
exoloration gesture start, and end. Accessibility hover events
are fired by the hover pipeline. An accessibility event is
dispatched up the view tree and the topmost view fires it.
Thus predecessors can augment the fired event. An accessibility
event has several records and a predecessor can optionally
modify, delete, and add such to the event.

3. Added onPopulateAccessibilityEvent and refactored the existing
accessibility code to use it.

4. Added API for querying the currently enabled accessibility services
by feedback type.

Change-Id: Iea2258c07ffae9491071825d966dc453b07e5134
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
3fb3d7c4e756bd32d5abde0abca9ab52d559bc84 23-Apr-2011 Adam Powell <adamp@google.com> Revert "Touch exploration feature, event bubling, refactor"

This reverts commit ac84d3ba81f08036308b17e1ab919e43987a3df5.

There seems to be a problem with this API change. Reverting for now to
fix the build.

Change-Id: Ifa7426b080651b59afbcec2d3ede09a3ec49644c
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java
ac84d3ba81f08036308b17e1ab919e43987a3df5 05-Apr-2011 Svetoslav Ganov <svetoslavganov@google.com> Touch exploration feature, event bubling, refactor

1. Added an Input Filter that interprets the touch screen motion
events to perfrom accessibility exploration. One finger explores.
Tapping within a given time and distance slop on the last exlopred
location does click and long press, respectively. Two fingers close
and in the same diretion drag. Multiple finglers or two fingers in
different directions or two fingers too far away are delegated to
the view hierarchy. Non moving fingers "accidentally grabbed the
device for the scrren" are ignored.

2. Added accessibility events for hover enter, hover exit, touch
exoloration gesture start, and end. Accessibility hover events
are fired by the hover pipeline. An accessibility event is
dispatched up the view tree and the topmost view fires it.
Thus predecessors can augment the fired event. An accessibility
event has several records and a predecessor can optionally
modify, delete, and add such to the event.

3. Added onPopulateAccessibilityEvent and refactored the existing
accessibility code to use it.

4. Added API for querying the currently enabled accessibility services
by feedback type.

Change-Id: Iec03c6c3fe298de3f14cb6efdbb9b198cd531a0c
/frameworks/base/services/java/com/android/server/accessibility/TouchExplorer.java