By publishing this Recommendation, W3C expects that the functionality specified in this Touch Interface Recommendation will not be affected by changes to HTML5 or Web IDL as those specifications proceed to Recommendation. The WG has completed and approved this specification's Test Suite and created an Implementation Report that shows that two or more independent implementations pass each test. This version of the specification includes fixes and improvements to Level 1, and incorporates the features previously published as Touch Event Extentions.
The Touch Events specification defines a set of low-level events that represent one or more points of contact with a touch-sensitive surface, and changes of those points with respect to the surface and any DOM elements displayed upon it (e.g. for touch screens) or associated with it (e.g. for drawing tablets without displays). It also addresses pen-tablet devices, such as drawing tablets, with consideration toward stylus capabilities.
User Agents that run on terminals which provide touch input to use web applications typically use interpreted mouse events to allow users to access interactive web applications. However, these interpreted events, being normalized data based on the physical touch input, tend to have limitations on delivering the intended user experience. Additionally, it is not possible to handle concurrent input regardless of device capability, due to constraints of mouse events: both system level limitations and legacy compatibility.
Meanwhile, native applications are capable of handling both cases with the provided system APIs.
The Touch Events specification provides a solution to this problem by specifying interfaces to allow web applications to directly handle touch events, and multiple touch points for capable devices.
This specification defines conformance criteria that apply to a single product: the user agent that implements the interfaces that it contains.
WindowProxy is defined in [[!HTML5]].
The IDL blocks in this specification are conforming IDL fragments as defined by the WebIDL specification [[!WEBIDL]].
A conforming Web Events user agent must also be a conforming ECMAScript implementation of this IDL fragments in this specification, with the following exception:
Note: Both ways of reflecting IDL attributes allow for simply getting
and setting the property on the platform object to work. For example, given a
Touch
object aTouch
, evaluating aTouch.target
would return the EventTarget
for the Touch
object. If the
user agent implements IDL attributes as accessor properties, then the property access
invokes the getter which returns the EventTarget
. If the user agent implements
IDL attributes as data properties on the platform object with the same behavior as
would be found with the accessor properties, then the object would appear to have an
own property named target
whose value is an EventTarget
object,
and the property access would return this value.
Touch
Interface
This interface describes an individual touch point for a touch
event. Touch
objects are immutable; after one is created, its
attributes must not change.
EventTarget
on which the touch point started when it
was first placed on the surface, even if the touch point has
since moved outside the interactive area of that element.
0
if no value is known. The value must not be negative.
0
if no value is known. The value must not be
negative.
0
if no
value is known. The value must be greater than or equal to
0
and less than 90
.
If the ellipse described by radiusX and radiusY is circular, then
rotationAngle has no effect. The user agent may use 0
as
the value in this case, or it may use any other value in the allowed
range. (For example, the user agent may use the rotationAngle value
from the previous touch event, to avoid sudden changes.)
0
to
1
, where 0
is no pressure, and 1
is the highest level of pressure the touch device is capable of sensing;
0
if no value is known. In environments where force is
known, the absolute pressure represented by the force attribute, and
the sensitivity in levels of pressure, may vary.
TouchList
Interface
This interface defines a list of individual points of contact for a
touch event. TouchList
objects are immutable; after one is
created, its contents must not change.
A TouchList
object's supported property indices ([[!WEBIDL]])
are the numbers in the range 0 to one less than the length of the list.
Touch
objects in the list
Touch
? item (in unsigned long index)Touch
at the specified index in the list or
null if the index is not less than the length of the list.
TouchEvent
Interface
This interface defines the touchstart
, touchend
,
touchmove
, and touchcancel
event types.
TouchEvent
objects are immutable; after one is created and
initialized, its attributes must not change.
TouchList
touchesTouch
objects for every point of contact currently
touching the surface.
TouchList
targetTouches
Touch
objects for every point of contact that is touching
the surface and started on the element that is the
target of the current event.
TouchList
changedTouches
a list of Touch
objects for every point of contact which contributed
to the event.
For the touchstart
event this must be a list of the touch
points that just became active with the current event. For the
touchmove
event this must be a list of the touch points that
have moved since the last event. For the touchend
and
touchcancel
events this must be a list of the touch points
that have just been removed from the surface, with the last known coordinates
of the touch points before they were removed.
altKey
true
if the alt (Alternate) key modifier is activated;
otherwise false
metaKey
true
if the meta (Meta) key modifier is activated;
otherwise false
. On some platforms this attribute may
map to a differently-named key modifier.
ctrlKey
true
if the ctrl (Control) key modifier is activated;
otherwise false
shiftKey
true
if the shift (Shift) key modifier is activated;
otherwise false
User agents should ensure that all Touch
objects available from a given
TouchEvent
are all associated to the same document that the TouchEvent
was dispatched
to. To implement this, user agents should maintain a notion of the current
touch-active document. On first touch, this is set to the target document
where the touch was created. When all active touch points are released, the
touch-active document is cleared. All TouchEvent
s are dispatched to the
current touch-active document, and each Touch
object it contains refers
only to DOM elements (and co-ordinates) in that document. If a touch starts entirely
outside the currently touch-active document, then it is ignored entirely.
The examples below demonstrate the relations between the different
TouchList
members defined in a TouchEvent
.
touches
and targetTouches
of a TouchEvent
This example demonstrates the utility and relations between the
touches
and targetTouches
members defined in the TouchEvent
interface. The following code will generate different output based
on the number of touch points on the touchable element and the document:
<div id='touchable'>This element is touchable.</div> <script> document.getElementById('touchable').addEventListener('touchstart', function(ev) { if (ev.touches.item(0) == ev.targetTouches.item(0)) { /** * If the first touch on the surface is also targeting the * "touchable" element, the code below should execute. * Since targetTouches is a subset of touches which covers the * entire surface, TouchEvent.touches >= TouchEvents.targetTouches * is always true. */ document.write('Hello Touch Events!'); } if (ev.touches.length == ev.targetTouches.length) { /** * If all of the active touch points are on the "touchable" * element, the length properties should be the same. */ document.write('All points are on target element') } if (ev.touches.length > 1) { /** * On a single touch input device, there can only be one point * of contact on the surface, so the following code can only * execute when the terminal supports multiple touches. */ document.write('Hello Multiple Touch!'); } }, false); </script>
changedTouches
of a TouchEvent
This example demonstrates the utility of changedTouches
and it's relation
with the other TouchList
members of the TouchEvent
interface.
The code is a example which triggers whenever a touch point is removed
from the defined touchable element:
<div id='touchable'>This element is touchable.</div> <script> document.getElementById('touchable').addEventListener('touchend', function(ev) { /** * Example output when three touch points are on the surface, * two of them being on the "touchable" element and one point * in the "touchable" element is lifted from the surface: * * Touch points removed: 1 * Touch points left on element: 1 * Touch points left on document: 2 */ document.write('Removed: ' + ev.changedTouches.length); document.write('Remaining on element: ' + ev.targetTouches.length); document.write('Remaining on document: ' + ev.touches.length); }, false); </script>
TouchEvent
types
The following table provides a summary of the types of possible
TouchEvent
types defined in this specification. All events
should accomplish the bubbling phase. Some events are not cancelable
(see canceled event).
Event Type | Sync / Async | Bubbling phase | Trusted proximal event target types | DOM interface | Cancelable | Default Action |
---|---|---|---|---|---|---|
touchstart |
Sync | Yes | Document , Element |
TouchEvent |
Yes | undefined |
touchend |
Sync | Yes | Document , Element |
TouchEvent |
Yes | Varies: user agents may dispatch mouse and click events |
touchmove |
Sync | Yes | Document , Element |
TouchEvent |
Yes | undefined |
touchcancel |
Sync | Yes | Document , Element |
TouchEvent |
No | none |
touchstart
eventA user agent must dispatch this event type to indicate when the user places a touch point on the touch surface.
The target of this event must be an Element
. If the touch
point is within a frame, the event should be dispatched to an element
in the child browsing context of that frame.
If this event is canceled, it should prevent any default actions caused by any touch events associated with the same active touch point, including mouse events or scrolling.
touchend
eventA user agent must dispatch this event type to indicate when the user removes a touch point from the touch surface, also including cases where the touch point physically leaves the touch surface, such as being dragged off of the screen.
The target of this event must be the same Element
on
which the touch point started when it was first
placed on the surface, even if the touch point has since moved
outside the interactive area of the target element.
The touch point or points that were removed must be included
in the changedTouches
attribute of the TouchEvent
, and
must not be included in the touches
and targetTouches
attributes.
If this event is canceled, any sequence of touch events that includes this event must not be interpreted as a click.
touchmove
eventA user agent must dispatch this event type to indicate when the user moves a touch point along the touch surface.
The target of this event must be the same Element
on
which the touch point started when it was first
placed on the surface, even if the touch point has since moved
outside the interactive area of the target element.
Note that the rate at which the user agent sends touchmove
events is implementation-defined, and may depend on hardware
capabilities and other implementation details.
A user agent should suppress the default action caused by
any touchmove
event until at least one touchmove
event
associated with the same active touch point is not
canceled. Whether the default action is suppressed
for touchmove
events after at least one touchmove
event
associated with the same active touch point is not canceled is
implementation dependent.
touchcancel
event
A user agent must dispatch this event type to indicate when a touch
point has been disrupted in an implementation-specific manner, such as
a synchronous event or action originating from the UA canceling the
touch, or the touch point leaving the document window into a
non-document area which is capable of handling user interactions.
(e.g. The UA's native user interface, plug-ins) A user agent may
also dispatch this event type when the user places more touch
points on the touch surface than the device or implementation is
configured to store, in which case the earliest Touch
object
in the TouchList
should be removed.
The target of this event must be the same Element
on
which the touch point started when it was first
placed on the surface, even if the touch point has since moved
outside the interactive area of the target element.
The touch point or points that were removed must be included
in the changedTouches
attribute of the TouchEvent
, and
must not be included in the touches
and targetTouches
attributes.
Document
Interface
The Document
interface [[!DOM-LEVEL-3-CORE]] contains methods
by which the user can create Touch
and TouchList
objects.
Touch createTouch()
Touch
object with the specified attributes.
view
target
identifier
pageX
pageY
screenX
screenY
TouchList createTouchList()
TouchList
object consisting of zero or more Touch
objects.
Calling this method with no arguments creates a TouchList
with no objects in it
and length 0 (zero).
Touch
... touches
Some user agents implement an initTouchEvent
method as part of the
TouchEvent
interface. When this method is available, scripts
can use it to initialize the properties of a TouchEvent
object,
including its TouchList
properties (which can be initialized
with values returned from createTouchList
). The
initTouchEvent
method is not yet standardized, but it may appear
in some form in a future specification.
click
The user agent may dispatch both touch events and mouse events
[[!DOM-LEVEL-2-EVENTS]] in response to the same user input. If the
user agent dispatches both touch events and mouse events in response to
a single user action, then the touchstart
event type must be
dispatched before any mouse event types for that action.
If touchstart
, touchmove
, or touchend
are canceled, the user agent should not dispatch any mouse
event that would be a consequential result of the the prevented touch
event.
If a Web application can process touch events, it can cancel the events, and no corresponding mouse events would need to be dispatched by the user agent. If the Web application is not specifically written for touch input devices, it will react to the subsequent mouse events instead.
User agents will typically dispatch mouse and click events when there is only a single active touch point. Multi-touch interactions – involving two or more active touch points – will usually only generate touch events.
If the user agent interprets a sequence of touch events as a click,
then it should dispatch mousemove
, mousedown
,
mouseup
, and click
events (in that order) at the location
of the touchend
event for the corresponding touch input. If the
contents of the document have changed during processing of the touch
events, then the user agent may dispatch the mouse events to a
different target than the touch events.
The default actions and ordering of any further touch and mouse events are implementation-defined, except as specified elsewhere.
The activation of an element (e.g., in some implementations, a tap) would typically produce the following event sequence (though this may vary slightly, depending on specific user agent behavior):
touchstart
touchmove
events, depending on movement of the fingertouchend
mousemove
(for compatibility with legacy mouse-specific code)mousedown
mouseup
click
If, however, either the touchstart
, touchmove
or touchend
event has been canceled during this interaction, no mouse or
click events will be fired, and the resulting sequence of events would simply be:
touchstart
touchmove
events, depending on movement of the fingertouchend
touchstart
event indicating its
appearance. It ceases to be active after the user agent dispatches a
touchend
or touchcancel
event indicating that the
touch point is removed from the surface or no longer tracked.
preventDefault()
,
returning false
in an event handler, or other means as defined by
[[!DOM-LEVEL-3-EVENTS]] and [[!HTML5]].
The working group maintains a list of open issues in this specification. These issues may be addressed in future revisions of the specification.
Many thanks to the WebKit engineers for developing the model used as a basis for this spec, Neil Roberts (SitePen) for his summary of WebKit touch events, Peter-Paul Koch (PPK) for his write-ups and suggestions, Robin Berjon for developing the ReSpec.js spec authoring tool, and the WebEvents WG for their many contributions.
Many others have made additional comments as the spec developed, which have led to steady improvements. Among them are Matthew Schinckel, Andrew Grieve, Cathy Chan, and Boris Zbarsky. If we inadvertently omitted your name, please let me know.
The group acknowledges the following contributors to this specification's test suite: Matt Brubeck, Olli Pettay, Art Barstow, Cathy Chan and Rick Byers.
This is a summary of the major changes made since the 10 October 2013 Recommendation was published. Full commit history is also available.