Skip

Changes and additions in Pointer Events Level 3

By Patrick H. Lauke

See also the slides & demos.

Transcript

Hi there…

This is a brief overview of the Pointer Events Working Group and what we've been working on – changes and additions in Pointer Events Level 3.

First of all, to set the scene: what are Pointer Events?

Now, to better address devices with different input types, this specification defines a more abstract form of input called a pointer.

It's a high level event model based on mouse events but also covering pen, touch, and other (future) pointing devices.

The latest stable recommendation is Pointer Events Level 2, from the 4th of April 2019, and work has since been ongoing towards Pointer Events Level 3, which is currently in the editor's draft stage as of August '23.

Now, Pointer Events Level 3, beyond clarifications for undocumented cases and scenarios that have come out of implementation experience and developer adoption, Level 3 includes a few new features: The pointerrawupdate events to better react to fast movements ; The getCoalescedEvents() and getPredictedEvents() methods ; The altitudeAngle and azimuthAngle properties ; And as a little bonus, the redefinition of click, auxclick, and contextmenu as actual pointer events.

So, pointerrawupdate…

The problem: for performance reasons, user agents may delay the dispatch of pointermove events, as they already do for mousemove.

For very fast pointer movements, user agents will generally coalesce individual small movements into a single pointermove event.

While this is good for performance, this can be problematic for scenarios where authors want to accurately track high frequency pointer movements – for instance, drawing applications.

And I have a little demo here of a basic drawing application using pointermove.

So I'm using the mouse and if I'm drawing – particularly curves – in this application, just tracking pointermove, you'll see that it's not particularly smooth.

There are many points that are very far apart for fast movements, and if I'm just drawing a line between each point that I'm getting from my pointermoves, the line that results at the end of it is not particularly smooth.

So, enter pointerrawupdate.

The new pointerrawupdate event aims to help make these applications work smoother.

Now, compared to the pointermove event, user agents should dispatch pointerrawupdate as soon as possible, at a very high frequency.

And, as a comparison, here's the same basic drawing application, but this time, we're using pointerrawupdate rather than pointermove.

And, you'll see that there are a lot more points that are being tracked, and even for particularly curved aspects of the movement that I'm tracking, the line is a lot smoother.

It's still not perfect, but there's a lot higher frequency of points, and as a result, the line that is then being drawn is a lot smoother.

Now, pointerrawupdate may have a performance impact.

Authors should keep the code that they're executing in response to pointerrawupdate to a minimum.

Generally, you just want to store the coordinates of that particular pointerrawupdate if you're tracking it.

And note that even though pointerrawupdate should fire as soon as possible, the user agent may still coalesce a few individual events and changes if they are extremely fast – faster than the update rate of pointerrawupdate.

Now, we talked about coalesced points.

There's a new getCoalescedEvents() method in Pointer Events Level 3.

So, the problem, again, for very fast pointer movements, user agents will generally coalesce individual small movements into a single pointermove, or even – as we've seen – a pointerrawupdate event.

Again, for certain applications such as drawing applications, authors may want access to all the separate events that were coalesced by the user agent.

Now, the new getCoalescedEvents() method gives authors access to all the raw position and property changes that were coalesced into a single pointermove or pointerrawupdate event.

And this really provides the best of both worlds: it allows for increased granularity without incurring additional performance penalties.

As an author, you'd listen to regular pointermove or pointerrawupdate events, and then process all the coalesced events for that particular event.

And with a little bit of pseudocode here, what you would do is, as normal, you'd add an event listener for pointermove, and then, when you receive pointermove events, for instance, you check if there are coalesced events available.

And if so, you'd literally loop over the list of coalesced events, and then do something clever with those coalesced events – for instance, grabbing the clientX and clientY properties, and actually using those coordinates to do the drawing, for instance, in a drawing application.

And of course, you'd still fall back if there are no coalesced events – if they're not supported – just do the normal processing of pointermove.

And, once again, the same basic demo is a basic drawing application.

This time, we are using pointermove, but we also track the getCoalescedEvents().

So you'll see, as I'm drawing, we see the black spots here are the individual coordinates purely from the pointermove event, but the red circles here are the coordinates of the coalesced events that we're also processing.

And you'll see particularly on areas such as this one where there's been very fast movement in a curve, the actual pointermove as we saw before, is very distant here – there's a lot of, kind of, averaging out of this particular curve.

But then, if we also process the coalesced events that we received as part of these individual pointermoves, we can then use those to draw a much smoother line after the fact as well.

Now, getPredictedEvents().

The problem is even though we now have pointerrawupdate and getCoalescedEvents() to get high frequency updates, particularly in situations such as drawing applications, there may still be a perceived latency – particularly on situations like stylus enabled tablets, where as you're drawing, there is still a very small gap between the stylus itself – the physical stylus – being moved, and the application catching up – receiving the events and, say, drawing a line for instance.

So that's where getPredictedEvents() can come in handy.

Now, some user agents have built-in algorithms which, after a series of confirmed pointer movements can predict likely future movements.

Now, the new getPredictedEvents() method gives authors access to these predicted events, and this can be helpful in scenarios like drawing applications: you can "draw ahead" to predicted positions to reduce the perceived latency, and then later discard these speculative/predicted points when the real points are actually received.

And with a bit of pseudo code again, no big surprises.

The way you would process these is: you listen to a normal event, such as pointermove, and then within the handling of that event you can check if there are… if getPredictedEvents() is actually supported, and if so, you iterate over the list of predicted events and do something clever at that point with the predicted events, such as drawing ahead speculatively, and then keeping a note of that and then checking later on when an actual further pointermove comes in, removing the previous predicted lines, and just drawing the actual line.

And, a little demo once again: it's a basic drawing application using pointermove, and in this case, I'm not doing anything fancy like trying to draw ahead and then removing those speculative points.

I'm literally just showing where the individual pointers are, the individual pointermove events, and I'm also using coalescedEvents to make it smoother.

But then, also visualizing the predicted events just as circles.

So, if I open this up and if I'm drawing here, we'll see the actual explicit pointermove and coalesced events are drawn here as black dots, and then as I've been drawing, I've also been checking if there are predicted events, and drawing little circles there.

And, you'll see that just at the end of the movement, the predicted events kind of predicted that the line would continue if I was to continue on the same track as before.

Now, this isn't perfect, particularly if I'm doing curves, for instance, there may be scenarios where the prediction algorithm thinks that I'm going to be furthering the line here along the tangent, when actually I decided to kind of move to the side in the curve.

So, they're not perfect, but, as I said, these can be used more as a way of smoothing out the perceived latency as we're drawing things, as we're tracking high frequency, kind of, pointer movements.

And also, just to clarify what's in an out of scope: both getCoalescedEvents() and getPredictedEvents() in our specification only define the methods and the API to access coalesced and predicted events.

The Pointer Events specification itself does not define how events are coalesced or how they are predicted – this is left up to individual implementations, so it could be operating system or user agent dependent.

Next up: altitudeAngle and azimuthAngle.

The problem: the original Pointer Events specification defines tiltX and tiltY properties to convey the orientation of a stylus.

Now, these properties are admittedly not very intuitive for developers.

Here are the diagrams that we have in the specification, and I'm just going to read out the start of the definition for these properties as well.

So tiltX: "The plane angle (in degrees, in the range of [-90,90]) between the Y-Z plane and the plane containing both the transducer (e.g. the pen or stylus) axis and the Y axis".

And the equivalent for tiltY: "The plane angle between the X-Z plane and the plane containing both the transducer axis and the X axis".

So, we've had a lot of developer feedback, particularly, letting us know that this was not very intuitive or easy to grasp.

So, enter altitudeAngle and azimuthAngle.

Pointer Events Level 3 "borrows" the altitudeAngle and azimuthAngle properties from Touch Events.

And these two properties were introduced when Apple expanded Touch Events, effectively, to support Pencil on iPad.

Here's the two diagrams of how to visualize altitudeAngle and azimuthAngle.

The definition is still fairly dense, but it is more intuitive from just looking at the actual graphs as well.

So altitudeAngle: "The altitude (in radians) of the transducer (e.g. pen or stylus), in the range [0,π/2] — where 0 is parallel to the surface (X-Y plane), and π/2 is perpendicular to the surface." And azimuthAngle: "The azimuth angle (in radians) of the transducer, in the range [0,2π] — where 0 represents a transducer whose cap is pointing in the direction of increasing X values (point to "3 o'clock" if looking straight down) on the X-Y plane, and the values progressively increase when going clockwise".

Still, admittedly, fairly complex, but slightly more intuitive than the definition of tiltX and tiltY with the different planes, and how to visualize that.

Now, altitudeAngle and azimuthAngle.

User agents must provide both the classic tiltX and tiltY, and the new altitudeAngle and azimuthAngle properties.

And the specification includes an algorithm for converting between the two sets as well.

And as a little demo, I have a little pen tracker.

I'm actually going to use a Wacom tablet in this case to illustrate this. (let's zoom in a little bit… so, hopefully you'll be able to see on the webcam) I am tracking in this case the position of this particular stylus, and, you know, visualizing it with WebGL.

But in the top left corner, I'm also showing some of the properties of the events that are coming in, and you'll see that as I'm moving the stylus, as I'm tilting the stylus, I'm receiving both tiltX and tiltY properties, as well as azimuthAngle and altitudeAngle – and again, a reminder tiltX until tiltY is in degrees, and the azimuth and altitude angles are in radians.

And now, as a little bonus addition: click, auxclick, and contextmenu.

Pointer Events Level 3 redefines these events in UI Events, so the specification defines these – click, auxclick, and contextmenu – as actual Pointer Events.

Now, this change is already in the latest UI Events working draft, and it opens up some possible new applications.

For instance, a very simple one is: if I have a control or a button, I can determine what type of input caused one of these events to be fired, by simply checking the pointerType property of, say, the click event as it comes in.

And once again, here's a little demo: it's a basic button that also shows the pointerType once it's been clicked.

So, opening this up.

No surprises if I click this with the mouse, it will report that the pointerType was indeed 'mouse'.

Now, once again, I'm going to use my stylus here, and activate this.

If I can… and we'll see that, for that particular click event, it actually reported pointerType 'pen'.

Now, of course, not everything is a pointer.

So I'm going to use the keyboard now to just set focus to this button, and if I activate this… yes, pointerType will be empty, So, of course, for accessibility reasons, if nothing else, remember that… don't always assume that the click in this case will be fired by either a pen, or a mouse, or – if I had a touch screen available here for the demo, it would report 'touch' as well.

But there will be situations where pointerType itself is actually empty, and that generally would indicate that it's something like maybe a keyboard activated it, or the click event was fired programmatically via JavaScript, for instance.

And, that concludes the slight update of what we've been working on in the Pointer Events Level 3 specification.

So, we're currently finishing up the last few blockers for Level 3, and adding all the Web Platform Tests for the specification.

We're hoping to go to Candidate Recommendation shortly.

And, we also have the links here in the presentation to the current Editor's Draft, and obviously our work is on GitHub, so… feel free to check it out.

And with that… thank you very much

Skip

Sponsors

Support TPAC 2023 and get great benefits from our Sponsorship packages.
For further details, contact sponsorship@w3.org

Silver sponsor

Gooroomee

Bronze sponsors

Sttark Igalia

Inclusion fund sponsors

TetraLogical Services University of Illinois Chicago Igalia