Last year I started a blog post series about how input works in KWin/Wayland. This blog post resumes this series by talking about touch input.
Several people wondered why it took so long for this blog post. After all it’s more than a month since the last one. Of course there is a good reason for it. I was reworking parts of the input stack and wanted to discuss the changes with the next post of the input blog post series. Unfortunately there are still a few changes missing, so I decided to nevertheless do the touch input post first.
Touch input is the new kid in the block concerning input events. It’s a technology which was created after X11 got created and thus it is not part of the X11 core protocol. On X11 this makes touch a weird beast. E.g. there is always an emulation to a pointer event. Applications which do not support touch can still be used as the touch events generate pointer events. Now this is actually a huge sacrifice for the API and means that touch feels – at least to me – as a second class citizen in X11.
On Wayland the situation is way better. Touch is part of the core input protocol and does not emulate pointer events. Applications need to support touch in order to get touch events. If an application does not support touch, the touch events won’t trigger any actions. This is a good thing as it means applications need to do something sensible with touch events.
Like with the other events touch events are reported to KWin by libinput. Touch events are quite straight forward. We get touch down events (when a finger touches the screen), touch up events (when a finger gets lifted again) and touch motion events (when the finger moves on a screen). This is fully multi-touch aware, meaning we can follow multiple touch points individually.
The events are sent through KWin’s internal filter architecture like all other events. Currently KWin does not really intercept events yet. We do support touch events on window decoration and KWin’s own internal windows. But in those cases we emulate mouse events. We don’t have any UI elements which would benefit from multi touch events, thus emulating mouse events internally is sufficient for the time being. If in future we add multi touch aware UI elements that would require changes.
In case KWin does not intercept the touch sequence the events are passed on to the KWayland Server component which forwards the events to the Wayland window which is currently receiving touch events. KWin determines the window by using the window at the first touch down of the sequence. While a sequence is in progress the window cannot change.
The touch events are then processed by the application and can provide sensible functionality. E.g. our Plasma calendar supports a pinch-zoom gesture to switch to an overview of all months. This was developed under X11 and just works on Wayland without any adjustments. Good job, Qt devs!
Last week at the Plasma sprint touch gestures were an important discussion point during the last days. We decided which global gestures we want to support in Plasma. We hope to be able to deliver this for Plasma 5.10 on Wayland and will also look to get the same on X11 by reusing the architecture written for Wayland. But this might land in a later release.
Global touch gestures have an interesting and useful feature. When a sequence starts KWin does not know whether that will be a global gesture or a gesture which needs to be forwarded to the applications. Thus all events must be sent to the applications. Once KWin knows that this is a global gesture it can send a cancel event to the application. This informs the application that the touch sequence got canceled. This prevents conflicts between the global and application touch gestures. On X11 this is not so comfortable, so we will have to see how we can support this.
> KWin determines the window by using the window at the first touch down of the sequence. While a sequence is in progress the window cannot change.
Does this mean that, e.g., drag’n’drop using touch events is not possible between applications?
No, drag and drop has its own protocol. But thinking about it, I think we don’t have code for this.
As someone who sometimes uses a graphics tablet (small wacom one really nice when needing to retouch some fine details in a photo) I was wondering how that would work in Wayland? As far as I can tell libinput does have support for these kinds of things (https://wayland.freedesktop.org/libinput/doc/latest/tablet-support.html )
That is a topic for another blog post.
Looking forward to that post then, keep up the good work!
Hm, a few things I always noticed on my tablet is that there was a general lack of support for the hold-to-right-click gesture, and it was also very difficult to just click something; all events would be interpreted as drag events, since there is always some movement that the screen detects even if the finger/stylus is not moving. So, for instance, there was almost no way to deselect in Konsole, it would instead always select some part of it. That, also coupled with the general lack of DPI scaling, made touch rather annoying in the past.
I wonder whether any of this is going to improve in the future, and how much of a say KWin has in that.
I know this is off-topic, but did the client-side/server-side window decoration negotiation protocol ever get into core Wayland?
Having used a touch-screen laptop for around three years now, even just emulating left-mouse-click is extremely useful. It would be great if such emulation were still available for non-touch-aware apps.
Of course, this all assumes UI elements are not too small on the screen. Forcing DPI in the KDE fonts settings helps a lot, and there are a few other tweakables if you look through the UI/theme settings (e.g. scroll bar width), but not everything. (There are global scaling options — e.g. everything ×2 — but I don’t like that option so much.)
And exactly which application do need it? Qt supports it on Wayland, GTK supports it on Wayland, Xwayland supports it including the legacy emulation to mouse pointer.