How input works – touch screen edge swipe gestures

Continuing my series about how input works in KWin/Wayland I want to discuss a brand new feature we implemented for Plasma 5.10. This year we had a developer sprint in Stuttgart and discussed what kind of touchpad and touch screen gestures we want to support and how to implement it. Now the result of this discussion got merged into our master branch and we are currently discussing which actions to use by default.

Gesture Recognizer

Touchpad and touch screen gestures are kind of similar so the approach we took is able to handle both of them. We introduced a GestureRecognizer which is able to recognize gestures (surprise) in a very abstract way. It doesn’t know how the input events look like, whether a touch screen, touchpad, mouse or whatever input device generated the event.

To use the GestureRecognizer a Gesture needs to be registered. The Gesture describes the actual Gesture which needs to be recognized. E.g. how many fingers need to participate in the gesture (for our touch screen gestures it is one, for our touchpad gestures it is four), the direction of the gesture (leftwards, rightwards, downwards, upwards), the minimum distance, the trigger position, etc. etc.

Now input events can be fed into the GestureRecognizer and the GestureRecognizer decides whether a Gesture is active or becomes non-active. As said this is absolutely generic, it doesn’t care how the events are triggered.

This alone does not yet allow to do anything with it, we don’t have any way to use the GestureRecognizer yet.
At this point our implementations for touchpad and touch screen divide. We have different existing implementations which are more suited than trying to have something shared for both.

Touchpad gestures

For touchpad gestures our global shortcuts handling is used. GlobalShortcutsManager is a KWin internal (!) mechanism to register some internal actions to tigger in a global way through input events. The GlobalShortcutsManager gained a GestureRecognizer and various areas in KWin can now register a QAction as a global touchpad gesture.

So far we still haven’t reached the elements we discussed in the previous posts like InputEventFilter. There is of course an InputEventFilter which feeds events into the GlobalShortcutsManager. This filter got extended to support touchpad gestures and now we have the full stack together:

  • libinput reports a touchpad gesture event
  • InputEventFilter passes the touchpad gesture event to the GlobalShortcutsManager
  • The GlobalShortcutsManager passes the information to the GestureRecognizer
  • The GestureRecognizer triggers the QAction
  • something happens

By default the following gestures are supported:

  • 4 finger swipe down: Present Windows
  • 4 finger swipe up: Desktop Grid
  • 4 finger swipe left/right: desktop next/previous

Screen edge support for touch

For touch screen gestures we used a different area in KWin which already provides a fairly similar functionality: Screen edge activation for mouse events.

It’s similar in the way that it activates an action when an input event happens at a screen edge. The main difference is the direction of the input event: for mouse it’s towards the edge, for touch it is from the edge.

The ScreenEdges gained a GestureRecognizer (you see there are two different, independently acting GestureRecognizers) and every Edge defines a Gesture. The Gesture is only passed to the GestureRecognizer if the Edge is reserved for a touch action. Each Edge can have a different action configured and of course you can configure different (or same) action for touch and mouse on the same Edge. When a Gesture for an Edge gets started we provide the same visual feedback as for the edges started through mouse events.

For ScreenEdges there is also a dedicated InputEventFilter which now gained support for touch events and feeds the touch events into the GestureRecognizer.

But there is more to it. This feature got backported to X11. Our X11-standalone platform plugin gained support for XInput 2.2 and touch events. KWin now listens for touch events on X11 and passes these events into the GestureRecognizer of the ScreenEdges. Thus the feature which we implemented on Wayland for Wayland is also available on X11 and directly usable for all users – be it on X11 or on Wayland.

Touchpad gestures are unfortunately only available on Wayland.