How input works – touch screen edge swipe gestures

Continuing my series about how input works in KWin/Wayland I want to discuss a brand new feature we implemented for Plasma 5.10. This year we had a developer sprint in Stuttgart and discussed what kind of touchpad and touch screen gestures we want to support and how to implement it. Now the result of this discussion got merged into our master branch and we are currently discussing which actions to use by default.

Gesture Recognizer

Touchpad and touch screen gestures are kind of similar so the approach we took is able to handle both of them. We introduced a GestureRecognizer which is able to recognize gestures (surprise) in a very abstract way. It doesn’t know how the input events look like, whether a touch screen, touchpad, mouse or whatever input device generated the event.

To use the GestureRecognizer a Gesture needs to be registered. The Gesture describes the actual Gesture which needs to be recognized. E.g. how many fingers need to participate in the gesture (for our touch screen gestures it is one, for our touchpad gestures it is four), the direction of the gesture (leftwards, rightwards, downwards, upwards), the minimum distance, the trigger position, etc. etc.

Now input events can be fed into the GestureRecognizer and the GestureRecognizer decides whether a Gesture is active or becomes non-active. As said this is absolutely generic, it doesn’t care how the events are triggered.

This alone does not yet allow to do anything with it, we don’t have any way to use the GestureRecognizer yet.
At this point our implementations for touchpad and touch screen divide. We have different existing implementations which are more suited than trying to have something shared for both.

Touchpad gestures

For touchpad gestures our global shortcuts handling is used. GlobalShortcutsManager is a KWin internal (!) mechanism to register some internal actions to tigger in a global way through input events. The GlobalShortcutsManager gained a GestureRecognizer and various areas in KWin can now register a QAction as a global touchpad gesture.

So far we still haven’t reached the elements we discussed in the previous posts like InputEventFilter. There is of course an InputEventFilter which feeds events into the GlobalShortcutsManager. This filter got extended to support touchpad gestures and now we have the full stack together:

  • libinput reports a touchpad gesture event
  • InputEventFilter passes the touchpad gesture event to the GlobalShortcutsManager
  • The GlobalShortcutsManager passes the information to the GestureRecognizer
  • The GestureRecognizer triggers the QAction
  • something happens

By default the following gestures are supported:

  • 4 finger swipe down: Present Windows
  • 4 finger swipe up: Desktop Grid
  • 4 finger swipe left/right: desktop next/previous

Screen edge support for touch

For touch screen gestures we used a different area in KWin which already provides a fairly similar functionality: Screen edge activation for mouse events.

It’s similar in the way that it activates an action when an input event happens at a screen edge. The main difference is the direction of the input event: for mouse it’s towards the edge, for touch it is from the edge.

The ScreenEdges gained a GestureRecognizer (you see there are two different, independently acting GestureRecognizers) and every Edge defines a Gesture. The Gesture is only passed to the GestureRecognizer if the Edge is reserved for a touch action. Each Edge can have a different action configured and of course you can configure different (or same) action for touch and mouse on the same Edge. When a Gesture for an Edge gets started we provide the same visual feedback as for the edges started through mouse events.

For ScreenEdges there is also a dedicated InputEventFilter which now gained support for touch events and feeds the touch events into the GestureRecognizer.

But there is more to it. This feature got backported to X11. Our X11-standalone platform plugin gained support for XInput 2.2 and touch events. KWin now listens for touch events on X11 and passes these events into the GestureRecognizer of the ScreenEdges. Thus the feature which we implemented on Wayland for Wayland is also available on X11 and directly usable for all users – be it on X11 or on Wayland.

Touchpad gestures are unfortunately only available on Wayland.

12 Replies to “How input works – touch screen edge swipe gestures”

  1. Niiice! Care to make a video of it in action?

    How ready for release is this? Do you expect it to arrive in kde neon user edition soon?

  2. Awesome!
    after discussing this in the Telegram group, I put up a KDE wiki a while ago about gestures, here:
    https://community.kde.org/KDE_Visual_Design_Group/Gestures
    Was this considered/used?

    Also, can the touchpad gestures be customized (in directions, action and number of fingers) or disabled, in case one wants to just use, for example, libinput-gestures?

    Thanks as always for the great work and for taking the time to report back here.

    1. Was this considered/used?

      This is the first time I hear of it. So no, if you don’t share such things with the devs, the devs won’t consider it. I asked for input from the VDG about two years ago – nothing happened, so we came up with an own solution at the Affenfels sprint. Now asking whether we considered this is rather disappointing.

      Also, can the touchpad gestures be customized (in directions, action and number of fingers) or disabled, in case one wants to just use, for example, libinput-gestures?

      No to all questions.

      1. This is quite sad. This was the first time I tried to contribute to KDE directly (not just with funding) so I’m pretty disappointed to see that my time was completely wasted. I thought that communicating this with the VDG, and exchanging opinions on this for several days, was actually the same as “sharing with the devs”. So, from my point of view, I actually DID share what I could contribute. A normal user would think that the VDG and the (other) devs communicate a lot. It’s a problem between the VDG and yourself if you didn communicate enough about this. Sorry to come out maybe abruptly, I do not want to sound negative, but I believe you can see why I am sad about this.

        I believe not making this da-activable is a mistake. Many users use libinput-gestures already, and might have different gestures set up for what you coded, For example, I switch windows, not desktops, with 4 fingers horizontal swipes. Thus, 5.10 will break my workflow and that of many others. I bet you do not care too much tho, as you always reply bitterly to this kind of comment which describe the usecase of a minority of the users:(
        (they can be disabled in Gnome)

        1. This is quite sad.

          I’m very sorry and also quite sad – you might have noticed in my reply. This is just f***g unbelievable. Members of the VDG knew that I was working on it and nobody notified me or any other developer who would have been able to raise this to me. I was waiting for such input quite long and very disappointed to not get it. And to figure out that there is such a document now is also quite disappointing to me.

          Thus, 5.10 will break my workflow and that of many others.

          No it won’t. This is Wayland only and unless you are on Wayland it won’t break your workflow. We will quite certainly not add any support for libinput-gestures because of security reasons. To use libinput-gestures the users need to reconfigure their system in an – IMHO – unacceptable way. Thus we will not support it.

          We might in the future make the actions configurable, but not with the aim to support libinput-gestures.

          1. Thanks again for the reply. Yes, you are of course right about not breaking the workflow because of Wayland and X11, I forgot. And I guess you are right about the security concerns of libinput-gestures.
            In KDE’s spirit, I do believe you’ll end up making the gestures configurable and adding more and more gestures, as other DEs and OSs have. Even though it was just very amateurish work (basically just a list of possibilities), I hope that that wiki will be of some use to you in the future. However, you’ll need a KCM for the gestures, I hope someone will pop up and create that. Thinking about that, if there is a (KDE) coding project I might undertake, that’s one I’d like to do. We’ll see if I’ll have time to learn C!

            Good work as always, keep it up.

  3. Please forgive me for asking in such a strange place, but the kde and qt bug reports on the wayland issue have had no comments for around a month, and I’ve been unable to find more info on the issue:

    What’s the state of/plan for kde on wayland with qt 5.8+? I understand qt broke their versioning stability guarantee, and I’m not pointing fingers at you guys. I’m in an interesting spot on opensuse tumbleweed of wanting to test bleeding edge kde, but the unstable kde repositories switched to qt 5.8 a bit ago.

    You guys also previously had the best wayland experience for me out of all I’ve tried (gnome, enlightenment, sway).

    1. What’s the state of/plan for kde on wayland with qt 5.8+?

      Qt 5.8 is dead – there won’t be any bugfixes for Qt 5.8. The bug fix required on Qt side is only available in Qt 5.9. Given that I cannot recommend Qt 5.8 to anyone, especially as it’s not the only issue.

  4. Ah, very nice. I was wondering whether that was already available, what with the screen edge support already being there, so it’s nice to know.

    And wonderful timing, too: I’m getting a new convertible soon, and I can’t wait to see how Plasma (Mobile) performs on it. I’ll try and compare several desktops and see what the pros and cons of each are. Having gesture support is really nice for on-screen keyboard invocation.

  5. In touchegg, but not in libinput-gestures, one can also use three- or four- finger “dragging” instead of “swiping”. Dragging can be used for instance to move the window (as in Alt-F3 Move) continuously, or scrolling. I found this nice and used the window move with three fingers. It was only triggered when the three fingers started the motion left or right. When three fingers went up or down, something else was activated (in my config it was maximize and minimize window).

Comments are closed.