How do touch gestures work in ThinLinc?

Both the native[1] and browser-based ThinLinc clients support a number of different touch gestures, which can be used on devices with a touch screen. This feature can be useful for certain workflows, or for using ThinLinc from devices which lack a keyboard or mouse.

Since most GNU/Linux desktop environments and applications are designed to be used with a keyboard and mouse, your mileage may vary here. ThinLinc detects touch gestures on the local device, and translates them into the equivalent mouse and keyboard inputs on the remote desktop. How these inputs are then interpreted on the desktop is application-dependent.

For example, ThinLinc translates a two-finger drag (“pan”) gesture to a mouse scroll-wheel rotation on the remote desktop. Some applications will interpret this as a scroll (e.g. move the document up and down), whereas others will interpret it as a zoom (e.g. make the document bigger/smaller). When setting up a desktop environment for use with touch-screen devices, it can be useful to test that applications behave in a way that users expect.

Since fingers are less accurate than a mouse cursor, consider making things like icons and scroll bars larger for touch-screen users. Some desktop environments and applications provide accessibility settings for the visually impared, which may be useful here. It might also be possible to adjust DPI settings for the same purpose, depending on the size and resolution of the users’ displays.

For more information on using touch gestures in ThinLinc, see Client Touch Gestures — The ThinLinc Administrator's Guide 4.14.0 build 2408 documentation

  1. Excluding macOS. ↩︎