Input Handling

Translating mobile touch gestures to a remote desktop environment requires a specialized pipeline. AVNC splits input interpretation across three main classes: TouchHandler, KeyHandler, and Dispatcher.

Input Flow

     +----------------+     +--------------------+     +--------------+
     |  Touch events  |     |     Key events     |     | Virtual keys |
     +----------------+     +--------------------+     +--------------+
             |                        |                        |
             v                        v                        |
     +----------------+     +--------------------+             |
     | [TouchHandler] |     |    [KeyHandler]    |<------------+
     +----------------+     +--------------------+
             |                        |
             |                        v
             |              +--------------------+
             +------------->+    [Dispatcher]    +
                            +--------------------+
                                      |
                                      |
                 +--------------------+---------------------+
                 |                    |                     |
                 v                    v                     v
         +---------------+    +----------------+    +---------------+
         |  [Messenger]  |    | [VncViewModel] |    | [VncActivity] |
         +---------------+    +----------------+    +---------------+

1. Touch Handler (TouchHandler.kt)

The TouchHandler interprets raw Android MotionEvents.

  • Gesture Detection: Leverages a custom-tuned GestureDetector to recognize single taps, double taps, long presses, two/three-finger taps, and multi-finger swipes.
  • Swipe vs. Scale Detection: Android often confuses a two-finger swipe with a pinch-to-zoom. AVNC calculates the vectors between fingers; if the angle variance is low, it triggers a pan/scroll. If high, it triggers a zoom.
  • Hardware Passthrough: Physical mice and stylus events are identified by their InputDevice source and routed directly as absolute pointer clicks/moves.

2. Key Handler (KeyHandler.kt)

The RFB (VNC) protocol relies on X11 KeySyms, not standard scan codes.

  • Unicode Composition: Mobile keyboards use character maps that send diacritics (dead keys like ^ or ~) separately. KeyHandler buffers these combining accents and composes the final Unicode character before looking up the X11 KeySym.
  • Fake Shifts: Software keyboards (like Gboard) frequently report uppercase characters without reporting a Shift key ACTION_DOWN. The handler synthesizes fake Shift press/release events to ensure the remote server respects the capitalization.
  • XT Keycodes: AVNC supports the QEMU Extended Key Event protocol. If the server supports it, AVNC passes raw Linux kernel scancodes to avoid X11 layout translation issues.

3. Dispatcher (Dispatcher.kt)

The Dispatcher translates recognized gestures into concrete RFB actions based on user preferences.

  • Direct Mode (Touchscreen): Actions (clicks) happen exactly at the absolute framebuffer coordinates mapped beneath the finger.
  • Relative Mode (Touchpad): The device acts like a laptop trackpad. Taps trigger clicks at the current remote cursor location. Movement drags the cursor relative to its last position.
  • Pointer Acceleration: In Touchpad mode, the PointerAcceleration class implements velocity-based acceleration curves (similar to libinput on Linux) to allow precise movements at slow speeds and large screen traverses at high speeds.