Table of Contents
- Hackish Workaround (Zoom)
While trying to help someone find the perfect remote desktop / remote control / VNC / etc. app, I realized that many Android and iOS apps are missing a key feature when it comes to controlling a desktop mouse with your phone… click-and-drag!
To be clear, a click-and-drag operation would be when you left click your mouse, and then, while still holding down the left mouse button, you move your cursor to a new position, and once there, release it.
At first, this might not seem like a huge deal; there are many things you can do on a computer without needing to click-and-drag. However, when it comes to collaborative and creative tools, this becomes a nightmare as pretty much everything requires click-and-drag support! Annotating, drawing, moving virtual objects, rearranging slides, etc…
The more I thought about this, and came across more and more “big league” apps ( > million downloads) without click-and-drag, I started to doubt my own assumptions about mobile development. Is there a good reason why this feature is so often omitted from remote control apps? Is there an OS-level API blocker? Permissions issues?
I decided to do some digging…
My guess as to why this is so often omitted, is because mapping a touchscreen to a mouse input is slightly complicated by the fact that there are multiple ways to do it, and there is no perfect mapping system. There is no perfect mapping, because a mouse doesn’t act like a touchscreen. With a mouse, you can move from point A to point B without holding down any buttons. But to move from point A to point B on a touchscreen, without jumping in between, you have to hold down your finger to the screen while moving. How should this be interpreted?
- If we map “user holds finger to screen and moves from A to B” to the mouse simply moving from A to B, then we cannot, at the same time, interpret it as a click and drag operation.
- This might be called “touch mouse” mode
- This is why many apps require a secondary indicator action from the user (long press, multi-finger tap, etc) as the start of a click-and-drag operation. But an equal, or greater amount of apps, seem to not bother spending the time to implement this at all, and just omit the click-and-drag ability altogether (looking at you, Zoom)
- If we map “user holds finger to screen and moves from A to B” to the mouse clicking and dragging, without requiring that the user long press first or something like that, than we lose the ability for the user to move the mouse in a continuous path, without dragging at the same time.
Apps Missing Click-and-Drag Support
Here are some examples of apps not supporting click-and-drag on mobile for remote mouse control:
A viewer on a mobile device has slightly more limited control than a viewer on a desktop or laptop computer. They can press to simulate a left mouse-button click anywhere on the screen share, and they can perform keyboard entry by first bringing up the keyboard using a button provided in their viewing interface. Dragging and dropping, or dragging scrollbars, is not supported, nor are keyboard shortcuts such as Ctrl+B since there is no way of generating them on most mobile devices.
Many don’t allow for remote mouse control from a mobile touchscreen, period!
Different Mapping Options
Here are some examples of developers using some different mapping options:
Github thread on the topic: https://github.com/novnc/noVNC/issues/1267
RealVNC’s unique mapping model
- Double-tap to start click-and-drag
- Even middle-click is supported (three finger tap)
Android-VNC-Viewer allows for the user to switch between mapping modes!
- This is very unique; I wish every remote control app offered this!
- This is also a good way to show how complex mapping is
Google Chrome Remote Desktop: Android Police writeup
Click and drag is also a bit confusing at first. Most apps use double-tap and drag, but not Chrome Remote Desktop. With this app you long-press until a ripple effect appears around the cursor. At that point you can begin dragging where you want.
TeamViewer: On Android, uses double-tap to start click and drag
Splashtop: Tablet and Mobile Phone Mappings
An added “wrinkle” in developing a touchscreen-to-mouse mapping is that if your app is a webpage / wrapper around a web app, then you are going to be receiving both mouse and touch events, and you need to add logic to differentiate between and map appropriately.
For example, you might get all of the below:
Related article (although dated): html5rocks.com – Touch And Mouse
It looks like, although this is a common issue, there are plenty of exceptions that make it clear that this is not an OS limitation of Android or iOS, but has more to do with how each individual app developer decided to implement the handling of touch events.
For example, Android-VNC-Viewer, one of the earlier VNC clients for Android, has a great breakdown in their wiki on different mapping modes that they make available to the user, which includes click-and-drag support.
So, the TLDR? Most likely this feature is omitted simply because of the added complexity. Eye-roll please.
Hackish Workaround (Zoom)
For Zoom, I built a quick workaround kludge, that intercepts when the Zoom Android client sends a right click (which happens on a “long press”) and turns it into the start of a click-and-drag operation. You can find it here: github.com/joshuatz/right-click-and-drag