Click and Drag Mobile Touchscreen Mouse Mapping

  • report
    Click for Disclaimer
    This Post is over a year old (first published about 4 years ago). As such, please keep in mind that some of the information may no longer be accurate, best practice, or a reflection of how I would approach the same thing today.
  • infoFull Post Details
    info_outlineClick for Full Post Details
    Date Posted:
    Mar. 30, 2020
    Last Updated:
    Mar. 30, 2020
  • classTags
    classClick for Tags

Table of Contents


While trying to help someone find the perfect remote desktop / remote control / VNC / etc. app, I realized that many Android and iOS apps are missing a key feature when it comes to controlling a desktop mouse with your phone… click-and-drag!

To be clear, a click-and-drag operation would be when you left click your mouse, and then, while still holding down the left mouse button, you move your cursor to a new position, and once there, release it.

At first, this might not seem like a huge deal; there are many things you can do on a computer without needing to click-and-drag. However, when it comes to collaborative and creative tools, this becomes a nightmare as pretty much everything requires click-and-drag support! Annotating, drawing, moving virtual objects, rearranging slides, etc…

The more I thought about this, and came across more and more “big league” apps ( > million downloads) without click-and-drag, I started to doubt my own assumptions about mobile development. Is there a good reason why this feature is so often omitted from remote control apps? Is there an OS-level API blocker? Permissions issues?

I decided to do some digging…


Initial Thoughts

My guess as to why this is so often omitted, is because mapping a touchscreen to a mouse input is slightly complicated by the fact that there are multiple ways to do it, and there is no perfect mapping system. There is no perfect mapping, because a mouse doesn’t act like a touchscreen. With a mouse, you can move from point A to point B without holding down any buttons. But to move from point A to point B on a touchscreen, without jumping in between, you have to hold down your finger to the screen while moving. How should this be interpreted?

  • If we map “user holds finger to screen and moves from A to B” to the mouse simply moving from A to B, then we cannot, at the same time, interpret it as a click and drag operation.
    • This might be called “touch mouse” mode
    • This is why many apps require a secondary indicator action from the user (long press, multi-finger tap, etc) as the start of a click-and-drag operation. But an equal, or greater amount of apps, seem to not bother spending the time to implement this at all, and just omit the click-and-drag ability altogether (looking at you, Zoom)
  • If we map “user holds finger to screen and moves from A to B” to the mouse clicking and dragging, without requiring that the user long press first or something like that, than we lose the ability for the user to move the mouse in a continuous path, without dragging at the same time.

Real-World Examples:

Apps Missing Click-and-Drag Support

Here are some examples of apps not supporting click-and-drag on mobile for remote mouse control:


    • A viewer on a mobile device has slightly more limited control than a viewer on a desktop or laptop computer. They can press to simulate a left mouse-button click anywhere on the screen share, and they can perform keyboard entry by first bringing up the keyboard using a button provided in their viewing interface. Dragging and dropping, or dragging scrollbars, is not supported, nor are keyboard shortcuts such as Ctrl+B since there is no way of generating them on most mobile devices.

  • Zoom: there is no control to start click-and-drag

  • Many don’t allow for remote mouse control from a mobile touchscreen, period!

Different Mapping Options

Here are some examples of developers using some different mapping options:

Web Complexity

An added “wrinkle” in developing a touchscreen-to-mouse mapping is that if your app is a webpage / wrapper around a web app, then you are going to be receiving both mouse and touch events, and you need to add logic to differentiate between and map appropriately.

For example, you might get all of the below:

  • pointerover -> pointerenter->pointerdown->touchstart->pointerup->pointerout->pointerleave->touchend
  • mouseover->mousemove->mousedown->mouseup->click

Related article (although dated): – Touch And Mouse


It looks like, although this is a common issue, there are plenty of exceptions that make it clear that this is not an OS limitation of Android or iOS, but has more to do with how each individual app developer decided to implement the handling of touch events.

For example, Android-VNC-Viewer, one of the earlier VNC clients for Android, has a great breakdown in their wiki on different mapping modes that they make available to the user, which includes click-and-drag support.

So, the TLDR? Most likely this feature is omitted simply because of the added complexity. Eye-roll please.

Hackish Workaround (Zoom)

For Zoom, I built a quick workaround kludge, that intercepts when the Zoom Android client sends a right click (which happens on a “long press”) and turns it into the start of a click-and-drag operation. You can find it here:

2 thoughts on “Click and Drag Mobile Touchscreen Mouse Mapping”

  1. Parsecplayer1 says:

    Hello, I have quite the same problem but with parsec android client connecting to parsec windows host, moving the finger on the android screen makes the windows mouse moving/hovering but I would prefer a left click and drag instead, can you make an update on your code to handle this ?

    1. joshuatz says:

      Sorry for taking a while to respond to this. If you, or anyone else, is wondering about this, I think you could just swap the buttons on lines 150 & 182 and it would work (maybe – it’s been a while since I wrote this).

Leave a Reply

Your email address will not be published.