Mouse event generation from touch events
OK, I saw this question here for a while, and no one comes up with the answer that I will give him.
Touching events, unlike mouse events, is associated with many points of contact with the user interface. To satisfy this, sensory events provide many sensory points. Since the mouse cannot be in two places at the same time, the two interaction methods really should be handled separately for a better user experience. OP, since you are not asking about detecting the touch of a device or mouse, I left this for another person to ask.
Handling both
Mouse and touch events can coexist. Adding listeners to mouse or touch events on a device that does not have one or the other is not a problem. The missing input interface simply does not generate any events. This simplifies the implementation of a transparent solution for your page.
It depends on which interface you prefer and emulate that interface when the hardware is not available for it. In this case, I will emulate a mouse from any generated touch events.
Creating events programmatically.
The code uses the MouseEvent object to create and dispatch events. It is easy to use, and events are indistinguishable from real mouse events. Detailed Description MouseEvents goto MDN MouseEvent
In the most basic.
Create a mouse click event and send it to the document
var event = new MouseEvent( "click", {'view': window, 'bubbles': true,'cancelable': true}); document.dispatchEvent(event);
You can also send an event to individual items.
document.getElementById("someButton").dispatchEvent(event);
To listen to an event is exactly the same as listening to an actual mouse.
document.getElementById("someButton").addEventListener(function(event){
The second argument to the MouseEvent function is where you can add additional information about the event. Say, for example, clientX and clientY the mouse positions or which or buttons for which the / s button is pressed.
If you have ever looked at MouseEvent , you will find out that there are many properties. Thus, exactly what you send in the mouse event will depend on what your event listener is using.
Touch events.
Touching events is like a mouse. There is touchstart , touchmove and touchend . They differ in that they provide multiple locations, one element for each contact point. Not sure what max is, but for this answer we are only interested in one. See MDN touchEvent for details .
What we need to do is for sensory events in which only one contact point is involved, which we want to generate the corresponding mouse events in one place. If a touch event returns more than one contact point, we cannot know which one the focus is on, so we will simply ignore them.
function touchEventHandler(event){ if (event.touches.length > 1){
So, now we know that the touch of one contact we can do about creating mouse events based on information in touch events.
In the most basic
touch = event.changedTouches[0]; // get the position information if(type === "touchmove"){ mouseEventType = "mousemove"; // get the name of the mouse event // this touch will emulate }else if(type === "touchstart"){ mouseEventType = "mousedown"; // mouse event to create }else if(type === "touchend"){ mouseEventType = "mouseup"; // ignore mouse up if click only } var mouseEvent = new MouseEvent( // create event mouseEventType, // type of event { 'view': event.target.ownerDocument.defaultView, 'bubbles': true, 'cancelable': true, 'screenX':touch.screenX, // get the touch coords 'screenY':touch.screenY, // and add them to the 'clientX':touch.clientX, // mouse event 'clientY':touch.clientY, }); // send it to the same target as the touch event contact point. touch.target.dispatchEvent(mouseEvent);
Now your mice will receive mousedown , mousemove , mouseup events when the user touches the device in only one place.
Missing click
Everything is still fine, but there is one mouse event that is not needed, and it is necessary. "onClick" I am not sure that there is an equilibrio touch event, and just like the exercise that I saw, there is enough information in that we have to decide whether it is possible to read a set of sensory events with one click.
This will depend on how far apart the initial and final touch events, more than a few pixels and its drag and drop are. It will also depend on how long. (Although this is not the same as a mouse), I find that people usually click on a button, while the mouse can be held, instead of conformation in the release, or pushed to cancel, this is not how people use the touch interface.
So, I am recording the time when the touchStart event occurs. event.timeStamp and where it started. Then, in the touchend event touchend I find the distance that he has moved, and the time since. If both of them are under the limits that I set, I will also generate a mouse click event along with the mouse event.
So, this is the basic way to convert touch events to mouse events.
Some CODES
Below is a tiny API called mouseTouch that does what I just explained. It covers the most basic mouse interactions required in a simple graphical application.
So hope that helps you with your code.