Creates a new TouchProcessor that will dispatch events to the given stage.
The distance (in points) describing how close two touches must be to each other to be recognized as a multitap gesture.
The time period (in seconds) in which two touches must occur to be recognized as a multitap gesture.
Returns the number of fingers / touch points that are currently on the stage.
The base object that will be used for hit testing. Per default, this reference points to the stage; however, you can limit touch processing to certain parts of your game by assigning a different object.
Indicates if it multitouch simulation should be activated. When the user presses ctrl/cmd (and optionally shift), he'll see a second touch curser that mimics the first. That's an easy way to develop and test multitouch when there's only a mouse available.
The stage object to which the touch objects are (per default) dispatched.
Analyzes the current touch queue and processes the list of current touches, emptying the queue while doing so. This method is called by Starling once per frame.
Force-end all current touches. Changes the phase of all touches to 'ENDED' and immediately dispatches a new TouchEvent (if touches are present). Called automatically when the app receives a 'DEACTIVATE' event.
Removes all event handlers on the stage and releases any acquired resources.
Enqueues a new touch our mouse event with the given properties.
Enqueues an artificial touch that represents the mouse leaving the stage.
On OS X, we get mouse events from outside the stage; on Windows, we do not. This method enqueues an artificial hover point that is just outside the stage. That way, objects listening for HOVERs over them will get notified everywhere.
Generated using TypeDoc
The TouchProcessor is used to convert mouse and touch events of the conventional Flash stage to Starling's TouchEvents.
The Starling instance listens to mouse and touch events on the native stage. The attributes of those events are enqueued (right as they are happening) in the TouchProcessor.
Once per frame, the "advanceTime" method is called. It analyzes the touch queue and figures out which touches are active at that moment; the properties of all touch objects are updated accordingly.
Once the list of touches has been finalized, the "processTouches" method is called (that might happen several times in one "advanceTime" execution; no information is discarded). It's responsible for dispatching the actual touch events to the Starling display tree.
Subclassing TouchProcessor
You can extend the TouchProcessor if you need to have more control over touch and mouse input. For example, you could filter the touches by overriding the "processTouches" method, throwing away any touches you're not interested in and passing the rest to the super implementation.
To use your custom TouchProcessor, assign it to the "Starling.touchProcessor" property.
Note that you should not dispatch TouchEvents yourself, since they are much more complex to handle than conventional events (e.g. it must be made sure that an object receives a TouchEvent only once, even if it's manipulated with several fingers). Always use the base implementation of "processTouches" to let them be dispatched. That said: you can always dispatch your own custom events, of course.