touchHandler module
handles touchscreen interaction. Used to provide input gestures for touchscreens, touch modes and other support facilities. In order to use touch features, NVDA must be installed on a touchscreen computer.
- class touchHandler.POINTER_INFO
Bases:
Structure
- _fields_ = [('pointerType', <class 'ctypes.c_ulong'>), ('pointerId', <class 'ctypes.c_ulong'>), ('frameId', <class 'ctypes.c_ulong'>), ('pointerFlags', <class 'ctypes.c_ulong'>), ('sourceDevice', <class 'ctypes.c_void_p'>), ('hwndTarget', <class 'ctypes.c_void_p'>), ('ptPixelLocation', <class 'ctypes.wintypes.POINT'>), ('ptHimetricLocation', <class 'ctypes.wintypes.POINT'>), ('ptPixelLocationRaw', <class 'ctypes.wintypes.POINT'>), ('ptHimetricLocationRaw', <class 'ctypes.wintypes.POINT'>), ('dwTime', <class 'ctypes.c_ulong'>), ('historyCount', <class 'ctypes.c_ulong'>), ('inputData', <class 'ctypes.c_long'>), ('dwKeyStates', <class 'ctypes.c_ulong'>), ('PerformanceCount', <class 'ctypes.c_ulonglong'>)]
- PerformanceCount
Structure/Union member
- dwKeyStates
Structure/Union member
- dwTime
Structure/Union member
- frameId
Structure/Union member
- historyCount
Structure/Union member
- hwndTarget
Structure/Union member
- inputData
Structure/Union member
- pointerFlags
Structure/Union member
- pointerId
Structure/Union member
- pointerType
Structure/Union member
- ptHimetricLocation
Structure/Union member
- ptHimetricLocationRaw
Structure/Union member
- ptPixelLocation
Structure/Union member
- ptPixelLocationRaw
Structure/Union member
- sourceDevice
Structure/Union member
- class touchHandler.POINTER_TOUCH_INFO
Bases:
Structure
- _fields_ = [('pointerInfo', <class 'touchHandler.POINTER_INFO'>), ('touchFlags', <class 'ctypes.c_ulong'>), ('touchMask', <class 'ctypes.c_ulong'>), ('rcContact', <class 'ctypes.wintypes.RECT'>), ('rcContactRaw', <class 'ctypes.wintypes.RECT'>), ('orientation', <class 'ctypes.c_ulong'>), ('pressure', <class 'ctypes.c_ulong'>)]
- orientation
Structure/Union member
- pointerInfo
Structure/Union member
- pressure
Structure/Union member
- rcContact
Structure/Union member
- rcContactRaw
Structure/Union member
- touchFlags
Structure/Union member
- touchMask
Structure/Union member
- class touchHandler.TouchInputGesture(*args, **kwargs)
Bases:
InputGesture
Represents a gesture performed on a touch screen. Possible actions are: * Tap: a finger touches the screen only for a very short amount of time. * Flick{Left|Right|Up|Down}: a finger swipes the screen in a particular direction. * Tap and hold: a finger taps the screen but then again touches the screen, this time remaining held. * Hover down: A finger touches the screen long enough for the gesture to not be a tap, and it is also not already part of a tap and hold. * Hover: a finger is still touching the screen, and may be moving around. Only the most recent finger to be hovering causes these gestures. * Hover up: a finger that was classed as a hover, releases contact with the screen. All actions accept for Hover down, Hover and Hover up, can be made up of multiple fingers. It is possible to have things such as a 3-finger tap, or a 2-finger Tap and Hold, or a 4 finger Flick right. Taps maybe pluralized (I.e. a tap very quickly followed by another tap of the same number of fingers will be represented by a double tap, rather than two separate taps). Currently double, tripple and quadruple plural taps are detected. Tap and holds can be pluralized also (E.g. a double tap and hold means that there were two taps before the hold). Actions also communicate if other fingers are currently held while performing the action. E.g. a hold+tap is when a finger touches the screen long enough to become a hover, and a tap with another finger is performed, while the first finger remains on the screen. Holds themselves also can be made of multiple fingers. Based on all of this, gestures could be as complicated as a 5-finger hold + 5-finger quadruple tap and hold. To find out the generalized point on the screen at which the gesture was performed, use this gesture’s x and y properties. If low-level information about the fingers and sub-gestures making up this gesture is required, the gesture’s tracker and preheldTracker properties can be accessed. See touchHandler.MultitouchTracker for definitions of the available properties.
- counterNames = ['single', 'double', 'tripple', 'quodruple']
- pluralActionLabels = {'double': 'double {action}', 'quodruple': 'quadruple {action}', 'single': 'single {action}', 'tripple': 'tripple {action}'}
- _get_speechEffectWhenExecuted()
- _get_reportInInputHelp()
- _get_identifiers()
The identifier(s) which will be used in input gesture maps to represent this gesture. These identifiers will be normalized and looked up in order until a match is found. A single identifier should take the form: C{source:id} where C{source} is a few characters representing the source of this gesture and C{id} is the specific gesture. An example identifier is: C{kb(desktop):NVDA+1}
This property should not perform normalization itself. However, please note the following regarding normalization. If C{id} contains multiple chunks separated by a + sign, they are considered to be ordered arbitrarily and may be reordered when normalized. Normalization also ensures that the entire identifier is lower case. For example, NVDA+control+f1 and control+nvda+f1 will match when normalized. See L{normalizeGestureIdentifier} for more details.
Subclasses must implement this method. @return: One or more identifiers which uniquely identify this gesture. @rtype: list or tuple of str
- RE_IDENTIFIER = re.compile('^ts(?:\\((.+?)\\))?:(.*)$')
- classmethod getDisplayTextForIdentifier(identifier)
Get the text to be presented to the user describing a given gesture identifier. This should only be called with normalized gesture identifiers returned by the L{normalizedIdentifiers} property in the same subclass. For example, C{KeyboardInputGesture.getDisplayTextForIdentifier} should only be called for “kb:*” identifiers returned by C{KeyboardInputGesture.normalizedIdentifiers}. Most callers will want L{inputCore.getDisplayTextForIdentifier} instead. The display text consists of two strings: the gesture’s source (e.g. “laptop keyboard”) and the specific gesture (e.g. “alt+tab”). @param identifier: The normalized gesture identifier in question. @type identifier: str @return: A tuple of (source, specificGesture). @rtype: tuple of (str, str) @raise Exception: If no display text can be determined.
- _get__immediate()
- _abc_impl = <_abc._abc_data object>
- _immediate
- identifiers: List[str] | Tuple[str, ...]
- reportInInputHelp
Indicates that this gesture should be reported in Input help mode. This would only be false for flooding Gestures like touch screen hovers. @type: bool
- speechEffectWhenExecuted
The effect on speech when this gesture is executed; one of the SPEECHEFFECT_* constants or C{None}.
- class touchHandler.TouchHandler
Bases:
Thread
This constructor should always be called with keyword arguments. Arguments are:
group should be None; reserved for future extension when a ThreadGroup class is implemented.
target is the callable object to be invoked by the run() method. Defaults to None, meaning nothing is called.
name is the thread name. By default, a unique name is constructed of the form “Thread-N” where N is a small decimal number.
args is a list or tuple of arguments for the target invocation. Defaults to ().
kwargs is a dictionary of keyword arguments for the target invocation. Defaults to {}.
If a subclass overrides the constructor, it must make sure to invoke the base class constructor (Thread.__init__()) before doing anything else to the thread.
- terminate()
- run()
Method representing the thread’s activity.
You may override this method in a subclass. The standard run() method invokes the callable object passed to the object’s constructor as the target argument, if any, with sequential and keyword arguments taken from the args and kwargs arguments, respectively.
- inputTouchWndProc(hwnd, msg, wParam, lParam)
- setMode(mode)
- pump()
- notifyInteraction(obj)
Notify the system that UI interaction is occurring via touch. This should be called when performing an action on an object. @param obj: The NVDAObject with which the user is interacting. @type obj: L{NVDAObjects.NVDAObject}
- touchHandler.touchSupported(debugLog: bool = False) bool
Returns if the system and current NVDA session supports touchscreen interaction. @param debugLog: Whether to log additional details about touch support to the NVDA log.
- touchHandler.setTouchSupport(enable: bool)
- touchHandler.handlePostConfigProfileSwitch()
- touchHandler.initialize()
- touchHandler.terminate()