Description
The zForceKeyboardMouse example showcase how touch input can be translated to mouse or keyboard input. The example enables SAMD micro based boards (e.g. Neonode Prototyping Board) to send keystrokes or mouse navigation to the host system by using the official Arduino libraries <Mouse.h> and <Keyboard.h>, through HID.
The example divides the Touch Active Area into one mouse pad and a keyboard section, containing 5 buttons (A-E). The mouse section works like a relative mouse pad, where the cursor moves relatively to its previous position. The Keyboard buttons are programmed to print the letters A-E when a touch is being preformed in each corresponding section.
We can access the touch data from the sensor module through the zForce Arduino library
Code Block |
---|
|
int16_t x = ((TouchMessage*)touch)->touchData[0].x; //Touch input absolute x coordinate
int16_t y = ((TouchMessage*)touch)->touchData[0].y; //Touch input absolute y coordinate
int8_t event = ((TouchMessage*)touch)->touchData[0].event; //Touch input event state, i.e TouchUp, TouchDown...
|
Touch events are included in the touch notification message and are used to describe the current reported touch event. The touch events are used to trigger buttons and left-clicks in the MouseKeyboard example, and are defined in the Arduino Library.
Code Block |
---|
|
enum TouchEvent
{
DOWN = 0, //First reported touch object
MOVE = 1, //Reported touch object.
UP = 2, //Last reported touch object
INVALID = 3,
GHOST = 4
}; |
Mouse Pad Touch handling
Since the Touch Sensor Module is recognized as a touch screen digitizer, the touch input data needs to be extracted in order to translate it as mouse input. To calculate the objects relative movement, the current touch is subtracted with the precious touch in the absolute space.
Code Block |
---|
|
Mouse.move((x - previousTouch.x), (y - previousTouch.y)); //"x" and "y" are the location of the reported touch coordinates
//"previousTouch.x" and "previousTouch.y" are the location of the
// previous reported coordinate. |
A left-click action is preformed once the touch object exit the Touch Active Area. In practice, this means that the end user would needs to both "press" and "release" their finger within the TAA in order to preform a left-click action.
This means that the end user would have to first "press" and "release" their finger in a tapping motion in order to preform a touch. The touch sensitivity can be adjusted by the global variable holdTime, which acts like a timer for how long a "touch" can take. To optimize the tactile response for in-air applications, please refer to section Click-on-Touch for In-Air Applications for further information.
The zForceKeyboardMouse evaluates the left-click action by the touch events:
Code Block |
---|
|
switch (event)
{
case 0: // DOWN event
previousTouch.x = x;
previousTouch.y = y;
globalMillis = millis();
break;
case 1: // MOVE event
if ((millis() - globalMillis) >= holdTime)
{
Mouse.move((x - previousTouch.x), (y - previousTouch.y)); //Move to the relative position.
}
previousTouch.x = x;
previousTouch.y = y;
break;
case 2: // UP event
if (millis() - globalMillis < holdTime) // Mouse "left click", sensitivity
{ // can be tuned by changing "holdTime"
Mouse.click(MOUSE_LEFT);
}
break;
default: break;
} |
Keyboard Touch Handling
The keyboard buttons will send a key press depending on the position of the touch object. Each buttons are sectioned by the variable keyboardBoundary, and are thereafter sectioned further to distinguish each key.
If a reported touch with a DOWN-event is positioned within the keyboardBoundary, a second look-up will be preformed in order to evaluate which key to print.
Code Block |
---|
|
if (event == 0) { // DOWN event
//assign Key to the given interval
if (y < 250)
Keyboard.print('A'); //Print Key "A"
else if (y < 500)
Keyboard.print('B'); //Print Key "B"
else if (y < 750)
Keyboard.print('C'); //Print Key "C"
else if (y < 1000)
Keyboard.print('D'); //Print Key "D"
else
Keyboard.print('E'); //Print Key "E"
}
|
Click-on-Touch for In-Air Applications
For in-air solution, the Touch Sensor Module gives a higher tactile response when the touch object triggers an action on the DOWN-event, instead of the UP-event. Meaning that a "left-click" or "button presses" would be be triggered when the end user first enters the Touch Active Area.
Please refer to the below click-on-touch adjustment for the mouse pad:
Code Block |
---|
|
switch (event)
{
case 0: // DOWN event
previousTouch.x = x;
previousTouch.y = y;
globalMillis = millis();
break;
case 1: // MOVE event
if ((millis() - globalMillis) >= holdTime)
{
Mouse.move((x - previousTouch.x), (y - previousTouch.y));
}
previousTouch.x = x;
previousTouch.y = y;
break;
case 2: // UP event
break;
default: break;
} |
Panel |
---|
borderColor | darkblue |
---|
bgColor | lightgrey |
---|
title | Read More |
---|
|
Implementation examples Children Display |
---|
depth | 1 |
---|
page | Implementation Examples |
---|
|
Read More Children Display |
---|
depth | 1 |
---|
page | Neonode® Touch Sensor Module User's Guide |
---|
|
|