NL2033901B1 - Data input system and key data sending method - Google Patents
Data input system and key data sending method Download PDFInfo
- Publication number
- NL2033901B1 NL2033901B1 NL2033901A NL2033901A NL2033901B1 NL 2033901 B1 NL2033901 B1 NL 2033901B1 NL 2033901 A NL2033901 A NL 2033901A NL 2033901 A NL2033901 A NL 2033901A NL 2033901 B1 NL2033901 B1 NL 2033901B1
- Authority
- NL
- Netherlands
- Prior art keywords
- assignment
- key
- gesture
- hand
- data entry
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/0202—Constructional details or processes of manufacture of the input device
- G06F3/0219—Special purpose keyboards
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
- G06F3/0234—Character input methods using switches operable in different directions
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
Abstract
The invention relates to a data input system with a device to be worn on or by a human hand to provide a handheld keyboard, comprising: - a keystroke detection system arranged on the device to define at least three key positions for fingers of the hand and to detect corresponding keystrokes of individual fingertips of the hand, - a control unit configured to determine key data based on detected keystrokes and an assignment of the key data to the at least three key positions, - an output unit for sending the key data determined by the control unit to an external device, and - a measurement system allowing to determine a gesture of a hand palm of the hand, wherein the control unit is configured to change the assignment of the at least three key positions based on the determined gesture.
Description
Data input system and key data sending method
The invention relates to a data input system including a device to be worn on a human hand. The invention further relates to a method for sending key data to an external device, for instance using the data input system.
A classic keyboard and mouse are currently still the most used devices for providing data input to an external device such as a computer. A main disadvantage of the classic keyboard and mouse is that the use is bound to a table or other reference surface and this reference surface mainly determines the position of the human body while working or operating. Another disadvantage of the classic keyboard is that it is a relatively large device and cannot be easily transported, especially when extended with a numerical keyboard portion.
The limited mobility of the user and the frequent usage of the classic keyboard and mouse may cause injuries to neck, shoulder, and wrist, commonly referred to as repetitive strain injuries. An example thereof is carpal tunnel syndrome.
There are many initiatives to solve this by providing input devices that do not require a fixed reference surface such as a table, thereby providing a keyboard to mimic the operation and use of a classic keyboard without using a fixed reference surface.
One such prior art input device is disclosed in US4,517,424. The device engages the hand of a user and maintains a fixed position while permitting small movement of the fingers. Pushbuttons are provided around the fingers and adapted to be operated by pivotal movements of the finger in its plane, wherein each pushbutton is engaged by a different portion of the finger. Although the device provides an input device that can replace a classic keyboard, it does not have a user experience like a classic keyboard due to the unnatural position of the fingers, hamely, a constant bent position and the fact that other finger portions than the fingertips are used to engage with pushbuttons.
Another prior art input device is disclosed in WO2003/038588. The device is worn around the palm of a hand and comprises input members at the palmar side of the hand to be manipulated by the fingers of the hand itself, and further input members at the back (dorsal side) of the hand wearing the device to be manipulated by fingers of the other hand. However, the number of buttons is limited, and the cross-engagement of hands is far from mimicking a classic keyboard.
A further prior art input device is disclosed in US5,796,354. The device is worn on the hand and arm and comprises input members to be manipulated by the fingertips. A further input may be provided for e.g. a video game by detecting a movement of the hand relative to the corresponding arm.
Yet another prior art input device is disclosed in US2016/246368. The device is worn around the wrist and includes a plurality of sensors. Control logic is used to analyze the signals and to determine which finger is moved or tapped. A keyboard can be mimicked by using additional signals such as the number of taps or complete hand gestures to determine which symbol to assign to the finger. This however means that the operation of the keyboard is fundamentally different from a classic keyboard.
Another prior art input device is disclosed in KR102377162. The device is a glove worn on the hand including a pressure sensor to detect contact between the fingertip and a surface, a flex sensor to determine bending of the finger, and an inertial sensor to determine movement of the index finger to allow the device to mimic a classic keyboard. A clear disadvantage of the glove is that it fully encloses the hands and fingers, limits movement of the fingers, and cannot easily be exchanged with another user having a different size. Further, in practice, there are a lot of mismatches between intended keystroke and registered keystroke, and not all keys on a classic keyboard can be assigned and thus used.
A further prior art input device is disclosed in EP3765945 from the same applicant. The device is worn on the hand and defines an array of key positions for the fingers of the hand and allows to detect a corresponding keystroke movement and position individual fingertips of the hand. To change the assignment of keys to the array of key positions, the thumb presses a corresponding thumb button. This is far from operating the input device like a classic keyboard.
In view of the above it is an object of the invention to provide a data input system including a device to be worn on or by a human hand to provide a keyboard without relying on a fixed reference surface that still mimics a classic keyboard.
According to a first aspect of the invention, there is provided a data input system with a device to be worn on or by a human hand to provide a handheld keyboard, comprising: - a keystroke detection system arranged on the device to define at least three key positions for fingers of the hand and to detect corresponding keystrokes of individual fingertips of the hand, - a control unit configured to determine key data based on detected keystrokes and an assignment of the key data to the at least three key positions, - an output unit for sending the key data determined by the control unit to an external device, and - a measurement system allowing to determine a gesture of a hand palm of the hand, wherein the control unit is configured to change the assignment of the at least three key positions based on the determined gesture.
The invention according to the first aspect is based on the insight that in order to mimic a classic keyboard with a handheld keyboard the keystroke detection system needs to define at least three key positions for fingers of the hand which can be addressed by the fingers of the hand as would be the case on a classic keyboard and that the assignment of the at least three key positions is based on hand gestures, in particular the movement, position and/or orientation of the hand palm as may be measured by the measurement system.
In an embodiment, the at least three key positions are arranged in one or more rows that in use extend perpendicular to the fingers of the hand. In case of at least two rows of key positions, key positions are provided at different distances from the hand palm. Key positions in different rows may be arranged in columns extending substantially perpendicular to the rows. In an embodiment, the key positions are arranged in an array, e.g. an array of two rows and three or four columns. In another embodiment, the keystroke detection system defines at least three rows of key positions for fingers of the hand. For instance, an array of three rows and four columns of key positions.
In an embodiment, a row of key positions includes more key positions than fingers, i.e. at least five key positions. An advantage thereof may be that additional key positions are available for one or more of the outer fingers, i.e. the index finger or the pink, of which the fingertips can easily move sideways without the hand palm moving and without interfering with a neighboring finger.
In an embodiment, when the key positions are arranged in at least two rows, the rows do not need to have an equal number of key positions. In an exemplary embodiment, it is possible that the keystroke detection system defines an array with one or two rows each having a key position per finger, and an additional one or two rows with key positions for only some of the fingers. As the middle finger, ring finger, and/or index finger typically have a length and thus reach larger than the little finger, it may be practical to provide more key positions for the middle finger, ring finger, and /or index finger than for the other finger(s).
In other words, the keystroke detection system may define rows of key positions, wherein at least one row is incomplete compared to at least one other row. However, it is also possible that a different number of key positions are available for respective fingers due to a different alignment in the rows of key positions.
In an embodiment, the keystroke detection system may define one or more key positions to detect a corresponding keystroke of the thumb of the hand. Alternatively, or additionally, at least one of the key positions to detect a corresponding keystroke of the fingertip of the index finger can also detect a corresponding keystroke of the thumb of the hand. Hence, a key position may be configured to be shared by the index finger and the thumb, but in a more general embodiment, one or more of the provided key positions may be configured to be shared by two fingers or a finger and the thumb.
In an embodiment, the keystroke detection system includes a detector per key position configured to detect a presence of a fingertip, such as a switch, force sensor, optical sensor, proximity sensor like a capacitive sensor, neural motor sensor, or electromyography (emg) based sensor. The presence of a fingertip is then an indication of a keystroke and the subsequent absence of the fingertip is then corresponding to a release of the key.
When using a switch as detector, it is preferred that the switch is configured to require contact with a fingertip and a translation, preferably with some counterpressure (e.g. from a spring), of the fingertip while making contact, and providing feedback when a threshold is crossed, e.g. by a change in counterpressure provided by the switch. This mimics the pressing of a key/pushbutton on a classic keyboard.
In an embodiment, a detector defines two or more key positions and is configured to measure a position of the corresponding fingertip to determine which of the key positions is addressed by the fingertip. Examples thereof are a touchscreen, and a camera. 5 In an embodiment, it is possible that two or more key positions partially overlap. In such a case the keystroke detection system and/or the control system may be configured to determine which key position is addressed more compared to the other key position(s).
For instance, the key positions may form zones that partially overlap and addressing a key position may be carried out by contact with a detector surface or by a proximity sensor, where it is possible to detect the addressing of the key position by the amount of contact or the rate of proximity. By determining a percentage of the contact or rate of proximity that falls within a particular zone it is possible to determine which key position was intended to be addressed by the (near) contact.
In an embodiment, a pair of key positions overlap completely, wherein the detector is configured such that contact with the corresponding detector addresses the pair of key positions as a whole, and another parameter, for instance the applied force, is used to distinguish between the key positions of the pair of key positions.
In an embodiment, a detector associated with a key position is configured to measure two parameters, wherein one parameter is used to detect a keystroke, and the other parameter is used to change the assignment of the key position.
In an embodiment, a first detector is provided to detect addressing a plurality of key positions, and a second detector is provided to subsequently detect a keystroke at one of the plurality of key positions. Detection of the keystroke with the second detector may be carried out using one of the above-described methods or configurations.
In an embodiment, the device includes a base that has a substantially fixed position relative to the hand palm when the device is worn on or by the hand, and wherein the keystroke detection system is arranged at a side of the base facing towards the fingers.
In an embodiment, the measurement system includes one or more of the following sensors: - one or more accelerometers to measure acceleration, preferably in three directions, which directions are preferably orthogonal directions,
- One or more gyroscopes to measure rotational velocity, preferably in three directions, which directions are preferably orthogonal directions, and - one or more magnetometers to measure magnetic force, preferably in three directions, which directions are preferably orthogonal directions.
The measurement system may include an inertial measurement unit in which one or more accelerometers and one or more gyroscopes, and possibly one or more magnetometers cooperate to measure a specific acceleration, angular rate and/or orientation of the hand palm of the hand wearing the device, preferably in two or three directions, which directions are preferably orthogonal directions.
In an embodiment, the device comprises a base to be supported by or carried by the hand palm. The measurement system and/or the control unit and/or the output unit are preferably arranged in or on the base, preferably rigidly connected to the base. In an embodiment, a printed circuit board (PCB) is provided including the sensors of the measurement system, a processor of the control unit, and possibly a communication chip of the output unit.
In an embodiment, the measurement system is arranged remote from the device to determine a gesture of the hand palm, e.g. using a camera. Alternatively, or additionally, the measurement system is arranged on the device.
In an embodiment, the measurement system is configured to determine a gesture of the hand palm directly by measuring movement of or activity in the hand palm. This can be done by being in direct contact with, e.g. being connected or attached to, the hand palm.
This contact or connection may be indirect via other structural parts. Alternatively, or additionally, the measurement system is configured to determine a gesture of the hand palm indirectly, e.g. by measuring movement or activity in the wrist and/or fingers.
The measurement system allows to determine a gesture of the hand palm of the hand by measuring a movement, position and/or orientation of the hand palm. In some embodiments, the (presence and absence of a) gesture can be determined directly from an output of the measurement system. For instance, when the gesture is determined by an angular orientation of the hand, comparing a value representative for the angular orientation with a predetermined threshold allows to determine a gesture. In other embodiments, the (presence and absence of) gestures can be determined by looking at an output of the measurement system over time.
Although it is possible that the assignment of the at least three key positions can only be changed based on a single gesture, it is envisaged that different gestures can be determined resulting in different changes of assignment, where a different gesture can be a combination of other gestures, so that the measurement system and/or control unit is configured to allow to determine different gestures, and the control unit is configured to change the assignment differently for each different gesture.
In an embodiment, one or more gestures may be determined directly from an output of the measurement system, i.e. using (instant) sensor values associated with a single moment in time, or by looking at sensor values over time. It is also envisaged that one or more gestures are determined using instant sensor values and one or more other gestures are determined by looking at sensor values over time.
Looking at an output of the measurement system over time to determine a gesture may also include averaging a sensor value over time or determining whether a gesture is maintained for a predetermined amount of time before “confirming” the gesture. This may alternatively or additionally involve the use of filtering techniques such as the complementary filter or the Kalman filter. An advantage is that the determination of the key data to be sent will become less sensitive to shocks or vibrations or other unwanted sensor signals like noise and the data input system is thus better able to determine gestures of the hand palm corresponding to an intended change in assignment of the at least three key positions defined by the keystroke detection system.
Itis to be noted here explicitly that a change in assignment of the at least three key positions already takes place when only one of the key positions is assigned different key data. It is therefore not necessary that all assignments for every key position change.
Determination of the presence (and absence) of a gesture may be carried out by the measurement system and/or the control unit although it may be preferred that the control unit is configured to determine a gesture based on the output of the measurement system and to change the assignment of the at least three key positions accordingly.
In an embodiment, a gesture is determined based on a sensor value or a rate of change of this sensor value and comparing this sensor value or rate of change with a threshold value, which threshold value may be predetermined or dynamically changing and acts as a minimum threshold or detection limit. In this way, the system will not determine a gesture when the sensor value or rate of change is low.
In an example, a gesture is determined based on a measured rate of movement of the hand palm and comparing this rate of movement with a threshold value, which may be predetermined or dynamically changing and acts as a minimum threshold or detection limit. In this way, the system will not determine a gesture when the rate of movement of the hand palm is low similar to drift of a sensor signal.
A change of assignment can be any change, including but not limited to: - a SHIFT or CAPS LOCK operation resulting in a change from small to capital letters or a change of character/symbol as is known from classic keyboards (or vice versa), - adding accents to the letter or symbol like “ * * “*”, and other diacritics such as diaeresis, - changing the assignment from one key of the classic keyboard to another key on the classic keyboard corresponding to moving over the classic keyboard with the hand. This change of assignment will be referred to as a shift in assignment and thus corresponds to a movement of the hand or finger over a classic keyboard.
In an embodiment, the system, e.g. the control unit, is configured to determine a gesture corresponding to an intended single shift in assignment of at least some of the at least three key positions, meaning that the assignment of at least some of the at least three key positions changes to an adjacent key on the classic keyboard.
In an embodiment, the system, e.g. the control unit, is configured to distinguish between a gesture corresponding to an intended single shift in assignment of some or all of the at least three key positions, meaning that the assignment of one or more key positions changes to an adjacent key on the classic keyboard, and a gesture corresponding to an intended double shift in assignment of some or all of the at least three key positions, meaning that the assignment of one or more key positions changes to an adjacent key of an adjacent key on the classic keyboard, for instance by determining a rate of movement of the hand palm or other sensor value and comparing this rate of movement or other sensor value with a boundary value, which may be predetermined or dynamically changing. Hence, the control unit may be configured to detect a gesture corresponding to a single shift when the rate of movement or other sensor value is below the boundary value, and possibly above the abovementioned minimum threshold, and to detect a gesture corresponding to a double shift when the rate of movement or other sensor value is above the boundary value.
In an embodiment, it is possible that the threshold value and/or the boundary value are adjustable allowing to optimize the working of the system for each user. It is also possible that the system is configured to “learn” the optimal threshold value and/or boundary value for a specific user.
In an embodiment, it is possible that the input system is configured to learn to recognize a gesture, or to carry out multiple calculations or evaluations to determine a gesture, e.g. based on consensus.
In a more general embodiment, a gesture may be predetermined, and characteristics thereof may be provided to the system allowing the measurement system and/or control unit to compare the actually performed gesture with these characteristics. In practice, depending on the characteristics, it may be unlikely that the actual gesture is identical to the predetermined gesture. In such cases, the system, e.g. the control unit, may be configured to determine a score associated with the predetermined gesture which is representative for the similarity of the actually performed gesture with respect to the predetermined gesture. It may then be determined that a predetermined gesture is performed when the score is above a minimum value.
When a plurality of predetermined gestures is defined, a respective score can be determined that is representative for the similarity of the actually performed gesture with respect to a respective predetermined gesture. It may then be determined that the predetermined gesture is performed which has the highest score. Such determination may also be subject to determining whether at least one score is also above a minimum value.
Such minimum value may be referred to as a decision threshold or decision limit.
Scores may also be provided by running a plurality of recognition algorithms. Further, there may be a factor of chance, leading to a non-deterministic evaluation of score(s).
Machine learning or any other learning method may be used to determine characteristics of a gesture, which characteristics may also be dynamic under certain circumstances, e.g. depend on other parameters, for instance resulting in a gesture with characteristics requiring a displacement of the hand palm that is substantially inversely proportional to the corresponding velocity. In a further example, a gesture may be learned using first hand dimensions, but may need to be determined using second hand dimensions.
In an embodiment, the control unit may be configured to return to a default assignment between key data and key positions defined by the keystroke detection system after no or little movement has been detected for a certain amount of time, which certain amount of time may have a predetermined value or a dynamically changing value, where little movement can be defined as being below a threshold value. No or little movement may also be defined as a gesture, which gesture then results in the application of the “default” assignment of the at least three key positions by the control unit.
In an embodiment, a movement of the hand palm to be measured by the measurement system to determine a gesture for changing the assignment mainly includes a rotational movement about an axis perpendicular to the hand palm.
In an embodiment, a movement of the hand palm to be measured by the measurement system to determine a gesture for changing the assignment mainly includes a translational movement parallel to the hand palm.
In an embodiment, a movement of the hand palm to be measured by the measurement system to determine a gesture for changing the assignment mainly includes a rotational movement about an axis perpendicular to the hand palm and a simultaneous translational movement parallel to the hand palm.
In an embodiment, the control unit is configured to determine a change of assignment and apply the change of assignment and to return to a default or initial assignment after determining a keystroke. This may apply only to predetermined changes of assignment.
In an embodiment, the control unit is configured to determine a change of assignment and to apply the change of assignment to future keystrokes. This may apply only to predetermined changes of assignment. In other words, the change of assignment is maintained for multiple keystrokes. An example thereof is that a movement of the hand palm results in a shift of assignment corresponding to moving over a classic keyboard with the hand and subsequently addressing key positions in correspondence with the changed assignment. Another example is that a shift of assignment is added to an earlier shift of assignment when a gesture or other corresponding gesture is repeated within a certain time period.
Alternatively, or additionally, a keystroke and change of assignment may occur substantially simultaneously. This may depend on how the control unit is configured to detect a keystroke. In classic terms, a keystroke involves “hitting”, i.e. making contact with possibly followed by pressing, a key and subsequently releasing the key. The control unit may be configured to detect a keystroke, determine associated key data, and operating the output unit to send the key data based upon the “hitting” action of a fingertip without waiting for the releasing action. In other embodiment, the control unit is configured to wait for the releasing action to determine associated key data and operating the output unit to send the key data.
The waiting period between the “hitting” action and the releasing action allows for a change in assignment that occurs substantially simultaneously with the keystroke. In an embodiment, the change in assignment may be dependent on the key position and initial assignment and thus be based on the detected “hitting” action. In other words, the control unit may be configured to detect a start of a keystroke at a key position, to determine a gesture of a hand palm, to change the assignment of the key position based on the gesture and the detected start of the keystroke at the key position, and to subsequently determine the key data to be outputted upon a detected end of the keystroke. In an example, a fingertip may “hit” a key position associated with the letter “e”. While maintaining contact with the key position, the hand palm may perform a gesture corresponding to the accent * ‘ “ resulting in a change of assignment of the key position to the letter “é”, so that the control unit will instruct the output unit to output key data in association with this letter to the external device upon releasing of the key position by the fingertip. Such a change of assignment can also be used for accents like “ * “, “A”, and other diacritics such as diaeresis. Such changes of assignments may also be examples of an automatic return to the default or initial assignment after the keystroke.
Another example of a gesture that can be used substantially simultaneously with a keystroke to change the assignment of the key positions is a downwards or upwards movement of the fingertip by moving or tilting the hand palm downwards or upwards. An upwards or downwards movement while making contact with a key position may correspond to a SHIFT action or a change from left to right hand or vice versa allowing to cover an entire classic keyboard using one hand. A downwards movement may correspond to a deep press and may replace a force sensor in the detector allowing to measure the force with which contact is made. It is noted here that downwards and upwards are defined relative to the hand itself, so that these directions change when the hand is held upside down. The downwards direction is substantially parallel to a normal to a palmar side of the hand and the upwards direction is substantially parallel to a normal to a dorsal side of the hand.
A further example of a gesture that can be used substantially simultaneously with a keystroke to change the assignment of the key positions is a sideways movement of the hand, for instance to cause a change from left to right hand or vice versa allowing to cover an entire classic keyboard using one hand.
Itis also possible that key data is determined by the control unit upon “hitting” a key position, i.e. at a start of the keystroke, possibly send to the external device using the output unit, e.g. for preview purposes, but that a possible change in assignment and/or subsequent release of the key position will update the send key data and make the input final.
In an embodiment, the input system has a motion mode in which the gestures can be used as input data to the external device like a virtual joystick or a virtual dial knob or virtual slider. The motion mode may be activated using one of the key positions, for instance the motion mode is active as long as one of the key positions is addressed.
Alternatively, or additionally, the motion mode is activated using a gesture, and possibly deactivated using the same or another gesture.
The first aspect of the invention also relates to a combination of a first data input system and a second data input system, wherein the first and second data input systems are each a data input system according to the first aspect of the invention where it is envisaged that components are shared by the first and second data input systems and/or wherein the first and second data input systems have a slave/master relationship allowing cooperation between the first and second data input systems and consistent communication with the external device. An advantage is that two hands can be used to provide data input. This can be a left and right hand of the same person or hands of different persons.
In an embodiment, the first and second data input systems are configured such that a gesture determined by one of the first and second data input systems causes a change of assignment of the key positions of the other one of the first and second data input systems.
According to a second aspect of the invention, there is provided a method for sending key data to an external device, comprising the following steps: a. detecting one or more keystrokes of individual fingertips, b. determining a gesture of a hand palm of a corresponding hand, c. determining an assignment of key data to possible keystrokes based on a determined gesture, d. determining the key data to be sent based on the detected one or more keystrokes and the determined assignment, and e. sending the determined key data to the external device.
In an embodiment, the method is carried out using a data input system according to the first aspect of the invention. It will be appreciated that embodiments and/or features described in relation to the first aspect of the invention may also apply to the method and be embodiments and/or features of the method according to the second aspect of the invention where appropriate. The same applies to embodiments and/or features described in relation to the second aspect of the invention, where for instance method steps may be applied to the system according to the first aspect of the invention in such a form that the system or a component thereof is configured to perform the method step.
In an embodiment, a keystroke has a start and an end, wherein the start corresponds to making contact with or get in proximity of a key position and the end corresponds to releasing or leaving the key position, wherein the method is configured to carry out step d. after carrying out steps a. to c.
In an embodiment, step b. is carried out simultaneously with step a.
In an embodiment, steps a. to e. are carried out by a data input system according to the first aspect of the invention.
In an embodiment, step a. is carried out for one of a right or left hand and using a first data input system, and step b. is carried out for the other one of the right or left hand using a second data input system. The first and second data input system may be a data input system according to the first aspect of the invention.
In an embodiment, a combined gesture of two hands is determined and step c. is carried out based on the determined combined gesture.
The invention will now be described in a non-limiting way by reference to the accompanying drawings in which like parts are indicated by like reference symbols, and in which:
Figs. 1-2 schematically depict a data input system according to an embodiment of the invention,
Fig. 3 schematically depicts an electric diagram of the data input system of Figs. 1 and 2,
Fig. 4 schematically depicts a plan view of the base and keystroke detection system of the input system of Figs. 1 and 2,
Figs. 5-8 schematically depict different assignments of the key positions of the keystroke detection system on a computer keyboard layout,
Figs. 1 and 2 schematically depict a data input system with a wearable device ID according to an embodiment of the invention when worn on a human hand HH. A human hand is known to comprise a thumb and four fingers, which in this description will respectively be denoted index finger, middle finger, ring finger, and little finger, respectively, when starting at the thumb side of the hand. In Fig. 1, the thumb TH and the index finger IF are clearly visible. The other fingers are hidden behind the index finger IF.
The device ID comprise a base B with a proximal end PE and a distal end DE opposite the proximal end PE. The proximal end side of the base B is configured to engage with a hand palm HP of the hand HH and is therefore not visible in Figs. 1 and 2 but indicated using dashed lines.
The distal end side of the base B is provided with a keystroke detection system to define at least three key positions for fingers of the hand and to detect corresponding keystrokes of individual fingertips of the hand. In the embodiment of Figs. 1 and 2, the key positions are defined using respective detectors, including detectors D1-D3 configured to interact with a fingertip FT of the index finger IF of the hand HH. Interaction between a detector
D1-D3 and the fingertip FT allows to determine a position of the fingertip FT as well as a respective keystroke of the fingertip FT.
Fig. 3 schematically depicts an electric diagram of the system of Figs. 1 and 2. Shown in
Fig. 3 are the detectors D1-D3, but also schematically the other detectors associated with the other fingers resulting in a detector Dn with n indicating the total number of detectors.
The detectors D1 to Dn are connected to an output unit OU configured to send key data to an external device (not shown but e.g. a computer) via a control unit CU.
Sending the key data to an external device is preferably done wirelessly, e.g. using
Bluetooth, WiFi, infrared, ZigBee, or any other wireless data transfer method. However, it is not excluded that the data transfer between the system and the external device is carried out using a wire connection, e.g. when a fast and stable connection is required, for instance when using the system for gaming. The user input transferred from the detectors
D1-Dn to the control unit CU is indicated by the detector signals DS1 to DSn, and the data transfer from the output unit OU to the external device is indicated by the output signal
OS.
Referring again to Figs. 1 and 2, the device ID comprises a finger support FS to receive a portion of a finger IF of the hand HH corresponding to the proximal phalanges. The finger support FS is here embodied in the form of a ring but can be any supporting structure suitable to engage with the finger IF such that when said finger portion is received in the finger support FS, the device ID is carried by the hand via the finger support FS.
The finger support FS is attached to a connecting member, in this case a beam BE1, which in turn is hingedly connected to a connecting member, in this case a beam BE2 of the base B, so that beam BE1 is able to rotate relative to beam BE2 about a rotation axis
RA. The rotation axis RA extends substantially out of plane of the drawing and is positioned to be aligned with a metacarpophalangeal joint of the corresponding finger IF when said finger portion is received in the finger support FS. The metacarpophalangeal joint can be found between the corresponding proximal phalanges and the corresponding metacarpal bone. Due to this location of the rotation axis RA, the finger IF can easily be moved up and down relative to the base B while at the same time continuing to support the device ID. This is illustrated by comparing the position of the finger IF in the Figs. 1 and 2. In Fig. 1, the finger is able to interact with detector D3 and in Fig. 2, the finger has moved upwards allowing the fingertip FT to interact with the detector D1. Although not shown, an intermediate position of the finger allows the finger to interact with detector D2.
Although the detectors D1-D3 have been depicted as being provided on a more or less flat base B, it is also possible to provide the detectors at different positions allowing to limit the required movement of the fingers to reach the detectors.
Fig. 2 also depicts another embodiment in which the detectors are arranged differently using dashed lines to indicate the location of upper surfaces S1-S3 of the detectors D1-
D3. It will be apparent for the skilled person that such an arrangement may require a different shape of the base B as seen in side view, e.g. a step-like shape or a concave shape. From the orientation of the dashed lines, it can also be seen that the upper surfaces S1-S3 of the detectors D1-D3 may be tilted relative to each other so that a normal to these upper surfaces S1-S3 may be substantially aligned with a frequently occurring direction of approach of the fingertip.
From Fig. 3 it is apparent that the device ID of Figs. 1 and 2 further includes a measurement system IMS. The measurement system IMS allows to determine a gesture of the hand, in this case by measuring movement of the hand palm HP of the hand, for changing the assignment of the at least three key positions defined by the keystroke detection system.
The control unit CU is configured to determine the key data to be sent by the output unit
OU based on a sensor output of the keystroke detection system and an assignment of the key data to the at least three key positions. This means that when a fingertip of a finger interacts with one of the detectors D1-Dn, a corresponding sensor signal is received by the control unit CU allowing to determine a keystroke and to determine which key data to send to an external device. The control unit CU therefore includes a mapping between sensor input from the detectors D1-Dn and the key data to be outputted. This can be in the form of a table. The control unit CU thus assigns key data to the plurality of key positions. This is referred to as the assignment of the key positions. This assignment can change. One of the common assignment changes is the use of the SHIFT key as on a classic keyboard. This will result for letters in a change from small letters to capital letters, or vice versa, and for other characters in a change of character. Such a change in assignment also takes place for a classic keyboard so that different input can be provided using the same keys of the keyboard.
When using a device to be worn on or by a human hand, it is unlikely that as many key positions can be provided by the keystroke detection system as there are keys on a classic keyboard. The measurement system IMS therefore preferably allows to determine a gesture of the hand that is representative for an intended virtual movement of the hand over a classic keyboard and subsequently changing the assignment of the key positions defined by the keystroke detection system, which changing can also be referred to as a shifting in assignment with the shift corresponding to the movement of the hand over the virtual keyboard. Hence, this shift in assignment allows to virtually move over a virtual keyboard defined by the control unit CU and to address all keys of a keyboard for input to the external device while using the same set of key positions.
Fig. 4 schematically depicts a top view of the base B of a device ID, which may be the wearable data input device ID of Figs. 1 and 2. The device ID is intended for a left human hand HH as shown in Figs. 1 and 2 but could also be configured for a right human hand
HH.
The keystroke detection system includes an array of twelve detectors D1-D12 with three rows of detectors. The first row closest to the hand palm includes detectors D1, D4, D7 and D10. The third row farthest away from the hand palm includes detectors D3, D6, D9 and D12. The second row in between the first row and the third row includes detectors D2,
D5, D8 and D11. Each detector defines a key position. The fingertip of the index finger is configured to interact with the detectors D1-D3, the fingertip of the middle finger is configured to interact with the detectors D4-D8, the fingertip of the ring finger is configured to interact with the detectors D7-D9, and the fingertip of the little finger is configured to interact with the detectors D10-D12.
Although the rows include a key position per finger, it is possible that additional key positions are provided in a row, e.g. as shown in dashed lines to right and left of the first, second, and third row, allowing the index finger and/or little finger to interact with additional detectors at the additional key positions.
Further, although three rows of key positions are provided, it is possible to provide incomplete additional rows as for instance shown in dashed lines below the detectors D4 and D7, allowing the middle finger and/or ring finger to interact with additional detectors in the additional incomplete rows.
To explain the assignment of key positions and possible changes, in this case shifts, in assignment, the array of detectors D1-D12 is depicted in Fig. 5 over a standard QWERTY keyboard layout to indicate the following assignment:
Detector Key data
D1 V
D2 F
D3 R
D4 Cc
D5 D
D6 E
D7 X
D8 S
D9 W
D10 Zz
D11 A
D12 Q
The above may be a default or initial assignment that is used as starting point for this keyboard. It will be apparent that even when two similar devices ID are used, one suitable for the left hand as in Figs. 1 and 2, and one suitable for the right hand, not all keys on a standard QWERTY keyboard are addressable using a single assignment of key data to key positions as defined by the detectors D1-D12.
When a user wants to input key data to an external device corresponding to the numbers 1-4 on the standard QWERTY keyboard, the assignment may be shifted upwards as indicated in Fig. 6 resulting in the following assignment:
Detector Key data
D1 F
D2 R
D3 4
D4 D
D5 E
D8 3
D7 S
D8 W
D9 2
D10 A
D11 Q
D12 1
The shift upwards to change the assignment may be triggered by movement of the hand palm measured by the measurement system IMS. This movement may be a forward in- plane movement of the hand palm substantially corresponding to the Y-direction as indicated in Figs. 1, 2 and 4. Alternatively, this movement may be an upward out-of-plane movement of the hand palm substantially corresponding to the Z-direction as indicated in
Figs. 1 and 2.
Another possible shift in assignment is depicted in Fig. 7, where the assignment has shifted to the right compared to the default or starting assignment of Fig. 5 resulting in the following assignment:
Detector Key data
D1 B
D2 G
D3 T
D4 V
D5 F
D6 R
D7 C
D8 D
D9 E
D10 X
D11 S
D12 W
The sideways shift may be triggered by a rotation and/or translation of the hand palm in that direction.
In the above examples of a change in assignment, the entire array of key positions shifts over the keyboard layout. However, it is also envisaged that only a portion of the array moves over the keyboard layout, for instance only the row or column at the edge of the array corresponding to the shift direction may move while leaving all other assignments unchanged.
It is further noted that although only an upwards and right shift have been depicted, the assignment may shift in any direction, for instance the opposite directions to the left or downwards, but also in an oblique direction, which in this case is both a movement sideways and upwards or downwards.
To avoid that a change in assignment of the key positions is triggered undesirably, it is preferred to use a threshold for movement of the hand palm. This threshold may be a minimum displacement of the hand palm so that a change of assignment only takes place after rotating and/or translating the hand palm with a value above the threshold.
Movements below the threshold are then ignored by the control unit.
Alternatively, the threshold may be a minimum rate of movement of the hand palm so that low velocity movements do not trigger a change in assignment.
Itis also possible that a combination of thresholds is used resulting in only a trigger to change the assignment of the key positions when a predetermined displacement is exceeded in a predetermined minimum period of time. In this way, high velocity movements over a short distance are also ignored as well as low velocity movements over a large distance.
In some situations, it may be desirable to more radically change the assignment of the key positions. An example thereof is indicated in Fig. 8 compared to the default/initial or starting assignment of Fig. 5. In Fig. 8, the assignment has changed to the following:
Detector Key data
D1 N
D2 H
D3 Y
D4 B
D5 G
D6 T
D7 V
D8 F
D9 R
D10 C
D11 D
D12 E
Compared to the assignment of Fig. 5, the shift in assignment of Fig. 7 may be considered a single shift in assignment, while the shift in assignment of Fig. 8 may be considered a double shift in assignment.
The control unit CU may be configured to be able to distinguish between an intended single shift in assignment and an intended double shift in assignment. This distinction may for instance be made by comparing the rate of movement with a boundary value, wherein the control unit CU will detect a single shift when the rate of movement is below the boundary value, and wherein the control unit CU will detect a double shift when the rate of movement is above the boundary value. However, additionally, or alternatively, a comparison may be made between the displacement and a boundary value, wherein the control unit CU will detect a single shift when the displacement is below the boundary value, and wherein the control unit CU will detect a double shift when the displacement is above the boundary value.
In a more complex embodiment of the control unit CU, it is possible to distinguish between a single, double, or triple shift using a first boundary value for the distinction between the single and double shift and using a second boundary value for the distinction between the double and triple shift.
In an embodiment, the control unit CU is configured to base its output not only on sensor input from the measurement system corresponding to a certain moment in time as is for instance the case when looking at a rate of movement, e.g. velocity and direction of movement, but also to sensor inputs corresponding to other moments in time, e.g. tracking displacement and/or orientation over time, thereby allowing to detect more complex gestures and to distinguish between movements intended to shift the assignment of the key positions and other movements, e.g. pointing to a computer screen to direct the attention of another person to an object on the screen or reaching for a button to turn on or off the lights. This distinction can be made by looking at a certain history of the movement and/ or position and to conclude for instance that a movement of the hand palm is too large or too fast or a combination thereof.
Claims (17)
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| NL2033901A NL2033901B1 (en) | 2023-01-03 | 2023-01-03 | Data input system and key data sending method |
| PCT/NL2024/050004 WO2024147739A1 (en) | 2023-01-03 | 2024-01-03 | Data input system and key data sending method |
| EP24700037.5A EP4646639A1 (en) | 2023-01-03 | 2024-01-03 | Data input system and key data sending method |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| NL2033901A NL2033901B1 (en) | 2023-01-03 | 2023-01-03 | Data input system and key data sending method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| NL2033901B1 true NL2033901B1 (en) | 2024-07-12 |
Family
ID=86272495
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| NL2033901A NL2033901B1 (en) | 2023-01-03 | 2023-01-03 | Data input system and key data sending method |
Country Status (3)
| Country | Link |
|---|---|
| EP (1) | EP4646639A1 (en) |
| NL (1) | NL2033901B1 (en) |
| WO (1) | WO2024147739A1 (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4517424A (en) | 1980-10-17 | 1985-05-14 | Inro France | Hand-secured pushbutton control device |
| US5796354A (en) | 1997-02-07 | 1998-08-18 | Reality Quest Corp. | Hand-attachable controller with direction sensing |
| WO2003038586A1 (en) | 2001-10-30 | 2003-05-08 | Digityper Ab | A portable data input device and use of such a device |
| US20160246368A1 (en) | 2013-12-27 | 2016-08-25 | Intel Corporation | Piezoelectric sensor assembly for wrist based wearable virtual keyboard |
| EP3765945A2 (en) | 2018-03-11 | 2021-01-20 | Van de Laar, Laurens | Wearable data input device and operating method |
| KR102377162B1 (en) | 2020-01-03 | 2022-03-22 | 동의대학교 산학협력단 | wearable keyboard |
-
2023
- 2023-01-03 NL NL2033901A patent/NL2033901B1/en active
-
2024
- 2024-01-03 EP EP24700037.5A patent/EP4646639A1/en active Pending
- 2024-01-03 WO PCT/NL2024/050004 patent/WO2024147739A1/en not_active Ceased
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4517424A (en) | 1980-10-17 | 1985-05-14 | Inro France | Hand-secured pushbutton control device |
| US5796354A (en) | 1997-02-07 | 1998-08-18 | Reality Quest Corp. | Hand-attachable controller with direction sensing |
| WO2003038586A1 (en) | 2001-10-30 | 2003-05-08 | Digityper Ab | A portable data input device and use of such a device |
| US20160246368A1 (en) | 2013-12-27 | 2016-08-25 | Intel Corporation | Piezoelectric sensor assembly for wrist based wearable virtual keyboard |
| EP3765945A2 (en) | 2018-03-11 | 2021-01-20 | Van de Laar, Laurens | Wearable data input device and operating method |
| US20210034153A1 (en) * | 2018-03-11 | 2021-02-04 | Laurens van de Laar | Wearable Data Input Device and Operating Method |
| KR102377162B1 (en) | 2020-01-03 | 2022-03-22 | 동의대학교 산학협력단 | wearable keyboard |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2024147739A1 (en) | 2024-07-11 |
| EP4646639A1 (en) | 2025-11-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20030048260A1 (en) | System and method for selecting actions based on the identification of user's fingers | |
| JP5622572B2 (en) | Pressure sensor array apparatus and method for tactile sensing | |
| JP5166008B2 (en) | A device for entering text | |
| US8816964B2 (en) | Sensor-augmented, gesture-enabled keyboard and associated apparatus and computer-readable storage medium | |
| US20160364138A1 (en) | Front touchscreen and back touchpad operated user interface employing semi-persistent button groups | |
| US20040041787A1 (en) | Method and apparatus for a hybrid pointing device used with a data processing system | |
| WO2012070682A1 (en) | Input device and control method of input device | |
| WO2003088202A1 (en) | Symbol encoding apparatus and method | |
| US20240256040A1 (en) | Wearable Data Input Device and Operating Method | |
| KR100499391B1 (en) | Virtual input device sensed finger motion and method thereof | |
| CN101706677A (en) | Input device based on body feeling and input method thereof | |
| JP6740389B2 (en) | Adaptive user interface for handheld electronic devices | |
| EP0696014B1 (en) | Pressure sensitive input device wearable around a human finger | |
| CN103995610A (en) | Method for user input from alternative touchpads of a handheld computerized device | |
| NL2033901B1 (en) | Data input system and key data sending method | |
| KR101686585B1 (en) | A hand motion tracking system for a operating of rotary knob in virtual reality flighting simulator | |
| CN100374998C (en) | A touch-type information input device and method | |
| WO2022089351A1 (en) | Wearable keyboard and mouse, and efficient operating method for mouse | |
| Sekar et al. | Wearable virtual keyboard for visually impaired person | |
| US20060195622A1 (en) | Data input device, information equipment, information equipment control method, and computer program | |
| KR101513969B1 (en) | character input apparatus using finger movement | |
| JP2025541589A (en) | Application and system for dual control of a game, application and system for displaying virtual buttons, system for determining the value of at least one parameter of a user's finger, and system for determining and presenting the position of a user's finger on a display | |
| WO2025022785A1 (en) | Data glove and data glove set | |
| KR101805111B1 (en) | Input interface apparatus by gripping and the method thereof | |
| JP2025018906A (en) | Data Gloves and Data Glove Sets |