[go: up one dir, main page]

WO2024246484A1 - Mécanisme et procédé d'entrée d'utilisateur - Google Patents

Mécanisme et procédé d'entrée d'utilisateur Download PDF

Info

Publication number
WO2024246484A1
WO2024246484A1 PCT/GB2024/051250 GB2024051250W WO2024246484A1 WO 2024246484 A1 WO2024246484 A1 WO 2024246484A1 GB 2024051250 W GB2024051250 W GB 2024051250W WO 2024246484 A1 WO2024246484 A1 WO 2024246484A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
user input
user
input mechanism
movement path
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/GB2024/051250
Other languages
English (en)
Inventor
Stephen William Roberts
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Touchnetix Ltd
Original Assignee
Touchnetix Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Touchnetix Ltd filed Critical Touchnetix Ltd
Publication of WO2024246484A1 publication Critical patent/WO2024246484A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0446Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/039Accessories therefor, e.g. mouse pads
    • G06F3/0393Accessories for touch pads or touch screens, e.g. mechanical guides added to touch screens for drawing straight lines, hard keys overlaying touch screens or touch pads
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Definitions

  • the present invention relates to the field of user input mechanisms and in particular touch sensors, for example touch sensors for overlying a display screen to provide a touch- sensitive display (touch screen).
  • touch sensors for example touch sensors for overlying a display screen to provide a touch- sensitive display (touch screen).
  • embodiments of the invention relate to user input mechanisms that make use of a touch sensitive apparatus and a guiding element.
  • Capacitive touch sensors are used in wide variety of applications.
  • a capacitive touch sensor can be generalised as one that uses a physical sensor element comprising an arrangement of electrically conductive electrodes extending over a touch-sensitive area or surface (sensing area or surface) to define sensor nodes and a measurement circuitry connected to the electrodes and operable to measure changes in the electrical capacitance of each of the electrodes or the mutual-capacitance between combinations of electrodes.
  • the electrodes are typically provided on a substrate and arranged in a suitable pattern.
  • a conductive object such as a user’s finger
  • approaches or touches the touch-sensitive surface a change in the electrical capacitance is observed and this change can be used to identify certain properties associated with the conductive object, e.g., the location of the conductive object relative to the touch-sensitive surface.
  • Capacitive touch sensors generally offer a greater degree of flexibility to designers and users of user interfaces. This may not only be in terms of the physical structure of the capacitive touch sensor, whereby capacitive touch sensors can be made in a variety of shapes and have curved planes, etc., to fit various shaped openings, but also in terms of how touch sensors can be used or adapted for various tasks. Additionally, these sensors typically do not have moving parts and thus can be considered more robust and have a relatively longer operational lifetime.
  • capacitive touch sensors are becoming more and more commonplace in everyday life, e.g., in electrical devices, such as smartphones, etc.
  • more conventional user input mechanisms such as dials or knobs may be preferred by designers of electrical devices or users of such electrical devices.
  • Such conventional user input mechanisms may be more familiar with certain users or groups of users, which may in part be due to offering a more mechanical or tactile interaction with the user input mechanism which may provide a level or reassurance or confirmation of the user’s input.
  • capacitive touch sensors which have the advantages described above, but which provide a more mechanical or tactile interaction with the user.
  • Such sensors while providing a more tactile user interface, suffer some disadvantages.
  • the wipers often contact the touch-sensitive surface which may cause damage to the touch-sensitive surface thus reducing the lifetime of the sensor, while the wipers also spatially extend from the dial thereby increasing the footprint of the dial.
  • a user input mechanism for providing a signal indicative of a user input to a system communicatively coupled to the user input mechanism.
  • the user input mechanism includes a touch-sensitive element comprising a plurality of electrodes that define a touch sensitive surface; a guiding element mounted to the touch-sensitive element, wherein the guiding element is configured such that a user’s digit or hand may be guided, relative to the touch-sensitive surface, along a predetermined movement path; and processing circuitry configured to process signals received from the touch-sensitive element.
  • the processing circuitry is configured to: identify and/or receive signals from the touch-sensitive element, wherein the signals each correspond to a capacitive coupling at locations on the touch-sensitive element along the predetermined movement path; and determine a degree of movement of the user’s digit or hand along the predetermined movement path by determining the difference between a first set of received signals from the touch-sensitive element obtained at a first time and a second set of received signals from the touch-sensitive element obtained at a second time.
  • a system comprising the user input mechanism of the first aspect, wherein the system is configured to receive the determined degree of movement of the user’s hand or digit along the predetermined movement path from the processing circuitry of the user input mechanism and to perform a function responsive to the received determined degree of movement.
  • a method for providing a signal indicative of a user input to a system communicatively coupled to the user input mechanism wherein the user input mechanism comprises a touch-sensitive element comprising a plurality of electrodes that define a touch sensitive surface; a guiding element mounted to the touch-sensitive element, wherein the guiding element is arranged such that a user’s hand or digit may be guided, relative to the touch-sensitive surface, along a predetermined movement path; and processing circuitry configured to process signals received from the touch-sensitive element.
  • the method includes: identifying and/or receiving signals from the touch-sensitive element, wherein the signals each correspond to a capacitive coupling at locations on the touch-sensitive element along the predetermined movement path; and determining a degree of movement of the user’s hand or digit along the predetermined movement path by determining the difference between a first set of received signals from the touch-sensitive element obtained at a first time and a second set of received signals from the touch-sensitive element obtained at a second time.
  • a method of calibrating a user input mechanism for providing a signal indicative of a user input to a system communicatively coupled to the user input mechanism, wherein the user input mechanism comprises a touch-sensitive element comprising a plurality of electrodes that define a touch sensitive surface; a guiding element mounted to the touch-sensitive element, wherein the guiding element is arranged such that a user’s hand or digit may be guided, relative to the touch-sensitive surface, along a predetermined movement path; and processing circuitry configured to process signals received from the touch-sensitive element.
  • the method includes: receiving a first set of signals from the touch-sensitive element indicative of the capacitive coupling at locations on the touch-sensitive element while a user engages with the guiding element with their hand or digits; guiding the user’s hand or digits along the predetermined movement path while the user engages the guiding element; receiving a second set of signals from the touch-sensitive element indicative of the capacitive coupling at locations on the touch-sensitive element while the user engages the guiding element; and from the first set and second set of signals, identifying corresponding locations on the touch- sensitive surface that correspond to the predetermined movement path of the guiding element.
  • Figure 1 schematically illustrates a user input mechanism in accordance with certain embodiments of the invention
  • Figure 2 schematically illustrates a self-capacitance measurement mode, specifically with a view to explaining the principles of self capacitance measurement
  • Figure 3 schematically illustrates a mutual-capacitance measurement mode, specifically with a view to explaining the principles of mutual capacitance measurement
  • Figure 4 schematically illustrates, in plan view, the user input mechanism of Figure 1 in more detail in accordance with certain embodiments of the invention, and specifically shows a guiding element according to a first aspect
  • Figure 5 schematically illustrates, in perspective view, the user input mechanism of Figure 4;
  • Figure 6 illustrates a graph depicting sets of mutual capacitive couplings taken at different times corresponding to sensor nodes of the user input mechanism of Figures 1 and 4;
  • Figure 7 illustrates a graph depicting two sets of mutual capacitive couplings of Figure 6, with the two sets provided superimposed on one another;
  • Figure 8 schematically illustrates the user input mechanism of Figure 4, and specifically shows a modification to the processing of measurements compared to Figure 4 in which data measurement points are located along the predetermined measurement path;
  • Figure 9 is a flow chart illustrating a method for generating a user input signal based on obtaining sets of capacitive measurements at different times in accordance with embodiments of the present invention, the sets of capacitive measurements corresponding to sensor nodes of the user input mechanism of Figures 1 and 4;
  • Figure 10 schematically illustrates, in plan view, a further embodiment of a user input mechanism, and specifically shows a guiding element according to a second aspect
  • Figure 11 schematically illustrates, in plan view, a further embodiment of a user input mechanism, and specifically shows a guiding element according to a third aspect
  • Figure 12 is a flow chart illustrating a method for calibrating a user input mechanism in accordance with an aspect of the present invention.
  • the present disclosure relates broadly to a user input mechanism. More specifically, the user input mechanism is a touch-sensitive apparatus (which comprises an array of electrodes forming a touch-sensitive surface) and a guiding element mounted to the touch- sensitive surface.
  • the guiding element acts to guide a user’s interaction with the touch- sensitive surface, essentially providing a predetermined movement path which permits a user to move their hand I digit(s) anywhere along the predetermined movement path.
  • identifying a degree of movement of the user’s hand I digit(s) may be performed by comparing a pattern in a first set of measurements that contain indications of a capacitive coupling associated with locations on the touch-sensitive surface corresponding to the predetermined movement path with a pattern in a second set of measurements that similarly contain indications of a capacitive coupling associated with locations on the touch- sensitive surface corresponding to the predetermined movement path, where the first and second set of measurements are obtained at different times.
  • a degree of movement along the predetermined movement path of the user’s hand I digit(s) can be determined, and subsequently can be used to indicate an intended or desired user input (for example, to control an associated system communicatively coupled to the user input mechanism).
  • Such an approach provides a more tactile (and potentially more familiar) way of providing a user input while still retaining the advantages of capacitive touch-sensitive surfaces.
  • Figure 1 schematically shows a user input mechanism 1 in accordance with the principles of the present disclosure.
  • the user input mechanism 1 is represented in plan view (to the left in the figure) and also in cross-sectional view (to the right in the figure).
  • the user input mechanism 1 comprises a sensor element 100, measurement circuitry 105, processing circuitry 106, and cover 108.
  • the sensor element 100 and cover 108 may, more generally be referred to as a touch-sensitive element of the user input mechanism 1
  • the measurement circuitry 105 and processing circuitry 106 may, more generally, be referred to as the controller of the user input mechanism 1.
  • the user input mechanism 1 is adapted to sense touches, using in part the touch-sensitive element, and to identify a user input from the sensed touches. Accordingly, the user input mechanism 1 may alternatively be referred to herein as a touch-sensitive apparatus 1.
  • the touch-sensitive apparatus 1 is primarily configured for establishing the position of a touch 109 within a two-dimensional sensing area (otherwise referred to herein as a touch- sensitive surface) by providing Cartesian coordinates along an X-direction (horizontal in the figure) and a Y-direction (vertical in the figure).
  • the sensor element 100 is constructed from a substrate 103 that could be glass or plastic or some other insulating material and upon which is arranged an array of electrodes consisting of multiple laterally extending parallel electrodes, X-electrodes 101 (row electrodes), and multiple vertically extending parallel electrodes, Y-electrodes 102 (column electrodes), which in combination allow the position of a touch 109 to be determined.
  • the term “touch” as used herein is understood to encompass direct contact between a conductive object, such as a user’s finger, and the touch-sensitive surface (e.g., placing the user’s finger such that it contacts the touch-sensitive surface), but optionally to also encompass a conductive object, such as the user’s finger, which is sufficiently close to the touch-sensitive surface to be sensed by the touch-sensitive surface (e.g., placing the user’s finger such that it hovers above the touch-sensitive surface).
  • a touch is intended to encompass detectable interactions between a conductive object and the touch- sensitive surface, including direct contact with the touch-sensitive surface and proximity to the touch-sensitive surface.
  • the touch-sensitive apparatus 1 may only be configured to sense direct contact between the touch-sensitive surface and the conductive object. In some implementations, the touch-sensitive apparatus 1 may be capable of switching between direct contact sensing and proximity sensing. In some implementations, the touch-sensitive apparatus 1 is configured to distinguish between a touch in which the conductive object is a distance from the touch-sensitive surface and a touch in which the conductive object is in contact with the touch-sensitive surface, and further may be configured to perform different function(s) in response to detecting a touch in which the conductive object is a distance from the touch-sensitive surface and a touch in which the conductive object is in contact with the touch-sensitive surface.
  • the X-electrodes 101 (row electrodes) are aligned parallel to the X-direction and the Y-electrodes 102 (column electrodes) are aligned parallel to the Y-direction.
  • the different X-electrodes allow the position of a touch to be determined at different positions along the Y-direction while the different Y-electrodes allow the position of a touch to be determined at different positions along the X-direction. That is to say in accordance with the terminology used herein, the electrodes are named (in terms of X- and Y-) after their direction of extent rather than the direction along which they resolve position.
  • the electrodes may also be referred to as row electrodes and column electrodes. It will however be appreciated these terms are simply used as a convenient way of distinguishing the groups of electrodes extending in the different directions. In particular, the terms are not intended to indicate any specific electrode orientation. In general the term “row” will be used to refer to electrodes extending in a horizontal direction for the orientations represented in the figures while the terms “column” will be used to refer to electrodes extending in a vertical direction in the orientations represented in the figures.
  • the X-electrodes 101 and Y-electrodes 102 define a touch- sensitive surface (or area), which is a region of the substrate 103 which is sensitive to touch.
  • each electrode may have a more detailed structure than the simple "bar" structures represented in Figure 1, but the operating principles are broadly the same.
  • the sensor electrodes are made of an electrically conductive material such as copper or Indium Tin Oxide (ITO).
  • ITO Indium Tin Oxide
  • the touch-sensitive element may need to be transparent, for use with a display screen to form a touch sensor as described above, in which case ITO electrodes and a plastic substrate are common.
  • a touch pad such as often provided as an alternative to a mouse in laptop computers is usually opaque, and hence can use lower cost copper electrodes and an epoxy-glass-fibre substrate (e.g. FR4).
  • the electrodes 101 , 102 are electrically connected via circuit conductors 104 to measurement circuitry 105, which is in turn connected to processing circuitry 106 by means of a circuit conductor 107.
  • the measurement circuitry 105 and I or the processing circuitry 106 may each be provided by a (micro)controller, processor, ASIC or similar form of control chip. Although shown separately in Figure 1 , in some implementations, the measurement circuitry 105 and the processing circuitry 106 may be provided by the same (micro)controller, processor, ASIC or similar form of control chip.
  • the measurement circuitry 105 and I or the processing circuitry 106 may be comprised of a printed circuit board (PCB), which may further include the various circuit conductors 104, 107.
  • PCB printed circuit board
  • the measurement circuitry 105 and the processing circuitry 106 may be formed on the same PCB, or separate PCBs. Note also that the functionality provided by either of the measurement circuitry 105 and the processing circuitry 106 may be split across multiple circuit boards and I or across components which are not mounted to a PCB.
  • the processing circuitry 106 interrogates the measurement circuitry 105 to recover the presence and coordinates of any touch or touches present on, or proximate to, the sensor element 100.
  • the measurement circuitry 105 is configured to perform capacitance measurements associated with the electrodes 101 , 102 (described in more detail below).
  • the measurement circuitry 105 comprises drive circuitry 112 for generating electrical signals for performing the capacitance measurements.
  • the measurement circuitry 105 outputs the capacitance measurements to the processing circuitry 106, which is arranged to perform processing using the capacitance measurements.
  • the processing circuitry 106 may be configured to perform a number of functions, but at the very least is configured to determine when a touch 109, caused by a conductive object such as a human finger or a stylus coming into contact with (or being proximate to) the touch-sensitive surface of the sensor element 100 with appropriate analysis of relative changes in the electrodes’ measured capacitance (capacitive coupling).
  • the processing circuitry 106 may also be configured to, with appropriate analysis of relative changes in the electrodes’ measured capacitance I capacitive coupling, calculate a touch position on the cover’s surface as an X, Y coordinate 111.
  • a front cover (also referred to as a lens or panel) 108 is positioned in front of the substrate 103 and a single touch 109 on the surface of the cover 108 is schematically represented. Note that the touch itself does not generally make direct galvanic connection to the sensor 103 or to the electrodes 102. Rather, the touch influences the electric fields 110 that the measurement circuitry 105 generates using the electrodes 102 (described in more detail below).
  • a further aspect of capacitive touch sensors relates to the way the measurement circuitry 105 uses the electrodes of the sensor element to make its measurements.
  • Figure 2 broadly shows the user input mechanism I touch-sensitive apparatus 1 of Figure 1, but further includes components where necessary to explain the principle of performing “self-capacitance” measurements using the user input mechanism I touch-sensitive apparatus 1 of Figure 1.
  • the drive circuitry 112 of the measurement circuitry 105 is configured to generate and apply an electrical stimulus (drive signal) 113 to each electrode 101 , 102 which will cause an electric field 110 to form around it.
  • This field 110 couples through the space around the electrode back to the measurement circuitry 105 via numerous conductive return paths that are part of the nearby circuitry of the sensor element 100 and the housing of the touch-sensitive apparatus 1 or the apparatus in which it is mounted (shown schematically by reference numeral 114), or physical elements from the nearby surroundings 115 etc., so completing a capacitive circuit 116.
  • the overall sum of return paths is typically referred to as the “free space return path” in an attempt to simplify an otherwise hard-to-visualize electric field distribution.
  • the measurement circuitry 105 is only driving each electrode from a single explicit electrical terminal 117; the other terminal is the capacitive connection via this “free space return path”.
  • the capacitance measured by the measurement circuitry 105 is the “self-capacitance” of the sensor electrode (and connected tracks) that is being driven relative to free space (or Earth as it is sometimes called) i.e. the “self-capacitance” of the relevant sensor electrode. Touching or approaching the electrode with a conductive element, such as a human finger, causes some of the field to couple via the finger through the connected body 118, through free space and back to the measurement circuitry 105.
  • This extra return path 119 can be relatively strong for large objects (such as the human body), and so can give a stronger coupling of the electrode’s field back to the measurement circuitry 105; touching or approaching the electrode hence increases the selfcapacitance of the electrode.
  • the measurement circuitry 105 is configured to sense this increase in capacitance. The increase is strongly proportional to the area 120 of the applied touch 109 and is normally weakly proportional to the touching body’s size (the latter typically offering quite a strong coupling and therefore not being the dominant term in the sum of series connected capacitances).
  • the electrodes 101 , 102 are arranged on an orthogonal grid, generally with a first set of electrodes on one side of a substantially insulating substrate 103 and the other set of electrodes on the opposite side of the substrate 103 and oriented at substantially 90° to the first set.
  • the electrodes may be oriented at a different angle (e.g., 30°) relative to one another.
  • these designs are more complex to manufacture and less suitable for transparent sensors.
  • one set of electrodes is used to sense touch position in a first axis that we shall call “X” and the second set to sense the touch position in the second orthogonal axis that we shall call “Y”.
  • the measurement circuitry 105 can either drive each electrode in turn (sequential) with appropriate switching of a single control channel (i.e., via a multiplexer) or it can drive them all in parallel with an appropriate number of separate control channels.
  • any neighbouring electrodes to a driven electrode are sometimes grounded by the measurement circuitry 105 to prevent them becoming touch sensitive when they are not being sensed (remembering that all nearby capacitive return paths will influence the measured value of the actively driven electrode).
  • the nature of the stimulus applied to all the electrodes is typically the same so that the instantaneous voltage on each electrode is approximately the same.
  • each electrode has minimal influence on its neighbours (the electrode-to-electrode capacitance is non-zero but its influence is only “felt” by the measurement circuitry 105 if there is a voltage difference between the electrodes).
  • a second technique is based on measuring what is frequently referred to as “mutualcapacitance”.
  • Figure 3 broadly shows the user input mechanism I touch-sensitive apparatus 1 of Figure 1, but further includes components where necessary to explain the principle of performing “mutual capacitance” measurements using the user input mechanism I touch-sensitive apparatus 1 of Figure 1.
  • the measurement circuitry 105 will sequentially (or in some implementations, in parallel) stimulate each of an array of transmitter (driven/drive) electrodes, shown as the X electrodes 101 in Figure 3, that are coupled by virtue of their proximity to an array of receiver electrodes, shown as the Y electrodes 102 in Figure 3.
  • the Y electrodes 102 may instead be the transmitting electrodes and the X electrodes 101 may instead be the receiving electrodes in other implementations.
  • the resulting electric field 110 is now directly coupled from the transmitter to each of the nearby receiver electrodes 102; the “free space” return path discussed above plays a negligible part in the overall coupling back to the measurement circuitry 105 when the sensor element 100 is not being touched.
  • the area local to and centred on the intersection of a transmitter and a receiver electrode is typically referred to as a “node” or “intersection point”.
  • a conductive element such as a human finger
  • the electric field 110 is partly diverted to the touching object.
  • An extra return path to the measurement circuitry 105 is now established via the body 118 and “free-space” in a similar manner to that described above. However, because this extra return path acts to couple the diverted field directly to the measurement circuitry 105, the amount of field coupled to the nearby receiver electrode 102 decreases. This is measured by the measurement circuitry 105 as a decrease in the “mutual-capacitance” between that particular transmitter electrode and receiver electrodes in the vicinity of the touch 109.
  • the measurement circuitry 105 senses this change in capacitance of one or more nodes. For example, if a reduction in capacitive coupling to a given Y-electrode is observed while a given X-electrode is being driven, it may be determined there is a touch in the vicinity of where the given X-electrode and given Y-electrode cross, or intersect, within the sensing area of the sensor element 100.
  • the magnitude of a capacitance change is nominally proportional to the area 120 of the touch (although the change in capacitance does tend to saturate as the touch area increases beyond a certain size to completely cover the nodes directly under the touch) and weakly proportional to the size of the touching body (for reasons as described above).
  • the magnitude of the capacitance change also reduces as the distance between the touch sensor electrodes and the touching object increases.
  • the transmitter electrodes 101 and receiver electrodes 102 in the described implementation are arranged as an orthogonal grid, with the transmitter electrodes 101 on one side of a substantially insulating substrate 103 and the receiver electrodes 102 on the opposite side of the substrate 103.
  • This is as schematically shown in Figure 3.
  • the first set of transmitter electrodes 101 shown on one side of a substantially insulating substrate 103 and the second set of receiver electrodes 102 is arranged at nominally 90° to the transmitter electrodes on the other side of the substrate 103.
  • the electrodes may be oriented at a different angle (e.g., 30°) relative to one another.
  • other implementations may have structures where the grid is formed on a single side of the substrate and small insulating bridges, or external connections, are used to allow the transmitter and receiver electrodes to be connected in rows and columns without short circuiting.
  • Figures 4 and 5 schematically show the user input mechanism I touch sensitive apparatus 1 of Figure 1.
  • Figure 4 shows the user input mechanism 1 in plan view
  • Figure 5 shows the user input mechanism 1 in perspective view.
  • Figure 4 depicts the sensor element 100 (comprising the substrate 103, and the X electrodes 101 and Y electrodes 102), and the cover 108, which together form the touch-sensitive element of the user input mechanism 1 as discussed above.
  • Figure 5 further omits the X and Y electrodes 101, 102 from the sensor element for clarity.
  • Other components of the user input mechanism 1 described in Figure 1 are omitted from Figures 4 and 5 for clarity.
  • Figures 4 and 5 show a guiding element 130 in accordance with aspects of the present disclosure.
  • the guiding element 130 is in the form of a cylindrical element that is mounted on the touch-sensitive element (and more particularly, to the cover 108 of the touch-sensitive element).
  • the guiding element 130 is shown in cross-section with the underlying X and Y electrodes 101 , 102 visible.
  • the guiding element 130 is a separate component from the touch-sensitive element. That is to say, the guiding element 130 is formed independently of the touch-sensitive element (and in particular the cover 108) and is subsequently mounted to touch-sensitive element (and in particular the cover 108).
  • the guiding element 130 is mounted via a spindle 131 (see Figure 4) which allows the guiding element 130 to rotate about the spindle 131.
  • the spindle 131 is fixed to the touch-sensitive element (and in particular the cover 108), so accordingly, the guiding element 130 is able to rotate relative to the touch-sensitive element.
  • the way in which the guiding element 130 is mounted to the cover 108 is not particularly limited.
  • the spindle 131 may be attached to the cover 108 via a screw thread or the like, or the spindle 131 may be adhered to the surface of the cover 108.
  • the guiding element 130 is provided as an element that the user may interact with when using the user input mechanism 1.
  • the guiding element 130 mimics a rotating dial or knob and, as such, a user is able to grip the circumferential sides and/or press on the top surface of the guiding element 130 and perform a suitable action to cause the guiding element 130 to rotate (e.g., substantially in the direction as shown by the double-headed arrow in Figure 4).
  • the guiding element 130 therefore acts to guide a user’s hand or digit (finger), relative to the touch-sensitive surface of the touch-sensitive element, along a predetermined movement path 132.
  • a predetermined movement path 132 for the rotatable guiding element 130 is shown in Figure 4 by the thick black line around the outside of the guiding element 130.
  • the user interacts with the guiding element 130, e.g., by gripping the circumferential sides of the guiding element 130, the user’s digits (fingers) when projected onto the touch- sensitive surface approximately lie on points along the predetermined movement path 132.
  • a user’s hand or digit(s) are brought into contact with, or in the proximity of, the touch-sensitive surface of the user input mechanism 1 and as such can be detected by the sensor element 100 and measurement circuitry 105 via a change in capacitance.
  • the guiding element 130 may be formed from a non-conductive material, such as plastic.
  • the guiding element 130 consists of a non-conductive material. Accordingly, in such implementations, the user-input mechanism 1 is unable (or substantially unable) to detect the presence of the guiding element 130. That is to say, the output from the sensor element 100 (e.g., at the measurement circuitry 105) may be substantially identical regardless of whether the guiding element 130 is mounted to the touch-sensitive element or not.
  • the guiding element 130 may comprise conductive elements and/or be formed from a conductive material.
  • the guiding element 130 is constructed such that the conductive material, to the extent it influences the capacitive measurements obtained from the sensor element, affects the capacitive coupling along the predetermined movement path by substantially the same amount. That is to say, the guiding element 130 may be provided with a conductive element that either does not affect the capacitive coupling of the sensor element 100 or uniformly affects the capacitive coupling of the sensor element 100 at least along the predetermined movement path.
  • the spindle 131 may be formed from a metal, for rigidity and longevity purposes, while the remaining parts of the guiding element 130 may be formed from plastic. In some implementations however, the guiding element 130 does not comprise any conductive components.
  • the user input mechanism 1 may operate according to the selfcapacitance or mutual capacitance measurement techniques.
  • the user input mechanism 1 functions using the mutual capacitance measurement technique.
  • the measurement of the capacitance between the two electrodes, indicative of the mutual capacitance at a location where the two electrodes interact I overlap, can be correspondingly obtained.
  • the mutual capacitance at each of the sensor nodes can be obtained by performing measurements of different pairs of drive and receive electrodes.
  • the measured mutual capacitance decreases from a steady state or expected mutual capacitance.
  • Figure 6 is a graph depicting the mutual capacitance, C, shown on the Y-axis, for each of the sensor nodes 1 to 12 (shown on the X-axis). Three traces are shown in Figure 6, with each trace obtained at a different time (specifically, times to, ti, and t2). For each trace, there is shown an indication of the mutual capacitive coupling for each of the sensor nodes 1 to 12 relative to a reference (the dashed line labelled with to, ti, and t2). The precise value of the reference here is not significant for the present discussion, but the reference might be considered to represent an expected mutual capacitance for the sensor node obtained in advance.
  • the first trace, at time to, is obtained in the absence of a user interacting with the guiding element 130. More specifically, the user’s hand or digit(s) are far enough away from the sensor element 100 that they do not capacitively couple to any of the sensor nodes to 1 12.
  • Each of the mutual capacitances for each of the sensor nodes 1 to 12 is greater than the threshold, Ct, shown by the dashed line.
  • the threshold Ct is chosen to represent a value which corresponds to detection of a touch (i.e., when the mutual capacitance for a given sensor node drops below the threshold Ct, this signifies a touch is present at the sensor node).
  • the threshold, Ct may be set for the detection of direct contact with the sensor element 100 (by the user’s digit or hand) or for the detection of proximity of the user’s digit or hand to the sensor element 100.
  • the measured mutual capacitance lies between the reference value and the threshold value.
  • the reference value may be representative of a mutual capacitance obtained in advance or of an expected mutual capacitance obtained in ideal circumstances.
  • the actual measurement of the mutual capacitance at a given sensor node may deviate from the reference by a slight amount (i.e. , by less than the reference minus the threshold value, Ct) as is shown in Figure 6, e.g., due to noise or the measurement being taken in non-idealised conditions.
  • the second trace, at time ti, is obtained when the user first interacts with the guiding element 130.
  • the user grips the circumferential sides of the guiding element 130 between their index finger and thumb. More specifically, the user places their index finger at a location on the circumferential side of the guiding element 130 which overlaps approximately with sensor nodes 4 and 5 and the user places their thumb at a location on the circumferential side of the guiding element 130 which overlaps approximately with sensor nodes 10 and 11.
  • the index finger and thumb of the user may not make direct contact with the sensor element 100 (specifically the cover 108) but instead may hover above the cover 108.
  • the sensor element 100 is capable of detecting the user’s index finger and thumb, and this can be seen in the trace at ti. More specifically, it can be seen that the mutual capacitances for sensors nodes 4 and 5, and 10 and 11 are below the threshold Ct, thereby signifying the presence of a touch.
  • the third trace, at a time t2, is obtained at a later time after the user has interacted with the guiding element 130. More specifically, the user rotates the rotatable guiding element 130 (while gripping the guiding element 130 as discussed above) such that the user’s index finger is at a location on the circumferential side of the guiding element 130 that overlaps approximately with sensor nodes 1 and 2 and the user’s thumb is at a location on the circumferential side of the guiding element 130 that overlaps approximately with sensor nodes 7 and 8. (That is, the user may rotate the guiding element 130 approximately 90° anticlockwise).
  • the measurement circuitry 105 is configured to measure the mutual capacitances of sensor nodes corresponding to locations on the touch-sensitive element along the predetermined movement path 132.
  • the measurement circuitry 105 is configured to provide (e.g., to the processing circuitry 106) a first set of measurements (or signals) corresponding to the mutual capacitances of the sensor nodes corresponding to locations on the touch-sensitive element along the predetermined movement path 132.
  • the mutual capacitances of the first set of measurements are obtained at or substantially at the same time.
  • the user input mechanism 1 may be configured to measure the mutual capacitances of sensor nodes 1 to 12 at the same time (e.g., by driving each of the drive electrodes using a different drive signal), or alternatively, the user input mechanism 1 may be configured to measure the mutual capacitances of sensor nodes 1 to 12 at substantially the same time (e.g., by sequentially driving each of the drive electrodes within a short time of, e.g., a few microseconds of each other). Additionally, the measurement circuitry 105 is configured to provide (e.g., to the processing circuitry 106) a second set of measurements (or signals) corresponding to the mutual capacitances of the sensor nodes corresponding to locations on the touch-sensitive element along the predetermined movement path 132.
  • the mutual capacitances of the second set of measurements are obtained at or substantially at the same time, in a similar manner to the first set above.
  • the first and second set of measurements or signals are obtained at substantially different times.
  • the time period between the first and second set of measurements or signals may be 2 seconds or less, or 1 second or less, or 0.5 seconds or less, or 100 milliseconds or less, or 10 milliseconds or less, or 1 millisecond or less.
  • the time period may be set to correspond to a length of time in which a measurable movement of the guiding element 130 (or movement of the user’s hand or digit while being guided by the guiding element 130) can be detected by the sensor element 100 and measurement circuitry 105.
  • the time period required to obtain the measurements of the sensor nodes at substantially the same time is less than, and in some implementations several orders of magnitude less than, the time period between the first and second set of measurements or signals (e.g., between the first time h and the second time t2).
  • the processing circuitry 106 is configured to determine a degree of movement of the user’s digit or hand along the predetermined movement path 132 by determining the difference between the first set of measurements or signals and the second set of measurements or signals. That is to say, based on a difference between the two sets of signals, the processing circuitry 106 is able to identify the degree of movement along the predetermined movement path 132. Subsequently, the processing circuitry 106 is able to convert the identified degree of movement along the predetermined movement path 132 to a user input. The user input may then be used, either by the processing circuitry 106 itself or any suitable circuitry communicatively coupled to the processing circuitry 106, to perform a function corresponding to the user input.
  • Figure 7 shows a graph depicting the second and third trace (corresponding to the first and second sets of measurements) from Figure 6 superimposed on one another. Similarly to Figure 6, Figure 7 shows the mutual capacitance of each of the sensor nodes 1 to 12. The trace corresponding to the first set of measurements obtained at h is shown by the solid line, with the trace corresponding to the second set of measurements obtained at t2 is shown by the dashed line.
  • the processing circuitry 106 is configured to determine a degree of movement of the user’s hand or digit based on the first and second set of measurements. In some implementations, this can be performed according to the principles describe below.
  • the processing circuitry 106 identifies a first pattern based on the first set of measurements or signals from the touch-sensitive element obtained at the first time (e.g., at time h). Determining a pattern may include simply identifying the trace as shown in Figures 6 and 7 (i.e. , the line extending from the value at sensor node 1 to the value at sensor node 12). Alternatively, determining a pattern may include identifying a particular shape or feature within the first set of measurements. For example, with reference to Figures 6 and 7, the first trace includes two “troughs” or “dips” at locations broadly centred on sensor node 4 and sensor node 11.
  • Each of these “dips” may be determined to correspond to a particular shape or feature, which can subsequently be identified as the pattern. Equally, the relative locations of the two “dips” may themselves be indicative of a feature of the first set of measurements. For example, the spacing between the first recess and the second recess may be identified as a feature of the first set of measurements.
  • the trace, shapes and I or features of the trace as discussed above may also be compared to predetermined patterns obtained in advance.
  • the user may pinch the guiding element 130 between their index finger and thumb.
  • the trace for a set of measurements obtained while the user is gripping the guiding element 130 in such a way to be similar to the traces shown in Figures 6 and 7. That is, one would expect the trace to show a first “dip” corresponding to the detection of the index finger and a second “dip” corresponding to the detection of the thumb, where the “dips” are separated by a distance essentially corresponding to the finger and thumb being at opposite sides of the guiding element 130.
  • the shape of the “dips” may be consistent with a typical shape expected for an index finger and thumb accordingly.
  • the user may grip the guiding element 130 differently.
  • the user may place all of their fingers around the outside of the guiding element 130 (for example, such that all five digits contact the circumferential surface of the guiding element 130 at roughly equally spaced locations around the periphery of the guiding element 130), or the user may use a single index finger placed on the top surface of the guiding element 130 but positioned off-centre.
  • Such interactions may have a typical expected pattern.
  • the process of identifying a first pattern based on the first set of measurements or signals from the touch-sensitive element obtained at the first time may involve determining which of the predetermined patterns corresponds to the first set of measurements. That is, one or more predetermined patterns may be fitted to the first set of measurements, for example using a least squares method or the like, and the predetermined pattern with the closest fit is identified as the pattern by the processing circuitry 106.
  • the processing circuitry 106 identifies a second pattern based on the second set of measurements or signals from the touch-sensitive element obtained at the second time (e.g., at time t2). In much the same way as described above, the processing circuitry 106 identifies a pattern based on the second set of measurements, e.g., either of the trace (or shapes/features thereof) or a predetermined pattern fitted to the second set of measurements.
  • the processing circuitry 106 is configured to determine a shift factor.
  • the shift factor is indicative of a magnitude of a shift required to shift the first pattern to the second pattern such that the two (identified) patterns substantially align.
  • the identified first pattern is the first trace (shown in solid line in Figure 7) and the identified second pattern is second trace (shown in dashed line in Figure 7).
  • the two patterns I traces are broadly similar but one is offset from the other (e.g., offset in the X-axis direction of the graph of Figure 7). As can be seen in Figure 7, one must shift the first I second trace by an amount AS such that the patterns (or features of the patterns) align or overlap.
  • the amount AS may be referred to herein as a shift factor.
  • the shift factor is the required “shift” (or translation in the x-axis direction) that one must make to align (or substantially align) the first and second traces with one another.
  • the shift factor may be calculated in any suitable way by the processing circuitry 106.
  • the processing circuitry 106 is configured to shift the position of the first trace (or second trace) relative to the second trace (or first trace) by an amount (e.g., by a distance of one sensor node), and identify a parameter indicative of the correlation between the two traces.
  • a parameter indicative of the correlation between the two traces may include the summation of the differences squared between each of the values for the corresponding sensor nodes of the first trace and the (shifted) second trace.
  • This process may be performed iteratively for different amounts (e.g., a shift of one sensor node, two sensor nodes, three sensor nodes, etc.) until the summation of the differences is minimised.
  • the process may perform an iteration in which there is no shift between the first and second traces.
  • the amount that the first and second traces are relatively shifted by e.g., zero, one sensor node, two sensor nodes, etc.
  • the shift factor AS is equal to a shift of three sensor nodes. That is to say, the summation of the differences is minimised when the second trace is shifted by three sensor nodes in the negative x-direction.
  • the above represents a form of least squares regression which can be used to determine a degree of correlation between the first and second traces.
  • any suitable technique for identifying the correlation between two traces may be employed by the processing circuitry 106 in order to identify the shift factor AS.
  • the above approach identifies the shift factor AS using an iterative approach which progressively shifts the first and second traces relative to one another by an amount.
  • the amount is given above as a distance of one sensor node; that is, trace is shifted by an amount such that the capacitance value for a given sensor node is shifted to the position of the adjacent sensor node and essentially replaces the capacitance value at the “new” sensor node.
  • the amount need not be this.
  • the amount may correspond to a distance of half a sensor node, a quarter of a sensor node, etc. Equally, the amount need not be the same for each iteration.
  • a first iteration may relatively shift the first and second traces by e.g., a distance of three sensor nodes, while the second iteration may be a relative shift of a distance of one sensor node.
  • the distances to relatively shift the first and second traces between iterations may be set in advance or be dependent on the value of the summation of the squares of the differences obtained for the previous iteration (e.g., if the previous iteration provides a relatively large summation, then the next iteration may be performed using a shift of a distance of three sensor nodes, as opposed to a situation where the previous iteration provides a relatively small summation which may mean the next iteration is performed using a shift of a distance of only one sensor node).
  • the correlation may be performed using any identified feature or shapes of the trace (i.e., using the abovementioned pattern).
  • the correlation may be performed using any identified feature or shapes of the trace (i.e., using the abovementioned pattern).
  • there are two possible features that could be used as a basis for performing the correlation one corresponding to the “dip” attributed to the index finger and one corresponding to the “dip” attributed to the thumb).
  • One or more of these features may be selected and used to perform the correlation. In some instances, however, particularly where two or more features are used, different shift factors may be calculated for each feature.
  • the processing circuitry 106 may be configured to identify the respective features (e.g., the “dips”) and determine an amount for each of the features which causes the features to substantially align I overlap. That is to say, each feature is shifted by an amount that most closely aligns the feature identified in the first trace with the corresponding feature identified in the second trace.
  • the feature corresponding to the index finger i.e., the “dip” at nodes 4 and 5
  • processing circuitry 106 may be configured to set the shift factor AS based on an average of the two shift amounts corresponding to each of the features.
  • the processing circuitry 106 is configured to determine the degree of movement of the user’s hand or digit along the predetermined movement path 132 based on the calculated shift factor AS.
  • the processing circuity 106 may be provided with a mathematical relationship (e.g., in the form of an equation or look-up table) which maps the shift factor AS to an amount of movement of the user’s hand or digit (which may correspond to movement of the guiding element 130).
  • the amount of movement may be represented as a distance along the predetermined movement path (for example, expressed in mm or cm), a percentage or fraction of the total distance of the predetermined movement path (for example, expressed as 25 % or % of the total predetermined movement path), or in the case of rotational movement, an angular amount (e.g., expressed in degrees or radians).
  • the processing circuitry 106 can determine a degree of movement - e.g., 90° in the case of the example described with respect to Figures 6 and 7.
  • an intended user input can be determined by the processing circuitry 106 from the determined degree of movement along the predetermined movement path 132.
  • the intended user input may be sent to a corresponding system that the user input mechanism 1 is communicatively coupled to.
  • the user input mechanism 1 may be coupled to a speaker system (e.g., in a vehicle or the like), with the intended user input being sent to a controller of the speaker system which uses the intended user input to adjust the output volume of the speaker system.
  • the processing circuitry 106 of the user input mechanism 1 may be integrated with the corresponding system, such that the processing circuitry 106 is also provided with the functionality to control the corresponding system.
  • the present disclosure provides for a user input mechanism 1 in which a user input is detected through using a capacitive sensing technique (e.g., mutual capacitance) to sense touches (e.g., direct touches or hover touches) using a sensor element 100. Owing to the presence of the guiding element 130, the sensed touches can be mapped to a predetermined movement path 132 and subsequently can be translated or mapped to a certain movement along the predetermined movement path 132.
  • a capacitive sensing technique e.g., mutual capacitance
  • an intended user input can be determined from the certain movement along the predetermined movement path 132 and subsequently used in any desired way, e.g., to control a function or process of a system communicatively coupled to the user-input mechanism.
  • the guiding element 130 is provided to help guide users such that the interaction between the user and the sensor element 100 is restricted (or substantially restricted) to the predetermined movement path 132. Accordingly, a user’s interaction with the user input mechanism 1, and hence the detection of a user’s desired input, can be more accurately and reliably determined. Moreover, the guiding element 103 provides a physical feature which some users may find more engaging or easier to use than the sensor element 100 alone.
  • the guiding element 130 itself does not capacitively couple to the sensor element 100 (or at least not in a way that allows for the position of the guiding element 130 to be determined). Hence, it is only the user’s hand or digit(s) that are detected by the sensor element 100 and it is the capacitive coupling of the user’s hand or digit(s) that is subsequently used to determine the user input.
  • the guiding element 130 is a separate component, formed separately from the sensor element 100 and which is capable of being mounted to the sensor element 100. This enables a degree of flexibility in terms of where the guiding element 130 may be positioned in respect of the sensor element 100, meaning that the user input mechanism 1 may be implemented in a wide range of applications.
  • conventional touch sensors may be retrofit with suitable guiding elements 130 to enable functions as described above to be implemented on existing touch sensors.
  • shift factor of three may indicate a user turning the dial anticlockwise by an amount of 90°. Strictly speaking, this may be considered a shift factor of minus 3 because the first pattern (trace) is shifted in the negative X-direction by three sensor nodes.
  • the shift factor of nine may indicate a user rotating the guiding element 130 by 270° in the clockwise direction.
  • this may be considered a shift factor of positive nine, because the first pattern (trace) is shifted in the positive X-direction by nine sensor nodes.
  • Techniques may be implemented to help differentiate between these two motions.
  • the time between the first and second set of measurements (that is, the time between the first time and the second time) may be set to be sufficiently small that only small amounts of rotation are possible. For example, it may be determined that, under normal use conditions, a user takes 0.5 ms to rotate the guiding element by 180°. Therefore, separating the first and second measurements by a time period of less than 0.5 ms, may enable the processing circuitry 106 to essentially exclude shift factors greater than six at any one time (i.e.
  • the processing circuitry 106 may only consider a shift factor of ⁇ 6 between the first and second measurements, with the ⁇ providing an indication of direction. Accordingly, it should be appreciated that the direction of movement (rotation) may affect the input signal that is generated by the processing circuity 106 for controlling communicatively coupled systems - for example, anticlockwise rotation may indicate an intended user input of a decrease of a parameter while clockwise rotation may indicate an intended user input of an increase of a parameter.
  • the user input mechanism 1, and more specifically the sensor element 100 is configured to operate using a mutual capacitance sensing technique.
  • the sensor element 100 may be configured to operate using the self-capacitance sensing technique.
  • the sensor element 100 may be configured to operate in a broadly similar way as above, e.g., by sensing the capacitances at different intersection points (sensor nodes) of the sensor element 100.
  • the self-capacitance technique is different in that, firstly, it is the self-capacitance of the driven electrode that is measured which means it can be more involved to determine the capacitance at the intersection points, and secondly, the self-capacitance technique is unable to distinguish between multiple simultaneous touches.
  • the sensor element 100 may be constructed to operate specifically (although not necessarily exclusively) using self-capacitance measurements.
  • the arrangement of electrode array may be such that electrode pads are located around the predetermined movement path 132, whereby the self-capacitance of the electrode pads is measured.
  • the processing circuitry 106 may be configured to perform the calculation of the shift factor regardless. In this sense, the processing circuitry 106 identifies changes in the respective traces as above, but it should be appreciated that whether the processing circuitry 106 identifies two touches or (incorrectly) four touches may be insignificant for the purposes of calculating the shift factor. (In other words, the processing circuitry 106 is still configured to shift the pattern by an amount to identify the shift factor).
  • the predetermined movement path 132 essentially as a path having a width of one sensor node (that is to say, for each position on the movement path 132, a single sensor node corresponds to that position on the movement path 132), this need not be the case and the predetermined movement path 132 may essentially include a number of sensor nodes per location on the predetermined movement path.
  • the predetermined movement path 132 having a greater width can essentially be thought of as the predetermined movement path 132 having a greater thickness - that is, the bold line 132 may be even thicker than what is shown in Figure 4.
  • sensor node 2 and the sensor node directly above may both potentially lie within the predetermined movement path 132. Accordingly, both of the sensor nodes may be affected by movement of the user’s hand or digit(s) and thus both contribute to the capacitive coupling. In such instances, an average of the capacitive coupling from both of sensor node 2 and the vertically adjacent sensor node may be considered as representing the capacitive coupling at that location along the predetermined movement path.
  • the location of the user’s hand or digit’s may vary. This can be taken into consideration by considering an average (or weighted sum) of the different sensor nodes.
  • an average or weighted sum of signals from multiple nodes may be used to increase the degree of correlation between the first pattern and second pattern, thereby increasing the reliability to which the processing circuitry 106 is able to determine whether a first pattern corresponds or correlates to the second pattern. In particular, this may be used to help account for spatial differences between the nodes of the electrode array and the predetermined movement path 132.
  • Figure 8 schematically shows the user input mechanism 1 and sensor element 100 in a similar manner to Figure 4.
  • Figure 8 is understood from Figure 4 and like reference signs indicate like components.
  • the user input mechanism 1 of Figure 8 is the same as the user input mechanism of Figure 4.
  • certain aspects of the operation of the measurement circuitry 105 and subsequently the processing circuitry 106 may be modified to help realise a greater degree of correlation (or at least a possibility of a greater degree of correlation) between a first pattern and a second pattern obtained from the sensor element 100, and thus to more reliably determined a shift fact AS between two patterns.
  • node 9 is, in the radial direction, much closer to the predetermined movement path 132 than node 10, while the rotational distance (i.e., the circumferential distance along the predetermined movement path 132) from node 8 to node 9 is different to the distance between node 9 to node 10.
  • the rotational distance i.e., the circumferential distance along the predetermined movement path 132
  • the orthogonal grid of electrodes 101 , 102 and the predetermined movement path 132 do not overlap or correspond exactly.
  • the data measurement points A to D are shown as being equally spaced along the predetermined movement path 132.
  • the data measurement points A to D are provided at locations corresponding to 180° (A), 210° (B), 240° (C) and 270° (D) assuming the 0° location is the most vertical position in respect of the orientation shown in Figure 8.
  • nodes 13 to 18 of the sensor element 100 are also shown and labelled.
  • each of the data measurement points A to D represent a location along the predetermined movement path 132 for which a representative value of the mutual capacitance (or capacitive coupling) is to be obtained and used in a similar correlation process as described with respect to Figures 6 and 7. That is, for example, instead of a correlation involving nodes 1 to 12 as shown in Figures 6 and 7, a correlation involving the data measurement points A to D (and additional data measurement points) is performed.
  • a weighted average of mutual capacitance measurements from corresponding nodes of the electrode array in proximity of a given data measurement point A to D is used to provide the value representative of the mutual capacitance at the data measurement point A to D.
  • the weighted average takes into consideration the relative spatial distance of the nodes to the data measurement point. For example, taking data measurement point A, one can see from Figure 8 that nodes 7, 8, 13 and 14 are the four nodes of the orthogonal grid that are spatially closest to data measurement point A. Nodes 7 and 8 are the closest two nodes to data measurement point A and are equally spaced from data measurement point A.
  • nodes 13 and 14 are furthest from data measurement point A but are also located equidistant from data measurement point A. Based on the relative distances between the respective nodes 7, 8, 13, 14 and the data measurement point A, a certain weighted sum or average of the mutual capacitance measurements from these nodes provides a suitable representation of the mutual capacitance that would be expected to be obtained at data measurement point A.
  • the weighted average may be expressed as the sum of 80% of the mutual capacitance measurement at node 7, 80% of the mutual capacitance measurement at node 8, 20% of the mutual capacitance measurement at node 13 and 20% of the mutual capacitance measurement at node 14, all divided by four.
  • nodes 8, 9, 14 and 15 are the four nodes of the orthogonal grid that are spatially closest to data measurement point B.
  • Node 9 is perhaps marginally closer to data measurement point B than node 8, while nodes 14 and 15 are much further away with node 15 marginally closer to data measurement point B.
  • the weighted average may be expressed as the sum of 90% of the mutual capacitance measurement at node 9, 80% of the mutual capacitance measurement at node 8, 15% of the mutual capacitance measurement at node 15 and 10% of the mutual capacitance measurement at node 14, all divided by four.
  • Each of the data measurement points may similarly have an associated weighted average determined in order to produce a representation of the mutual capacitance as measured at that data measurement point.
  • each of the data measurement points to be used in the correlation process are obtained under equal conditions.
  • any variation in the capacitive coupling at these locations is due to external factors and not the distance of the node to the predetermined measurement path 132.
  • the technique in Figure 8 proceeds substantially as described above except that the data measurement points rather than the nodes per se are used to generate the first pattern and the second pattern. More specifically, a first set of received signals from the sensor element 100 are used to determine representations of the capacitive coupling associated with the plurality of data measurement points (e.g., A to D) at the first time and the second set of received signals are used to determine representations of the capacitive coupling associated with the plurality of data measurement points at the second time.
  • the processing circuitry 106 is similarly configured to determine the difference between the first set of received signals from the touch-sensitive element obtained at the first time (i.e.
  • the processing circuitry 106 determines the difference between the representations of the capacitive coupling associated with the plurality of data measurement points (e.g., A to D) obtained at the first time (providing the first pattern) and the representations of the capacitive coupling associated with the plurality of data measurement points (e.g., A to D) obtained at the second time (providing the second pattern).
  • the first and second patterns are less prone to variation over the course of moving along the predetermined movement path 132.
  • the processing element 106 to perform a more reliable determination of whether the first pattern corresponds to the second pattern, even in the presence of external noise and/or slight variations in the user’s grip on the guiding element 130.
  • a data measurement point between data measurement points A and B may provide a weighted average of a mutual capacitance measurement based on the measurements from nodes 8 and 14 (divided by two in this case).
  • a weighted sum may be used to generate the representative mutual capacitance measurement for the data measurement points, this may only be suitable where data measurement points have the same number of nodes contributing to the representative mutual capacitance measurement for the given data measurement point. However, in most implementations, it may be more practical to utilise a weighted average. Therefore, the number of nodes contributing to the weighted average may vary depending on the correspondence between the nodes of the sensor element 100 and the predetermined movement path 132.
  • the nodes of the sensor element 100 that are to be used to provide a representative mutual capacitance measurement for a given data measurement point are determined in advanced and may be stored by the measurement circuitry 105 and/or the processing circuitry 106.
  • nodes contribute to the representation of the mutual capacitance measurement at data measurement point A, for example, any number of nodes may be used, and in particular, those nodes (or a subset thereof) which are likely to experience a change in mutual capacitance in the presence of an object at or in the vicinity of the given data measurement point.
  • Figure 9 is a representative method for providing a signal indicative of a user input to a system communicatively coupled to the user input mechanism 1.
  • the method starts at either step S1a or S2. At steps S1a and S2, the measurement
  • the 105 is operated to measure the capacitances (mutual capacitance and/or self-capacitance) of the sensor element 100.
  • the measurement circuitry 105 is configured to perform measurements of the capacitances of the entire electrode array at a first time (e.g., all sensor nodes not just the sensor nodes lying on the predetermined movement path 132).
  • a first time e.g., all sensor nodes not just the sensor nodes lying on the predetermined movement path 132
  • the measurement circuitry 105 before sending the measurements of the entire electrode array to the processing circuitry 106, or the processing circuitry 106 itself after receiving the measurements of the entire electrode array from the measurement circuitry 105 are configured to identify those capacitance measurements from the capacitance measurements of the entire electrode array that correspond to the sensor nodes lying on the predetermined movement path 132.
  • the processing circuitry 106 and/or measurement circuitry 105 may be provided with information in advance enabling the processing circuitry 106 and/or measurement circuitry 105 to identify the relevant sensor nodes and associated capacitances of those sensor nodes. For example, this information may be provided as part of a calibration process, or be provided to the processing circuitry
  • steps S1a and S1b may be implemented using a subset of the sensor nodes of the electrode array, where the subset includes the sensor nodes lying along the predetermined movement path in addition to further sensor nodes (but fewer than all sensor nodes of the electrode array).
  • the measurement circuitry 105 is instead configured to perform measurements of the capacitances of the sensor nodes lying on the predetermined movement path 132 at the first time.
  • the measurement circuitry 105 may be provided with information in advance concerning the sensor nodes lying on the predetermined movement path, as discussed above, such that the measurement circuitry 105 can appropriately drive the relevant electrodes, etc.
  • Methods implementing step S2 may provide reduced processing time and/or improved responsiveness as compared to methods implementing steps S1a and S1b.
  • the processing circuitry 106 having received measurements from the measurement circuitry 105, is configured to determine whether at least one touch (direct touch or hover touch) is detected amongst the set of measurements corresponding to the sensor nodes located along the predetermined movement path 132.
  • a threshold, Ct may be provided to identify whether a capacitance measurement corresponds to a detected touch or not.
  • the processing circuitry 106 recognises this is indicative of a touch being detected at the corresponding location of the sensor node.
  • the processing circuitry 106 recognises this is indicative of no touches being detected.
  • the method proceeds back to steps S1a and S2 respectively, whereby the corresponding measurements are performed at a later time.
  • the capacitive measurements may be obtained periodically, for example, every 0.5 seconds.
  • the method may proceed back to steps S1a and S2 respectively after waiting for a period of e.g., 0.5 seconds, and thereafter the measurement circuitry 105 performs further capacitive measurements of the electrode array as described above.
  • the method proceeds to step S4.
  • the measurement circuitry is configured to perform a further capacitance measurement at a later time (e.g., the second time).
  • the measurements at step S4 may be performed in a similar manner to the measurements as performed at steps S1a or S2 accordingly.
  • the later time at step S4 may be different to the periodic time discussed above at step S3.
  • the purpose of the later time at step S4 is to ensure that any movement of the user’s hand or digit(s) is appropriately captured by the capacitance measurements made at the later time.
  • the later time may be 2 seconds or less, or 1 second or less, or 0.5 seconds or less, or 100 milliseconds or less, or 10 milliseconds or less, or 1 millisecond or less from the first time.
  • step S4 the method proceeds to step S5 where patterns are identified in the set of measurements obtained at steps S1a and S1 b or step S2 and the set of measurements obtained at step S4 using any of the techniques described above.
  • step S6 the processing circuitry 106 is configured to determine the shift factor AS, using any of the techniques as discussed above.
  • step S7 the processing circuitry is configured to generate an input signal indicative of the user’s intended input.
  • the generated input signal may be sued by the processing circuitry 106 itself, or it may be transmitted to other control circuitry, e.g., circuitry of a communicatively coupled system.
  • the above method may be used to determine a degree of movement of the user’s hand or digit(s) corresponding to movement of the guiding element 130.
  • the first time and the second time may correspond with the start and end of the movement of the user’s hand or digit(s).
  • the processing circuitry 106 may be configured to output a plurality of user input signals, whereby each user input signal corresponds to a shift factor between an earlier set of measurements and a later set of measurements (e.g., between a first time and a second time, or a second time and a third time, etc.).
  • the method may proceed from step S7 to step S4, whereby step S4 is repeated at a later (e.g., a third) time, and step S5 is performed between measurements obtained at the second and third times.
  • the shift factor between the set of measurements either side of movement stopping may essentially be equal to zero.
  • the processing circuitry 106 upon detection of a zero shift factor, may not generate an input signal. Equally, in some instances, movement may stop when the user removes their hand or digit(s) from the guiding element 130, and hence an indication of the movement stopping or having stopped may also be where none of the measurements obtained at the later time fall below the threshold Ct. In this case, the processing circuitry 106 may also not generate an input signal.
  • the processing circuitry 106 may generate and output a single user input signal at step S7. This may be achieved either by summing the different shift factors calculated at steps S6 for sets of measurements obtained at different times before proceeding to step S7 (substantially as above except only a single input signal is generated and output), or it may be achieved by only comparing the first pattern obtained from the measurements from the first time period (i.e., steps S1a or S2) with the set of measurements obtained at a time period when movement has stopped. In order to determine when movement has stopped in this case, a comparison between a set of measurements made at a time t x can be compared to measurements made at a time t y (where time t y is later than time t x ).
  • the measurements made at time t y are substantially the same as the measurements made at a time t x , this can indicate that movement of the user’s hand or digit(s) stopped at around time t x . Accordingly, the measurements obtained at the first time (at steps S1a or S2) and the measurements obtained at t x form the two sets of measurements used within steps S5 and S6.
  • step S1a and S1b may be used initially, while the approach at step S2 may be used when performing step S4 of the method of Figure 9.
  • the user input mechanism 1 of Figures 4 and 5 provides a guiding element 130 which is fixed provided at a fixed location relative to the touch-sensitive element, and having a predetermined movement path that broadly corresponds to a path that follows an outer contour of the guiding element 130. More specifically, the guiding element 130 is mounted on the touch-sensitive surface so as to rotate about a fixed axis on the touch-sensitive element, with the predetermined movement path 132 being circular.
  • the present disclosure is not limited to such guiding elements.
  • Figure 10 shows, schematically, a further example of a guiding element 230.
  • Figure 10 will be understood from Figure 4, with like components attributed similar reference signs. A specific description of these features is omitted herein, and only the differences will be described in detail.
  • Guiding element 230 takes the form of a slider which is configured to follow a substantially linear path from a first position to a second position under movement from a user’s hand or digit(s). This movement is indicated by the double-headed arrow in Figure 10.
  • the guiding element 230 is again a separate component that is mounted, movably, to the touch-sensitive element (and more specifically the cover 108).
  • the guiding element 230 may be mounted to the cover 108 via a track or the like (not shown), which may include an inverse T-shaped channel that allows a corresponding T-shaped part of the guiding element 230 to fit into the T-shaped channel, so as to permit sliding motion of the guiding part 230 along the T-shaped channel.
  • the T-shaped channel may be a separate component that is fixably mounted to the cover 108, e.g., via an adhesive, although in other implementations, the T-shaped channel may be formed in the cover 108.
  • the visible or exposed part of the guiding element 230 is shown in Figure 10 as taking a substantially cuboidal shape, however it should be appreciated that the guiding element 230 may take any desired shape as appropriate.
  • Figure 10 also shows a linear predetermined movement path 232.
  • the linear predetermined movement path 232 may broadly align with the track that the guiding element 230 is able to move along.
  • a number of sensor nodes 1 to 12 are indicated which approximately overlap with, or correspond to, the predetermined movement path 232.
  • sensor nodes that overlap with, or correspond to, the predetermined movement path 232 are the sensor nodes which exhibit or may exhibit a strong coupling between the user’s hand or digit(s) interacting with the guiding element 230. These sensor nodes may be determined in advance. In Figure 10, it can be seen that there are pairs of sensors nodes that are distributed either side of the predetermined movement path 232.
  • a user may interact with the guiding element 230 by e.g., placing their index finger on one half of the longitudinal side surface of the guiding element 230 and their middle finger on the other half.
  • Such an approach means that during movement of the guiding element 230, the index finger predominantly couples to sensor nodes 1 , 3, 5, 7, etc. while the middle finger predominantly couples to sensor nodes 2, 4, 6, 8, etc. (or vice versa).
  • the user may be able to interact with the guiding element 230 using only a single finger.
  • measurements may be obtained at different times (e.g., a first time and a second time), and subsequently patterns are identified in the measurements and used to determine a shift factor AS, which again can be used to generate a user input, as described above. That is to say, the techniques described above in the case of a rotating guiding element 130 may be corresponding applied to a linearly sliding guiding element 230. Again, based on the measurements obtained, an intended user input can be calculated by finding the shift factor that shifts the first measurements to the second measurements (or vice versa).
  • the guiding element 230 is mounted to the touch-sensitive element such that the guiding element 230 is able to move relative to the touch-sensitive element along the predetermined movement path 232, where the guiding element 230 is configured to move in a linear direction relative to the touch-sensitive surface, and wherein the predetermined movement path 232 is linear.
  • Figures 4 and 10 show a guiding element 130, 230 which is moveable (rotationally or linearly).
  • the guiding element may substantially be permitted to move in any configuration relative to the touch-sensitive surface (provided it follows a predetermined movement path). That is, the relative motion may be linear, curved, rotational, etc. or any combination thereof.
  • the guiding element may not be permitted to move, but instead is fixedly mounted to the touch-sensitive surface such that movement of the guiding element is not permitted.
  • Figure 11 shows, schematically, a further example of a guiding element 330.
  • Figure 11 will be understood from Figures 4 and 10, with like components attributed similar reference signs. A specific description of these features is omitted herein, and only the differences will be described in detail.
  • Guiding element 330 in this implementation takes the form of a three-dimensional shape.
  • the three dimensional shape of the guiding element 330 is not particularly limited, and any shape may be used in accordance with the present disclosure. While a more complex shape is shown in Figure 11 , other shapes such as a cylinder (e.g., as shown in Figure 4) or a cuboid (e.g., as shown in Figure 10) may also be used.
  • the predetermined movement path 332 is instead defined by the outer contours of the guiding element 330. Accordingly, the predetermined movement path 332 may broadly align with or follow the outer contours of the guiding element 330.
  • the guiding element 330 is again a separate component that is mounted, in this case fixedly, to the touch-sensitive element (and more specifically the cover 108).
  • the way in which a user interacts with the guiding element 330 is different to the way in which the user interacts with the moveable guiding elements 130, 230.
  • the user places a digit or digits either side of the guiding element 330, and correspondingly slides their digit or digits along the outer surface of the guiding element 330.
  • This motion is broadly represented by the double-headed arrow on Figure 11.
  • the guiding element 330 may have an outer surface which facilitates the sliding of a user’s digit or digits (for example, the outer surface may be highly polished).
  • Figure 11 correspondingly shows the predetermined movement path 332.
  • a number of sensor nodes 1 to 12 are indicated which approximately overlap with, or correspond to, the predetermined movement path 332 (noting that a predetermined movement path 332 may also be present on the opposite side of the guiding element 330, although this is not shown in Figure 11).
  • sensor nodes that overlap with, or correspond to, the predetermined movement path 332 are the sensor nodes which exhibit or may exhibit a strong coupling between the user’s hand or digit(s) interacting with the guiding element 330. These sensor nodes may be determined in advance.
  • the indicated sensor nodes broadly follow the outer contours of the guiding element 330.
  • Similar traces of the capacitive couplings corresponding to measurements obtained from sensor nodes 1 to 12 may be obtained. Again, measurements may be obtained at different times (e.g., a first time and a second time), and subsequently patterns are identified in the measurements and used to determine a shift factor AS, which again can be used to generate a user input, as described above. That is to say, the techniques described above in the case of a rotating or sliding guiding element 130, 230 may be analogously applied to a fixed guiding element 330.
  • the guiding element 130, 230 and 330 may be provided in a fixed or moveable configuration with respect to the cover 108 of the sensor element 100.
  • the principles of the present disclosure apply regardless of how the guiding element 130, 230 and 330 is coupled to the touch-sensitive element I cover 108.
  • a first and second set of measurements corresponding to the capacitive coupling (self- or mutual-capacitive couplings) at each of a plurality of locations (e.g., the sensor nodes) corresponding to a predetermined movement path may be obtained, and based on these measurements a degree of movement of a user’s hand or digit(s) is obtained accordingly.
  • the locations corresponding to the predetermined movement path may be obtained in advance and provided to the measurement circuitry 105 and/or the processing circuitry 106.
  • Such information may be provided based on empirical modelling. However, in other implementations, the information may be obtained on the basis of a calibration process.
  • Figure 12 depicts an example method for calibrating a user input mechanism include a guiding element 130, 230 or 330.
  • Figure 12 starts at step S100 where a guiding element 130, 230, 330 is mounted to the cover 108 of the user input mechanism 1.
  • the guiding element 130, 230, 330 is attached in a suitable way as described above.
  • the method proceeds to obtain a first set of measurements indicative of the capacitive couplings from the electrode array of the sensor element 100.
  • the first set of measurements is made in the absence of any user interaction with the sensor element 100 (that is, the user’s hand or digit(s) are far enough away from the sensor element 100 to affect the capacitive coupling).
  • the first measurements are therefore indicative of the steady state capacitive couplings, and are subsequently used as a reference.
  • the user interacts with the guiding element 130, 230, 330 by placing their hand I digit(s) at a first position.
  • this may simply be gripping the cylindrical guiding element 130 at any suitable position.
  • this may require ensuring the guiding element 230 is positioned at the start of its full range of travel before making the measurements.
  • This may also apply to fixed guiding elements, such as guiding elements 330, which have a non- continuous predetermined movement path. In this case, while the guiding element 330 itself does not move, the user may place their digits at one end of the predetermined movement path.
  • the method proceeds to obtain a second set of measurements indicative of the capacitive couplings from the electrode array of the sensor element 100 at the first position. These measurements may be obtained by the measurement circuitry 105 and provided to the processing circuitry 106.
  • the user subsequently performs measurements, via the measurement circuitry 105, at subsequent positions of the guiding element I user’s hands or digit(s).
  • the user may rotate the guiding element 130 by an amount, e.g., 20° to arrive at the second position (and subsequently a further 20° to arrive at the third position, etc.).
  • the user may push or slide the guiding element 230 along the track by an amount less than the full range of travel of the guiding element 230.
  • the user slides their digit(s) to a second position, third position, etc. along the contours of the guiding element 330. It should be appreciated that the more complex the shape of the guiding element and/or of the predetermined movement path, the accuracy of the calibration is improved the more positions are measured.
  • step S108 the method proceeds to determine whether all positions (along the predetermined movement path) have been measured. This is a question that may be directed to the user after every set of measurements obtained by the measurement circuitry 105. In the case of a non-continuous predetermined movement path, measurements should be performed at at least the start and end of the predetermined movement path in order to provide a suitable calibration. For the continuous predetermined movement paths, at least one complete loop (e.g., rotation) should be obtained in order to provide a suitable calibration. Assuming the above conditions are met for the respective guiding elements, the user may answer step S108 in the affirmative, i.e. , a YES at step S108.
  • step S108 If not, i.e., a NO at step S108, then further measurements at different positions are performed until all positions have been measured. It should also be appreciated that the answer to question S108 may be subjective, particularly when movement of the user’s hand I digit(s) or guiding element is continuous (i.e., not discrete). However, the purpose of this step in the method is to ensure that at least capacitances at the start and end of the predetermined movement paths are measured along with at least some other measurements therebetween.
  • the processing circuitry 106 is configured to analyse the set of measurements performed at each of the positions in step S104 and S106 with respect to the set of measurements performed at step S102. For each set of measurements obtained at steps S104 and S106, any identified changes in the capacitive couplings at certain sensor nodes (e.g., in the case of a mutual capacitance, a decrease in the mutual capacitance) indicates that those sensor nodes are affected by the user’s hand I digit(s) and therefore are candidate sensor nodes for the predetermined movement path.
  • the processing circuitry 106 may use the capacitive threshold Ct mentioned above as an indicator to determine whether the sensor node is a candidate sensor node or not.
  • a stricter threshold may be applied (i.e., a greater decrease from the reference), in order to exclude sensor nodes which may only be weakly affected by the user’s hands I digits.
  • the processing circuitry is configured to determine locations (sensor nodes) on the touch-sensitive surface that correspond to the predetermined movement path of the guiding element.
  • the present disclosure is not limited to this. Indeed, an advantage of providing a separate guiding element which can be mounted to the sensor element 100 (and more particularly the cover 108) is that it allows greater flexibility in the placement of the guiding element relative to the touch-sensitive surface. In some applications, for example, in the creation of touch-sensitive vehicle (car) dashboards, the touch-sensitive surface may span a relatively large area as compared to the area occupied by the projection of the guiding element on the touch- sensitive surface.
  • the processing circuitry 106 may be configured to receive capacitive measurements from the electrode array from locations other than those corresponding to the predetermined movement path(s), and subsequently identify the presence of one or more touches located at sensor nodes not corresponding to the predetermined movement path.
  • the processing circuitry 106 is configured to process the detection of the touch in a conventional manner, once it is determined or recognised that the touch does not correspond to a sensor node located on the predetermined movement path.
  • Such an approach permits greater flexibility in the placement of guiding elements according to certain applications, but it additionally facilitates the integration of both more physical I tactile user inputs alongside the conventional capacitive touch inputs.
  • Figures 4, 10 and 11 show the electrode array extending underneath the guiding elements (see Figure 11 in particular).
  • the electrode array may be designed in mind of the guiding element that is mounted or to be mounted to the touch-sensitive surface.
  • the four unlabelled sensor nodes located between sensor nodes 1 to 8 may be omitted, provided the electrode array may be driven accordingly.
  • the measurement circuitry 105 may be configured to not obtain measurements from certain sensor nodes in light of the areal extent of the guiding element (again, for example, with reference to Figure 11 , the capacitance at the four unlabelled sensor nodes located between sensor nodes 1 to 8 may not be measured by the measurement circuitry 105).
  • intersection points are points at which the electrodes spatially intersect
  • the electrodes may not spatially intersect, but rather be in close proximity to one another.
  • the electric field generated by an electrode may be intersected by an adjacent electrode to define an intersection point. That is, the intersection point is to be understood as a point in space at which the electric field(s) of one electrode intersect a second electrode most strongly (in other words, the point at which the capacitive coupling is greatest).
  • the user input mechanism includes a touch-sensitive element comprising a plurality of electrodes that define a touch sensitive surface; a guiding element mounted to the touch- sensitive element, wherein the guiding element is configured such that a user’s digit or hand may be guided, relative to the touch-sensitive surface, along a predetermined movement path; and processing circuitry configured to process signals received from the touch-sensitive element.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un mécanisme d'entrée d'utilisateur pour fournir un signal indiquant une entrée d'utilisateur à un système couplé en communication au mécanisme d'entrée d'utilisateur. Le mécanisme d'entrée d'utilisateur comprend un élément tactile comprenant une pluralité d'électrodes qui définissent une surface tactile ; un élément de guidage monté sur l'élément tactile, l'élément de guidage étant configuré de telle sorte que le doigt ou la main d'un utilisateur peut être guidé, par rapport à la surface tactile, le long d'un trajet de déplacement prédéterminé ; et des circuits de traitement configurés pour traiter des signaux reçus en provenance de l'élément tactile. Les circuits de traitement sont configurés pour : identifier et/ou recevoir des signaux provenant de l'élément tactile, les signaux correspondant chacun à un couplage capacitif à des emplacements sur l'élément tactile le long du trajet de déplacement prédéterminé ; et déterminer un degré de mouvement du doigt ou de la main de l'utilisateur le long du trajet de déplacement prédéterminé en déterminant la différence entre un premier ensemble de signaux reçus en provenance de l'élément tactile obtenu à un premier instant et un second ensemble de signaux reçus en provenance de l'élément tactile obtenu à un second instant. L'invention concerne également un système, un premier procédé et un second procédé.
PCT/GB2024/051250 2023-05-26 2024-05-14 Mécanisme et procédé d'entrée d'utilisateur Pending WO2024246484A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2307913.0 2023-05-26
GB2307913.0A GB2630382A (en) 2023-05-26 2023-05-26 User input mechanism and method

Publications (1)

Publication Number Publication Date
WO2024246484A1 true WO2024246484A1 (fr) 2024-12-05

Family

ID=87060975

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2024/051250 Pending WO2024246484A1 (fr) 2023-05-26 2024-05-14 Mécanisme et procédé d'entrée d'utilisateur

Country Status (2)

Country Link
GB (1) GB2630382A (fr)
WO (1) WO2024246484A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2463764A2 (fr) * 2010-12-12 2012-06-13 Thomas Klotz Dispositif et procédé de commande d'un écran tactile
US20200233521A1 (en) * 2017-10-11 2020-07-23 Mitsubishi Electric Corporation Operation input device
US20210286470A1 (en) * 2018-11-29 2021-09-16 Japan Display Inc. Sensor device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101393492A (zh) * 2007-09-20 2009-03-25 胡书彬 边际方向触摸键及其操作方法
CN103576864A (zh) * 2012-07-20 2014-02-12 贺征东 一种触摸屏虚拟键盘辅助装置和方法
CN107430447B (zh) * 2015-04-13 2020-11-20 三菱电机株式会社 操作工具、输入装置以及电子设备
TWI691868B (zh) * 2018-12-03 2020-04-21 宏碁股份有限公司 旋鈕裝置與相關互動顯示裝置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2463764A2 (fr) * 2010-12-12 2012-06-13 Thomas Klotz Dispositif et procédé de commande d'un écran tactile
US20200233521A1 (en) * 2017-10-11 2020-07-23 Mitsubishi Electric Corporation Operation input device
US20210286470A1 (en) * 2018-11-29 2021-09-16 Japan Display Inc. Sensor device

Also Published As

Publication number Publication date
GB202307913D0 (en) 2023-07-12
GB2630382A (en) 2024-11-27

Similar Documents

Publication Publication Date Title
KR102363531B1 (ko) 인덕티브 센싱과 정전용량형 센싱을 이용하는 터치 포스 센서 및 그 동작 방법
TWI433014B (zh) 顯示裝置
US9652093B2 (en) Touch sensors and touch sensing methods
US20110310064A1 (en) User Interfaces and Associated Apparatus and Methods
EP2149838B1 (fr) Ecran tactile capacitif d'un dispositif d'affichage pour détecter un doigt et un stylet
US9619044B2 (en) Capacitive and resistive-pressure touch-sensitive touchpad
US20120306802A1 (en) Differential capacitance touch sensor
US10078400B2 (en) Touch sensor panel and method correcting palm input
JP6369805B2 (ja) タッチセンサ装置及び電子機器並びにタッチジェスチャー検知プログラム
CN101203901B (zh) 带有旋转检测的基于手持式光标器的输入设备
US9405383B2 (en) Device and method for disambiguating region presses on a capacitive sensing device
US20130222337A1 (en) Terminal and method for detecting a touch position
JP2008146654A (ja) タッチパネル及びこれに用いる位置検出方法
US8624865B2 (en) Device for improving the accuracy of the touch point on a touch panel and a method thereof
CN103914178A (zh) 触摸传感芯片、接触感应装置及该装置的坐标校准方法
CN106293136B (zh) 低轮廓电容指点杆
WO2024246484A1 (fr) Mécanisme et procédé d'entrée d'utilisateur
WO2013023088A1 (fr) Geste à deux doigts sur un capteur linéaire ou un capteur à une seule couche
US20140168112A1 (en) Touch sensing method and touch sensing apparatus
EP2404229A1 (fr) Capacité de surface à gestes de surface
EP3980876B1 (fr) Dispositif et méthode sensible au toucher
US20130201148A1 (en) Two-finger gesture on a linear sensor or single layer sensor
HK1195957A (en) Two-finger gesture on a linear sensor or single layer sensor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24729343

Country of ref document: EP

Kind code of ref document: A1