[go: up one dir, main page]

GB2639967A - A hair styling system and control thereof - Google Patents

A hair styling system and control thereof

Info

Publication number
GB2639967A
GB2639967A GB2404586.6A GB202404586A GB2639967A GB 2639967 A GB2639967 A GB 2639967A GB 202404586 A GB202404586 A GB 202404586A GB 2639967 A GB2639967 A GB 2639967A
Authority
GB
United Kingdom
Prior art keywords
hair
uwb
user
gesture
computing device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2404586.6A
Other versions
GB202404586D0 (en
Inventor
Cutter Ry
George Milner Robert
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jemella Ltd
Original Assignee
Jemella Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jemella Ltd filed Critical Jemella Ltd
Priority to GB2404586.6A priority Critical patent/GB2639967A/en
Publication of GB202404586D0 publication Critical patent/GB202404586D0/en
Priority to PCT/GB2025/050676 priority patent/WO2025202662A1/en
Publication of GB2639967A publication Critical patent/GB2639967A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D1/00Curling-tongs, i.e. tongs for use when hot; Curling-irons, i.e. irons for use when hot; Accessories therefor
    • A45D1/28Curling-tongs, i.e. tongs for use when hot; Curling-irons, i.e. irons for use when hot; Accessories therefor with means for controlling or indicating the temperature
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D1/00Curling-tongs, i.e. tongs for use when hot; Curling-irons, i.e. irons for use when hot; Accessories therefor
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D1/00Curling-tongs, i.e. tongs for use when hot; Curling-irons, i.e. irons for use when hot; Accessories therefor
    • A45D1/02Curling-tongs, i.e. tongs for use when hot; Curling-irons, i.e. irons for use when hot; Accessories therefor with means for internal heating, e.g. by liquid fuel
    • A45D1/04Curling-tongs, i.e. tongs for use when hot; Curling-irons, i.e. irons for use when hot; Accessories therefor with means for internal heating, e.g. by liquid fuel by electricity
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D1/00Curling-tongs, i.e. tongs for use when hot; Curling-irons, i.e. irons for use when hot; Accessories therefor
    • A45D1/06Curling-tongs, i.e. tongs for use when hot; Curling-irons, i.e. irons for use when hot; Accessories therefor with two or more jaws
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D2/00Hair-curling or hair-waving appliances ; Appliances for hair dressing treatment not otherwise provided for
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D2/00Hair-curling or hair-waving appliances ; Appliances for hair dressing treatment not otherwise provided for
    • A45D2/001Hair straightening appliances
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D20/00Hair drying devices; Accessories therefor

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

A hair styling system comprises a hair styler (1, fig.1), at least one ultra-wide band (UWB) sensor 220 configured to transmit radio pulses and receive signals reflected by objects adjacent the sensor. In one embodiment a processor can identify a gesture made by a user 410 and, in response to the identified gesture, control at least one setting of the hair styling device or a computing device. The gesture is preferably compared with a plurality of gesture recognition models comprising UWB radio signal patterns corresponding to gestures. The gesture may control e.g. the temperature of the hair styling device or music tracks played by a computing device. In a second embodiment the signals from the UWB sensor may be used to identify movement of a hair styling device which has a radio frequency identification (RFID) tag thereon. Feedback can then be provided to the user based on the movement.

Description

A Hair Styling System and Control thereof The present invention relates to an apparatus and method for styling the hair of a person (or conceivably an animal), for example after washing the hair or as part of a styling process, and a means of controlling that apparatus during the styling process.
Typically, handheld (portable) hair stylers (e.g., hair straightening and/or curling devices) are provided with one or more user interfaces that allow the user to set user defined parameters for the device. For example, the one or more user interfaces may allow a user to set a desired operating temperature of the hair styler, and/or request a cool/heat shot. The one or more user interfaces may also allow a user, by way of example only, to select one or more styling programs to be used to style their hair. For example, the hair styler may have pre-programmed styling programs stored within its memory that set specific parameters to facilitate achieving a particular hair style. Additionally, the one or more user interfaces may also allow a user to switch between different operating modes of the hair styler. For example, the user may switch between a training mode where the heaters of the hair styler are switched off allowing a user to practise particular styling techniques and moves without actually styling their hair, and a styling mode, where the heaters of the hair styler are operable and heat the hair.
However, such user input functionality to control the hair styler has drawbacks, especially, but not exclusively, when using the hair styler in the styling mode described above. For example, the need to use different buttons at different positions on the styler, and/or one or more touch displays to operate the styler and control its functions can be difficult while also styling one's hair. In particular, but not exclusively, the need to operate the styler using a touch sensitive display can result in the user having to interrupt styling procedures to look at the display and select an input. This can have negative effects on the quality of a style using the styler as the user has to regularly remove their hair from the device to select a new function and then begin running a tress of hair through the device again once the new function has been selected. There is therefore a need to improve functionality of hair stylers while minimising the amount of user input interaction required. In particular, there is a need to improve functionality of such hair stylers without requiring the user to interrupt styling procedures to change styler settings. There is also a need to improve functionality of such hair stylers without requiring a user to move from a seated position while styling. There is also a need to improve functionality of such hair stylers without requiring a user to move from their current position in general while styling. For example, where a user is using the hair styler in a standing position in front of a mirror, there is a need to improve functionality of such hair stylers so that such users do not have to move around a lot while in front of the mirror.
Furthermore, during the use of a hair styler, whether that be in a training mode or a styling mode as described above, the user may benefit from feedback from the styler (or another device in communication with the styler) to provide guidance and instruction to the user, thereby ensuring correct use of the styler. By ensuring such correct use of the styler, optimal styling may be achieved. This is particularly true where a user wishes to achieve more complicated styles that may traditionally require the assistance of a professional hair stylist. There is therefore a need to provide feedback to a user during their use of the hair styler device.
Summary of Invention
In an aspect of the invention there is provided a hair styling system comprising a hair styling device, at least one ultra-wide band, UWB, sensor configured to transmit UWB radio pulses and to receive UWB radio signals reflected by objects adjacent the at least one UWB sensor, and a computing device in communication with the at least one UWB sensor, the computing device comprising a processor configured to i) process signals obtained from the at least one UWB sensor to identify a gesture made by a user adjacent to the at least one UWB sensor, and ii) control, in response to the identified gesture, at least one setting of the hair styling device or the computing device.
In another aspect, to process signals obtained from the at least one UWB sensor to identify a gesture made by a user adjacent to the at least one UWB sensor may comprise comparing the signals obtained from the at least one UWB sensor with a plurality of gesture recognition models.
In another aspect, comparing the signals obtained from the UWB sensor with a plurality of gesture recognition models may comprise comparing a pattern of the signals obtained from the at least one UWB sensor with each one of the gesture recognition models of the plurality of gesture recognition models, wherein each gesture recognition model comprises one or more pre-stored UWB radio signal patterns corresponding to a gesture.
In another aspect, the one or more pre-stored UWB radio signal patterns corresponding to a gesture comprise at least one of: a distance-amplitude signal graph, and/or a distance-time signal graph.
In another aspect, to process signals obtained from the at least one UWB sensor to identify a gesture made by a user adjacent the at least one UWB sensor may comprise comparing the signals obtained from the at least one UWB sensor with a gesture recognition model, wherein each one of the gesture recognition models is a convolutional neural network, CNN. Additionally, or alternatively the signals may be compared with a plurality of gesture recognition models, wherein each one of the gesture recognition models is a convolutional neural network, CNN.
In another aspect each identified gesture is associated with one of a plurality of control commands and in response to identifying the gesture made by a user adjacent to the at least one UWB sensor the processor is configured to transmit a corresponding control command to the hair styler to control the at least one setting of the hair styling device.
In another aspect, controlling the at least one setting of the hair styling device, in response to an identified gesture may comprise increasing a temperature of the hair styler, or increasing a temperature of the hair styler by a pre-configured step size, or initiating a heat-shot function of the hair styler, or initiating a cold-shot function of the hair styler.
In another aspect, in response to identifying the gesture made by a user adjacent to the at least one UWB sensor the processor may be configured to control the at least one setting of the computing device in communication with the at least one UWB sensor.
In another aspect, in response to an identified gesture, the at least one setting of the 30 computing device in communication with the at least one UWB sensor may comprise restarting a music track being played by the computing device or skipping a music track being played by the computing device.
In an aspect of the invention there is provided a hair styling system comprising a hair styling device, a plurality of ultra-wide band, UWB, sensors, each UWB sensor configured to transmit UWB radio pulses and to receive UWB radio signals reflected by objects adjacent the plurality of UWB sensors, and a computing device in communication with the plurality of UWB sensors. The computing device comprises a processor configured to i) process signals obtained from each UWB sensor of the plurality of UWB sensors to identify a dynamic gesture made by a user, wherein the dynamic gesture is a moving gesture, ii) process the signals obtained from each UWB sensor of the plurality of UWB sensors to determine a position of a user's head, and iii) control, in response to the identified dynamic gesture and the determined position of the user's head, at least one setting of the hair styling device or the computing device.
In another aspect, the processor is configured to process signals obtained from the plurality of UWB sensors to identify the dynamic gesture made by the user may comprise comparing the signals obtained from the plurality of UWB sensors with a plurality of gesture recognition models.
In another aspect, comparing the signals obtained from the plurality of UWB sensors with a plurality of gesture recognition models may comprise comparing a pattern of the signals obtained from the plurality of UWB sensors with each one of the gesture recognition models of the plurality of gesture recognition models, wherein each gesture recognition model comprises one or more pre-stored UWB radio signal patterns corresponding to a dynamic gesture. In addition, the one or more pre-stored UWB radio signal patterns corresponding to a dynamic gesture may comprise a distance-time signal graph.
In another aspect, the processor is configured to process signals obtained from the plurality of UWB sensors to identify a gesture made by a user may comprise comparing the signals obtained from the plurality of UWB sensors with a plurality of gesture recognition models, wherein each one of the gesture recognition models is a convolutional neural network (CNN).
In another aspect, the processor is configured to process the signals obtained from each UWB sensor of the plurality of UWB sensors to determine a position of a user's head may comprise identifying, in the signals obtained from each UWB sensor a static portion of the signals, wherein the static portion of the signals is a portion of the signals that remains unchanged over time. For example, during use of the hair styler a user may move their head (e.g., tilt/twist/turn their head). The processor thus may be configured to process the signals obtained from each UWB sensor of the plurality of UWB sensors to determine a position of a user's head as well as changes in the position of a user's head over time (e.g., a time frame).
In another aspect, to control, in response to the identified dynamic gesture and the determined position of the user's head, at least one setting of the hair styling device may comprise determining, whether the dynamic gesture began on the left-or right-hand side of the user's head, and wherein each identified dynamic gesture and its starting position relative to the left-or right-hand side of the user's head is associated with one of a plurality of different control commands, and in response to identifying the gesture made by a user and its starting position relative to the left-or right-hand side of the user's head, the processor is configured to transmit a corresponding control command to the hair styler to control the at least one setting of the hair styling device.
In another aspect, to control, in response to the identified dynamic gesture and the determined position of the user's head, at least one setting of the computing device may comprise determining, whether the dynamic gesture began on the left-or right-hand side of the user's head, and in response to identifying the gesture made by a user and its starting position relative to the left-or right-hand side of the user's head, the processor is configured to control the at least one setting of the computing device.
In an aspect of the invention there is provided a hair styling system comprising a hair styling device, at least one ultra-wide band, UWB, sensor configured to transmit UWB radio pulses and to receive UWB radio signals from at least one radio frequency identification, RFID, tag on the hair styling device, at least one user interface for input of a desired hair style, and a computing device in communication with the at least one UWB sensor. The computing device comprises a processor configured to i) process signals obtained from the at least one UWB sensor to identify a movement of the hair styling device adjacent to at least one UWB sensor, and H) provide feedback to a user in response to the identified movement.
In another aspect, the processor is configured to process signals obtained from the at least one UWB sensor to identify a movement of the hair styling device adjacent to the movement of the hair styling device, at least one UWB sensor may comprise comparing the signals obtained from the at least one UWB sensor with a plurality of gesture recognition models.
In another aspect, comparing the signals obtained from the UWB sensor with a plurality of gesture recognition models may comprise comparing a pattern of the signals obtained from the at least one UWB sensor with each one of the gesture recognition models of the plurality of gesture recognition models, wherein each gesture recognition model comprises one or more pre-stored UWB radio signal patterns corresponding to a gesture.
In another aspect, the one or more pre-stored UWB radio signal patterns corresponding to a movement of the hair styling device comprises a distance-time signal graph.
In another aspect, to process signals obtained from the at least one UWB sensor to identify a movement of the hair styling device adjacent to the at least one UWB sensor may comprise comparing the signals obtained from the at least one UWB sensor with a plurality of gesture recognition models, wherein each one of the gesture recognition models is a convolutional neural network, CNN.
In another aspect, to provide feedback to a user in response to the identified movement may comprise comparing the identified movement of the hair styler device with a set of movements of the hair styler device required to achieve the desired hair style input at the at least one user interface, determining, based on the comparing, whether the identified movement of the hair styler device corresponds to at least one movement of the hair styler device required to achieve the desired hair style, and in response to the determining, the processor is configured to transmit a feedback command to the hair styler to provide feedback to a user in response to the identified movement. The feedback may comprise at least one of visual feedback provided via a display of the computing device, visual feedback provided via a display of the hair styler device, audio feedback provided via the computing device, haptic feedback provided via the hair styler and/or audio feedback provided via the hair styler (where the hair styler is provided with one or more speakers).
In another aspect, the identified movement of the hair styling device may comprise at least one of a rotating movement of the hair styler device, a unidirectional straight movement of the hair styler device, and/or an up-down movement of the hair styler device while arms of the hair styler device are open.
Brief Description of the Drawings
Embodiments of the invention will now be described, by way of example only, and with reference to the drawings in which: Figure la illustrates a perspective overview of a hair styling device; Figure 1 b illustrates the device of Figure la when in use to style hair; Figure 2 illustrates a hair styler system used by an end user to style their hair; Figure 3a illustrates internal components/modules of an ultra-wide band (UWB) sensor of the hair styler system; Figure 3b illustrates the circuitry of an ultra-wide band (UWB) sensor of the hair styler system; Figure 4a illustrates an example static gesture and processing thereof to adjust a function of the hair styler; Figure 4b illustrates an example dynamic gesture and processing thereof to adjust a function of the hair styler; Figure 5 illustrates an example process flow diagram depicting the steps involved in gesture-based control of a hair styler 1 and/or computing device of the hair styler 20 system; Figure 6 illustrates an exemplar gesture-function mapping table stored in the memory of a computing device of the hair styler system; Figure 7 illustrates an example use of a signal graph, generated through reflected UWB pulses, to determine a user's head position and a gesture with respect to the user's head position; Figure 8 illustrates an example process flow diagram depicting the steps involved in another gesture-based control of a hair styler and/or computing device of the hair styler system; Figure 9 illustrates an example use of a signal graph, generated through reflected UWB pulses, to determine a styling movement performed by a user of a hair styler; Figure 10 illustrates an example process flow diagram depicting the steps involved in recognising a location and movement of a hair styler and providing appropriate feedback; Figure 11 illustrates a table of example movements of a hair styler that may be performed by a user and the feedback that may be triggered by the computing device; and Figure 12 illustrates another example movement of the hair styler 1 and processing thereof to provide user feedback.
Overview Figure 1a illustrates a handheld (portable) hair styler 1. The hair styler 1 includes a first movable arm 4a and a second movable arm 4b, which are coupled at proximal ends thereof to a shoulder (or hinge) 2. The first arm 4a bears a first heater 6a at its distal end, and the second arm 4b bears a second heater 6b at its distal end. The first and second heaters 6a, 6b oppose one another and are brought together as the first and second arms 4a, 4b are moved from an open configuration to a closed configuration. As shown in Figure 1 b, during use, a tress of hair 40 is sandwiched between the two arms 4 so that the user's hair is in contact with, and therefore heated by, outer heating surfaces of the heaters 6a, 6b. Therefore, as the user pulls the hair styler 1 along the tress of hair 40, the tress of hair 40 is heated by conductive heating to a suitable temperature to facilitate styling.
One or more user interfaces 11 are provided to allow the user to set user defined parameters and for the device to output information to the user. For example, a desired operating temperature may be set via the user interface 11. The user interface 11 may have a dial, button, or touch display for allowing the user to input information to the hair styler 1 and the user interface 11 may have an indicator light, display, sound generator or haptic feedback generator for outputting information to the user. In this embodiment, the user interface 11 also comprises a control button or switch 14 to enable the user to turn the hair styler 1 on or off; and an indicator light 15 to show whether the power is on.
A printed circuit board assembly (not shown) may be provided at any suitable location within the housing of the hair styler 1 and carries the control circuitry for controlling the operation of the hair styler 1 and for controlling the interaction with the user via the user interface 11. In this example, electrical power is provided to the hair styler 1 by means of a power supply located at an end of the device, via a power supply cord 3. The power supply may be an AC mains power supply. However, in an alternative embodiment the power supply may comprise one or more DC batteries or cells (which may be rechargeable, e.g., from the mains or a DC supply via a charging lead), thereby enabling the hair styler 1 to be a cordless product.
In use, the hair styler 1 is turned on, energising the heaters 6 to cause them to heat up.
The user then opens the first and second arms 4a, 4b and, normally starting from the roots of the hair (Le., near the scalp), a length or tress of hair 40 (which may be clumped) is introduced between the arms 4a, 4b, transversely across the heaters 6a, 6b. The user then closes the arms 4a, 4b so that the length of hair 40 is held between the first and second arms 4a, 4b and then the user pulls the hair through the closed arms (as illustrated in Figure lb). The outer (hair contacting) surface of the heaters 6 is flat in this embodiment and so the hair styler 1 can be used to straighten the user's hair. The hair styling hair styler 1 shown in Figure 1 can also be used to curl the hair by turning the hair styler 1 through approximately 180 degrees or more after clamping the hair between the arms 4a, 4b and before moving the hair styler 1 along the tress of hair 40.
The control of the functions and/or settings of such hair stylers 1 however can be cumbersome, especially where a user must control the functions and/or settings via an array of buttons and/or user interfaces of the hair styler 1. For example, where a user wishes to change a temperature of the hair styler, or trigger a heat/cold shot, the user may have to remove their hair from the hair styler to look at the available buttons and/or options on the user interface to change its settings. This in turn results in use of the hair styler 1 being less efficient than would be optimal, as the user may have to start-stop styling to change settings as they go.
Beneficially, by providing an Ultra-WideBand (UVVB) sensor and computing device in the hair styling system herein described, a user is able to control functions and/or settings of the hair styler via gestures and movements of their body e.g., their hand. Such manipulation of the functions and settings of the hair styler beneficially means that a user can adjust the functions and/or settings of the hair styler while in use i.e., they no longer have to remove their hair from the hair styler to look at buttons and/or user interfaces. Furthermore, the hair styler can be controlled from a seated position without the user having to move around.
Further beneficially, by providing the UWB sensor and computing device in the hair styling system herein described a combination of a gesture and a location of a gesture relative to a part of a user's body (e.g., the head) can be used to control functions and/or settings of the hair styler, thereby beneficially providing a greater array of possible gestures that may be used to control the settings/function of the hair styler. That in turn means a greater level of functionality can be achieved.
When using hair stylers at home, users are less likely to attempt complex styles due to the complexity of the movements and/or sequence of movements that may need to be performed with the hair styler to achieve a desired style. This in turn means that such hair stylers are not used to their maximum potential.
Beneficially, by providing a UWB sensor, a computing device, and radio-frequency identification (RFID) tags in the hair styling system herein described, the system is able to determine/track movements of the hair styler being performed by a user and provide appropriate feedback to the user to assist in their efforts to style their hair in accordance with a desired style. For example, the computing device beneficially stores sets and/or patterns of movements that are necessary to achieve specific styles. Upon selection of a desired style via an appropriate user interface, the hair styling system compares detected movements of the hair styler with those stored at the computing device and associated with the desired style to ascertain whether the user is performing movements correctly. Based on that determination, the hair styler system beneficially provides feedback (e.g., audio, visual and/or haptic feedback) to the user to guide them in their styling technique. That in turn beneficially improves overall user experience of the hair styler product. It is also hoped that this will beneficially encourage users to attempt more complex styles at home with the hair styler.
Each configuration of the hair styler system briefly summarised above will now be discussed in more detail with reference to Figures 2 to 12.
Gesture Recognition Figure 2 illustrates a hair styler system 200 in use by an end user to styler their hair.
As seen in Figure 2 the hair styler system 100 comprises a handheld (portable) hair styler 1, at least one ultra-wide band (UWB) sensor 220, and a processor 214. Optionally, the processor 214 may be stored within the handheld (portable) hair styler 1. Additionally, where the processor 214 is stored within the handheld (portable) hair styler 1 it may form part of a computing device 210. Alternatively, the processor 214 may be external to the handheld (portable) hair styler 1. Additionally, where the processor 214 is external to the handheld (portable) hair styler 1 it may form part of a computing device 210. The handheld hair styler 1 may be any appropriate type of hair styler device such as that previously described with reference to Figures la and lb. UWB is a short-range, wireless communication protocol that operates through radio waves at very high frequencies (e.g., between 3.1 and 10.6 GHz) in a wide band (e.g., a band of 500 MHz or greater). Unlike W-Fi and Bluetooth protocols, UWB is able to capture highly accurate spatial and directional data. The UWB sensor 220 is described in more detail with reference to Figure 3.
The computing device 210 may be any appropriate electronic device that can provide communication and data transfer between itself and the hair styler 1 and the UWB sensor 220. It will be appreciated that the computing device 210 may be an integral part of either the hair styler 1 or the UWB sensor 220. For example, appropriate circuitry that provides a memory 212, a processor 214, and a communication module 216 may be provided in the hair styler 1, alternatively it may be provided as part of the UWB sensor 220.
Alternatively, the computing device 210 may be an external device e.g., a separate device from the hair styler 1 and the UWB sensor 220. For example, the computing device 210 may be a simple computer processing device with appropriate circuitry that provides a memory 212, a processor 214, and a communication module 216 -such as a personal computer, tablet, laptop, smart screen, or the like.
In another example, the computing device 210 may be a user equipment (UE), such as a UE used in telecommunication systems e.g., a smart phone. Where the computing device 210 is an external device such as a UE, it may be configured to communicate with the hair styler 1 and the UWB sensor 220 over any appropriate wireless communication links 224; 226 that can support communication between such devices. For example, they may communicate with one another over W-Fi, Bluetooth, or over any other radio access technology that supports communications between devices within close proximity of one another.
Nevertheless, it will be appreciated that where the computing device 210 is an external device such as a UE, it may alternatively, be configured to communicate with the hair styler 1 and the UWB sensor 210 over a wired connection between the hair styler 1 and the UWB sensor 210.
It will be appreciated that depending on the form of the computer device 210, it may be configured with several other appropriate modules. By way of example only, where the computing device 210 is a UE (e.g., a smart phone), the computing device 210 may be equipped with an audio module 218 and one or more speakers for providing sounds and/or music to a user during use of the hair styler 1.
As shown in Figure 2, during use, the UWB sensor 220 may transmit radio pulses 230 toward the user using the hair styler 1. For example, (and as described below in more detail), the UWB sensor 220 may emit radio pulses 230 across a wide spectrum frequency range (e.g., a band range of 500 MHz or greater) toward the user between every 10 to 100 ms. Having emitted those radio pulses 230, the UWB sensor 220 may subsequently receive (detect) reflections of those radio pulses which are converted into reflected radio pulse signals by the processor.
Where the UWB sensor 220 is provided with its own processor, the UWB sensor 220 may process those reflected radio pulses to translate them into data (e.g., reflected radio pulse signals), and then communicate that data over communication link 224 with the computing device 210. It will be appreciated that the shorter the duration of the pulse, the more precisely the processor of the UWB sensor can determine distance measurements and thus spatial and directional information.
Alternatively, where the UWB sensor 220 does not include its own processing capabilities, the UWB sensor 220 may, communicate appropriate information over communication link 224 to the computing device 210 to allow the computing device 210 to receive information on the reflected radio pulses to translate it into spatial and directional data.
The time of arrival (ToA) of each of those reflected radio pulse signals is dependent on how far away the object is off of which the radio pulse was reflected. Accordingly, the reflected radio pulse signals may be used to determine relative distances between the hair styler 1 and objects in the vicinity of the hair styler 1.
Furthermore, it will be appreciated that the power (e.g., amplitude) of such reflected radio pulse signals will depend on various factors including the surface area (size) of the object from which the radio pulses are reflected. Accordingly, the reflected radio pulse signals may also be used to determine relative sizes of objects in the vicinity of the hair styler 1. By way of example only, an object may include a hand of a user, and the surface area of the user's hand may vary depending on a gesture being performed by the user e.g., a closed fist gesture will result in reflected radio pulse signals with a lower amplitude than if the gesture was an open palm gesture.
Each reflected radio pulse signal may comprise a linear combination of reflected radio pulses from N different paths (directions) and an additive noise term, over a predetermined period of time such that it comprises N delayed and distorted signals that may, for example, be represented algebraically as:
N
x[n, lc] = anis(n,k -TD+ N wherein: s(n,k) is the estimate of the transmitted pulse shape received at the receiver in the UWB sensor 220 that is usually distorted due to several different factors, such as the reflection, refraction, and scattering coefficients of objects, N is additive noise, an; is a scaling factor of the signal, and Lis a duration Once received, each reflected radio pulse signal may also have corrections applied to it where appropriate to remove 'clutter' from the signals. Clutter is defined as aspects of the reflected pulse signals (also referred to simply as 'reflected signals') that are caused by background objects, that is to say objects that are far away from the UWB sensor 220 e.g., walls in the background behind the user. By applying such corrections, the remaining features of each reflected radio pulse signal may be clearer, enabling the processor to identify the presence of objects more clearly (e.g., gestures) in the foreground of the hair styler 1.
Based on the reflected radio pulse signals, the computer device 210 may (further) process the signals to determine a gesture of a user. For example, the processor 214 of the communication device 210 may compare the signals with one or more gesture recognition models stored within the memory 212 of the communication device 210 to determine whether a user has performed a specific gesture.
Having determined that a particular gesture has been performed by the user, the computer device 210 may determine one or more functions of the hair styler 1 that are to be adjusted based on the determined gesture. By way of example only, the computer device may determine that a temperature of the hair styler 1 is to be adjusted based on the determined gesture. In response to determining that one or more functions of the hair styler 1 are to be adjusted, the computing device 210 may send an appropriate message and/or command to the hair styler 1 (e.g., over wireless communication link 226) to adjust the one or more functions of the hair styler 1.
Beneficially the system illustrated in Figure 2 allows the control of a hair styler 1 using gestures (e.g., hand gestures), rather than having to press buttons on the hair styler 1 or use a touch screen to make inputs. Beneficially, a user is thus able to control the hair styler 1 while continuing to style their hair (i.e., the styling procedure does not have to be interrupted). Furthermore, it is easier for a user to control the hair styler 1, while in use, without having to move from a seated position.
Figure 3a illustrates possible internal components/modules of a UWB sensor 220.
As shown in Figure 3a, the UWB sensor 220 implemented in the system shown in Figure 2, has a transceiver 312 that transmits UWB pulses and receives reflected UWB pulses in response. The transmitted pulses will typically have a duration of a few nanoseconds and accordingly have a wide spectrum of the order of 500 MHz to 1.5GHz. The pulses are typically transmitted with a pulse repetition frequency of between 1 ms and 1 ns. The transceiver 312 also receives reflected radio pulses, reflected off the user and other objects in the vicinity of the UWB transceiver 312. It will be appreciated that while the UWB sensor 220 in Figure 3a is shown to have a transceiver 312 that facilitates both transmission and reception of radio pulses, the UWB sensor 220 may alternatively be provided with a separate transmitter and receiver circuitry capable of transmitting and receiving pulses, respectively.
As shown in Figure 3a, the UWB sensor 220 may optionally be provided with its own processor 314. For example, the UWB sensor 220 may be provided with a processor 314 that may be capable of processing reflected radio pulses to translate them into data (e.g., spatial, and directional data), and then communicate that data over communication link 224 with the computing device 220. Alternatively, the UWB sensor 220 may not be provided with its own processor 314, in which case the UWB sensor 220 may, communicate appropriate information over communication link 224 to the computing device 210 to allow the computing device 210 to receive information on the reflected radio pulses to translate it into spatial and directional data.
Furthermore, as shown in Figure 3a, the UWB sensor 220 is provided with a communication module 316 to allow communication with one or more external devices. For example, the UWB sensor 220 may be in communication with one or more computing devices (not shown in Fig. 3a) and/or a hair styler 1. By way of example only, the UWB sensor 220 may communicate with one or more computing devices 210 (not shown) and/or a hair styler 1 over Wi-Fi, Bluetooth, or over any other radio access technology that supports communications between devices within close proximity of one another. Alternatively, the UWB sensor 220 may communicate with one or more computing devices 210 (not shown) and/or a hair styler 1 over a wired connection between the hair styler 1 and the computing device 210.
Figure 3b illustrates the circuitry of the UWB sensor 220.
As shown in Figure 3b, the UWB sensor 220 includes an oscillator 220-3 and a pulse generator 220-4 to generate ultra-wide band pulses of electromagnetic radiation. The ultra-wide band pulses may optionally be passed through a modulator 220-5 to prepare the pulses for transmission by the transmission (Tx) antenna 220-7. As shown in Figure 3b, prior to transmission by Tx antenna 220-7, the pulses may also be passed through a power amplifier 220-6 to increase the power (Le., the amplitude) of the pulses.
The UWB sensor 220 also includes a receiver (Rx) antenna 220-8 for receiving reflected UWB signals. Once received by the Rx antenna 220-8, the reflected UWB signals are passed through a low-band pass filter 220-9 to allow the processing of reflected UWB signals below a cutoff frequency threshold while attenuating all signals above this cutoff frequency threshold. The reflected UWB signals are also passed through a low-noise amplifier 220-19 to amplify very low-power signals without significantly degrading their signal-to-noise ratio (SNR). The reflected UWB signals (and/or information pertaining to those reflected UWB signals) are then processed by the processor 314. For example, reflected UWB signals (and/or information pertaining to those reflected UWB signals) may be processed by the processor 314 to determine a proximity of a user of the hair styler 1 to the hair styler 1 as will be described in more detail later.
Optionally, prior to being processed by the processor 314, the reflected UWB signals may also be passed through a correlator receiver 220-11 to correlate the reflected UWB signals with the UWB pulses transmitted by the Tx antenna 220-7. For example, the reflected UWB signals may be passed through a mixer 220-12 where the received signal is mixed with a version of the transmitted UWB pulses with the resulting mixed signal being integrated by the integrator 220-13 prior to being processed by the processor 314.
Figure 4a illustrates an example static gesture and processing thereof to adjust a function of the hair styler 1.
As shown in Figure 4a, there is an UWB sensor 220 in communication with a computing device 210 as described above with reference to Figure 2. It will be appreciated that the computing device 210 may be external to both the UWB sensor 220 and the hair styler 1. Alternatively, the computing device may be housed within the hair styler 1. It will further be appreciated that where the computing device is external to the hair styler 1, the computing device 210 may communicate with the hair styler 1 over an appropriate communication link (e.g., communication link 226). The UWB sensor 220 is in communication with the computing device 210 over communication link 224.
As described briefly above, a user of the hair styler 1 may control one or more functions of the hair styler 1 via a gesture (e.g., a hand gesture) 410. Figure 4a shows, by way of example only, one possible gesture that a user may use to control one or more functions of the hair styler 1. The gesture 410 shown (hereafter referred to as 'Devil Horns' gesture) involves the user extending their first and last fingers of a hand while closing all other fingers and thumb towards their palm.
During use of the hair styler 1, the end user sits (or alternatively stands) in front of the at least one UWB sensor 220, which in turn emits UWB radio pulses 230 toward the user. Those UWB radio pulses 230 propagate through air and space, with their energy dissipating as a function of distance away from the UWB sensor 220. Upon reaching an object such as the user, a hand of the user, a wall behind the user, or the like, those UWB radio pulses 230 will be reflected (not shown) off the respective objects, back toward the UWB sensor 220, which in turn detects the reflected pulses in the form of reflected pulse signals as described above. Based on the reflected pulse signals, a processor 314 of the UWB sensor 220, or a processor 214 of the computing device 210, processes the signals (or information associated with them) to determine highly accurate spatial and directional data in the form of a signal graph 420 (e.g., a distance-amplitude signal graph).
Processing the reflected pulse signals may include any appropriate form of processing known in the art that allows the generation of graphs such as distance-amplitude signal graphs and/or distance-time signal graphs and/or sequences of distance-amplitude signal graphs over time, and the like.
As previously described, the amplitude of the reflected pulse signals will vary depending on the size (surface area) of the object(s) from which the pulses reflect off. For example, the distance-amplitude signal graph 420 shown in Figure 4a is one example of the signal amplitudes detected when the reflected pulse signals are a consequence of UWB pulses being reflected off a user's hand making a 'Devils Horn' gesture.
Once the spatial and directional data in the form of a signal graph 420 is determined, the computing device 210, which may have determined the signal graph 420 itself, or alternatively received the signal graph 420 from the UWB sensor 220, uses the signal graph 420 to determine the gesture being performed by a user.
By way of example only, the processor 214 of the computing device 210 may determine a gesture being performed by a user of the hair styler 1 based on comparing one or more aspects of the signal / signal graph 420 with tables stored within the memory 212 of the computing device 210. For example, there may be stored within the memory 212 of the computing device 210 one or more mapping tables that map specific amplitude values, or sequence of amplitude values, to specific gestures. The processor 214 may be able to process the signal / signal graph 420 to identify an aspect of the signal / signal graph 420 (e.g., an amplitude value at a distance assumed to correspond to a distance of a user from the sensor), and then lookup that aspect in the one or more mapping tables to identify a gesture being performed by a user of the hair styler 1.
By way of another example, the processor 214 of the computing device 210 may compare the signal graph 420 with one or more gesture recognition models stored within the memory 212 of the computing device 210 to determine a gesture being performed by a user of the hair styler 1. For example, the memory 212 of the computing device 210 may store a plurality of recognition models, each associated with their own respective gesture. Each recognition model of the plurality of recognition models may comprise a signal graph 420 (or data corresponding to such a signal graph 420), that corresponds to a reflected UWB signal that would be expected to be detected by a receiver in the event that a particular gesture is performed in front of the UWB sensor 220.
Once the signal graph 420 has been compared with the one or more gesture recognition models stored within the memory 212 of the computing device 210, the processor 214 may determine a gesture being performed by the user. For example, when performing this comparison, the processor 214 compares the signal graph 420 with the one or more gesture recognition models to determine a matching metric for each comparison that is made. A matching metric may, by way of example only, comprise a percentage similarity value indicating a percentage similarity between the signal graph 420 and a gesture recognition model i.e., a percentage similarity between the signal graph 420 and the signal graph of the gesture recognition model. Nevertheless, it will be appreciated that the matching metric may be any appropriate metric that indicates a level of similarity or match between the signal graph 420 and gesture recognition model.
Based on the matching metrics, the processor 214 determines which gesture recognition model matches closest to the signal graph 420. For example, the processor may select the gesture recognition model associated with the largest percentage similarity value. By determining which gesture recognition model matches closest to the signal graph 420, the processor 214 in turn determines which gesture is being performed by the user, as each gesture recognition model is associated with their own specific gesture (e.g., 'Devil Horns' gesture).
Based on determining the gesture being performed by the user, the processor 214 determines a function of the hair styler 1 to adjust, and sends, over the communication link 226 with the hair styler 1, an appropriate message or command to adjust the corresponding function of the hair styler 1. For example, having determined that the user is performing a 'Devil Horns' gesture, the processor 214 may send an appropriate message or command to the hair styler 1 to adjust a temperature of the hair styler 1 e.g., each time the 'Devil Horns' gesture is detected the temperature of the hair styler 1 may be incrementally increased in steps such as 120 °C, 140 °C, 160 °C, 180 °C, 210 °C.
It will be appreciated that while the above describes a scenario where the temperature of the hair styler 1 may be incrementally increased in steps such as 120 °C, 140 °C, 160 °C, 180 °C, 210 °C, another gesture may also be configured (such as an upside down Devil Horns' gesture by way of example) such that when it is detected the temperature of the hair styler incrementally decreases in steps of, for example 20 °C.
Furthermore, it will be appreciated that a further gesture may also be configured that triggers a 'hard stop' such that the temperature cannot be raised any further than its current temperature.
While the gestures described above with reference to Figure 4a relate to static gestures, it will be appreciated that the same system can be used to detect dynamic gestures. Further by providing multiple UWB sensors 220 at different positions in front of the user (e.g., one in front of the user, one to the left of the user, and one to the right of the user), the system can differentiate between left hand gestures and right-hand gestures which can allow the user to control more functions/settings of the hair styler 1.
Figure 4b illustrates an example dynamic gesture and processing thereof to adjust a function of the hair styler 1.
As shown in Figure 4b, there are three UWB sensors 220-1, 220-2, 220-3 in communication with a computing device 210 as described above with reference to Figure 2. The UWB sensors 220-1, 220-2, 220-3 are in communication with the computing device 210 over communication links 224-1, 224-2, 224-3 respectively. It will be appreciated that the computing device 210 may be an external to both the UWB sensor 220 and the hair styler 1. Alternatively, the computing device may be housed within the hair styler 1. It will further be appreciated that where the computing device 210 is external to the hair styler 1, the computing device 210 may communicate with the hair styler 1 over an appropriate communication link (e.g., communication link 226).
As described briefly above, a user of the hair styler 1 may control one or more functions of the hair styler 1 via a gesture (e.g., a hand gesture) 430. Figure 4b shows, by way of example only, one possible gesture that a user may use to control one or more functions of the hair styler 1. The gesture 430 shown (hereafter referred to as 'Swipe Right' gesture) involves the user moving their hand in a direction from the left to the right.
During use of the hair styler 1, the end user sits in front of UWB sensors 220-1, 220-2, 220-3 positioned in front of, and to the left and right of the user, which in turn emit UWB radio pulses 230-1, 230-2, 230-3 toward the user. Those UWB radio pulses 230-1, 2302, 230-3 propagate through air and space, with their energy dissipating as a function of distance away from the UWB sensor 220-1, 220-2, 220-3. Upon reaching an object such as the user, a hand of the user, a wall behind the user, or the like, the UWB radio pulses 230-1, 230-2, 230-3 will be reflected (not shown) off the respective objects, back toward the UWB sensor 220-1, 220-2, 220-3, which in turn detects the reflected pulses in the form of reflected pulse signals as described above.
However, unlike in the static gesture scenario, the reflected pulse signals received at the different UWB sensors 220-1, 220-2, 220-3 will change over time as the user moves their hand from the left to the right as shown in the distance-time graphs shown in Figure 4b.
Based on the reflected pulse signals, the processor 214 of the computing device 210, processes the reflected signals to determine highly accurate spatial and directional data in the form of a signal graph 320-1, 320-2, 320-3 from each UWB sensor 220-1, 220-2, 220-3 (e.g., the three distance-time signal graphs as shown in Figure 4b). In more detail, each time a UWB pulse is transmitted from the three UWB sensors 220-1 to 2203 a new amplitude/distance plot (such as is shown in Figure 4a) is generated. This information is processed to identify the distance of specific objects and the distance of those objects over time is determined by processing the signals received from successive UWB pulses that are transmitted by the UWB sensors 220. The pulse repetition frequency has to be sufficient to be able to capture the desired motion. A pulse repetition frequency of the order of 10KHz is sufficient to capture most human movements. The distance information contained in the reflected UWB signals relating to more static objects -such as the background walls or the user's head, can be filtered out, so that the determined distance-time signal graphs 320-1, 320-2, 320-3 contain the distance information of moving objects such as the user's hand. By processing the distance-time graphs 320-1 320-2, 320-3 from the top UWB sensor, the left UWB sensor and the right UWB sensor, the processor 214 can determine if the object that is moving is on the left-hand side or the right-hand side of the user's head and how that object is moving over time relative to each of the UWB sensors 220. This information can then be used to select a corresponding control function to be performed.
Once the spatial and directional data in the form of signal graphs 320-1, 320-2, 320-3 is determined, the computing device 210, uses the signal graph 420 to determine a dynamic gesture being performed by a user by analysing the signal graphs 320-1, 3202, 320-3 themselves as described above.
Alternatively, the computing device 210, may use the signal graphs 320-1, 320-2, 320-3 to determine a dynamic gesture being performed by a user by analysing the signal graphs 320-13, 20-2, 320-3 and comparing them with pre-stored gesture models as before in the static gesture case.
Based on determining the gesture being performed by the user, the processor 214 determines a function of the hair styler 1, or of the computing device 210 to adjust, and sends an appropriate message or command to adjust that function of the hair styler 1 or that function of the computing device 210. For example, having determined that the user is performing a 'Swipe Right' gesture, the processor 214 may send an appropriate message to adjust a music track being played currently by the computing device 210.
The process of gesture recognition and setting adjustment of the hair styler 1 is further described below with reference to the process flow diagram of Figure 5.
Figure 5 illustrates an example process flow diagram depicting the steps involved in gesture-based control of a hair styler 1 and/or computing device 210.
At step S510, a UWB sensor 220 transmits UWB pulses toward a user operating a hair styler 1. The UWB sensor transmits radio pulses 230 approximately once every 10 to ms across a wide spectrum frequency range. Those radio pulses propagate through the air toward a user operating the hair styler 1 and upon hitting an object (e.g., a hand of the user), the radio pulses are reflected off the object, back to the UWB sensor 220.
At step S520, the reflected UWB pulses are received by the UWB sensor 220.
At step 5530, the reflected UWB pulses are either processed by a processor of the UWB sensor 220 or a processor of the computing device 210 to determine the gesture performed by the user.
At step S540, having determined the gesture being performed by a user, the computing device 210 sends an appropriate message or command to the hair styler 1 to adjust a function and/or setting of the hair styler 1 based on the gesture being performed. For example, the memory 212 of the computing device 210 may store a mapping table that maps specific gestures to specific functions and/or settings of the hair styler 1 that are to be adjusted when that specific gesture is detected. Having determined which gesture is being performed, the computing device 210 may look up the gesture in the mapping table stored in its memory 212 to determine the function and/or setting of the hair styler 1 that is to be adjusted based on the gesture. Having determined the function and/or setting, an appropriate message or command may be sent by the computing device 210 to the hair styler 1 over a communication link i.e., each gesture is associated with one of a plurality of control commands and in response to detecting a specific gesture, the processor is configured to transmit the corresponding control command to the hair styler 1.
Alternatively, or additionally, at step S540 having determined the gesture being performed by the user, the computing device 210 may determine one or more functions and/or settings of the computing device 210 to adjust based on the gesture being performed. For example, the memory 212 of the computing device 210 may store a mapping table that maps specific gestures to specific functions and/or settings of the computing device 210 (additionally or alternatively to specific functions and/or settings of the hair styler 1) that are to be adjusted when that specific gesture is detected. Having determined which gesture is being performed, the computing device 210 may look up the gesture in the mapping table stored in its memory 212 to determine the function of the computing device 210 that is to be adjusted based on the gesture. The computing device 210 may then adjust one or more of its functions and/or settings as appropriate. By way of example only, where the computing device 210 has an audio module 218 (e.g., where the computing device 210 is a smart phone), the computing device 210 may be able to play music while a user is styling their hair using the hair styler 1. As such, the user may use gestures to change the music track being played.
An example of a gesture-function mapping table that may be stored in the memory 212 of the computing device 210 will now be described with reference to Figure 6.
Figure 6 illustrates an exemplary gesture-function mapping table that may be stored in 30 the memory 212 of the computing device 210. As shown, the table 600 includes four example gestures that may be performed by a user to control and/or adjust a function and/or setting of the hair styler 1. In one example, the user may make a 'Devil Horns' gesture with one of their hands wherein the user extends the first and last finger of their hand while closing the remaining fingers and thumb.
In response to detecting the 'Devil Horns' gesture using the process described with reference to Figure 5, the computing device 210 may determine, based on the table 600, that the user wishes to control / adjust a temperature of the hair styler 1. By way of example only, each time the user makes the 'Devil Horns' gesture, the computing device 210 may determine that the user wishes to increase the temperature of the hair styler 1 by a set amount. As shown in table 600, the possible temperature settings of the hair styler 1 may be pre-configured in a specific pattern such that each time the 'Devil Horns' gesture is detected the computing device sends an appropriate message or command to the hair styler to change the temperature of the hair styler 1 to the next temperature in the pre-configured pattern of temperatures. By way of example only, the pre-configured pattern of temperatures may be {120 °C, 140 °C, 160 °C, 180 °C, 210 OC}. If the hair styler 1 is at the maximum temperature of 210 °C, then in response to detecting the 'Devil Horns' gesture, the computing device 210 may send an appropriate message or command to the hair styler 1 to loop the temperature back around to the beginning of the pre-configured pattern of temperatures e.g., 120 °C. It will be appreciated that the pre-configured pattern of temperatures given is by way of example only, and that other patterns are possible.
In another example, the user may make a 'Finger Gun' gesture with the index finger extended outwards and the remaining fingers and thumb closed into a fist. In response to detecting the 'Finger Gun' gesture using the process described with reference to Figure 5, the computing device 210 may determine, based on the table 600, that a user wishes to control a heat shot feature of the hair styler 1. By way of example only, in response to determining that a user has performed the 'Finger Gun' gesture the computing device 210 sends an appropriate message or command to the hair styler 1 to initiate a heat shot function (e.g., a rapid increase in the temperature of the hair styler 1 for a pre-determined period of time) of the hair styler 1 to inject heat into a user's hair.
In another example, the user may make a 'Hand Swipe' gesture with their hand. The Hand Swipe' gesture may comprise a swipe to the left of a hand, or a swipe to the right of a hand. Each will be discussed in turn. The user may make a 'Hand Swipe' gesture with their hand to the left (i.e., 'Left Hand Swipe' gesture). In response to detecting the 'Left Hand Swipe' gesture using the process described with reference to Figure 5, the computing device 210 may determine, based on the table 600, that a user wishes to replay a current music track being played by the computing device 210. The computing device 210 may then restart the current music track being played.
In another example, the user may make another 'Hand Swipe' gesture with their hand.
For example, the user may make a 'Hand Swipe' gesture with their hand to the right (Le., Right Hand Swipe' gesture). In response to detecting the 'Right Hand Swipe' gesture using the process described with reference to Figure 5, the computing device 210 may determine, based on the table 600, that a user wishes to skip a current music track being played by the computing device 210. The computing device 210 may then skip the current music track being played.
In another example, (not shown in table 600), the user may make confirmatory gestures with their hand such as a 'Thumbs-Up' gesture to confirm whether they wish to proceed with a step of a styling process or procedure. For example, when the user is practising with the hair styler 1 and the hair styler 1 is operating in a training mode, the hair styler 1 may not heat up its heating plates to allow heat to the user's hair. In response to detecting the 'Thumbs-Up' gesture, the computing device 210 may determine that a user wishes to switch from a training mode to a styling mode of the hair styler 1, and the computing device 210 sends an appropriate message or command to the hair styler 1 to initiate the styling mode, which may include initiating a heat-up procedure of the heating plates of the hair styler 1 to allow the application of heat to the user's hair.
In another example, (not shown in table 600), the user may make negative gestures with their hand such as a 'Thumbs-Down' gesture to indicate that they are not ready to proceed with a step of a styling process or procedure. For example, when the user is practising with the hair styler 1 and the hair styler 1 is operating in a training mode, the hair styler 1 may not heat up its heating plates to allow heat to the user's hair. In response to detecting the Thumbs-Down' gesture, the computing device 210 may determine that a user wishes to remain in the training mode of the hair styler 1, and the computing device 210 sends an appropriate message or command to the hair styler 1 to indicate that the hair styler 1 is to remain in the training mode.
It will be appreciated that the mapping table 600 in Figure 6 is given by way of example only and that the system may be configured with any multitude of gestures that may be used to control and/or adjust a function and/or setting of the hair styler 1 or the computing device 210. For example, the system may be configured such that specific gestures may be used to adjust and/or control (to name but a few): a cold shot function of the hair styler 1, and a styling mode of the hair styler 1.
Head Position Detection Additionally, or alternatively to the gesture recognition procedures described above, the hair styler system 100 comprising the handheld (portable) hair styler 1, the ultra-wide band (UWB) sensor 220, and the computing device 210, as shown in Figure 2, may also be configured to enable detection of a user's head position.
Figure 7 illustrates an example use of a signal graph, generated through reflected UWB pulses, to determine a user's head position and a gesture performed with respect to the determined position of the user's head.
As shown in Figure 7, there are three UWB sensors 220-1, 220-2, 220-3 in communication with a computing device 210 as described above with reference to Figures 2 & 3. The UWB sensors 220-1, 220-2, 220-3 are in communication with the computing device 210 over communication links 224-1, 224-2, 224-3 respectively. It will be appreciated that the computing device 210 may be an external to both the UWB sensor 220 and the hair styler 1. Alternatively, the computing device may be housed within the hair styler 1. It will further be appreciated that where the computing device 210 is external to the hair styler 1, the computing device 210 may communicate with the hair styler 1 over an appropriate communication link (e.g., communication link 226).
As described briefly above, a user of the hair styler 1 may control one or more functions of the hair styler 1 via a gesture (e.g., a hand gesture) 430. Additionally, or alternatively, a user of the hair styler 1 may control one or more functions of the hair styler 1 via a combination of a gesture (e.g., a hand gesture) 430 and a position of the gesture (e.g., a position of the gesture with respect to a user's head).
Figure 7 shows, by way of example only, one possible gesture that a user may use to control one or more functions of the hair styler 1. The gesture 430 shown (hereafter referred to as 'Swipe Right' gesture) involves the user moving their hand in a direction from the left to the right.
During use of the hair styler 1, the end user sits in front of UWB sensors 220-1, 220-2, 220-3 positioned in front of, and to the left and right of the user, which in turn emit UWB radio pulses 230-1, 230-2, 230-3 toward the user. Those UWB radio pulses 230-1, 230- 2, 230-3 propagate through air and space, with their energy dissipating as a function of distance away from the UWB sensor 220-1, 220-2, 220-3. Upon reaching an object such as the user, a hand of the user, a wall behind the user, or the like, the UWB signals will be reflected (not shown) off the respective objects, back toward the UWB sensor 220-1, 220-2, 220-3, which in turn detects the reflected pulses in the form of reflected pulse signals as described above.
Based on the reflected pulse signals, the processor 214 of the computing device 210, processes the reflected signals to determine highly accurate spatial and directional data in the form of a signal graph 320-1, 320-2, 320-3 from each UWB sensor 220-1, 220-2, 220-3 (e.g., the three distance-time signal graphs as shown in Figure 7). In more detail, each time a UWB pulse is transmitted from the three UWB sensors 220-1 to 220-3 a new amplitude/distance plot (such as is shown in Figure 4a) is generated. This information is processed to identify the distance of specific objects and the distance of those objects over time is determined by processing the signals received from successive UWB pulses that are transmitted by the UWB sensors 220-1, 220-2, 220-3.
The pulse repetition frequency has to be sufficient to be able to capture the desired motion. A pulse repetition frequency of the order of 10KHz is sufficient to capture most human movements. The distance information contained in the reflected UWB signals relating to more static objects -such as the background walls or the user's head, can be filtered out, so that the determined distance-time signal graphs 320-1, 320-2, 320-3 contain the distance information of moving objects such as the user's hand. By processing the distance-time graphs 320-1 320-2, 320-3 from the top UWB sensor, the left UWB sensor and the right UWB sensor, the processor 214 can determine if the object that is moving is on the left-hand side or the right-hand side of the user's head and how that object is moving over time relative to each of the UWB sensors 220. This information can then be used to select a corresponding control function to be performed.
Once the spatial and directional data in the form of signal graphs 320-1, 320-2, 320-3 is determined, the computing device 210, uses the signal graphs 320-1, 320-2, 320-3 to determine a dynamic gesture being performed by the user in one of the manners described above with reference to Figure 4b. For example, based on the signal graphs 320-1, 320-2, 320-3 the computing device 210 may determine that a left-to-right (or rightto-left) swiping gesture is being performed by the user in front of the UWB sensors 2201, 220-2, 220-3.
As will be appreciated, the reflected pulse signals received at the different UWB sensors 220-1, 220-2, 220-3 that are a consequence of reflections from the user's hand moving in space will change over time as the user moves their hand from the left to the right as shown in the distance-time graphs 320-1, 320-2, 320-3 shown in Figure 7. However, the user's head, which is static, results in a static signal 322 in the distance-time graphs 320-1, 320-2, 320-3 e.g., a static portion of the reflected pulse signals that does not change over time. Accordingly, the location of the user's head may also be determined based on the distance-time graphs 320-1, 320-2, 320-3, as well as whether the movement of the user's hand started/finished on the left/right hand side of the user's head.
By way of another example only, the processor 214 of the computing device 210 may compare the signal graphs 320-1, 320-2, 320-3 with one or more gesture recognition models stored within the memory 212 of the computing device 210 to determine a gesture being performed by a user of the hair styler 1. For example, the memory 212 of the computing device 210 may store a plurality of recognition models, each associated with their own respective gesture. Each recognition model of the plurality of recognition models may comprise sets of signal graph 320-1, 320-2, 320-3 (or data corresponding to such signal graphs 320-1, 320-2, 320-3), that corresponds to a reflected UWB signals that would be expected to be detected by a receiver in the event that a particular dynamic gesture is performed in front of the UWB sensors 220-1, 220-2, 220-3.
Once the signal graphs 320-1, 320-2, 320-3 have been compared with the one or more gesture recognition models stored within the memory 212 of the computing device 210, the processor 214 may determine a gesture being performed by a user. For example, when performing the comparison, the processor 214 may compare the signal graphs 320-1, 320-2, 320-3 with the one or more gesture recognition models and a matching metric for each comparison may be determined. A matching metric may, by way of example only, comprise a percentage similarity value indicating a percentage similarity between the signal graphs 320-1, 320-2, 320-3 and a gesture recognition model i.e., a percentage similarity between the signal graphs 320-1, 320-2, 320-3 and the signal graph of the gesture recognition model. Nevertheless, it will be appreciated that the matching metric may be any appropriate metric that indicates a level of similarity or match between the signal graphs 320-1, 320-2, 320-3 and gesture recognition model.
Based on the matching metrics, the processor 212 determines which gesture recognition model matches closest to the signal graphs 320-1, 320-2, 320-3. For example, the processor may select the gesture recognition model associated with the largest percentage similarity value. By determining which gesture recognition model matches closest to the signal graphs 320-1, 320-2, 320-3, the processor 212 in turn determines which gesture is being performed by a user, as each gesture recognition model is associated with their own specific gesture (e.g., 'Swipe Right' gesture).
Furthermore, based on the static signal 322 and the known positions of the UWB sensors 220-1, 220-2, 220-3, the processor 214 is also able to determine a location of the user's head, and the position at which the gesture is performed (when static), or starts and finishes (when dynamic) with respect to the users head.
Based on determining the gesture being performed by the user, and in addition the position relative to the user's head at which the gesture was performed, the processor 212 determines a function of the hair styler 1, or a function of the computing device 210 to adjust, and sends, an appropriate message or command to adjust that function of the hair styler 1 or that function of the computing device 210. For example, having determined that the user is performing a 'Swipe Right' gesture, the processor 212 may send an appropriate message to adjust a music track being played currently by the computing device 210.
Beneficially, by being able to determine which portions of the reflected signal correspond to reflections off the left and right side of the user's head respectively, the computing device 210 may be able to adjust and/or control one or more functions and/or settings of the hair styler 1 using more advanced gestures. For example, the computing device 210 may determine a location of a gesture relative to a user's head e.g., the computing device 210 may determine that a user is making the 'Devil Horns' gesture to the left of the user's head which may correspond to a request to control and/or adjust a temperature of the hair styler 1 by increasing the temperature. In a similar vein, the computing device 210 may determine that a user is making the 'Devil Horns' gesture to the right of the user's head which may correspond to a request to control and/or adjust a temperature of the hair styler 1 by decreasing the temperature.
It will be appreciated that the combination of being able to determine a gesture and a relative location of where that gesture is performed with respect to the right and left sides of the user's head, allows a greater number of gestures to be discriminated thereby allowing a greater number of functions or settings of the hair styler 1 and/or of the computing device to be controlled.
Figure 8 illustrates an example process flow diagram depicting the steps involved in another gesture-based control of a hair styler 1 and/or computing device 210.
At step 5810, a UWB sensors 220-1, 220-2, 220-3 transmits UWB pulses toward a user operating a hair styler 1. The UWB sensor transmits UWB radio pulses 230 (e.g., approximately 1 pulse ms) across a wide spectrum frequency range. Those radio pulses propagate through the air toward a user operating the hair styler 1 and upon hitting an object (e.g., a hand of the user), the radio pulses are reflected off the object, back to the UWB sensors 220-1, 220-2, 220-3.
At step S820, the reflected UWB pulses are received by the UWB sensors 220-1, 220-2, 220-3.
At step 5830, the reflected UWB pulses are processed by a processor to process the reflected UWB pulses into spatial and directional data such as a signal graph (S832).
At step S838, having determined a gesture being performed by a user, the computing device 210, analyses the signal in the determined signal graph to identify a portion of the signal corresponding to reflected UWB pulses reflected from the user's head. For example, the computing device 210 may determine that the portion of the signal where the signal has its greatest static amplitude that is unchanged over time corresponds to UWB reflected pulses from the user's head For example, as the user's head is not moving, a static portion (amplitude) of the signal would be expected within the distance-time graph over a period of time, while the initiation of a new gesture would result in a new amplitude signal in the signal graph. Accordingly, when a new amplitude signal is observed in the signal graph, that new amplitude may be attributed to a new gesture that is being performed relative to the static portion (amplitude) of the signal corresponding to the user's head.
Once the portion of the signal associated with UWB pulses reflecting off a user's head has been identified, the computing device 210 can assign sections of the portions of the signal to the left and right side of the user's head, respectively. For example, assuming that the user is facing the UWB sensors 220-1, 220-2, 220-3 located to the left, centre and right of the user respectively and the relative positions of those UWB sensors is known by the computing device 210, then using the identified portion of the signal associated with UWB pulses reflecting off a user's head, the computing device 210 can divide the signal into a first (left) and a second (right) section to the left and the right of the identified static signal. The first section of the signal will correspond to UWB pulses reflected from the left side of the user's head, while the second section will correspond to UWB pulses reflected from the right side of the user's head. In that way, the computing device 210 may be able to determine, whether observed new gestures are being performed on the left and right side of a user's head, respectively.
Alternatively, where the gesture is a dynamic gesture moving from left-to-right or right-toleft, the computing device 210 may determine whether the gesture began/finished at the left/right hand side of the user's head as described above with reference to Figure 7.
At step S840, having determined the gesture being performed by a user, and a location of the gesture with respect to the left-and right-hand side of a user's head (e.g., whether the gesture is on the left or right hand side of the head, or whether the gesture began/finished at the left or right hand side of the head), the computing device 210 sends an appropriate message or command to the hair styler 1 to adjust a function and/or setting of the hair styler 1 based on the gesture being performed. For example, the memory 212 of the computing device 210 may store a mapping table that maps specific gestures performed in specific locations to specific functions and/or setting of the hair styler 1 that are to be adjusted when that specific gesture is detected. Having determined which gesture is being performed, the computing device 210 may look up the gesture in the mapping table stored in its memory 212 to determine the function and/or setting of the hair styler 1 that is to be adjusted based on the gesture. Having determined the function and/or setting, an appropriate message or command may be sent by the computing device 210 to the hair styler 1 over a communication link i.e., each gesture is associated with one of a plurality of control commands and in response to detecting a specific gesture, the processor is configured to transmit the corresponding control command to the hair styler 1.
Alternatively, or additionally, at step S840 having determined a gesture being performed by a user, and a location of the gesture with respect to the left-and right-hand side of a user's head, the computing device 210 may determine one or more functions and/or settings of the computing device 210 to adjust based on the gesture being performed. For example, the memory 212 of the computing device 210 may store a mapping table that maps specific gestures performed in specific locations to specific functions and/or settings of the computing device 210 (additionally or alternatively to specific functions and/or settings of the hair styler 1) that are to be adjusted when that specific gesture is detected. Having determined which gesture is being performed, the computing device 210 may look up the gesture in the mapping table stored in its memory 212 to determine the function and/or setting of the hair styler 1 that is to be adjusted based on the gesture.
Having determined the function and/or setting, the computing device 210 may adjust one or more of its functions and/or settings as appropriate.
Styling Recognition Additionally, or alternatively to the gesture recognition procedures and gesture location determination procedures described above, the hair styler system 100 comprising the handheld (portable) hair styler 1, the UWB sensor 220, and the computing device 210, as shown in Figure 2, may also be configured to enable styling recognition and feedback i.e., a process for recognising a styling movement performed by a user, and providing appropriate feedback.
Figure 9 illustrates an example use of a signal graph, generated through reflected UWB pulses, to determine a styling movement performed by a user of a hair styler 1.
As shown in Figure 9, there is an UWB sensor 220 in communication with a computing device 210 as described above with reference to Figures 2 & 3. The UWB sensor 220 is in communication with the computing device 210 over communication link 224. Although not shown, it will be appreciated that, as shown in Figure 2, the hair styler 1 is in communication with the computing device 210 over an appropriate communication link.
As described briefly above, it may be beneficial for the computing device 210 to be able to determine particular movements and/or actions that a user is making with the hair styler 1 as they style their hair, and to provide appropriate feedback to the user. For example, it may be beneficial to provide feedback to a user to indicate that they are moving their hair through the hair styler 1 too quickly or not quick enough to achieve the desired style. Additionally, or alternatively, it may be beneficial to provide feedback to a user to indicate that a curling action of the user is being performed correctly or incorrectly to achieve the desired style.
Figure 9 shows, by way of example only, one possible movement that a user may make with the hair styler 1. The movement shown (hereafter referred to as 'Curling' movement) involves the user placing a tress of their hair in the hair styler 1 and then rotating the hair styler 1 to curl their hair.
Radio-frequency identification (RFID) tags In one example, one or more Radio-frequency identification (RFID) tags may be implemented in and/or on the hair styler 1. Such RFID tags typically comprise an integrated circuit (IC), an antenna and a substrate. They use electromagnetic fields to automatically identify and track objects to which the tags are attached (e.g., the hair styler 1).
During use of the hair styler 1, the end user sits in front of at least one UWB sensor 220 (which in this scenario may be considered an RFID reader or interrogator), which in turn emits UWB radio pulses 230 toward the user and the hair styler 1. Upon reaching the hair styler 1, those UWB radio pulses 230 activate the one or more RFID tags on the hair styler 1. Those RFID tags may be passive tags that receive their power from the UWB pulses (Le., the electromagnetic wave induces a current in the RFID tag's antenna), or alternatively the RFID tags may be active tags that have their own power source such as a battery. Alternatively the RFID tags may be active tags that are connected to the power supply/circuitry of the hair styling device. Preferably, the RFID tags on the hair styler 1 will be passive RFID tags.
Once activated, the one or more RFID tags may send signals back to the UWB sensor 220. For example, the one or more RFID tags may modulate the UWB signals they receive with their own unique ID (which may be an ID specific to the hair styler 1) such that the signals sent back to the UWB sensor 220 by the RFID tags carry ID information (e.g., hair styler ID information).
Once received, those signals sent back to the UWB sensor 220 by the RFID tags may be processed by the UWB sensor 220 itself if it has its own appropriate processor for recovering such data. Alternatively, the UWB sensor 220 may forward those signals, or send information about the signals, in an appropriate message, to the computing device 210 for the computing device 210 to process the signals and recover the data they contain. For example, as shown in Figure 9, as the hair styler 1 pulled in a downward trajectory (e.g., as it is pulled through a tress of hair) EM signals detected by the UWB sensor 220 from the RFID tags will change over time as the trajectory of the hair styler 1 progresses. As shown in the time-amplitude graph 920 that may be generated by the processor 214 based on the EM signals, the amplitude of the EM signals may change over time as the distance of the hair styler from the UWB sensor 220 changes. Accordingly, the computing device 210 may be able to use such EM signals and generated time-amplitude graph 920 to determine a movement of the hair styler 1 relative to the UWB sensor 220. In addition, the UWB signals received by the UWB sensor 220 from the RFID tags, the computing device 210 may also be used to determine the identity/type of hair styler 1 being used, and thus may be able to determine, based on the type of hair styler 1 being used, whether the movement of the hair styler 1 relative to the UWB sensor 220 is appropriate. For example, appropriate movements for specific types of hair stylers may be stored in the memory 212 of the computing device 210.
Having determined that the user is moving the hair styler 1 in a particular way appropriate feedback may be provided to the user (e.g., visual, audio, and/or haptic feedback) to indicate whether or not the user is using the hair styler 1 correctly to achieve a desired style.
By way of example only, during initiation of the hair styler 1, the user may input into the hair styler a particular hair style that they wish to achieve. Such input may occur via one or more buttons on the hair styler 1, and/or via a touch display or other appropriate user interface of the hair styler 1, and/or via a touch display or other appropriate user interface of the computing devices (e.g., a phone or computing tablet). Once selected, the UWB sensor 210 and RFID tags may be used as described above to track the movement of the hair styler 1 to ensure that a user is using the hair styler 1 correctly to achieve the desired style. When it is determined that the user is not moving the hair styler 1 correctly to achieve the desired style (for example the detected movement may be compared with a set of expected movements needed to achieve the desired style which may be stored in a memory 212 of the computing device 210), then the computing device 210 may send an appropriate message to the hair styler 1 to provide feedback to the user e.g., haptic feedback such as a vibration of the hair styler 1 and/or audio feedback from the hair styler 1 where the hair styler 1 is provided with one or more speakers. Additionally, or alternatively, the computing device 210 itself may provide feedback to the user such as sounds (if the computing device 210 has speakers) or visual indications via a display of the computing device 210 (e.g., a screen of a smartphone).
It will be appreciated that the determination of the rotational movement of the hair styler 1 and the corresponding feedback provided as described above is given by way of example only and that other movements may also be determined, with different types of feedback being provided for each movement.
The process of styling recognition and providing appropriate feedback is further described below with reference to the process flow diagram of Figure 10.
Figure 10 illustrates an example process flow diagram depicting the steps involved in recognising a location and movement of a hair styler 1 and providing appropriate feedback based on a hair style desired by the user.
At step S1010, a user of the hair styler 1 inputs, via any appropriate user interface of the hair styler 1 (or the computing device 210), an input to indicate a desired hair style to be achieved using the hair styler 1. It will be appreciated that the possible hair styles that may be selected by a user may be pre-configured in the hair styler 1 and the movements needed to achieve that style may be pre-stored in a memory of the hair styler 1 and/or an app of the hair styler 1. Alternatively, where the hair styler 1 is configured to communicate with a computing device 210, the hair styler 1 may request, via an appropriate message, the computing device 210 to search the Internet and download a set of movements needed to achieve a particular hair style, which may in turn be stored in the memory of the hair styler 1 and/or the memory of the computing device 210. At step S1020, one or more UWB sensors 220 transmit UWB pulses toward a user operating a hair styler 1. The UWB signals propagate through the air toward a user operating the hair styler 1, and upon reaching the hair styler 1 those UWB signals energise one or more RFID tags on the hair styler 1.
At step S1030, in response to the pulses from the UWB sensor 220 energising the one or more RFID tags on the hair styler 1, the one or more RFID tags transmit one or more electromagnetic EM signals (i.e., EM pulses) back to the UWB sensor 220. Those EM signals are generated and transmitted by the RFID tag in response to being in the vicinity of the UWB sensor (Le., in response to receiving UWB pulses from the UWB sensor 220). For example, the one or more RFID tags may modulate the UWB signals they receive with their own unique ID (which may be an ID specific to the hair styler 1) such that the signals sent back to the UWB sensor 220 by the RFID tags carry ID information (e.g., hair styler ID information).
If the RFID tag is an active tag, then the EM signals are generated and transmitted using an energy source such as a battery. Preferably, however, the RFID tags are passive tags, and thus the EM signals are generated using radio energy from the transmitted UWB pulses received from the UWB sensor 220.
At step S1040, the UWB sensor 220 processes the EM signals received from the one or more RFID tags, or alternatively, the UWB sensor 220 may transmit appropriate information pertaining to the EM signals it receives to the computing device 210 to process the EM signals from the one or more RFID tags.
Processing the EM signals at step S1040, may include, determining, based on the EM signals, a movement of the hair styler 1 (S1044). For example, the computing device 210 may process the EM signals (or any appropriate information about the EM signals received from the UWB sensor 220) to determine, based on e.g., changes in the properties of such EM signals over time, a movement of the hair styler 1 e.g., a rotation of the hair styler 1, a downward/upward movement of the hair styler 1, and the like. In addition, as already described above, processing the EM signals may also include processing the EM signals to recover data modulated in the signals such as an ID specific to the hair styler 1.
At step S1050, the computing device 210, may compare the determined movement of the hair styler 1 with a set of movements necessary to achieve the desired style input by the user at step S1010. For example, the determined movement of the hair styler 1 may be compared with the movements needed to achieve the desired style that are stored in a memory of the computing device 210 or downloaded from the Internet. By way of example only, where a user wishes to achieve a particular style that comprises hair curls, the computing device 210 may compare determined rotational movements of the hair styler 1 with rotational movements needed to achieve the desired style that are stored in a memory of the computing device 210 or downloaded from the Internet to determine if the user is rotating the hair styler 1 correctly e.g., rotating in the correct direction.
At step S1060, having compared the determined rotational movements of the hair styler 1 with rotational movements needed to achieve the desired style that are stored in a memory of the computing device 210 or downloaded from the Internet to determine if the user is rotating the hair styler 1 correctly (e.g., rotating in the correct direction), the computing device 210 may provide appropriate feedback to the user (e.g., audio, haptic and/or visual feedback) via the computing device 210 or the hair styler 1. For example, if it is determined by the computing device 210 that the user is not rotating the hair styler 1 in the correct direction, the computing device 210 may send an appropriate message or command to the hair styler 1 to activate a haptic unit in the hair styler 1 such that the hair styler 1 vibrates to indicate to the user that they are making a mistake i.e., each gesture/movement is associated with one of a plurality of feedback commands and in response to detecting a specific gesture/movement, the processor is configured to transmit the corresponding feedback command to the hair styler 1 to trigger appropriate feedback.
In another example, if it is determined by the computing device 210 that the user is not rotating the hair styler 1 in the correct direction, the computing device 210 may send an appropriate message or command to the hair styler 1 to activate a user interface or display on the hair styler 1 and indicate a message on that display informing the user that they are making a mistake. Alternatively (or additionally), where the hair styler is provided with one or more speakers, the computing device 210 may send an appropriate message or command to the hair styler 1 to trigger the one or speakers to play an error or warning sound to inform the user that they are making a mistake.
In another example, if it is determined by the computing device 210 that the user is not rotating the hair styler 1 in the correct direction, the computing device 210 may display a message on a display of the computing device 210 informing the user that they are making a mistake.
In another example, if it is determined by the computing device 210 that the user is not rotating the hair styler 1 in the correct direction, the computing device 210 may display an image or message or play a sound (if the computing device 210 is equipped with an audio module 218 and speakers) informing the user that they are making a mistake.
It will be appreciated that the examples given above are non-exhaustive and that other forms of appropriate feedback via the hair styler 1, computing device 210, or both the hair styler 1 and the computing device 210 may be provided to a user in response to the computing device 210 determining a particular movement of the hair styler 1 and comparing that movement with a set of movements necessary to achieve a desired hair style.
An example set of movements that may be performed by the user and the corresponding feedback that may be triggered by the computing device will now be described with reference to the table in Figure 11.
Figure 11 illustrates a table of example movements that may be performed by the user and the feedback that may be triggered by the computing device 210 in response to determining those movements and comparing them with the set of movements needed to achieve a desired hair style. As shown, the table 1100 includes four example movements that may be performed by a user, and corresponding feedback that may be triggered. In one example, the movement is a straight movement of the hair styler 1 e.g., a straight, unidirectional, downward movement that may be performed by a user when they are running a tress of hair through the hair styler 1 to straighten their hair.
Having determined that the user is moving the hair styler 1 in a straight movement, the computing device 210 compares that determined movement of the hair styler 1 with a set of movements necessary to achieve the desired hair style. For example, the computing device 210 may compare the determined movement with movements necessary to achieve the desired hair style input by the user at step S1010. As described above, the movements necessary to achieve the desired hair style may be stored in a memory of the computing device 210 or may be downloaded into the memory of the computing device 210 once the user inputs the desired hair style at step S1010.
Based on the comparison of the determined movement and the movements necessary to achieve the desired hair style, the computing device 210 may trigger appropriate feedback to the user. For example, the computing device 210 may indicate to the user whether they should/should not be performing the movement and/or indicate whether they are making the movement too slowly or too quickly. As described above, the computing device 210 may provide feedback in the form of visual or audio feedback.
Alternatively, or additionally, the computing device 210 may send an appropriate message or command to the hair styler 1 over a communication link to trigger the hair styler 1 to provide appropriate feedback e.g., visual, or haptic feedback.
In another example, the movement is a curling movement (e.g., a rotational movement of the hair styler 1) to curl a tress of hair in the hair styler 1. Having determined that the user is moving the hair styler 1 in a curling movement, the computing device 210 compares that determined movement of the hair styler 1 with a set of movements necessary to achieve the desired hair style. For example, the computing device 210 may compare the determined movement with movements necessary to achieve the desired hair style input by the user at step S1010. As described above, the movements necessary to achieve the desired hair style may be stored in a memory of the computing device 210 or may be downloaded into the memory of the computing device 210 once the user inputs the desired hair style at step S1010. Based on the comparison of the determined movement and the movements necessary to achieve the desired hair style, the computing device 210 may trigger appropriate feedback to the user. For example, the computing device 210 may indicate to the user whether they should/should not be performing the movement and/or indicate whether they are making the curling in the correct direction or not. As described above, the computing device 210 may provide feedback in the form of visual or audio feedback. Alternatively, or additionally, the computing device 210 may send an appropriate message or command to the hair styler 1 over a communication link to trigger the hair styler 1 to provide appropriate feedback e.g., visual, or haptic feedback.
In another example, the movement may be a random non-specific/useful movement.
Having determined that the user is moving the hair styler 1 in a non-specific direction, the computing device 210 compares that determined movement of the hair styler 1 with a set of movements necessary to achieve the desired hair style. For example, the computing device 210 may compare the determined movement with movements necessary to achieve the desired hair style input by the user at step S1010. As described above, the movements necessary to achieve the desired hair style may be stored in a memory of the computing device 210 or may be downloaded into the memory of the computing device 210 once the user inputs the desired hair style at step S1010. As described above, that computing device 210 and its respective memory may be stored in/on the hair styler 1 itself, or alternatively may be a separate external computing device such as a phone, tablet, or the like.
Based on the comparison of the determined movement and the movements necessary to achieve the desired hair style, the computing device 210 may trigger appropriate feedback to the user. For example, where the movement does not correspond with any of the movements necessary to achieve the desired hair style, the computing device 210 may provide feedback indicating that the movement is not a recognised movement, and/or not a movement necessary to achieve the desired hair style (e.g., the computing device 210 may play a random sound such as a 'magic wand' sound as if the user just cast a spell). As described above, the computing device 210 may provide feedback in the form of visual or audio feedback. Alternatively, or additionally, the computing device 210 may send an appropriate message or command to the hair styler 1 over a communication link to trigger the hair styler 1 to provide appropriate feedback e.g., visual, or haptic feedback.
In another example, the hair styler 1 is operating in a training mode to allow the user of the hair styler 1 to practice specific movements that may be required to achieve a desired hair style. For example, the user may switch the hair styler 1 to a training mode by an appropriate input via at least one user interface of the hair styler 1. While in the training mode the user may be able to input a desired hair style to achieve, however the hair styler 1 will not heat up the heating plates of the hair styler 1 so that the user can practice the movements needed to achieve the desired hair style without actively styling their hair. While in the training mode, a movement of the hair styler 1 may include movement of the hair styler 1 in an up-down movement while the arms 4a, 4b of the hair styler 1 are open. Having determined that the user is moving the hair styler 1 in an up-down movement with the arms 4a, 4b of the hair styler 1 open, the computing device 210 may determine that the user wishes to repeat the movement performed immediately before the movement of the hair styler 1 in the up-down movement. Beneficially, this allows the user to practise movements of the hair styler 1 over and over again until they have perfected the movement.
It will be appreciated that the table 1100 in Figure 11 is by way of example only and that the system may be configured to detect any of a multitude of movements and provide any of a multitude of corresponding feedback.
Modifications and Alternatives Detailed embodiments have been described above. As those skilled in the art will appreciate, a number of modifications and alternatives can be made to the above embodiments whilst still benefiting from the inventions embodied therein. By way of illustration only a number of these alternatives and modifications will now be described.
Deep-Learning & Neural Networks In some of the embodiments described above, the processing of the signal graphs to determine gestures included comparing signal graphs with one or more gesture recognition models stored within a memory 212 of the computing device 210. For example, each gesture recognition model may comprise a signal graph of an expected reflected UWB pulse signal for a specific gesture, and the computing device may compare the generated signal graph with each of the signal graphs of the gesture recognition models to determine a gesture being performed by a user.
It will nevertheless be appreciated that the gesture recognition models may take any other appropriate form that allows a gesture to be recognised using the reflected signals detected by the UWB sensors. For example, each gesture recognition model may comprise a pre-trained convolutional neural network (CNN) or some other form of deep-learning neural network.
By way of example only, each gesture recognition model may comprise a CNN comprising at least one convolutional layer, at least one pooling layer, and at least one fully connected layer. Additionally, the CNN may have flattening layers, MaxPooling layers, SoftMax, layers, and other non-linearity layers where appropriate such as Sigmoid layers, Tanh layers, and/or ReLU layers.
Each CNN, once trained using training datasets, may be used to classify reflected signals detected by the UWB sensors into specific gestures. Additionally, the CNN may be further trained and refined during use of the CNN using 'live' reflected signals detected by the UWB sensors to further refine its classification capabilities. Training of such a CNN would be relatively straight forward -involving supplying many different signals graphs obtained when the known and different gestures are performed. Alternatively, a CNN may be generated specific to each gesture. In this case the signal graph generated would be applied to each CNN and each CNN would output a value indicating how close the signal graph is to the signal graphs that were used to train the CNN. The gesture associated with the CNN giving the highest value would then be determined to be the gesture performed by the user and the appropriate control action taken.
Pattern Recognition It will be appreciated that while some of the examples described above make use of 30 RFID tags and the UWB sensor 220 to determine the location and movement of the hair styler 1, other methods of detecting the location and movement of the hair styler 1 are possible. For example, rather than RFID tags and an UWB sensor 220, a camera may be provided, in communication with the computing device 210, that is able to track the location and movement of the hair styler 1.
Figure 12 illustrates an example movement of the hair styler 1 and processing thereof to provide user feedback.
As shown in Figure 12, there is a camera 1220 in communication with a computing device 210 as described above with reference to Figure 2. The camera 1220 is in communication with the computing device 210 over communication link 1224. Although not shown, it will be appreciated that, as shown in Figure 2, the hair styler 1 is in communication with the computing device 210 over an appropriate communication link.
Figure 12 shows, by way of example only, one possible movement of the hair styler 1 that a user may carry out. The movement shown involves the user rotating the hair styler to facilitate curling of a tress of hair when placed in the hair styler 1.
During use of the hair styler 1, the end user sits in front of the camera 1220, which captures image data of the user and their use of the hair styler. The computing device 210 in communication with the camera may continuously, or periodically, process/analyse the image data generated by the camera 1220 to determine patterns of movements of the hair styler 1. For example, the computing device 210 may store in its memory 212 sets of image data that corresponds to particular movements of the hair styler e.g., the memory 212 of the computing device 210 may store a sequence of images of the hair styler 1 in different positions as it is rotated over time. The processor of the computing device 210 may compare image data captured by the camera 1220 to image data stored in its memory to determine a movement of the hair styler 1. By way of example, the processor may compare the image data with each sequence of images (or image data) stored in the memory of the computing device 210 and assign a matching metric for each comparison, the matching metric indicating a degree or percentage similarity between the captured image data and specific image data stored in the memory 212 of the computing device 210.
Once the comparisons have been made, the computing device 210 determines a pattern of movements being performed by a user. For example, the computing device 210 may 30 determine the pattern of movements being performed based on which sequence of images (or image data) stored in the memory of the computing device 210 are assigned the highest matching metric when compared with the image data captured by the camera 1120.
Once the comparisons have been made, the computing device 210, compares the determined pattern of movements of the hair styler 1 with a pattern of movements necessary to achieve the desired style input by the user (which may also be stored in the memory 212 of the computing device 210). For example, the determined pattern of movements of the hair styler 1 may be compared with the pattern of movements needed to achieve the desired style that are stored in a memory of the computing device 210 or downloaded from the Internet.
By way of example only, where a user wishes to achieve a particular style that comprises hair curls, the computing device 210 may compare the determined pattern of movements of the hair styler 1 with rotational movements needed to achieve the desired style that are stored in a memory of the computing device 210 or downloaded from the Internet to determine if the user is rotating the hair styler 1 correctly e.g., rotating in the correct direction.
Having compared the determined pattern of movements of the hair styler 1 with rotational movements needed to achieve the desired style that are stored in a memory of the computing device 210 or downloaded from the Internet to determine if the user is rotating the hair styler 1 correctly (e.g., rotating in the correct direction), the computing device 210 may provide appropriate feedback to the user (e.g., audio, haptic and/or visual feedback) via the computing device 210 or the hair styler 1. For example, if it is determined by the computing device 210 that the user is not rotating the hair styler 1 in the correct direction, the computing device 210 may send an appropriate message or command to the hair styler 1 to activate a haptic unit in the hair styler 1 such that the hair styler 1 vibrates to indicate to the user that they are making a mistake.
It will be appreciated that the example given above is non-exhaustive and that other forms of appropriate feedback via the hair styler 1, computing device 210, or both the hair styler 1 and the computing device 210 may be provided to a user in response to the computing device 210 determining a particular pattern of movements of the hair styler 1 and comparing that pattern of movements with one or more patterns of movements necessary to achieve a desired hair style.
Other Alternatives For the purposes of simplicity, the above description focuses on hair care products and particularly a hair styling device. However, it will be appreciated that the above-described concepts may be applied widely to any hair styling product and/or other beauty product / devices in the beauty industry, including, for example: hair dryers, curling tongs/wands, hair straighteners, nail gel/varnish curers (such as UV lamp systems for the curing of nail varnish), skin epilators, hair colouring devices, crimpers, etc. In the above embodiments, the beauty product device communicated with a smart processing device. This is not essential -the beauty product device may be provided with all the processing functionality of the processing device.
In the above examples, the feedback messages were provided to the user by way of the user interface on the beauty product device or on the processing device. This is not essential. The feedback messages may be provided by any suitable user interface of any nearby device. For example, they may be sent to an Amazon Echo speaker device for playout to the user as voice messages or displayed to the user on a television screen, a mirror, or the like.
The method of communication between the hair styler and the computing device could be via a cable or wireless means. Examples of applicable wireless communications include Bluetooth, LoRa, ZigBee, 802.15 standard, NFC, or optical means -both visible and IR.
In the examples given above, various specific temperatures and power levels were discussed. As those skilled in the art will appreciate, all these specific values are clearly not essential to the invention and the particular values used in a given product will depend on the treatment to be given, the voltage sources used etc. In the above embodiments, a number of software modules were described. As those skilled in the art will appreciate, the software modules may be provided in compiled or un-compiled form and may be supplied to the hair styler or the corresponding processing device (mobile telephone and/or the like) as a signal over a computer network, or on a recording medium. Further, the functionality performed by part or all of this software may be performed using one or more dedicated hardware circuits. However, the use of software is preferred as it facilitates the updating of the beauty product device (and the processing device).
The above embodiments used various processors located in the UWB sensors or in the computing device or the hair styler. These processors may be programmable processors such as microprocessors or CPUs or they may be formed from dedicated hardware circuits such as ASIC devices and the like.
Various other modifications will be apparent to those skilled in the art and will not be described in further detail here.

Claims (25)

  1. Claims 1. A hair styling system comprising: a hair styling device; at least one ultra-wide band, UWB, sensor configured to transmit UWB radio pulses and to receive UWB radio signals reflected by objects adjacent the at least one UWB sensor; and a processor configured to: i) process signals obtained from the at least one UWB sensor to identify a gesture made by a user adjacent to the at least one UWB sensor; and H) control, in response to the identified gesture, at least one setting of the hair styling device or the computing device.
  2. 2. The hair styling system of claim 1, wherein to process signals obtained from the at least one UWB sensor to identify a gesture made by a user adjacent to the at least one UWB sensor comprises: comparing the signals obtained from the at least one UWB sensor with a plurality of gesture recognition models.
  3. 3. The hair styling system of claim 2, wherein comparing the signals obtained from the UWB sensor with a plurality of gesture recognition models comprises: comparing a pattern of the signals obtained from the at least one UWB sensor with each one of the gesture recognition models of the plurality of gesture recognition models, wherein each gesture recognition models comprises one or more pre-stored UWB radio signal patterns corresponding to a gesture.
  4. 4. The hair styling system of claim 3, wherein the one or more pre-stored UWB radio signal patterns corresponding to a gesture comprise at least one of: a distance-amplitude signal graph, and/or a distance-time signal graph.
  5. 5. The hair styling system of claim 1, wherein to process signals obtained from the at least one UWB sensor to identify a gesture made by a user adjacent the at least one UWB sensor comprises: comparing the signals obtained from the at least one UWB sensor with a plurality of gesture recognition models, wherein each one of the gesture recognition models is a convolutional neural network, CNN.
  6. 6. The hair styling system of any preceding claim, wherein each identified gesture is associated with one of a plurality of control commands; and in response to identifying the gesture made by a user adjacent to the at least one UWB sensor the processor is configured to transmit a corresponding control command to the hair styler to control the at least one setting of the hair styling device.
  7. 7. The hair styling system of claim 6, wherein to control, in response to the identified gesture, the at least one setting of the hair styling device comprises: increasing a temperature of the hair styler; or increasing a temperature of the hair styler by a pre-configured step size; or initiating a heat-shot function of the hair styler; or initiating a cold-shot function of the hair styler.
  8. 8. The hair styling system of any preceding claim, wherein in response to identifying the gesture made by a user adjacent to the at least one UWB sensor the processor is configured to control the at least one setting of the computing device in communication with the at least one UWB sensor.
  9. 9. The hair styling system of claim 8, wherein to control, in response to the identified gesture, the at least one setting of the computing device in communication with the at 25 least one UWB sensor comprises: restarting a music track being played by the computing device; or skipping a music track being played by the computing device.
  10. 10. A hair styling system comprising: a hair styling device; a plurality of one ultra-wide band, UWB, sensor, each UWB sensor configured to transmit UWB radio pulses and to receive UWB radio signals reflected by objects adjacent the at least one UWB sensor; and a computing device in communication with the plurality of UWB sensors, the computing device comprising a processor configured to: i) process signals obtained from each UWB sensor of the plurality of UWB sensors to identify a dynamic gesture made by a user, wherein the dynamic gesture is a moving gesture; H) process the signals obtained from each UWB sensor of the plurality of UWB sensors to determine a position of a user's head; and iii) control, in response to the identified dynamic gesture and the determined position of the user's head, at least one setting of the hair styling device or the computing device.
  11. 11. The hair styling system of claim 10, wherein to process signals obtained from the plurality of UWB sensors to identify the dynamic gesture made by the user comprises: comparing the signals obtained from the plurality of UWB sensors with a plurality gesture recognition models.
  12. 12. The hair styling system of claims 10 and 11, wherein comparing the signals obtained from the plurality of UWB sensors with a plurality of gesture recognition models comprises: comparing a pattern of the signals obtained from the plurality of UWB sensors with each one of the gesture recognition models of the plurality of gesture recognition models, wherein each gesture recognition models comprises one or more pre-stored UWB radio signal patterns corresponding to a dynamic gesture.
  13. 13. The hair styling system of claim 12, wherein the one or more pre-stored UWB radio signal patterns corresponding to a dynamic gesture comprises a distance-time signal graph.
  14. 14. The hair styling system of claim 10, wherein to process signals obtained from the plurality of UWB sensors to identify a gesture made by a user comprises: comparing the signals obtained from the plurality of UWB sensors with a plurality of gesture recognition models, wherein each one of the gesture recognition models is a convolutional neural network, CNN.
  15. 15. The hair styling system of any preceding claim, wherein to process the signals obtained from each UWB sensor of the plurality of UWB sensors to determine a position of a user's head comprises: identifying, in the signals obtained from each UWB sensor a static portion of the signals, wherein the static portion of the signals is a portion of the signals that remains unchanged over time.
  16. 16. The hair styling system of claim 15, wherein to control, in response to the identified dynamic gesture and the determined position of the user's head, at least one setting of the hair styling device comprises: determining, whether the dynamic gesture began on the left-or right-hand side of the user's head; and wherein each identified dynamic gesture and its starting position relative to the left-or right-hand side of the user's head is associated with one of a plurality of control 15 commands; and in response to identifying the gesture made by a user and its starting position relative to the left-or right-hand side of the user's head, the processor is configured to transmit a corresponding control command to the hair styler to control the at least one setting of the hair styling device.
  17. 17. The hair styling system of claim 15, wherein to control, in response to the identified dynamic gesture and the determined position of the user's head, at least one setting of the computing device comprises: determining, whether the dynamic gesture began on the left-or right-hand side of the user's head; and in response to identifying the gesture made by a user and its starting position relative to the left-or right-hand side of the user's head, the processor is configured to control the at least one setting of the computing device.
  18. 18. A hair styling system comprising: a hair styling device; at least one ultra-wide band, UWB, sensor configured to transmit UWB radio pulses and to receive UWB radio signals from at least one radio frequency identification, RFID, tag on the hair styling device; at least one user interface for input of a desired hair style; and a computing device in communication with the at least one UWB sensor, the computing device comprising a processor configured to: i) process signals obtained from the at least one UWB sensor to identify a movement of the hair styling device adjacent to at least one UWB sensor; and ii) provide feedback to a user in response to the identified movement.
  19. 19. The hair styling system of claim 18, wherein to process signals obtained from the at least one UWB sensor to identify a movement of the hair styling device adjacent to the at least one UWB sensor comprises: comparing the signals obtained from the at least one UWB sensor with a plurality gesture recognition models.
  20. 20. The hair styling system of claim 19, wherein comparing the signals obtained from the UWB sensor with a plurality of gesture recognition models comprises: comparing a pattern of the signals obtained from the at least one UWB sensor with each one of the gesture recognition models of the plurality of gesture recognition models, wherein each gesture recognition models comprises one or more pre-stored UWB radio signal patterns corresponding to a movement of the hair styling device.
  21. 21. The hair styling system of claim 20, wherein the one or more pre-stored UWB radio signal patterns corresponding to a movement of the hair styling device comprises a distance-time signal graph.
  22. 22. The hair styling system of claim 18, wherein to process signals obtained from the at least one UWB sensor to identify a movement of the hair styler device adjacent to the at least one UWB sensor comprises: comparing the signals obtained from the at least one UWB sensor with a plurality of gesture recognition models, wherein each one of the gesture recognition models is a convolutional neural network, CNN.
  23. 23. The hair styling system of any preceding claim, wherein to provide feedback to a user in response to the identified movement comprises: comparing the identified movement of the hair styler device with a set of movements of the hair styler device required to achieve the desired hair style input at the at least one user interface; determining, based on the comparing, whether the identified movement of the hair styler device corresponds to at least one movement of the hair styler device required to achieve the desired hair style; and in response to the determining, the processor is configured to transmit a feedback command to the hair styler to provide feedback to a user in response to the identified movement.
  24. 24. The hair styling system of any one of claims 18 to 23, wherein the identified movement of the hair styler device comprises at least one of: a rotating movement of the hair styler device; a unidirectional straight movement of the hair styler device; an up-down movement of the hair styler device while arms of the hair styler device are open.
  25. 25. The hair styling system of any preceding claim, wherein the feedback comprises at least one of: visual feedback provided via a display of the computing device; visual feedback provided via a display of the hair styler device; audio feedback provided via the computing device; audio feedback provided by the hair styler device; or haptic feedback provided via the hair styler.
GB2404586.6A 2024-03-28 2024-03-28 A hair styling system and control thereof Pending GB2639967A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2404586.6A GB2639967A (en) 2024-03-28 2024-03-28 A hair styling system and control thereof
PCT/GB2025/050676 WO2025202662A1 (en) 2024-03-28 2025-03-28 A hair styling system and control thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2404586.6A GB2639967A (en) 2024-03-28 2024-03-28 A hair styling system and control thereof

Publications (2)

Publication Number Publication Date
GB202404586D0 GB202404586D0 (en) 2024-05-15
GB2639967A true GB2639967A (en) 2025-10-08

Family

ID=91023451

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2404586.6A Pending GB2639967A (en) 2024-03-28 2024-03-28 A hair styling system and control thereof

Country Status (1)

Country Link
GB (1) GB2639967A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180317624A1 (en) * 2016-07-29 2018-11-08 Spur Concepts Inc. System and method for an enhanced hair dryer
CN112904998A (en) * 2019-12-04 2021-06-04 佛山市云米电器科技有限公司 Fan control method, fan and computer readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180317624A1 (en) * 2016-07-29 2018-11-08 Spur Concepts Inc. System and method for an enhanced hair dryer
CN112904998A (en) * 2019-12-04 2021-06-04 佛山市云米电器科技有限公司 Fan control method, fan and computer readable storage medium

Also Published As

Publication number Publication date
GB202404586D0 (en) 2024-05-15

Similar Documents

Publication Publication Date Title
US11561519B2 (en) Systems and methods of gestural interaction in a pervasive computing environment
US20240201794A1 (en) Systems and methods of tracking moving hands and recognizing gestural interactions
US11513205B2 (en) System and method associated with user authentication based on an acoustic-based echo-signature
Zhao et al. SideSwipe: detecting in-air gestures around mobile devices using actual GSM signal
EP3914180B1 (en) A method of determining a position and/or orientation of a handheld device with respect to a subject, a corresponding apparatus and a computer program product
CN110554773B (en) Haptic device for producing directional sound and haptic sensation
WO2014117647A1 (en) Non-contact gesture control method, and electronic terminal device
CN105284179A (en) Wearable device and communication method using wearable device
KR20200006757A (en) Apparatus and method for confirming object in electronic device
CN115411521B (en) Position adjustment method and device of near field communication NFC antenna, electronic equipment and medium
CN115553814A (en) Beauty method and radio frequency instrument
CN110502108A (en) Device control method, device, and electronic device
GB2639967A (en) A hair styling system and control thereof
CN106980836A (en) Auth method and device
Wang et al. Simultaneous authentication of multiple users using a single mmWave radar
WO2025202662A1 (en) A hair styling system and control thereof
CN107870665A (en) A method, device and terminal for controlling a terminal
JP7640672B2 (en) Activating Cross-Device Interactions Using Pointing Gesture Recognition
KR20230114401A (en) Method for providing costomzied cooking content and user terminal for implementing the same
WO2024010618A1 (en) Bio-impedance sensing for gesture input, object recognition, interaction with passive user interfaces, and/or user identificaiton and/or authentication
US20250390178A1 (en) Systems and methods of tracking moving hands and recognizing gestural interactions
WO2020255286A1 (en) Pairing display device, pairing display system, and pairing display method
He et al. Handleap: towards contact-free gesture interaction with earphones via acoustic sensing
JP6140385B2 (en) Acoustic operation system, operation method, and computer program
CN113573203A (en) Control method of non-contact earphone, intelligent earphone and storage medium