EP4371445A1 - A handheld device - Google Patents
A handheld device Download PDFInfo
- Publication number
- EP4371445A1 EP4371445A1 EP22208007.9A EP22208007A EP4371445A1 EP 4371445 A1 EP4371445 A1 EP 4371445A1 EP 22208007 A EP22208007 A EP 22208007A EP 4371445 A1 EP4371445 A1 EP 4371445A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- effectors
- image
- loading state
- handheld device
- subject
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 239000012636 effector Substances 0.000 claims abstract description 192
- 238000000034 method Methods 0.000 claims abstract description 34
- 238000012546 transfer Methods 0.000 claims abstract description 28
- 238000004590 computer program Methods 0.000 claims description 12
- 230000003287 optical effect Effects 0.000 claims description 12
- 230000001225 therapeutic effect Effects 0.000 claims description 6
- 239000000835 fiber Substances 0.000 claims description 3
- 239000000463 material Substances 0.000 claims description 3
- 229920005594 polymer fiber Polymers 0.000 claims description 3
- 230000005540 biological transmission Effects 0.000 abstract description 15
- 230000002829 reductive effect Effects 0.000 abstract description 7
- 238000013461 design Methods 0.000 abstract description 4
- 238000012545 processing Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 10
- 238000003860 storage Methods 0.000 description 9
- 238000004458 analytical method Methods 0.000 description 7
- 230000001680 brushing effect Effects 0.000 description 7
- 230000008901 benefit Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 230000006835 compression Effects 0.000 description 5
- 238000007906 compression Methods 0.000 description 5
- 230000001276 controlling effect Effects 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 210000000481 breast Anatomy 0.000 description 2
- 238000004140 cleaning Methods 0.000 description 2
- 230000000875 corresponding effect Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000013439 planning Methods 0.000 description 2
- 238000013515 script Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000002560 therapeutic procedure Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 239000000872 buffer Substances 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 235000019241 carbon black Nutrition 0.000 description 1
- 239000006229 carbon black Substances 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000003699 hair surface Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000005201 scrubbing Methods 0.000 description 1
- 229920002379 silicone rubber Polymers 0.000 description 1
- 239000004945 silicone rubber Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A46—BRUSHWARE
- A46B—BRUSHES
- A46B15/00—Other brushes; Brushes with additional arrangements
- A46B15/0002—Arrangements for enhancing monitoring or controlling the brushing process
- A46B15/0004—Arrangements for enhancing monitoring or controlling the brushing process with a controlling means
- A46B15/0012—Arrangements for enhancing monitoring or controlling the brushing process with a controlling means with a pressure controlling device
-
- A—HUMAN NECESSITIES
- A46—BRUSHWARE
- A46B—BRUSHES
- A46B15/00—Other brushes; Brushes with additional arrangements
- A46B15/0002—Arrangements for enhancing monitoring or controlling the brushing process
- A46B15/0004—Arrangements for enhancing monitoring or controlling the brushing process with a controlling means
- A46B15/001—Arrangements for enhancing monitoring or controlling the brushing process with a controlling means with means indicating the remaining useful life of brush
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B26—HAND CUTTING TOOLS; CUTTING; SEVERING
- B26B—HAND-HELD CUTTING TOOLS NOT OTHERWISE PROVIDED FOR
- B26B21/00—Razors of the open or knife type; Safety razors or other shaving implements of the planing type; Hair-trimming devices involving a razor-blade; Equipment therefor
- B26B21/40—Details or accessories
- B26B21/405—Electric features; Charging; Computing devices
- B26B21/4056—Sensors or controlling means
-
- A—HUMAN NECESSITIES
- A46—BRUSHWARE
- A46B—BRUSHES
- A46B2200/00—Brushes characterized by their functions, uses or applications
- A46B2200/10—For human or animal care
- A46B2200/1066—Toothbrush for cleaning the teeth or dentures
Definitions
- the present invention relates to the field of handheld devices (i.e., personal care devices, treatment devices, or therapeutic devices), and in particular to the field of handheld devices that capture images of a surface of a subject.
- handheld devices i.e., personal care devices, treatment devices, or therapeutic devices
- Handheld devices such as tooth/hair brushing devices, shaving devices and breast pumps, are used on a regular basis.
- cameras and other imaging sensors have been integrated into handheld devices to capture images of the surface of the user proximate to end-effectors (e.g., bristles, filaments, shaver caps, etc.) of the devices.
- Image capture enables remote image-based diagnostics, enhanced location sensing, therapy planning, and/or treatment monitoring.
- the process of capturing these images is being seamlessly integrated into regular personal care devices and routines of the user. This seamless integration of imaging may avoid extra hassle to the user and enable immediate image-based feedback to user as there is no need for smart phone based image capture or for additional personal care routine workflow steps such as taking separate images after tooth brushing.
- end-effectors are known to deform under forces typically exerted by a user during use of the device.
- end-effectors are prone to encroach upon the image field of the camera, making image interpretation more difficult (and sometimes impossible). Consequently, it is often necessary to acquire many images before a useful image is obtained. This results in a high volume of data transfer, with much of the data being relatively useless to realize the above advantages of image capture.
- a handheld device comprising:
- deformation of said end-effectors under a given loading state may cause some of the end-effectors to encroach on a field of view of image capture of the surface of the user.
- a determined/estimated/predicted loading state of the end-effectors during capture of an image may be leveraged to determine a quality of the image.
- transfer of the image is selectively controlled based on the image quality. In this way, an overall data transfer rate may be reduced by omitting transmission of poor quality (i.e., highly obscured) images.
- images of a portion of a surface of the user captured by a handheld device having end-effectors for engaging with the (same) surface of the user are often obscured/encroached by the end-effectors. This is because, as the end effectors engage with the surface of the user, they experience undergo a loading state (i.e., due to a force/a pressure), deflecting/deforming/forcing them within the field of view of the image capture device. Of course, the surface of the user may not even be viewable in images with high degrees of encroachment by the end-effectors.
- Said further processing may include remote image-based diagnostics, enhanced location sensing, therapy planning, and/or treatment monitoring.
- each image is selectively transmitted.
- images of poor quality may not be transmitted (or transmitted at a high compression ratio), while those that provide a clear view of the surface of the user are transmitted (or transmitted losslessly, or at a low compression ratio).
- the loading state may be directly measured/determined, or a likelihood/probability of the end-effectors being in a loading-state is estimated.
- the loading state of the end effectors provides an efficient (i.e., fast and having low computational complexity) means for determining/estimating/predicting the quality of the image. While directly assessing the image of the surface of the user using image processing techniques may provide a highly accurate assessment as to the quality of the image, such a solution is likely to be computationally intensive and slow, whilst such accuracy may be unnecessary. This may not be desirable for handheld devices where size, power and processing capabilities may be limited. Thus, leveraging a loading state as an indirect means of assessment of the quality of the image may facilitate an improved (i.e., more efficient, faster) method for selectively transmitting the image.
- the invention proposes selective transmission of images, rather than selective capture of images.
- Selectively capturing images responsive to a control signal typically requires a sophisticated image capture device, which has inherent cost, power and space considerations.
- image capture devices that simply capture the image repetitively (i.e., at a regular/random time interval) are widely available, small and inexpensive.
- the proposed invention instead provides that the image is captured, and subsequently assessed for quality (based on determined/detected/measured loading state). This may therefore contribute to a reduced overall cost of the handheld device.
- the quality value may indicate an extent to which the image of the surface of the subject is obscured by the plurality of end-effectors.
- a quality of the image relates to the extent to which the end-effectors are present in the image.
- this has a direct impact on the usefulness of the image, as highly obscured images may not contain the sufficient information for further processing.
- the plurality of end effectors being present in the image act as noise, and therefore the extent to which the end-effectors obscure the image dictates the noisiness (i.e., a quality) of the image.
- the loading state may, in exemplary embodiments, describe a level of deformation of at least one of the plurality of end-effectors.
- Different loading states indicate a different extent/level/degree to which the plurality of end-effectors deform/deflect (and thus an extent to which the end-effectors may deform into an imaging field of view of the image capture device).
- a greater loading state i.e. a higher force or pressure
- applied to the end-effector results in a greater deformation/deflection of the end-effector.
- an extent of deformation is indicative of a likelihood that the end-effector is present in the field of view of the image capture device.
- the loading state may further describe at least one of a direction of deformation and a mode of deformation of at least one of the plurality of end-effectors.
- the end-effector may deform in any radial direction from a resting axis, while the end-effector may deform freely (i.e., when lightly in contact with the surface of the user), in a constrained manner (i.e., when both end of the end-effector are pinned/fixed), in a trapped manner (i.e., the end-effector cannot move), or in a dynamic fashion originating from mechanical instability (bistable snap through or buckling). All of these factors impact the prediction/determination of quality of the captured image that is derived from the loading state. Thus, this information may provide a more accurate quality value.
- the sensor unit may comprise a sensor configured to detect a force exerted by at least one of the end-effectors on a portion of a surface of the subject.
- the loading state may be based on the detected force or pressure.
- One way in which to determine the loading state of the end-effector is to directly assess/detect/measure a force exerted by the end-effectors on the surface of the subject. As many devices already have such sensors integrated into them (i.e., some electric toothbrushes), this may provide an inexpensive and convenient means for assessing the loading state.
- the sensor unit may comprise at least one deformation sensing element coupled to one of the plurality of end-effectors or as one of the plurality of end-effectors, the deformation sensing element configured to detect a deformation value of the corresponding end-effector. Then, the loading state may be based on the detected deformation value.
- the loading state may be determined by directly measuring the deformation of the end-effectors. This may be achieved by devices/sensors that change properties and/or output a signal based on a degree of deformation to which they experience.
- the loading state can be determined directly from the deformation of the end-effectors.
- this may provide a highly accurate means for assessing the loading state.
- the at least one deformation sensing element may be one of a piezoelectric polymer fiber, a fiber Bragg grating, a Fabry-Perot sensor, or a pressure-sensitive material.
- the sensor unit may comprise an optical proximity sensor configured to detect a distance value between a portion of the handheld device adjacent to the end-effectors and the surface of the subject. As such, the loading state may be based on the detected distance value.
- Yet another method by which the loading state may be determined is based on a detected distance between the portion of the handheld device at the base of the end-effectors, and the surface of the subject. Essentially, this means that a distance occupied by the end-effector (when in contact with the surface of the subject) may be determined. Of course, as this distance decreases smaller than the length of the end-effectors, the loading state may be considered to be increasing.
- One cost-effective, low-power and convenient means for assessing the distance is the use of an optical proximity sensor, which may assess the distance at a number of locations.
- the handheld device may further comprise a vibratory actuator adapted to vibrate at least one of the plurality of end-effectors.
- the sensor unit may comprise a sensor configured to detect a vibration value of at least one of the plurality of end-effectors, and the loading state may be based on the detected vibration value.
- handheld devices comprise a component which vibrates in order to aid in the function of the device (i.e., for assisted brushing of teeth, massaging, or hair shaving).
- a sensor such as an accelerometer
- an extent of the vibration may be assessed. Indeed, vibrations may impact a loading state on the end-effector, and may more generally impact a quality of the image captured by the image capture device (i.e., by blurring, or moving the end-effectors within the field of view).
- control unit may be configured to transfer the image responsive to the quality value being based on a loading state that corresponds to a predetermined range of acceptable loading states.
- loading states in which the quality of the image is acceptable are known. This (potentially non-continuous) range may vary by device, and therefore may be determined during design, manufacture, and testing stages. Thus, when it is determined that the loading state that the end-effector was under during capture of the image is acceptable, then the image is transferred (and potentially stored and subject to further processing/analysis).
- the predetermined range of acceptable loading states may be bounded by a predetermined minimum loading state and a predetermined maximum loading state
- the predetermined minimum loading state may be indicative of at least one of the plurality of end-effectors being in contact with the portion of the surface of the subject
- the predetermined maximum loading state may be indicative of a minimum acceptable amount of unobscured surface of the subject present in the image.
- images are transferred when the end-effectors are in contact with the surface of the subject but are not excessively obscured by the end-effectors (i.e., when the image quality is still high).
- This may provide an effective use of the bandwidth and power of the handheld device for transferring images, in that images useful for further processing are transferred (while other images may be discarded, sent at a high compression ratio, or stored locally).
- each of the plurality of end-effectors may be one of an elastically deformable end-effector, or a rigid end-effector mounted on the handheld device by an elastically deformable base.
- end-effectors There are any different types of end-effectors. Two types which are of particular focus are flexible/elastically deformable end-effectors, and those which are mounted or comprise a flexible/elastically deformable part. These types of end-effectors are likely to encroach on a field of view of the image capture device under specific loading states due to their deformable nature. Thus, handheld devices with such end-effectors may particularly benefit from embodiments of the invention.
- the handheld device may be a personal care device, a treatment device, or a therapeutic device.
- a method for selectively controlling the transfer of an image of a surface of a subject acquired by a handheld device comprising a plurality of end-effectors for engagement with a portion of the surface of a subject comprising: capturing an image of the surface of the subject; determining a loading state of the plurality of end-effectors at a same time as when the image was captured; determining a quality value of the image based on the loading state; and selectively controlling the transfer of the image based on the quality value.
- a computer program comprising computer program code means adapted, when said computer program is run on a computer, to implement a method of a proposed embodiment.
- the invention proposes concepts for aiding and/or improving handheld devices that capture images of a surface of a subject as end-effectors of the device engage with the surface of the subject. Specifically, a loading state of the end-effectors is determined at the same time as image capture. From this loading state, a predicted quality value of the image may be determined, which informs selective transmission of the image. Overall, this may mean that images that likely poor quality may not be transmitted, reducing cost and a bandwidth required to facilitate such unnecessary transmission.
- proposed concepts aim to improve the process of image acquisition of a portion of a surface of the subject by a handheld device having a plurality of end-effectors for engagement with a portion of the surface of a subject.
- deformation of said end-effectors under a loading may cause some of the end-effectors to encroach on a field of view of image capture of the surface of the user.
- a determined loading state of the end-effectors during capture of an image may be leveraged to determine a quality of the image.
- transfer of the image is selectively controlled based on the image quality. In this way, an overall data transfer rate may be reduced by omitting transmission of poor quality (i.e., highly obscured) images.
- end-effectors are known to deform under pressure, and are prone to encroach upon an image field of an image capture device.
- the resulting image will contain the end-effector.
- the end-effectors are not the desired target of the image capture. Instead, it is desirable that the image contains (at least part of) the surface of the subject. Thus, encroachment by the end-effectors is undesirable.
- the end-effector may freely deform or move (i.e. touch surface and freely move) as shown in 10.
- the end-effector may be restricted as shown in 20, meaning that the distal end of the end effector from the handheld device may not always freely move.
- the end-effector may be pinned in a crevasse of a treatment surface, meaning that the distal end cannot move and is therefore likely not effective.
- these different modes of deflection may also impact whether the end-effector encroaches on the field of view of the image capture device.
- End-effectors may be thought of as upstanding elongate members (e.g., bristles, quills, tufts), but are not restricted as such.
- End-effectors are any means by which a surface (e.g., teeth, tongue, skin, hair) of the subject is engaged by a handheld device, for example for scrubbing, brushing, shaving, massaging, etc. They are typically elastically deformable/flexible themselves, or are mounted on an elastically deformable/flexible member, such that they deform/deflect under a pressure applied during typical use of the handheld device.
- end-effector deformation such as uncontrolled splaying, pinning, statically loading/deforming obscure regions of interest to be imaged.
- One solution is to retrospectively neglect such images from (remote) data analysis, but this increases total data amount to be sent wirelessly to the cloud or other in-device interfaces.
- Another solution would be to process the captured image with known image analysis techniques to determine an extent to which the image contains unobstructed/unobscured regions of interest, but this is necessarily computationally expensive and time consuming.
- embodiments of the invention provide systems and computer-implemented methods (executed by a processor) to selectively transmit image data for further processing based on a predicted image quality based on a determined loading state of the end-effectors during image capture. Images may be transmitted when the loading state of the end-effectors (and thus the predicted/determined image quality) is associated with a given range. This range is preferably greater than zero (i.e. the end-effectors not in contact with the surface of the subject), and not exceeding a maximum loading state (i.e. high deformation of the end-effectors and low image quality).
- embodiments propose only transmitting captured images associated with low/zero end-effector deformation and/or specific ranges of lower loading states, with forces and deformation either directly measured or evaluated indirectly.
- a properly designed image capture device can be positioned and configured such that a captured image of the surface of the subject is free of bristle encroachment when the end-effectors are in specific loading states (e.g. pressure force, acceleration, force, etc.).
- specific loading states e.g. pressure force, acceleration, force, etc.
- the handheld device vibration may also be in some known specific state.
- the image remains sufficiently undistorted providing the specific loading state (and in particular, the contact pressure of the end-effectors) remains in a specified range of values. Such values are not necessarily contiguous but are often monotone.
- embodiments of the invention provide the following main elements:
- FIG. 3 presents a simplified block diagram of a handheld device 100 according to an embodiment of the invention. Specifically, there are depicted a plurality of end-effectors 110, an image capture device 120, a sensor unit 130, and a control unit 140. Optionally, there may further be provided a vibratory means 112, a communication unit 150, and an external/remote processor 160.
- Fig. 3 relates to a (powered) toothbrush
- the invention is equally applicable to other devices where end-effector deformation may obscure the image of an (on-board) image capture device/camera.
- Fig. 3 could equally depict a hairbrush, a skin cleaning brush, a brushing mouthpiece, a shaver, etc.
- the handheld device may be one of a personal care device, hairbrush, a treatment device, or a therapeutic device. Equally, the position of the elements described is not prescriptive of their required relative locations.
- the end-effectors 110 are configured for engagement with a portion of a surface of a subject. In other words, the end-effectors 110 contact a part of the surface of the subject.
- the surface could be a tooth, a gum, a tongue, a skin surface, a hair surface, etc. of the subject.
- the end-effectors could refer to individual elongate members upstanding from a head of the handheld device, such as brushes, tufts, or collections of brushes/tufts.
- any deformable end-effector (or end-effector mounted on a deformable member) which comes into contact with a part of the surface of the subject may be one of the end-effectors 110.
- each of the plurality of end-effectors 110 are one of an elastically deformable end-effector, or a rigid end-effector mounted on the handheld device by an elastically deformable base.
- the subject could be the user of the handheld device 100 putting the end-effectors 110 in contact with the surface of the subject.
- the subject may be the person or animal to which the end-effectors 110 are in contact with, while the handheld device 100 is operated by another person.
- the handheld device 100 could be a toothbrush operated by a veterinarian brushing the teeth of an animal, and so the subject is the animal.
- the image capture device 120 is configured to capture an image of the surface of the subject.
- the image capture device 120 is designed/positioned/adapted to have a field of view including the surface of the subject when the end-effectors 110 come into contact with part of the surface of the subject. In this way, the surface of the subject may be imaged for remote analysis (i.e. for diagnosis, advice, etc.).
- the sensor unit 130 is configured to determine a loading state of the plurality of end-effectors 110 at a same time as when the image was captured.
- the loading state may be directly measured/determined, or a likelihood/probability of the end-effectors being in a loading-state is estimated.
- the loading state may describe at least one of a pressure exerted on the end-effectors 110 by contact with the surface of the subject, a vibration of the handheld device 100, and any other force that the end-effectors 110 and/or image capture device 120 may be subject to.
- the loading state (indirectly) describes a level/extent/degree of deformation of at least one of the plurality of end-effectors 110. More specifically, the loading state may describe at least one of a direction of deformation and a mode of deformation. Essentially, the loading state is indicative of the forces that the end-effectors 110 are under during contact with the surface of the subject.
- the sensor unit 130 may be configured to continually/continuously determine/detect/measure the loading state of the end-effectors 110 during use of the handheld device 100.
- the loading state may be determined at precisely the same time as when the image is captured, or alternatively/additionally in the time before/after capture of the image.
- the control unit 140 is configured to determine a quality value of the image based on the loading state.
- the loading state of the plurality of end-effectors 110 determines an extent to which the end-effectors 110 deform, and therefore has a direct link to the end-effectors 110 encroaching on the field-of-view of the image capture device 120.
- the quality of the image may be known to be low due to encroachment by the end-effectors.
- the quality value essentially indicates an extent to which the image of the surface of the subject is obscured by the plurality of end-effectors 110.
- a high quality value indicates that the captured image is relatively end-effector-free (i.e. the end-effectors are absent from the image), and therefore contain a clear view of the surface of the subject.
- a low quality value indicates a presence of end-effectors 110 in the image to a point that the surface of the subject may not be visible, or visible to an extent that is not particularly useful for further processing.
- the control unit 140 is further configured to selectively control the transfer of the image based on the quality value. Essentially, this means that for some quality values (derived from the loading state) the images may be transferred in a first manner, and for other quality values the image may be transferred in a second manner.
- this may mean that the quality value determines whether an image is transferred or not.
- the parameters/characteristics of the transfer of the image may be different. For example, low quality images may be transferred at a high compression rate (strongly lossy), while high quality images may be transferred at a low compression rate (losslessly, or weakly lossy).
- the quality value may discretely (i.e. in a binary manner) or continuously (i.e. many different schemes for transfer) determine the transfer of the image.
- images useful in further processing/analysis may be transferred, whereas images less useful may not be transferred. Accordingly, an overall data transfer rate may be reduced. This may be thought of a more effective transmission scheme, with a higher ratio between data utility per bit transmitted/sent.
- control unit 140 is configured to transfer the image responsive to the quality value being based on a loading state that corresponds to a predetermined range of acceptable loading states.
- the image is transferred responsive to the quality value being between an acceptable range of acceptable quality values (derived from the loading state).
- a range a contiguous range, or split
- quality values/loading states corresponds to images that may contain enough of the area of interest to be useful for further processing.
- the predetermined range of acceptable loading states is (at least) bounded by a predetermined minimum loading state and a predetermined maximum loading state (or minimum and maximum quality value).
- the predetermined minimum loading state may be indicative of at least one of the plurality of end-effectors 110 being in contact with the portion of the surface of the subject. Indeed, if the end-effectors 110 are not in contact with the surface of the subject then the image captured by the image capture device 120 is highly unlikely to contain an image of the surface of the subject (the area of interest).
- the predetermined maximum loading state may be indicative of a minimum acceptable amount of unobscured surface of the subject present in the image.
- a loading state greater than zero but below an upper threshold may be appropriate. Images captured in such conditions reflect moments where the end-effectors 110 are in contact (as opposed to zero pressure where the device is free) but in a way to interfere the least with the acquired image. Above the upper threshold, the end-effector deformation will be such as to prohibitively interfere with the image and make further analysis unnecessary.
- the control unit 140 may then be configured to control a communication unit 150 to transmit (or not transmit, as the case may be) the image.
- the image may be transmitted by any appropriate means, such as Bluetooth, Wi-Fi, Zigbee, etc.
- the image may be transmitted to an external processor 160 (i.e., the cloud, a smartphone of the subject, etc.), for further processing or analysis. It should be noted that embodiments are not restricted to this arrangement. Indeed, the control unit 140 may control transmission of the images locally, for storage and/or future transfer by other means.
- the sensor unit 130 comprises a sensor configured to detect/measure a force exerted by at least one of the end-effectors 110 on a portion of a surface of the subject. The loading state is thus based on the detected force.
- this embodiment particularly benefits from implementation in handheld devices 100 that already have force/pressure sensors (such as an inertia measurement unit, IMU) already integrated.
- force/pressure sensors such as an inertia measurement unit, IMU
- powered toothbrushes often have a pressure sensor to detect when a force applied by the user is inappropriate for adequate brushing.
- the pressure/force sensor may be used to detect a loading state of the end-effectors 110.
- a detected force or pressure
- signals from one or more pressure sensors may be correlated.
- a signature of a single sensor or of a multiplicity of sensors may be considered to provide a measurement of the specific loading state (i.e. the force applied by a user applying the end-effectors 110 to a surface of the subject). In this way, a more accurate assessment of the loading state may be acquired.
- the sensor unit 130 comprises at least one deformation sensing element coupled to (i.e., integrated within or alongside) one of the plurality of end-effectors 110.
- the deformation sensing element is configured to detect a deformation value of the corresponding end-effector 110.
- the end-effectors 110 could be the deformation elements themselves.
- the loading state is based on the detected deformation value.
- the deformation value indicates the pressure that the end-effectors 110 are under, and therefore the loading state.
- the deformation sensing elements could each be any one of a piezoelectric polymer fiber, a fiber Bragg grating, a Fabry-Perot sensor, or a pressure-sensitive material (e.g., carbon-black filled conductive silicone rubber). All of these elements have a property that changes with a deformation. Thus, when the deformation sensing element is mechanically coupled to the end-effector 110 (or is one of the end-effectors 110), then a change in the property may directly measure the deformation.
- a change in the property may directly measure the deformation.
- measurements from (deformation/strain) sensing elements/filaments to directly assess end-effector deformation can be used assess the suitability of captured images for transmission for further analysis.
- the sensor unit 130 comprises an optical proximity sensor configured to detect/measure a distance value between a portion of the handheld device 100 adjacent to the end-effectors 110 and the surface of the subject. In this case, the loading state is based on the detected distance value.
- the optical proximity sensor is most suitably an infrared LED sensor with a small number of pixels on a sensing element, but other optical proximity sensors may be appreciated.
- the optical proximity sensor may do this by:
- the handheld device 100 may further comprise vibratory actuator 112 adapted to vibrate at least one of the plurality of end-effectors.
- vibratory actuators 112 are well known, and usually provided to improve the functionality of the handheld device 100.
- an electric toothbrush may have a platen that vibrates the plurality of end-effectors 110 for more effective cleaning.
- the loading state may also be indicated by vibration of the end-effectors 110.
- the sensor unit 130 comprises a sensor configured to detect/measure a vibration value of at least one of the plurality of end-effectors 110. From the vibration value, the loading state may be (at least partially) derived.
- the sensor in this case may be an accelerometer.
- the sensor may be part of an IMU already found in many handheld devices.
- Fig. 4 presents a flow diagram of a method 200 for selectively controlling the transfer of an image of a surface of a subject acquired by a handheld device as described above.
- the handheld (e.g., personal care, treatment and/or therapeutic) device comprises a plurality of end-effectors for engagement with a portion of the surface of a subject and a means for capturing the image of the surface of the subject.
- an image of the surface of the subject is captured. This may be captured by a known camera/imaging system.
- a loading state of the plurality of end-effectors is determined.
- This loading state reflects the loading state at (approximately) the same time as when the image was captured is determined. In other words, an image is captured, and a force being exerted on the plurality of end-effectors at roughly the same time (i.e. slightly before or after) is determined directly or indirectly.
- the loading state may be determined in a plurality of different ways. Indeed, at optional steps 222-228, there are a number of different values detected/measured from which the loading state may be derived. Thus, in some embodiments, the loading state is determined based on at least one of a force, a deformation value, a distance value and/or a vibration value. In this way, a full appreciation of the deformation/mode of deformation of the end-effectors may be realized.
- a quality value of the image is determined based on the loading state.
- the quality value may reflect the extent to which the captured image is likely to be obscured by the end-effectors. Thus, the quality value indicates a likely worth/value of the image.
- the transfer of the image is selectively controlled.
- the image may be transferred losslessly, weakly lossy, strongly lossy or not at all. This is based on the quality value, with images of higher quality being prioritized for transmission.
- the method provides selective transmission of a captured image which reflects the loading state of the end-effectors at the time. In this way, images that are more likely to be unobscured will be prioritized for transmission, reducing an overall data transfer rate.
- Fig. 5 illustrates an example of a computer 300 within which one or more parts of an embodiment may be employed.
- Various operations discussed above may utilize the capabilities of the computer 300.
- one or more parts of a system for controlling a handheld device may be incorporated in any element, module, application, and/or component discussed herein.
- system functional blocks can run on a single computer or may be distributed over several computers and locations (e.g. connected via internet), such as a cloud-based computing infrastructure.
- the computer 300 includes, but is not limited to, PCs, workstations, laptops, PDAs, palm devices, servers, storages, and the like.
- the computer 300 may include one or more processors 310, memory 320, and one or more I/O devices 330 that are communicatively coupled via a local interface (not shown).
- the local interface can be, for example but not limited to, one or more buses or other wired or wireless connections, as is known in the art.
- the local interface may have additional elements, such as controllers, buffers (caches), drivers, repeaters, and receivers, to enable communications. Further, the local interface may include address, control, and/or data connections to enable appropriate communications among the aforementioned components.
- the processor 310 is a hardware device for executing software that can be stored in the memory 320.
- the processor 310 can be virtually any custom made or commercially available processor, a central processing unit (CPU), a digital signal processor (DSP), or an auxiliary processor among several processors associated with the computer 300, and the processor 310 may be a semiconductor based microprocessor (in the form of a microchip) or a microprocessor.
- the memory 320 can include any one or combination of volatile memory elements (e.g., random access memory (RAM), such as dynamic random access memory (DRAM), static random access memory (SRAM), etc.) and non-volatile memory elements (e.g., ROM, erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), tape, compact disc read only memory (CD-ROM), disk, diskette, cartridge, cassette or the like, etc.).
- RAM random access memory
- DRAM dynamic random access memory
- SRAM static random access memory
- non-volatile memory elements e.g., ROM, erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), tape, compact disc read only memory (CD-ROM), disk, diskette, cartridge, cassette or the like, etc.
- the memory 320 may incorporate electronic, magnetic, optical, and/or other types
- the software in the memory 320 may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing logical functions.
- the software in the memory 320 includes a suitable operating system (O/S) 340, compiler 360, source code 350, and one or more applications 370 in accordance with exemplary embodiments.
- the application 370 comprises numerous functional components for implementing the features and operations of the exemplary embodiments.
- the application 370 of the computer 300 may represent various applications, computational units, logic, functional units, processes, operations, virtual entities, and/or modules in accordance with exemplary embodiments, but the application 370 is not meant to be a limitation.
- the operating system 340 controls the execution of other computer programs, and provides scheduling, input-output control, file and data management, memory management, and communication control and related services. It is contemplated by the inventors that the application 370 for implementing exemplary embodiments may be applicable on all commercially available operating systems.
- Application 370 may be a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed.
- a source program then the program is usually translated via a compiler (such as the compiler 360), assembler, interpreter, or the like, which may or may not be included within the memory 320, so as to operate properly in connection with the O/S 340.
- the application 370 can be written as an object oriented programming language, which has classes of data and methods, or a procedure programming language, which has routines, subroutines, and/or functions, for example but not limited to, C, C++, C#, Pascal, BASIC, API calls, HTML, XHTML, XML, ASP scripts, JavaScript, FORTRAN, COBOL, Perl, Java, ADA, .NET, and the like.
- the I/O devices 330 may include input devices such as, for example but not limited to, a mouse, keyboard, scanner, microphone, camera, etc. Furthermore, the I/O devices 330 may also include output devices, for example but not limited to a printer, display, etc. Finally, the I/O devices 330 may further include devices that communicate both inputs and outputs, for instance but not limited to, a NIC or modulator/demodulator (for accessing remote devices, other files, devices, systems, or a network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router, etc. The I/O devices 330 also include components for communicating over various networks, such as the Internet or intranet.
- a NIC or modulator/demodulator for accessing remote devices, other files, devices, systems, or a network
- RF radio frequency
- the I/O devices 330 also include components for communicating over various networks, such as the Internet or intranet.
- the software in the memory 320 may further include a basic input output system (BIOS) (omitted for simplicity).
- BIOS is a set of essential software routines that initialize and test hardware at startup, start the O/S 340, and support the transfer of data among the hardware devices.
- the BIOS is stored in some type of read-only-memory, such as ROM, PROM, EPROM, EEPROM or the like, so that the BIOS can be executed when the computer 300 is activated.
- the processor 310 When the computer 300 is in operation, the processor 310 is configured to execute software stored within the memory 320, to communicate data to and from the memory 320, and to generally control operations of the computer 300 pursuant to the software.
- the application 370 and the O/S 340 are read, in whole or in part, by the processor 310, perhaps buffered within the processor 310, and then executed.
- a computer readable medium may be an electronic, magnetic, optical, or other physical device or means that can contain or store a computer program for use by or in connection with a computer related system or method.
- the application 370 can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
- a "computer-readable medium" can be any means that can store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- the computer readable medium can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium.
- the proposed control method(s) of Fig. 4 may be implemented in hardware or software, or a mixture of both (for example, as firmware running on a hardware device).
- the functional steps illustrated in the process flowcharts may be performed by suitably programmed physical computing devices, such as one or more central processing units (CPUs) or graphics processing units (GPUs).
- CPUs central processing units
- GPUs graphics processing units
- Each process - and its individual component steps as illustrated in the flowcharts - may be performed by the same or different computing devices.
- a computer-readable storage medium stores a computer program comprising computer program code configured to cause one or more physical computing devices to carry out a control method as described above when the program is run on the one or more physical computing devices.
- Storage media may include volatile and non-volatile computer memory such as RAM, PROM, EPROM, and EEPROM, optical discs (like CD, DVD, BD), magnetic storage media (like hard discs and tapes).
- RAM random access memory
- PROM read-only memory
- EPROM erasable programmable read-only memory
- EEPROM electrically erasable programmable read-only memory
- optical discs like CD, DVD, BD
- magnetic storage media like hard discs and tapes.
- Various storage media may be fixed within a computing device or may be transportable, such that the one or more programs stored thereon can be loaded into a processor.
- the blocks shown in the block diagrams of Fig 3 may be separate physical components, or logical subdivisions of single physical components, or may be all implemented in an integrated manner in one physical component.
- the functions of one block shown in the drawings may be divided between multiple components in an implementation, or the functions of multiple blocks shown in the drawings may be combined in single components in an implementation.
- Hardware components suitable for use in embodiments of the present invention include, but are not limited to, conventional microprocessors, application specific integrated circuits (ASICs), and field-programmable gate arrays (FPGAs).
- ASICs application specific integrated circuits
- FPGAs field-programmable gate arrays
- One or more blocks may be implemented as a combination of dedicated hardware to perform some functions and one or more programmed microprocessors and associated circuitry to perform other functions.
- a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
- a suitable medium such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures.
- two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Forests & Forestry (AREA)
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
Proposed are schemes, solutions, concepts, designs, methods and systems pertaining to aiding and/or improving image acquisition of a portion of a surface of the subject by a handheld device having a plurality of end-effectors for engagement with a portion of the surface of a subject. Specifically, deformation of said end-effectors under a load may cause some of the end-effectors to encroach on a field of view of image capture of the surface of the user. It has therefore been realized that a determined loading state of the end-effectors during capture of an image may be leveraged to determine a quality of the image. Thus, transfer of the image is selectively controlled based on the image quality. In this way, an overall data transfer rate may be reduced by omitting transmission of poor quality (i.e., highly obscured) images.
Description
- The present invention relates to the field of handheld devices (i.e., personal care devices, treatment devices, or therapeutic devices), and in particular to the field of handheld devices that capture images of a surface of a subject.
- Handheld devices, such as tooth/hair brushing devices, shaving devices and breast pumps, are used on a regular basis. Recently, cameras and other imaging sensors have been integrated into handheld devices to capture images of the surface of the user proximate to end-effectors (e.g., bristles, filaments, shaver caps, etc.) of the devices.
- Image capture enables remote image-based diagnostics, enhanced location sensing, therapy planning, and/or treatment monitoring. Increasingly, rather than using separate devices such as smartphones, dental endoscopes, or handheld intraoral scanners, the process of capturing these images is being seamlessly integrated into regular personal care devices and routines of the user. This seamless integration of imaging may avoid extra hassle to the user and enable immediate image-based feedback to user as there is no need for smart phone based image capture or for additional personal care routine workflow steps such as taking separate images after tooth brushing.
- However, end-effectors are known to deform under forces typically exerted by a user during use of the device. Thus, end-effectors are prone to encroach upon the image field of the camera, making image interpretation more difficult (and sometimes impossible). Consequently, it is often necessary to acquire many images before a useful image is obtained. This results in a high volume of data transfer, with much of the data being relatively useless to realize the above advantages of image capture.
- The invention is defined by the claims.
- According to examples in accordance with an aspect of the invention, there is provided a handheld device, comprising:
- a plurality of end-effectors for engagement with a portion of a surface of a subject;
- an image capture device configured to capture an image of the surface of the subject;
- a sensor unit configured to determine a loading state of the plurality of end-effectors at a same time as when the image was captured; and
- a control unit configured to determine a quality value of the image based on the loading state, and to selectively control the transfer of the image based on the quality value.
- Proposed are schemes, solutions, concepts, designs, methods and systems pertaining to aiding and/or improving image acquisition of a portion of a surface of the subject by a handheld device having a plurality of end-effectors for engagement with a portion of the surface of a subject. Specifically, deformation of said end-effectors under a given loading state may cause some of the end-effectors to encroach on a field of view of image capture of the surface of the user. It has therefore been realized that a determined/estimated/predicted loading state of the end-effectors during capture of an image may be leveraged to determine a quality of the image. Thus, transfer of the image is selectively controlled based on the image quality. In this way, an overall data transfer rate may be reduced by omitting transmission of poor quality (i.e., highly obscured) images.
- By way of explanation, images of a portion of a surface of the user captured by a handheld device having end-effectors for engaging with the (same) surface of the user are often obscured/encroached by the end-effectors. This is because, as the end effectors engage with the surface of the user, they experience undergo a loading state (i.e., due to a force/a pressure), deflecting/deforming/forcing them within the field of view of the image capture device. Of course, the surface of the user may not even be viewable in images with high degrees of encroachment by the end-effectors. Also, as well as end-effectors encroaching on the view of the camera, and high loading states may impact the image capture device itself, reducing an image quality.
Such images may not be useful (or of little use) for further processing. Said further processing may include remote image-based diagnostics, enhanced location sensing, therapy planning, and/or treatment monitoring. - Therefore, each image is selectively transmitted. In effect, this means that images of poor quality may not be transmitted (or transmitted at a high compression ratio), while those that provide a clear view of the surface of the user are transmitted (or transmitted losslessly, or at a low compression ratio). This means that a required bandwidth for transmitting captured images, storage at a receiver, and processing required at the receiver may be reduced, without excessively reducing an overall usefulness/quality of the data acquired by the image capture device.
- Moreover, the loading state may be directly measured/determined, or a likelihood/probability of the end-effectors being in a loading-state is estimated. The loading state of the end effectors provides an efficient (i.e., fast and having low computational complexity) means for determining/estimating/predicting the quality of the image. While directly assessing the image of the surface of the user using image processing techniques may provide a highly accurate assessment as to the quality of the image, such a solution is likely to be computationally intensive and slow, whilst such accuracy may be unnecessary. This may not be desirable for handheld devices where size, power and processing capabilities may be limited. Thus, leveraging a loading state as an indirect means of assessment of the quality of the image may facilitate an improved (i.e., more efficient, faster) method for selectively transmitting the image.
- Furthermore, the invention proposes selective transmission of images, rather than selective capture of images. Selectively capturing images responsive to a control signal typically requires a sophisticated image capture device, which has inherent cost, power and space considerations. In contrast, image capture devices that simply capture the image repetitively (i.e., at a regular/random time interval) are widely available, small and inexpensive. Thus, the proposed invention instead provides that the image is captured, and subsequently assessed for quality (based on determined/detected/measured loading state). This may therefore contribute to a reduced overall cost of the handheld device.
- In some embodiments, the quality value may indicate an extent to which the image of the surface of the subject is obscured by the plurality of end-effectors.
- A quality of the image relates to the extent to which the end-effectors are present in the image. Of course, this has a direct impact on the usefulness of the image, as highly obscured images may not contain the sufficient information for further processing. In other words, the plurality of end effectors being present in the image act as noise, and therefore the extent to which the end-effectors obscure the image dictates the noisiness (i.e., a quality) of the image.
- Therefore, defining the quality in this way may lead to a more effective selective transmission.
- The loading state may, in exemplary embodiments, describe a level of deformation of at least one of the plurality of end-effectors.
- Different loading states indicate a different extent/level/degree to which the plurality of end-effectors deform/deflect (and thus an extent to which the end-effectors may deform into an imaging field of view of the image capture device). A greater loading state (i.e. a higher force or pressure) applied to the end-effector results in a greater deformation/deflection of the end-effector. Of course, such an extent of deformation is indicative of a likelihood that the end-effector is present in the field of view of the image capture device.
- In particular, the loading state may further describe at least one of a direction of deformation and a mode of deformation of at least one of the plurality of end-effectors.
- Typically, there is more than one direction, and more than one type of deflection. Often it is the case that the end-effector may deform in any radial direction from a resting axis, while the end-effector may deform freely (i.e., when lightly in contact with the surface of the user), in a constrained manner (i.e., when both end of the end-effector are pinned/fixed), in a trapped manner (i.e., the end-effector cannot move), or in a dynamic fashion originating from mechanical instability (bistable snap through or buckling). All of these factors impact the prediction/determination of quality of the captured image that is derived from the loading state. Thus, this information may provide a more accurate quality value.
- In further embodiments, the sensor unit may comprise a sensor configured to detect a force exerted by at least one of the end-effectors on a portion of a surface of the subject. In this case, the loading state may be based on the detected force or pressure.
- One way in which to determine the loading state of the end-effector is to directly assess/detect/measure a force exerted by the end-effectors on the surface of the subject. As many devices already have such sensors integrated into them (i.e., some electric toothbrushes), this may provide an inexpensive and convenient means for assessing the loading state.
- Alternatively, or additionally, in some embodiments the sensor unit may comprise at least one deformation sensing element coupled to one of the plurality of end-effectors or as one of the plurality of end-effectors, the deformation sensing element configured to detect a deformation value of the corresponding end-effector. Then, the loading state may be based on the detected deformation value.
- Another way in which the loading state may be determined is by directly measuring the deformation of the end-effectors. This may be achieved by devices/sensors that change properties and/or output a signal based on a degree of deformation to which they experience. Thus, coupling these sensing elements to the end-effectors (or adapting the sensing elements to be the end-effectors themselves), the loading state can be determined directly from the deformation of the end-effectors. Thus, this may provide a highly accurate means for assessing the loading state.
- Specifically, the at least one deformation sensing element may be one of a piezoelectric polymer fiber, a fiber Bragg grating, a Fabry-Perot sensor, or a pressure-sensitive material.
- Furthermore, the sensor unit may comprise an optical proximity sensor configured to detect a distance value between a portion of the handheld device adjacent to the end-effectors and the surface of the subject. As such, the loading state may be based on the detected distance value.
- Yet another method by which the loading state may be determined is based on a detected distance between the portion of the handheld device at the base of the end-effectors, and the surface of the subject. Essentially, this means that a distance occupied by the end-effector (when in contact with the surface of the subject) may be determined. Of course, as this distance decreases smaller than the length of the end-effectors, the loading state may be considered to be increasing. One cost-effective, low-power and convenient means for assessing the distance is the use of an optical proximity sensor, which may assess the distance at a number of locations.
- In some embodiments, the handheld device may further comprise a vibratory actuator adapted to vibrate at least one of the plurality of end-effectors. In this case, the sensor unit may comprise a sensor configured to detect a vibration value of at least one of the plurality of end-effectors, and the loading state may be based on the detected vibration value.
- In many cases, handheld devices comprise a component which vibrates in order to aid in the function of the device (i.e., for assisted brushing of teeth, massaging, or hair shaving). Thus, by utilizing a sensor such as an accelerometer, an extent of the vibration may be assessed. Indeed, vibrations may impact a loading state on the end-effector, and may more generally impact a quality of the image captured by the image capture device (i.e., by blurring, or moving the end-effectors within the field of view).
- In some embodiments, the control unit may be configured to transfer the image responsive to the quality value being based on a loading state that corresponds to a predetermined range of acceptable loading states.
- It may be the case that loading states in which the quality of the image is acceptable are known. This (potentially non-continuous) range may vary by device, and therefore may be determined during design, manufacture, and testing stages. Thus, when it is determined that the loading state that the end-effector was under during capture of the image is acceptable, then the image is transferred (and potentially stored and subject to further processing/analysis).
- More specifically, in some embodiments, the predetermined range of acceptable loading states may be bounded by a predetermined minimum loading state and a predetermined maximum loading state, the predetermined minimum loading state may be indicative of at least one of the plurality of end-effectors being in contact with the portion of the surface of the subject, and the predetermined maximum loading state may be indicative of a minimum acceptable amount of unobscured surface of the subject present in the image.
- Thus, images are transferred when the end-effectors are in contact with the surface of the subject but are not excessively obscured by the end-effectors (i.e., when the image quality is still high). This may provide an effective use of the bandwidth and power of the handheld device for transferring images, in that images useful for further processing are transferred (while other images may be discarded, sent at a high compression ratio, or stored locally).
- In some embodiments, each of the plurality of end-effectors may be one of an elastically deformable end-effector, or a rigid end-effector mounted on the handheld device by an elastically deformable base.
- There are any different types of end-effectors. Two types which are of particular focus are flexible/elastically deformable end-effectors, and those which are mounted or comprise a flexible/elastically deformable part. These types of end-effectors are likely to encroach on a field of view of the image capture device under specific loading states due to their deformable nature. Thus, handheld devices with such end-effectors may particularly benefit from embodiments of the invention.
- In some embodiments, the handheld device may be a personal care device, a treatment device, or a therapeutic device.
- According to another aspect of the invention, there is provided method for selectively controlling the transfer of an image of a surface of a subject acquired by a handheld device comprising a plurality of end-effectors for engagement with a portion of the surface of a subject, the method comprising: capturing an image of the surface of the subject; determining a loading state of the plurality of end-effectors at a same time as when the image was captured; determining a quality value of the image based on the loading state; and selectively controlling the transfer of the image based on the quality value.
- According to yet another aspect of the invention, there is provided a computer program comprising computer program code means adapted, when said computer program is run on a computer, to implement a method of a proposed embodiment.
- These and other aspects of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.
- For a better understanding of the invention, and to show more clearly how it may be carried into effect, reference will now be made, by way of example only, to the accompanying drawings, in which:
-
Fig. 1 presents a demonstration of the different modes of deformation of end-effectors subject to different loading states; -
Fig. 2 is a graph of a proportion of captured images being obscured as a function of end-effector deformation; -
Fig. 3 presents a simplified block diagram of a handheld device according to an embodiment of the invention; -
Fig. 4 presents a flow diagram of a method for selectively controlling the transfer of an image of a surface of a subject acquired by a handheld device according to another embodiment; and -
Fig. 5 provides a simplified block diagram of a computer within which one or more parts of an embodiment may be employed. - The invention will be described with reference to the Figures.
- It should be understood that the Figures are merely schematic and are not drawn to scale. It should also be understood that the same reference numerals are used throughout the Figures to indicate the same or similar parts.
- It should also be understood that the detailed description and specific examples, while indicating exemplary embodiments of the apparatus, systems and methods, are intended for purposes of illustration only and are not intended to limit the scope of the invention. These and other features, aspects, and advantages of the apparatus, systems and methods of the present invention will become better understood from the following description, appended claims, and accompanying drawings. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
- The invention proposes concepts for aiding and/or improving handheld devices that capture images of a surface of a subject as end-effectors of the device engage with the surface of the subject. Specifically, a loading state of the end-effectors is determined at the same time as image capture. From this loading state, a predicted quality value of the image may be determined, which informs selective transmission of the image. Overall, this may mean that images that likely poor quality may not be transmitted, reducing cost and a bandwidth required to facilitate such unnecessary transmission.
- Put another way, proposed concepts aim to improve the process of image acquisition of a portion of a surface of the subject by a handheld device having a plurality of end-effectors for engagement with a portion of the surface of a subject. Specifically, deformation of said end-effectors under a loading may cause some of the end-effectors to encroach on a field of view of image capture of the surface of the user. It has therefore been realized that a determined loading state of the end-effectors during capture of an image may be leveraged to determine a quality of the image. Thus, transfer of the image is selectively controlled based on the image quality. In this way, an overall data transfer rate may be reduced by omitting transmission of poor quality (i.e., highly obscured) images.
- By way of explanation, end-effectors are known to deform under pressure, and are prone to encroach upon an image field of an image capture device. Of course, when the end-effectors infringe upon the image field of the image capture device as an image is captured, the resulting image will contain the end-effector. Typically, the end-effectors are not the desired target of the image capture. Instead, it is desirable that the image contains (at least part of) the surface of the subject. Thus, encroachment by the end-effectors is undesirable.
- As seen in
Fig. 1 , different forms/mode of deformation may be present. In most ideal cases, the end-effector may freely deform or move (i.e. touch surface and freely move) as shown in 10. In others, the end-effector may be restricted as shown in 20, meaning that the distal end of the end effector from the handheld device may not always freely move. In further cases shown in 30, the end-effector may be pinned in a crevasse of a treatment surface, meaning that the distal end cannot move and is therefore likely not effective. Of course, these different modes of deflection may also impact whether the end-effector encroaches on the field of view of the image capture device. - Such end-effectors may be thought of as upstanding elongate members (e.g., bristles, quills, tufts), but are not restricted as such. End-effectors are any means by which a surface (e.g., teeth, tongue, skin, hair) of the subject is engaged by a handheld device, for example for scrubbing, brushing, shaving, massaging, etc. They are typically elastically deformable/flexible themselves, or are mounted on an elastically deformable/flexible member, such that they deform/deflect under a pressure applied during typical use of the handheld device.
- Thus, end-effector deformation such as uncontrolled splaying, pinning, statically loading/deforming obscure regions of interest to be imaged. One solution is to retrospectively neglect such images from (remote) data analysis, but this increases total data amount to be sent wirelessly to the cloud or other in-device interfaces. Another solution would be to process the captured image with known image analysis techniques to determine an extent to which the image contains unobstructed/unobscured regions of interest, but this is necessarily computationally expensive and time consuming.
- Hence, embodiments of the invention provide systems and computer-implemented methods (executed by a processor) to selectively transmit image data for further processing based on a predicted image quality based on a determined loading state of the end-effectors during image capture. Images may be transmitted when the loading state of the end-effectors (and thus the predicted/determined image quality) is associated with a given range. This range is preferably greater than zero (i.e. the end-effectors not in contact with the surface of the subject), and not exceeding a maximum loading state (i.e. high deformation of the end-effectors and low image quality).
- Put another way, embodiments propose only transmitting captured images associated with low/zero end-effector deformation and/or specific ranges of lower loading states, with forces and deformation either directly measured or evaluated indirectly.
- It has been observed that a properly designed image capture device can be positioned and configured such that a captured image of the surface of the subject is free of bristle encroachment when the end-effectors are in specific loading states (e.g. pressure force, acceleration, force, etc.). In some cases, the handheld device vibration may also be in some known specific state. It has also been realized that the image remains sufficiently undistorted providing the specific loading state (and in particular, the contact pressure of the end-effectors) remains in a specified range of values. Such values are not necessarily contiguous but are often monotone.
- Consequently, it is proposed to selectively transmit images associated with a determined loading state that lies within specific ranges of (generally lower) loading states. Indeed, this can be seen in
Fig. 2 , which represents see cumulative percentage distribution of relevant parameters. In this case, only image data associated with the lower 20% of deformations (i.e., a lower 20% of loading states) is transmitted, represented by the shaded area. As a result, the overall data transfer rate is reduced by approximately a factor of 5. The skilled person would understand that other choices can be made to reduce or increase the data rate. - Thus, embodiments of the invention provide the following main elements:
- (i) A handheld device (i.e. a personal care, healthcare, therapeutic or treatment device). For example, the handheld device may be a toothbrush, a hairbrush, a breast pump, a massaging device, a shaver, etc.). In any case, the handheld device has a plurality of end-effectors. For example, the end-effectors for a toothbrush are the bristles, for a shaver is the shaver cap, for a hairbrush are the tufts/bristles/upstanding members. The handheld device also has an (integrated) image capture device, which captures images of a part of the surface of the subject that the end-effectors engage/contact.
- (ii) A processor having a method/algorithm and connected to a means to either directly measure end-effector deformation by a loading state, or indirectly (i.e. by proxy from which the deformation may be derived). Put another way, the loading state may be directly measured/determined, or a likelihood/probability of the end-effectors being in a loading-state is estimated by the processor.
- (iii) A designed range of values or a threshold of measured end-effector deformation (or threshold for predicted image quality, or determined loading state) for determining the selective transmission of the image from the device (i.e. to the cloud for further processing).
- Moving on,
Fig. 3 presents a simplified block diagram of ahandheld device 100 according to an embodiment of the invention. Specifically, there are depicted a plurality of end-effectors 110, animage capture device 120, asensor unit 130, and acontrol unit 140. Optionally, there may further be provided a vibratory means 112, acommunication unit 150, and an external/remote processor 160. - Firstly, it should be noted that whilst the embodiment depicted in
Fig. 3 relates to a (powered) toothbrush, the invention is equally applicable to other devices where end-effector deformation may obscure the image of an (on-board) image capture device/camera. For example,Fig. 3 could equally depict a hairbrush, a skin cleaning brush, a brushing mouthpiece, a shaver, etc. Thus, it is understood by the skilled person that the handheld device may be one of a personal care device, hairbrush, a treatment device, or a therapeutic device. Equally, the position of the elements described is not prescriptive of their required relative locations. - The end-
effectors 110 are configured for engagement with a portion of a surface of a subject. In other words, the end-effectors 110 contact a part of the surface of the subject. - The surface could be a tooth, a gum, a tongue, a skin surface, a hair surface, etc. of the subject. The end-effectors could refer to individual elongate members upstanding from a head of the handheld device, such as brushes, tufts, or collections of brushes/tufts. Essentially, any deformable end-effector (or end-effector mounted on a deformable member) which comes into contact with a part of the surface of the subject may be one of the end-
effectors 110. In other words, each of the plurality of end-effectors 110 are one of an elastically deformable end-effector, or a rigid end-effector mounted on the handheld device by an elastically deformable base. - The subject could be the user of the
handheld device 100 putting the end-effectors 110 in contact with the surface of the subject. Alternatively, the subject may be the person or animal to which the end-effectors 110 are in contact with, while thehandheld device 100 is operated by another person. Put another way, thehandheld device 100 could be a toothbrush operated by a veterinarian brushing the teeth of an animal, and so the subject is the animal. - The
image capture device 120 is configured to capture an image of the surface of the subject. Thus, theimage capture device 120 is designed/positioned/adapted to have a field of view including the surface of the subject when the end-effectors 110 come into contact with part of the surface of the subject. In this way, the surface of the subject may be imaged for remote analysis (i.e. for diagnosis, advice, etc.). - The
sensor unit 130 is configured to determine a loading state of the plurality of end-effectors 110 at a same time as when the image was captured. In effect, the loading state may be directly measured/determined, or a likelihood/probability of the end-effectors being in a loading-state is estimated. - The loading state may describe at least one of a pressure exerted on the end-
effectors 110 by contact with the surface of the subject, a vibration of thehandheld device 100, and any other force that the end-effectors 110 and/orimage capture device 120 may be subject to. - Indeed, in some exemplary embodiments the loading state (indirectly) describes a level/extent/degree of deformation of at least one of the plurality of end-
effectors 110. More specifically, the loading state may describe at least one of a direction of deformation and a mode of deformation. Essentially, the loading state is indicative of the forces that the end-effectors 110 are under during contact with the surface of the subject. - Furthermore, the
sensor unit 130 may be configured to continually/continuously determine/detect/measure the loading state of the end-effectors 110 during use of thehandheld device 100. The loading state may be determined at precisely the same time as when the image is captured, or alternatively/additionally in the time before/after capture of the image. - Methods and means by which the loading state may be determined are described in more detail below.
- The
control unit 140 is configured to determine a quality value of the image based on the loading state. Put another way, the loading state of the plurality of end-effectors 110 determines an extent to which the end-effectors 110 deform, and therefore has a direct link to the end-effectors 110 encroaching on the field-of-view of theimage capture device 120. Thus, in certain loading states, the quality of the image may be known to be low due to encroachment by the end-effectors. - This, of course, may vary between
handheld device 100 and therefore the exact details of the link between a loading state and a quality value of the image may depend on the design of the end-effectors 110 and the positioning of theimage capture device 120. Generally, a greater loading state will mean a higher deformation, and therefore a potentially lower quality of image. However, it will be appreciated that in some cases a greater loading state may be desirable, as end-effectors 110 may deflect/deform away from the field of view of theimage capture device 120. The actual implementation details, and how a loading state relates to a quality of image, would be obvious to a person skilled in the art. - To clarify, the quality value essentially indicates an extent to which the image of the surface of the subject is obscured by the plurality of end-
effectors 110. Thus, a high quality value indicates that the captured image is relatively end-effector-free (i.e. the end-effectors are absent from the image), and therefore contain a clear view of the surface of the subject. Conversely, a low quality value indicates a presence of end-effectors 110 in the image to a point that the surface of the subject may not be visible, or visible to an extent that is not particularly useful for further processing. - The
control unit 140 is further configured to selectively control the transfer of the image based on the quality value. Essentially, this means that for some quality values (derived from the loading state) the images may be transferred in a first manner, and for other quality values the image may be transferred in a second manner. - In some cases, this may mean that the quality value determines whether an image is transferred or not. In other cases, the parameters/characteristics of the transfer of the image may be different. For example, low quality images may be transferred at a high compression rate (strongly lossy), while high quality images may be transferred at a low compression rate (losslessly, or weakly lossy). Of course, the quality value may discretely (i.e. in a binary manner) or continuously (i.e. many different schemes for transfer) determine the transfer of the image.
- In any case, images useful in further processing/analysis (i.e. for diagnosis, informing treatment, providing advice, etc.) may be transferred, whereas images less useful may not be transferred. Accordingly, an overall data transfer rate may be reduced. This may be thought of a more effective transmission scheme, with a higher ratio between data utility per bit transmitted/sent.
- In particular embodiments, the
control unit 140 is configured to transfer the image responsive to the quality value being based on a loading state that corresponds to a predetermined range of acceptable loading states. To paraphrase, the image is transferred responsive to the quality value being between an acceptable range of acceptable quality values (derived from the loading state). Thus, a range (a contiguous range, or split) of quality values/loading states corresponds to images that may contain enough of the area of interest to be useful for further processing. - More specifically, the predetermined range of acceptable loading states is (at least) bounded by a predetermined minimum loading state and a predetermined maximum loading state (or minimum and maximum quality value).
- The predetermined minimum loading state may be indicative of at least one of the plurality of end-
effectors 110 being in contact with the portion of the surface of the subject. Indeed, if the end-effectors 110 are not in contact with the surface of the subject then the image captured by theimage capture device 120 is highly unlikely to contain an image of the surface of the subject (the area of interest). The predetermined maximum loading state may be indicative of a minimum acceptable amount of unobscured surface of the subject present in the image. - Put another way, a loading state greater than zero but below an upper threshold may be appropriate. Images captured in such conditions reflect moments where the end-
effectors 110 are in contact (as opposed to zero pressure where the device is free) but in a way to interfere the least with the acquired image. Above the upper threshold, the end-effector deformation will be such as to prohibitively interfere with the image and make further analysis unnecessary. - The
control unit 140 may then be configured to control acommunication unit 150 to transmit (or not transmit, as the case may be) the image. The image may be transmitted by any appropriate means, such as Bluetooth, Wi-Fi, Zigbee, etc. The image may be transmitted to an external processor 160 (i.e., the cloud, a smartphone of the subject, etc.), for further processing or analysis. It should be noted that embodiments are not restricted to this arrangement. Indeed, thecontrol unit 140 may control transmission of the images locally, for storage and/or future transfer by other means. - Different means/methods for determining/measuring/detecting the loading state, both directly and indirectly, will now be discussed. Of course, these embodiments are not exhaustive, and any appropriate additional and alternative means for determining the loading state will be appreciated by the skilled person.
- In a first embodiment, the
sensor unit 130 comprises a sensor configured to detect/measure a force exerted by at least one of the end-effectors 110 on a portion of a surface of the subject. The loading state is thus based on the detected force. - Indeed, this embodiment particularly benefits from implementation in
handheld devices 100 that already have force/pressure sensors (such as an inertia measurement unit, IMU) already integrated. For example, powered toothbrushes often have a pressure sensor to detect when a force applied by the user is inappropriate for adequate brushing. However, in this use the pressure/force sensor may be used to detect a loading state of the end-effectors 110. - Of course, more than one sensor for this purpose may be provided, such that an accurate assessment of the force can be obtained
- By way of explanation, use may be made of a detected force (or pressure) as a proxy for the end-effector deformation (i.e. the loading state). To evaluate the intensity of the contact of the end-effector with the surface of the subject, signals from one or more pressure sensors may be correlated.
- In a simple implementation, only a single sensor is used. In an improved implementation, a signature of a single sensor or of a multiplicity of sensors may be considered to provide a measurement of the specific loading state (i.e. the force applied by a user applying the end-
effectors 110 to a surface of the subject). In this way, a more accurate assessment of the loading state may be acquired. - In a second embodiment, the
sensor unit 130 comprises at least one deformation sensing element coupled to (i.e., integrated within or alongside) one of the plurality of end-effectors 110. The deformation sensing element is configured to detect a deformation value of the corresponding end-effector 110. Alternatively, or in addition, the end-effectors 110 could be the deformation elements themselves. - In this case, the loading state is based on the detected deformation value. In other words, the deformation value indicates the pressure that the end-
effectors 110 are under, and therefore the loading state. - For example, the deformation sensing elements could each be any one of a piezoelectric polymer fiber, a fiber Bragg grating, a Fabry-Perot sensor, or a pressure-sensitive material (e.g., carbon-black filled conductive silicone rubber). All of these elements have a property that changes with a deformation. Thus, when the deformation sensing element is mechanically coupled to the end-effector 110 (or is one of the end-effectors 110), then a change in the property may directly measure the deformation.
- In other words, measurements from (deformation/strain) sensing elements/filaments to directly assess end-effector deformation can be used assess the suitability of captured images for transmission for further analysis.
- By use of many deformation sensing elements, it may be possible to differentiate an end-
effector field 110 in contact with an uneven surface (where some end-effectors are slightly deformed, and some are more deformed), to a free end-effector 110 use (where all end-effectors 110 will deform in a rather uniform manner the end-effectors 110 travel over the surface). - In a third embodiment, the
sensor unit 130 comprises an optical proximity sensor configured to detect/measure a distance value between a portion of thehandheld device 100 adjacent to the end-effectors 110 and the surface of the subject. In this case, the loading state is based on the detected distance value. - The distance between the surface of the subject, and the part of the
handheld device 100 on which the end-effectors 110 are mounted, indicates a space in which the end-effectors 110 occupy. Therefore, the loading state of the end-effectors 110 (and the deformation thereof) may be directly derived from this distance. - Thus, in this embodiment, direct use is made of the measured end-
effector 110 deformation in order to assess the suitability of captured images for transmission for further analysis. The optical proximity sensor is most suitably an infrared LED sensor with a small number of pixels on a sensing element, but other optical proximity sensors may be appreciated. - Proposed are also concepts for considering differences in deformation of the end-
effectors 110. In other words, different end-effectors 110 in different locations will likely be differently deformed. The optical proximity sensor may do this by: - (i) A scanning approach, in which the distance from one sensor to multiple points of the surface is acquired in a given (scanning) time.
- (ii) A volumetric approach, in which multiple sensors acquire the distance in critical locations of the end-effectors.
- In this manner, it is possible to differentiate an end-effector field (i.e. end-
effectors 110 in a location) in contact with an uneven surface (some variations of the end-effector 110 will be small and some will be more pronounced) to a free movement surface (where all end-effector 110 values will change in a rather uniform manner as the platen vibrates and the end-effectors 110 swing freely). - Finally, in some embodiments, the
handheld device 100 may further comprisevibratory actuator 112 adapted to vibrate at least one of the plurality of end-effectors. Suchvibratory actuators 112 are well known, and usually provided to improve the functionality of thehandheld device 100. For example, an electric toothbrush may have a platen that vibrates the plurality of end-effectors 110 for more effective cleaning. - When a
vibratory actuator 112 is provided, the loading state may also be indicated by vibration of the end-effectors 110. Thus, in this case, thesensor unit 130 comprises a sensor configured to detect/measure a vibration value of at least one of the plurality of end-effectors 110. From the vibration value, the loading state may be (at least partially) derived. - The sensor in this case may be an accelerometer. For example, the sensor may be part of an IMU already found in many handheld devices.
-
Fig. 4 presents a flow diagram of amethod 200 for selectively controlling the transfer of an image of a surface of a subject acquired by a handheld device as described above. As before, the handheld (e.g., personal care, treatment and/or therapeutic) device comprises a plurality of end-effectors for engagement with a portion of the surface of a subject and a means for capturing the image of the surface of the subject. - At
step 210, an image of the surface of the subject is captured. This may be captured by a known camera/imaging system. - At
step 220, a loading state of the plurality of end-effectors is determined. This loading state reflects the loading state at (approximately) the same time as when the image was captured is determined. In other words, an image is captured, and a force being exerted on the plurality of end-effectors at roughly the same time (i.e. slightly before or after) is determined directly or indirectly. - The loading state may be determined in a plurality of different ways. Indeed, at optional steps 222-228, there are a number of different values detected/measured from which the loading state may be derived. Thus, in some embodiments, the loading state is determined based on at least one of a force, a deformation value, a distance value and/or a vibration value. In this way, a full appreciation of the deformation/mode of deformation of the end-effectors may be realized.
- At
step 230, a quality value of the image is determined based on the loading state. The quality value may reflect the extent to which the captured image is likely to be obscured by the end-effectors. Thus, the quality value indicates a likely worth/value of the image. - At
step 240, the transfer of the image is selectively controlled. For example, the image may be transferred losslessly, weakly lossy, strongly lossy or not at all. This is based on the quality value, with images of higher quality being prioritized for transmission. - Overall, the method provides selective transmission of a captured image which reflects the loading state of the end-effectors at the time. In this way, images that are more likely to be unobscured will be prioritized for transmission, reducing an overall data transfer rate.
-
Fig. 5 illustrates an example of acomputer 300 within which one or more parts of an embodiment may be employed. Various operations discussed above may utilize the capabilities of thecomputer 300. For example, one or more parts of a system for controlling a handheld device may be incorporated in any element, module, application, and/or component discussed herein. In this regard, it is to be understood that system functional blocks can run on a single computer or may be distributed over several computers and locations (e.g. connected via internet), such as a cloud-based computing infrastructure. - The
computer 300 includes, but is not limited to, PCs, workstations, laptops, PDAs, palm devices, servers, storages, and the like. Generally, in terms of hardware architecture, thecomputer 300 may include one ormore processors 310,memory 320, and one or more I/O devices 330 that are communicatively coupled via a local interface (not shown). The local interface can be, for example but not limited to, one or more buses or other wired or wireless connections, as is known in the art. The local interface may have additional elements, such as controllers, buffers (caches), drivers, repeaters, and receivers, to enable communications. Further, the local interface may include address, control, and/or data connections to enable appropriate communications among the aforementioned components. - The
processor 310 is a hardware device for executing software that can be stored in thememory 320. Theprocessor 310 can be virtually any custom made or commercially available processor, a central processing unit (CPU), a digital signal processor (DSP), or an auxiliary processor among several processors associated with thecomputer 300, and theprocessor 310 may be a semiconductor based microprocessor (in the form of a microchip) or a microprocessor. - The
memory 320 can include any one or combination of volatile memory elements (e.g., random access memory (RAM), such as dynamic random access memory (DRAM), static random access memory (SRAM), etc.) and non-volatile memory elements (e.g., ROM, erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), tape, compact disc read only memory (CD-ROM), disk, diskette, cartridge, cassette or the like, etc.). Moreover, thememory 320 may incorporate electronic, magnetic, optical, and/or other types of storage media. Note that thememory 320 can have a distributed architecture, where various components are situated remote from one another, but can be accessed by theprocessor 310. - The software in the
memory 320 may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing logical functions. The software in thememory 320 includes a suitable operating system (O/S) 340,compiler 360,source code 350, and one ormore applications 370 in accordance with exemplary embodiments. As illustrated, theapplication 370 comprises numerous functional components for implementing the features and operations of the exemplary embodiments. Theapplication 370 of thecomputer 300 may represent various applications, computational units, logic, functional units, processes, operations, virtual entities, and/or modules in accordance with exemplary embodiments, but theapplication 370 is not meant to be a limitation. - The
operating system 340 controls the execution of other computer programs, and provides scheduling, input-output control, file and data management, memory management, and communication control and related services. It is contemplated by the inventors that theapplication 370 for implementing exemplary embodiments may be applicable on all commercially available operating systems. -
Application 370 may be a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed. When a source program, then the program is usually translated via a compiler (such as the compiler 360), assembler, interpreter, or the like, which may or may not be included within thememory 320, so as to operate properly in connection with the O/S 340. Furthermore, theapplication 370 can be written as an object oriented programming language, which has classes of data and methods, or a procedure programming language, which has routines, subroutines, and/or functions, for example but not limited to, C, C++, C#, Pascal, BASIC, API calls, HTML, XHTML, XML, ASP scripts, JavaScript, FORTRAN, COBOL, Perl, Java, ADA, .NET, and the like. - The I/
O devices 330 may include input devices such as, for example but not limited to, a mouse, keyboard, scanner, microphone, camera, etc. Furthermore, the I/O devices 330 may also include output devices, for example but not limited to a printer, display, etc. Finally, the I/O devices 330 may further include devices that communicate both inputs and outputs, for instance but not limited to, a NIC or modulator/demodulator (for accessing remote devices, other files, devices, systems, or a network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router, etc. The I/O devices 330 also include components for communicating over various networks, such as the Internet or intranet. - If the
computer 300 is a PC, workstation, intelligent device or the like, the software in thememory 320 may further include a basic input output system (BIOS) (omitted for simplicity). The BIOS is a set of essential software routines that initialize and test hardware at startup, start the O/S 340, and support the transfer of data among the hardware devices. The BIOS is stored in some type of read-only-memory, such as ROM, PROM, EPROM, EEPROM or the like, so that the BIOS can be executed when thecomputer 300 is activated. - When the
computer 300 is in operation, theprocessor 310 is configured to execute software stored within thememory 320, to communicate data to and from thememory 320, and to generally control operations of thecomputer 300 pursuant to the software. Theapplication 370 and the O/S 340 are read, in whole or in part, by theprocessor 310, perhaps buffered within theprocessor 310, and then executed. - When the
application 370 is implemented in software it should be noted that theapplication 370 can be stored on virtually any computer readable medium for use by or in connection with any computer related system or method. In the context of this document, a computer readable medium may be an electronic, magnetic, optical, or other physical device or means that can contain or store a computer program for use by or in connection with a computer related system or method. - The
application 370 can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a "computer-readable medium" can be any means that can store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer readable medium can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. - The proposed control method(s) of
Fig. 4 , and the system(s) ofFig. 3 , may be implemented in hardware or software, or a mixture of both (for example, as firmware running on a hardware device). To the extent that an embodiment is implemented partly or wholly in software, the functional steps illustrated in the process flowcharts may be performed by suitably programmed physical computing devices, such as one or more central processing units (CPUs) or graphics processing units (GPUs). Each process - and its individual component steps as illustrated in the flowcharts - may be performed by the same or different computing devices. According to embodiments, a computer-readable storage medium stores a computer program comprising computer program code configured to cause one or more physical computing devices to carry out a control method as described above when the program is run on the one or more physical computing devices. - Storage media may include volatile and non-volatile computer memory such as RAM, PROM, EPROM, and EEPROM, optical discs (like CD, DVD, BD), magnetic storage media (like hard discs and tapes). Various storage media may be fixed within a computing device or may be transportable, such that the one or more programs stored thereon can be loaded into a processor.
- To the extent that an embodiment is implemented partly or wholly in hardware, the blocks shown in the block diagrams of
Fig 3 . may be separate physical components, or logical subdivisions of single physical components, or may be all implemented in an integrated manner in one physical component. The functions of one block shown in the drawings may be divided between multiple components in an implementation, or the functions of multiple blocks shown in the drawings may be combined in single components in an implementation. Hardware components suitable for use in embodiments of the present invention include, but are not limited to, conventional microprocessors, application specific integrated circuits (ASICs), and field-programmable gate arrays (FPGAs). One or more blocks may be implemented as a combination of dedicated hardware to perform some functions and one or more programmed microprocessors and associated circuitry to perform other functions. - Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single processor or other unit may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. If a computer program is discussed above, it may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. If the term "adapted to" is used in the claims or description, it is noted the term "adapted to" is intended to be equivalent to the term "configured to". Any reference signs in the claims should not be construed as limiting the scope.
- The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
Claims (15)
- A handheld device (100), comprising:a plurality of end-effectors (110) for engagement with a portion of a surface of a subject;an image capture device (120) configured to capture an image of the surface of the subject;a sensor unit (130) configured to determine a loading state of the plurality of end-effectors at a same time as when the image was captured; anda control unit (140) configured to determine a quality value of the image based on the loading state, and to selectively control the transfer of the image based on the quality value.
- The handheld device of claim 1, wherein the quality value indicates an extent to which the image of the surface of the subject is obscured by the plurality of end-effectors (110).
- The handheld device of claim 1 or 2, wherein the loading state describes a level of deformation of at least one of the plurality of end-effectors (110).
- The handheld device of claim 3, wherein the loading state further describes at least one of a direction of deformation and a mode of deformation of at least one of the plurality of end-effectors (110).
- The handheld device of any of claims 1-4, wherein the sensor unit (130) comprises a sensor configured to detect a force or pressure exerted by at least one of the end-effectors (110) on a portion of a surface of the subject, and wherein the loading state is based on the detected force or pressure.
- The handheld device of any of claims 1-5, wherein the sensor unit (130) comprises at least one deformation sensing element coupled to one of the plurality of end-effectors (110) or as one of the plurality of end-effectors, the deformation sensing element configured to detect a deformation value of the corresponding end-effector, and wherein the loading state is based on the detected deformation value.
- The handheld device of claim 6, wherein the at least one deformation sensing element is one of a piezoelectric polymer fiber, a fiber Bragg grating, a Fabry-Perot sensor, or a pressure-sensitive material.
- The handheld device of any of claims 1-7, wherein the sensor unit (130) comprises an optical proximity sensor configured to detect a distance value between a portion of the handheld device adjacent to the end-effectors (110) and the surface of the subject, and wherein the loading state is based on the detected distance value.
- The handheld device of any of claims 1-8, wherein the handheld device further comprises vibratory actuator (112) adapted to vibrate at least one of the plurality of end-effectors (110), and wherein the sensor unit (130) comprises a sensor configured to detect a vibration value of at least one of the plurality of end-effectors, and wherein the loading state is based on the detected vibration value.
- The handheld device of any of claims 1-9, wherein the control unit (140) is configured to transfer the image responsive to the quality value being based on a loading state that corresponds to a predetermined range of acceptable loading states.
- The handheld device of claim 10, wherein the predetermined range of acceptable loading states is bounded by a predetermined minimum loading state and a predetermined maximum loading state, the predetermined minimum loading state indicative of at least one of the plurality of end-effectors (110) being in contact with the portion of the surface of the subject, and the predetermined maximum loading state indicative of a minimum acceptable amount of unobscured surface of the subject present in the image.
- The handheld device of any of claims 1-11, wherein each of the plurality of end-effectors (110) are one of an elastically deformable end-effector, or a rigid end-effector mounted on the handheld device by an elastically deformable base.
- The handheld device of any of claims 1-12, wherein the handheld device (100) is a personal care device, a treatment device, or a therapeutic device.
- A method (200) for selectively controlling the transfer of an image of a surface of a subject acquired by a handheld device (100) comprising a plurality of end-effectors (110) for engagement with a portion of the surface of a subject, the method comprising:capturing (210) an image of the surface of the subject;determining (220) a loading state of the plurality of end-effectors at a same time as when the image was captured;determining (230) a quality value of the image based on the loading state; andselectively (240) controlling the transfer of the image based on the quality value.
- A computer program comprising computer program code means adapted, when said computer program is run on a computer, to implement the method of claim 14.
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP22208007.9A EP4371445A1 (en) | 2022-11-17 | 2022-11-17 | A handheld device |
| CN202380079564.6A CN120201947A (en) | 2022-11-17 | 2023-11-02 | Handheld Devices |
| PCT/EP2023/080537 WO2024104785A1 (en) | 2022-11-17 | 2023-11-02 | A handheld device |
| EP23801344.5A EP4618807A1 (en) | 2022-11-17 | 2023-11-02 | A handheld device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP22208007.9A EP4371445A1 (en) | 2022-11-17 | 2022-11-17 | A handheld device |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP4371445A1 true EP4371445A1 (en) | 2024-05-22 |
Family
ID=84358954
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP22208007.9A Withdrawn EP4371445A1 (en) | 2022-11-17 | 2022-11-17 | A handheld device |
| EP23801344.5A Pending EP4618807A1 (en) | 2022-11-17 | 2023-11-02 | A handheld device |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP23801344.5A Pending EP4618807A1 (en) | 2022-11-17 | 2023-11-02 | A handheld device |
Country Status (3)
| Country | Link |
|---|---|
| EP (2) | EP4371445A1 (en) |
| CN (1) | CN120201947A (en) |
| WO (1) | WO2024104785A1 (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3150082A1 (en) * | 2015-09-29 | 2017-04-05 | The Procter and Gamble Company | Method and device for quantifying a wear level of a bristle field of a brush |
| US20200260859A1 (en) * | 2015-12-15 | 2020-08-20 | Koninklijke Philips N.V. | System and method for determining and notifying a user when to rplace a dental cleaning head |
-
2022
- 2022-11-17 EP EP22208007.9A patent/EP4371445A1/en not_active Withdrawn
-
2023
- 2023-11-02 EP EP23801344.5A patent/EP4618807A1/en active Pending
- 2023-11-02 CN CN202380079564.6A patent/CN120201947A/en active Pending
- 2023-11-02 WO PCT/EP2023/080537 patent/WO2024104785A1/en not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3150082A1 (en) * | 2015-09-29 | 2017-04-05 | The Procter and Gamble Company | Method and device for quantifying a wear level of a bristle field of a brush |
| US20200260859A1 (en) * | 2015-12-15 | 2020-08-20 | Koninklijke Philips N.V. | System and method for determining and notifying a user when to rplace a dental cleaning head |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2024104785A1 (en) | 2024-05-23 |
| EP4618807A1 (en) | 2025-09-24 |
| CN120201947A (en) | 2025-06-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6771659B2 (en) | Vibration control device | |
| US10070807B2 (en) | Detection and evaluation of user grip with a handheld tool | |
| JP2023017923A (en) | BIOMETRIC INFORMATION ACQUISITION METHOD, BIOMETRIC INFORMATION ACQUISITION MODEL LEARNING METHOD, DEVICE AND PROGRAM | |
| EP4371445A1 (en) | A handheld device | |
| US11497455B2 (en) | Personalized monitoring of injury rehabilitation through mobile device imaging | |
| Anisuzzaman et al. | Leveraging Comprehensive Echo Data to Power Artificial Intelligence Models for Handheld Cardiac Ultrasound | |
| KR102123598B1 (en) | Apparatus and system for skin diagnosis and method thereof | |
| CN112673396A (en) | Detecting object motion in medical imaging | |
| US20250082094A1 (en) | Personal care devices | |
| US20250275625A1 (en) | Personal care device | |
| EP4452126B1 (en) | Personal care devices; computer program and image processing methods therefore | |
| EP4498863B1 (en) | Personal care device | |
| EP4452127B1 (en) | Personal care devices, computer program and non-therapeutical method of processing captured images | |
| EP4389066A1 (en) | Systeme for assessing the fit of an oral appliance being worn by a subject | |
| EP4452130B1 (en) | Personal care devices, computer program and non-therapeutical method of processing captured images | |
| EP4203452A1 (en) | Personal care devices | |
| EP4615331A1 (en) | Cardiotocographic scanning session duration | |
| WO2025082885A1 (en) | Replaceable component condition prediction | |
| WO2025082824A1 (en) | Personal care device characteristic prediction | |
| CN103285518A (en) | System and method for recording cone-beam tomograms in radiation therapy | |
| JP2004337332A (en) | Biological information monitoring system, biological information monitoring method, biological information measuring terminal, and biological information receiving terminal |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
| 18D | Application deemed to be withdrawn |
Effective date: 20241123 |