[go: up one dir, main page]

WO2020195172A1 - Wearable device, information processing unit, and information processing method - Google Patents

Wearable device, information processing unit, and information processing method Download PDF

Info

Publication number
WO2020195172A1
WO2020195172A1 PCT/JP2020/003966 JP2020003966W WO2020195172A1 WO 2020195172 A1 WO2020195172 A1 WO 2020195172A1 JP 2020003966 W JP2020003966 W JP 2020003966W WO 2020195172 A1 WO2020195172 A1 WO 2020195172A1
Authority
WO
WIPO (PCT)
Prior art keywords
posture
hand
sensor
grasping
wearable device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2020/003966
Other languages
French (fr)
Japanese (ja)
Inventor
横山 正幸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of WO2020195172A1 publication Critical patent/WO2020195172A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • This disclosure relates to a wearable device, an information processing unit, and an information processing method.
  • Patent Document 1 discloses a motion capture device or the like that predicts muscle motion by calculating muscle tonus prediction data based on frequency analysis data of myoelectric potential information.
  • the output of the electrode may be affected by noise or the like depending on the contact condition of the electrode. Therefore, in the prior art, it is desired to improve the accuracy of estimating the muscular strength of the hand by using the myoelectric sensor that comes into contact with the skin surface.
  • the wearable device of one form according to the present disclosure uses a posture sensor for detecting the posture of the user's hand, an electromyographic sensor for detecting the myoelectric potential information of the hand, and the posture sensor.
  • the estimation unit includes an estimation unit that estimates the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.
  • the information processing unit of one form according to the present disclosure includes a posture sensor that detects the posture of the user's hand, an acquisition unit that acquires the detection results of the myoelectric potential sensor that detects the myoelectric potential information of the hand, and the posture sensor.
  • the estimation unit includes an estimation unit that estimates the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.
  • one form of information processing method includes a step in which a computer acquires detection results of a posture sensor for detecting the posture of a user's hand and an electromyographic sensor for detecting the myoelectric potential information of the hand.
  • the step includes estimating the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.
  • FIG. 1 is a diagram for explaining an example of a wearable device according to the first embodiment.
  • the wearable device 1 is attached to the user's hand H.
  • the user's hand H is described as a portion of the human body from the wrist to the fingertip, but for example, only the finger or the back of the hand may be used.
  • the wearable device 1 will be described, for example, in the case of a glove interface object, but the wearable device 1 is not limited thereto.
  • the wearable device 1 may be a glove, a racing glove, a wearing belt, or the like worn on the user's hand H.
  • the wearable device 1 may be attached to both of the user's hands H.
  • the wearable device 1 covers the entire finger, instep, etc. of the user's hand H.
  • the wearable device 1 can be used as an interface for detecting a user's grip strength in, for example, AR (Augmented Reality), VR (Virtual Reality), and the like.
  • AR Augmented Reality
  • VR Virtual Reality
  • Grip strength is also a very important muscle strength in some sports. For example, in baseball, it is important to firmly fix the bat during batting and use grip strength to transmit the force of the entire body to the bat. Therefore, by using the wearable device 1 as a glove interface object, it is possible to estimate the grip strength and the like in a plurality of types of sports.
  • the wearable device 1 aims to improve the estimation accuracy of the muscle strength of the hand H and the convenience of the user.
  • the wearable device 1 includes a mounting portion 2, a posture sensor 3, a myoelectric sensor 4, and an information processing unit 10.
  • a posture sensor 3 a posture sensor 3
  • a myoelectric sensor 4 a myoelectric sensor 4
  • an information processing unit 10 an information processing unit 10.
  • the mounting portion 2 is a glove on which the wearable device 1 is mounted on the user's hand H.
  • the mounting portion 2 is formed of, for example, cloth, synthetic fibers, leather, or the like.
  • the mounting portion 2 covers the user's hand H and deforms according to the operation of the hand H.
  • the mounting unit 2 is provided with a posture sensor 3, a myoelectric sensor 4, and an information processing unit 10. When the attachment portion 2 is attached to the user's hand H, for example, the posture sensor 3 and the myoelectric sensor 4 are brought into direct or indirect contact with the skin of the hand H.
  • the mounting unit 2 has a non-grasping posture P1 indicating a posture in which the user does not hold the hand H and a grasping posture P2 indicating a posture in which the user holds the hand H when the user is mounted on the hand H.
  • the non-grasp posture P1 includes, for example, a posture in which the hand H is open.
  • the non-grasping posture P1 includes a posture when the object to be detected is not grasped by the force of the hand H, and a part of the fingers of the hand H may be bent.
  • the grasping posture P2 includes, for example, a posture when the hand H holds an object to be detected, a posture when the force of the hand H is applied, and the like.
  • the posture of the hand H in the present embodiment includes, for example, the posture, shape, and grasping state of the hand H.
  • the posture sensor 3 detects the posture of the user's hand H.
  • the posture sensor 3 is configured to detect the bending of a part of the user's hand H, such as the user's finger.
  • the posture sensor 3 may be used in combination with, for example, a bending sensor, a camera, and an inertial measurement unit.
  • the posture sensor 3 is a bending sensor
  • the posture sensor 3 is arranged along the upper surface of each of the fingers of the hand H when the mounting portion 2 of the wearable device 1 is mounted on the hand H.
  • the upper surface of the finger is, for example, a surface that faces upward when the back of the hand H is turned upward.
  • the posture sensor 3 is arranged on the back side of the hand H so as to straddle the proximal phalanx and the metacarpal bone of each finger of the hand H will be described, but the present invention is limited to this. Not done.
  • the posture sensor 3 may be arranged on the entire finger, the inner surface of the finger, or the like so that the bending of the finger of the hand H can be detected.
  • the posture sensor 3 is a sensor whose resistance value increases when it is bent.
  • the posture sensor 3 detects at least a part of the bending of the user's hand.
  • the posture sensor 3 includes, for example, a resistor that is sensitive to bending.
  • the resistance value of the attitude sensor 3 changes based on the amount of bending of the resistor.
  • the posture sensor 3 detects the posture of the hand H as a resistance value.
  • the posture sensor 3 arranged on the index finger of the hand H detects the bending of the index finger, and as the bending increases, the resistance value also increases.
  • One of the plurality of posture sensors 3 is connected to the power supply of the wearable device 1, and the other is grounded.
  • the posture sensor 3 is electrically connected to the information processing unit 10 and outputs the detection result to the information processing unit 10.
  • the posture sensor 3 is provided so that the mounting portion 2 can be imaged.
  • the posture sensor 3 is provided outside the mounting portion 2, and estimates the user's hand H from the image captured by the mounting portion 2.
  • the posture sensor 3 detects the hand H and its posture from an image by using well-known techniques such as background subtraction method and pattern matching.
  • the posture sensor 3 is provided on the mounting portion 2 so as to be arranged on the finger, fingertip, or the like of the hand H.
  • the posture sensor 3 detects the three-dimensional angular velocity and acceleration according to the movement of the finger, the fingertip, and the like, and detects the posture of the hand H based on the detection result.
  • the myoelectric sensor 4 detects the myoelectric potential information of the user's hand H.
  • the myoelectric potential information is, for example, information necessary for estimating the grasped state of the hand H.
  • the myoelectric sensor 4 has, for example, a plurality of electrodes 41 that come into contact with the skin of the user's hand H.
  • the myoelectric sensor 4 detects the myoelectric potential information generated when the muscle of the hand H is operated by the electrode 41.
  • One of each of the plurality of electrodes 41 is connected to the power supply of the wearable device 1, and the other is grounded.
  • the myoelectric sensor 4 is electrically connected to the information processing unit 10, and outputs the myoelectric potential information detected by the plurality of electrodes 41 to the information processing unit 10.
  • the muscle electronic information includes, for example, information indicating the potential of each of the plurality of electrodes 41 required for estimating the grasping power of the hand H, the shape of the fingers, etc., and information indicating the part of the hand H in which the electrodes 41 are arranged. Including. In this embodiment, the case where the electrode 41 is a dry type electrode will be described, but a wet type electrode may be used. The advantages of the dry type electrode 41 are that it can be easily attached to the user's skin and that the running cost can be suppressed as compared with the case of using the wet type disposable electrode.
  • the information processing unit 10 is, for example, a dedicated or general-purpose computer. In this embodiment, the information processing unit 10 is formed in a chip shape.
  • the information processing unit 10 is a computer for processing data from another device.
  • the information processing unit 10 is an electronic device that uses the posture sensor 3 and the myoelectric sensor 4 in combination and processes the information detected by the posture sensor 3 and the myoelectric sensor 4.
  • the information processing unit 10 has, for example, a function of determining the posture of the user's hand H.
  • the information processing unit 10 has, for example, a function of estimating the grasping state of the user's hand H.
  • the grasping state includes, for example, a state such as a posture of fingers and a grip strength.
  • the information processing unit 10 obtains the detection results of the posture sensors 3 of the non-grasping posture P1 and the gripping posture P2 by calibration at the start of use of the user, and sets the detection result of the non-grasping posture P1 as the initial value. It may be.
  • FIG. 2 is a diagram showing an output example of the myoelectric sensor 4 of the wearable device 1 according to the first embodiment.
  • the vertical axis represents the amplitude and the horizontal axis represents the time.
  • the graphs G11 and G12 shown in FIG. 2 show an output example of the myoelectric sensor 4 in a state where the user's hand H is in the non-grasping posture P1, that is, the hand H is open.
  • the non-grasping posture P1 for example, a change in the posture of the hand H causes a recess in the back of the hand H, or the electrode 41 partially or completely does not come into contact with the skin.
  • the amplitude of the signal of the myoelectric sensor 4 becomes unstable due to the change of the contact impedance.
  • the output of the myoelectric sensor 4 does not change at the baseline as shown in the graph G11, and the boundary of the A / D converter (ADC) as shown in the graph G12. It becomes stuck to the value.
  • the graphs G21 and G22 shown in FIG. 2 show an output example of the myoelectric sensor 4 in a state where the user's hand H holds the grasping posture P2, that is, the hand H.
  • the grasping posture P2 has a first state P21 and a second state P22.
  • the first state P21 indicates a state in which the user lightly grasps the hand H without exerting any force.
  • the second state P22 indicates a state in which the user puts a lot of effort into holding the hand H firmly.
  • the first state P21 the posture of the hand H is fixed, and the contact area between the electrode 41 and the hand H is constant.
  • the myoelectric sensor 4 outputs myoelectric potential information indicating a myoelectric signal having a small amplitude.
  • the second state P22 outputs myoelectric potential information indicating a myoelectric signal in which the average amplitude increases in proportion to the grip strength of the hand H.
  • the myoelectric sensor 4 As shown in FIG. 2, in the myoelectric sensor 4, the characteristics of the myoelectric signal change depending on whether the user's hand H is in the non-grasping posture P1 and the grasping posture P2. That is, it can be seen that the myoelectric sensor 4 outputs a myoelectric signal effective for estimating the grasping state of the hand H when the hand H changes from the non-grasping posture P1 to the grasping posture P2. Further, in the myoelectric sensor 4, the myoelectric signal in the first state P21 of the grasping posture P2 is the myoelectric potential information at the beginning of grasping the hand H, and the myoelectric potential information can be used as the initial value.
  • FIG. 3 is a diagram showing an output example of the posture sensor 3 of the wearable device 1 according to the first embodiment.
  • the vertical axis shows the resistance value of the posture sensor 3
  • the horizontal axis shows the amount of change from the non-grasping posture P1 of the hand H to the gripping posture P2.
  • the graph shows that the posture sensor 3 outputs the resistance value Rs when the hand H is opened in the non-grasp posture P1.
  • the resistance value is around Rm according to the increase in the deformation amount of the posture sensor 3. It shows that it increases to.
  • the graph shows that the resistance value Rm output by the posture sensor 3 does not change in the first state P21 and the second state P22 of the grasping posture P2.
  • the posture sensor 3 changes the resistance value from the non-grasping posture P1 to the gripping posture P2 of the user's hand H, but it can be seen that the change in the gripping force of the hand H cannot be captured. Further, the wearable device 1 sets the reference resistance value Rth based on the first state P21 of the grasping posture P2 for the posture sensor 3, thereby changing the hand H from the non-grasping posture P1 to the grasping posture P2. The change of the hand H from the grasping posture P2 to the non-grasping posture P1 can be discriminated based on the detection result of the posture sensor 3.
  • the wearable device 1 determines the posture of the hand H based on the detection result of the posture sensor 3. For example, it is assumed that the user's hand H is the non-grasp posture P1 with the hand H open. In this case, since the resistance value detected by the posture sensor 3 does not exceed the reference resistance value Rth, the wearable device 1 determines that the posture of the user's hand H is the non-grasp posture P1 and does not detect the myoelectric sensor 4. .. After that, it is assumed that the user's hand H changes from the non-grasping posture P1 to the grasping posture P2 by, for example, holding a tool.
  • the wearable device 1 determines that the posture of the user's hand H is the grasping posture P2. Then, the wearable device 1 starts the detection of the myoelectric sensor 4, and estimates the grasping state of the hand H based on the detection result of the myoelectric sensor 4. For example, the wearable device 1 estimates the state of fingers, grip strength, and the like based on the frequency component of the myoelectric potential information for each electrode 41 of the myoelectric sensor 4.
  • the wearable device 1 estimates the grasping state of the movement of the hand H, the grip strength, etc. by utilizing the characteristic that the myoelectric signals of the myoelectric potential information of the myoelectric sensor 4 show different frequencies depending on the movement.
  • the wearable device 1 can extract an integrated value average potential, a frequency spectrum, and the like as parameters of a myoelectric signal.
  • the integrated value average potential is a rectified and averaged myoelectric signal, and it is known that the greater the muscle strength, the more active the muscle activity and the larger the value. Therefore, the wearable device 1 detects the grip strength of the hand H by, for example, a detection method using the integrated value average potential, which is a well-known technique. Further, the wearable device 1 may estimate the fine operation of the hand H by using, for example, a gesture detection method using a frequency spectrum, which is a well-known technique.
  • the wearable device 1 determines that the posture of the user's hand H is the non-grasp posture P1 and ends the detection of the myoelectric sensor 4. ..
  • the wearable device 1 when the posture of the hand H detected by the posture sensor 3 is the grasping posture P2, the hand H is based on the myoelectric potential information detected by the myoelectric sensor 4. Estimate the grasping state of. For example, when the user changes the hand H to the grasping posture P2, the user tends to maintain the grasping posture P2. As a result, the wearable device 1 can estimate the grasping state of the hand H using the myoelectric sensor 4 in a state where there is a high possibility that the grasping posture P2 of the hand H does not change. As a result, since the wearable device 1 can suppress the influence of noise and the like of the myoelectric sensor 4, it is possible to improve the estimation accuracy of the muscle strength of the hand H using the myoelectric sensor 4.
  • the wearable device 1 has a dry type electrode 41 in which the myoelectric sensor 4 detects myoelectric potential information.
  • the dry electrode 41 of the myoelectric sensor 4 since the identification of the myoelectric signal changes depending on the contact condition, the value may vary from measurement to measurement even with the same muscle strength. Since the wearable device 1 can estimate the grasping state of the hand H using the myoelectric sensor 4 in the state where the grasping posture P2 of the hand H does not change, the measurement result of the dry electrode 41 can be stabilized. it can. As a result, the wearable device 1 can easily attach the myoelectric sensor 4 to the hand H, and can improve convenience.
  • the wearable device 1 uses a bending sensor for detecting the movement of the finger of the hand H as the posture sensor 3. As a result, the wearable device 1 can determine the posture of the hand H based on the degree of bending of the fingers of the hand H. As a result, the wearable device 1 can improve the accuracy of discriminating the grasping posture P2 of the hand H, so that the estimation accuracy of the muscle strength of the hand H using the myoelectric sensor 4 can be further improved. Further, although the wearable device 1 can determine the posture from the image obtained by capturing the hand H, the user needs to position the hand H inside the imaging range, which may reduce convenience. On the other hand, the wearable device 1 does not have to limit the operating range of the hand H by using the bending sensor for the posture sensor 3, so that the convenience can be improved.
  • the posture sensor 3 when the mounting portion 2 is mounted, the posture sensor 3 is positioned at the position of the hand H where the mounting portion 2 can detect the grasping posture P2. As a result, in the wearable device 1, the posture sensor 3 can be arranged at the measurement position of the hand H only by mounting the mounting portion 2 on the hand H. As a result, the wearable device 1 can improve the accuracy of detecting the posture of the hand H by the posture sensor 3.
  • the mounting portion 2 is a glove
  • the posture sensor 3 is positioned on the back of the hand H by the mounting portion 2.
  • the wearable device 1 can stabilize the contact between the posture sensor 3 and the hand H by detecting the posture on the back of the hand H by the posture sensor 3.
  • the wearable device 1 can further improve the accuracy of detecting the posture of the hand H by the posture sensor 3.
  • the information processing unit 10 grasps the hand H based on the myoelectric potential information detected by the myoelectric sensor 4. To estimate. As a result, the information processing unit 10 can estimate the grasping state of the hand H using the myoelectric sensor 4 while the grasping posture P2 of the hand H does not change. As a result, the information processing unit 10 can suppress the influence of noise and the like of the myoelectric sensor 4, so that the accuracy of estimating the muscle strength of the hand H using the myoelectric sensor 4 can be improved.
  • FIG. 4 is a diagram showing a configuration example of the wearable device 1 according to the first embodiment.
  • the wearable device 1 includes a posture sensor 3, a myoelectric sensor 4, and an information processing unit 10.
  • the posture sensor 3, the myoelectric sensor 4, and the information processing unit 10 are operated by electric power from the power source 1B of the wearable device 1.
  • the information processing unit 10 includes an acquisition unit 11, a storage unit 12, an estimation unit 13, and a communication unit 14.
  • a program stored in the information processing unit 10 by a CPU Central Processing Unit
  • an MCU Micro Control Unit
  • RAM It is realized by executing Random Access Memory
  • each processing unit may be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field-Programmable Gate Array).
  • the acquisition unit 11 acquires the detection results output by the posture sensor 3 and the myoelectric sensor 4.
  • the acquisition unit 11 stores the acquired information in the storage unit 12.
  • the acquisition unit 11 can acquire the detection result of the posture sensor 3 and the detection result of the myoelectric sensor 4 at different timings.
  • the acquisition unit 11 acquires the information requested by the estimation unit 13 and outputs the acquired information to the estimation unit 13.
  • the acquisition unit 11 is electrically connected to the plurality of electrodes 41 of the myoelectric sensor 4, obtains the myoelectric signal from the electrodes 41, and obtains the active part, the magnitude of the amplitude, and the amount of change.
  • the acquisition unit 11 outputs the obtained information as myoelectric potential information to the estimation unit 13. In other words, the acquisition unit 11 outputs the myoelectric potential information indicating the myoelectric signal of the posture sensor 3 to the estimation unit 13.
  • the storage unit 12 stores various data.
  • the storage unit 12 can store data indicating the detection results of the posture sensor 3 and the myoelectric sensor 4.
  • the storage unit 12 stores, for example, posture data 12A and the like.
  • the posture data 12A includes data indicating the posture of the hand H to be detected.
  • the posture data 12A can detect the posture of the hand H according to the game by setting the posture of the hand H to be detected.
  • the posture data 12A may include data indicating the posture of the hand H according to activities of daily living, which is indispensable for conducting daily life.
  • the posture data 12A may include data actually measured when the set user's hand H is the grasping posture P2.
  • the posture data 12A may include data used for calibration.
  • the measurement results for each of the plurality of electrodes 41 are linked to the grip strength and the state of the hand H.
  • the posture data 12A may include data indicating a measurement result corresponding to a general hand H size.
  • the posture data 12A may include data such as a calculation program for obtaining grip strength based on the position of the electrode 41 and the amount of change in myoelectric potential information, a conversion table, and the like.
  • the estimation unit 13 estimates the grasping state of the user's hand H based on the myoelectric potential information detected by the myoelectric sensor 4.
  • the estimation unit 13 estimates the grasping state of the hand H when the grasping posture P2 continues for the determination time. For the determination time, for example, a time for avoiding an erroneous determination can be set.
  • the estimation unit 13 sets the myoelectric potential information of the myoelectric sensor 4 when the start of the grasping posture P2 is detected as an initial value, and estimates the grasping state based on the initial value and the myoelectric potential information detected after the start of the grasping posture P2. To do.
  • the estimation unit 13 estimates the grasped state based on the amount of change between the initial value myoelectric potential information and the detected myoelectric potential information.
  • the estimation unit 13 detects the end of the grasping posture P2 when the grasping state of the hand H is being estimated, the estimation unit 13 ends the estimation of the grasping state.
  • the end of the grasping posture P2 includes, for example, a change from the grasping posture P2 to the non-grasping posture P1.
  • the estimation unit 13 is based on the myoelectric potential information detected by the myoelectric sensor 4 when the posture of the hand H detected by the posture sensor 3 is the grasping posture indicated by the posture data 12A stored in the storage unit 12. , Estimate the grasping state of the user's hand H.
  • the estimation unit 13 outputs the estimated estimation result to the communication unit 14.
  • the communication unit 14 communicates wirelessly.
  • the communication unit 14 supports a short-range wireless communication system.
  • the communication unit 14 has a function of wirelessly communicating information with an external communication device or the like.
  • the communication unit 14 transmits the information from the estimation unit 13 to an external electronic device or the like.
  • External electronic devices include, for example, game consoles, televisions, smartphones, smart speakers, computers and the like.
  • the communication unit 14 outputs information received from an external electronic device or the like to the estimation unit 13.
  • the functional configuration example of the wearable device 1 according to the present embodiment has been described above.
  • the above configuration described with reference to FIG. 4 is merely an example, and the functional configuration of the wearable device 1 according to the present embodiment is not limited to such an example.
  • the functional configuration of the wearable device 1 according to the present embodiment can be flexibly modified according to specifications and operations.
  • the wearable device 1 may be configured such that the information processing unit 10 is provided outside the mounting unit 2 and the communication unit for transmitting the detection result of the sensor to the external information processing unit 10 is provided in the mounting unit 2.
  • FIG. 5 is a flowchart showing an example of a processing procedure executed by the wearable device 1 according to the first embodiment.
  • the processing procedure shown in FIG. 5 is realized by the information processing unit 10 of the wearable device 1 executing a program.
  • the processing procedure shown in FIG. 5 is repeatedly executed by the information processing unit 10.
  • the information processing unit 10 acquires the detection result of the posture sensor 3 (step S101).
  • the information processing unit 10 functions as the acquisition unit 11 by executing step S101.
  • the processing proceeds to step S102.
  • the information processing unit 10 determines the posture of the user's hand H based on the detection result of the posture sensor 3 (step S102). For example, when the posture of the hand H based on the detection result of the posture sensor 3 and the grasping posture indicated by the posture data 12A match or are similar, the information processing unit 10 determines the posture of the hand H as the grasping posture P2. The determination result is stored in the storage unit 12. When the information processing unit 10 determines that the posture of the hand H is not the grasping posture P2, the information processing unit 10 stores the determination result in the storage unit 12. Then, when the determination of the posture of the hand H is completed, the information processing unit 10 proceeds to the process in step S103.
  • the information processing unit 10 determines whether or not the posture of the hand H is the grasping posture P2 based on the determination result in step S102 (step S103). When the information processing unit 10 determines that the posture of the hand H is the grasping posture P2 (Yes in step S103), the information processing unit 10 proceeds to the process in step S104.
  • the information processing unit 10 determines whether or not the detection by the myoelectric sensor 4 has started (step S104). When the information processing unit 10 determines that the detection by the myoelectric sensor 4 has started (Yes in step S104), the information processing unit 10 proceeds to step S107, which will be described later. If the information processing unit 10 determines that the detection by the myoelectric sensor 4 has not started (No in step S104), the information processing unit 10 proceeds to step S105.
  • the information processing unit 10 determines whether or not the grasping posture P2 has continued for the determination time (step S105). For example, when the elapsed time from the time when the grasping posture P2 is detected reaches the determination time, the information processing unit 10 determines that the grasping posture P2 has continued for the determination time. Then, when the information processing unit 10 determines that the grasping posture P2 has continued for the determination time (Yes in step S105), the information processing unit 10 proceeds to the process in step S106.
  • the information processing unit 10 starts detection by the myoelectric sensor 4 (step S106). For example, the information processing unit 10 supplies electric power from the power source 1B to the myoelectric sensor 4. As a result, the myoelectric sensor 4 starts detecting myoelectric potential information by the plurality of electrodes 41. When the information processing unit 10 starts the detection by the myoelectric sensor 4, the process proceeds to step S107.
  • the information processing unit 10 acquires the detection result of the myoelectric sensor 4 (step S107).
  • the information processing unit 10 functions as the acquisition unit 11 by executing step S107.
  • the information processing unit 10 estimates the grasping state of the hand H based on the myoelectric potential information detected by the myoelectric sensor 4 (step S108).
  • the information processing unit 10 uses the myoelectric potential information of the myoelectric sensor 4 when detecting the start of the grasping posture P2 as an initial value, and the amount of change and the posture data between the initial value myoelectric potential information and the detected myoelectric potential information. Based on 12A, the grasping state of the hand H is estimated.
  • the information processing unit 10 stores the estimation result in the storage unit 12, the information processing unit 10 proceeds to the process in step S109.
  • the information processing unit 10 executes the output processing of the estimation result (step S109). For example, the information processing unit 10 transmits the estimation result to an external electronic device or the like via the communication unit 14 by executing the output process. Then, when the output process is completed, the information processing unit 10 proceeds to step S111, which will be described later.
  • step S110 the information processing unit 10 determines that the grasping posture P2 does not continue for the determination time (No in step S105).
  • the information processing unit 10 proceeds to the process in step S110.
  • the information processing unit 10 executes an output process of an estimation result indicating that the grasping state of the hand H has not been estimated (step S110). For example, the information processing unit 10 transmits the estimation result to an external electronic device or the like via the communication unit 14 by executing the output process. Then, when the output process is completed, the information processing unit 10 proceeds to step S111, which will be described later.
  • the information processing unit 10 determines whether or not it has ended (step S111). For example, the information processing unit 10 determines that the attachment unit 2 has been removed from the hand H based on the detection result of the myoelectric sensor 4. For example, when the information processing unit 10 receives the end request via the communication unit 14, it determines that the end is complete. Then, when the information processing unit 10 determines that the process is not completed (No in step S111), the information processing unit 10 returns the process to step S101 already described, and repeats a series of processes after step S101. When the information processing unit 10 determines that the process is complete (Yes in step S111), the information processing unit 10 terminates the processing procedure shown in FIG.
  • step S112 The information processing unit 10 determines whether or not the detection by the myoelectric sensor 4 has started (step S112). When the information processing unit 10 determines that the detection by the myoelectric sensor 4 has not started (No in step S112), the information processing unit 10 proceeds to step S110 already described. If the information processing unit 10 determines that the detection by the myoelectric sensor 4 has started (Yes in step S112), the information processing unit 10 proceeds to step S113.
  • the information processing unit 10 ends the detection by the myoelectric sensor 4 (step S113). For example, the information processing unit 10 stops the power supply from the power supply 1B to the myoelectric sensor 4. As a result, the myoelectric sensor 4 ends the detection of myoelectric potential information by the plurality of electrodes 41. When the information processing unit 10 finishes the detection by the myoelectric sensor 4, the information processing unit 10 proceeds to step S111 already described.
  • the information processing unit 10 functions as the estimation unit 13 by executing the processes of steps S102 to S106 and steps S108 to S111.
  • step S110 may be deleted. That is, if the information processing unit 10 has not estimated the grasped state of the hand H, the processing procedure may be such that the estimation result is not output. Further, in the processing procedure, the determination process (step S105) of whether or not the grasping posture P2 has continued for the determination time may be deleted.
  • the wearable device 1 when the posture of the hand H changes from the non-grasping posture P1 to the grasping posture P2 by the posture sensor 3, the myoelectric sensor 4 starts detecting the myoelectric potential information. As a result, the wearable device 1 does not need to drive the myoelectric sensor 4 when the posture of the hand H is the non-grasp posture P1. As a result, the wearable device 1 can reduce the power consumption by setting the posture P1 in which the posture of the hand H is not grasped into the standby state of the myoelectric sensor 4.
  • the estimation unit 13 does not estimate the grasping state of the hand H.
  • the wearable device 1 can avoid guessing the grasping state of the hand H in the case of the non-grasping posture P1 in which the myoelectric potential information of the myoelectric sensor 4 may include noise or the like.
  • the wearable device 1 can improve the estimation accuracy of the grasped state of the hand H based on the detection result of the myoelectric sensor 4.
  • the wearable device 1 estimates the grasping state of the hand H by the estimation unit 13 when the grasping posture P2 of the hand H continues over the determination threshold value. As a result, the wearable device 1 does not estimate the grasping state when the hand H temporarily changes to the grasping posture P2. As a result, the wearable device 1 can improve the detection accuracy that the user has changed the hand H to the grasping posture P2, so that the grasping state of the hand H is estimated even though the user's hand H is not the grasping posture P2. Can be suppressed.
  • the wearable device 1 uses the myoelectric potential information of the myoelectric sensor 4 when detecting the start of the grasping posture P2 of the hand H as an initial value, and sets the initial value and the myoelectric potential information detected after the start of the grasping posture P2. Based on this, the grasping state of the hand H is estimated. As a result, the wearable device 1 can estimate the grasping state based on the amount of change from the reference based on the myoelectric potential information at the start of the grasping posture P2 of the hand H. As a result, the wearable device 1 can estimate the grasping state in consideration of the contact state between the myoelectric sensor 4 and the hand H, so that the estimation accuracy of the grasping state can be improved.
  • the wearable device 1 detects the end of the grasping posture P2 of the hand H when the grasping state of the hand H is estimated, the estimation of the grasping state ends. As a result, the wearable device 1 can avoid estimating the grasping state of the hand H using the myoelectric sensor 4 when the grasping posture P2 of the hand H changes to another posture. As a result, the wearable device 1 can suppress the influence of the displacement between the myoelectric sensor 4 and the hand H, and can improve the estimation accuracy of the muscle strength of the hand H using the myoelectric sensor 4.
  • the wearable device 1 uses the myoelectric potential information detected by the myoelectric sensor 4 when the posture of the hand H detected by the posture sensor 3 is the grasping posture indicated by the posture data 12A stored in the storage unit 12. Based on this, the grasping state of the user's hand H is estimated. As a result, the wearable device 1 can estimate the grasping state of the hand H when the grasping posture P2 of the hand H indicated by the posture data 12A. As a result, the wearable device 1 can estimate the grasping state of the hand H according to the set grasping posture P2 by setting various different grasping postures P2.
  • the wearable device 1 has a grasped state of the hand H based on the myoelectric potential information detected by the myoelectric sensor 4 when the posture of the hand H detected by the posture sensor 3 is the grasped posture. Is estimated, but is not limited to this.
  • the wearable device 1 may add the detection result of the posture sensor 3 to the estimation of the grasping state of the hand H.
  • FIG. 6 is a diagram for explaining an example of the wearable device 1 according to the modified example (1) of the first embodiment.
  • the wearable device 1 is an open finger glove, a thimble glove, or the like that exposes the fingers, fingertips, etc. of the user's hand H when worn on the user's hand H.
  • the wearable device 1 may or may not expose the palm of the hand H, for example.
  • the wearable device 1 includes a mounting portion 2A, a posture sensor 3, a myoelectric sensor 4, and an information processing unit 10.
  • the wearable device 1 has one posture sensor 3 straddling the metacarpal bones of five fingers of the hand H and three myoelectric sensors 4.
  • the number of sensors is not limited.
  • the mounting portion 2A is a glove that exposes the fingers, fingertips, etc. of the user's hand H on the wearable device 1.
  • the mounting portion 2A is formed of, for example, cloth, synthetic fibers, leather, or the like.
  • the mounting portion 2A covers the back of the user's hand H and deforms according to the movement of the hand H.
  • the mounting portion 2A is provided with a posture sensor 3, a myoelectric sensor 4, and an information processing unit 10. When the attachment portion 2A is attached to the user's hand H, for example, the posture sensor 3 and the myoelectric sensor 4 are brought into direct or indirect contact with the skin of the hand H.
  • the mounting portion 2A has a non-grasping posture P1 indicating a posture in which the user does not hold the hand H and a grasping posture P2 indicating a posture in which the user holds the hand H when mounted on the hand H.
  • a non-grasping posture P1 indicating a posture in which the user does not hold the hand H
  • a grasping posture P2 indicating a posture in which the user holds the hand H when mounted on the hand H.
  • the mounting portion 2A is in the grasping posture P2
  • the user's hand H is in a state where all or part of the fingers are bent inward.
  • the back of the hand H changes to a rounded state in the direction in which the fingers are lined up. Therefore, in the mounting portion 2A, one posture sensor 3 is arranged on the back of the hand H straddling the metacarpal bones of a plurality of fingers, and myoelectric sensors 4 are arranged on both sides thereof.
  • the posture sensor 3 is configured to detect the bending of the back of the user's hand H.
  • the posture sensor 3 detects the bending of the back of the hand H, and as the bending increases, the resistance value also increases.
  • One of the posture sensors 3 is connected to the power supply 1B of the wearable device 1, and the other is grounded.
  • the posture sensor 3 is electrically connected to the information processing unit 10 and outputs the detection result to the information processing unit 10.
  • the wearable device 1 describes a case where one posture sensor 3 is arranged on the back of the hand H with a length straddling the metacarpal bones of five fingers, but the present invention is not limited to this.
  • each of the plurality of posture sensors 3 may be arranged side by side so as to straddle each of the metacarpal bones of five fingers.
  • the wearable device 1 may have a configuration in which the posture sensor 3 is arranged on a part of the back of the hand H that bends when the grasping posture P2.
  • the information processing unit 10 is a device that processes information detected by one posture sensor 3 and three myoelectric sensors 4.
  • the information processing unit 10 has, for example, a function of determining the posture of the user's hand H.
  • the information processing unit 10 has, for example, a function of estimating the grasping state of the user's hand H.
  • the wearable device 1 determines the posture of the hand H based on the detection result of the posture sensor 3. For example, it is assumed that the user's hand H is the non-grasp posture P1 with the hand H open. In this case, since the resistance value detected by the posture sensor 3 does not exceed the reference resistance value Rth, the wearable device 1 determines that the posture of the user's hand H is the non-grasp posture P1 and does not detect the myoelectric sensor 4. .. After that, it is assumed that the user's hand H changes from the non-grasping posture P1 to the grasping posture P2 by, for example, holding a tool.
  • the wearable device 1 determines that the posture of the user's hand H is the grasping posture P2. Then, the wearable device 1 starts the detection of the myoelectric sensor 4, and estimates the grasping state of the hand H based on the detection result of the myoelectric sensor 4. For example, the wearable device 1 estimates the state of fingers, grip strength, and the like based on the frequency component of the myoelectric potential information for each electrode 41 of the myoelectric sensor 4.
  • the wearable device 1 estimates the movement, grip strength, etc. of the hand H by utilizing the characteristic that the myoelectric signals indicated by the myoelectric potential information of the myoelectric sensor 4 indicate different frequency identification depending on the movement of the hand H.
  • the wearable device 1 can extract an integrated value average potential, a frequency spectrum, and the like as parameters of a myoelectric signal.
  • the integrated value average potential is a rectified and averaged myoelectric signal, and it is known that the greater the muscle strength, the more active the muscle activity and the larger the value. Therefore, the wearable device 1 detects the force of the hand H by the detection method using the integrated value average potential. Further, the wearable device 1 may estimate the fine operation of the hand H by using the gesture detection method using the frequency spectrum.
  • the wearable device 1 determines that the posture of the user's hand H is the grasping posture P2, and ends the detection of the myoelectric sensor 4.
  • the mounting portion 2A is an open finger glove, and the posture sensor 3 is positioned on the back of the hand H by the mounting portion 2A.
  • the wearable device 1 when the mounting portion 2A is mounted on the hand H, the fingertips of the hand H are exposed, so that the user can freely use the fingertips.
  • the wearable device 1 can improve the accuracy of detecting the posture of the hand H by the posture sensor 3 and improve the operability of the user.
  • the wearable device 1 according to the first embodiment can correspond to various different postures of the hand H.
  • the wearable device 1 can grasp and support the user's movement by combining the detection of the posture of the hand H and the estimation of the grip strength.
  • the wearable device 1 according to the modified example (2) of the first embodiment includes a mounting portion 2, a posture sensor 3, a myoelectric sensor 4, and an information processing unit 10.
  • the mounting portion 2 may be replaced with the mounting portion 2A.
  • the information processing unit 10 includes an acquisition unit 11, a storage unit 12, an estimation unit 13, and a communication unit 14.
  • FIG. 7 is a diagram showing an example of the grasping posture P2 of the wearable device 1 according to the modified example (2) of the first embodiment.
  • the wearable device 1 is associated with the user's scene and the grasping posture P2 by the posture data 12A.
  • Scenes include, for example, holding luggage, opening lids, opening doors, eating meals, and the like.
  • Information such as the bending amount and shape of the finger corresponding to the posture of the hand H is associated with the grasping posture P2, for example.
  • the posture data 12A associates information on the posture of hanging the bag with the hand H with the scene of holding the luggage.
  • the posture data 12A associates information on the posture of holding the lid with a finger with the scene of opening the lid.
  • the posture data 12A associates information on the posture of grasping the doorknob with the scene of opening the door.
  • the posture data 12A associates information on the posture of grasping the tableware with the scene of eating a meal.
  • the estimation unit 13 of the information processing unit 10 specifies the posture corresponding to the scene to be estimated based on the posture data 12A.
  • the estimation unit 13 uses the myoelectric sensor 4 to grip the hand H. Etc. is estimated.
  • the estimation unit 13 can provide a function of outputting the estimated grip strength as an operation result in the target scene of the user.
  • the target scene is a scene with luggage.
  • the posture sensor 3 detects the posture of the hand H.
  • the wearable device 1 specifies the grasping posture P2 of the target scene based on the posture data 12A.
  • the wearable device 1 estimates the grip strength (grasping state) of the user's hand H based on the myoelectric potential information detected by the myoelectric sensor 4.
  • the wearable device 1 associates the estimated grip strength with the identification information indicating the target scene, stores it in the storage unit 12, and transmits it to an external electronic device.
  • the wearable device 1 can indicate whether or not the user's grip strength is weakened by showing the user's grip strength in the target scene.
  • the posture data 12A includes the grasping posture P2 according to the scene, and the posture of the hand H detected by the posture sensor 3 is in the scene.
  • the grasping state of the hand H is estimated.
  • the wearable device 1 can be worn by the user in daily life to estimate the grasping state of the hand H according to the scene of daily life. For example, when the user cannot hold a heavy object for a long time, by setting "hold luggage" as the scene of the posture data 12A, the wearable device 1 can grasp the posture P2 when holding the luggage. Estimate the grasped state. As a result, the wearable device 1 can contribute to the support of daily life based on the grasped state of the user by outputting the grasped state of the hand H estimated in the scene.
  • modified example (2) of the first embodiment may be applied to the wearable device 1 of another embodiment or modified example.
  • FIG. 8 is a diagram for explaining an example of the system according to the second embodiment.
  • system 100 is, for example, a system for interactive gameplay of video games.
  • the system 100 includes a wearable device 1 according to the first embodiment, and a head-mounted display (HMD) 110.
  • the HMD 110 is worn, for example, in a manner similar to eyeglasses, goggles, or helmets and is configured to display a video game or other content to User U.
  • the HMD 110 provides an immersive experience for the user U by providing a display mechanism that is close to the eyes of the user U.
  • the HMD 110 can provide a display area for each of the user U's eyes, which occupies most or the entire field of view of the user U.
  • the HMD 110 can be connected to the computer 120 by wire or wirelessly.
  • Computer 120 includes, but is not limited to, for example, game consoles, personal computers, laptops, tablet computers, mobile devices, mobile phones, tablets, thin clients, set-top boxes, media streaming devices.
  • the computer 120 is configured to run a video game and output video and audio from the video game for rendering by the HMD 110.
  • the system 100 can use the wearable device 1 as a means for the user U to input a video game.
  • the system 100 has the wearable device 1 attached to the left and right hands H of the user U.
  • the wearable device 1 outputs the estimated grasping state of the hand H, the grip strength, and the like to the cooperation device such as the HMD 110 and the computer 120.
  • the computer 120 can execute a process of reflecting the grasping state of the hand H, the grip strength, and the like in AR, VR, a video game, and the like.
  • the computer 120 changes the damage given to the opponent according to the grip strength of the user U input from the wearable device 1, and outputs a video and audio showing the result to the HMD 110.
  • the HMD 110 outputs video and audio from the computer 120 to the user U.
  • the camera 130 is connected to the computer 120.
  • the camera 130 can be configured to capture an image of the interactive environment in which the user U is located. These captured images can be analyzed to determine the position and movement of the user U, HMD110, and wearable device 1.
  • the HMD 110 may include one or more lights that can be tracked to determine the position and orientation of the HMD 110.
  • the camera 130 may include one or more microphones for capturing sound from an interactive environment.
  • the camera 130 may be configured to include a plurality of image capture devices (eg, stereoscopic pairs of cameras), an IR camera, a depth of field camera, and combinations thereof. In the present embodiment, the case where the camera 130 includes an IR camera will be described.
  • the computer 120 functions as a thin client that communicates with the cloud gaming provider 150 on the network 200.
  • the cloud gaming provider 150 maintains and runs the video game being played by user U.
  • the computer 120 transmits the inputs from the wearable device 1, the HMD 110, and the camera 130 to the cloud gaming provider.
  • the cloud gaming provider 150 processes inputs that affect the game state of the video game being run.
  • the output from the running video game such as video data, audio data, and tactile feedback data, is transmitted to the computer 120.
  • FIG. 9 is a diagram for explaining an example of the wearable device 1 in the system 100 according to the second embodiment.
  • the wearable device 1 is attached to the hand H of the user U. Then, the user U holds the pen 160 with the hand H equipped with the wearable device 1.
  • the pen 160 is provided with a reflection marker at the pen tip.
  • the wearable device 1 estimates the grasping state of the hand H based on the myoelectric potential information detected by the myoelectric sensor 4. In this case, the wearable device 1 estimates the grasping state and the grip strength in which the object is grasped by the thumb and the index finger.
  • the computer 120 detects the position of the hand H and the pen tip of the pen 160 in space by analyzing the image captured by the camera 130 of the user U. For example, the computer 120 irradiates infrared rays from the HMD 110, the camera 130, and the like, and detects the position of the pen tip of the pen 160 in space based on the reflected light. Then, the computer 120 obtains the grip strength when the user U is holding the pen 160 based on the grasping state and the grip strength estimated by the wearable device 1, and determines the thickness of the line according to the grip strength. For example, the computer 120 increases the drawing line as the force of the user U holding the pen 160 increases.
  • the computer 120 causes the HMD 110 to display an image drawn based on the position of the pen tip of the pen 160 that has moved in space and the grip strength of the hand H.
  • the system 100 can display the image drawn by the user U in the space freehand on the HMD 110 based on the grasped state of the hand H estimated by the wearable device 1.
  • the wearable device 1, the HMD 110, and the camera 130 may themselves be networked devices that connect to the network 200 to communicate with the cloud gaming provider 150.
  • computer 120 may be a local network device, such as a router, that does not perform video game processing separately, but facilitates the passage of network traffic.
  • the connection to the network by the wearable device 1, the HMD 110, and the camera 130 may be wired or wireless.
  • the system 100 estimates the grasping state of the hand H based on the myoelectric potential information detected by the myoelectric sensor 4. To do.
  • the system 100 outputs the estimation result estimated by the wearable device 1 to the computer 120, and outputs the processing result of the computer 120 to the HMD 110.
  • the system 100 improves the accuracy of estimating the muscle strength of the hand H using the myoelectric sensor 4 of the wearable device 1, so that the reality of the output of the HMD 110 can be improved.
  • the hand H when the posture of the hand H is the grasping posture, the hand H may be imaged by the camera 130, so that the power consumption can be reduced.
  • each step related to the processing of the wearable device 1 in the present specification does not necessarily have to be processed in chronological order in the order described in the flowchart.
  • each step related to the processing of the information processing unit may be processed in an order different from the order described in the flowchart, or may be processed in parallel.
  • a posture sensor that detects the posture of the user's hand and An electromyographic sensor that detects the myoelectric potential information of the hand and When the posture detected by the posture sensor is the grasping posture, an estimation unit that estimates the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.
  • Wearable device with.
  • the myoelectric sensor has a dry electrode that comes into contact with the hand and detects the myoelectric potential information indicating the action potential generated in the muscle of the hand.
  • the posture sensor includes a bending sensor that detects the movement of the fingers of the hand.
  • the wearable device wherein the myoelectric sensor starts detecting myoelectric potential information when the posture detected by the posture sensor becomes the grasping posture.
  • the estimation unit does not estimate the grasping state of the hand of the user when the posture detected by the posture sensor is not the grasping posture.
  • the estimation unit estimates the grasping state when the grasping posture continues for a determination time.
  • the estimation unit uses the myoelectric potential information of the myoelectric sensor when the start of the grasping posture is detected as an initial value, and the estimation unit is based on the initial value and the myoelectric potential information detected after the start of the grasping posture.
  • the wearable device according to any one of (1) to (6) above, which estimates the grasping state.
  • (8) The wearable device according to any one of (1) to (7) above, wherein when the estimation unit detects the end of the grasping posture when estimating the grasping state, the estimation of the grasping state ends.
  • (9) Further provided with a mounting portion to be mounted on the hand of the user.
  • the wearable device positions the posture sensor at a position of the hand that can detect the grasping posture when the mounting portion is mounted on the hand.
  • the wearing portion is a glove that exposes at least the fingertips of the user.
  • the wearable device is positioned on the back of the hand by the mounting portion.
  • a storage unit for storing posture data indicating the grasping posture of the user's hand is provided.
  • the estimation unit is based on the myoelectric potential information detected by the myoelectric sensor when the posture detected by the posture sensor is the grasping posture indicated by the posture data stored in the storage unit.
  • the wearable device according to any one of (1) to (10), which estimates the grasping state of the hand of the user.
  • (12) The wearable device according to (11), wherein the posture data includes data indicating the grasping posture according to a scene in which the user operates.
  • An acquisition unit that acquires the detection results of a posture sensor that detects the posture of the user's hand and an electromyographic sensor that detects the myoelectric potential information of the hand.
  • an estimation unit that estimates the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.
  • the computer A step of acquiring the detection results of the posture sensor that detects the posture of the user's hand and the myoelectric sensor that detects the myoelectric potential information of the hand, and
  • Wearable devices worn in the user's hands A linkage device that cooperates with the wearable device and With The wearable device is A posture sensor that detects the posture of the user's hand and An electromyographic sensor that detects the myoelectric potential information of the hand and When the posture detected by the posture sensor is the grasping posture, an estimation unit that estimates the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.
  • the cooperation device is a system that executes processing based on the grasping state estimated by the wearable device.
  • Wearable device Mounting unit 2A Mounting unit 3 Attitude sensor 4 Myoelectric sensor 10 Information processing unit 11 Acquisition unit 12 Storage unit 12A Attitude data 13 Estimating unit 14 Communication unit 100 System 110 Head-mounted up display (HMD) 120 Computer 130 Camera H Hand P1 Non-grasp posture P2 Grasp posture U User

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A wearable device (1) is provided with: a position sensor (3) for detecting the position of a hand (H) of a user; a myoelectric sensor (4) for detecting myoelectric potential information of the hand (H); and an estimating unit which, when the position of the hand (H) detected by the position sensor (3) is a grasping position (P2), estimates a grasping state of the hand (H) of the user on the basis of the myoelectric potential information detected by the myoelectric sensor (4).

Description

ウェアラブル装置、情報処理ユニット及び情報処理方法Wearable device, information processing unit and information processing method

 本開示は、ウェアラブル装置、情報処理ユニット及び情報処理方法に関する。 This disclosure relates to a wearable device, an information processing unit, and an information processing method.

 特許文献1には、筋電位情報の周波数解析データに基づいて筋トーヌスの予測用データを算出することによって筋の動作予測を行うモーションキャプチャー装置等が開示されている。 Patent Document 1 discloses a motion capture device or the like that predicts muscle motion by calculating muscle tonus prediction data based on frequency analysis data of myoelectric potential information.

特開2001-054507号公報Japanese Unexamined Patent Publication No. 2001-054507

 上記の従来技術では、筋電センサの電極をユーザの皮膚表面に接触させて使用する場合に、電極の接触状況によっては電極の出力にノイズ等の影響を受ける可能性がある。このため、従来技術では、皮膚表面に接触させる筋電センサを用いて手の筋力を推定する精度を向上させることが望まれている。 In the above-mentioned conventional technology, when the electrode of the myoelectric sensor is used in contact with the skin surface of the user, the output of the electrode may be affected by noise or the like depending on the contact condition of the electrode. Therefore, in the prior art, it is desired to improve the accuracy of estimating the muscular strength of the hand by using the myoelectric sensor that comes into contact with the skin surface.

 そこで、本開示では、筋電センサを用いた手の筋力の推定精度を向上させることができるウェアラブル装置、情報処理ユニット及び情報処理方法を提案する。 Therefore, in this disclosure, we propose a wearable device, an information processing unit, and an information processing method that can improve the estimation accuracy of the muscle strength of the hand using the myoelectric sensor.

 上記の課題を解決するために、本開示に係る一形態のウェアラブル装置は、ユーザの手の姿勢を検出する姿勢センサと、前記手の筋電位情報を検出する筋電センサと、前記姿勢センサによって検出した前記姿勢が把握姿勢である場合に、前記筋電センサによって検出した前記筋電位情報に基づいて、前記ユーザの前記手の把握状態を推定する推定部と、を備える。 In order to solve the above problems, the wearable device of one form according to the present disclosure uses a posture sensor for detecting the posture of the user's hand, an electromyographic sensor for detecting the myoelectric potential information of the hand, and the posture sensor. When the detected posture is the grasping posture, the estimation unit includes an estimation unit that estimates the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.

 また、本開示に係る一形態の情報処理ユニットは、ユーザの手の姿勢を検出する姿勢センサ及び前記手の筋電位情報を検出する筋電センサの検出結果を取得する取得部と、前記姿勢センサによって検出した前記姿勢が把握姿勢である場合に、前記筋電センサによって検出した前記筋電位情報に基づいて、前記ユーザの前記手の把握状態を推定する推定部と、を備える。 Further, the information processing unit of one form according to the present disclosure includes a posture sensor that detects the posture of the user's hand, an acquisition unit that acquires the detection results of the myoelectric potential sensor that detects the myoelectric potential information of the hand, and the posture sensor. When the posture detected by the user is the grasping posture, the estimation unit includes an estimation unit that estimates the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.

 また、本開示に係る一形態の情報処理方法は、コンピュータが、ユーザの手の姿勢を検出する姿勢センサ及び前記手の筋電位情報を検出する筋電センサの検出結果を取得するステップと、前記姿勢センサによって検出した前記姿勢が把握姿勢である場合に、前記筋電センサによって検出した前記筋電位情報に基づいて、前記ユーザの前記手の把握状態を推定するステップと、を含む。 In addition, one form of information processing method according to the present disclosure includes a step in which a computer acquires detection results of a posture sensor for detecting the posture of a user's hand and an electromyographic sensor for detecting the myoelectric potential information of the hand. When the posture detected by the posture sensor is the grasping posture, the step includes estimating the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.

第1の実施形態に係るウェアラブル装置の一例を説明するための図である。It is a figure for demonstrating an example of the wearable apparatus which concerns on 1st Embodiment. 第1の実施形態に係るウェアラブル装置の筋電センサの出力例を示す図である。It is a figure which shows the output example of the myoelectric sensor of the wearable device which concerns on 1st Embodiment. 第1の実施形態に係るウェアラブル装置の姿勢センサの出力例を示す図である。It is a figure which shows the output example of the posture sensor of the wearable device which concerns on 1st Embodiment. 第1の実施形態に係るウェアラブル装置の構成例を示す図である。It is a figure which shows the configuration example of the wearable device which concerns on 1st Embodiment. 第1の実施形態に係るウェアラブル装置が実行する処理手順の一例を示すフローチャートである。It is a flowchart which shows an example of the processing procedure executed by the wearable apparatus which concerns on 1st Embodiment. 第1の実施形態の変形例(1)に係るウェアラブル装置の一例を説明するための図である。It is a figure for demonstrating an example of the wearable apparatus which concerns on the modification (1) of 1st Embodiment. 第1の実施形態の変形例(2)に係るウェアラブル装置の把握姿勢の一例を示す図である。It is a figure which shows an example of the grasping posture of the wearable device which concerns on the modification (2) of 1st Embodiment. 第2の実施形態に係るシステムの一例を説明するための図である。It is a figure for demonstrating an example of the system which concerns on 2nd Embodiment. 第2の実施形態に係るシステムにおけるウェアラブル装置の一例を説明するための図である。It is a figure for demonstrating an example of the wearable apparatus in the system which concerns on 2nd Embodiment.

 以下に、本開示の実施形態について図面に基づいて詳細に説明する。なお、以下の各実施形態において、同一の部位には同一の符号を付することにより重複する説明を省略する。 The embodiments of the present disclosure will be described in detail below with reference to the drawings. In each of the following embodiments, the same parts are designated by the same reference numerals, so that duplicate description will be omitted.

(第1の実施形態)
[第1の実施形態に係るウェアラブル装置の構成]
 図1は、第1の実施形態に係るウェアラブル装置の一例を説明するための図である。図1に示すように、ウェアラブル装置1は、ユーザの手Hに装着される。本実施形態では、ユーザの手Hは、人体における手首から指先までの部分として説明するが、例えば、指のみ、手の甲のみとしてもよい。ウェアラブル装置1は、例えば、手袋インターフェースオブジェクトである場合について説明するが、これに限定されない。例えば、ウェアラブル装置1は、ユーザの手Hに装着されるグローブ、レーシンググローブ、装着ベルト等であってもよい。なお、本実施形態では、ウェアラブル装置1は、ユーザの一方の手Hに装着される場合について説明するが、ユーザの両方の手Hのそれぞれに装着される構成としてもよい。
(First Embodiment)
[Configuration of Wearable Device According to First Embodiment]
FIG. 1 is a diagram for explaining an example of a wearable device according to the first embodiment. As shown in FIG. 1, the wearable device 1 is attached to the user's hand H. In the present embodiment, the user's hand H is described as a portion of the human body from the wrist to the fingertip, but for example, only the finger or the back of the hand may be used. The wearable device 1 will be described, for example, in the case of a glove interface object, but the wearable device 1 is not limited thereto. For example, the wearable device 1 may be a glove, a racing glove, a wearing belt, or the like worn on the user's hand H. In the present embodiment, the case where the wearable device 1 is attached to one of the user's hands H will be described, but the wearable device 1 may be attached to both of the user's hands H.

 図1に示す例では、ウェアラブル装置1は、ユーザの手Hの指、甲等の全体を覆っている。ウェアラブル装置1は、例えば、AR(Augmented Reality)、VR(Virtual Reality)等におけるユーザの握力を検出するインターフェースとして用いることができる。例えば、スポーツの道具の握り方は、野球、ゴルフ、テニス、卓球、ボクシング、レース等のスポーツごとに異なり、それぞれに対応するゲーム機のコントローラを個別に作るのは困難である。また、握力は、スポーツによっては非常に重要な筋力となる。例えば、野球では、バッティングの際にバットをしっかりと固定し、身体全体の力をバットに伝達するために握力を用いることが重要となる。このため、ウェアラブル装置1は、手袋インターフェースオブジェクトとすることで、複数種類のスポーツにおける握力等の推定が可能となっている。 In the example shown in FIG. 1, the wearable device 1 covers the entire finger, instep, etc. of the user's hand H. The wearable device 1 can be used as an interface for detecting a user's grip strength in, for example, AR (Augmented Reality), VR (Virtual Reality), and the like. For example, how to hold sports equipment differs for each sport such as baseball, golf, tennis, table tennis, boxing, and racing, and it is difficult to individually make a controller for a game machine corresponding to each sport. Grip strength is also a very important muscle strength in some sports. For example, in baseball, it is important to firmly fix the bat during batting and use grip strength to transmit the force of the entire body to the bat. Therefore, by using the wearable device 1 as a glove interface object, it is possible to estimate the grip strength and the like in a plurality of types of sports.

 また、ユーザが持つコントローラに圧電素子を設ける従来の握力測定方法では、フリーハンドや道具を持ったUI操作を実現するのは困難であり、かつ圧電素子の位置を正確に握らないと握力を推定することが困難であった。このため、ウェアラブル装置1は、手Hの筋力の推定精度の向上及びユーザの利便性の向上を図っている。 In addition, it is difficult to realize freehand or UI operation with tools by the conventional grip strength measurement method in which the piezoelectric element is provided in the controller of the user, and the grip strength is estimated unless the position of the piezoelectric element is accurately grasped. It was difficult to do. Therefore, the wearable device 1 aims to improve the estimation accuracy of the muscle strength of the hand H and the convenience of the user.

 ウェアラブル装置1は、装着部2と、姿勢センサ3と、筋電センサ4と、情報処理ユニット10と、を備える。第1の実施形態では、ウェアラブル装置1は、手Hの指のそれぞれに対応する5つの姿勢センサ3と、3つの筋電センサ4と、を有する場合について説明するが、センサの数を限定するものではない。 The wearable device 1 includes a mounting portion 2, a posture sensor 3, a myoelectric sensor 4, and an information processing unit 10. In the first embodiment, the case where the wearable device 1 has five posture sensors 3 and three myoelectric sensors 4 corresponding to each of the fingers of the hand H will be described, but the number of sensors is limited. It's not a thing.

 装着部2は、ウェアラブル装置1をユーザの手Hに装着される手袋である。装着部2は、例えば、布、合成繊維、皮等によって形成されている。装着部2は、ユーザの手Hを覆い、当該手Hの動作に応じて変形する。装着部2は、姿勢センサ3、筋電センサ4及び情報処理ユニット10が設けられている。装着部2は、例えば、ユーザの手Hに装着されると、姿勢センサ3及び筋電センサ4を直接的または間接的に手Hの皮膚に接触させる。装着部2は、手Hに装着された状態において、ユーザが手Hを握っていない姿勢を示す非把握姿勢P1と、ユーザが手Hを握った姿勢を示す把握姿勢P2と、を有する。非把握姿勢P1は、例えば、手Hを開いた姿勢を含む。非把握姿勢P1は、手Hの力で検出対象の物を握らないときの姿勢を含み、手Hの指の一部が曲がっていてもよい。把握姿勢P2は、例えば、手Hが検出対象の物を握ったときの姿勢、手Hの力を入れるときの姿勢等を含む。本実施形態における手Hの姿勢は、例えば、手Hの構え、形状、把握状態等を含む。 The mounting portion 2 is a glove on which the wearable device 1 is mounted on the user's hand H. The mounting portion 2 is formed of, for example, cloth, synthetic fibers, leather, or the like. The mounting portion 2 covers the user's hand H and deforms according to the operation of the hand H. The mounting unit 2 is provided with a posture sensor 3, a myoelectric sensor 4, and an information processing unit 10. When the attachment portion 2 is attached to the user's hand H, for example, the posture sensor 3 and the myoelectric sensor 4 are brought into direct or indirect contact with the skin of the hand H. The mounting unit 2 has a non-grasping posture P1 indicating a posture in which the user does not hold the hand H and a grasping posture P2 indicating a posture in which the user holds the hand H when the user is mounted on the hand H. The non-grasp posture P1 includes, for example, a posture in which the hand H is open. The non-grasping posture P1 includes a posture when the object to be detected is not grasped by the force of the hand H, and a part of the fingers of the hand H may be bent. The grasping posture P2 includes, for example, a posture when the hand H holds an object to be detected, a posture when the force of the hand H is applied, and the like. The posture of the hand H in the present embodiment includes, for example, the posture, shape, and grasping state of the hand H.

 姿勢センサ3は、ユーザの手Hの姿勢を検出する。姿勢センサ3は、ユーザの指などの、ユーザの手Hの一部の屈曲を検出するように構成されている。姿勢センサ3は、例えば、曲げセンサ、カメラ、慣性計測装置(IMU:Inertial Measurement Unit)等を用いることができる。姿勢センサ3は、例えば、曲げセンサ、カメラ、慣性計測装置を組み合わせて用いてもよい。 The posture sensor 3 detects the posture of the user's hand H. The posture sensor 3 is configured to detect the bending of a part of the user's hand H, such as the user's finger. As the attitude sensor 3, for example, a bending sensor, a camera, an inertial measurement unit (IMU: Inertial Measurement Unit), or the like can be used. The posture sensor 3 may be used in combination with, for example, a bending sensor, a camera, and an inertial measurement unit.

 本実施形態では、姿勢センサ3は、曲げセンサである場合について説明する。例えば、姿勢センサ3は、ウェアラブル装置1の装着部2が手Hに装着された場合に、手Hの指のそれぞれの上面に沿って配置される。指の上面は、例えば、手Hの甲を上方に向けたときに上方を向く表面である。本実施形態では、姿勢センサ3は、手Hの指のそれぞれの基節骨と中手骨とに跨がるように、手Hの甲側に配置される場合について説明するが、これに限定されない。例えば、姿勢センサ3は、手Hの指の曲げを検出できるように、指全体や指の内面等に配置してもよい。 In the present embodiment, the case where the posture sensor 3 is a bending sensor will be described. For example, the posture sensor 3 is arranged along the upper surface of each of the fingers of the hand H when the mounting portion 2 of the wearable device 1 is mounted on the hand H. The upper surface of the finger is, for example, a surface that faces upward when the back of the hand H is turned upward. In the present embodiment, the case where the posture sensor 3 is arranged on the back side of the hand H so as to straddle the proximal phalanx and the metacarpal bone of each finger of the hand H will be described, but the present invention is limited to this. Not done. For example, the posture sensor 3 may be arranged on the entire finger, the inner surface of the finger, or the like so that the bending of the finger of the hand H can be detected.

 姿勢センサ3は、屈曲すると抵抗値が増加するセンサとなっている。姿勢センサ3は、ユーザの手の少なくとも一部の屈曲を検出する。本実施形態では、姿勢センサ3は、例えば、屈曲に感応する抵抗器を含む。姿勢センサ3の抵抗値は、抵抗器の屈曲の量に基づいて変化する。換言すると、姿勢センサ3は、手Hの姿勢を抵抗値として検出する。例えば、手Hの人差し指に配置された姿勢センサ3は、人差し指の屈曲を検出し、当該屈曲が増加すると、抵抗値も増加する。複数の姿勢センサ3のそれぞれは、一方がウェアラブル装置1の電源に接続され、他方が接地されている。本実施形態では、姿勢センサ3は、情報処理ユニット10と電気的に接続されており、検出結果を情報処理ユニット10に出力する。 The posture sensor 3 is a sensor whose resistance value increases when it is bent. The posture sensor 3 detects at least a part of the bending of the user's hand. In this embodiment, the posture sensor 3 includes, for example, a resistor that is sensitive to bending. The resistance value of the attitude sensor 3 changes based on the amount of bending of the resistor. In other words, the posture sensor 3 detects the posture of the hand H as a resistance value. For example, the posture sensor 3 arranged on the index finger of the hand H detects the bending of the index finger, and as the bending increases, the resistance value also increases. One of the plurality of posture sensors 3 is connected to the power supply of the wearable device 1, and the other is grounded. In the present embodiment, the posture sensor 3 is electrically connected to the information processing unit 10 and outputs the detection result to the information processing unit 10.

 また、例えば、姿勢センサ3は、カメラである場合、装着部2を撮像可能に設けられる。換言すると、姿勢センサ3は、装着部2の外部に設けられ、装着部2を撮像した画像からユーザの手Hを推定する。姿勢センサ3は、例えば、背景差分法、パターンマッチング等の周知技術を用いて、画像から手Hとその姿勢を検出する。 Further, for example, in the case of a camera, the posture sensor 3 is provided so that the mounting portion 2 can be imaged. In other words, the posture sensor 3 is provided outside the mounting portion 2, and estimates the user's hand H from the image captured by the mounting portion 2. The posture sensor 3 detects the hand H and its posture from an image by using well-known techniques such as background subtraction method and pattern matching.

 また、例えば、姿勢センサ3は、慣性計測装置である場合、手Hの指、指先等に配置されるように装着部2に設けられる。姿勢センサ3は、指、指先等の移動に応じた3次元の角速度、加速度を検出し、その検出結果に基づいて手Hの姿勢を検出する。 Further, for example, in the case of an inertial measurement unit, the posture sensor 3 is provided on the mounting portion 2 so as to be arranged on the finger, fingertip, or the like of the hand H. The posture sensor 3 detects the three-dimensional angular velocity and acceleration according to the movement of the finger, the fingertip, and the like, and detects the posture of the hand H based on the detection result.

 筋電センサ4は、ユーザの手Hの筋電位情報を検出する。筋電位情報は、例えば、手Hの把握状態の推定に必要な情報である。筋電センサ4は、例えば、ユーザの手Hの皮膚に接触する複数の電極41を有する。筋電センサ4は、手Hの筋肉を動作させるときに発生する筋電位情報を電極41で検出する。複数の電極41のそれぞれは、一方がウェアラブル装置1の電源に接続され、他方が接地されている。筋電センサ4は、情報処理ユニット10と電気的に接続されており、複数の電極41で検出した筋電位情報を情報処理ユニット10に出力する。筋電子情報は、例えば、手Hの把握力、手指の形状等の推定に必要な複数の電極41ごとの電位を示す情報と、電極41が配置された手Hの部位を示す情報と、を含む。なお、本実施形態では、電極41は、乾式タイプの電極である場合について説明するが、湿式タイプの電極としてもよい。乾式タイプの電極41の利点は、ユーザの皮膚への取り付けが容易であることと、湿式タイプのディスポ電極を用いるよりもランニングコストを抑制できることである。 The myoelectric sensor 4 detects the myoelectric potential information of the user's hand H. The myoelectric potential information is, for example, information necessary for estimating the grasped state of the hand H. The myoelectric sensor 4 has, for example, a plurality of electrodes 41 that come into contact with the skin of the user's hand H. The myoelectric sensor 4 detects the myoelectric potential information generated when the muscle of the hand H is operated by the electrode 41. One of each of the plurality of electrodes 41 is connected to the power supply of the wearable device 1, and the other is grounded. The myoelectric sensor 4 is electrically connected to the information processing unit 10, and outputs the myoelectric potential information detected by the plurality of electrodes 41 to the information processing unit 10. The muscle electronic information includes, for example, information indicating the potential of each of the plurality of electrodes 41 required for estimating the grasping power of the hand H, the shape of the fingers, etc., and information indicating the part of the hand H in which the electrodes 41 are arranged. Including. In this embodiment, the case where the electrode 41 is a dry type electrode will be described, but a wet type electrode may be used. The advantages of the dry type electrode 41 are that it can be easily attached to the user's skin and that the running cost can be suppressed as compared with the case of using the wet type disposable electrode.

 情報処理ユニット10は、例えば、専用または汎用コンピュータである。本実施形態では、情報処理ユニット10は、チップ状に形成されている。情報処理ユニット10は、他のデバイスからのデータを処理するためのコンピュータである。情報処理ユニット10は、姿勢センサ3と筋電センサ4とを併用し、姿勢センサ3と筋電センサ4とが検出した情報を処理する電子機器である。情報処理ユニット10は、例えば、ユーザの手Hの姿勢を判別する機能を有する。情報処理ユニット10は、例えば、ユーザの手Hの把握状態を推定する機能を有する。把握状態は、例えば、手指の姿勢、握力等の状態を含む。情報処理ユニット10は、ユーザの使用開始時等のキャリブレーションで、非把握姿勢P1と把握姿勢P2の姿勢センサ3の検出結果を求めておき、非把握姿勢P1の検出結果を初期値とするようにしてもよい。 The information processing unit 10 is, for example, a dedicated or general-purpose computer. In this embodiment, the information processing unit 10 is formed in a chip shape. The information processing unit 10 is a computer for processing data from another device. The information processing unit 10 is an electronic device that uses the posture sensor 3 and the myoelectric sensor 4 in combination and processes the information detected by the posture sensor 3 and the myoelectric sensor 4. The information processing unit 10 has, for example, a function of determining the posture of the user's hand H. The information processing unit 10 has, for example, a function of estimating the grasping state of the user's hand H. The grasping state includes, for example, a state such as a posture of fingers and a grip strength. The information processing unit 10 obtains the detection results of the posture sensors 3 of the non-grasping posture P1 and the gripping posture P2 by calibration at the start of use of the user, and sets the detection result of the non-grasping posture P1 as the initial value. It may be.

 図2は、第1の実施形態に係るウェアラブル装置1の筋電センサ4の出力例を示す図である。図2に示すグラフのそれぞれは、縦軸が振幅、横軸が時間を示している。図2に示すグラフG11及びG12は、ユーザの手Hが非把握姿勢P1、すなわち手Hを開いた状態の筋電センサ4の出力例を示している。非把握姿勢P1は、例えば、手Hの姿勢の変化により、手Hの甲に凹部が生じたり、電極41が部分的または完全に皮膚と接触しなくなったりする。このような場合、筋電センサ4は、接触インピーダンスの変化によって信号の振幅が不安定になる。例えば、電極41が手Hから完全に剥離した場合、筋電センサ4の出力は、グラフG11に示すように基線のまま変化しなくなり、グラフG12に示すようにA/Dコンバータ(ADC)の境界値に張り付いた状態になる。 FIG. 2 is a diagram showing an output example of the myoelectric sensor 4 of the wearable device 1 according to the first embodiment. In each of the graphs shown in FIG. 2, the vertical axis represents the amplitude and the horizontal axis represents the time. The graphs G11 and G12 shown in FIG. 2 show an output example of the myoelectric sensor 4 in a state where the user's hand H is in the non-grasping posture P1, that is, the hand H is open. In the non-grasping posture P1, for example, a change in the posture of the hand H causes a recess in the back of the hand H, or the electrode 41 partially or completely does not come into contact with the skin. In such a case, the amplitude of the signal of the myoelectric sensor 4 becomes unstable due to the change of the contact impedance. For example, when the electrode 41 is completely separated from the hand H, the output of the myoelectric sensor 4 does not change at the baseline as shown in the graph G11, and the boundary of the A / D converter (ADC) as shown in the graph G12. It becomes stuck to the value.

 図2に示すグラフG21及びG22は、ユーザの手Hが把握姿勢P2、すなわち手Hを握った状態の筋電センサ4の出力例を示している。把握姿勢P2は、第1状態P21と、第2状態P22と、を有する。第1状態P21は、ユーザが力を入れずに手Hを軽く握った状態を示している。第2状態P22は、ユーザが力を入れて手Hを強く握った状態を示している。第1状態P21は、手Hの姿勢が固定され、電極41と手Hとの接触面積が一定になっている。この場合、筋電センサ4は、小さな振幅の筋電信号を示す筋電位情報を出力する。また、第2状態P22は、手Hの握力に比例して平均振幅が増加する筋電信号を示す筋電位情報を出力する。 The graphs G21 and G22 shown in FIG. 2 show an output example of the myoelectric sensor 4 in a state where the user's hand H holds the grasping posture P2, that is, the hand H. The grasping posture P2 has a first state P21 and a second state P22. The first state P21 indicates a state in which the user lightly grasps the hand H without exerting any force. The second state P22 indicates a state in which the user puts a lot of effort into holding the hand H firmly. In the first state P21, the posture of the hand H is fixed, and the contact area between the electrode 41 and the hand H is constant. In this case, the myoelectric sensor 4 outputs myoelectric potential information indicating a myoelectric signal having a small amplitude. Further, the second state P22 outputs myoelectric potential information indicating a myoelectric signal in which the average amplitude increases in proportion to the grip strength of the hand H.

 筋電センサ4は、図2に示すように、ユーザの手Hが非把握姿勢P1と把握姿勢P2とで、筋電信号の特性が変化する。すなわち、筋電センサ4は、手Hが非把握姿勢P1から把握姿勢P2に変化した場合に、手Hの把握状態の推定に有効な筋電信号を出力することが分かる。また、筋電センサ4は、把握姿勢P2の第1状態P21の筋電信号が、手Hを握り始めの筋電位情報であり、当該筋電位情報を初期値とすることができる。 As shown in FIG. 2, in the myoelectric sensor 4, the characteristics of the myoelectric signal change depending on whether the user's hand H is in the non-grasping posture P1 and the grasping posture P2. That is, it can be seen that the myoelectric sensor 4 outputs a myoelectric signal effective for estimating the grasping state of the hand H when the hand H changes from the non-grasping posture P1 to the grasping posture P2. Further, in the myoelectric sensor 4, the myoelectric signal in the first state P21 of the grasping posture P2 is the myoelectric potential information at the beginning of grasping the hand H, and the myoelectric potential information can be used as the initial value.

 図3は、第1の実施形態に係るウェアラブル装置1の姿勢センサ3の出力例を示す図である。図3に示すグラフは、縦軸が姿勢センサ3の抵抗値、横軸が手Hの非把握姿勢P1から把握姿勢P2への変化量を示している。当該グラフは、非把握姿勢P1で手Hを開いた状態の場合、姿勢センサ3が抵抗値Rsを出力することを示している。そして、当該グラフは、ユーザが非把握姿勢P1から把握姿勢P2の第1状態P21へ手Hを徐々に握った状態へ変化させると、姿勢センサ3の変形量の増加に応じて抵抗値Rm付近まで増加することを示している。そして、当該グラフは、把握姿勢P2の第1状態P21及び第2状態P22では、姿勢センサ3の出力する抵抗値Rmが変化しないことを示している。 FIG. 3 is a diagram showing an output example of the posture sensor 3 of the wearable device 1 according to the first embodiment. In the graph shown in FIG. 3, the vertical axis shows the resistance value of the posture sensor 3, and the horizontal axis shows the amount of change from the non-grasping posture P1 of the hand H to the gripping posture P2. The graph shows that the posture sensor 3 outputs the resistance value Rs when the hand H is opened in the non-grasp posture P1. Then, in the graph, when the user gradually changes the state in which the hand H is grasped from the non-grasping posture P1 to the first state P21 of the grasping posture P2, the resistance value is around Rm according to the increase in the deformation amount of the posture sensor 3. It shows that it increases to. Then, the graph shows that the resistance value Rm output by the posture sensor 3 does not change in the first state P21 and the second state P22 of the grasping posture P2.

 姿勢センサ3は、図3に示すように、ユーザの手Hが非把握姿勢P1から把握姿勢P2までは抵抗値が変化するが、手Hの把握力の変化を捉えることができないことが分かる。また、ウェアラブル装置1は、姿勢センサ3に対して、把握姿勢P2の第1状態P21に基づく基準抵抗値Rthを設定することで、非把握姿勢P1から把握姿勢P2への手Hの変化と、把握姿勢P2から非把握姿勢P1への手Hの変化とを姿勢センサ3の検出結果に基づいて判別することができる。 As shown in FIG. 3, the posture sensor 3 changes the resistance value from the non-grasping posture P1 to the gripping posture P2 of the user's hand H, but it can be seen that the change in the gripping force of the hand H cannot be captured. Further, the wearable device 1 sets the reference resistance value Rth based on the first state P21 of the grasping posture P2 for the posture sensor 3, thereby changing the hand H from the non-grasping posture P1 to the grasping posture P2. The change of the hand H from the grasping posture P2 to the non-grasping posture P1 can be discriminated based on the detection result of the posture sensor 3.

 図1に戻り、ウェアラブル装置1は、ユーザの手Hに装着されると、姿勢センサ3の検出結果に基づいて手Hの姿勢を判別する。例えば、ユーザの手Hは、手Hを開いた非把握姿勢P1であるとする。この場合、ウェアラブル装置1は、姿勢センサ3が検出する抵抗値が基準抵抗値Rthを超えないので、ユーザの手Hの姿勢が非把握姿勢P1と判別し、筋電センサ4の検出を行わない。その後、ユーザの手Hは、例えば、道具を握る等により、非把握姿勢P1から把握姿勢P2に変化したとする。この場合、ウェアラブル装置1は、姿勢センサ3が検出する抵抗値が基準抵抗値Rthを超えるので、ユーザの手Hの姿勢が把握姿勢P2と判別する。そして、ウェアラブル装置1は、筋電センサ4の検出を開始し、筋電センサ4の検出結果に基づいて手Hの把握状態を推定する。例えば、ウェアラブル装置1は、筋電センサ4の電極41ごとの筋電位情報の周波数成分に基づいて、手指の状態、握力等を推定する。 Returning to FIG. 1, when the wearable device 1 is attached to the user's hand H, the wearable device 1 determines the posture of the hand H based on the detection result of the posture sensor 3. For example, it is assumed that the user's hand H is the non-grasp posture P1 with the hand H open. In this case, since the resistance value detected by the posture sensor 3 does not exceed the reference resistance value Rth, the wearable device 1 determines that the posture of the user's hand H is the non-grasp posture P1 and does not detect the myoelectric sensor 4. .. After that, it is assumed that the user's hand H changes from the non-grasping posture P1 to the grasping posture P2 by, for example, holding a tool. In this case, since the resistance value detected by the posture sensor 3 exceeds the reference resistance value Rth, the wearable device 1 determines that the posture of the user's hand H is the grasping posture P2. Then, the wearable device 1 starts the detection of the myoelectric sensor 4, and estimates the grasping state of the hand H based on the detection result of the myoelectric sensor 4. For example, the wearable device 1 estimates the state of fingers, grip strength, and the like based on the frequency component of the myoelectric potential information for each electrode 41 of the myoelectric sensor 4.

 例えば、ウェアラブル装置1は、筋電センサ4の筋電位情報の筋電信号が動作によって相異なる周波数を示すという特性を利用して、手Hの動作、握力等の把握状態を推測する。例えば、ウェアラブル装置1は、筋電信号のパラメータとして、積分値平均電位、周波数スペクトル等を抽出することができる。積分値平均電位は、筋電信号を整流,平均化したものであり、筋力が大きいほど筋活動が活発になって大きな値をとることが知られている。したがって、ウェアラブル装置1は、例えば、周知技術である積分値平均電位を用いた検出方法によって手Hの握力を検出する。また、ウェアラブル装置1は、例えば、周知技術である周波数スペクトルを用いたジェスチャ検出方法を用いて、手Hの細かい動作を推定してもよい。 For example, the wearable device 1 estimates the grasping state of the movement of the hand H, the grip strength, etc. by utilizing the characteristic that the myoelectric signals of the myoelectric potential information of the myoelectric sensor 4 show different frequencies depending on the movement. For example, the wearable device 1 can extract an integrated value average potential, a frequency spectrum, and the like as parameters of a myoelectric signal. The integrated value average potential is a rectified and averaged myoelectric signal, and it is known that the greater the muscle strength, the more active the muscle activity and the larger the value. Therefore, the wearable device 1 detects the grip strength of the hand H by, for example, a detection method using the integrated value average potential, which is a well-known technique. Further, the wearable device 1 may estimate the fine operation of the hand H by using, for example, a gesture detection method using a frequency spectrum, which is a well-known technique.

 その後、ユーザの手Hは、把握姿勢P2から非把握姿勢P1に遷移したとする。この場合、ウェアラブル装置1は、姿勢センサ3が検出する抵抗値が基準抵抗値Rthを超えないので、ユーザの手Hの姿勢が非把握姿勢P1と判別し、筋電センサ4の検出を終了する。 After that, it is assumed that the user's hand H transitions from the grasping posture P2 to the non-grasping posture P1. In this case, since the resistance value detected by the posture sensor 3 does not exceed the reference resistance value Rth, the wearable device 1 determines that the posture of the user's hand H is the non-grasp posture P1 and ends the detection of the myoelectric sensor 4. ..

 以上のように、第1の実施形態に係るウェアラブル装置1は、姿勢センサ3によって検出した手Hの姿勢が把握姿勢P2である場合、筋電センサ4によって検出した筋電位情報に基づいて手Hの把握状態を推定する。例えば、ユーザは、手Hを把握姿勢P2に変化させると、当該把握姿勢P2を維持する傾向がある。これにより、ウェアラブル装置1は、手Hの把握姿勢P2が変わらない可能性が高い状態で、筋電センサ4を用いた手Hの把握状態の推定を行うことができる。その結果、ウェアラブル装置1は、筋電センサ4のノイズ等の影響を抑制できるので、筋電センサ4を用いた手Hの筋力の推定精度を向上させることができる。 As described above, in the wearable device 1 according to the first embodiment, when the posture of the hand H detected by the posture sensor 3 is the grasping posture P2, the hand H is based on the myoelectric potential information detected by the myoelectric sensor 4. Estimate the grasping state of. For example, when the user changes the hand H to the grasping posture P2, the user tends to maintain the grasping posture P2. As a result, the wearable device 1 can estimate the grasping state of the hand H using the myoelectric sensor 4 in a state where there is a high possibility that the grasping posture P2 of the hand H does not change. As a result, since the wearable device 1 can suppress the influence of noise and the like of the myoelectric sensor 4, it is possible to improve the estimation accuracy of the muscle strength of the hand H using the myoelectric sensor 4.

 また、ウェアラブル装置1は、筋電センサ4が筋電位情報を検出する乾式タイプの電極41を有する。例えば、筋電センサ4は、乾式の電極41の場合、接触具合により筋電信号の特定が変わるため、同じ筋力でも測定ごとに値がばらつく可能性がある。ウェアラブル装置1は、手Hの把握姿勢P2が変わらない状態で、筋電センサ4を用いた手Hの把握状態の推定を行うことができるので、乾式の電極41の測定結果を安定させることができる。その結果、ウェアラブル装置1は、手Hへの筋電センサ4の取り付けが容易となり、利便性を向上させることができる。 Further, the wearable device 1 has a dry type electrode 41 in which the myoelectric sensor 4 detects myoelectric potential information. For example, in the case of the dry electrode 41 of the myoelectric sensor 4, since the identification of the myoelectric signal changes depending on the contact condition, the value may vary from measurement to measurement even with the same muscle strength. Since the wearable device 1 can estimate the grasping state of the hand H using the myoelectric sensor 4 in the state where the grasping posture P2 of the hand H does not change, the measurement result of the dry electrode 41 can be stabilized. it can. As a result, the wearable device 1 can easily attach the myoelectric sensor 4 to the hand H, and can improve convenience.

 また、ウェアラブル装置1は、手Hの指の動きを検出する曲げセンサを姿勢センサ3として用いる。これにより、ウェアラブル装置1は、手Hの指の曲がり具合に基づいて手Hの姿勢を判別することができる。その結果、ウェアラブル装置1は、手Hの把握姿勢P2を判別する精度を向上できるので、筋電センサ4を用いた手Hの筋力の推定精度をより一層向上させることができる。また、ウェアラブル装置1は、手Hを撮像した画像から姿勢を判別することも可能ではあるが、ユーザが撮像範囲の内部に手Hを位置付ける必要があり、利便性を低下させる可能性がある。これに対し、ウェアラブル装置1は、姿勢センサ3に曲げセンサを用いることで、手Hの動作範囲を限定しなくてもよいので、利便性を向上させることができる。 Further, the wearable device 1 uses a bending sensor for detecting the movement of the finger of the hand H as the posture sensor 3. As a result, the wearable device 1 can determine the posture of the hand H based on the degree of bending of the fingers of the hand H. As a result, the wearable device 1 can improve the accuracy of discriminating the grasping posture P2 of the hand H, so that the estimation accuracy of the muscle strength of the hand H using the myoelectric sensor 4 can be further improved. Further, although the wearable device 1 can determine the posture from the image obtained by capturing the hand H, the user needs to position the hand H inside the imaging range, which may reduce convenience. On the other hand, the wearable device 1 does not have to limit the operating range of the hand H by using the bending sensor for the posture sensor 3, so that the convenience can be improved.

 また、ウェアラブル装置1は、装着部2が装着されると、当該装着部2が把握姿勢P2を検出可能な手Hの位置に姿勢センサ3を位置付ける。これにより、ウェアラブル装置1は、装着部2が手Hに装着されるだけで、姿勢センサ3を手Hの測定位置に配置することができる。その結果、ウェアラブル装置1は、姿勢センサ3によって手Hの姿勢を検出する精度を向上させることができる。 Further, in the wearable device 1, when the mounting portion 2 is mounted, the posture sensor 3 is positioned at the position of the hand H where the mounting portion 2 can detect the grasping posture P2. As a result, in the wearable device 1, the posture sensor 3 can be arranged at the measurement position of the hand H only by mounting the mounting portion 2 on the hand H. As a result, the wearable device 1 can improve the accuracy of detecting the posture of the hand H by the posture sensor 3.

 また、ウェアラブル装置1は、装着部2が手袋であり、姿勢センサ3が装着部2によって手Hの甲に位置付けられる。これにより、ウェアラブル装置1は、姿勢センサ3が手Hの甲で姿勢を検出することで、姿勢センサ3と手Hの接触を安定させることができる。その結果、ウェアラブル装置1は、姿勢センサ3によって手Hの姿勢を検出する精度をより一層向上させることができる。 Further, in the wearable device 1, the mounting portion 2 is a glove, and the posture sensor 3 is positioned on the back of the hand H by the mounting portion 2. As a result, the wearable device 1 can stabilize the contact between the posture sensor 3 and the hand H by detecting the posture on the back of the hand H by the posture sensor 3. As a result, the wearable device 1 can further improve the accuracy of detecting the posture of the hand H by the posture sensor 3.

 また、第1の実施形態に係る情報処理ユニット10は、姿勢センサ3によって検出した手Hの姿勢が把握姿勢である場合、筋電センサ4によって検出した筋電位情報に基づいて手Hの把握状態を推定する。これにより、情報処理ユニット10は、手Hの把握姿勢P2が変わらない状態で、筋電センサ4を用いた手Hの把握状態の推定を行うことができる。その結果、情報処理ユニット10は、筋電センサ4のノイズ等の影響を抑制できるので、筋電センサ4を用いた手Hの筋力の推定精度を向上させることができる。 Further, when the posture of the hand H detected by the posture sensor 3 is the grasping posture, the information processing unit 10 according to the first embodiment grasps the hand H based on the myoelectric potential information detected by the myoelectric sensor 4. To estimate. As a result, the information processing unit 10 can estimate the grasping state of the hand H using the myoelectric sensor 4 while the grasping posture P2 of the hand H does not change. As a result, the information processing unit 10 can suppress the influence of noise and the like of the myoelectric sensor 4, so that the accuracy of estimating the muscle strength of the hand H using the myoelectric sensor 4 can be improved.

[第1の実施形態に係るウェアラブル装置の構成例]
 図4は、第1の実施形態に係るウェアラブル装置1の構成例を示す図である。図4に示すように、ウェアラブル装置1は、姿勢センサ3と、筋電センサ4と、情報処理ユニット10と、を備える。姿勢センサ3と筋電センサ4と情報処理ユニット10とは、ウェアラブル装置1の電源1Bからの電力によって動作する。
[Configuration Example of Wearable Device According to First Embodiment]
FIG. 4 is a diagram showing a configuration example of the wearable device 1 according to the first embodiment. As shown in FIG. 4, the wearable device 1 includes a posture sensor 3, a myoelectric sensor 4, and an information processing unit 10. The posture sensor 3, the myoelectric sensor 4, and the information processing unit 10 are operated by electric power from the power source 1B of the wearable device 1.

 情報処理ユニット10は、取得部11と、記憶部12と、推定部13と、通信部14と、を備える。本実施形態では、取得部11及び推定部13の各処理部は、例えば、CPU(Central Processing Unit)やMCU(Micro Control Unit)等によって、情報処理ユニット10の内部に記憶されたプログラムがRAM(Random Access Memory)等を作業領域として実行されることにより実現される。また、各処理部は、例えば、ASIC(Application Specific Integrated Circuit)やFPGA(Field-Programmable Gate Array)等の集積回路により実現されてもよい。 The information processing unit 10 includes an acquisition unit 11, a storage unit 12, an estimation unit 13, and a communication unit 14. In the present embodiment, in each processing unit of the acquisition unit 11 and the estimation unit 13, for example, a program stored in the information processing unit 10 by a CPU (Central Processing Unit), an MCU (Micro Control Unit), or the like is stored in a RAM ( It is realized by executing Random Access Memory) or the like as a work area. In addition, each processing unit may be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field-Programmable Gate Array).

 取得部11は、姿勢センサ3及び筋電センサ4が出力した検出結果を取得する。取得部11は、取得した情報を記憶部12に記憶する。取得部11は、姿勢センサ3の検出結果と筋電センサ4の検出結果とを異なるタイミングで取得できる。取得部11は、推定部13に要求された情報を取得し、取得した情報を推定部13に出力する。本実施形態では、取得部11は、筋電センサ4の複数の電極41と電気的に接続されており、電極41からの筋電信号を求め、活動部位、振幅の大きさや変化量を求める。取得部11は、求めた情報を筋電位情報として推定部13に出力する。換言すると、取得部11は、姿勢センサ3の筋電信号を示す筋電位情報を推定部13に出力する。 The acquisition unit 11 acquires the detection results output by the posture sensor 3 and the myoelectric sensor 4. The acquisition unit 11 stores the acquired information in the storage unit 12. The acquisition unit 11 can acquire the detection result of the posture sensor 3 and the detection result of the myoelectric sensor 4 at different timings. The acquisition unit 11 acquires the information requested by the estimation unit 13 and outputs the acquired information to the estimation unit 13. In the present embodiment, the acquisition unit 11 is electrically connected to the plurality of electrodes 41 of the myoelectric sensor 4, obtains the myoelectric signal from the electrodes 41, and obtains the active part, the magnitude of the amplitude, and the amount of change. The acquisition unit 11 outputs the obtained information as myoelectric potential information to the estimation unit 13. In other words, the acquisition unit 11 outputs the myoelectric potential information indicating the myoelectric signal of the posture sensor 3 to the estimation unit 13.

 記憶部12は、各種データを記憶する。例えば、記憶部12は、姿勢センサ3及び筋電センサ4の検出結果を示すデータを記憶できる。記憶部12は、例えば、姿勢データ12A等を記憶する。姿勢データ12Aは、検出対象となる手Hの姿勢を示すデータを含む。例えば、ウェアラブル装置1がゲーム機に用いられる場合、ゲームに応じて検出対象の手Hの姿勢が異なる可能性がある。このため、姿勢データ12Aは、検出対象となる手Hの姿勢を設定することで、ゲームに応じた手Hの姿勢を検出することができる。姿勢データ12Aは、日々の生活を営む上で必要不可欠な日常生活動作に応じた手Hの姿勢を示すデータを含んでもよい。姿勢データ12Aは、設定されたユーザの手Hが把握姿勢P2のときに実際に測定したデータを含んでもよい。換言すると、姿勢データ12Aは、キャリブレーションに用いるデータを含んでもよい。姿勢データ12Aは、複数の電極41ごとの測定結果が握力、手Hの状態に紐付けられている。姿勢データ12Aは、一般的な手Hの大きさに対応した測定結果を示すデータを含んでもよい。姿勢データ12Aは、例えば、電極41の位置と筋電位情報の変化量とに基づいて握力を求める算出プログラム、変換テーブル等のデータを含んでもよい。 The storage unit 12 stores various data. For example, the storage unit 12 can store data indicating the detection results of the posture sensor 3 and the myoelectric sensor 4. The storage unit 12 stores, for example, posture data 12A and the like. The posture data 12A includes data indicating the posture of the hand H to be detected. For example, when the wearable device 1 is used in a game machine, the posture of the hand H to be detected may differ depending on the game. Therefore, the posture data 12A can detect the posture of the hand H according to the game by setting the posture of the hand H to be detected. The posture data 12A may include data indicating the posture of the hand H according to activities of daily living, which is indispensable for conducting daily life. The posture data 12A may include data actually measured when the set user's hand H is the grasping posture P2. In other words, the posture data 12A may include data used for calibration. In the posture data 12A, the measurement results for each of the plurality of electrodes 41 are linked to the grip strength and the state of the hand H. The posture data 12A may include data indicating a measurement result corresponding to a general hand H size. The posture data 12A may include data such as a calculation program for obtaining grip strength based on the position of the electrode 41 and the amount of change in myoelectric potential information, a conversion table, and the like.

 推定部13は、姿勢センサ3によって検出した手Hの姿勢が把握姿勢P2である場合に、筋電センサ4によって検出した筋電位情報に基づいて、ユーザの手Hの把握状態を推定する。推定部13は、把握姿勢P2が判定時間にわたって継続した場合に、手Hの把握状態を推定する。判定時間は、例えば、誤判定を回避するための時間を設定することができる。推定部13は、把握姿勢P2の開始を検出したときの筋電センサ4の筋電位情報を初期値とし、初期値と把握姿勢P2の開始後に検出した筋電位情報とに基づいて把握状態を推定する。例えば、推定部13は、初期値の筋電位情報と検出した筋電位情報との変化量に基づいて把握状態を推定する。推定部13は、手Hの把握状態を推定している場合に、把握姿勢P2の終了を検出すると、当該把握状態の推定を終了する。把握姿勢P2の終了は、例えば、把握姿勢P2から非把握姿勢P1への変化を含む。推定部13は、姿勢センサ3によって検出した手Hの姿勢が、記憶部12に記憶している姿勢データ12Aが示す把握姿勢である場合に、筋電センサ4によって検出した筋電位情報に基づいて、ユーザの手Hの把握状態を推定する。推定部13は、推定した推定結果を通信部14に出力する。 When the posture of the hand H detected by the posture sensor 3 is the grasping posture P2, the estimation unit 13 estimates the grasping state of the user's hand H based on the myoelectric potential information detected by the myoelectric sensor 4. The estimation unit 13 estimates the grasping state of the hand H when the grasping posture P2 continues for the determination time. For the determination time, for example, a time for avoiding an erroneous determination can be set. The estimation unit 13 sets the myoelectric potential information of the myoelectric sensor 4 when the start of the grasping posture P2 is detected as an initial value, and estimates the grasping state based on the initial value and the myoelectric potential information detected after the start of the grasping posture P2. To do. For example, the estimation unit 13 estimates the grasped state based on the amount of change between the initial value myoelectric potential information and the detected myoelectric potential information. When the estimation unit 13 detects the end of the grasping posture P2 when the grasping state of the hand H is being estimated, the estimation unit 13 ends the estimation of the grasping state. The end of the grasping posture P2 includes, for example, a change from the grasping posture P2 to the non-grasping posture P1. The estimation unit 13 is based on the myoelectric potential information detected by the myoelectric sensor 4 when the posture of the hand H detected by the posture sensor 3 is the grasping posture indicated by the posture data 12A stored in the storage unit 12. , Estimate the grasping state of the user's hand H. The estimation unit 13 outputs the estimated estimation result to the communication unit 14.

 通信部14は、無線により通信する。通信部14は、近距離無線通信方式をサポートする。通信部14は、外部の通信装置等と情報の無線通信を行う機能を有する。通信部14は、推定部13からの情報を外部の電子機器等に送信する。外部の電子機器は、例えば、ゲーム機、テレビジョン、スマートフォン、スマートスピーカ、コンピュータ等を含む。通信部14は、外部の電子機器等から受信した情報を推定部13に出力する。 The communication unit 14 communicates wirelessly. The communication unit 14 supports a short-range wireless communication system. The communication unit 14 has a function of wirelessly communicating information with an external communication device or the like. The communication unit 14 transmits the information from the estimation unit 13 to an external electronic device or the like. External electronic devices include, for example, game consoles, televisions, smartphones, smart speakers, computers and the like. The communication unit 14 outputs information received from an external electronic device or the like to the estimation unit 13.

 以上、本実施形態に係るウェアラブル装置1の機能構成例について説明した。なお、図4を用いて説明した上記の構成はあくまで一例であり、本実施形態に係るウェアラブル装置1の機能構成は係る例に限定されない。本実施形態に係るウェアラブル装置1の機能構成は、仕様や運用に応じて柔軟に変形可能である。例えば、ウェアラブル装置1は、情報処理ユニット10を装着部2の外部に設け、センサの検出結果を外部の情報処理ユニット10に送信する通信ユニットを装着部2に設ける構成としてもよい。 The functional configuration example of the wearable device 1 according to the present embodiment has been described above. The above configuration described with reference to FIG. 4 is merely an example, and the functional configuration of the wearable device 1 according to the present embodiment is not limited to such an example. The functional configuration of the wearable device 1 according to the present embodiment can be flexibly modified according to specifications and operations. For example, the wearable device 1 may be configured such that the information processing unit 10 is provided outside the mounting unit 2 and the communication unit for transmitting the detection result of the sensor to the external information processing unit 10 is provided in the mounting unit 2.

[第1の実施形態に係るウェアラブル装置1の処理手順]
 次に、第1の実施形態に係るウェアラブル装置1の情報処理ユニット10が実行する情報処理方法の一例について説明する。図5は、第1の実施形態に係るウェアラブル装置1が実行する処理手順の一例を示すフローチャートである。図5に示す処理手順は、ウェアラブル装置1の情報処理ユニット10がプログラムを実行することによって実現される。図5に示す処理手順は、情報処理ユニット10によって繰り返し実行される。
[Processing procedure of the wearable device 1 according to the first embodiment]
Next, an example of the information processing method executed by the information processing unit 10 of the wearable device 1 according to the first embodiment will be described. FIG. 5 is a flowchart showing an example of a processing procedure executed by the wearable device 1 according to the first embodiment. The processing procedure shown in FIG. 5 is realized by the information processing unit 10 of the wearable device 1 executing a program. The processing procedure shown in FIG. 5 is repeatedly executed by the information processing unit 10.

 図5に示すように、情報処理ユニット10は、姿勢センサ3の検出結果を取得する(ステップS101)。情報処理ユニット10は、ステップS101を実行することで、取得部11として機能する。情報処理ユニット10は、取得した検出結果を記憶部12に記憶すると、処理をステップS102に進める。 As shown in FIG. 5, the information processing unit 10 acquires the detection result of the posture sensor 3 (step S101). The information processing unit 10 functions as the acquisition unit 11 by executing step S101. When the information processing unit 10 stores the acquired detection result in the storage unit 12, the processing proceeds to step S102.

 情報処理ユニット10は、姿勢センサ3の検出結果に基づいて、ユーザの手Hの姿勢を判別する(ステップS102)。例えば、情報処理ユニット10は、姿勢センサ3の検出結果に基づく手Hの姿勢と姿勢データ12Aが示す把握姿勢とが一致あるいは類似している場合、手Hの姿勢を把握姿勢P2と判別し、判別結果を記憶部12に記憶する。情報処理ユニット10は、手Hの姿勢を把握姿勢P2ではないと判別した場合、その判別結果を記憶部12に記憶する。そして、情報処理ユニット10は、手Hの姿勢の判別が終了すると、処理をステップS103に進める。 The information processing unit 10 determines the posture of the user's hand H based on the detection result of the posture sensor 3 (step S102). For example, when the posture of the hand H based on the detection result of the posture sensor 3 and the grasping posture indicated by the posture data 12A match or are similar, the information processing unit 10 determines the posture of the hand H as the grasping posture P2. The determination result is stored in the storage unit 12. When the information processing unit 10 determines that the posture of the hand H is not the grasping posture P2, the information processing unit 10 stores the determination result in the storage unit 12. Then, when the determination of the posture of the hand H is completed, the information processing unit 10 proceeds to the process in step S103.

 情報処理ユニット10は、ステップS102の判別結果に基づいて、手Hの姿勢が把握姿勢P2であるか否かを判定する(ステップS103)。情報処理ユニット10は、手Hの姿勢が把握姿勢P2であると判定した場合(ステップS103でYes)、処理をステップS104に進める。情報処理ユニット10は、筋電センサ4による検出を開始したか否かを判定する(ステップS104)。情報処理ユニット10は、筋電センサ4による検出を開始したと判定した場合(ステップS104でYes)、処理を後述するステップS107に進める。また、情報処理ユニット10は、筋電センサ4による検出を開始していないと判定した場合(ステップS104でNo)、処理をステップS105に進める。 The information processing unit 10 determines whether or not the posture of the hand H is the grasping posture P2 based on the determination result in step S102 (step S103). When the information processing unit 10 determines that the posture of the hand H is the grasping posture P2 (Yes in step S103), the information processing unit 10 proceeds to the process in step S104. The information processing unit 10 determines whether or not the detection by the myoelectric sensor 4 has started (step S104). When the information processing unit 10 determines that the detection by the myoelectric sensor 4 has started (Yes in step S104), the information processing unit 10 proceeds to step S107, which will be described later. If the information processing unit 10 determines that the detection by the myoelectric sensor 4 has not started (No in step S104), the information processing unit 10 proceeds to step S105.

 情報処理ユニット10は、把握姿勢P2が判定時間継続したか否かを判定する(ステップS105)。例えば、情報処理ユニット10は、把握姿勢P2を検出した時点からの経過時間が判定時間に到達した場合に、把握姿勢P2が判定時間継続したと判定する。そして、情報処理ユニット10は、把握姿勢P2が判定時間継続したと判定した場合(ステップS105でYes)、処理をステップS106に進める。 The information processing unit 10 determines whether or not the grasping posture P2 has continued for the determination time (step S105). For example, when the elapsed time from the time when the grasping posture P2 is detected reaches the determination time, the information processing unit 10 determines that the grasping posture P2 has continued for the determination time. Then, when the information processing unit 10 determines that the grasping posture P2 has continued for the determination time (Yes in step S105), the information processing unit 10 proceeds to the process in step S106.

 情報処理ユニット10は、筋電センサ4による検出を開始する(ステップS106)。例えば、情報処理ユニット10は、電源1Bから筋電センサ4へ電力を供給させる。その結果、筋電センサ4は、複数の電極41によって筋電位情報の検出を開始する。情報処理ユニット10は、筋電センサ4による検出を開始すると、処理をステップS107に進める。 The information processing unit 10 starts detection by the myoelectric sensor 4 (step S106). For example, the information processing unit 10 supplies electric power from the power source 1B to the myoelectric sensor 4. As a result, the myoelectric sensor 4 starts detecting myoelectric potential information by the plurality of electrodes 41. When the information processing unit 10 starts the detection by the myoelectric sensor 4, the process proceeds to step S107.

 情報処理ユニット10は、筋電センサ4の検出結果を取得する(ステップS107)。情報処理ユニット10は、ステップS107を実行することで、取得部11として機能する。そして、情報処理ユニット10は、筋電センサ4によって検出した筋電位情報に基づいて、手Hの把握状態を推定する(ステップS108)。例えば、情報処理ユニット10は、把握姿勢P2の開始を検出したときの筋電センサ4の筋電位情報を初期値とし、初期値の筋電位情報と検出した筋電位情報との変化量及び姿勢データ12Aに基づいて、手Hの把握状態を推定する。情報処理ユニット10は、推定結果を記憶部12に記憶すると、処理をステップS109に進める。 The information processing unit 10 acquires the detection result of the myoelectric sensor 4 (step S107). The information processing unit 10 functions as the acquisition unit 11 by executing step S107. Then, the information processing unit 10 estimates the grasping state of the hand H based on the myoelectric potential information detected by the myoelectric sensor 4 (step S108). For example, the information processing unit 10 uses the myoelectric potential information of the myoelectric sensor 4 when detecting the start of the grasping posture P2 as an initial value, and the amount of change and the posture data between the initial value myoelectric potential information and the detected myoelectric potential information. Based on 12A, the grasping state of the hand H is estimated. When the information processing unit 10 stores the estimation result in the storage unit 12, the information processing unit 10 proceeds to the process in step S109.

 情報処理ユニット10は、推定結果の出力処理を実行する(ステップS109)。例えば、情報処理ユニット10は、出力処理を実行することにより、通信部14を介して推定結果を外部の電子機器等に送信する。そして、情報処理ユニット10は、出力処理が終了すると、処理を後述するステップS111に進める。 The information processing unit 10 executes the output processing of the estimation result (step S109). For example, the information processing unit 10 transmits the estimation result to an external electronic device or the like via the communication unit 14 by executing the output process. Then, when the output process is completed, the information processing unit 10 proceeds to step S111, which will be described later.

 また、情報処理ユニット10は、把握姿勢P2が判定時間継続していないと判定した場合(ステップS105でNo)、処理をステップS110に進める。情報処理ユニット10は、手Hの把握状態の推定を行っていないことを示す推定結果の出力処理を実行する(ステップS110)。例えば、情報処理ユニット10は、出力処理を実行することにより、通信部14を介して推定結果を外部の電子機器等に送信する。そして、情報処理ユニット10は、出力処理が終了すると、処理を後述するステップS111に進める。 Further, when the information processing unit 10 determines that the grasping posture P2 does not continue for the determination time (No in step S105), the information processing unit 10 proceeds to the process in step S110. The information processing unit 10 executes an output process of an estimation result indicating that the grasping state of the hand H has not been estimated (step S110). For example, the information processing unit 10 transmits the estimation result to an external electronic device or the like via the communication unit 14 by executing the output process. Then, when the output process is completed, the information processing unit 10 proceeds to step S111, which will be described later.

 情報処理ユニット10は、終了か否かを判定する(ステップS111)。例えば、情報処理ユニット10は、筋電センサ4の検出結果に基づいて、装着部2が手Hから外されたと推定した場合に、終了と判定する。例えば、情報処理ユニット10は、通信部14を介して終了要求を受信した場合に、終了と判定する。そして、情報処理ユニット10は、終了ではないと判定した場合(ステップS111でNo)、処理を既に説明したステップS101に戻し、ステップS101以降の一連の処理を繰り返す。また、情報処理ユニット10は、終了であると判定した場合(ステップS111でYes)、図5に示す処理手順を終了させる。 The information processing unit 10 determines whether or not it has ended (step S111). For example, the information processing unit 10 determines that the attachment unit 2 has been removed from the hand H based on the detection result of the myoelectric sensor 4. For example, when the information processing unit 10 receives the end request via the communication unit 14, it determines that the end is complete. Then, when the information processing unit 10 determines that the process is not completed (No in step S111), the information processing unit 10 returns the process to step S101 already described, and repeats a series of processes after step S101. When the information processing unit 10 determines that the process is complete (Yes in step S111), the information processing unit 10 terminates the processing procedure shown in FIG.

 また、情報処理ユニット10は、手Hの姿勢が把握姿勢P2ではないと判定した場合(ステップS103でNo)、処理をステップS112に進める。情報処理ユニット10は、筋電センサ4による検出を開始したか否かを判定する(ステップS112)。情報処理ユニット10は、筋電センサ4による検出を開始していないと判定した場合(ステップS112でNo)、処理を既に説明したステップS110に進める。また、情報処理ユニット10は、筋電センサ4による検出を開始したと判定した場合(ステップS112でYes)、処理をステップS113に進める。 If the information processing unit 10 determines that the posture of the hand H is not the grasping posture P2 (No in step S103), the information processing unit 10 proceeds to step S112. The information processing unit 10 determines whether or not the detection by the myoelectric sensor 4 has started (step S112). When the information processing unit 10 determines that the detection by the myoelectric sensor 4 has not started (No in step S112), the information processing unit 10 proceeds to step S110 already described. If the information processing unit 10 determines that the detection by the myoelectric sensor 4 has started (Yes in step S112), the information processing unit 10 proceeds to step S113.

 情報処理ユニット10は、筋電センサ4による検出を終了する(ステップS113)。例えば、情報処理ユニット10は、電源1Bから筋電センサ4へ電力供給を停止させる。その結果、筋電センサ4は、複数の電極41によって筋電位情報の検出を終了する。情報処理ユニット10は、筋電センサ4による検出を終了すると、処理を既に説明したステップS111に進める。 The information processing unit 10 ends the detection by the myoelectric sensor 4 (step S113). For example, the information processing unit 10 stops the power supply from the power supply 1B to the myoelectric sensor 4. As a result, the myoelectric sensor 4 ends the detection of myoelectric potential information by the plurality of electrodes 41. When the information processing unit 10 finishes the detection by the myoelectric sensor 4, the information processing unit 10 proceeds to step S111 already described.

 なお、上述の図5に示す処理手順では、情報処理ユニット10は、ステップS102からステップS106及びステップS108からステップS111の処理を実行することで、推定部13として機能する。 In the processing procedure shown in FIG. 5 above, the information processing unit 10 functions as the estimation unit 13 by executing the processes of steps S102 to S106 and steps S108 to S111.

 また、上述の図5に示す処理手順は、ステップS110の処理を削除してもよい。すなわち、情報処理ユニット10は、手Hの把握状態の推定を行っていない場合、推定結果の出力を行わない処理手順としてもよい。また、当該処理手順は、把握姿勢P2が判定時間継続したか否かの判定処理(ステップS105)を削除してもよい。 Further, in the processing procedure shown in FIG. 5 above, the processing in step S110 may be deleted. That is, if the information processing unit 10 has not estimated the grasped state of the hand H, the processing procedure may be such that the estimation result is not output. Further, in the processing procedure, the determination process (step S105) of whether or not the grasping posture P2 has continued for the determination time may be deleted.

 以上のように、ウェアラブル装置1は、姿勢センサ3によって手Hの姿勢が非把握姿勢P1から把握姿勢P2になると、筋電センサ4が筋電位情報の検出を開始する。これにより、ウェアラブル装置1は、手Hの姿勢が非把握姿勢P1の場合は、筋電センサ4を駆動させる必要がなくなる。その結果、ウェアラブル装置1は、手Hの姿勢が非把握姿勢P1を筋電センサ4の待機状態とすることで、電力消費を低減させることができる。 As described above, in the wearable device 1, when the posture of the hand H changes from the non-grasping posture P1 to the grasping posture P2 by the posture sensor 3, the myoelectric sensor 4 starts detecting the myoelectric potential information. As a result, the wearable device 1 does not need to drive the myoelectric sensor 4 when the posture of the hand H is the non-grasp posture P1. As a result, the wearable device 1 can reduce the power consumption by setting the posture P1 in which the posture of the hand H is not grasped into the standby state of the myoelectric sensor 4.

 また、ウェアラブル装置1は、姿勢センサ3によって検出した手Hの姿勢が把握姿勢P2ではない場合、推定部13が手Hの把握状態を推定しない。これにより、ウェアラブル装置1は、筋電センサ4の筋電位情報にノイズ等が含まれる可能がある非把握姿勢P1の場合に、手Hの把握状態を推測することを回避できる。その結果、ウェアラブル装置1は、筋電センサ4の検出結果に基づく手Hの把握状態の推定精度を向上させることができる。 Further, in the wearable device 1, if the posture of the hand H detected by the posture sensor 3 is not the grasping posture P2, the estimation unit 13 does not estimate the grasping state of the hand H. As a result, the wearable device 1 can avoid guessing the grasping state of the hand H in the case of the non-grasping posture P1 in which the myoelectric potential information of the myoelectric sensor 4 may include noise or the like. As a result, the wearable device 1 can improve the estimation accuracy of the grasped state of the hand H based on the detection result of the myoelectric sensor 4.

 また、ウェアラブル装置1は、手Hの把握姿勢P2が判定閾値にわたって継続した場合に、手Hの把握状態を推定部13によって推定する。これにより、ウェアラブル装置1は、手Hが一時的に把握姿勢P2に変化した場合は、把握状態の推定を行わない。その結果、ウェアラブル装置1は、ユーザが手Hを把握姿勢P2に変化させたことの検出精度を向上できるので、ユーザの手Hが把握姿勢P2ではないにもかかわらず手Hの把握状態の推定を行うことを抑制できる。 Further, the wearable device 1 estimates the grasping state of the hand H by the estimation unit 13 when the grasping posture P2 of the hand H continues over the determination threshold value. As a result, the wearable device 1 does not estimate the grasping state when the hand H temporarily changes to the grasping posture P2. As a result, the wearable device 1 can improve the detection accuracy that the user has changed the hand H to the grasping posture P2, so that the grasping state of the hand H is estimated even though the user's hand H is not the grasping posture P2. Can be suppressed.

 また、ウェアラブル装置1は、手Hの把握姿勢P2の開始を検出したときの筋電センサ4の筋電位情報を初期値とし、当該初期値と把握姿勢P2の開始後に検出した筋電位情報とに基づいて手Hの把握状態を推定する。これにより、ウェアラブル装置1は、手Hの把握姿勢P2の開始時の筋電位情報を基準とし、当該基準からの変化量に基づいて把握状態を推定することができる。その結果、ウェアラブル装置1は、筋電センサ4と手Hとの接触状態を考慮して把握状態を推定できるので、把握状態の推定精度を向上させることができる。 Further, the wearable device 1 uses the myoelectric potential information of the myoelectric sensor 4 when detecting the start of the grasping posture P2 of the hand H as an initial value, and sets the initial value and the myoelectric potential information detected after the start of the grasping posture P2. Based on this, the grasping state of the hand H is estimated. As a result, the wearable device 1 can estimate the grasping state based on the amount of change from the reference based on the myoelectric potential information at the start of the grasping posture P2 of the hand H. As a result, the wearable device 1 can estimate the grasping state in consideration of the contact state between the myoelectric sensor 4 and the hand H, so that the estimation accuracy of the grasping state can be improved.

 また、ウェアラブル装置1は、手Hの把握状態を推定している場合に、手Hの把握姿勢P2の終了を検出すると、当該把握状態の推定を終了する。これにより、ウェアラブル装置1は、手Hの把握姿勢P2が他の姿勢に変わると、筋電センサ4を用いた手Hの把握状態の推定の回避できる。その結果、ウェアラブル装置1は、筋電センサ4と手Hとのずれによる影響を抑制し、筋電センサ4を用いた手Hの筋力の推定精度を向上させることができる。 Further, when the wearable device 1 detects the end of the grasping posture P2 of the hand H when the grasping state of the hand H is estimated, the estimation of the grasping state ends. As a result, the wearable device 1 can avoid estimating the grasping state of the hand H using the myoelectric sensor 4 when the grasping posture P2 of the hand H changes to another posture. As a result, the wearable device 1 can suppress the influence of the displacement between the myoelectric sensor 4 and the hand H, and can improve the estimation accuracy of the muscle strength of the hand H using the myoelectric sensor 4.

 また、ウェアラブル装置1は、姿勢センサ3によって検出した手Hの姿勢が、記憶部12に記憶している姿勢データ12Aが示す把握姿勢である場合に、筋電センサ4によって検出した筋電位情報に基づいて、ユーザの手Hの把握状態を推定する。これにより、ウェアラブル装置1は、姿勢データ12Aが示す手Hの把握姿勢P2である場合に、手Hの把握状態を推定することができる。その結果、ウェアラブル装置1は、種々異なる把握姿勢P2を設定することで、当該設定された把握姿勢P2に応じた手Hの把握状態を推定することができる。 Further, the wearable device 1 uses the myoelectric potential information detected by the myoelectric sensor 4 when the posture of the hand H detected by the posture sensor 3 is the grasping posture indicated by the posture data 12A stored in the storage unit 12. Based on this, the grasping state of the user's hand H is estimated. As a result, the wearable device 1 can estimate the grasping state of the hand H when the grasping posture P2 of the hand H indicated by the posture data 12A. As a result, the wearable device 1 can estimate the grasping state of the hand H according to the set grasping posture P2 by setting various different grasping postures P2.

 上述の第1の実施形態は一例を示したものであり、種々の変更及び応用が可能である。 The above-mentioned first embodiment shows an example, and various modifications and applications are possible.

 上述の第1の実施形態に係るウェアラブル装置1は、姿勢センサ3によって検出した手Hの姿勢が把握姿勢である場合に、筋電センサ4によって検出した筋電位情報に基づいて手Hの把握状態を推定するが、これに限定されない。例えば、ウェアラブル装置1は、手Hの把握状態の推定に、姿勢センサ3の検出結果を加味してもよい。 The wearable device 1 according to the first embodiment described above has a grasped state of the hand H based on the myoelectric potential information detected by the myoelectric sensor 4 when the posture of the hand H detected by the posture sensor 3 is the grasped posture. Is estimated, but is not limited to this. For example, the wearable device 1 may add the detection result of the posture sensor 3 to the estimation of the grasping state of the hand H.

[第1の実施形態の変形例(1)]
 例えば、第1の実施形態に係るウェアラブル装置1は、姿勢センサ3の配置を変更することができる。
[Modified example of the first embodiment (1)]
For example, in the wearable device 1 according to the first embodiment, the arrangement of the posture sensor 3 can be changed.

 図6は、第1の実施形態の変形例(1)に係るウェアラブル装置1の一例を説明するための図である。図6に示すように、ウェアラブル装置1は、ユーザの手Hに装着された場合に、ユーザの手Hの指、指先等が露出するオープンフィンガーグローブ、指ぬきグローブ等となっている。なお、ウェアラブル装置1は、例えば、手Hの平を露出してもしなくてもよい。 FIG. 6 is a diagram for explaining an example of the wearable device 1 according to the modified example (1) of the first embodiment. As shown in FIG. 6, the wearable device 1 is an open finger glove, a thimble glove, or the like that exposes the fingers, fingertips, etc. of the user's hand H when worn on the user's hand H. The wearable device 1 may or may not expose the palm of the hand H, for example.

 第1の実施形態の変形例(1)に係るウェアラブル装置1は、装着部2Aと、姿勢センサ3と、筋電センサ4と、情報処理ユニット10と、を備える。第1の実施形態の変形例(1)では、ウェアラブル装置1は、手Hの5本の指の中手骨に跨がる1つの姿勢センサ3と、3つの筋電センサ4とを有する場合について説明するが、センサの数を限定するものではない。 The wearable device 1 according to the modified example (1) of the first embodiment includes a mounting portion 2A, a posture sensor 3, a myoelectric sensor 4, and an information processing unit 10. In the modification (1) of the first embodiment, the wearable device 1 has one posture sensor 3 straddling the metacarpal bones of five fingers of the hand H and three myoelectric sensors 4. However, the number of sensors is not limited.

 装着部2Aは、ウェアラブル装置1をユーザの手Hの指、指先等を露出する手袋である。装着部2Aは、例えば、布、合成繊維、皮等によって形成されている。装着部2Aは、ユーザの手Hの甲を覆い、当該手Hの動作に応じて変形する。装着部2Aは、姿勢センサ3、筋電センサ4及び情報処理ユニット10が設けられている。装着部2Aは、例えば、ユーザの手Hに装着されると、姿勢センサ3及び筋電センサ4を直接的または間接的に手Hの皮膚に接触させる。装着部2Aは、手Hに装着された状態において、ユーザが手Hを握っていない姿勢を示す非把握姿勢P1と、ユーザが手Hを握った姿勢を示す把握姿勢P2と、を有する。例えば、装着部2Aが把握姿勢P2の場合、ユーザの手Hは、指の全部または一部を内側に曲げた状態になっている。この場合、手Hの甲は、指が並んだ方向において、丸みを帯びた状態に変化する。このため、装着部2Aは、1つの姿勢センサ3を複数の指の中手骨に跨がる手Hの甲に配置し、その両側に筋電センサ4を配置している。 The mounting portion 2A is a glove that exposes the fingers, fingertips, etc. of the user's hand H on the wearable device 1. The mounting portion 2A is formed of, for example, cloth, synthetic fibers, leather, or the like. The mounting portion 2A covers the back of the user's hand H and deforms according to the movement of the hand H. The mounting portion 2A is provided with a posture sensor 3, a myoelectric sensor 4, and an information processing unit 10. When the attachment portion 2A is attached to the user's hand H, for example, the posture sensor 3 and the myoelectric sensor 4 are brought into direct or indirect contact with the skin of the hand H. The mounting portion 2A has a non-grasping posture P1 indicating a posture in which the user does not hold the hand H and a grasping posture P2 indicating a posture in which the user holds the hand H when mounted on the hand H. For example, when the mounting portion 2A is in the grasping posture P2, the user's hand H is in a state where all or part of the fingers are bent inward. In this case, the back of the hand H changes to a rounded state in the direction in which the fingers are lined up. Therefore, in the mounting portion 2A, one posture sensor 3 is arranged on the back of the hand H straddling the metacarpal bones of a plurality of fingers, and myoelectric sensors 4 are arranged on both sides thereof.

 姿勢センサ3は、ユーザの手Hの甲の曲げを検出するように構成されている。姿勢センサ3は、手Hの甲の曲げを検出し、当該曲げが増加すると、抵抗値も増加する。姿勢センサ3は、一方がウェアラブル装置1の電源1Bに接続され、他方が接地されている。本実施形態では、姿勢センサ3は、情報処理ユニット10と電気的に接続されており、検出結果を情報処理ユニット10に出力する。 The posture sensor 3 is configured to detect the bending of the back of the user's hand H. The posture sensor 3 detects the bending of the back of the hand H, and as the bending increases, the resistance value also increases. One of the posture sensors 3 is connected to the power supply 1B of the wearable device 1, and the other is grounded. In the present embodiment, the posture sensor 3 is electrically connected to the information processing unit 10 and outputs the detection result to the information processing unit 10.

 本実施形態では、ウェアラブル装置1は、1つの姿勢センサ3を5本の指の中手骨に跨がる長さで手Hの甲に配置する場合について説明するが、これに限定されない。例えば、ウェアラブル装置1は、複数の姿勢センサ3のそれぞれを5本の指の中手骨のそれぞれに跨がるように並べて配置してもよい。また、ウェアラブル装置1は、把握姿勢P2のときに曲がる手Hの甲の一部に姿勢センサ3を配置する構成としてもよい。 In the present embodiment, the wearable device 1 describes a case where one posture sensor 3 is arranged on the back of the hand H with a length straddling the metacarpal bones of five fingers, but the present invention is not limited to this. For example, in the wearable device 1, each of the plurality of posture sensors 3 may be arranged side by side so as to straddle each of the metacarpal bones of five fingers. Further, the wearable device 1 may have a configuration in which the posture sensor 3 is arranged on a part of the back of the hand H that bends when the grasping posture P2.

 情報処理ユニット10は、1つの姿勢センサ3と3つの筋電センサ4とが検出した情報を処理する装置である。情報処理ユニット10は、例えば、ユーザの手Hの姿勢を判別する機能を有する。情報処理ユニット10は、例えば、ユーザの手Hの把握状態を推定する機能を有する。 The information processing unit 10 is a device that processes information detected by one posture sensor 3 and three myoelectric sensors 4. The information processing unit 10 has, for example, a function of determining the posture of the user's hand H. The information processing unit 10 has, for example, a function of estimating the grasping state of the user's hand H.

 図6に示すように、ウェアラブル装置1は、ユーザの手Hに装着されると、姿勢センサ3の検出結果に基づいて手Hの姿勢を判別する。例えば、ユーザの手Hは、手Hを開いた非把握姿勢P1であるとする。この場合、ウェアラブル装置1は、姿勢センサ3が検出する抵抗値が基準抵抗値Rthを超えないので、ユーザの手Hの姿勢が非把握姿勢P1と判別し、筋電センサ4の検出を行わない。その後、ユーザの手Hは、例えば、道具を握る等により、非把握姿勢P1から把握姿勢P2に変化したとする。この場合、ウェアラブル装置1は、姿勢センサ3が検出する抵抗値が基準抵抗値Rthを超えるので、ユーザの手Hの姿勢が把握姿勢P2と判別する。そして、ウェアラブル装置1は、筋電センサ4の検出を開始し、当該筋電センサ4の検出結果に基づいて手Hの把握状態を推定する。例えば、ウェアラブル装置1は、筋電センサ4の電極41ごとの筋電位情報の周波数成分に基づいて、手指の状態、握力等を推定する。 As shown in FIG. 6, when the wearable device 1 is attached to the user's hand H, the wearable device 1 determines the posture of the hand H based on the detection result of the posture sensor 3. For example, it is assumed that the user's hand H is the non-grasp posture P1 with the hand H open. In this case, since the resistance value detected by the posture sensor 3 does not exceed the reference resistance value Rth, the wearable device 1 determines that the posture of the user's hand H is the non-grasp posture P1 and does not detect the myoelectric sensor 4. .. After that, it is assumed that the user's hand H changes from the non-grasping posture P1 to the grasping posture P2 by, for example, holding a tool. In this case, since the resistance value detected by the posture sensor 3 exceeds the reference resistance value Rth, the wearable device 1 determines that the posture of the user's hand H is the grasping posture P2. Then, the wearable device 1 starts the detection of the myoelectric sensor 4, and estimates the grasping state of the hand H based on the detection result of the myoelectric sensor 4. For example, the wearable device 1 estimates the state of fingers, grip strength, and the like based on the frequency component of the myoelectric potential information for each electrode 41 of the myoelectric sensor 4.

 例えば、ウェアラブル装置1は、筋電センサ4の筋電位情報が示す筋電信号が手Hの動作によって相異なる周波数特定を示すという特性を利用して手Hの動作、握力等を推測する。例えば、ウェアラブル装置1は、筋電信号のパラメータとして、積分値平均電位、周波数スペクトル等を抽出することができる。積分値平均電位は、筋電信号を整流、平均化したものであり、筋力が大きいほど筋活動が活発になって大きな値をとることが知られている。したがって、ウェアラブル装置1は、積分値平均電位を用いた検出方法によって手Hの力を検出する。また、ウェアラブル装置1は、周波数スペクトルを用いたジェスチャ検出方法を用いて、手Hの細かい動作を推定してもよい。 For example, the wearable device 1 estimates the movement, grip strength, etc. of the hand H by utilizing the characteristic that the myoelectric signals indicated by the myoelectric potential information of the myoelectric sensor 4 indicate different frequency identification depending on the movement of the hand H. For example, the wearable device 1 can extract an integrated value average potential, a frequency spectrum, and the like as parameters of a myoelectric signal. The integrated value average potential is a rectified and averaged myoelectric signal, and it is known that the greater the muscle strength, the more active the muscle activity and the larger the value. Therefore, the wearable device 1 detects the force of the hand H by the detection method using the integrated value average potential. Further, the wearable device 1 may estimate the fine operation of the hand H by using the gesture detection method using the frequency spectrum.

 その後、ユーザの手Hは、把握姿勢P2から非把握姿勢P1に遷移したとする。この場合、ウェアラブル装置1は、姿勢センサ3が検出する抵抗値が基準抵抗値Rthを超えないので、ユーザの手Hの姿勢が把握姿勢P2と判別し、筋電センサ4の検出を終了する。 After that, it is assumed that the user's hand H transitions from the grasping posture P2 to the non-grasping posture P1. In this case, since the resistance value detected by the posture sensor 3 does not exceed the reference resistance value Rth, the wearable device 1 determines that the posture of the user's hand H is the grasping posture P2, and ends the detection of the myoelectric sensor 4.

 以上のように、第1の実施形態の変形例(1)に係るウェアラブル装置1は、装着部2Aがオープンフィンガーグローブであり、姿勢センサ3が装着部2Aによって手Hの甲に位置付けられる。ウェアラブル装置1は、装着部2Aが手Hに装着された場合、手Hの指先が露出するので、ユーザが指先を自由に使えるようになる。その結果、ウェアラブル装置1は、姿勢センサ3によって手Hの姿勢を検出する精度を向上させるとともに、ユーザの操作性を向上させることができる。 As described above, in the wearable device 1 according to the modified example (1) of the first embodiment, the mounting portion 2A is an open finger glove, and the posture sensor 3 is positioned on the back of the hand H by the mounting portion 2A. In the wearable device 1, when the mounting portion 2A is mounted on the hand H, the fingertips of the hand H are exposed, so that the user can freely use the fingertips. As a result, the wearable device 1 can improve the accuracy of detecting the posture of the hand H by the posture sensor 3 and improve the operability of the user.

[第1の実施形態の変形例(2)]
 例えば、第1の実施形態に係るウェアラブル装置1は、手Hの種々異なる姿勢に対応することができる。
[Modified Example (2) of the First Embodiment]
For example, the wearable device 1 according to the first embodiment can correspond to various different postures of the hand H.

 例えば、日常生活動作は、日々の生活を営む上で不可欠な基本的行動への指標となる。食事や排泄、入浴などの日常的な動作には、複雑な動作が組み合わされるケースが多く、握力が低下していると様々な問題が生じる。問題としては、例えば、重いものを長時間もてない、瓶やペットボトルの蓋が開けられない、撚るタイプのドアが開けにくい、食事を摂るのが難しい等が挙げられる。このため、第1の実施形態の変形例(2)に係るウェアラブル装置1は、手Hの姿勢の検出と握力推定とを組み合わせることで、ユーザの動作を把握して支援することができる。 For example, activities of daily living are indicators of basic behaviors that are indispensable for conducting daily living. In many cases, complicated movements are combined with daily movements such as eating, excreting, and bathing, and various problems occur when the grip strength is weakened. Problems include, for example, not being able to hold heavy objects for a long time, being unable to open the lids of bottles and PET bottles, having difficulty opening twist-type doors, and having difficulty eating. Therefore, the wearable device 1 according to the modified example (2) of the first embodiment can grasp and support the user's movement by combining the detection of the posture of the hand H and the estimation of the grip strength.

 第1の実施形態の変形例(2)に係るウェアラブル装置1は、装着部2と、姿勢センサ3と、筋電センサ4と、情報処理ユニット10と、を備える。なお、第1の実施形態の変形例(2)に係るウェアラブル装置1は、装着部2を装着部2Aに置き換えてもよい。情報処理ユニット10は、取得部11と、記憶部12と、推定部13と、通信部14と、を備える。 The wearable device 1 according to the modified example (2) of the first embodiment includes a mounting portion 2, a posture sensor 3, a myoelectric sensor 4, and an information processing unit 10. In the wearable device 1 according to the modification (2) of the first embodiment, the mounting portion 2 may be replaced with the mounting portion 2A. The information processing unit 10 includes an acquisition unit 11, a storage unit 12, an estimation unit 13, and a communication unit 14.

 図7は、第1の実施形態の変形例(2)に係るウェアラブル装置1の把握姿勢P2の一例を示す図である。図7に示すように、ウェアラブル装置1は、姿勢データ12Aによってユーザのシーンと把握姿勢P2と紐付けている。シーンは、例えば、荷物を持つ、蓋を開ける、ドアを開く、食事を摂る等のシーンを含む。把握姿勢P2は、例えば、手Hの姿勢に対応した指の曲げ量、形状等の情報が紐付けられている。 FIG. 7 is a diagram showing an example of the grasping posture P2 of the wearable device 1 according to the modified example (2) of the first embodiment. As shown in FIG. 7, the wearable device 1 is associated with the user's scene and the grasping posture P2 by the posture data 12A. Scenes include, for example, holding luggage, opening lids, opening doors, eating meals, and the like. Information such as the bending amount and shape of the finger corresponding to the posture of the hand H is associated with the grasping posture P2, for example.

 図7に示す例では、姿勢データ12Aは、荷物を持つシーンには、手Hで袋をぶら下げる姿勢に関する情報を紐付けている。姿勢データ12Aは、蓋を開けるシーンには、指で蓋を握る姿勢に関する情報を紐付けている。姿勢データ12Aは、ドアを開くシーンには、ドアノブを掴む姿勢に関する情報を紐付けている。姿勢データ12Aは、食事を摂るシーンには、食器を掴む姿勢に関する情報を紐付けている。 In the example shown in FIG. 7, the posture data 12A associates information on the posture of hanging the bag with the hand H with the scene of holding the luggage. The posture data 12A associates information on the posture of holding the lid with a finger with the scene of opening the lid. The posture data 12A associates information on the posture of grasping the doorknob with the scene of opening the door. The posture data 12A associates information on the posture of grasping the tableware with the scene of eating a meal.

 情報処理ユニット10の推定部13は、推定対象のシーンに対応する姿勢を姿勢データ12Aに基づいて特定する。推定部13は、姿勢センサ3の検出結果に基づいて判別した手Hの姿勢と、対象のシーンに対応する姿勢とが一致または類似している場合、筋電センサ4を用いて手Hの握力等の把握状態を推定する。推定部13は、推定した握力をユーザの対象シーンにおける動作結果として出力する機能を提供できる。 The estimation unit 13 of the information processing unit 10 specifies the posture corresponding to the scene to be estimated based on the posture data 12A. When the posture of the hand H determined based on the detection result of the posture sensor 3 and the posture corresponding to the target scene match or are similar to each other, the estimation unit 13 uses the myoelectric sensor 4 to grip the hand H. Etc. is estimated. The estimation unit 13 can provide a function of outputting the estimated grip strength as an operation result in the target scene of the user.

 例えば、対象のシーンが荷物を持つシーンであるとする。この場合、ウェアラブル装置1は、ユーザの手Hに装着されると、姿勢センサ3によって手Hの姿勢を検出する。ウェアラブル装置1は、対象のシーンの把握姿勢P2を姿勢データ12Aに基づいて特定する。ウェアラブル装置1は、姿勢センサ3によって検出した姿勢が当該把握姿勢P2である場合、筋電センサ4によって検出した筋電位情報に基づいて、ユーザの手Hの握力(把握状態)を推定する。ウェアラブル装置1は、推定した握力を対象のシーンを示す識別情報に紐付けて記憶部12に記憶するとともに、外部の電子機器に送信する。その結果、ウェアラブル装置1は、対象のシーンのユーザの握力を示すことで、ユーザの握力が衰えているか否か等を認識させることができる。 For example, assume that the target scene is a scene with luggage. In this case, when the wearable device 1 is attached to the user's hand H, the posture sensor 3 detects the posture of the hand H. The wearable device 1 specifies the grasping posture P2 of the target scene based on the posture data 12A. When the posture detected by the posture sensor 3 is the grasping posture P2, the wearable device 1 estimates the grip strength (grasping state) of the user's hand H based on the myoelectric potential information detected by the myoelectric sensor 4. The wearable device 1 associates the estimated grip strength with the identification information indicating the target scene, stores it in the storage unit 12, and transmits it to an external electronic device. As a result, the wearable device 1 can indicate whether or not the user's grip strength is weakened by showing the user's grip strength in the target scene.

 以上のように、第1の実施形態の変形例(2)に係るウェアラブル装置1は、姿勢データ12Aがシーンに応じた把握姿勢P2を含み、姿勢センサ3によって検出した手Hの姿勢がシーンに対応した把握姿勢である場合、手Hの把握状態を推定する。これにより、ウェアラブル装置1は、ユーザが日常生活で装着されることで、日常生活のシーンに応じた手Hの把握状態を推定することができる。例えば、ユーザが重いものを長時間持てない場合、姿勢データ12Aのシーンとして「荷物を持つ」を設定することで、ウェアラブル装置1は、荷物を持っているときの把握姿勢P2である場合に、把握状態を推定する。その結果、ウェアラブル装置1は、シーンで推定した手Hの把握状態を出力することで、ユーザの把握状態に基づく日常生活の支援に貢献することができる。 As described above, in the wearable device 1 according to the modified example (2) of the first embodiment, the posture data 12A includes the grasping posture P2 according to the scene, and the posture of the hand H detected by the posture sensor 3 is in the scene. In the case of the corresponding grasping posture, the grasping state of the hand H is estimated. As a result, the wearable device 1 can be worn by the user in daily life to estimate the grasping state of the hand H according to the scene of daily life. For example, when the user cannot hold a heavy object for a long time, by setting "hold luggage" as the scene of the posture data 12A, the wearable device 1 can grasp the posture P2 when holding the luggage. Estimate the grasped state. As a result, the wearable device 1 can contribute to the support of daily life based on the grasped state of the user by outputting the grasped state of the hand H estimated in the scene.

 なお、第1の実施形態の変形例(2)は、他の実施形態、変形例のウェアラブル装置1に適用してもよい。 Note that the modified example (2) of the first embodiment may be applied to the wearable device 1 of another embodiment or modified example.

(第2の実施形態)
[第2の実施形態に係るウェアラブル装置1を用いたシステムの概要]
 次に、第2の実施形態について説明する。図8は、第2の実施形態に係るシステムの一例を説明するための図である。図8に示すように、システム100は、例えば、ビデオゲームのインタラクティブなゲームプレイのためのシステムである。システム100は、第1の実施形態に係るウェアラブル装置1と、ヘッドマウントディスプレイ(HMD:Head Mounted Display)110と、を備える。HMD110は、例えば、眼鏡、ゴーグル、またはヘルメットと類似の方法で着用され、ユーザUにビデオゲームまたは他のコンテンツを表示するように構成される。HMD110は、ユーザUの目に近接する表示機構の提供により、ユーザUが没入できる体験を提供する。HMD110は、ユーザUの視界の大部分または全体を占める、ユーザUの目のそれぞれに対する表示領域を提供することができる。
(Second Embodiment)
[Outline of the system using the wearable device 1 according to the second embodiment]
Next, the second embodiment will be described. FIG. 8 is a diagram for explaining an example of the system according to the second embodiment. As shown in FIG. 8, system 100 is, for example, a system for interactive gameplay of video games. The system 100 includes a wearable device 1 according to the first embodiment, and a head-mounted display (HMD) 110. The HMD 110 is worn, for example, in a manner similar to eyeglasses, goggles, or helmets and is configured to display a video game or other content to User U. The HMD 110 provides an immersive experience for the user U by providing a display mechanism that is close to the eyes of the user U. The HMD 110 can provide a display area for each of the user U's eyes, which occupies most or the entire field of view of the user U.

 HMD110は、有線または無線によってコンピュータ120に接続することができる。コンピュータ120は、例えば、ゲーム機、パーソナルコンピュータ、ラップトップ、タブレットコンピュータ、携帯機器、携帯電話、タブレット、シンクライアント、セットトップボックス、メディアストリーミングデバイスを含むが、これらに限定されない。本実施形態では、コンピュータ120は、ビデオゲームを実行し、HMD110によってレンダリングするためにビデオゲームからのビデオ及びオーディオを出力するように構成されている。 The HMD 110 can be connected to the computer 120 by wire or wirelessly. Computer 120 includes, but is not limited to, for example, game consoles, personal computers, laptops, tablet computers, mobile devices, mobile phones, tablets, thin clients, set-top boxes, media streaming devices. In this embodiment, the computer 120 is configured to run a video game and output video and audio from the video game for rendering by the HMD 110.

 システム100は、ユーザUがビデオゲームの入力を行う手段として、ウェアラブル装置1を用いることができる。図8に示す例では、システム100は、ユーザUの左右の手Hにウェアラブル装置1を装着している。そして、ウェアラブル装置1は、推測した手Hの把握状態、握力等をHMD110、コンピュータ120等の連携装置に出力する。例えば、コンピュータ120は、手Hの把握状態、握力等をAR、VR、ビデオゲーム等に反映する処理を実行することができる。例えば、ボクシングのゲームの場合、コンピュータ120は、ウェアラブル装置1から入力されたユーザUの握力に応じて、相手に与えるダメージを変化させ、その結果を示すビデオ及びオーディオをHMD110に出力する。HMD110は、コンピュータ120からのビデオ及びオーディオをユーザUに出力する。 The system 100 can use the wearable device 1 as a means for the user U to input a video game. In the example shown in FIG. 8, the system 100 has the wearable device 1 attached to the left and right hands H of the user U. Then, the wearable device 1 outputs the estimated grasping state of the hand H, the grip strength, and the like to the cooperation device such as the HMD 110 and the computer 120. For example, the computer 120 can execute a process of reflecting the grasping state of the hand H, the grip strength, and the like in AR, VR, a video game, and the like. For example, in the case of a boxing game, the computer 120 changes the damage given to the opponent according to the grip strength of the user U input from the wearable device 1, and outputs a video and audio showing the result to the HMD 110. The HMD 110 outputs video and audio from the computer 120 to the user U.

 コンピュータ120は、カメラ130が接続されている。カメラ130は、ユーザUが位置するインタラクティブ環境の画像をキャプチャするように構成することができる。これらのキャプチャされた画像は、ユーザU、HMD110、及びウェアラブル装置1の位置及び移動を判定するために分析することができる。HMD110は、HMD110の位置及び方向を判定するためにトラッキングすることができる1つまたは複数のライトを含んでもよい。カメラ130は、インタラクティブ環境からの音をキャプチャするための1つまたは複数のマイクロフォンを含むことができる。カメラ130は、複数の画像キャプチャデバイス(たとえば、カメラの立体視対)、IRカメラ、深度カメラ、及びその組合せを含むように構成してもよい。本実施形態では、カメラ130は、IRカメラを含む場合について説明する。 The camera 130 is connected to the computer 120. The camera 130 can be configured to capture an image of the interactive environment in which the user U is located. These captured images can be analyzed to determine the position and movement of the user U, HMD110, and wearable device 1. The HMD 110 may include one or more lights that can be tracked to determine the position and orientation of the HMD 110. The camera 130 may include one or more microphones for capturing sound from an interactive environment. The camera 130 may be configured to include a plurality of image capture devices (eg, stereoscopic pairs of cameras), an IR camera, a depth of field camera, and combinations thereof. In the present embodiment, the case where the camera 130 includes an IR camera will be described.

 コンピュータ120は、ネットワーク200上でクラウドゲーミングプロバイダ150と通信するシンクライアントとして機能する。クラウドゲーミングプロバイダ150は、ユーザUによってプレイされているビデオゲームを保守及び実行する。コンピュータ120は、ウェアラブル装置1、HMD110、及びカメラ130からの入力をクラウドゲーミングプロバイダに伝送する。クラウドゲーミングプロバイダ150は、実行しているビデオゲームのゲーム状態に影響を及ぼす入力を処理する。ビデオデータ、オーディオデータ、及び触覚フィードバックデータなどの、実行しているビデオゲームからの出力は、コンピュータ120に伝送される。 The computer 120 functions as a thin client that communicates with the cloud gaming provider 150 on the network 200. The cloud gaming provider 150 maintains and runs the video game being played by user U. The computer 120 transmits the inputs from the wearable device 1, the HMD 110, and the camera 130 to the cloud gaming provider. The cloud gaming provider 150 processes inputs that affect the game state of the video game being run. The output from the running video game, such as video data, audio data, and tactile feedback data, is transmitted to the computer 120.

 図9は、第2の実施形態に係るシステム100におけるウェアラブル装置1の一例を説明するための図である。図9に示すように、ウェアラブル装置1は、ユーザUの手Hに装着されている。そして、ユーザUは、ウェアラブル装置1を装着した手Hでペン160を握っている。ペン160は、ペン先に反射マーカーが設けられている。ウェアラブル装置1は、姿勢センサ3によって検出した手Hの姿勢が把握姿勢である場合、筋電センサ4によって検出した筋電位情報に基づいて手Hの把握状態を推定する。この場合、ウェアラブル装置1は、親指と人差し指とで物を把握している把握状態と握力とを推定する。 FIG. 9 is a diagram for explaining an example of the wearable device 1 in the system 100 according to the second embodiment. As shown in FIG. 9, the wearable device 1 is attached to the hand H of the user U. Then, the user U holds the pen 160 with the hand H equipped with the wearable device 1. The pen 160 is provided with a reflection marker at the pen tip. When the posture of the hand H detected by the posture sensor 3 is the grasping posture, the wearable device 1 estimates the grasping state of the hand H based on the myoelectric potential information detected by the myoelectric sensor 4. In this case, the wearable device 1 estimates the grasping state and the grip strength in which the object is grasped by the thumb and the index finger.

 コンピュータ120は、カメラ130がユーザUを撮像した画像を解析することで、手Hとペン160のペン先との空間における位置を検出する。例えば、コンピュータ120は、HMD110、カメラ130等から赤外線を照射させ、その反射光に基づいて空間におけるペン160のペン先の位置を検出する。そして、コンピュータ120は、ウェアラブル装置1が推定した把握状態と握力とに基づいてユーザUがペン160を握っているときの握力を求め、当該握力に応じた線の太さを決定する。例えば、コンピュータ120は、ユーザUがペン160を握る力が強くなるにしたがって描画する線を太くする。コンピュータ120は、空間を移動したペン160のペン先の位置と手Hの握力とに基づいて描いた画像をHMD110に表示させる。その結果、システム100は、ウェアラブル装置1が推定した手Hの把握状態に基づいて、ユーザUがフリーハンドで空間内に描いた画像をHMD110に表示させることができる。 The computer 120 detects the position of the hand H and the pen tip of the pen 160 in space by analyzing the image captured by the camera 130 of the user U. For example, the computer 120 irradiates infrared rays from the HMD 110, the camera 130, and the like, and detects the position of the pen tip of the pen 160 in space based on the reflected light. Then, the computer 120 obtains the grip strength when the user U is holding the pen 160 based on the grasping state and the grip strength estimated by the wearable device 1, and determines the thickness of the line according to the grip strength. For example, the computer 120 increases the drawing line as the force of the user U holding the pen 160 increases. The computer 120 causes the HMD 110 to display an image drawn based on the position of the pen tip of the pen 160 that has moved in space and the grip strength of the hand H. As a result, the system 100 can display the image drawn by the user U in the space freehand on the HMD 110 based on the grasped state of the hand H estimated by the wearable device 1.

 本実施形態では、ウェアラブル装置1、HMD110、及びカメラ130はそれ自体が、クラウドゲーミングプロバイダ150と通信するためにネットワーク200に接続するネットワーク化されたデバイスであってもよい。例えば、コンピュータ120は、ビデオゲーム処理を別様に実行しないが、ネットワークトラフィックの通過を容易にするルータなどのローカルネットワークデバイスであってもよい。ウェアラブル装置1、HMD110、及びカメラ130によるネットワークへの接続は、有線またはワイヤレスでもよい。 In this embodiment, the wearable device 1, the HMD 110, and the camera 130 may themselves be networked devices that connect to the network 200 to communicate with the cloud gaming provider 150. For example, computer 120 may be a local network device, such as a router, that does not perform video game processing separately, but facilitates the passage of network traffic. The connection to the network by the wearable device 1, the HMD 110, and the camera 130 may be wired or wireless.

 以上のように、システム100は、ウェアラブル装置1の姿勢センサ3によって検出した手Hの姿勢が把握姿勢である場合、筋電センサ4によって検出した筋電位情報に基づいて手Hの把握状態を推定する。システム100は、ウェアラブル装置1が推定した推定結果をコンピュータ120に出力し、コンピュータ120の処理結果をHMD110で出力する。これにより、システム100は、ウェアラブル装置1の筋電センサ4を用いた手Hの筋力の推定精度が向上しているので、HMD110の出力のリアリティを向上させることができる。また、システム100は、手Hの姿勢が把握姿勢の場合に、カメラ130によって手Hを撮像すればよいので、電力の消費を低減させることができる。 As described above, when the posture of the hand H detected by the posture sensor 3 of the wearable device 1 is the grasping posture, the system 100 estimates the grasping state of the hand H based on the myoelectric potential information detected by the myoelectric sensor 4. To do. The system 100 outputs the estimation result estimated by the wearable device 1 to the computer 120, and outputs the processing result of the computer 120 to the HMD 110. As a result, the system 100 improves the accuracy of estimating the muscle strength of the hand H using the myoelectric sensor 4 of the wearable device 1, so that the reality of the output of the HMD 110 can be improved. Further, in the system 100, when the posture of the hand H is the grasping posture, the hand H may be imaged by the camera 130, so that the power consumption can be reduced.

 なお、第2の実施形態に、他の実施形態、変形例の技術思想を組み合わせてもよい。 Note that the second embodiment may be combined with the technical ideas of other embodiments and modifications.

 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 Although the preferred embodiments of the present disclosure have been described in detail with reference to the accompanying drawings, the technical scope of the present disclosure is not limited to such examples. It is clear that a person with ordinary knowledge in the technical field of the present disclosure can come up with various modifications or modifications within the scope of the technical ideas described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.

 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 Further, the effects described in the present specification are merely explanatory or exemplary and are not limited. That is, the technique according to the present disclosure may exhibit other effects apparent to those skilled in the art from the description herein, in addition to or in place of the above effects.

 また、コンピュータに内蔵されるCPU、ROMおよびRAMなどのハードウェアに、情報処理ユニット10が有する構成と同等の機能を発揮させるためのプログラムも作成可能であり、当該プログラムを記録した、コンピュータに読み取り可能な記録媒体も提供され得る。 In addition, it is possible to create a program for causing the hardware such as the CPU, ROM, and RAM built in the computer to exhibit the same function as the configuration of the information processing unit 10, and read the program recorded by the computer. Possible recording media may also be provided.

 また、本明細書のウェアラブル装置1の処理に係る各ステップは、必ずしもフローチャートに記載された順序に沿って時系列に処理される必要はない。例えば、情報処理ユニットの処理に係る各ステップは、フローチャートに記載された順序と異なる順序で処理されても、並列的に処理されてもよい。 Further, each step related to the processing of the wearable device 1 in the present specification does not necessarily have to be processed in chronological order in the order described in the flowchart. For example, each step related to the processing of the information processing unit may be processed in an order different from the order described in the flowchart, or may be processed in parallel.

 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 ユーザの手の姿勢を検出する姿勢センサと、
 前記手の筋電位情報を検出する筋電センサと、
 前記姿勢センサによって検出した前記姿勢が把握姿勢である場合に、前記筋電センサによって検出した前記筋電位情報に基づいて、前記ユーザの前記手の把握状態を推定する推定部と、
 を備えるウェアラブル装置。
(2)
 前記筋電センサは、前記手に接触し、当該手の筋肉で発生する活動電位を示す前記筋電位情報を検出する乾式電極を有する
 前記(1)に記載のウェアラブル装置。
(3)
 前記姿勢センサは、前記手の指の動きを検出する曲げセンサを含む
 前記(1)または(2)に記載のウェアラブル装置。
(4)
 前記筋電センサは、前記姿勢センサによって検出した前記姿勢が前記把握姿勢になると、前記筋電位情報の検出を開始する
 前記(1)に記載のウェアラブル装置。
(5)
 前記推定部は、前記姿勢センサによって検出した前記姿勢が前記把握姿勢でない場合、前記ユーザの前記手の前記把握状態を推定しない
 前記(1)に記載のウェアラブル装置。
(6)
 前記推定部は、前記把握姿勢が判定時間にわたって継続した場合に前記把握状態を推定する
 前記(1)から(5)のいずれかに記載のウェアラブル装置。
(7)
 前記推定部は、前記把握姿勢の開始を検出したときの前記筋電センサの前記筋電位情報を初期値とし、当該初期値と前記把握姿勢の開始後に検出した前記筋電位情報とに基づいて前記把握状態を推定する
 前記(1)から(6)のいずれかに記載のウェアラブル装置。
(8)
 前記推定部は、前記把握状態を推定している場合に、前記把握姿勢の終了を検出すると、当該把握状態の推定を終了する
 前記(1)から(7)のいずれかに記載のウェアラブル装置。
(9)
 前記ユーザの前記手に装着される装着部をさらに備え、
 前記装着部は、前記手に装着された場合に、前記把握姿勢を検出可能な前記手の位置に前記姿勢センサを位置付ける
 前記(1)から(8)のいずれかに記載のウェアラブル装置。
(10)
 前記装着部は、前記ユーザの少なくとも指先を露出する手袋であり、
 前記姿勢センサは、前記装着部によって前記手の甲に位置付けられる
 前記(9)に記載のウェアラブル装置。
(11)
 前記ユーザの手の前記把握姿勢を示す姿勢データを記憶する記憶部をさらに備え、
 前記推定部は、前記姿勢センサによって検出した前記姿勢が、前記記憶部に記憶している前記姿勢データが示す前記把握姿勢である場合に、前記筋電センサによって検出した前記筋電位情報に基づいて、前記ユーザの前記手の把握状態を推定する
 前記(1)から(10)のいずれかに記載のウェアラブル装置。
(12)
 前記姿勢データは、前記ユーザが動作するシーンに応じた前記把握姿勢を示すデータを含む
 前記(11)に記載のウェアラブル装置。
(13)
 ユーザの手の姿勢を検出する姿勢センサ及び前記手の筋電位情報を検出する筋電センサの検出結果を取得する取得部と、
 前記姿勢センサによって検出した前記姿勢が把握姿勢である場合に、前記筋電センサによって検出した前記筋電位情報に基づいて、前記ユーザの前記手の把握状態を推定する推定部と、
 を備える情報処理ユニット。
(14)
 コンピュータが、
 ユーザの手の姿勢を検出する姿勢センサ及び前記手の筋電位情報を検出する筋電センサの検出結果を取得するステップと、
 前記姿勢センサによって検出した前記姿勢が把握姿勢である場合に、前記筋電センサによって検出した前記筋電位情報に基づいて、前記ユーザの前記手の把握状態を推定するステップと、
 を含む情報処理方法。
(15)
 ユーザの手に装着されるウェアラブル装置と、
 前記ウェアラブル装置と連携する連係装置と、
 を備え、
 前記ウェアラブル装置は、
 ユーザの手の姿勢を検出する姿勢センサと、
 前記手の筋電位情報を検出する筋電センサと、
 前記姿勢センサによって検出した前記姿勢が把握姿勢である場合に、前記筋電センサによって検出した前記筋電位情報に基づいて、前記ユーザの前記手の把握状態を推定する推定部と、
 を備え、
 前記連携装置は、前記ウェアラブル装置が推定した前記把握状態に基づく処理を実行するシステム。
The following configurations also belong to the technical scope of the present disclosure.
(1)
A posture sensor that detects the posture of the user's hand and
An electromyographic sensor that detects the myoelectric potential information of the hand and
When the posture detected by the posture sensor is the grasping posture, an estimation unit that estimates the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.
Wearable device with.
(2)
The wearable device according to (1) above, wherein the myoelectric sensor has a dry electrode that comes into contact with the hand and detects the myoelectric potential information indicating the action potential generated in the muscle of the hand.
(3)
The wearable device according to (1) or (2) above, wherein the posture sensor includes a bending sensor that detects the movement of the fingers of the hand.
(4)
The wearable device according to (1), wherein the myoelectric sensor starts detecting myoelectric potential information when the posture detected by the posture sensor becomes the grasping posture.
(5)
The wearable device according to (1), wherein the estimation unit does not estimate the grasping state of the hand of the user when the posture detected by the posture sensor is not the grasping posture.
(6)
The wearable device according to any one of (1) to (5) above, wherein the estimation unit estimates the grasping state when the grasping posture continues for a determination time.
(7)
The estimation unit uses the myoelectric potential information of the myoelectric sensor when the start of the grasping posture is detected as an initial value, and the estimation unit is based on the initial value and the myoelectric potential information detected after the start of the grasping posture. The wearable device according to any one of (1) to (6) above, which estimates the grasping state.
(8)
The wearable device according to any one of (1) to (7) above, wherein when the estimation unit detects the end of the grasping posture when estimating the grasping state, the estimation of the grasping state ends.
(9)
Further provided with a mounting portion to be mounted on the hand of the user.
The wearable device according to any one of (1) to (8), wherein the mounting portion positions the posture sensor at a position of the hand that can detect the grasping posture when the mounting portion is mounted on the hand.
(10)
The wearing portion is a glove that exposes at least the fingertips of the user.
The wearable device according to (9), wherein the posture sensor is positioned on the back of the hand by the mounting portion.
(11)
Further, a storage unit for storing posture data indicating the grasping posture of the user's hand is provided.
The estimation unit is based on the myoelectric potential information detected by the myoelectric sensor when the posture detected by the posture sensor is the grasping posture indicated by the posture data stored in the storage unit. The wearable device according to any one of (1) to (10), which estimates the grasping state of the hand of the user.
(12)
The wearable device according to (11), wherein the posture data includes data indicating the grasping posture according to a scene in which the user operates.
(13)
An acquisition unit that acquires the detection results of a posture sensor that detects the posture of the user's hand and an electromyographic sensor that detects the myoelectric potential information of the hand.
When the posture detected by the posture sensor is the grasping posture, an estimation unit that estimates the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.
Information processing unit equipped with.
(14)
The computer
A step of acquiring the detection results of the posture sensor that detects the posture of the user's hand and the myoelectric sensor that detects the myoelectric potential information of the hand, and
When the posture detected by the posture sensor is the grasping posture, the step of estimating the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor, and
Information processing methods including.
(15)
Wearable devices worn in the user's hands
A linkage device that cooperates with the wearable device and
With
The wearable device is
A posture sensor that detects the posture of the user's hand and
An electromyographic sensor that detects the myoelectric potential information of the hand and
When the posture detected by the posture sensor is the grasping posture, an estimation unit that estimates the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.
With
The cooperation device is a system that executes processing based on the grasping state estimated by the wearable device.

 1 ウェアラブル装置
 2 装着部
 2A 装着部
 3 姿勢センサ
 4 筋電センサ
 10情報処理ユニット
 11 取得部
 12 記憶部
 12A 姿勢データ
 13 推定部
 14 通信部
 100 システム
 110 ヘッドマウントアップディスプレイ(HMD)
 120 コンピュータ
 130 カメラ
 H 手
 P1 非把握姿勢
 P2 把握姿勢
 U ユーザ
1 Wearable device 2 Mounting unit 2A Mounting unit 3 Attitude sensor 4 Myoelectric sensor 10 Information processing unit 11 Acquisition unit 12 Storage unit 12A Attitude data 13 Estimating unit 14 Communication unit 100 System 110 Head-mounted up display (HMD)
120 Computer 130 Camera H Hand P1 Non-grasp posture P2 Grasp posture U User

Claims (14)

 ユーザの手の姿勢を検出する姿勢センサと、
 前記手の筋電位情報を検出する筋電センサと、
 前記姿勢センサによって検出した前記姿勢が把握姿勢である場合に、前記筋電センサによって検出した前記筋電位情報に基づいて、前記ユーザの前記手の把握状態を推定する推定部と、
 を備えるウェアラブル装置。
A posture sensor that detects the posture of the user's hand and
An electromyographic sensor that detects the myoelectric potential information of the hand and
When the posture detected by the posture sensor is the grasping posture, an estimation unit that estimates the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.
Wearable device with.
 前記筋電センサは、前記手に接触し、当該手の筋肉で発生する活動電位を示す前記筋電位情報を検出する乾式電極を有する
 請求項1に記載のウェアラブル装置。
The wearable device according to claim 1, wherein the myoelectric sensor has a dry electrode that comes into contact with the hand and detects the myoelectric potential information indicating the action potential generated in the muscle of the hand.
 前記姿勢センサは、前記手の指の動きを検出する曲げセンサを含む
 請求項2に記載のウェアラブル装置。
The wearable device according to claim 2, wherein the posture sensor includes a bending sensor that detects the movement of the fingers of the hand.
 前記筋電センサは、前記姿勢センサによって検出した前記姿勢が前記把握姿勢になると、前記筋電位情報の検出を開始する
 請求項1に記載のウェアラブル装置。
The wearable device according to claim 1, wherein the myoelectric sensor starts detecting the myoelectric potential information when the posture detected by the posture sensor becomes the grasping posture.
 前記推定部は、前記姿勢センサによって検出した前記姿勢が前記把握姿勢でない場合、前記ユーザの前記手の前記把握状態を推定しない
 請求項1に記載のウェアラブル装置。
The wearable device according to claim 1, wherein the estimation unit does not estimate the grasping state of the hand of the user when the posture detected by the posture sensor is not the grasping posture.
 前記推定部は、前記把握姿勢が判定時間にわたって継続した場合に前記把握状態を推定する
 請求項1に記載のウェアラブル装置。
The wearable device according to claim 1, wherein the estimation unit estimates the grasping state when the grasping posture continues for a determination time.
 前記推定部は、前記把握姿勢の開始を検出したときの前記筋電センサの前記筋電位情報を初期値とし、当該初期値と前記把握姿勢の開始後に検出した前記筋電位情報とに基づいて前記把握状態を推定する
 請求項1に記載のウェアラブル装置。
The estimation unit uses the myoelectric potential information of the myoelectric sensor when the start of the grasping posture is detected as an initial value, and the estimation unit is based on the initial value and the myoelectric potential information detected after the start of the grasping posture. The wearable device according to claim 1, wherein the grasping state is estimated.
 前記推定部は、前記把握状態を推定している場合に、前記把握姿勢の終了を検出すると、当該把握状態の推定を終了する
 請求項1に記載のウェアラブル装置。
The wearable device according to claim 1, wherein when the estimation unit detects the end of the grasping posture when the grasping state is estimated, the estimation of the grasping state is completed.
 前記ユーザの前記手に装着される装着部をさらに備え、
 前記装着部は、前記手に装着された場合に、前記把握姿勢を検出可能な前記手の位置に前記姿勢センサを位置付ける
 請求項1に記載のウェアラブル装置。
Further provided with a mounting portion to be mounted on the hand of the user.
The wearable device according to claim 1, wherein the mounting portion positions the posture sensor at a position of the hand that can detect the grasping posture when the mounting portion is mounted on the hand.
 前記装着部は、前記ユーザの少なくとも指先を露出する手袋であり、
 前記姿勢センサは、前記装着部によって前記手の甲に位置付けられる
 請求項9に記載のウェアラブル装置。
The wearing portion is a glove that exposes at least the fingertips of the user.
The wearable device according to claim 9, wherein the posture sensor is positioned on the back of the hand by the mounting portion.
 前記ユーザの手の前記把握姿勢を示す姿勢データを記憶する記憶部をさらに備え、
 前記推定部は、前記姿勢センサによって検出した前記姿勢が、前記記憶部に記憶している前記姿勢データが示す前記把握姿勢である場合に、前記筋電センサによって検出した前記筋電位情報に基づいて、前記ユーザの前記手の把握状態を推定する
 請求項1に記載のウェアラブル装置。
Further, a storage unit for storing posture data indicating the grasping posture of the user's hand is provided.
The estimation unit is based on the myoelectric potential information detected by the myoelectric sensor when the posture detected by the posture sensor is the grasping posture indicated by the posture data stored in the storage unit. The wearable device according to claim 1, wherein the state of grasping the hand of the user is estimated.
 前記姿勢データは、前記ユーザが動作するシーンに応じた前記把握姿勢を示すデータを含む
 請求項11に記載のウェアラブル装置。
The wearable device according to claim 11, wherein the posture data includes data indicating the grasped posture according to a scene in which the user operates.
 ユーザの手の姿勢を検出する姿勢センサ及び前記手の筋電位情報を検出する筋電センサの検出結果を取得する取得部と、
 前記姿勢センサによって検出した前記姿勢が把握姿勢である場合に、前記筋電センサによって検出した前記筋電位情報に基づいて、前記ユーザの前記手の把握状態を推定する推定部と、
 を備える情報処理ユニット。
An acquisition unit that acquires the detection results of a posture sensor that detects the posture of the user's hand and an electromyographic sensor that detects the myoelectric potential information of the hand.
When the posture detected by the posture sensor is the grasping posture, an estimation unit that estimates the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor.
Information processing unit equipped with.
 コンピュータが、
 ユーザの手の姿勢を検出する姿勢センサ及び前記手の筋電位情報を検出する筋電センサの検出結果を取得するステップと、
 前記姿勢センサによって検出した前記姿勢が把握姿勢である場合に、前記筋電センサによって検出した前記筋電位情報に基づいて、前記ユーザの前記手の把握状態を推定するステップと、
 を含む情報処理方法。
The computer
A step of acquiring the detection results of the posture sensor that detects the posture of the user's hand and the myoelectric sensor that detects the myoelectric potential information of the hand, and
When the posture detected by the posture sensor is the grasping posture, the step of estimating the grasping state of the hand of the user based on the myoelectric potential information detected by the myoelectric sensor, and
Information processing methods including.
PCT/JP2020/003966 2019-03-27 2020-02-03 Wearable device, information processing unit, and information processing method Ceased WO2020195172A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-061182 2019-03-27
JP2019061182 2019-03-27

Publications (1)

Publication Number Publication Date
WO2020195172A1 true WO2020195172A1 (en) 2020-10-01

Family

ID=72608887

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/003966 Ceased WO2020195172A1 (en) 2019-03-27 2020-02-03 Wearable device, information processing unit, and information processing method

Country Status (1)

Country Link
WO (1) WO2020195172A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060248478A1 (en) * 2005-01-18 2006-11-02 Forrest Liau Sensing input actions
JP2007136041A (en) * 2005-11-22 2007-06-07 Tokyo Institute Of Technology Learning support device, learning support method, virtual human interface device, virtual human interface method, virtual human interface system, program for realizing these devices, and recording medium
JP2015512550A (en) * 2012-04-09 2015-04-27 クアルコム,インコーポレイテッド Gesture-based remote device control

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060248478A1 (en) * 2005-01-18 2006-11-02 Forrest Liau Sensing input actions
JP2007136041A (en) * 2005-11-22 2007-06-07 Tokyo Institute Of Technology Learning support device, learning support method, virtual human interface device, virtual human interface method, virtual human interface system, program for realizing these devices, and recording medium
JP2015512550A (en) * 2012-04-09 2015-04-27 クアルコム,インコーポレイテッド Gesture-based remote device control

Similar Documents

Publication Publication Date Title
EP3860527B1 (en) Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
US10761575B2 (en) Detecting a gesture made by a person wearing a wearable electronic device
KR102302640B1 (en) Detecting and using body tissue electrical signals
US10579151B2 (en) Controller for finger gesture recognition and method for recognizing finger gesture
US20200097081A1 (en) Neuromuscular control of an augmented reality system
US20190228330A1 (en) Handstate reconstruction based on multiple inputs
US11307671B2 (en) Controller for finger gesture recognition and method for recognizing finger gesture
US20220155866A1 (en) Ring device having an antenna, a touch pad, and/or a charging pad to control a computing device based on user motions
KR20130027006A (en) Method and apparatus for hand gesture control in a minimally invasive surgical system
KR20120102647A (en) A master finger tracking device and method of use in a minimally invasive surgical system
KR20140015144A (en) Method and system for hand presence detection in a minimally invasive surgical system
KR20120115487A (en) Method and system for hand control of a teleoperated minimally invasive slave surgical instrument
JPWO2016038953A1 (en) DETECTING DEVICE, DETECTING METHOD, CONTROL DEVICE, AND CONTROL METHOD
Heo et al. A realistic game system using multi-modal user interfaces
KR102048551B1 (en) System and Method for Virtual reality rehabilitation training using Smart device
US11640203B2 (en) Operation system and operation method
WO2020195172A1 (en) Wearable device, information processing unit, and information processing method
US20220253140A1 (en) Myoelectric wearable system for finger movement recognition
US11592901B2 (en) Control device and control method for robot arm
KR102901323B1 (en) Resting state-based user authentication device and method thereof using signal of electromyogram and inertial measurement unit
KR102048546B1 (en) System and Method for rehabilitation training using Virtual reality device
RU196281U1 (en) Information input device
CN115857660A (en) An Image-Based Gesture Recognition Smart Ring
GB2552219A (en) Wearable input device
CN117133045A (en) Gesture recognition methods, devices, equipment and media

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20779274

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20779274

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP