[go: up one dir, main page]

WO2012011350A1 - Dispositif d'évaluation de posture de marche - Google Patents

Dispositif d'évaluation de posture de marche Download PDF

Info

Publication number
WO2012011350A1
WO2012011350A1 PCT/JP2011/064130 JP2011064130W WO2012011350A1 WO 2012011350 A1 WO2012011350 A1 WO 2012011350A1 JP 2011064130 W JP2011064130 W JP 2011064130W WO 2012011350 A1 WO2012011350 A1 WO 2012011350A1
Authority
WO
WIPO (PCT)
Prior art keywords
feature point
feature
factor
walking posture
locus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2011/064130
Other languages
English (en)
Japanese (ja)
Inventor
森 健太郎
佐藤 哲也
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omron Healthcare Co Ltd
Original Assignee
Omron Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omron Healthcare Co Ltd filed Critical Omron Healthcare Co Ltd
Publication of WO2012011350A1 publication Critical patent/WO2012011350A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6823Trunk, e.g., chest, back, abdomen, hip

Definitions

  • the present invention relates to a walking posture determination device, and more particularly to a walking posture determination device suitable for determining a walking posture of a user who wears the device at a predetermined site.
  • the present invention has been made to solve the above-described problems, and one of its purposes is to provide a walking posture determination apparatus capable of evaluating a detailed walking posture with higher accuracy.
  • Another object of the present invention is to provide a walking posture determination device capable of displaying the walking posture in an easily understandable manner.
  • a walking posture determination device includes a main body, an acceleration sensor for detecting acceleration of the main body, and a controller. It is an apparatus for determining the walking posture of a user wearing a predetermined part.
  • a three-dimensional trajectory from which a moving component in the traveling direction during walking of a predetermined part to which the main body is attached has a pattern.
  • the pattern includes a plurality of feature points that define the features of the pattern.
  • the control unit Based on the acceleration detected by the acceleration sensor, the control unit removes the moving component in the traveling direction on the plane perpendicular to each of the three orthogonal directions of the vertical direction, the traveling direction, and the left-right direction.
  • a specifying unit that specifies the position of the first position, a first calculation unit that calculates a value of the characteristic factor of the trajectory based on the position specified by the specifying unit, a value of the characteristic factor and an index value indicating the walking posture
  • a second calculation unit that calculates an index value based on a characteristic factor value calculated by the first calculation unit according to a correlation obtained in advance, and an index value calculated by the second calculation unit
  • a determination unit for determining a walking posture based on
  • the walking posture determination device further includes a display unit.
  • the control unit further includes a display control unit that causes the display unit to display the walking posture determined by the determination unit and the target walking posture in a comparable manner.
  • the walking posture determination device further includes a display unit.
  • the control unit further includes a display control unit that causes the display unit to display advice for improving the walking posture determined by the determination unit.
  • the correlation is represented by a multiple regression equation that is a relational expression between a characteristic factor value as an objective variable and an index value as an explanatory variable, obtained by multiple regression analysis.
  • the feature points are a first feature point when the first foot is grounded and a second feature when the locus reaches the highest position while standing on the first foot. And a third feature point when the second foot comes in contact with the ground, and a fourth feature point when the locus reaches the highest position while standing on the second foot.
  • the feature factor is a first feature factor that is the distance in the vertical direction between the first feature point and the second feature point in the locus projected on the plane perpendicular to the traveling direction, and the feature factor is projected onto the plane perpendicular to the left-right direction.
  • the second feature factor calculated from the distance between the first feature point and the second feature point in the locus and the distance between the third feature point and the fourth feature point is included.
  • the index includes stride.
  • the multiple regression equation includes the product of the first partial regression coefficient and the first characteristic factor obtained by the multiple regression analysis, and the product of the second partial regression coefficient and the second feature factor obtained by the multiple regression analysis. , Is a formula for calculating the sum with the third partial regression coefficient.
  • the feature point is a first feature point when the first foot is grounded, a second feature point when the locus reaches the highest position while standing on the first foot, The third feature point on the rightmost side of the locus, the fourth feature point on the leftmost side of the locus, the fifth feature point on the most front side on the right side of the locus, and the sixth feature on the leftmost side of the locus.
  • the feature factor is the distance in the horizontal direction between the third feature point and the fourth feature point in the vertical direction between the first feature point and the second feature point in the trajectory projected on the plane perpendicular to the traveling direction. And the distance in the left-right direction between the fifth feature point and the sixth feature point in the trajectory projected on the surface perpendicular to the vertical direction, and the seventh feature point and the A second feature factor that is a quotient divided by the distance in the left-right direction from the eight feature points.
  • the index includes the step interval.
  • the multiple regression equation includes the product of the first partial regression coefficient and the first characteristic factor obtained by the multiple regression analysis, and the product of the second partial regression coefficient and the second feature factor obtained by the multiple regression analysis. , Is a formula for calculating the sum with the third partial regression coefficient.
  • the three-dimensional trajectory from which the moving component in the advancing direction during walking of the predetermined part to which the main body is attached has a pattern, and the pattern is a feature point that defines the feature of the pattern.
  • the traveling direction, and the left-right direction based on the acceleration detected by the acceleration sensor
  • the position of the point is specified, the value of the trait feature factor is calculated based on the specified position, and the value of the feature factor and the index value indicating the walking posture are calculated based on the calculated feature factor value.
  • the index value is calculated according to the correlation obtained in advance, and the walking posture is determined based on the calculated index value.
  • the walking posture determination device measures not only the number of steps, but also the amount of activity (also referred to as the amount of exercise) in exercise and daily activities (for example, vacuuming, carrying light luggage, cooking, etc.).
  • the embodiment will be described as an activity meter capable of However, the present invention is not limited to this, and the walking posture determination device may be a pedometer capable of measuring the number of steps.
  • FIG. 1 is an external view of an activity meter 100 according to the embodiment of the present invention.
  • the activity meter 100 is mainly composed of a main body portion 191 and a clip portion 192.
  • the clip unit 192 is used to fix the activity meter 100 to a user's clothes or the like.
  • the main body 191 includes a display change / decision switch 131, a left operation / memory switch 132, a right operation switch 133, and a part of a display unit 140, which will be described later.
  • a display 141 is provided.
  • display 141 is configured by a liquid crystal display (LCD), but is not limited thereto, and may be another type of display such as an EL (ElectroLuminescence) display. .
  • LCD liquid crystal display
  • EL ElectroLuminescence
  • FIG. 2 is a diagram showing a usage state of the activity meter 100 in this embodiment.
  • activity meter 100 is attached to a belt on a user's waist using clip portion 192, for example. In this embodiment, it is desirable that the activity meter 100 is fixedly mounted near the user's waist.
  • the user's traveling direction during walking is the Z axis (the forward direction is the positive direction), the user's left and right direction during walking is the X axis (the right direction is the positive direction), and the vertical direction is the Y axis.
  • a coordinate system with (vertically upward as a positive direction) is used.
  • FIG. 3 is a diagram showing a first example in which the hip locus during walking of the user is viewed from the direction of walking.
  • FIG. 4 is a diagram illustrating a second example in which the hip locus during walking of the user is viewed from the walking direction.
  • FIG. 3A and FIG. 4A are diagrams in which the hip locus during walking is superimposed on the user's image.
  • FIG. 3B and FIG. 4B are graphs showing the locus of the waist when the user walks.
  • this locus is a locus projected on the XY plane, which is a plane perpendicular to the Z-axis, during walking.
  • the right foot usually touches the ground after reaching the highest position after the right foot is released from the ground, and then the left foot is moved to the highest position after the left foot is released from the ground. After reaching, the foot is moved in the process of the left foot touching the ground.
  • the locus of the user's waist first moves from the lower right to the upper left, reaches the highest position on the upper left, then moves to the lower left, reaches the lowest position on the lower left, and then moves to the upper right. It is a specific pattern in which, after reaching the highest position on the upper right, heading to the lower right and reaching the lowest position on the lower right.
  • FIG. 5 is a diagram showing a plurality of examples in which the hip locus during walking of the user is viewed from the direction of walking.
  • FIG. 5 (A) is a diagram similar to the diagram shown in FIG. 3 (B).
  • FIG. 5A shows the locus of the waist when the user walks in the normal walking posture.
  • FIG. 5B is a diagram similar to the diagram illustrated in FIG.
  • FIG. 5B shows the locus of the waist when the user walks in a case where the step is wider than in the case of FIG.
  • FIG. 5C shows the locus of the waist when the user walks when the step is narrower than in the case of FIG.
  • FIG. 5 (D) shows the locus of the waist when the user walks when walking with a skate rather than the case of FIG. 5 (A).
  • FIG. 5 (E) shows the locus of the waist when the user walks when walking with the back of his back, rather than in the case of FIG. 5 (A).
  • FIG. 5 (F) shows the locus of the user's waist when walking on a forehead than in the case of FIG. 5 (A).
  • FIG. 5A to FIG. 5F look different, they have a specific pattern as described in FIG. 3 and FIG.
  • FIG. 6 is a diagram showing the correlation between the user's waist trajectory calculated from acceleration data and the measured user's waist trajectory calculated from the acceleration data in this embodiment.
  • FIG. 6 (A) is a diagram of an actually measured waist trajectory of the user when viewed from the direction of walking.
  • FIG. 6A is a view similar to FIG. 3B, FIG. 4B, and FIG. 5A to FIG. 5F.
  • the locus shown in FIG. 6A is obtained by, for example, photographing a place where the user is walking from the direction of travel with a camera and connecting the movements of a certain point near the waist by image processing. .
  • FIG. 6 (B) is a view of the hip locus during walking of the user, calculated from the acceleration data, as seen from the direction of walking.
  • a method of calculating the hip locus during walking of the user based on the triaxial acceleration data detected by the acceleration sensor of the activity meter 100 will be described. This locus is calculated by the control unit of the activity meter 100.
  • the accelerations Ax (t), Ay (t), and Az (t) in the X-axis, Y-axis, and Z-axis directions described with reference to FIG. 2 are specified.
  • the detected values obtained by the acceleration sensor are directly used as the accelerations Ax in the X-axis, Y-axis, and Z-axis directions.
  • T), Ay (t), Az (t) may be used.
  • the X-axis, Y-axis, and Z-axis directions are converted by coordinate conversion of the detection values obtained by the acceleration sensor.
  • the respective accelerations Ax (t), Ay (t), Az (t) are calculated.
  • the speed excluding the average speed component in the short time during the time of ⁇ 1 step that is, the relative speed Vx ′ (t), Vy ′ (t) and Vz ′ (t) are calculated.
  • the time for one step is T seconds, and T is calculated, for example, by calculating the time between acceleration peaks for each step.
  • the points (X (t) and Y (t)) having the calculated positions X (t) and Y (t) as X and Y coordinate values are plotted on the XY plane while changing t.
  • a trajectory obtained by projecting the trajectory of the user on the XY plane can be obtained.
  • An example of this locus is the locus shown in FIG.
  • These trajectories are traces of patterns as shown in FIGS. 7A to 9A, which will be described later.
  • FIG. 7 is a diagram for explaining the feature points included in the locus pattern projected on the XY plane in this embodiment.
  • FIG. 8 is a diagram for explaining the feature points included in the locus pattern projected on the XZ plane in this embodiment.
  • FIG. 9 is a diagram for explaining the feature points included in the locus pattern projected on the YZ plane in this embodiment.
  • feature point (1) is a point when the right foot touches down in the walking cycle.
  • the condition for specifying the feature point (1) is that the right side is the right and the top and the bottom is the lowest.
  • Feature point (2) is a point when the right foot is standing in the walking cycle (particularly a point when the user's waist is at the highest position in the vertical direction).
  • the condition for specifying the feature point (2) is a condition that it is after the feature point (1) and is the highest in the vertical direction.
  • Feature point (3) is the point when the left foot touches down in the walking cycle.
  • the condition for specifying the feature point (3) is a condition that it is after the feature point (2) and is the lowest in the vertical direction.
  • Feature point (4) is a point when the left foot is standing in the walking cycle (particularly a point when the user's waist is at the highest position in the vertical direction).
  • the condition for specifying the feature point (4) is a condition that it is after the feature point (3) and is the highest in the vertical direction.
  • Feature point (5) is the point when the right foot touches down in the walking cycle.
  • the condition for specifying the feature point (5) is the condition that it is after the feature point (4) and is the lowest in the vertical direction.
  • This feature point (5) is the feature point (1) of the next one cycle.
  • Feature point (6) is a point when the user's waist is on the rightmost side in the walking cycle.
  • the condition for specifying the feature point (6) is a condition that the value of X (t) calculated by Expression 7 is maximum when X (t) ⁇ 0 in one cycle.
  • Feature point (7) is a point when the user's waist is on the leftmost side in the walking cycle.
  • the condition for specifying the feature point (7) is a condition that the value of X (t) calculated by Expression 7 is minimum in one cycle when X (t) ⁇ 0.
  • Feature point (8) is the intersection of the loci of the waist in one walking cycle in the walking cycle.
  • the conditions for specifying the feature point (8) are the XY of the waist locus from the feature point (2) to the feature point (3) and the waist locus from the feature point (4) to the feature point (5). It is a condition that it is an intersection on a plane.
  • the feature point (9) is a point when the right foot touches down in the walking cycle.
  • the condition for specifying the feature point (9) is a condition that the right and left sides are right and the front and rear sides are the rearmost.
  • the feature point (10) is a point when the right foot is standing in the walking cycle (particularly a point when the user's waist is at the foremost position relative to the average position in a short time in the traveling direction).
  • the condition for specifying the feature point (10) is a condition that it is after the feature point (9) and is the foremost in terms of front and rear.
  • Feature point (11) is the point when the left foot touches down in the walking cycle.
  • the condition for specifying the feature point (11) is a condition that it is after the feature point (10) and is at the back of the front and rear.
  • the feature point (12) is a point when the left foot is standing in the walking cycle (particularly a point when the user's waist is at the forefront relative position to the average position in a short time in the traveling direction).
  • the condition for specifying the feature point (12) is a condition that it is after the feature point (11) and is the foremost in terms of front and rear.
  • Feature point (13) is the point when the right foot touches down in the walking cycle.
  • the condition for specifying the feature point (11) is a condition that it is after the feature point (12) and is at the back of the front and rear.
  • This feature point (13) is the feature point (9) of the next one cycle.
  • Feature point (14) is the intersection of the loci of the waist in one walking cycle in the walking cycle.
  • the conditions for specifying the feature point (14) are the XY of the waist locus from the feature point (10) to the feature point (11) and the waist locus from the feature point (12) to the feature point (13). It is a condition that it is an intersection on a plane.
  • feature points (1), (3), and (5) described in FIG. 7 are the lowest points in the locus pattern projected on the YZ plane.
  • the feature points (2) and (4) are the uppermost points in the locus pattern projected on the YZ plane.
  • FIG. 10 is a diagram for explaining the feature factor calculated based on the position of the feature point included in the locus pattern projected on the XY plane in this embodiment.
  • FIG. 11 is a diagram for explaining the feature factor calculated based on the position of the feature point included in the locus pattern projected on the XZ plane in this embodiment.
  • FIG. 12 is a diagram for explaining the feature factor calculated based on the position of the feature point included in the locus pattern projected on the YZ plane in this embodiment.
  • the feature factor Wu is a distance in the X-axis direction (referred to as “upper left / right width”) between the feature point (2) and the feature point (4) on the XY plane, and the feature point (2 ) Is subtracted from the X coordinate value of the feature point (4).
  • the feature factor Wd is a distance in the X-axis direction between the feature point (1) and the feature point (3) on the XY plane (referred to as “lower left / right width”), and the value of the X coordinate of the feature point (1). Is calculated by subtracting the X-coordinate value of the feature point (3).
  • the feature factor W is a distance in the X-axis direction between the feature point (6) and the feature point (7) on the XY plane (referred to as “left-right width”). It is calculated by subtracting the X coordinate value of the point (7).
  • the feature factor H1 is a distance in the Y-axis direction between the feature point (4) and the feature point (3) on the XY plane (referred to as “left-side vertical width”), and is obtained from the value of the Y coordinate of the feature point (4). It is calculated by subtracting the Y coordinate value of the feature point (3).
  • the feature factor Hr is a distance in the Y-axis direction between the feature point (2) and the feature point (1) on the XY plane (referred to as “right upper and lower width”), and is obtained from the value of the Y coordinate of the feature point (2). It is calculated by subtracting the Y coordinate value of the feature point (1).
  • Feature factor H is the average of feature factor H1 and feature factor Hr on the XY plane (referred to as “vertical width”), and is calculated by adding H1 and Hr and dividing by two.
  • the feature factor Hcl is the height (“left cross point height) of the feature point (8) on the basis of the feature point (3) on the XY plane.
  • the feature point (8) is obtained from the Y coordinate value of the feature point (8). It is calculated by subtracting the Y coordinate value of 3).
  • the feature factor Hcr is the height of the feature point (8) with respect to the feature point (1) on the XY plane (“right cross point height), and the feature point (8) is obtained from the Y coordinate value of the feature point (8). It is calculated by subtracting the Y coordinate value of 1).
  • the feature factor ISO is the height (referred to as “phase”) of the feature point (8) with respect to the vertical width of the trajectory in the XY plane.
  • the feature factor Hcl is divided by the feature factor Hl, and the feature factor Hcr is the feature factor. Calculated by adding to Hr and dividing by 2.
  • the feature factor Vlev is the degree of whether the upper side of the trajectory in the XY plane is open or lower (called “shape ⁇ or ⁇ ”), and is calculated by dividing the feature factor Wu by the feature factor Wd.
  • the feature factor Ilev is a factor (referred to as “shape I”) for specifying whether the shape of the locus on the XY plane is a vertically long shape or a horizontally long shape. Calculated by dividing.
  • the characteristic factor Hb is a ratio of the left and right vertical widths in the XY plane (referred to as “left / right vertical width ratio”), and is calculated by dividing the characteristic factor Hr by the characteristic factor Hl.
  • the feature factor Yb is the ratio of the left and right heights on the XY plane (referred to as “left / right height ratio”), and the difference between the Y coordinate value of the feature point (4) and the Y coordinate value of the feature point (1). Is divided by the difference between the Y coordinate value of the feature point (2) and the Y coordinate value of the feature point (3).
  • the feature factor Wb is a ratio of the left and right widths on the XY plane (referred to as “right / left width ratio”), and the difference between the X coordinate value of the feature point (6) and the X coordinate value of the feature point (8). , By dividing by the difference between the X coordinate value of the feature point (8) and the Y coordinate value of the feature point (7).
  • the characteristic factor St1 is the sum of the vertical amplitudes (referred to as “vertical amplitude from the right foot grounding to the left foot grounding”) until the left foot touches the ground on the XY plane. This is calculated by adding the value obtained by subtracting the Y coordinate value of the feature point (1) and the value obtained by subtracting the Y coordinate value of the feature point (3) from the Y coordinate value of the feature point (2).
  • the characteristic factor Str is a sum of vertical amplitudes from the left foot to the ground on the XY plane until the right foot contacts the ground (referred to as “vertical amplitude from the left foot grounding to the right foot grounding”). Calculated by adding the value obtained by subtracting the Y coordinate value of the feature point (3) and the value obtained by subtracting the Y coordinate value of the feature point (5) from the Y coordinate value of the feature point (4).
  • the feature factor “jun” is a factor (referred to as “writing order”) indicating whether the locus is written clockwise or counterclockwise, and the X coordinate of the feature point (2) and the feature point (4) It is calculated by making a positive / negative determination.
  • the feature factor WuSu is a distance in the X-axis direction (referred to as “upper left-right width”) between the feature point (10) and the feature point (12) on the XZ plane, and the feature point (10 ) Is subtracted from the X coordinate value of the feature point (12).
  • the feature factor WdSu is the distance in the X-axis direction between the feature point (9) and the feature point (11) on the XZ plane (referred to as “lower left / right width”), and the value of the X coordinate of the feature point (9) Is calculated by subtracting the X-coordinate value of the feature point (11).
  • the feature factor Wsu is a distance in the X-axis direction (referred to as “horizontal width”) between the feature point (6) and the feature point (7) on the XZ plane.
  • the feature factor Wsu is a feature based on the X coordinate value of the feature point (6). It is calculated by subtracting the X coordinate value of the point (7).
  • the feature factor HlSu is a distance in the Z-axis direction between the feature point (12) and the feature point (11) in the XZ plane (referred to as “left-side vertical width”), and is obtained from the value of the Z coordinate of the feature point (12). It is calculated by subtracting the Z coordinate value of the feature point (11).
  • the feature factor HrSu is a distance in the Z-axis direction between the feature point (10) and the feature point (9) in the XZ plane (referred to as “right upper and lower width”), and is obtained from the value of the Z coordinate of the feature point (10). It is calculated by subtracting the Z coordinate value of the feature point (9).
  • the feature factor Hsu is an average of the feature factor HlSu and the feature factor HrSu on the XZ plane (referred to as “vertical width”), and is calculated by adding HlSu and HrSu and dividing by two.
  • the feature factor HclSu is the height (“left cross point height) of the feature point (8) on the basis of the feature point (11) in the XZ plane.
  • the feature point (8) is obtained from the Z-coordinate value of the feature point (8). 11) is calculated by subtracting the value of the Z coordinate.
  • the feature factor HcrSu is the height (“right cross point height) of the feature point (8) on the basis of the feature point (9) in the XZ plane.
  • the feature point (8) is obtained from the Z coordinate value of the feature point (8) ( It is calculated by subtracting the Z coordinate value of 9).
  • the feature factor ISOSu is the height (referred to as “phase”) of the feature point (14) with respect to the vertical width of the trajectory on the XY plane, and is the same value as the ISO on the XY plane described in FIG.
  • the feature factor VlevSu is the degree of whether the upper side of the trajectory in the XZ plane is open or lower (referred to as “shape ⁇ or ⁇ ”), and is calculated by dividing the feature factor WuSu by the feature factor WdSu.
  • the feature factor IlevSu is a factor (referred to as “shape I”) for specifying whether the shape of the trajectory in the XZ plane is a vertically long shape or a horizontally long shape.
  • the feature factor Hsu is a feature factor Wsu. Calculated by dividing.
  • the characteristic factor HbSu is a ratio of the left and right vertical widths in the XZ plane (referred to as “left / right vertical width ratio”), and is calculated by dividing the characteristic factor HrSu by the characteristic factor HlSu.
  • the feature factor YbSu is the ratio of the left and right heights in the XZ plane (referred to as “left-right height ratio”), and the difference between the Z coordinate value of the feature point (13) and the Z coordinate value of the feature point (9). Is divided by the difference between the Z coordinate value of the feature point (10) and the Z coordinate value of the feature point (11).
  • the characteristic factor WbSu is a ratio of the left and right widths in the XZ plane (referred to as “right / left width ratio”), and is the same value as Wb of the XY plane described in FIG.
  • the characteristic factor StlSu is the sum of the front and rear amplitudes (referred to as “front and rear amplitudes from the right foot grounding to the left foot grounding”) until the left foot touches the ground in the XZ plane, and is the Z coordinate of the feature point (10). It is calculated by adding the value obtained by subtracting the Z coordinate value of the feature point (9) and the value obtained by subtracting the Z coordinate value of the feature point (11) from the Z coordinate value of the feature point (10).
  • the characteristic factor StrSu is the sum of the front and rear amplitudes (referred to as “front and rear amplitudes from the left foot contact to the right foot contact”) until the right foot contacts the ground in the XZ plane, and the Z coordinate of the feature point (12) It is calculated by adding the value obtained by subtracting the Z coordinate value of the feature point (11) and the value obtained by subtracting the Z coordinate value of the feature point (13) from the Z coordinate value of the feature point (12).
  • the characteristic factor Zfl is a width in which the left foot in the XZ plane is standing and moves back and forth after the position of the waist reaches the highest point (referred to as “backward movement of the waist from the uppermost point of the left stepped leg”). It is calculated by subtracting the Z coordinate value of the feature point (4) from the Z coordinate value of 12).
  • the characteristic factor Zfr is a width (referred to as “backward movement of the waist from the uppermost point of the right foot stand”) when the right foot is standing on the XZ plane and the position of the waist reaches the uppermost point and then moved back and forth. It is calculated by subtracting the Z coordinate value of the feature point (2) from the Z coordinate value of 10).
  • the characteristic factor Zf is a width that moves back and forth after the position of the waist reaches the highest point in the stance on the XZ plane (referred to as “backward movement of the waist from the highest point of the stance”), and the characteristic factor Zfl and the characteristic factor Calculated by adding Zfr and dividing by 2.
  • the characteristic factor Zbl is a width (referred to as “backward movement of the waist from the left foot contact”) after the left foot touches the ground in the XZ plane (referred to as “backward movement of the waist from the left foot contact”). It is calculated by subtracting the Z coordinate value of the feature point (3).
  • the feature factor Zbr is a width (referred to as “backward back-and-forth movement from the right foot grounding”) after the right foot in the XZ plane touches the ground, and from the value of the Z coordinate of the feature point (9). It is calculated by subtracting the Z coordinate value of the feature point (5).
  • the feature factor Zb is a width in which the position of the waist has moved back and forth after the foot in the XZ plane contacts the ground (referred to as “backward movement of the waist from the ground”), and the feature factor Zbl and the feature factor Zbr are 2 Calculated by dividing by.
  • the feature factor dZ is a forward / backward tilt (referred to as “front / back tilt”) in the YZ plane, and the value of the Y coordinate of the feature point (1) from the value of the Y coordinate of the feature point (2). Is divided by the value obtained by subtracting the value of the Z coordinate of the feature point (1) from the value of the Z coordinate of the feature point (2).
  • the characteristic factor StlShi is the sum of the amplitudes in the left diagonal direction in the YZ plane (referred to as “left front-rear amplitude”), the distance between the characteristic points (2) and (1) in the YZ plane, and the characteristics in the YZ plane. It is calculated by adding the distance between the point (2) and the feature point (3).
  • the distance between the feature point (2) and the feature point (1) on the YZ plane is the square of the value obtained by subtracting the Z coordinate value of the feature point (1) from the Z coordinate value of the feature point (2), and the feature point ( It is calculated as the square root of the sum of the value obtained by subtracting the Y coordinate value of the feature point (1) and the square of the Y coordinate value of 2).
  • the distance between the feature point (2) and the feature point (3) on the YZ plane is the square of the value obtained by subtracting the Z coordinate value of the feature point (3) from the Z coordinate value of the feature point (2), and the feature point ( It is calculated as the square root of the sum of the value obtained by subtracting the Y coordinate value of the feature point (3) and the square of the Y coordinate value of 2).
  • the feature factor StrShi is the sum of the amplitudes in the right diagonal direction in the YZ plane (referred to as “right front-rear amplitude”), the distance between the feature points (4) and (3) in the YZ plane, and the features in the YZ plane. It is calculated by adding the distance between the point (4) and the feature point (1).
  • the distance between the feature point (4) and the feature point (3) on the YZ plane is the square of the value obtained by subtracting the Z coordinate value of the feature point (3) from the Z coordinate value of the feature point (4), and the feature point ( It is calculated as the square root of the sum of the square of the value obtained by subtracting the Y coordinate value of the feature point (3) from the Y coordinate value of 4).
  • the distance between the feature point (4) and the feature point (1) on the YZ plane is the square of the value obtained by subtracting the Z coordinate value of the feature point (1) from the Z coordinate value of the feature point (4), and the feature point ( It is calculated as the square root of the sum of the square of the value obtained by subtracting the Y coordinate value of the feature point (1) from the Y coordinate value of 4).
  • the feature factor StShi is the sum of the amplitudes in the oblique direction in the YZ plane (referred to as “front-rear amplitude”), and is calculated by adding the feature factor StlShi and the feature factor StrShi and dividing by two.
  • FIG. 13 is a first diagram for explaining the correlation between the feature factor and the step length of the index indicating the walking posture in this embodiment.
  • FIG. 14 is a second diagram for illustrating the correlation between the feature factor and the stride among the indices indicating the walking posture in this embodiment.
  • the characteristic factor Hr of the locus pattern projected on the XZ plane described in FIG. 10 is the vertical axis (y), and the actually measured value of the stride, which is one of the indices indicating the walking posture, is obtained.
  • the characteristic factor StShi of the locus pattern projected on the YZ plane described in FIG. 12 is taken as the vertical axis (y), and the actually measured value of the stride, which is one of the indices indicating the walking posture, is obtained.
  • the stride which is an index indicating the walking posture, has a high correlation with the feature factor Hr and the feature factor StShi. Therefore, by performing multiple regression analysis, the feature factor Hr and the feature factor StShi as objective variables and the explanation are explained.
  • ⁇ , ⁇ , and ⁇ are partial regression coefficients obtained by multiple regression analysis.
  • FIG. 15 is a first diagram for explaining the correlation between the feature factor and the walking distance among the indices indicating the walking posture in this embodiment.
  • FIG. 16 is a second diagram for explaining the correlation between the feature factor and the step indicating the walking posture in this embodiment.
  • the step which is an index indicating the walking posture, is subjected to a multiple regression analysis with a multiple regression analysis of the characteristic factor Hr / W and the characteristic factor WuSu / WdSu as objective variables and the stride value as an explanatory variable.
  • Step width Width ⁇ ⁇ Hr / W + ⁇ ⁇ WuSu / WdSu + ⁇ can be used to calculate the step value. Note that ⁇ , ⁇ , and ⁇ are coefficients obtained by multiple regression analysis.
  • FIG. 17 is a block diagram showing an outline of the configuration of the activity meter 100 in this embodiment.
  • activity meter 100 includes a control unit 110, a memory 120, an operation unit 130, a display unit 140, an acceleration sensor 170, and a power source 190. Further, the activity meter 100 may include a sound report unit for outputting sound and an interface for communicating with an external computer.
  • control unit 110 the memory 120, the operation unit 130, the display unit 140, the acceleration sensor 170, and the power source 190 are incorporated in the main body unit 191 described with reference to FIG.
  • the operation unit 130 includes the display change / decision switch 131, the left operation / memory switch 132, and the right operation switch 133 described with reference to FIG. 1, and an operation signal indicating that these switches have been operated is sent to the control unit 110. Send.
  • the acceleration sensor 170 is a semiconductor type of MEMS (Micro Electro Mechanical Systems) technology, but is not limited to this, and may be of another type such as a mechanical type or an optical type. In the present embodiment, acceleration sensor 170 outputs a detection signal indicating the acceleration in each of the three axial directions to control unit 110. However, the acceleration sensor 170 is not limited to the three-axis type, and may be one-axis or two-axis type.
  • MEMS Micro Electro Mechanical Systems
  • the memory 120 includes non-volatile memory such as ROM (Read Only Memory) (for example, flash memory) and volatile memory such as RAM (Random Access Memory) (for example, SDRAM (synchronous Dynamic Random Access Memory)).
  • ROM Read Only Memory
  • RAM Random Access Memory
  • SDRAM synchronous Dynamic Random Access Memory
  • the memory 120 includes program data for controlling the activity meter 100, data used for controlling the activity meter 100, setting data for setting various functions of the activity meter 100, and the number of steps and activities. Measurement result data such as quantity is stored every predetermined time (for example, every day). The memory 120 is used as a work memory when the program is executed.
  • the control unit 110 includes a CPU (Central Processing Unit), and according to an operation signal from the operation unit 130 according to a program for controlling the activity meter 100 stored in the memory 120, the acceleration sensor 170 and the atmospheric pressure sensor 180.
  • the memory 120 and the display unit 140 are controlled on the basis of the detection signal from.
  • the display unit 140 includes the display 141 described with reference to FIG. 1 and controls the display 141 to display predetermined information according to a control signal from the control unit 110.
  • the power source 190 includes a replaceable battery, and supplies power from the battery to each unit that requires power to operate, such as the control unit 110 of the activity meter 100.
  • FIG. 18 is a functional block diagram showing an outline of the function of the activity meter 100 in this embodiment.
  • the control unit 110 of the activity meter 100 includes an acceleration reading control unit 111, a feature point position specifying unit 112, a feature factor calculation unit 113, an index calculation unit 114, and a walking posture determination unit 115. And a display control unit 116.
  • the storage unit 120 of the activity meter 100 includes an acceleration data storage unit 121, a feature point position storage unit 122, a feature factor storage unit 123, a correlation storage unit 124, and an index storage unit 125.
  • these units included in the control unit 110 are configured in the control unit 110 by executing software for executing the processing of FIG. 19 described later by the control unit 110. I will do it.
  • the present invention is not limited to this, and each of these units included in the control unit 110 may be configured inside the control unit 110 as a hardware circuit.
  • Each of these units included in the storage unit 120 is temporarily configured in the storage unit 120 when the control unit 110 executes software for executing the processing of FIG. .
  • the present invention is not limited to this, and each of these units included in the storage unit 120 may be configured as a dedicated storage device.
  • each of these units included in the storage unit 120 may be temporarily configured in a built-in memory of the control unit 110 such as a register instead of being configured in the storage unit 120.
  • the acceleration reading control unit 111 detects accelerations Ax (t), Ay (t), and Az (t) in three axes from the acceleration sensor 170.
  • acceleration data Ax (t), Ay (t), Az (t) in the Z-axis direction are directly used as the X-axis and Y-axis.
  • acceleration data Ax (t), Ay (t), Az (t) in the Z-axis direction are directly used as the X-axis and Y-axis.
  • the X-axis, Y-axis, and Z-axis directions are converted by coordinate conversion of the detection values obtained by the acceleration sensor.
  • the respective acceleration data Ax (t), Ay (t), Az (t) are calculated.
  • the acceleration reading control unit 111 stores the acceleration data Ax (t), Ay (t), and Az (t) calculated for each sampling period in the acceleration data storage unit 121 of the storage unit 120.
  • the feature point position specifying unit 112 is based on the acceleration data Ax (t), Ay (t), Az (t) stored in the acceleration data storage unit 121, as described with reference to FIG.
  • the relative position X (t) with respect to the average position in a short time (here, ⁇ 1 step time ( ⁇ T seconds)) of the activity meter 100 in the X-axis, Y-axis, and Z-axis directions , Y (t), Z (t) are calculated.
  • the feature point position specifying unit 112 uses the method described with reference to FIGS. 7 to 9 based on the calculated positions X (t), Y (t), and Z (t) to determine the feature points. Specify the coordinate value of the position.
  • the feature point position specifying unit 112 is based on the acceleration detected by the acceleration sensor 170, and the orthogonal three-axis directions of the Y-axis direction (vertical direction), the Z-axis direction (traveling direction), and the X-axis direction (left-right direction).
  • the position of the feature point of the trajectory projected by removing the moving component in the Z-axis direction on the XZ plane, the XY plane, and the YZ plane, which are planes perpendicular to each other, is specified.
  • the feature point position specifying unit 112 causes the feature point position storage unit 122 to store the calculated position of the feature point.
  • the feature factor calculation unit 113 calculates the value of the feature factor according to the calculation formulas described with reference to FIGS. 10 to 12 based on the position of the feature point stored in the feature point position storage unit 122. Then, the feature factor calculation unit 113 stores the calculated feature factor value in the feature factor storage unit 123.
  • the correlation storage unit 124 stores in advance the multiple regression equations described above with reference to FIGS.
  • the index calculation unit 114 based on the feature factor value stored in the feature factor storage unit 123, in accordance with the multiple regression equation stored in the correlation storage unit 124, indicates an index indicating the walking posture (for example, stride, step, Rotation of hips, leg height, back muscle stretch, center of gravity balance, etc.). Then, the index calculation unit 114 stores the calculated index value in the index storage unit 125.
  • the walking posture for example, stride, step, Rotation of hips, leg height, back muscle stretch, center of gravity balance, etc.
  • the walking posture determination unit 115 determines the walking posture based on the index value stored in the index storage unit 125.
  • FIG. 19 is a first diagram showing an example of the determination of the walking posture in this embodiment. Referring to FIG. 19, as shown in FIG. 19 (A), when the stride is wider than the predetermined threshold, as shown in FIG. 19 (B), the walking posture is smaller than the stride whose width is smaller than the predetermined threshold. May be judged as good.
  • the walking posture is better when the step is narrower than the predetermined threshold than when the step is wider than the predetermined threshold as shown in FIG. It is possible to judge.
  • FIG. 20 is a second diagram showing an example of the determination of the walking posture in this embodiment.
  • the step length is L
  • the step interval is W
  • the hip rotation is D
  • the foot-lifting height is H
  • the back stretch is B.
  • Threshold values for classifying the stride length L are a, b, c (a ⁇ b ⁇ c).
  • thresholds for classifying the step W are d, e, f (d ⁇ e ⁇ f).
  • the thresholds for classifying the hip rotation D are set as h, i, j (h ⁇ i ⁇ j).
  • Threshold values for classifying the foot height H are k, l, and m (k ⁇ l ⁇ m).
  • the thresholds for classifying the back stretch B are n, o, and p (n ⁇ o ⁇ p).
  • the determination of the walking posture can be performed in combination with various indexes to make a detailed determination according to the user.
  • the display control unit 116 controls the display unit 140 to display the determination result of the walking posture determined by the walking posture determination unit 115.
  • the activity meter 100 may be connected to an external device such as a personal computer, and the determination result may be displayed on the display unit of the external device.
  • FIG. 21 is a diagram showing an example of the display of the walking posture determination result in this embodiment. Referring to FIG. 21, it is displayed at the top of the right column that the goal is “I want to lose weight”, “I want to maintain physical strength”, and “I want to walk younger”. Yes. This target is set in advance by the user selecting from several target candidate options.
  • an image of the target walking posture viewed from the front and from the side is displayed.
  • an image of the user's walking posture generated based on the calculated index indicating the walking posture as viewed from the front and an image as viewed from the side are displayed.
  • a mark an oval box in the figure is displayed on a body part related to an index having a low evaluation among the indexes indicating the walking posture.
  • a radar chart for stride, step and hip rotation, and a chart showing where the balance of the center of gravity is located on the left and right of the index indicating the walking posture are displayed at the top.
  • the target walking posture index value is indicated by a rhombus plot
  • the user walking posture index value is indicated by a square plot.
  • the target value for the stride, step, and hip rotation is the third stage of each of the four stages, whereas the user value is the second stage and the third stage, respectively.
  • the first stage is shown.
  • the center of gravity balance chart shows that the balance of the user is to the right while the balance of the target is the center.
  • the waist position is marked in the image of the walking posture of the user in the right column.
  • the foot color may be red.
  • the advice to the user is displayed under the chart in the left column.
  • advice such as “Wide the stride, rotate the hips and walk” is displayed.
  • Such advice is stored in advance in the storage unit 120 of the activity meter 100 in accordance with the degree of deviation from the target, and the walking posture determination unit 115 determines the The advice corresponding to the degree of deviation is selected, and the selected advice is displayed on the display unit 140 by the display control unit 116.
  • training content for approaching the target walking posture may be displayed instead of or together with the advice.
  • FIG. 22 is a diagram showing an example of the display of the user's walking posture image in this embodiment.
  • FIG. 22 (A) is an image showing a standard walking posture.
  • FIG. 22B is an image showing a walking posture when the stride is large.
  • FIG. 22C is an image showing a walking posture when the center of gravity balance is on the right side.
  • FIG. 23 is a flowchart showing the flow of the walking posture determination process executed by the control unit 110 of the activity meter 100 in this embodiment.
  • control unit 110 reads the detected value of the acceleration sensor from acceleration sensor 170, and as described in acceleration reading control unit 111 in FIG. 18, acceleration data Ax (t), Ay (T) and Az (t) are stored in the storage unit 120 for each sampling period.
  • step S102 the control unit 110 determines whether or not one step of walking has been detected. Here, it is determined that one step has been detected by detecting the feature point (1) (feature point (5)) described in FIG. If it is determined that one step has not been detected (NO in step S102), control unit 110 advances the process to be executed to step S111.
  • step S103 the control unit 110 stores acceleration data Ax (t for one step stored in the storage unit 120 in step S101. ), Ay (t), Az (t) are read, and the coordinate value of the position of the feature point is calculated as described in the feature point position specifying unit 112 in FIG.
  • step S104 the control unit 110 calculates the value of the feature factor based on the coordinate value of the position of the feature point calculated in step S103, as described in the feature factor calculation unit 113 in FIG. .
  • step S105 the control unit 110 follows the correlation between the feature factor and the walking posture index as described in the index calculation unit 114 of FIG. 18 based on the value of the feature factor calculated in step S104.
  • the index value is calculated and stored in the storage unit 120. Then, the control part 110 advances the process to perform to the process of step S11.
  • step S111 the control unit 110 determines whether or not an instruction to display the determination result of the walking posture is received by the operation unit 130 being operated by the user. If it is determined that the result display instruction has not been received (NO in step S111), control unit 110 returns the process to be executed to the process in step S101.
  • step S112 the control unit 110 reads the index indicating the walking posture stored in the storage unit 120 in step S105, Based on the index, the walking posture is determined as described in the walking posture determination unit 115 of FIG.
  • step S113 the control unit 110 controls the display unit 140 to display the determination result of the walking posture determined in step S112, as described in the display control unit 116 of FIG. Then, the control part 110 returns the process to perform to the process of step S101.
  • the activity meter 100 includes the main body 191, the acceleration sensor 170, and the control unit 110, and the walking posture of the user who wears the main body 191 on the waist. It is an apparatus for determining.
  • the three-dimensional trajectory from which the moving component in the advancing direction (Z-axis direction) during walking of the waist to which the main body 191 is attached has the pattern described with reference to FIGS.
  • the pattern includes a plurality of feature points that define the features of the pattern.
  • the control unit 110 is an XZ plane that is a plane perpendicular to the vertical direction (Y-axis direction), the traveling direction (Z-axis direction), and the left-right direction (X-axis direction).
  • the feature point position specifying unit 112 that specifies the position of the feature point of the locus projected by removing the moving component in the traveling direction (Z-axis direction) on the XY plane and the YZ plane, and the feature point position specifying unit 112
  • the feature factor calculation unit 113 that calculates the value of the feature factor of the trajectory based on the determined position, and the feature factor calculation unit 113 according to the correlation obtained in advance between the value of the feature factor and the value of the index indicating the walking posture.
  • An index calculation unit 114 that calculates an index value based on the calculated feature factor value, and a walking posture determination that determines a walking posture based on the index value calculated by the index calculation unit 114 And a part 115.
  • the index indicating the walking posture is accurately calculated based on the accurate correlation, and the index indicating the various walking postures is calculated, the detailed walking posture can be evaluated more accurately.
  • the activity meter 100 further includes a display unit 140.
  • the control unit 110 includes a display control unit 116 that causes the display unit 140 to display the walking posture determined by the walking posture determination unit 115 and the target walking posture in a comparable manner. For this reason, the user's walking posture can be easily displayed.
  • control unit 110 includes a display control unit 116 that displays on the display unit 140 advice for improving the walking posture determined by the walking posture determination unit 115. For this reason, the user's walking posture can be easily displayed.
  • the correlation is a relational expression obtained by multiple regression analysis, which is a relational expression between a characteristic factor value as a target variable and an index value as an explanatory variable. It is shown by a regression equation.
  • the feature points are the feature point (1) when the right foot touches down, the feature point (2) when the locus reaches the highest position while standing on the right foot, and the left foot This includes a feature point (3) when the robot touches down and a feature point (4) when the locus reaches the highest position while standing on the left foot.
  • the feature factor is a feature factor Hr that is the distance in the vertical direction (Y-axis direction) between the feature point (1) and the feature point (2) in the locus projected on the XY plane perpendicular to the traveling direction (Z-axis direction), and , Calculated from the distance between the feature point (1) and the feature point (2) and the distance between the feature point (3) and the feature point (4) in the locus projected on the YZ plane perpendicular to the left-right direction (X-axis direction).
  • the characteristic factor StShi is included.
  • the index includes stride.
  • the feature points are the feature point (1) when the right foot touches down, the feature point (2) when the trajectory reaches the highest position while standing on the right foot, and the rightmost point of the trajectory.
  • the feature point (6), the leftmost feature point (7) of the locus, the foremost feature point (10) on the right side of the locus, the foremost feature point (12) on the left side of the locus, and the right side of the locus And the rearmost feature point (9) and the rearmost feature point (11) on the left side of the locus.
  • the feature factor is a distance Hr in the vertical direction (Y-axis direction) between the feature point (1) and the feature point (2) in the locus projected on the XY plane perpendicular to the traveling direction (Z-axis direction).
  • a feature factor Hr / W which is a quotient divided by a distance W in the left-right direction (X-axis direction) from the feature point (7), and a feature in the locus projected on the XZ plane perpendicular to the vertical direction (Y-axis direction)
  • the quotient obtained by dividing the distance WuSu between the point (10) and the feature point (12) in the left-right direction (X-axis direction) by the distance WdSu between the feature point (9) and the feature point (11) in the left-right direction (X-axis direction).
  • the characteristic factor WuSu / WdSu is included.
  • the index includes the step interval.
  • the walking posture is determined based on the relationship between the index value and the threshold value.
  • the present invention is not limited to this, and the walking posture may be determined on the basis of the similarity between the combination of the index whose relationship with the walking posture is obtained in advance and the calculated combination of the index.
  • the threshold value of the index indicating the walking posture may be determined based on actually measured data when a person with a good walking posture actually walks.
  • the target walking posture and the user's walking posture are displayed separately as described in FIG.
  • the present invention is not limited to this, and the target walking posture and the user's walking posture may be displayed in an overlapping manner.
  • the average speed component is an average speed component for the time of ⁇ 1 step.
  • the present invention is not limited to this, and it may be an average speed component for a time of ⁇ n steps (n is a predetermined number) or a time of ⁇ n steps (n steps before the calculation target time). It may be an average velocity component, may be an average velocity component of ⁇ s seconds (s is a predetermined number), or may be an average velocity component of -s seconds (before calculation target, s seconds).
  • the invention of the device for the activity meter 100 has been described.
  • the present invention is not limited to this, and can be understood as an invention of a control method for controlling the activity meter 100.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Dentistry (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Physiology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente invention concerne un dispositif d'évaluation de posture de marche qui est pourvu d'une unité principale, d'un capteur d'accélération pour détecter l'accélération de l'unité principale, et d'une unité de commande, et qui est un dispositif pour évaluer la posture de marche d'un utilisateur portant l'unité principale sur une région prédéterminée. Une trajectoire tridimensionnelle de la région prédéterminée exclut la composante de mouvement dans la direction de déplacement lorsque la marche a un motif, le motif contenant une pluralité de points caractéristiques qui définissent les caractéristiques de celle-ci. Un contrôleur, sur la base de l'accélération détectée par un capteur d'accélération, identifie les positions de points caractéristiques de la trajectoire qui a été projetée après élimination de la composante de mouvement de la direction de déplacement de surfaces perpendiculaires à chacun de trois axes mutuellement orthogonaux, qui sont la direction verticale, la direction de déplacement, et les directions gauche et droite (S103), calcule les valeurs de facteurs caractéristiques de la trajectoire sur la base des positions identifiées (S104), et en fonction de la relation de corrélation qui a été obtenue à l'avance entre les valeurs des facteurs caractéristiques et les valeurs des indices indiquant la posture de marche, calcule les valeurs des indices sur la base des valeurs des facteurs caractéristiques calculés (S105), et évalue la posture de marche sur la base des valeurs calculées des indices (S112). Par conséquent, il est possible d'effectuer une évaluation plus précise et détaillée de la posture de marche.
PCT/JP2011/064130 2010-07-22 2011-06-21 Dispositif d'évaluation de posture de marche Ceased WO2012011350A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010164954A JP2012024275A (ja) 2010-07-22 2010-07-22 歩行姿勢判定装置
JP2010-164954 2010-07-22

Publications (1)

Publication Number Publication Date
WO2012011350A1 true WO2012011350A1 (fr) 2012-01-26

Family

ID=45496779

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/064130 Ceased WO2012011350A1 (fr) 2010-07-22 2011-06-21 Dispositif d'évaluation de posture de marche

Country Status (2)

Country Link
JP (1) JP2012024275A (fr)
WO (1) WO2012011350A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103720476A (zh) * 2013-12-13 2014-04-16 天津大学 一种曲径运动模式下的稳定性评价方法
WO2014181606A1 (fr) * 2013-05-10 2014-11-13 オムロンヘルスケア株式会社 Compteur de posture de marche et programme
CN106456060A (zh) * 2014-07-07 2017-02-22 欧姆龙健康医疗事业株式会社 活动量测定装置、活动量测定方法和活动量测定程序
CN112289404A (zh) * 2020-10-22 2021-01-29 中国医学科学院生物医学工程研究所 一种步态训练计划的生成方法、装置、设备及存储介质
CN119339412A (zh) * 2024-12-20 2025-01-21 北京南口鸭育种科技有限公司 基于视频监控的种鸭状态图像识别方法及系统

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2013226907B2 (en) * 2012-02-29 2015-10-29 Mizuno Corporation Running form diagnostic system and method for scoring running form
JP5888309B2 (ja) * 2013-10-31 2016-03-22 カシオ計算機株式会社 トレーニング支援装置およびシステム、フォーム解析装置および方法、ならびにプログラム
WO2016024565A1 (fr) * 2014-08-13 2016-02-18 株式会社村田製作所 Dispositif de capture de mouvement, procédé de capture de mouvement, procédé de diagnostic d'exécution de mouvement et dispositif porté sur un corps pour une capture de mouvement
JP6369811B2 (ja) * 2014-11-27 2018-08-08 パナソニックIpマネジメント株式会社 歩行解析システムおよび歩行解析プログラム
JP6527023B2 (ja) * 2015-05-29 2019-06-05 株式会社早稲田エルダリーヘルス事業団 移動運動解析装置及びシステム並びにプログラム
JP6527024B2 (ja) * 2015-05-29 2019-06-05 株式会社早稲田エルダリーヘルス事業団 移動運動解析装置及びシステム並びにプログラム
JP6592988B2 (ja) * 2015-06-30 2019-10-23 富士通株式会社 評価システムおよび評価方法
WO2017061005A1 (fr) * 2015-10-08 2017-04-13 株式会社ジェイアイエヌ Procédé de traitement d'information, dispositif et programme de traitement d'information
KR102070568B1 (ko) 2015-11-19 2020-01-29 파나소닉 아이피 매니지먼트 가부시키가이샤 보행 동작 표시 시스템 및 컴퓨터 판독 가능한 기록 매체에 저장된 프로그램
CN109310913B (zh) * 2016-08-09 2021-07-06 株式会社比弗雷斯 三维模拟方法及装置
US11883183B2 (en) 2016-10-07 2024-01-30 Panasonic Intellectual Property Management Co., Ltd. Cognitive function evaluation device, cognitive function evaluation method, and recording medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008229266A (ja) * 2007-03-23 2008-10-02 Aisin Seiki Co Ltd 歩行能力からの運動機能向上メニュー提案システム及び歩行能力からの運動機能向上メニュー提案方法
JP2009000391A (ja) * 2007-06-23 2009-01-08 Tanita Corp 歩行評価システム、歩行計、歩行評価プログラムおよび記録媒体

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008229266A (ja) * 2007-03-23 2008-10-02 Aisin Seiki Co Ltd 歩行能力からの運動機能向上メニュー提案システム及び歩行能力からの運動機能向上メニュー提案方法
JP2009000391A (ja) * 2007-06-23 2009-01-08 Tanita Corp 歩行評価システム、歩行計、歩行評価プログラムおよび記録媒体

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
KOYO TAKENOSHITA ET AL.: "Quantitative Assessment System for Gait of the Elderly Using a Portable Acceleration Monitor Device", TRANSACTIONS OF JAPANESE SOCIETY FOR MEDICAL AND BIOLOGICAL ENGINEERING, vol. 43, no. 1, 10 March 2005 (2005-03-10), pages 140 - 150 *
TATSUNORI NISHI: "Quantification and Pattern Classification of Hemiplegic Gait by Accelerometer", HUMAN INTERFACE SYMPOSIUM 2009 RONBUNSHU, 1 September 2009 (2009-09-01), pages 647 - 652 *
TEPPEI KOBAYASHI: "Kinematic Analysis System of Walking by Acceleration Sensor : An Estimation of Walk-Mate in Post-operation Rehabilitation of Hip-joint Disease", TRANSACTIONS OF THE SOCIETY OF INSTRUMENT AND CONTROL ENGINEERS, vol. 42, no. 5, 31 May 2006 (2006-05-31), pages 567 - 576 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014181606A1 (fr) * 2013-05-10 2014-11-13 オムロンヘルスケア株式会社 Compteur de posture de marche et programme
JP2014217696A (ja) * 2013-05-10 2014-11-20 オムロンヘルスケア株式会社 歩行姿勢計およびプログラム
CN103720476A (zh) * 2013-12-13 2014-04-16 天津大学 一种曲径运动模式下的稳定性评价方法
CN106456060A (zh) * 2014-07-07 2017-02-22 欧姆龙健康医疗事业株式会社 活动量测定装置、活动量测定方法和活动量测定程序
CN106456060B (zh) * 2014-07-07 2019-01-08 欧姆龙健康医疗事业株式会社 活动量测定装置和活动量测定方法
CN112289404A (zh) * 2020-10-22 2021-01-29 中国医学科学院生物医学工程研究所 一种步态训练计划的生成方法、装置、设备及存储介质
CN119339412A (zh) * 2024-12-20 2025-01-21 北京南口鸭育种科技有限公司 基于视频监控的种鸭状态图像识别方法及系统

Also Published As

Publication number Publication date
JP2012024275A (ja) 2012-02-09

Similar Documents

Publication Publication Date Title
WO2012011350A1 (fr) Dispositif d'évaluation de posture de marche
JP5724237B2 (ja) 歩行変化判定装置
JP6288706B2 (ja) 上体運動計測システム及び上体運動計測方法
KR101252634B1 (ko) 보행자세 분석 시스템
JP6131706B2 (ja) 歩行姿勢計およびプログラム
JP5504810B2 (ja) 歩行姿勢判定装置、制御プログラム、および制御方法
CN107174255A (zh) 基于Kinect体感技术的三维步态信息采集与分析方法
JP6319446B2 (ja) モーションキャプチャ装置、モーションキャプチャ方法、運動性能診断方法およびモーションキャプチャ用身体装着具
CN105311806A (zh) 运动解析方法、装置及系统、身体活动辅助方法及装置
CN101504424A (zh) 微型多功能人体姿态智能检测仪及检测方法
CN106572816A (zh) 步行解析系统和步行解析程序
CN105435438B (zh) 运动解析装置以及运动解析方法
US20160058373A1 (en) Running Energy Efficiency
JP2018069035A (ja) 歩行解析システム及び方法
JP2018121930A (ja) 歩容の評価方法
KR20180085916A (ko) 웨어러블 장치의 위치를 추정하는 방법 및 이를 이용하는 장치
WO2014181605A1 (fr) Dispositif de mesure de la posture de marche et programme associé
JP6593683B2 (ja) 歩行評価システム
WO2014181602A1 (fr) Dispositif de mesure de posture de marche et programme associé
CN112839569B (zh) 评估人类移动的方法和系统
US20230371849A1 (en) Gait information generation device, gait information generation method, and recording medium
US20230397879A1 (en) Pelvic inclination estimation device, estimation system, pelvic inclination estimation method, and recording medium
EP3238619A1 (fr) Procédé d'évaluation de démarche et son dispositif électronique
JP2012191962A (ja) 位置変動検出装置、これを含むシステム、及び位置変動検出装置の動作方法
HK40052123A (en) Method and system for assessing human movements

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11809524

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11809524

Country of ref document: EP

Kind code of ref document: A1