WO2023127870A1 - Dispositif d'aide aux soins, programme d'aide aux soins et procédé d'aide aux soins - Google Patents
Dispositif d'aide aux soins, programme d'aide aux soins et procédé d'aide aux soins Download PDFInfo
- Publication number
- WO2023127870A1 WO2023127870A1 PCT/JP2022/048136 JP2022048136W WO2023127870A1 WO 2023127870 A1 WO2023127870 A1 WO 2023127870A1 JP 2022048136 W JP2022048136 W JP 2022048136W WO 2023127870 A1 WO2023127870 A1 WO 2023127870A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- evaluation
- user
- unit
- tool
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/22—Social work or social welfare, e.g. community support activities or counselling services
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
Definitions
- the present disclosure relates to a care support device, a care support program, and a care support method.
- the above-mentioned technology aims to present a suitable rehabilitation menu based on information on the equipment and personnel of nursing care facilities, as well as the care level and dementia level of the user receiving care.
- the conditions of users are completely different for each individual, and it is difficult to present an appropriate rehabilitation menu to each user based on the above information.
- the present disclosure has been made in view of the above problems, and its purpose is to provide a technology that can easily and accurately analyze the user's body movement and support nursing care work. .
- a nursing care support device that supports nursing care work, and includes an evaluation unit that evaluates the physical exercise based on first image information including the user's physical exercise, and an evaluation of the physical exercise. , an improvement measure information transmission unit that selects a suitable training menu and presents it to a supporter, and a document information generation unit that generates document information based on at least the state of implementation of the training menu by the user.
- a care support device characterized by:
- FIG. 2 is a diagram showing an example hardware configuration of a user terminal 10;
- FIG. 3 is a diagram showing an example of the software configuration of the user terminal 10;
- FIG. 4 is a diagram showing a configuration example of physical information stored in a physical information storage unit 130.
- FIG. 4 is a diagram showing a configuration example of an evaluation request transmitted by an evaluation request transmission unit 112 to the server device 20;
- FIG. 4 is a diagram showing a configuration example of evaluation information received by the evaluation information receiving unit 113 from the server device 20.
- FIG. 11 is a diagram showing a configuration example of an improvement measure request that an improvement measure request transmission unit 116 transmits to the server device 20; It is a figure which shows the structural example of improvement measure information.
- 2 is a diagram illustrating an example hardware configuration of a server device 20;
- FIG. 3 is a diagram showing an example of the software configuration of the server device 20;
- FIG. 3 is a diagram showing a configuration example of image information stored in an image data storage unit 231.
- FIG. 4 is a diagram showing a configuration example of reference information stored in a reference information storage unit 232;
- FIG. 3 is a diagram showing a configuration example of evaluation condition information stored in an evaluation condition information storage unit 233;
- FIG. 6 is a diagram showing a configuration example of improvement plan information stored in an improvement plan information storage unit 234;
- FIG. 10 is a diagram showing an example of a video evaluation screen in the weightlifting mode.
- FIG. 11 is another diagram showing an example of a video evaluation screen in the weightlifting mode.
- FIG. 11 is another diagram showing an example of a video evaluation screen in the weightlifting mode.
- FIG. 11 is another diagram showing an example of a video evaluation screen in the weightlifting mode.
- FIG. 11 is another diagram showing an example of a video evaluation screen in the weightlifting mode.
- a care support device according to an embodiment of the present invention has the following configuration.
- a care support device that supports care work, an evaluation unit that evaluates the physical exercise based on the first image information including the user's physical exercise; an improvement measure information transmission unit that selects a suitable training menu from the evaluation of the physical exercise and presents it to a supporter; a document information generation unit that generates document information based on at least the state of implementation of the training menu by the user;
- a care support device characterized by comprising: [Item 2] an implementation status information acquisition unit that acquires implementation status information of the training menu based on at least the second image information of the user; further comprising The document information generating unit generates the document information based on at least the implementation status information;
- the care support device according to item 1, characterized by: [Item 3] the improvement measure information transmitting unit reselecting the training menu based on the first image information and the second image information; 3.
- the care support device characterized by: [Item 4] A care support program that supports care work, to the processor, an evaluation step of evaluating the physical exercise based on the first image information including the user's physical exercise; a improvement measure information transmission step of selecting a suitable training menu from the evaluation of the physical exercise and presenting it to a supporter; a document information generation step of generating document information based on at least the state of implementation of the training menu by the user; A care support program that allows you to carry out [Item 5] A nursing care support method for supporting nursing care work, the processor an evaluation step of evaluating the physical exercise based on the first image information including the user's physical exercise; a improvement measure information transmission step of selecting a suitable training menu from the evaluation of the physical exercise and presenting it to a supporter; a document information generation step of generating document information based on at least the state of implementation of the training menu by the user; A nursing care support method, characterized by performing
- a care support device evaluates the physical exercise of a care service user (hereinafter referred to as a user) and presents a suitable training menu such as training and rehabilitation.
- the care support device of the present embodiment is, for example, an image of a user exercising (either a still image or a moving image, but in the present embodiment, a moving image). ), it should be noted that physical exercise also includes those performed using tools. Furthermore, physical exercise includes exercise performed while receiving support from a supporter.
- the care support device of the present embodiment identifies a body part and evaluates the movement of the body part based on the relative positional relationship of the body part. Note that the care support device of the present embodiment identifies tools and parts of tools from an image of a user exercising using tools, and determines the absolute position of the part and a plurality of different parts. You may evaluate a motion of a body based on the relative positional relationship of , the relative positional relationship of the said part, and the body part.
- FIG. 1 is a diagram showing an example of the overall configuration of a care support device according to this embodiment.
- the care support device of this embodiment includes a user terminal 10 and a server device 20 .
- an imaging terminal 30 may be included.
- the user terminal 10, the server device 20, and the imaging terminal 30 are connected to each other via a communication network 40 so as to be able to communicate with each other.
- the communication network 40 is, for example, the Internet or a LAN (Local Area Network), and is constructed by a public telephone line network, a dedicated telephone line network, a mobile telephone line network, Ethernet (registered trademark), a wireless communication path, and the like.
- the user terminal 10 is a computer operated by a user who performs physical exercise or a supporter thereof. Such supporters include not only the user's family members, but also trainers, physical therapists, caregivers, and others who play a role in guiding, instructing, explaining, and supporting users who exercise. It's okay to be
- the user terminal 10 is, for example, a smart phone, a tablet computer, a personal computer, or the like.
- the user terminal 10 is equipped with an imaging device such as a camera, which can capture an image of the user's body during exercise. In this embodiment, it is assumed that a moving image of the user's body during exercise is transmitted from the user terminal 10 to the server device 20 .
- the user terminal 10 may also serve as the imaging terminal 30 . Although only one user terminal 10 is shown in FIG. 1, it goes without saying that a plurality of user terminals may be provided.
- the user terminal 10 may be a sensor worn by the user (a wearable sensor such as a shape of clothes or a shoe last, or a sensor attached to clothes or a part of the body). Not only exercise measurement, but also activity amount, conversation amount, sleep time, pulse, UV amount, pulse interval (PPI), skin temperature, heartbeat, etc. are sensed, and those data are sent to the server device 20 via the communication network 40. may be
- the server device 20 is a computer that evaluates physical exercise.
- the server device 20 is, for example, a workstation, a personal computer, a virtual computer that is logically implemented by cloud computing, or the like.
- the server device 20 receives the moving image captured by the user terminal 10 or the imaging terminal 30, analyzes the received moving image, and evaluates the body exercise.
- the server device 20 also makes proposals regarding improvement measures for physical exercise. Details of evaluation of physical exercise and proposal of improvement measures will be described later.
- the imaging terminal 30 is, for example, a smart phone, a tablet computer, a personal computer, or the like.
- the image capturing terminal 30 includes an image capturing device such as a camera, which can capture an image of the user's body during physical exercise. In the present embodiment, it is assumed that a moving image obtained by imaging the user's body during exercise is transmitted from the imaging terminal 30 to the server device 20 .
- the data of the image of the user stored in the imaging terminal 30 may be directly input to the server device 20 by the user, the supporter of the user, or a business operator using the care support device, Input may be via communication network 40 .
- the supporter terminal 50 is, for example, a smart phone, a tablet computer, a personal computer, or the like.
- an image obtained by capturing the body movement of the user is analyzed. It's okay.
- an image captured by the user terminal 10 is processed by each processing unit and stored by each storage unit, but the image may be captured by the imaging terminal 30. .
- the supporter terminal 50 may have functions equivalent to those of the user terminal 10, and instead of the user, the user's body movement is captured, and improvement measures are taken. You can make a request.
- the supporter may use the functions of the supporter terminal 50, which are equivalent to those of the user terminal 10, to present the user with evaluation results and improvement measures. Further, all the processing performed by the server device 20 on the user terminal 10 may be performed on the supporter terminal 50, and all the processing performed on the server device 20 by the user terminal 10 may be performed on the supporter terminal. 50.
- FIG. 2 is a diagram showing a hardware configuration example of the user terminal 10.
- the user terminal 10 includes a CPU 101 , a memory 102 , a storage device 103 , a communication interface 104 , a touch panel display 105 and a camera 106 .
- the storage device 103 is, for example, a hard disk drive, solid state drive, flash memory, etc., which stores various data and programs.
- the communication interface 104 is an interface for connecting to the communication network 40, and includes, for example, an adapter for connecting to Ethernet (registered trademark), a modem for connecting to a public telephone network, and a wireless communication device for wireless communication. , USB (Universal Serial Bus) connector and RS232C connector for serial communication.
- the touch panel display 105 is a device that inputs and outputs data.
- the user terminal 10 may further include input devices such as a keyboard, mouse, buttons, and microphone, and output devices such as speakers and printers.
- FIG. 3 is a diagram showing a software configuration example of the user terminal 10.
- the user terminal 10 includes an imaging unit 111, an evaluation request transmission unit 112, an evaluation information reception unit 113, an evaluation display unit 114, a checkpoint display unit 115, an improvement request transmission unit 116, an improvement measure information reception unit 117, and an improvement measure information. It includes functional units of display unit 118 , physical information storage unit 130 , image storage unit 131 , evaluation information storage unit 132 , and improvement measure storage unit 133 .
- each of the above functional units is implemented by the CPU 101 provided in the user terminal 10 reading a program stored in the storage device 103 into the memory 102 and executing it, and each of the above storage units is provided in the user terminal 10. It is implemented as part of the storage area provided by memory 102 and storage device 103 .
- the imaging unit 111 captures images, including moving images, while the user is exercising. By controlling the camera 106, the imaging unit 111 can obtain a moving image of the user's body movement.
- the user or the user's support person installs the user terminal 10 on a flat place or a wall, directs the optical axis of the camera 106 to the place where the user is exercising, and instructs the start of video recording.
- the imaging unit 111 may operate the camera 106 in response to this to obtain a moving image.
- the imaging unit 111 stores the acquired moving image in the image storage unit 131 .
- the image storage unit 131 stores images captured by the imaging unit 111 .
- the image is a moving image in this embodiment, it is not limited to this.
- the image storage unit 131 can store moving images, for example, as files.
- the imaging unit 111 captures an image of the user's physical exercise, which may be at the time of initial evaluation, or at the time the user implements a training menu presented by the function of the server device 1, which will be described later. You may At either time, the image captured by the imaging unit 111 is stored in the image storage unit 131 .
- the physical information storage unit 130 stores information (hereinafter referred to as physical information) regarding factors affecting the user's body, physical ability, and training effect.
- FIG. 4 is a diagram showing a configuration example of physical information stored in the physical information storage unit 130.
- the physical information includes height, weight, gender, dominant hand, arm length, leg length, hand size, finger length, grip strength, muscle strength, flexibility, shoulder strength.
- the evaluation request transmission unit 112 transmits to the server device 20 a request to evaluate physical exercise (hereinafter referred to as an evaluation request) based on the image captured by the imaging unit 111 .
- FIG. 5 is a diagram showing a configuration example of an evaluation request that the evaluation request transmission unit 112 transmits to the server device 20.
- the evaluation request includes user ID, mode, physical information and image data.
- a user ID is information that identifies a user.
- a mode is information indicating an exercise performed by a user. Modes can be, for example, "strength training", “joint range of motion training", “gait rehabilitation”, “improvement of frailty (improvement of specific symptoms and medical conditions)", and the like. It is assumed that the mode is selected from predetermined options.
- the physical information is physical information stored in the physical information storage unit 130 .
- the image data is data of a moving image acquired by the imaging unit 111 .
- the evaluation information receiving unit 113 receives information on evaluation of physical exercise using a tool (hereinafter referred to as evaluation information) that is responded from the server device 20 in response to an evaluation request.
- evaluation information information on evaluation of physical exercise using a tool (hereinafter referred to as evaluation information) that is responded from the server device 20 in response to an evaluation request.
- the evaluation information receiving section 113 registers the received evaluation information in the evaluation information storage section 132 .
- FIG. 6 is a diagram showing a configuration example of evaluation information received by the evaluation information receiving unit 113 from the server device 20.
- the evaluation information includes mode, user ID, tool position information, body part position information, posture information, motion information and checkpoint information.
- the user ID and mode are the user ID and mode included in the evaluation request.
- the photographed image indicates that the body of the user who has performed the exercise indicated by the mode is photographed.
- Tool position information includes each part of the tool (for example, the entire bat used for baseball, both ends, the grip, the place where the ball meets, the center of gravity, and other arbitrary parts.
- the tool position information includes each part of the tool and the position of the part in association with the time point on the time axis of the moving image. Based on the tool position information, it is possible to display the movement of the tool and the relationship with parts of the body. That is, for example, at the position indicated by the tool position information, a figure (such as a circle) indicating the part can be superimposed on the image and displayed.
- tool position information may not be included for a portion that connects two portions (for example, the midpoint between the portions where the barbell is gripped with the right and left hands). In this case, by connecting a pair of marks (such as circles) indicating two predetermined portions with a line, a portion connecting these two portions can be represented.
- the tool part position information may be included for each frame constituting the moving image, may be included for each key frame (including frames related to checkpoints to be described later), or may be included for any number of It may be included for each frame, or may be included for random time points. If position information is not included in each frame, the figure can be displayed based on position information at the most recent past time.
- the body position information indicates the position in the image of each part of the body (eg, head, shoulders, elbows, waist, knees, ankles, etc.).
- the body position information includes a part and the position of the part in association with a point in time on the time axis of the moving image.
- the state of the skeleton of the body (bone) can be displayed. That is, for example, it is possible to superimpose and display a figure (for example, a circle) indicating the part on the image at the position indicated by the body position information. It should be noted that multiple site positions may be included for one time point.
- position information does not need to be included for a part connecting two parts (for example, a forearm connecting a wrist and an elbow, a thigh connecting a waist and a knee, etc.).
- a part connecting these two parts can be expressed.
- Position information may be included for each frame that constitutes a moving image, may be included for each keyframe (including frames related to checkpoints to be described later), or may be included for any number of frames. , or at random time points. If position information is not included in every frame, bones can be displayed based on position information for the most recent past time.
- the tool orientation information is information related to the orientation of the tool used by the user and the direction in which the part of the tool faces.
- the tool orientation information includes a portion of the tool to be evaluated, a tool movement value, an evaluation rank, an evaluation comment, etc., in association with a point in time on the time axis of the moving image.
- the tool orientation value is a value representing the orientation of the tool.
- a tool orientation value can be, for example, the distance from the ground to a part of the tool, the distance between two parts of the tool, the angle of the part (the first part of the tool and the part where the user is holding the tool, for example).
- An evaluation rank is a value representing an evaluation value by a rank.
- the evaluation rank is expressed by, for example, 1 to 5 out of 5 or ABC.
- the evaluation comment is a comment related to the evaluation of posture. For example, if the mode is "upright row” and the right and left ends of the barbell are at different distances from the ground, an evaluation comment such as "different forces are applied to the left and right" may be included.
- Tool movement information is information related to the movement of the tool used by the user.
- the tool movement information includes the portion of the tool to be evaluated, the list of tool orientation values, the evaluation rank, and the evaluation comment in association with the period on the time axis of the moving image.
- a list of tool orientation values is a time series of tool orientation values within a time period.
- the evaluation comment is a comment related to the evaluation regarding the movement of the tool. For example, if the mode is "upright row” and there is not enough up and down movement of the barbell, a rating comment such as "the barbell is not lifted enough" may be included.
- the posture information is information related to the posture of the user's body.
- the posture information includes a part to be evaluated, a posture value, an evaluation rank, and an evaluation comment in association with a time point on the time axis of the moving image.
- a posture value is a value representing a posture.
- the attitude value can be, for example, the distance from the ground to the part, the distance between the two parts, the angle of the indirect part (a straight line from the first end part to the indirect part, and from the second end part to the indirect part angle made by a straight line to ) and so on.
- An evaluation rank is a value representing an evaluation value by a rank.
- the evaluation rank is expressed by, for example, 1 to 5 out of 5 or ABC.
- the evaluation comment is a comment related to the evaluation of posture.
- Body movement information is information related to the movement of the user's body.
- the body movement information includes a part to be evaluated, a list of posture values, an evaluation rank, and an evaluation comment in association with a period on the time axis of the moving image.
- the list of attitude values is the time-series attitude values within the period.
- An evaluation comment is a comment related to an evaluation regarding motion. For example, if the mode is "lifting" and the knee extension is not smooth, an evaluation comment such as "knee movement is not smooth" may be included.
- the relationship information indicates information on one or more positional relationships between the tool and the body.
- the relationship information includes information on the relationship between information such as the position, orientation, and movement of the part of the tool and information such as the body part, posture, and movement, in association with the point in time on the time axis of the moving image. For example, based on the tool position information and the body position information, it is possible to display the positional relationship between the two. That is, for example, it is possible to superimpose a figure (such as a circle) indicating the part on the image at the position indicated by the tool position information, and display a figure (such as a circle) indicating the part at the position indicated by the body position information. can. It should be noted that one point in time may include a plurality of parts and site positions.
- Positional information does not need to be included for a point connecting a part and a part (for example, the tip of the bat and the center point of the part where the user grips the bat).
- a pair of figures such as circles
- the relational information may be included for each frame constituting the moving image, for each keyframe, for each checkpoint (details will be described later), or for any number of frames. may be included or may be included for random time points. If position information is not included in each frame, it is possible to display a figure indicating a part or region and a line connecting them based on the position information at the most recent past time.
- Checkpoint information is information that indicates the points (hereinafter referred to as checkpoints) at which the direction of the tool and the posture of the body should be checked in the movement of the tool used by the user or in the series of body movements of the user. be.
- checkpoints include when the barbell reaches its highest position, when it reaches its lowest position, and when it is lifted.
- the mode is "pitching", it is when the foot is lifted, when the lifted foot is put down and the weight is shifted, when the ball is released, and so on.
- the checkpoint information stores information indicating a checkpoint (hereinafter referred to as a checkpoint ID) in association with a point in time on the time axis of the moving image. That is, it is possible to specify the frame (still image) in the moving image in which the checkpoint indicated by the checkpoint ID is displayed.
- the evaluation display unit 114 displays evaluation information. For example, the evaluation display unit 114 superimposes the tool position, tool orientation, tool movement information, body position, posture, body movement information, etc. included in the evaluation information on the animation, and displays the parts of the tool and the body. By displaying figures representing body parts (for example, circles representing the ends of tools, the center of gravity of the body, lines connecting them, etc.), movements of tool parts and body parts can be superimposed on videos. .
- the evaluation display unit 114 can graphically display chronological changes in the position of the part of the tool, the direction of the tool, the movement of the tool, the position of the part of the body, the posture, the movement of the body, and the like. can.
- the evaluation display unit 114 can display the evaluation rank and the evaluation comment together with the display of the moving image based on the tool orientation information and the tool movement information included in the evaluation information. For example, the evaluation display unit 114 displays the tool orientation information when the playback time of the moving image reaches the vicinity of the time point included in the tool orientation information (for example, it can be set to any length such as around 5 seconds). You can view the rating rank and rating comments contained in the . Moreover, the evaluation display unit 114 can display the evaluation rank and the evaluation comment included in the tool movement information when the reproduction time of the moving image comes within the period included in the tool movement information. The evaluation display unit 114 can also display the posture value included in the posture information. In addition, the evaluation display unit 114 can graphically display changes in tool orientation values over time based on the list of tool orientation values included in the tool movement information.
- the evaluation display unit 114 can display the evaluation rank and the evaluation comment together with the display of the moving image based on the posture information and motion information included in the evaluation information. For example, the evaluation display unit 114 displays the playback time included in the posture information when the playback time of the moving image reaches the vicinity of the point in time included in the posture information (for example, it can be set to an arbitrary length such as around 5 seconds). You can view the evaluation rank and evaluation comments that have been submitted. Moreover, the evaluation display unit 114 can display the evaluation rank and the evaluation comment included in the motion information when the playback time of the moving image comes within the period included in the motion information. The evaluation display unit 114 can also display the posture value included in the posture information.
- the evaluation display unit 114 can display chronological changes in posture values in a graph based on the list of posture values included in the motion information.
- the evaluation rank and the evaluation comment may be displayed together with the evaluation rank and the evaluation comment displayed in accordance with the display of the moving image based on the tool orientation information and the tool movement information described in the previous paragraph. .
- the checkpoint display unit 115 can extract and display checkpoint images from the video.
- the checkpoint display unit 115 can read a frame corresponding to the time point included in the checkpoint information from the moving image data stored in the image storage unit 131 and display it as a still image. Also, the checkpoint display unit 115 may, for example, extract and display only body parts from the read frames.
- the remedy request transmission unit 116 transmits to the server device 20 a request for acquiring an remedy regarding tool handling or physical exercise (hereinafter referred to as a remedy request).
- FIG. 7 is a diagram showing a configuration example of an improvement request.
- the improvement request includes a user ID, mode, purpose, and the like.
- the purpose is the purpose for which the user makes improvements.
- the purpose can be, for example, "increase joint range of motion", “increase muscle strength”, “stabilize walking”, “improve symptoms/medical condition", and the like.
- the purpose may also be selected from predetermined options, and the symptoms and medical conditions may also be selected from predetermined options.
- the improvement plan information receiving unit 117 receives information about the improvement plan (hereinafter referred to as improvement plan information) transmitted from the server device 20 in response to the improvement plan request.
- the improvement plan information receiving unit 117 stores the received improvement plan information in the improvement plan storage unit 133 .
- FIG. 8 shows a configuration example of improvement measure information.
- the improvement plan information includes purpose, advice and standard information, training menu (training menu, appropriate intensity of training, appropriate number of times of training, recommended posture such as standing or sitting, individual body according to the information) and the number of times performed.
- the advice is assumed to be a character string representing the improvement measure, but may be content that presents the improvement measure using an image, video, or the like.
- the reference information is suitable orientation and movement of the tool (position, orientation, movement, speed, etc. of each part) and body posture (position, angle, etc. of each part).
- the improvement measure information receiving unit 117 may receive the improvement measure information transmitted by the improvement measure information transmitting unit 216 based on the evaluation result and the reference value even if there is no improvement measure request.
- the improvement plan information display unit 118 displays the improvement plan.
- the remedy information display unit 118 displays advice included in the remedy information. Further, when the improvement measure information includes a suitable position and angle of a part, the improvement measure information display unit 118 extracts an image of a frame in which the part is at a suitable angle from the moving image information. It may be displayed on the user terminal 10 .
- FIG. 9 is a diagram showing a hardware configuration example of the server device 20.
- the server device 20 includes a CPU 201 , a memory 202 , a storage device 203 , a communication interface 204 , an input device 205 and an output device 206 .
- the storage device 203 is, for example, a hard disk drive, solid state drive, flash memory, etc., which stores various data and programs.
- the communication interface 204 is an interface for connecting to the communication network 40, and includes, for example, an adapter for connecting to Ethernet (registered trademark), a modem for connecting to a public telephone network, and a wireless communication device for performing wireless communication. , USB (Universal Serial Bus) connector and RS232C connector for serial communication.
- the input device 205 is, for example, a keyboard, mouse, touch panel, button, microphone, etc. for inputting data.
- the output device 206 is, for example, a display, printer, speaker, or the like that outputs data.
- FIG. 10 is a diagram showing a software configuration example of the server device 20.
- the server device 20 includes an evaluation request reception unit 211, an image analysis unit 212, an evaluation unit 213, an evaluation information transmission unit 214, an improvement measure request reception unit 215, an improvement measure information transmission unit 216, an implementation status information Acquisition unit 217, document information generation unit 218, image data storage unit 231, reference information storage unit 232, evaluation condition information storage unit 233, improvement measure information storage unit 234, implementation status information storage unit 235 , a template storage unit 236, and a generated document storage unit 237.
- FIG. 10 is a diagram showing a software configuration example of the server device 20.
- the server device 20 includes an evaluation request reception unit 211, an image analysis unit 212, an evaluation unit 213, an evaluation information transmission unit 214, an improvement measure request reception unit 215, an improvement measure information transmission unit 216, an implementation status information Acquisition unit 217, document information generation unit 218, image data storage unit 231, reference information storage unit 232, evaluation condition information storage unit 233, improvement measure information storage unit
- each of the functional units described above is implemented by the CPU 201 provided in the server device 20 reading a program stored in the storage device 203 into the memory 202 and executing the program. and part of the storage area provided by the storage device 203 .
- the evaluation request receiving unit 211 receives an evaluation request transmitted from the user terminal 10.
- the evaluation request reception unit 211 registers information including image data included in the received evaluation request (hereinafter referred to as image information) in the image data storage unit 231 .
- FIG. 11 is a diagram showing a configuration example of image information stored in the image data storage unit 231. As shown in FIG. As shown in the figure, the image information includes image data in association with a user ID indicating the user whose image was captured. The image data was included in the evaluation request.
- the reference information storage unit 232 stores reference values related to body movements derived from tool positions, tool movements (tool orientations and movements, etc.), postures (positions and angles of parts, etc.), and relationships between tools and the body. Information included (hereinafter referred to as reference information) is stored.
- FIG. 12 is a diagram showing a configuration example of reference information stored in the reference information storage unit 232. As shown in FIG.
- the reference information includes information such as the absolute position of the tool and how the tool moves (moving speed, moving distance, etc.) , direction of movement, etc.), reference information on the absolute position of the part or relative position to another part or other reference object (hereinafter referred to as position reference information), and three parts including joint parts, two parts It includes, but is not limited to, reference information on the angle formed by the straight line connecting each and the joint part (hereinafter referred to as angle reference information), and information on the relationship between the part of the tool and the part of the body.
- the tool position reference information includes the part of the tool and the reference position of that part in association with the mode and checkpoint ID.
- the vertical position may be, for example, the height above the ground or the distance from either foot.
- the distance from the line connecting the body parts or the body parts such as the distance between the line connecting both shoulders and the shaft, may be used.
- the horizontal position of the part may be the distance from a predetermined reference object (for example, a mound plate or a mark on the floor) or the distance from the reference part such as the shoulder, chest, or leg. It is assumed that the position reference information is registered in advance.
- the tool movement reference information includes reference values for information such as the movement speed and movement distance of parts of the tool, the direction of movement at a certain point in time, and the trajectory of movement during a certain period of time, in association with the mode and checkpoint ID. and are included.
- the position reference information includes the part and the reference position of the part in association with the mode and checkpoint ID. There may be multiple sites. Regarding position, the vertical position may be, for example, the height from the ground or the distance from either foot. The horizontal position may be the distance from a predetermined reference object (for example, a mound plate or a mark on the floor) or the distance from a reference part such as the shoulder, chest, or leg. It is assumed that the position reference information is registered in advance.
- a predetermined reference object for example, a mound plate or a mark on the floor
- a reference part such as the shoulder, chest, or leg. It is assumed that the position reference information is registered in advance.
- the angle reference information includes two parts (part 1 and part 2), one joint part, a straight line connecting the part 1 and the joint part, a part 2 and the joint part, and corresponding to the mode and the checkpoint ID. and the reference value of the angle between the straight line connecting the part.
- the relational reference information includes information on the reference represented by the relation between the part of the tool and the part of the body in association with the mode and checkpoint ID.
- the relational reference information includes information obtained from movement speed, movement distance, angle, etc. in one or more parts and regions in association with the mode and checkpoint ID. For example, when the mode is batting, the reference information includes the movement speed of the tip of the bat at the time of contact with the ball, the angle formed by the bat and the dominant arm holding the bat, and the like.
- the evaluation condition information storage unit 233 stores information for evaluation (hereinafter referred to as evaluation condition information).
- FIG. 13 is a diagram showing a configuration example of evaluation condition information stored in the evaluation condition information storage unit 233.
- the evaluation condition information includes categories, conditions, evaluation ranks, and comments.
- Category is the category of evaluation. Categories may include, for example, "muscle strength", “range of motion”, and "endurance”.
- the conditions include the position, orientation, or movement of each part of the tool in the image (change in position in time series), or the position or movement of each part of the body (change in position in time series).
- condition information For example, when analyzing a weightlifting movement, checkpoints at the moment of lifting the barbell, conditions such as elbow angle and arm extension speed, and shaft movement and up and down speed during the period of lifting and lowering the barbell. can be set in the evaluation condition information. Also, when analyzing a pitching form, it is possible to set conditions such as the angle of the elbow and the line speed of the arm in the evaluation condition information for check points for releasing the ball.
- the evaluation rank is an evaluation value when the above conditions are satisfied.
- the comment is a description of the body's posture and movement when the above conditions are met.
- the image analysis unit 212 (part/part identification unit) analyzes the image data.
- the image analysis unit 212 analyzes the image data, extracts the feature amount of each part of the tool and each part of the body, and specifies the position of each part and each part in the image. Also, the image analysis unit 212 analyzes the image data, extracts the feature amount of each part of the tool, and identifies the direction in which each part faces. It should be noted that the image analysis method by the image analysis unit 212 is assumed to employ a general method, and detailed description thereof will be omitted here.
- the image analysis unit 212 may analyze image data for each frame or key frame, may analyze image data for each checkpoint, or may analyze image data at random timing. good too.
- the image analysis unit 212 also compares the position of each part extracted from the image data with the position reference information stored in the reference information storage unit 232 for each checkpoint ID, and selects the closest time as the checkpoint time. Identify as
- the user's physical exercise in the present embodiment also includes what the user does with the assistance of a supporter.
- the image analysis unit 212 may capture the user's body part or the movement of the tool associated with the user's body movement from the image capturing such body movement.
- the image analysis unit 212 may detect a person by analyzing image data and extracting a feature amount, and after dividing an area for each person, specify each part and each part of the user.
- the image analysis unit 212 may perform posture estimation after estimating the region of each person in units of pixels using instance segmentation.
- the image analysis unit 212 distinguishes between the user and the supporter and analyzes the posture of the user.
- the user may be determined to be the person who appears in the largest image on the moving image, or the person appearing closer to the center of the moving image may be determined to be the user.
- a person may be determined as a user.
- the image analysis unit 212 may determine the user based on the markers attached to the user's clothing, body surface, hair, etc. Conversely, the image analysis unit 212 may attach the marker to the supporter to determine the user.
- the image analysis unit 212 may determine that a person holding or wearing a device used for training or the like is a user. Also, the image analysis unit 212 may identify the user using a general face authentication technique. In addition, the image analysis unit 212 may determine the user by recognizing the features of the user (for example, an elderly person, a care recipient, etc.). In this case, for example, the image analysis unit 212 generates a determination model that has learned the walking and movement characteristics of the elderly and people with problems in specific parts, and uses the model to determine the characteristics of the elderly and care recipients.
- the image analysis unit 212 analyzes the postures of all the people appearing in the moving image, and then presents the user terminal or the supporter terminal with a function that allows the user to select the user. A selection may be accepted to identify the user. Note that the user determination method described above is not limited to the case where the user and the supporter are included in the image. may be used to determine the user.
- the image analysis unit 212 identifies the parts of the supporter without mistaking them for the parts of the user even when the user and the supporter are included in the image and the two overlap on the image.
- the image analysis unit 212 may, for example, output a plurality of joint point candidates for each part and group which candidates belong to the same person as other part candidates in post-processing. Further, the image analysis unit 212 may acquire depth using a depth sensor, and group joint points with those of the same person based on the depth information.
- the image analysis unit 212 analyzes images acquired from a plurality of imaging terminals, analyzes whether or not there is a contradiction in the same part in the parts determined from each image, and integrates them to suppress false detection. good too.
- the image analysis unit 212 may receive the user's or supporter's manual designation of the detected joint points from the user's terminal or the supporter's terminal, and group them with those of the same person. In addition, the image analysis unit 212 uses physical characteristics such as the joint length between the specified joint points, the range of motion of the joint (restrictions on the joint angle), and the degree of bending of the waist to identify the detected parts as those of the same person. may be grouped. Also, the image analysis unit 212 tracks each joint using the time-series information (the supporter is kept away at the start of imaging). In addition, using an imaging terminal, the user or supporter wears a marker such as a glove to target limbs that are likely to be detected incorrectly, and the movement is imaged.
- a marker such as a glove to target limbs that are likely to be detected incorrectly, and the movement is imaged.
- the image analysis unit 212 may be used for grouping joint points or parts when a person is captured.
- the evaluation unit 213 evaluates the movement of the tool used by the user based on the image data.
- the evaluation unit 213 searches the evaluation condition information storage unit 233 for evaluation condition information including conditions satisfied by the position and motion of each part of the tool specified from the image data, If there is evaluation condition information, the evaluation rank and comments included in it are acquired. Note that the evaluation unit 213 may evaluate the movement of the tool and count the number of body movements.
- the evaluation unit 213 evaluates the movement of the user's body based on the image data.
- the evaluation unit 213 searches the evaluation condition information storage unit 233 for evaluation condition information including conditions satisfied by the position and movement of each part specified from the image data, and searches the evaluation condition information that satisfies the conditions. Get the rating rank and comments contained in the information, if any. Note that the evaluation unit 213 may evaluate the body movement and count the number of body movements.
- the evaluation unit 213 evaluates the tool used by the user and the movement of the user's body based on the image data.
- the evaluation unit 213 generates evaluation condition information including the position of each part of the tool and each part of the body specified from the image data, and the conditions satisfied by the movement or relationship between the parts and the parts.
- a search is made from the storage unit 233, and if there is evaluation condition information that satisfies the condition, the evaluation rank and comment included therein are acquired.
- the evaluation unit 213 may count the number of physical exercises by evaluating the movement of the tool and the body.
- the evaluation information transmission unit 214 transmits the evaluation information to the user terminal 10.
- the evaluation information transmission unit 214 generates tool position information including the points in time on the time axis of the moving image specified by the image analysis unit 212 and the positions of each part of the tool.
- the evaluation rank and comment acquired by the evaluation unit 213 if the position of the part of the tool satisfies the condition, posture information including the time, part and tool orientation values, the evaluation rank and the comment is generated, and the motion of the part is calculated. If (position change in chronological order) satisfies the conditions, then generate tool motion information including a list of time points, parts and tool orientation values, as well as evaluation ranks and comments.
- the evaluation information transmission unit 214 generates checkpoint information including the time point corresponding to each checkpoint analyzed by the image analysis unit 212 and the checkpoint ID indicating the checkpoint.
- the evaluation information transmission unit 214 creates evaluation information including the generated tool position information, tool orientation information, tool movement information, and checkpoint information, and transmits the evaluation information to the user terminal 10 .
- the evaluation unit 213 and the evaluation information transmission unit 214 can correspond to the comment output unit of the present invention.
- the evaluation information transmission unit 214 transmits the evaluation information to the user terminal 10.
- the evaluation information transmission unit 214 generates position information including the time point on the time axis of the moving image specified by the image analysis unit 212 and the position of each part.
- position information including the time point on the time axis of the moving image specified by the image analysis unit 212 and the position of each part.
- posture information including the time point, the part and posture values, the evaluation rank and the comment is generated, and the movement of the part (time series) is generated. position change) satisfies the conditions, generate motion information including a list of time points, part and posture values, evaluation ranks and comments.
- the evaluation information transmission unit 214 generates checkpoint information including the time point corresponding to each checkpoint analyzed by the image analysis unit 212 and the checkpoint ID indicating the checkpoint.
- the evaluation information transmission unit 214 creates evaluation information including the generated position information, posture information, motion information, and checkpoint information, and transmits the evaluation information to the user terminal 10 .
- the evaluation unit 213 and the evaluation information transmission unit 214 can correspond to the comment output unit of the present invention.
- the improvement plan information storage unit 234 stores information related to improvement plans (hereinafter referred to as improvement plan information).
- FIG. 14 is a diagram showing a configuration example of improvement plan information stored in the improvement plan information storage unit 234.
- the improvement plan information includes advice in association with purposes, categories and conditions.
- the conditions may be conditions for the tool itself (weight of the barbell, etc.), how to use the tool, conditions for the body (flexibility, etc.), or conditions for the position, orientation, and movement of parts of the tool. It may be a condition for the position or movement of the part of the .
- the improvement request reception unit 215 receives the improvement request sent from the user terminal 10 .
- the remedy request receiving unit 215 may receive the remedy request from the supporter terminal 50 .
- the improvement measure information transmission unit 216 transmits the user's physical information included in the evaluation request and each item specified by the image analysis unit 212 among the improvement measure information corresponding to the mode and purpose included in the improvement request. A search is made for items that satisfy conditions such as the position, orientation, movement, etc. of a part or each part.
- the improvement measure information transmission unit 216 acquires the advice of the searched improvement measure information, creates the improvement measure information in which the purpose and the advice are set, and responds to the user terminal 10 with the created improvement measure information.
- the improvement measure information transmitting unit 216 also transmits the position, orientation, speed, angle, and the like of each portion and each part included in the reference information in the improvement measure information. Note that the improvement measure information transmission unit 216 may search for an improvement measure based on the evaluation information and the reference value without the improvement measure request, and the improvement measure information transmission unit 216 transmits the improvement measure to the user terminal 10. You may send.
- the improvement measure information transmission unit 216 may transmit to the user terminal 10 the improvement measure information corresponding to the mode (symptoms, disease name, condition, etc.) and/or purpose (desires, goals, needs, etc.).
- the improvement measure information transmission unit 216 transmits the user's physical information contained in the evaluation request among the improvement measure information corresponding to the mode (symptoms, disease name, condition, etc.) and/or the purpose (hope, goal, needs, etc.) In particular, information such as ADL, joint range of motion, degree of support required, degree of care required, etc., or information that satisfies conditions such as the position, orientation, movement, etc. of each part and each part specified by the image analysis unit 212 is searched. good too.
- the improvement measure information transmission unit 216 transmits the user's physical information included in the evaluation request, particularly ADL, joint range of motion, support required, among the improvement measure information corresponding to the user's motor function, life function, and cognitive function. It is possible to search for information that satisfies conditions such as information such as degree and degree of need for nursing care, and the position, orientation, movement, etc. of each portion and each part specified by the image analysis unit 212 .
- the improvement measure information transmission unit 216 transmits mode (symptoms, disease name, condition, etc.) or/and purpose (hope, goal, needs, etc.) and improvement measure information corresponding to the user's motor function, life function, cognitive function , the user's physical information included in the evaluation request, especially information such as ADL, joint range of motion, degree of support required, degree of care required, and the position and orientation of each part and each part specified by the image analysis unit 212, A search may be made for items that satisfy conditions such as movement.
- the improvement measure information transmission unit 216 corresponds to modes (symptoms, disease names, conditions, etc.) and/or purposes (desires, goals, needs, etc.) and low-evaluation items among the user's motor functions, life functions, and cognitive functions.
- the user's physical information included in the evaluation request especially information such as ADL, joint range of motion, level of support required, level of care required, and each part and each part specified by the image analysis unit 212 A search may be made for a part that satisfies conditions such as the position, orientation, and movement of the part.
- the implementation status information acquisition unit 217 acquires, from the user terminal 10, the implementation status of the improvement measures (a training menu such as a rehabilitation menu and a training menu) transmitted to the user terminal 10, and stores it in the implementation status information storage unit 235. do.
- the implementation status information acquisition unit 217 presents a form or the like for inputting the implementation date, the number of implementations, etc. for each training menu on the user terminal 10, and acquires the input information from the user and the supporter as implementation status information. .
- the implementation status information acquisition unit 217 acquires an image captured by the user terminal 10, and the image analysis unit 212 analyzes the image to determine whether or not the specified training menu has been implemented. You may acquire the information of the frequency
- the image analysis unit 212 stores information on how each part of the body or the part of the tool moves for each training menu, each part obtained by analyzing the image of the user, and the movement of each part. By comparing the information, it is possible to analyze whether or not the specified training menu has been implemented, and to count the number of times the movement of each part and each part has been repeated. Further, the evaluation unit 213 may evaluate the image to evaluate whether or not the training was effective, and the evaluation information transmission unit 214 may display the evaluation result on the user terminal 10 .
- the document information generation unit 218 generates documents to be submitted to departments related to health and welfare established in each local government, etc., in order to receive the application of nursing care insurance and the payment of subsidies in the nursing care business.
- Such documents include documents for applying for and notifying long-term care insurance, including documents related to designated applications, documents related to remuneration claims, and documents related to instruction audits. , ADL maintenance addition, living function improvement cooperation addition, scientific nursing care promotion addition, etc., but not limited to these.
- the documents include handover, transfer, chart (user's condition, monitoring, etc.), training implementation record, diary, care plan, assessment results, various plans, care provision chart, nursing care benefit statement (leading to remuneration claim) Records) and other documents created and stored by the organization to which the supporter belongs may also be included.
- the document information generation unit 218 acquires information on the template of the document to be generated, which is stored in the template storage unit 236, and stores the information on the user, the content of care provided, and the training implementation stored in the implementation status information storage unit 235.
- the corresponding information of the situation (implementation content, number of times, etc.) may be poured into the template and output to the supporter terminal 50 as a word file, PDF file, or the like.
- the document information generation unit 218 cooperates with the management system, etc., and stores relevant information such as user information, implementation details, implementation frequency, etc., stored in the implementation status information storage unit 235, into the database of the management system. may be stored.
- FIG. 15 is a diagram showing an example of the flow of processing executed by the care support device of this embodiment.
- the imaging unit 111 of the user terminal 10 receives mode input, images the user's body during exercise, and acquires video data (S321).
- the evaluation request transmission unit 112 transmits to the server device 20 an evaluation request including the user ID indicating the user, the received mode, the physical information and the video data (S322).
- the image analysis unit 212 analyzes the moving image data to extract feature amounts (S323), and specifies the position of each part and each part (S324).
- the image analysis unit 212 may specify the position on the image, or use the physical information to determine the actual position (height from the ground, distance from a reference point such as the center of gravity of the body, etc.). You may make it specify.
- the evaluation unit 213 acquires an evaluation rank and a comment from the evaluation condition information that satisfies the condition of each part, the position of each part, the movement of the part (change in position over time) (S325).
- the evaluation information transmission unit 214 creates evaluation information and transmits it to the user terminal 10 (S326).
- the evaluation display unit 114 of the user terminal 10 displays the position, orientation, movement, etc. of the tool on the video data based on the received evaluation information (S327).
- the evaluation display unit 114 of the user terminal 10 may display the position (bone) of each part indicating the posture of the body, as well as the evaluation rank and comments (S327).
- the evaluation display unit 114 may graphically display the position, orientation, movement, etc. of the part, and time-series changes in the position and movement of the part.
- the checkpoint display unit 115 may extract and display images of checkpoints from the moving image.
- the remedy request transmission unit 116 transmits the remedy request to the server device 20 according to the instruction from the user (S328).
- the improvement measure information transmission unit 216 searches for the improvement measure information that satisfies the conditions, and Advice included in the measure information is acquired (S329), and improvement measure information including the acquired advice is created and transmitted to the user terminal 10 (S330).
- the improvement measure information display unit 118 displays the advice included in the received improvement measure information, and displays the suitable tool usage as video data. It can be superimposed and displayed (S331).
- the improvement measure information display unit 118 displays the advice included in the received improvement measure information, and also displays the preferred body posture in the form of bones. (S331).
- the imaging unit 111 of the user terminal 10 captures an image of the user's body while exercising, and acquires other moving image data.
- the user terminal 10 may transmit an evaluation request including the user ID indicating the user, the received mode, physical information and video data to the server device 20 by the evaluation request transmission unit 112, and may capture the body movement.
- only the moving image data obtained from the video data may be transmitted to the server device 20 (S332).
- the implementation status information acquisition unit 217 acquires implementation status information based on the analysis of the moving image data by the image analysis unit 212 and the evaluation by the evaluation unit 213 (S333).
- the document information generation unit 218 generates document information based on at least the implementation status information and the document template stored in the server device 20 (S334).
- the care support device of the present embodiment it is possible to easily evaluate physical exercise. Especially for physical exercise related to sports, it is possible to evaluate the positional relationship and movement of each part of the tool and each part of the body. In addition, since the care support device of the present embodiment also provides comments and advice, the user can easily grasp the current situation and improvement measures.
- FIG. 16 is a diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
- FIG. 16 illustrates a case where a moving image is captured in the weightlifting mode.
- the screen 41 displays a mark 411 indicating the position of the barbell shaft.
- the movement of the barbell shaft is indicated by line 412 .
- FIG. 17 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
- FIG. 17 illustrates a case where a moving image is captured in the weightlifting mode.
- the result of evaluation performed by the evaluation unit 213 in the weight lifting mode the inclination of the barbell shaft, the movement distance of the shaft, the movement speed (line 421), and the like are displayed.
- the reference value (line 422) is displayed.
- the actual measurement result (line 423) may be displayed, and further, the difference from the reference value may be displayed numerically or graphically. The user can consider the movement and posture that should be corrected by referring to this.
- FIG. 18 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
- FIG. 18 illustrates a case where moving images are captured in the weightlifting mode.
- the reference value such as the angle of the body joint at the lowest point of the barbell and the evaluation result (line 432) are displayed. It is a figure.
- the difference from the reference value may be displayed numerically, or may be displayed in a graph or the like.
- the evaluation result based on the relationship between the body and the tool may be displayed. The user can consider the movement and posture that should be corrected by referring to this.
- FIG. 19 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
- FIG. 19 illustrates a case where a moving image is captured in the weightlifting mode.
- a line 441 displays a bone that is displayed by connecting predetermined positions of parts of the tool and body parts identified from the image. Note that the bones may be displayed superimposed on the captured image.
- a line 442 represents the acceleration of each part of the body.
- a count result such as how many times the weight has been lifted may be displayed.
- the evaluation result and the next training result in accordance with the purpose or the like may be indicated.
- FIG. 20 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
- FIG. 20 illustrates a case where a moving image is captured in the weightlifting mode.
- an evaluation (line 441) obtained by comparison with various reference values is displayed.
- the server device 20 analyzes the image, but the present invention is not limited to this, and the user terminal 10 analyzes the image and specifies the positional relationship between each part and each part. may
- the positions of parts and sites are positions on a two-dimensional image, but they are not limited to this, and may be three-dimensional positions.
- the three-dimensional position of a part or site is specified based on the image from the camera 106 and the depth map from the depth camera. can do.
- the three-dimensional position may be specified by estimating the three-dimensional image from the two-dimensional image. It is also possible to provide a depth camera in place of the camera 106 and specify the three-dimensional position only from the depth map from the depth camera. In this case, the depth map is transmitted from the user terminal 10 together with the image data or instead of the image data to the server device 20, and the image analysis unit 212 of the server device 20 analyzes the three-dimensional position. can be done.
- an image of the user's body during exercise using a tool is transmitted from the user terminal 10 to the server device 20.
- a feature amount may be extracted from the image, and the feature amount may be transmitted to the server device 20.
- the user terminal 10 may estimate the part of the tool or the part of the body based on the feature amount, and determine the absolute value of the part or the part. position (on the XY coordinates of the image), or the actual distance from the reference position (for example, the ground, the tip of the foot, the head, the center of gravity of the body, etc.), or any other coordinate system ) Or acquire the relative positional relationship between multiple parts, between multiple parts, and between multiple parts, and send these absolute positions and relative positional relationships to the server device 20 You may make it transmit.
- content prepared on the server device 20 side is provided as the improvement measure information, but this is not restrictive.
- Marks and bones representing movements and postures may be superimposed and displayed on moving images or still images extracted from moving images. This makes it possible to easily grasp what kind of movement and posture should be taken.
- evaluation is performed on the part and orientation of the tool, the position or movement of the part of the body, etc. (position over time), but the position of the tool worn by the user is not limited to this. may be specified and evaluated.
- the server device 20 stores the tool and the reference value of the size (length, etc.) of the tool in association with the user's physical information (height, weight, etc.). Identify the shape of the tool by extracting the feature value of the tool in use, estimate the size of the tool based on the size of the user (such as height, etc.) included in the shape and physical information, and estimate the tool If the difference between the size and the reference value is greater than or equal to a predetermined threshold value, a tool having the size of the reference value can be recommended.
- content such as advice is provided as an improvement measure, but for example, the user may be suspended from exercising.
- the server device 20 stores a reference value at which physical exercise should be interrupted in association with the user's physical information (purpose, height, weight, etc.).
- the number and speed of exercise for example, the speed of lifting the barbell is extremely slow, or the number of times of exercise is too large at one time, etc.
- the physical exercise is discontinued.
- a comment may be issued to the user terminal 10 asking the user to stop, the user may be notified by changing the display such as turning off the screen, or a sound such as an alert sound may be emitted. Alternatively, the user may be notified by vibration.
- content such as advice is provided for improvement measures, but for example, determination of illness and physical exercise for improvement may be presented.
- the server device 20 extracts candidates for the disease that the user is assumed to have developed from the symptoms entered by the user in the physical information and the evaluation information, and presents a screening test for narrowing down the disease. do. After the user has taken a screening test and has narrowed down the name of the disease, the service recommends that the user receive a medical examination, and recommends physical exercise for improvement, tools for physical exercise, and items such as meals.
- the server device 20 can estimate the speed, acceleration, movement distance, trajectory, etc. of the tool.
- the server device 20 can estimate the number of patterns as the number of actions using the tool by extracting patterns of changes in the position of the tool in time series.
- the server device 20 may store assignments instead of evaluation comments in association with one or a series of postures or movements, and output the assignments.
- the exercise is evaluated, but the present invention is not limited to this.
- Contents for improving physical exercise may be presented according to the purpose, such as performance, or preparatory stages such as stretching, strength training, and posture.
- the server device 20 associates one or a series of movement of a part of the tool, orientation of the part of the tool, body posture, or movement of the body part, and instead of the evaluation comment, the server device 20 provides the content of the training or the like. It is sufficient to store the content and output the content.
- the exercise is evaluated, but it is not limited to this, and it is also possible to automatically detect the action performed by the user.
- the server device 20 stores, as reference information, the positions and postures (positions of each part of the body) of each part of a tool that performs a predetermined action such as a shoot or a pass, and the tool analyzed from the image. By comparing the position of the part or part of the body with the reference information, the action performed by the user in the image can be identified.
- an image captured in the past is analyzed to evaluate the motion, but the present invention is not limited to this, and analysis processing is performed in real time, and when a predetermined motion is detected, the next step is taken. It is also possible to recommend postures and postures, parts and directions to be moved, the number of times, and the like. In this case, correct postures and postures are stored in association with postures or movements instead of evaluation comments, correct postures and postures are calculated in real time, and differences between the postures at that time and the correct postures and postures are calculated. Just output the actions needed to fill in the difference.
- the improvement measure information is obtained by extracting the improvement measure stored in the improvement measure information storage unit 234 based on the mode, purpose, physical information, or the evaluation result performed by the evaluation unit 213. , is presented to the user terminal 10.
- the evaluation unit 213 evaluates images of training performed a plurality of times at different times, determines whether the evaluation result is improved, and determines whether the evaluation result is improved.
- the improvement measure information transmission unit 216 may present to the user terminal 10 a training menu different from the previous time or an increase/decrease in the number of times.
- the improvement measure information storage unit 234 stores the result of the evaluation of the user's physical exercise by the evaluation unit 213 based on the first image including the physical exercise performed by the user, and By comparing the results of evaluating the user's body movement based on the second image containing the body movement performed at different times after the above, the evaluation based on the second image is compared. , it is determined that the image has improved when it is closer to the reference value than the evaluation performed based on the first image. The improvement measure information storage unit 234 determines that there is no improvement when the evaluation based on the second image is farther from the reference value than the evaluation based on the first image. do.
- the improvement measure information storage unit 234 determines that there is no change. judge not.
- the improvement measure information storage unit 234 extracts improvement measures based on the mode, purpose, physical information, the evaluation results obtained by the evaluation unit 213, and the determination results such as improvement, no improvement, and no change. It may be redone and presented on the user terminal 10 .
- the user terminal 10 or the server device 20 executes the predetermined function and stores the information.
- the functional unit and the storage unit may be provided separately.
Landscapes
- Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
- Primary Health Care (AREA)
- Engineering & Computer Science (AREA)
- Tourism & Hospitality (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Child & Adolescent Psychology (AREA)
- Theoretical Computer Science (AREA)
- Marketing (AREA)
- Biophysics (AREA)
- Physical Education & Sports Medicine (AREA)
- Epidemiology (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Medical Treatment And Welfare Office Work (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Dispositif d'aide aux soins caractérisé en ce qu'il comprend une unité d'évaluation pour évaluer un exercice physique sur la base de premières informations d'image qui comprennent un exercice physique réalisé par un utilisateur, une unité de transmission d'informations de mesure d'amélioration pour sélectionner un menu d'entraînement approprié à partir d'une évaluation de l'exercice physique et présenter le menu d'entraînement à une personne de soutien, et une unité de génération d'informations de document pour générer des informations de document sur la base au moins de l'état de mise en œuvre du menu d'entraînement par l'utilisateur.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2021213725A JP2023097545A (ja) | 2021-12-28 | 2021-12-28 | 介護支援装置、介護支援プログラム、介護支援方法 |
| JP2021-213725 | 2021-12-28 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023127870A1 true WO2023127870A1 (fr) | 2023-07-06 |
Family
ID=86999037
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2022/048136 Ceased WO2023127870A1 (fr) | 2021-12-28 | 2022-12-27 | Dispositif d'aide aux soins, programme d'aide aux soins et procédé d'aide aux soins |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2023097545A (fr) |
| WO (1) | WO2023127870A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240013499A1 (en) * | 2022-07-08 | 2024-01-11 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, recording medium, and image processing system |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019008771A1 (fr) * | 2017-07-07 | 2019-01-10 | りか 高木 | Système de gestion de processus de guidage destiné à une thérapie et/ou exercice physique, et programme, dispositif informatique et procédé de gestion de processus de guidage destiné à une thérapie et/ou exercice physique |
| WO2019022102A1 (fr) * | 2017-07-25 | 2019-01-31 | パナソニックIpマネジメント株式会社 | Procédé d'assistant d'activité, programme, et système d'assistant d'activité |
| WO2020107097A1 (fr) * | 2018-11-27 | 2020-06-04 | Bodybuddy Algorithms Inc. | Systèmes et procédés de fourniture d'exercice et de régimes personnalisés |
| JP2021117553A (ja) * | 2020-01-22 | 2021-08-10 | 株式会社ジェイテクト | 運動評価システム及びサーバシステム |
| JP2021529368A (ja) * | 2018-06-21 | 2021-10-28 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | 理学療法用の仮想環境 |
-
2021
- 2021-12-28 JP JP2021213725A patent/JP2023097545A/ja active Pending
-
2022
- 2022-12-27 WO PCT/JP2022/048136 patent/WO2023127870A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019008771A1 (fr) * | 2017-07-07 | 2019-01-10 | りか 高木 | Système de gestion de processus de guidage destiné à une thérapie et/ou exercice physique, et programme, dispositif informatique et procédé de gestion de processus de guidage destiné à une thérapie et/ou exercice physique |
| WO2019022102A1 (fr) * | 2017-07-25 | 2019-01-31 | パナソニックIpマネジメント株式会社 | Procédé d'assistant d'activité, programme, et système d'assistant d'activité |
| JP2021529368A (ja) * | 2018-06-21 | 2021-10-28 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | 理学療法用の仮想環境 |
| WO2020107097A1 (fr) * | 2018-11-27 | 2020-06-04 | Bodybuddy Algorithms Inc. | Systèmes et procédés de fourniture d'exercice et de régimes personnalisés |
| JP2021117553A (ja) * | 2020-01-22 | 2021-08-10 | 株式会社ジェイテクト | 運動評価システム及びサーバシステム |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240013499A1 (en) * | 2022-07-08 | 2024-01-11 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, recording medium, and image processing system |
| US12469241B2 (en) * | 2022-07-08 | 2025-11-11 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, recording medium, and image processing system, with change instruction to change form of 3D data |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2023097545A (ja) | 2023-07-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10973439B2 (en) | Systems and methods for real-time data quantification, acquisition, analysis, and feedback | |
| US10314536B2 (en) | Method and system for delivering biomechanical feedback to human and object motion | |
| JP7620969B2 (ja) | 運動支援システム | |
| Yurtman et al. | Automated evaluation of physical therapy exercises using multi-template dynamic time warping on wearable sensor signals | |
| JP7656920B2 (ja) | 運動評価システム | |
| CN111883229B (zh) | 一种基于视觉ai的智能运动指导方法及系统 | |
| US20150003687A1 (en) | Motion information processing apparatus | |
| US9510789B2 (en) | Motion analysis method | |
| JP7492722B2 (ja) | 運動評価システム | |
| Shi et al. | A VR-based user interface for the upper limb rehabilitation | |
| Gauthier et al. | Human movement quantification using Kinect for in-home physical exercise monitoring | |
| JP2016035651A (ja) | 在宅リハビリテーションシステム | |
| WO2023127870A1 (fr) | Dispositif d'aide aux soins, programme d'aide aux soins et procédé d'aide aux soins | |
| AU2023318948A1 (en) | Approaches to independently detecting presence and estimating pose of body parts in digital images and systems for implementing the same | |
| JP2021068069A (ja) | 無人トレーニングの提供方法 | |
| WO2022030619A1 (fr) | Système de prise en charge de guidage | |
| JP6439106B2 (ja) | 身体歪みチェッカー、身体歪みチェック方法およびプログラム | |
| JP7659293B2 (ja) | 身体運動支援システム | |
| JP7713250B2 (ja) | 姿勢推定装置、姿勢推定システム、姿勢推定方法 | |
| JP7561357B2 (ja) | 判定方法、判定装置、及び、判定システム | |
| EP3830811B1 (fr) | Système et procédé d'entraînement physique d'une partie du corps | |
| Jurado et al. | An IoT Monitoring System Based on Artificial Intelligence Image Recognition and EMG Signal Processing for Abdominal Exercise Performance | |
| Chung et al. | An Integrated Rehabilitation APP for Stroke Patients | |
| Ekambaram et al. | Real-Time Error Analysis of Exercise Posture for Musculoskeletal Disorder–A Machine Vision Approach | |
| CN118785851A (zh) | 对运动数据进行标记以及生成运动评估模型的方法和系统 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22916093 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20/09/2024) |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 22916093 Country of ref document: EP Kind code of ref document: A1 |