[go: up one dir, main page]

WO2018122956A1 - Système, procédé et programme de support d'analyse de mouvement sportif - Google Patents

Système, procédé et programme de support d'analyse de mouvement sportif Download PDF

Info

Publication number
WO2018122956A1
WO2018122956A1 PCT/JP2016/088884 JP2016088884W WO2018122956A1 WO 2018122956 A1 WO2018122956 A1 WO 2018122956A1 JP 2016088884 W JP2016088884 W JP 2016088884W WO 2018122956 A1 WO2018122956 A1 WO 2018122956A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
time segment
model
data
sports
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2016/088884
Other languages
English (en)
Japanese (ja)
Inventor
洋介 本橋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to PCT/JP2016/088884 priority Critical patent/WO2018122956A1/fr
Priority to JP2018558560A priority patent/JP6677319B2/ja
Publication of WO2018122956A1 publication Critical patent/WO2018122956A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present invention relates to a sports motion analysis support system, a sports motion analysis support method, and a sports motion analysis support program that support motion analysis in sports.
  • Motion capture is known as a technique that can be used for motion analysis in sports.
  • application software that displays a trajectory of a specific part of the body (for example, an ankle) based on a moving image obtained by imaging a person who is playing sports.
  • application software that displays the difference between a model form and the form of a sports person.
  • Patent Document 1 describes that a swing stage and a main position of golf are specified from a measurement value of a frame difference of an image using a rule-based model. Further, Patent Document 1 describes that in this case, a hidden Markov model, a state space model, a finite state machine, a regression method, a support vector machine, a neural network, a fuzzy theory, and the like may be used.
  • Patent Document 2 collects motion image data of 10 trials, uses 5 trials as learning data, estimates the parameters of the hidden Markov model, and performs a recognition experiment using the remaining 5 trials as test data. It is described. Patent Document 2 describes that a discrete cosine transform coefficient having a relatively low frequency component has been found to be effective as a feature amount for image recognition of human motion.
  • the form and the result of the action including the form do not always correspond completely. For example, even if a person's form is good, it may happen that the person's condition was not good and the result was not good. Also, for example, even if a person's form is good, it may occur that good results are not achieved due to external factors such as rain and wind.
  • the inventors of the present invention thought that a form in a certain time zone during a time when a person is operating is an important form having a great influence on the result.
  • the result there is a numerical value indicating the sports performance.
  • An example of the result other than the numerical value indicating the result is an event.
  • Specific examples of the event include, for example, matters relating to the movement of equipment used in sports, such as the matter in which direction the ball flew after soccer PK (penalty kick).
  • the present invention relates to a sports motion analysis support system and a sports motion analysis support method capable of assisting a user so that a time zone in which a form has a great influence on a result among times during which a series of motions are performed in sports can be grasped. And it aims at providing a sports movement analysis support program.
  • a sports motion analysis support system uses a plurality of data, a data storage unit that stores a plurality of data in which moving image data representing a series of motions in sports and the results of the motions are associated with each other.
  • a learning unit that learns a model representing a relationship between an operation in a time segment and a result corresponding to the operation for each of a plurality of time segments determined with respect to a time point representing the operation, and using the model
  • an evaluation unit that calculates the prediction accuracy of the predicted result for each time segment.
  • the sports motion analysis support system uses a plurality of data, and a data storage unit that stores a plurality of data in which image data of a moving image representing a series of motions in sports and the results of the motions are associated,
  • a learning unit that learns a model representing a relationship between an operation in a time segment and a result corresponding to the operation for each of a plurality of time segments determined with respect to a time point representing a predetermined motion
  • a model that specifies a time segment in which a time zone in which the degree of improvement in the prediction accuracy of the result predicted using the method is large can be specified.
  • a computer including a data storage unit that stores a plurality of data in which image data of a moving image representing a series of motions in sports is associated with a result of the motion is stored in the computer. And learning a model representing the relationship between the motion in the time segment and the result corresponding to the motion for each of the plurality of time segments defined with reference to the time point representing the predetermined motion. The prediction accuracy of the result predicted by using is calculated for each time segment.
  • a computer including a data storage unit that stores a plurality of data in which image data of a moving image representing a series of motions in sports is associated with a result of the motion is stored in the computer. And learning a model representing the relationship between the motion in the time segment and the result corresponding to the motion for each of the plurality of time segments defined with reference to the time point representing the predetermined motion. A time segment in which a time zone in which a degree of improvement in prediction accuracy of a result predicted by use is large can be specified is specified.
  • the sports motion analysis support program is a sports motion mounted on a computer including a data storage unit that stores a plurality of data in which image data of a moving image representing a series of motions in sports is associated with a result of the motion.
  • An analysis support program that uses a plurality of data in a computer and corresponds to the operation in the time division and the corresponding operation for each of the plurality of time divisions defined based on the time point representing the predetermined operation.
  • the sports motion analysis support program is a sports motion mounted on a computer including a data storage unit that stores a plurality of data in which image data of a moving image representing a series of motions in sports is associated with a result of the motion.
  • An analysis support program that uses a plurality of data in a computer and corresponds to the operation in the time division and the corresponding operation for each of the plurality of time divisions defined based on the time point representing the predetermined operation.
  • a learning process that learns a model that expresses the relationship with the result to be performed, and a specific process that specifies a time zone that can identify a time zone in which the degree of improvement in prediction accuracy of the result predicted using the model is large It is characterized by.
  • the present invention it is possible to assist the user so that the time zone in which the form has a great influence on the result among the time during which a series of actions in sports is performed can be grasped.
  • FIG. 1 is a block diagram illustrating an example of a sports motion analysis support system according to a first embodiment of the present invention. It is a schematic diagram which shows a series of operation
  • a long jump will be described as an example.
  • the present invention is applicable to sports other than long jumping.
  • a case where the result of the long jump operation is a result (in this example, a jump distance) will be described as an example.
  • the result is represented numerically.
  • the result of movement in sports may be an event such as in which direction the ball flew.
  • the result is represented by a value (for example, a binary value of “0” and “1”) according to the content of the event.
  • a value for example, a binary value of “0” and “1”
  • FIG. FIG. 1 is a block diagram showing an example of a sports motion analysis support system according to the first embodiment of the present invention.
  • the sports motion analysis support system 1 according to the first embodiment of the present invention includes a data storage unit 2, a time segment image extraction unit 3, a learning unit 4, a prediction unit 5, an evaluation unit 6, and a display unit 7. Is provided.
  • the data storage unit 2 is a storage device that stores a plurality of data in which image data of a moving image representing a series of actions in sports and the results of the actions are associated with each other.
  • image data of a moving image representing a series of actions in sports and the results of the actions are associated with each other.
  • moving image data representing a series of actions of a person who performs a long jump is associated with a result (jumping distance) obtained as a result of the action.
  • the data storage unit 2 stores a plurality of such data.
  • the manner of associating the image data with the results is not particularly limited.
  • the grade may exist as data different from the image data, and the image data and the grade may be associated with each other.
  • the image data and the results may be associated with each other in a manner in which the grade is included in the moving image of the image data (in other words, the manner in which the grade is expressed on the moving image). This also applies to the embodiments described later.
  • FIG. 2 is a schematic diagram showing a series of actions of a person who performs a long jump (hereinafter referred to as a player).
  • a player to simplify the drawing, for example, the same posture is illustrated as the posture of the player when running, but the actual player moves a series of limbs while moving the limbs and the like. Perform the action.
  • This also applies to the schematic diagram shown in FIG.
  • the athlete makes a run, jumps at the crossing board 11, and then lands.
  • moving image data representing the series of operations can be obtained.
  • the data which matched the image data and the result at that time become one data.
  • the data storage unit 2 stores a plurality of such data in advance.
  • a plurality of storage units 2 may be stored.
  • the data storage unit 2 stores a plurality of data related to a specific player.
  • the specific player may be a player who uses the sports motion analysis support system 1 of the present invention, or a player who is instructed by a coach using the sports motion analysis support system 1.
  • the specific player may be a player who is a competitor for the user who uses the sports motion analysis support system 1 of the present invention. This also applies to other embodiments described later.
  • the movie represents a series of movements of a player, the movie also represents the form at each point of the player.
  • the time division in the present invention is a time division determined on the basis of a time point when the moving image represents a specific action of the player.
  • the time point when the moving image represents the crossing operation is used as a reference, and the time of this reference is set to zero.
  • the time before the reference (time 0) is expressed as negative, and the time after the reference is expressed as positive.
  • the time division can be determined by setting the start time and the end time with the reference time as 0 when the moving image represents the crossing operation.
  • a plurality of time segments are determined in advance.
  • the lengths of the time segments may be different from one another, and the plurality of time segments may include a common time zone.
  • FIG. 3 is a schematic diagram showing an example of a plurality of time segments.
  • a plurality of time segments a to g are shown.
  • “A” to “g” illustrated in FIG. 3 are time segment identification information.
  • the time division a is a range from time “ ⁇ 2.5” to time “ ⁇ 2.0” (see FIG. 3).
  • the time division b is a range from time “ ⁇ 2.5” to time “ ⁇ 1.5” (see FIG. 3).
  • the start time and the end time are determined in the other time sections c, d, e,.
  • the start time of each time segment is “ ⁇ 2.5”, but the start time “ ⁇ 2.5” of each time segment is an example.
  • the start time of each time segment may be earlier than “ ⁇ 2.5” or later than “ ⁇ 2.5”.
  • the start time of each time segment is common, but the start time of each time segment may not be common.
  • each time segment may be determined continuously so as not to include a common time zone with other time segments. For example, the time interval “ ⁇ 2.5” to the time “ ⁇ 2.0” is set as the time division a, the time interval “ ⁇ 2.0” to the time “ ⁇ 1.5” is set as the time division b, and the time “ ⁇ 1” is set. Each time segment may be determined such that the time segment c is from .5 ”to time“ ⁇ 1.0 ”.
  • each time segment may be determined so that the start time is different with the end time being the same.
  • the reference time “0” may be set as a common end time.
  • the plurality of time segments are determined so as to be ordered based on time.
  • the start time of each time segment is common and the end time is different. Therefore, the time segments shown in FIG. 3 can be ordered in the order of a, b, c, d, e, f, and g in the order of end times.
  • the time divisions are determined to be ordered based on time will be described as an example.
  • the time segment image extraction unit 3 identifies a range corresponding to each of a plurality of predetermined time segments for each image data stored in the data storage unit 2. Then, the time segment image extraction unit 3 extracts still images from the range corresponding to each time segment for each image data, and generates a set of still images for each time segment of each image data.
  • the first image data is image data # 1
  • any n-th image data is image data #n.
  • the time segment image extraction unit 3 specifies, for each time segment, a range corresponding to the time segment in the image data # 1. That is, the time segment image extraction unit 3 specifies a range corresponding to the time segment a, a range corresponding to the time segment b, and the like in the image data # 1. Then, the time segment image extraction unit 3 extracts still images from the range corresponding to the time segment a in the image data # 1, and generates a set of still images corresponding to the time segment a in the image data # 1.
  • the time segmented image extraction unit 3 may extract a still image every predetermined time when extracting a still image from moving image data.
  • the time segment image extracting unit 3 may extract the still image from the range corresponding to the time segment a every 0.1 second.
  • 0.1 second was illustrated as an example of the above-mentioned predetermined time, the above-mentioned predetermined time is not limited to 0.1 second.
  • time segment image extraction unit 3 extracts still images from the range corresponding to each time segment in the image data # 1, and generates a set of still images for each time segment.
  • the time-segment image extraction unit 3 performs the same processing on each image data other than the image data # 1, and generates a set of still images for each time segment in each image data.
  • the set of still images obtained for each time segment in each image data represents the player's action and form in the corresponding time segment.
  • FIG. 4 is an explanatory diagram showing identification information of a set of still images obtained for each time segment in each image data.
  • a set of still images extracted from the time segment a in the image data # 1 is represented as “# 1, a” (see FIG. 4).
  • the identification information of the other set of still images is also expressed by the same rule.
  • FIG. 4 also shows results corresponding to each image data.
  • the learning unit 4 learns, for each of a plurality of time segments, a model representing the relationship between the motion in the time segment and the grade (jumping distance) corresponding to the motion (in other words, the generation) To do).
  • this model is a model showing a relationship between a form in a time segment (a form represented by a set of still images corresponding to the time segment) and a grade.
  • the learning unit 4 learns a model in which a set of still images representing a form is an explanatory variable and a score is an objective variable. Therefore, it is possible to calculate the predicted value of the score using the model obtained by learning and the set of still images.
  • the learning unit 4 determines, for each image data, a set of still images corresponding to a time segment of interest and a combination of results, and uses the combination determined for each image data as learning data,
  • a model corresponding to the category may be learned by machine learning.
  • the machine learning algorithm may be any algorithm that can learn a model for calculating a predicted value of a score as described above.
  • the learning unit 4 learns a model corresponding to the time segment a.
  • the learning unit 4 combines the still image set “# 1, a” with the score “5.8 m”, the still image set “# 2, a” with the score “6.5 m”, A combination of a set of still images “# 3, a” and a score “7.1 m”, a combination of a set of still images “# 4, a” and a score “6.2 m”, etc. (see FIG. 4)
  • a model indicating the relationship between a form (a set of still images corresponding to the time section a) in the time section a and the grade is learned by using the combination of these as learning data.
  • the learning unit 4 similarly learns models corresponding to the respective time segments for the other time segments b, c, d,.
  • a model is obtained for each time segment by the processing of the learning unit 4.
  • the prediction unit 5 calculates the predicted value of the grade for each image data using a model corresponding to each time division.
  • the prediction unit 5 calculates the predicted value of the grade using the model corresponding to the time division a for the image data # 1.
  • the prediction unit 5 calculates the predicted value of the score by applying the set “# 1, a” of the still images corresponding to the time segment a in the image data # 1 to the model corresponding to the time segment a. To do.
  • the prediction unit 5 similarly calculates the predicted value of the result based on the model corresponding to each other time segment for the image data # 1. That is, the prediction unit 5 calculates the predicted value of the score for each model corresponding to each time segment with respect to the image data # 1.
  • the prediction unit 5 calculates the predicted value of the grade for the number of time sections for one image data.
  • the prediction unit 5 similarly calculates the predicted value of the result for the number of time segments for each image data other than the image data # 1.
  • the prediction unit 5 calculates N prediction values corresponding to the time segment a. Similarly, N predicted values corresponding to other time segments are also calculated.
  • the evaluation unit 6 calculates the prediction accuracy of the predicted value using a plurality of predicted values (predicted values for the number of image data) corresponding to the time segment and the true value of the grade for each time segment. It can be said that the evaluation part 6 is evaluating the prediction accuracy of a model for every time division.
  • the true value of the grade is the grade value stored in the data storage unit 2 in association with the image data.
  • the evaluation unit 6 may calculate the average value of values obtained by dividing the predicted value of the result by the true value of the result for each time segment. This value can be said to be a value representing prediction accuracy. For example, the evaluation unit 6 divides the predicted value calculated based on the time section a of the image data # 1 by the result value (true value) associated with the image data # 1. The evaluation unit 6 also divides the predicted value calculated based on the time division a by the value of the grade (true value) associated with the image data of interest for each of the other image data. The evaluation unit 6 calculates the average value of the division results obtained for each image data as the prediction accuracy corresponding to the time segment a (more specifically, the prediction accuracy of the model corresponding to the time segment a). The evaluation unit 6 calculates the prediction accuracy corresponding to each time segment in the same manner for each other time segment.
  • the calculation method of the prediction accuracy is not limited to the above example.
  • the evaluation unit 6 may calculate the average value of values obtained by dividing the absolute value of the difference between the predicted value of the grade and the true value of the grade by the true value for each time segment. This value can also be said to be a value representing prediction accuracy.
  • the display unit 7 displays the relationship between the time segment and the prediction accuracy corresponding to the time segment (more specifically, the prediction accuracy of the model corresponding to the time segment).
  • the display unit 7 may display time information and text information in association with prediction accuracy of a model corresponding to the time information.
  • the display unit 7 displays the relationship between the time segment and the prediction accuracy of the model corresponding to the time segment in a graph from the viewpoint of easy understanding for the person viewing the displayed information.
  • FIG. 5 is an explanatory diagram illustrating an example of a graph displayed by the display unit 7, and illustrates a graph indicating a relationship between a time segment and a prediction accuracy of a model corresponding to the time segment.
  • the vertical axis of the graph shown in FIG. 5 represents the prediction accuracy of the model.
  • the horizontal axis of the graph represents time segments.
  • the start time of each time segment is common, and the end time is different. Therefore, each time segment can be ordered in the order of end time.
  • the display unit 7 orders the time segments in the order of the end time, and displays the identification information of each time segment along the horizontal axis in that order (see FIG. 5).
  • the display part 7 displays the graph which shows the change of the prediction precision corresponding to each time division so that it may illustrate in FIG. 5, for example.
  • the time segment image extraction unit 3, the learning unit 4, the prediction unit 5, the evaluation unit 6, and the display unit 7 are realized by, for example, a CPU (Central Processing Unit) of a computer having a display device (not shown in FIG. 1).
  • the CPU reads a sports motion analysis support program from a program recording medium such as a computer program storage device (not shown in FIG. 1), and according to the sports motion analysis support program, the time segment image extraction unit 3 and the learning unit 4
  • the prediction unit 5, the evaluation unit 6, and the display unit 7 may be operated.
  • a part of the display unit 7 that determines display contents (for example, a graph) and displays the display contents on the display device is realized by the CPU.
  • the part that actually performs display is realized by a display device.
  • the computer may be a personal computer or a portable computer such as a smartphone. These points are the same in other embodiments described later.
  • the sports motion analysis support system 1 may have a configuration in which two or more physically separated devices are connected by wire or wirelessly.
  • the sports motion analysis support system 1 may be realized as a system in which a portable computer such as a smartphone and a server cooperate. This also applies to other embodiments described later.
  • FIG. 6 is a flowchart showing an example of processing progress of the first embodiment of the present invention. Since the details of the operation of each step shown below have already been described, detailed description thereof will be omitted here.
  • the time segment image extraction unit 3 identifies a range corresponding to each of a plurality of predetermined time segments for each image data stored in the data storage unit 2. Then, the time segment image extraction unit 3 extracts a still image from a range corresponding to each time segment for each image data, and generates a set of still images for each time segment of the individual image data ( Step S1).
  • the learning unit 4 learns, by machine learning, a model representing the relationship between the form and the grade in each time segment for each of the plurality of time segments (step S2).
  • the prediction unit 5 calculates the predicted value of the grade for each piece of image data using a model corresponding to each time segment (step S3).
  • the evaluation unit 6 calculates the prediction accuracy of the model corresponding to the time segment using a plurality of predicted values corresponding to the time segment and the true value of the grade (step S4).
  • the display unit 7 displays the relationship between the time segment and the prediction accuracy of the model corresponding to the time segment (step S5).
  • the display unit 7 may display time information and text information in association with the prediction accuracy of a model corresponding to the time information.
  • the display unit 7 displays the relationship between the time segment and the prediction accuracy of the model corresponding to the time segment in a graph as illustrated in FIG.
  • a model corresponding to one time segment is generated using a plurality of combinations of a set of still images and results. Therefore, when the predicted value of the grade is calculated by this model, if the prediction accuracy of the model is good, it can be said that the predicted value statistically represents the tendency of the grade according to the form.
  • the evaluation part 6 calculates the prediction precision of the model corresponding to a time division for every time division.
  • the display unit 7 displays the relationship between the time segment and the prediction accuracy of the model corresponding to the time segment.
  • the user of the sports motion analysis support system 1 (the player whose image data and results are stored in the data storage unit 2, or the coach thereof) has the prediction accuracy for each time segment ordered based on the time. It is possible to confirm, and based on the change in the prediction accuracy, it is possible to grasp a time zone in which the degree of improvement in the prediction accuracy of the results predicted using the model is large. As described above, if the prediction accuracy of the model is good, it can be said that the predicted value of the result obtained using the model statistically represents the tendency of the result according to the form. Therefore, being able to grasp the time zone in which the degree of improvement in the prediction accuracy of the results predicted using the model is large means that the time zone in which the form has a great influence on the result can be grasped.
  • this embodiment it can assist so that a user can grasp
  • the prediction accuracy of the corresponding model is slightly increased.
  • the prediction accuracy of the corresponding model is slightly increased.
  • the degree of improvement of the prediction accuracy of the model corresponding to the time segment c is larger than the prediction accuracy of the model corresponding to the time segment b. Therefore, the user can select the time zone corresponding to the difference between the time zone c and the time zone b (specifically, the time zone from the time “ ⁇ 1.5” to the time “ ⁇ 1.0”, see FIG. 3). ) Can be recognized as a time zone in which the degree of improvement in the prediction accuracy of the results predicted using the model is large. Therefore, the user can grasp that the time zone is a time zone in which the form has a great influence on the result among the time during which the long jump is performed.
  • time zone in this example, the time zone from time “ ⁇ 1.5” to time “ ⁇ 1.0”
  • the user focuses on the time zone form. By confirming, it can contribute to the improvement of the player's performance.
  • Embodiment 2 FIG.
  • the sports motion analysis support system 1 according to the first embodiment displays a graph illustrated in FIG. 5, for example, so that the user has a time zone in which the degree of improvement in prediction accuracy of results predicted using a model is large. I was able to grasp.
  • the sports motion analysis support system according to the second exemplary embodiment of the present invention specifies a time segment that can identify a time zone in which the degree of improvement in prediction accuracy of results predicted using a model is large, and displays the time segment To do.
  • the sports motion analysis support system according to the second embodiment of the present invention can be represented by the blocks shown in FIG. 1 similarly to the sports motion analysis support system 1 according to the first embodiment.
  • a second embodiment will be described. Explanation of matters similar to those in the first embodiment will be omitted as appropriate.
  • the data storage unit 2, the time segment image extraction unit 3, the learning unit 4 and the prediction unit 5 are the same as the data storage unit 2, the time segment image extraction unit 3, the learning unit 4 and the prediction unit 5 of the first embodiment. The description is omitted.
  • the evaluation unit 6 of the second embodiment performs the same operation as the evaluation unit 6 of the first embodiment, and further performs the following operations.
  • the evaluation unit 6 identifies a time segment that can identify a time zone in which the degree of improvement in prediction accuracy of the results predicted using the model is large.
  • the evaluation unit 6 orders the time segments based on the time, and calculates the difference in prediction accuracy between the models corresponding to the two time segments whose order is adjacent. In this case, the evaluation unit 6 calculates a value obtained by subtracting the prediction accuracy of the model corresponding to the time segment a from the prediction accuracy of the model corresponding to the time segment b. The evaluation unit 6 also uses the prediction accuracy of the model corresponding to the time segment whose order is later, for other sets of time segments whose order is adjacent to each other, such as a set of time segments b and c and a set of time segments c and d. Then, a value obtained by subtracting the prediction accuracy of the model corresponding to the previous time segment in the order is calculated.
  • the evaluation unit 6 sets a time segment whose order is later in a set of time segments having the largest difference in prediction accuracy as a time segment that can identify a time zone in which the degree of improvement in prediction accuracy is large. What is necessary is just to specify. For example, it is assumed that the difference in prediction accuracy in the set of time segments b and c is the largest among the differences in prediction accuracy. In this case, the evaluation part 6 should just specify the time division c as a time division which can specify the time slot
  • a time segment that is later in order is identified as a time segment that can identify a time zone in which the degree of improvement in prediction accuracy is large.
  • a threshold value may be determined in advance.
  • the method of specifying a time segment that can specify a time zone in which the degree of improvement in prediction accuracy of results predicted using a model is large is not limited to the above method, and may be another method.
  • the display unit 7 displays the time division specified by the evaluation unit 6.
  • FIG. 7 is a flowchart showing an example of processing progress of the second embodiment of the present invention. Steps S1 to S4 (see FIG. 7) in the second embodiment are the same as steps S1 to S4 (see FIG. 6) in the first embodiment, and a description thereof will be omitted.
  • step S4 the evaluation unit 6 specifies a time segment in which a time zone in which the degree of improvement in prediction accuracy of the results predicted using the model is large can be specified (step S11). Since the example of the operation of the evaluation unit 6 has already been described, the description thereof is omitted here.
  • step S11 the display unit 7 displays the time segment specified in step S11 (a time segment in which a time zone in which the degree of improvement in prediction accuracy of results predicted using the model is large) can be identified (step S11). S12).
  • FIG. 8 is an explanatory diagram showing an example of the display mode of the time segment in step S12.
  • the display unit 7 displays an icon 21 representing the time segment identification information for each time segment.
  • the display unit 7 also displays the start time and end time of the time segment.
  • the display part 7 displays the icon 21 of the time division identified in step S11 (in this example, the icon 21 of the time division c) in a different form from the icons 21 of the other time divisions.
  • the display unit 7 displays that the time segment c is the time segment specified in step S11 by displaying the frame line of the icon 21 of the time segment c thicker than the frame lines of the other icons 21. Represents.
  • FIG. 8 is an example of the display mode in step S12, and the display mode in step S12 is not limited to the example shown in FIG.
  • the evaluation unit 6 identifies a time segment in which a time zone in which the degree of improvement in prediction accuracy of a result predicted using a model is large can be identified. And the display part 7 displays the time division. Therefore, as in the first embodiment, the user can grasp the time zone in which the form has a great influence on the result among the time during which the long jump is performed. For example, it is assumed that the evaluation unit 6 specifies the time segment c in step S11 and the display unit 7 displays the time portion c. In this case, the user can select a time zone corresponding to the difference between the time segment c and the previous time segment b (specifically, from time “ ⁇ 1.5” to time “ ⁇ 1.0”). It is possible to recognize that the time zone (see FIGS.
  • the time zone is a time zone in which the degree of improvement in prediction accuracy of results predicted using the model is large. Therefore, the user can grasp that the time zone is a time zone in which the form has a great influence on the result among the time during which the long jump is performed.
  • step S12 the display unit 7 may perform the operation of step S5 in the first embodiment together. That is, the display unit 7 may display the time segment specified in step S11 and display the relationship between the time segment and the prediction accuracy of the model corresponding to the time segment. For example, the display unit 7 may display the time segment c in the manner illustrated in FIG. 8 and the graph illustrated in FIG.
  • FIG. 9 is a block diagram showing one modification of the second embodiment.
  • the sports motion analysis support system 1 shown in FIG. 9 further includes an operation unit 8 in addition to the components shown in FIG.
  • the operation up to step S12 is the same as the operation described in the second embodiment or the above-described modification.
  • the display unit 7 displays the icon 21 of each time segment in the manner illustrated in FIG. 8 in step S12 will be described as an example.
  • the operation unit 8 is a user interface for the user to specify a time segment, and is realized by, for example, a mouse.
  • the user operates the operation unit 8 to specify the icon 21 displayed as illustrated in FIG.
  • the display unit 7 accepts designation of a time segment according to the operation.
  • step S11 description will be made assuming that the time segment specified in step S11 is designated. For example, it is assumed that an operation such as a click is performed on the icon 21 of the time segment c among the icons 21 shown in FIG.
  • the display unit 7 When the display unit 7 receives designation of the time segment c from the outside, the display unit 7 displays the prediction accuracy of the model corresponding to the time segment c.
  • the display unit 7 calculates the image data in which the predicted value of the result calculated using the model (model corresponding to the time segment c) in step S3 is the maximum, and uses the model in step S3.
  • the image data for which the predicted value of the recorded result is the smallest is specified.
  • the display unit 7 is calculated using the image data in which the predicted value of the result calculated using the model corresponding to the designated time segment is the maximum, and the model.
  • the image data having the minimum predicted value of the grade is specified.
  • the display unit 7 specifies the image data predicted to have the best result and the image data predicted to have the worst result.
  • the maximum means the maximum among the predicted values calculated using the model corresponding to the designated time segment c
  • the minimum means the model corresponding to the designated time segment c. This means the minimum of the predicted values calculated using.
  • the display unit 7 displays the moving image in the range corresponding to the specified time segment c in the image data having the maximum predicted value and the specified time segment c in the image data having the minimum predicted value. Each video in the corresponding range is displayed.
  • FIG. 10 is a schematic diagram illustrating an example of a screen displayed by the display unit 7 when a time segment is designated after step S12.
  • the prediction accuracy display column 31 is a column for displaying the prediction accuracy of the model corresponding to the designated time segment.
  • the first image display field 32 is a field for displaying a moving image in a range corresponding to a designated time segment in the image data having the maximum predicted value.
  • the second image display field 32 is a field for displaying a moving image in a range corresponding to a designated time segment in the image data having the smallest predicted value.
  • the display unit 7 displays the screen illustrated in FIG.
  • the display unit 7 specifies the image data having the maximum predicted value and the image data having the minimum predicted value.
  • the display unit 7 is calculated using each image data in which the predicted value of the score calculated using the model corresponding to the designated time segment corresponds to the top first to the top predetermined, and the model.
  • the image data corresponding to the predicted value of the grade from the lower first to the predetermined lower order may be specified.
  • the display unit 7 includes a moving image in a range corresponding to the specified time segment in each image data in which the predicted value corresponds from the upper first to the upper predetermined position, and the predicted value from the lower first to the lower predetermined.
  • the moving image in the range corresponding to the designated time segment in each image data corresponding to the first may be displayed.
  • Values representing the upper predetermined number and the lower predetermined number may be determined in advance.
  • the sports motion analysis support system 1 may be provided with an input device (user interface) for the user to input values representing the upper predetermined order and the lower predetermined order into the sport motion analysis support system 1.
  • this modification it is possible to display a video that is predicted to have the best grade, and that can be displayed in a time zone in which the form has a great influence on the result, and that the grade is predicted to be the worst. It is possible to display a moving image of a time zone in which the form greatly affects the result. Therefore, by analyzing or comparing these images, it can be used for improving the form, finding a wrinkle of the form, and the like.
  • the display unit 7 may perform the above display operation only when the time segment specified in step S11 is designated. In addition, when an arbitrary time segment is designated, the display unit 7 may perform the display operation according to the designated time segment.
  • the result of the operation is a result and is represented by a numerical value
  • the result of movement in sports may be an event.
  • a case where a PK (penalty kick) scene in soccer is applied to the present invention will be described as an example.
  • the result of the PK action is one of two types of events: “the ball flew to the right” and “the ball flew to the left”. Description of the same matters as those in the first embodiment, the second embodiment, and the modifications thereof will be omitted.
  • the result (event) of the PK operation is represented by binary values.
  • the data storage unit 2 associates video image data representing a series of actions of a person performing PK (hereinafter referred to as a player) with the result of the action (event “1” or event “0”).
  • a plurality of data is stored in advance.
  • the result of this operation can be said to be a nominal measure, for example.
  • a plurality of time segments may be determined based on the time point when the video represents the player's kicking motion.
  • the learning unit 4 learns, for each of a plurality of time segments, a model representing the relationship between the operation in the time segment and the probability that the event “1” occurs or the event “0” occurs.
  • the probability that event “1” will occur or the probability that event “0” will occur is represented by one objective variable.
  • the range that this objective variable can take is 0 to 1. If the value of the objective variable is larger than 0.5, the value represents the probability that the event “1” (that is, the event “the ball flies to the right”) will occur.
  • the probability that the event “1” occurs is high, and the closer the value is to 0.5, the lower the probability that the event “1” occurs.
  • the learning unit 4 may use, for example, logistic regression analysis as a machine learning algorithm.
  • the learning unit 4 determines a combination of a set of still images corresponding to a time segment of interest and a result (“1” or “0”) for each image data, and learns a combination determined for each image data
  • the model corresponding to the time segment of interest may be learned by machine learning (for example, logistic regression analysis).
  • the learning unit 4 learns the model for each time segment.
  • the prediction unit 5 calculates a predicted value of the result for each image data using a model corresponding to each time segment.
  • the predicted value is the value of the objective variable and represents the probability that the event “1” will occur or the probability that the event “0” will occur.
  • the probability (value of the objective variable) calculated by the prediction unit 5 is a continuous value that can take a value in the range of 0 to 1. Therefore, it can be said that the value of the objective variable calculated by the prediction unit 5 is, for example, an order scale.
  • the evaluation unit 6 may perform the following processing.
  • the true value of the result is “1” or “0”, and is stored in the data storage unit 2 in association with the image data.
  • the evaluation unit 6 calculates the prediction accuracy of the model corresponding to the time segment a by focusing on the time segment a. It is assumed that the prediction unit 5 calculates a predicted value for each image data using a model corresponding to the time segment a. The number of predicted values calculated for the time segment a is the number of image data. The evaluation unit 6 regards each predicted value as “1” or “0”. That is, the evaluation unit 6 regards the predicted value as “1” if one predicted value is greater than 0.5, and sets the predicted value as “0” if the predicted value is smaller than 0.5. I reckon. Then, the evaluation unit 6 determines whether or not the predicted value regarded as “1” or “0” matches the true value of the result, and counts the number of predicted values that match the true value. To do.
  • the evaluation unit 6 regards the predicted value “0.8” as “1” and determines that it matches the true value “1”. In this case, if the true value is “0”, the evaluation unit 6 may regard the predicted value “0.8” as “1” and determine that it does not match the true value “0”.
  • the evaluation unit 6 regards the predicted value “0.3” as “0” and determines that it matches the true value “0”. In this case, if the true value is “1”, the evaluation unit 6 may regard the predicted value “0.3” as “0” and determine that it does not match the true value “1”.
  • the evaluation unit 6 regards the predicted value as “1” or “0” for each predicted value corresponding to the image data, and counts the number of predicted values whose values match the true value. . Then, the evaluation unit 6 sets the value obtained by dividing the count value by the number of predicted values calculated for the time segment a as the prediction accuracy of the model corresponding to the time segment a. Thus, the value obtained by dividing the count value by the number of predicted values can also be referred to as the coincidence rate.
  • the evaluation unit 6 calculates the prediction accuracy of the model corresponding to each time segment for each other time segment.
  • the result of the operation may be an event even when the designation of the time segment is received from the outside after step S12.
  • step S12 it is assumed that the display unit 7 displays the icon 21 of each time segment in the manner illustrated in FIG. 8, and then the time segment c is designated. In this case, the display unit 7 displays the prediction accuracy of the model corresponding to the time segment c.
  • the display unit 7 specifies image data having a maximum predicted value calculated using the model and image data having a minimum predicted value calculated using the model.
  • the higher the value the higher the probability that the ball will fly to the right after kicking, and the lower the value, the higher the probability that the ball will fly to the left after kicking. That is, the display unit 7 identifies the image data of the form predicted to have the highest probability that the ball will fly to the right after the kick and the image data of the form predicted to have the highest probability of the ball to fly to the left after the kick. Will be.
  • the display unit 7 displays the moving image in the range corresponding to the specified time segment c in the image data having the maximum predicted value and the specified time segment c in the image data having the minimum predicted value. Each video in the corresponding range is displayed.
  • the screen displayed by the display unit 7 when the time division is designated may be the same as the screen illustrated in FIG.
  • the text information such as “score: good” and “score: bad” shown in FIG. 10 may be displayed as “ball progress direction: right”, “ball progress direction: left”, etc., respectively.
  • the sports motion analysis support system 1 may include a data acquisition unit that acquires data to be stored in the data storage unit 2 from the outside.
  • FIG. 11 is a block diagram illustrating a configuration example in the case where a data acquisition unit is provided.
  • the data storage unit 2, the time segment image extraction unit 3, the learning unit 4, the prediction unit 5, the evaluation unit 6, and the display unit 7 are those elements in the first embodiment, and those in the second embodiment and its modifications. These are the same as those elements, and a description thereof will be omitted.
  • the data acquisition unit 9 acquires a plurality of pieces of data in which moving image data representing a series of motions in sports and the results of the motions are associated with each other, and stores them in the data storage unit 2.
  • the data acquisition unit 9 may access the device, acquire a plurality of data from the device, and store the data in the data storage unit 2.
  • the processing after the data acquisition unit 9 stores a plurality of data in the data storage unit 2 is the same as the processing described in the first embodiment or the processing described in the second embodiment or its modification. is there.
  • the data acquisition unit 9 is realized by, for example, a CPU of a computer that operates according to a sports motion analysis support program.
  • the operation of the sport to which the present invention is applied is not limited to these.
  • the time point at which the ball is hit with the golf club may be set as the reference time.
  • the data storage unit 2 may store data associating image data of an action tossed by a volleyball setter with a result indicating whether the ball flew to the right or left. Good.
  • the time when the setter toss may be set as the reference time.
  • the present invention can be applied to a pitcher pitching operation in baseball, a rugby formation, or an American football formation.
  • the present invention can be applied to various sports operations.
  • FIG. 12 is a schematic block diagram showing a configuration example of a computer according to each embodiment of the present invention.
  • the computer 1000 includes a CPU 1001, a main storage device 1002, an auxiliary storage device 1003, an interface 1004, a display device 1005, and an input device 1006.
  • the input device 1006 corresponds to the operation unit 8 illustrated in FIG.
  • the sports motion analysis support system 1 is implemented in a computer 1000.
  • the operation of the sports motion analysis support system 1 is stored in the auxiliary storage device 1003 in the form of a program (sport motion analysis support program).
  • the CPU 1001 reads out the program from the auxiliary storage device 1003, develops it in the main storage device 1002, and executes the above processing according to the program.
  • the auxiliary storage device 1003 is an example of a tangible medium that is not temporary.
  • Other examples of the non-temporary tangible medium include a magnetic disk, a magneto-optical disk, a CD-ROM, a DVD-ROM, and a semiconductor memory connected via the interface 1004.
  • this program is distributed to the computer 1000 via a communication line, the computer 1000 that has received the distribution may develop the program in the main storage device 1002 and execute the above processing.
  • the program may be for realizing a part of the above-described processing.
  • the program may be a differential program that realizes the above-described processing in combination with another program already stored in the auxiliary storage device 1003.
  • circuitry IV circuitry IV
  • processors or combinations thereof. These may be configured by a single chip or may be configured by a plurality of chips connected via a bus. Part or all of each component may be realized by a combination of the above-described circuit and the like and a program.
  • the plurality of information processing devices and circuits may be centrally arranged or distributedly arranged.
  • the information processing apparatus, the circuit, and the like may be realized as a form in which each is connected via a communication network, such as a client and server system and a cloud computing system.
  • FIG. 13 is a block diagram showing an outline of the present invention.
  • the sports motion analysis support system of the present invention includes a data storage unit 2, a learning unit 4, and an evaluation unit 6.
  • the data storage unit 2 stores a plurality of pieces of data in which image data of moving images representing a series of motions in sports and the motion results are associated with each other.
  • the learning unit 4 uses a plurality of data, and for each of a plurality of time segments determined based on a time point representing a predetermined operation, the relationship between the operation in the time segment and the result corresponding to the operation Learn a model that represents
  • the evaluation unit 6 calculates the prediction accuracy of the result predicted using the model for each time segment.
  • FIG. 14 is another block diagram showing the outline of the present invention.
  • the sports motion analysis support system of the present invention may include a data storage unit 2, a learning unit 4, and a specifying unit 16.
  • the data storage unit 2 and the learning unit 4 are the same as the data storage unit 2 and the learning unit 4 shown in FIG.
  • the specifying unit 16 (for example, the evaluation unit 6 in the second embodiment) specifies a time segment in which a time zone in which the degree of improvement in prediction accuracy of a result predicted using a model is large can be specified.
  • a data storage unit that stores a plurality of data in which image data of a moving image representing a series of actions in sports and the results of the actions are associated; Using each of the plurality of data, for each of a plurality of time segments determined on the basis of a time point representing a predetermined operation, a model representing a relationship between the operation in the time segment and the result corresponding to the operation A learning department to learn,
  • a sports motion analysis support system comprising: an evaluation unit that calculates a prediction accuracy of a result predicted using a model for each time segment.
  • the display unit displays prediction accuracy of a model corresponding to a time segment designated from the outside, specifies a predetermined number of image data based on a prediction value predicted using the model, and specifies each specified image data
  • the sports motion analysis support system according to any one of Supplementary Note 1 to Supplementary Note 3, wherein the moving image of the time segment is displayed.
  • the result of the action associated with the image data is a numerical value indicating the grade
  • the sports motion analysis support system according to any one of Supplementary Note 1 to Supplementary Note 4, wherein the learning unit learns a model representing a relationship between the motion in the time division and a numerical value indicating the result for each time division.
  • the result of the action associated with the image data is an event
  • the sports motion analysis support system according to any one of appendix 1 to appendix 4, wherein the learning unit learns a model representing a relationship between the motion in the time segment and the probability that the event occurs for each time segment.
  • a data storage unit that stores a plurality of data in which image data of a moving image representing a series of actions in sports and the results of the actions are associated; Using each of the plurality of data, for each of a plurality of time segments determined on the basis of a time point representing a predetermined operation, a model representing a relationship between the operation in the time segment and the result corresponding to the operation A learning department to learn,
  • a sports motion analysis support system comprising: a specifying unit that specifies a time segment in which a time zone in which a degree of improvement in prediction accuracy of a result predicted using a model is large can be specified.
  • a computer comprising a data storage unit for storing a plurality of data in which image data of a moving image representing a series of actions in sports and the results of the actions are associated with each other, Using each of the plurality of data, for each of a plurality of time segments determined on the basis of a time point representing a predetermined operation, a model representing a relationship between the operation in the time segment and the result corresponding to the operation Learn, A sports motion analysis support method, characterized in that the prediction accuracy of a result predicted using a model is calculated for each time segment.
  • a computer comprising a data storage unit for storing a plurality of data in which image data of a moving image representing a series of actions in sports and the results of the actions are associated with each other, Using each of the plurality of data, for each of a plurality of time segments determined on the basis of a time point representing a predetermined operation, a model representing a relationship between the operation in the time segment and the result corresponding to the operation Learn,
  • a sports motion analysis support method characterized by identifying a time segment in which a time zone in which a degree of improvement in prediction accuracy of a result predicted using a model is large can be identified.
  • a sports motion analysis support program installed in a computer including a data storage unit that stores a plurality of data in which image data of a moving image representing a series of motions in sports and the results of the motions are associated with each other,
  • a model representing a relationship between the operation in the time segment and the result corresponding to the operation Learning process to learn
  • a sports motion analysis support program for executing an evaluation process for calculating the prediction accuracy of a result predicted using a model for each time segment.
  • a sports motion analysis support program installed in a computer including a data storage unit that stores a plurality of data in which image data of a moving image representing a series of motions in sports and the results of the motions are associated with each other,
  • a model representing a relationship between the operation in the time segment and the result corresponding to the operation Learning process to learn
  • a sports motion analysis support program for executing a specific process for specifying a time segment in which a time zone in which a degree of improvement in prediction accuracy of a result predicted using a model is large can be specified.
  • the present invention can be suitably applied to a sports motion analysis support system that supports motion analysis in sports.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

Dans la présente invention, une unité de stockage de données 2 stocke une pluralité d'éléments de données vidéo montrant une série de mouvements dans un sport, associés aux résultats des mouvements. Une unité d'apprentissage 4 utilise la pluralité d'éléments de données pour apprendre un modèle représentant la relation entre des mouvements dans un intervalle de temps et des résultats correspondant aux dits mouvements, pour chacun d'une pluralité d'intervalles de temps stipulés en référence à un point dans le temps qui représente le mouvement prédéterminé. Une unité d'évaluation 6 calcule, pour chaque intervalle de temps, la précision de prédiction des résultats prédits à l'aide du modèle.
PCT/JP2016/088884 2016-12-27 2016-12-27 Système, procédé et programme de support d'analyse de mouvement sportif Ceased WO2018122956A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2016/088884 WO2018122956A1 (fr) 2016-12-27 2016-12-27 Système, procédé et programme de support d'analyse de mouvement sportif
JP2018558560A JP6677319B2 (ja) 2016-12-27 2016-12-27 スポーツ動作解析支援システム、方法およびプログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/088884 WO2018122956A1 (fr) 2016-12-27 2016-12-27 Système, procédé et programme de support d'analyse de mouvement sportif

Publications (1)

Publication Number Publication Date
WO2018122956A1 true WO2018122956A1 (fr) 2018-07-05

Family

ID=62707143

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/088884 Ceased WO2018122956A1 (fr) 2016-12-27 2016-12-27 Système, procédé et programme de support d'analyse de mouvement sportif

Country Status (2)

Country Link
JP (1) JP6677319B2 (fr)
WO (1) WO2018122956A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021225007A1 (fr) * 2020-05-08 2021-11-11 株式会社電通 Système de prédiction de résultat
JPWO2022049694A1 (fr) * 2020-09-03 2022-03-10
WO2022230504A1 (fr) * 2021-04-28 2022-11-03 オムロン株式会社 Dispositif d'amélioration de mouvement, procédé d'amélioration de mouvement, programme d'amélioration de mouvement et système d'amélioration de mouvement

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7344510B2 (ja) 2019-11-05 2023-09-14 テンソル・コンサルティング株式会社 動作分析システム、動作分析方法、および動作分析プログラム
KR102284802B1 (ko) * 2020-10-06 2021-08-02 김세원 스포츠 경기에 관련된 선수의 상태 정보를 제공하는 장치 및 방법

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080312010A1 (en) * 2007-05-24 2008-12-18 Pillar Vision Corporation Stereoscopic image capture with performance outcome prediction in sporting environments
WO2015080063A1 (fr) * 2013-11-27 2015-06-04 株式会社ニコン Appareil électronique
JP2015119833A (ja) * 2013-12-24 2015-07-02 カシオ計算機株式会社 運動支援システム及び運動支援方法、運動支援プログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080312010A1 (en) * 2007-05-24 2008-12-18 Pillar Vision Corporation Stereoscopic image capture with performance outcome prediction in sporting environments
WO2015080063A1 (fr) * 2013-11-27 2015-06-04 株式会社ニコン Appareil électronique
JP2015119833A (ja) * 2013-12-24 2015-07-02 カシオ計算機株式会社 運動支援システム及び運動支援方法、運動支援プログラム

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021225007A1 (fr) * 2020-05-08 2021-11-11 株式会社電通 Système de prédiction de résultat
JP2021177316A (ja) * 2020-05-08 2021-11-11 株式会社電通 勝敗予測システム
CN113939837A (zh) * 2020-05-08 2022-01-14 株式会社电通 胜负预测系统
JP7078667B2 (ja) 2020-05-08 2022-05-31 株式会社電通 勝敗予測システム
JPWO2022049694A1 (fr) * 2020-09-03 2022-03-10
WO2022049694A1 (fr) * 2020-09-03 2022-03-10 日本電信電話株式会社 Dispositif d'entraînement, dispositif d'estimation, procédé d'entraînement et programme d'entraînement
JP7393701B2 (ja) 2020-09-03 2023-12-07 日本電信電話株式会社 学習装置、推定装置、学習方法、及び学習プログラム
WO2022230504A1 (fr) * 2021-04-28 2022-11-03 オムロン株式会社 Dispositif d'amélioration de mouvement, procédé d'amélioration de mouvement, programme d'amélioration de mouvement et système d'amélioration de mouvement
JP2022170184A (ja) * 2021-04-28 2022-11-10 オムロン株式会社 動作改善装置、動作改善方法、動作改善プログラム及び動作改善システム
JP7622542B2 (ja) 2021-04-28 2025-01-28 オムロン株式会社 動作改善装置、動作改善方法、動作改善プログラム及び動作改善システム

Also Published As

Publication number Publication date
JP6677319B2 (ja) 2020-04-08
JPWO2018122956A1 (ja) 2019-03-28

Similar Documents

Publication Publication Date Title
US12370429B2 (en) Computer vision and artificial intelligence applications for performance evaluation and/or skills development
US11967086B2 (en) Player trajectory generation via multiple camera player tracking
Blank et al. Sensor-based stroke detection and stroke type classification in table tennis
US10803762B2 (en) Body-motion assessment device, dance assessment device, karaoke device, and game device
JP6704606B2 (ja) 判定システム、及び判定方法
JP6677319B2 (ja) スポーツ動作解析支援システム、方法およびプログラム
US20210170230A1 (en) Systems and methods for training players in a sports contest using artificial intelligence
CN112204570B (zh) 动态地确定区域
US10664691B2 (en) Method and system for automatic identification of player
JP6653423B2 (ja) プレー区間抽出方法、プレー区間抽出装置
Tanaka et al. Automatic edge error judgment in figure skating using 3d pose estimation from a monocular camera and imus
JP6677320B2 (ja) スポーツ動作解析支援システム、方法およびプログラム
JP6760610B2 (ja) 位置計測システム、及び位置計測方法
CN119478791A (zh) 球员和球的追踪检测方法、系统、电子设备和存储介质
US20220343649A1 (en) Machine learning for basketball rule violations and other actions
JP7740693B2 (ja) 戦術分析装置およびその制御方法、並びに制御プログラム
RU2599699C1 (ru) Способ регистрации и анализа соревновательных игровых действий спортсменов
CN110314368B (zh) 台球击球的辅助方法、装置、设备及可读介质
Kuruppu et al. Comparison of different template matching algorithms in high speed sports motion tracking
JP2020054748A (ja) プレイ分析装置、及び、プレイ分析方法
US20250010132A1 (en) Evaluation apparatus, evaluation method, and program
Malawski et al. Automatic analysis of techniques and body motion patterns in sport
EP4325448A1 (fr) Appareil et procédé de traitement de données
JP2023178888A (ja) 判定装置、判定方法、及びプロググラム
Alsaif¹ et al. Check for updates

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16924989

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018558560

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16924989

Country of ref document: EP

Kind code of ref document: A1