[go: up one dir, main page]

WO2025177697A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Info

Publication number
WO2025177697A1
WO2025177697A1 PCT/JP2024/045826 JP2024045826W WO2025177697A1 WO 2025177697 A1 WO2025177697 A1 WO 2025177697A1 JP 2024045826 W JP2024045826 W JP 2024045826W WO 2025177697 A1 WO2025177697 A1 WO 2025177697A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
information processing
processing device
athlete
athletes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2024/045826
Other languages
English (en)
Japanese (ja)
Inventor
シュテファン アウスト
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Communication Systems Ltd
Original Assignee
NEC Communication Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Communication Systems Ltd filed Critical NEC Communication Systems Ltd
Publication of WO2025177697A1 publication Critical patent/WO2025177697A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • This disclosure relates to an information processing device, an information processing method, and a program.
  • Patent Document 1 describes a technology that provides feedback on the user's movements at the location where the device is worn by the user via the device.
  • the information processing device 1 is configured to acquire input data including at least one of image data, audio data, and haptic data, and generate control data to be supplied to one or more haptic devices worn by the athlete by referring to the input data. Therefore, the information processing device 1 has the effect of improving the athlete's play by providing feedback on the athlete's movements.
  • Fig. 2 is a flow diagram showing the flow of the information processing method S1.
  • the information processing method S1 includes an acquisition process (acquisition step) S11 and a generation process (generation step) S12.
  • the information processing method S1 is executed, as an example, by the information processing device 1 described above.
  • step S11 the acquisition unit 11 of the information processing device 1 acquires input data including at least one of image data, audio data, and tactile data.
  • step S12 the generation unit 12 of the information processing device 1 generates control data to be supplied to one or more haptic devices worn by the athlete, by referring to the input data acquired by the acquisition unit 11 in step S11.
  • information processing method S1 is configured to acquire input data including at least one of image data, audio data, and haptic data, and generate control data to be supplied to one or more haptic devices worn by the athlete by referring to the input data. Therefore, information processing method S1 has the effect of improving the athlete's play by providing feedback on the athlete's movements.
  • the configuration according to this exemplary embodiment simultaneously records multimodal data consisting of audio, video, tactile data, etc., analyzes it using AI-based technology, and provides real-time tactile feedback to the athlete via a haptic device.
  • FIG. 3 is a block diagram showing the configuration of the information processing system 100A.
  • the information processing system 100A includes, as an example, an information processing device 1A, an information processing device 2A, a camera 16, a microphone 17, a haptic device 18, and an input/output device 19.
  • the information processing device 2A has the same configuration as the information processing device 1A described below.
  • the camera 16 captures an image of a subject and supplies image data including the subject in its angle of view to the information processing device 1A or other devices included in the information processing system 100A.
  • the camera 16 may be composed of one or more cameras installed in the competition area.
  • the camera 16 may be composed of a front camera that captures images of the competition area from the front, a rear camera that captures images of the competition area from the opposite side to the front camera, and other cameras that capture images of important locations in the competition area.
  • the camera 16 may include a camera attached to the body of an athlete, coach, or referee.
  • the camera 16 may include a camera attached to or above the head of an athlete, coach, or referee.
  • Such a camera is suitable for capturing images of the entire playing area, and by acquiring image data from such a camera, the information processing device 1A can record and analyze the entire playing area.
  • the microphone 17 collects sounds from the target area and supplies sound data containing the sounds to the information processing device 1A included in the information processing system 100A or other devices.
  • the microphone 17 may be composed of one or more microphones installed in the competition area.
  • the microphone 17 may be composed of a front microphone that collects sounds from the front of the competition area, a back microphone that collects sounds from the opposite side of the competition area from the front microphone, and other microphones that collect sounds from important locations in the competition area.
  • the microphone 17 may include a microphone attached to the body of an athlete, coach, or referee.
  • the microphone 17 may include a microphone attached to the head of an athlete, coach, or referee.
  • Such a microphone is suitable for collecting sounds from a specific subject, and by acquiring sound data from such a microphone, the information processing device 1A can suitably perform recording and analysis related to that subject.
  • the haptic device 18 is, for example, a device that includes components used for driving and generates a motion that can be recognized by an athlete wearing the device through the sense of touch.
  • the control data supplied to the haptic device 18 from the information processing device 1A included in the information processing system 100A or another device may be data used to control the haptic device 18.
  • the haptic device 18 may be worn on the body of an athlete or the like. Specifically, the haptic device 18 may be worn on the arm, hand, leg, finger, footwear, etc. of the athlete or the like.
  • the haptic device 18 further includes a tactile sensor capable of detecting contact with an object, and supplies tactile data, which is data relating to the contact of the object with the tactile sensor, to the information processing device 1A included in the information processing system 100A or to another device.
  • Such haptic devices are suitable for detecting objects that come into contact with athletes, etc., and by acquiring haptic data from such haptic devices, the information processing device 1A can suitably perform recording and analysis related to the athletes, etc.
  • the component used to drive the haptic device 18 may be a motor that generates vibrations.
  • the control data supplied to the haptic device 18 may include, for example, information regarding the pattern of electrical signals that drive a motor or the like provided in the haptic device 18.
  • the control data supplied to the haptic device 18 may include, for example, information regarding the vibration pattern generated by a motor or the like provided in the haptic device 18.
  • the vibration pattern may include, for example, information transmitted from the information processing device 1A to the athlete via the haptic device 18.
  • Such haptic devices are suitable for generating actions that can be recognized by athletes and others through their sense of touch, and by supplying control data to such haptic devices, the information processing device 1A can effectively transmit information to the athletes and others through their sense of touch.
  • the input/output device 19 receives instructions from the user, supplies information indicating the received instructions to the information processing device 1A or other devices, and presents information obtained from the information processing device 1A or other devices to the user.
  • the input/output device 19 is configured to include at least one of the following input/output devices: a keyboard, a mouse, a display, a printer, a touch panel, etc.
  • the input/output device 19 visually presents information acquired from the information processing device 1A or other devices to the user via a touch panel.
  • the input/output device 19 may also be configured to display a GUI (Graphical User Interface) on the touch panel and acquire instructions from the user as input to the GUI.
  • GUI Graphic User Interface
  • the input/output device 19 may be configured to include, as an example, a head-mounted display or eyeglass-type display, and to present information acquired from the information processing device 1A or other devices to the user as virtual reality or mixed reality by displaying the information on the head-mounted display or eyeglass-type display. More specifically, the input/output device 19 may be configured to present a digital copy of the athlete generated by the generation unit 12 (described below) as virtual reality or mixed reality.
  • the communication unit 15A communicates with devices external to the information processing device 1A.
  • the communication unit 15A communicates with the camera 16, the microphone 17, the haptic device 18, the input/output device 19, and the information processing device 2A.
  • the communication unit 15A transmits data supplied from the control unit 10A to the input/output device 19 and the information processing device 2A, and supplies data received from the camera 16, the microphone 17, the haptic device 18, the input/output device 19, and the information processing device 2A to the control unit 10A.
  • the storage unit 14A stores various data referenced by the control unit 10A and various data generated by the control unit 10A.
  • Input data IN referenced by the control unit 10A Athlete action data PA referenced by the control unit 10A Analysis result RR, which is an example of the processing result by the control unit 10A Library data LI referenced by the control unit 10A
  • Control data COD which is an example of the processing result by the control unit 10A
  • Presentation data DD which is an example of the processing result by the control unit 10A
  • the learned model LM used in the processing by the control unit 10A is stored.
  • the input data IN may be multimodal data including at least one of image data IM, audio data AU, and haptic data HAP.
  • the input data IN may also include at least one of image data captured by the camera 16, audio data collected by the microphone 17, and haptic data detected by the haptic device 18.
  • the image data IM, audio data AU, and haptic data HAP included in the input data IN may be synchronized with each other in real time, for example.
  • at least one of the image data IM, audio data AU, and haptic data HAP included in the input data IN may include information regarding the movements of one or more athletes.
  • the audio data AU is, for example, data containing sounds generated by athletes and the like in and around the competition area.
  • the audio data AU may also, for example, be collected by microphone 17.
  • the tactile data HAP is, for example, data relating to the contact of an object with a tactile sensor capable of detecting contact of an object.
  • the tactile sensor may be provided in a tactile device 18 worn by an athlete or the like.
  • the athlete movement data PA is data indicating information about the movements of one or more athletes included in at least one of the image data IM, audio data AU, and haptic data HAP in the input data IN.
  • the athlete movement data PA may be data indicating the results of the generation unit 12 detecting the movements of one or more athletes in the input data using a machine learning model (for example, a learned model LM).
  • the analysis result RR includes information indicating the result of the analysis process performed by the generation unit 12, which will be described later.
  • the analysis result RR includes the following information:
  • the data may include the results of an analysis of the performance of at least one of one or more athletes, with reference to information on the movements of one or more athletes in the input data.
  • the analysis may be performed, for example, on each of a plurality of areas (area 1, area 2, ...) in the competition area shown in the image data (not shown).
  • the analysis result RR may also include gap information GI.
  • the gap information GI may be, for example, - Information containing information about the difference between the athlete's performance and the comparison performance.
  • the library data LI is, for example, a library related to the training of one or more athletes.
  • the library data LI may include information related to the improvement of the athletes' skills in each training item.
  • the control data COD is, for example, data used to control the operation of the haptic device 18.
  • the control data COD may include information regarding the vibration pattern generated by a motor or the like provided in the haptic device 18.
  • the vibration pattern may include, for example, information transmitted from the information processing device 1A to the athlete via the haptic device 18.
  • the control data COD may include, for example, information regarding the pattern of the electrical signal that vibrates a motor or the like provided in the haptic device 18.
  • the presentation data DD is, for example, data for presenting a digital copy of the athlete in virtual reality or mixed reality.
  • the digital copy may be, for example, a data set that represents (or simulates) the athlete's movements in two or three dimensions.
  • the model LM has multiple parameters learned based on a deep learning algorithm and detects the actions of one or more athletes in the input data IN.
  • control unit 10A As shown in FIG. 3, the control unit 10A includes an acquisition unit 11, a generation unit 12, and an output unit 13.
  • the generation unit 12 generates control data COD to be supplied to one or more haptic devices 18 worn by the athlete, with reference to the input data IN.
  • the control data COD may include information regarding a vibration pattern generated by a motor or the like provided in the haptic device 18.
  • the vibration pattern may include, for example, information transmitted from the information processing device 1A to the athlete via the haptic device 18.
  • the generation unit 12 may generate control data COD in response to the actions of one or more athletes included in the input data IN described above.
  • the generation unit 12 may generate control data COD that serves as feedback for the actions of one or more athletes, for example.
  • the information processing device 1A equipped with a feedback loop will be described later with reference to FIG. 6.
  • the generation unit 12 may detect the movements of one or more athletes in the input data IN using a machine learning model (for example, a learned model LM) and generate the control data COD by referring to the detection results.
  • a machine learning model for example, a learned model LM
  • the specific algorithm used by the generation unit 12 to detect the athletes' movements is not limited to this exemplary embodiment, but as an example, the process of detecting the athletes' movements may use an algorithm based on machine learning.
  • the generation unit 12 may generate the control data COD by referencing library data LI, which is a library related to the training of one or more athletes.
  • the control data COD generated by the generation unit 12 by referencing library data LI may, for example, contribute to improving the athletes' skills in each training item.
  • the generation of control data COD by referencing library data LI will be described later with reference to Figures 7 and 8.
  • the generation unit 12 may also refer to the control data COD to generate presentation data DD for presenting a digital copy of the athlete in virtual reality or mixed reality.
  • the digital copy is, for example, a data set that represents (or simulates) the athlete's movements in two or three dimensions, and is presented as virtual reality or mixed reality via the input/output device 19 described above.
  • the generation unit 12 includes an analysis unit 12-1.
  • the analysis unit 12-1 performs an analysis process by referring to information relating to the movements of one or more athletes in the input data.
  • the analysis unit 12-1 performs an analysis process by referring to information relating to the movements of one or more athletes in the input data IN, and analyzes the performance of at least one of the one or more athletes.
  • the generation unit 12 may generate control data COD by referring to the analysis results by the analysis unit 12-1.
  • the analysis results by the analysis unit 12-1 include:
  • the data may include information about the difference between the athlete's performance and a comparison performance.
  • the comparison performance may be the athlete's ideal performance, or a provisional performance aimed at approaching the ideal performance.
  • the performance is as follows: ⁇ The user such as a coach or trainer may set this in advance by referring to the athlete's past performances, the athlete's team's past performances, etc., or the above-mentioned model LM or other machine learning models, A learning process may be applied that references multiple past performances of the athlete and multiple past performances of the team to which the athlete belongs, and the learned model may output a performance for comparison.
  • the analysis result by the analysis unit 12-1 is as follows: It may also include information to support coach decision-making for each of one or more athletes.
  • the information to support the coach's decision-making for each of one or more athletes may be output by a machine-learned model.
  • the above-mentioned model LM or other machine-learned model may be -
  • the model is trained in advance by referring to the athlete's past performances, the athlete's team's past performances, and the athlete's current team's past performances, and then the trained model is -
  • the system may be configured to output information to support the coach's decision-making regarding each of the one or more athletes.
  • the analysis results by the analysis unit 12-1 may also include areas in which the athlete should improve.
  • areas in which the athlete should improve can be generated by a process similar to or identical to the process for generating information to support coach decision-making described above.
  • the output unit 13 outputs the results of the analysis process performed by the generation unit 12. More specifically, the output unit 13 supplies the results of the analysis process performed by the generation unit 12 to the input/output device 19 and controls the input/output device 19 to output the results. The output unit 13 also stores the results of the analysis process performed by the generation unit 12 in the storage unit 14A.
  • the output unit 13 outputs, as an example, the virtual reality or mixed reality indicated by the presentation data.
  • the presentation data DD generated by the generation unit 12 may, as an example, be presented via a touch panel, or presented as virtual reality or mixed reality via a head-mounted display or eyeglass-type display.
  • the information processing device 2A has, as an example, a configuration similar to that of the information processing device 1A described above. However, the user of the information processing device 2A may be different from the user of the information processing device 1A.
  • the information processing device 1A may be installed on-premise (for example, in a facility having a competition area), and the information processing device 2A may be installed remotely (at a remote location). When grounded in this way, the information processing device 1A can exchange information with athletes, teams, coaches (including managers and trainers), referees, etc. participating in the competition, and the information processing device 2A can exchange information with other coaches (including managers and trainers), other referees, etc. located in remote locations.
  • FIG. 4 is a diagram illustrating application example 1 of the information processing system 100A.
  • at least two cameras (16-1, 16-2) and at least two microphones (17-1, 17-2) are installed in the competition area CA.
  • at least two haptic devices (18-1, 18-2) are attached to the players competing in the competition area CA.
  • the competition area CA is composed of, for example, an area CA1 where player P1 plays and an area CA2 where player P2 plays.
  • area CA1 is composed of three areas
  • area CA2 is also composed of three areas. Each of these multiple areas is identified by the generation unit 12 described above, and each area is associated with the results of analysis of the players in that area.
  • an input/output device 19 is placed on the premises and used by an operator or coach who is a user on the premises. Via the input/output device 19, the user can input instructions to the information processing device 1A and view the results of the analysis performed by the information processing device 1A.
  • information processing device 2A is placed in a remote location and used by a remote user, such as an operator or coach.
  • the user can input instructions to information processing device 2A or information processing device 1A, and view the results of the analysis performed by information processing device 2A or information processing device 1A.
  • communication between the devices that make up information processing system 100A is carried out via a wide area network, such as the Tactile Internet, 5G, Beyond 5G, or 6G, for example.
  • a wide area network such as the Tactile Internet, 5G, Beyond 5G, or 6G, for example.
  • coaches in remote locations can check the analysis results in real time, allowing them to give instructions to athletes or teams in real time and storing the analysis results in a database (for example, memory unit 14A) for future evaluation.
  • FIG. 5 is a diagram illustrating application example 2 of the information processing system 100A.
  • the camera 16 includes a front camera ("Front” in the group of cameras shown in FIG. 5 ), a rear camera ("Back” in the group of cameras shown in FIG. 5 ), and side cameras ("Side A” and "Side B” in the group of cameras shown in FIG. 5 ).
  • the microphones 17 include a front microphone ("Front” in the group of microphones shown in FIG. 5 ), a rear microphone ("Back" in the group of microphones shown in FIG.
  • the haptic devices 18 include haptic devices worn by each player ("Player A,”"PlayerB,””PlayerC,” and “Player D” in the group of haptic devices shown in FIG. 5 ).
  • the camera 16 is configured with an array of multiple cameras, including a front camera, a rear camera, and a side camera. The images captured by these cameras are then recorded.
  • the generation unit 12 can track and record the movements of each player (athlete) by referencing these images.
  • the generation unit 12 can analyze the image data acquired by the camera 16 to track and record the coordination of the entire team. It is also preferable to install cameras on the court in order to observe the entire court at once.
  • the microphone 17 is configured as an array of multiple microphones, including front microphones, back microphones, and side microphones. Sounds collected by these microphones are recorded. The coordination and conversations of the players (athletes) are recorded for real-time analysis and future use. In this way, referencing audio data in addition to video data is advantageous for the information processing device 1A to analyze the real-time coordination of the entire team, and for coaches to analyze team performance.
  • FIG. 6 is a diagram showing a processing example 1 by the information processing device 1A. More specifically, FIG. 6 is a diagram showing a processing example when the information processing device 1A is equipped with a feedback loop. As shown in FIG. 6, the information processing device 1A may be, for example, a tactile training system equipped with a feedback loop.
  • TR in FIG. 6 is a start/stop sequence triggered by a coach when the coach requests assistance from AI in the information processing device 1A, for example.
  • the acquisition unit 11 acquires input data IN.
  • the input data IN may include information regarding the movements of one or more athletes.
  • the analysis unit 12-1 may execute an analysis process of the performance of at least one or more athletes by referencing information relating to the movements of one or more athletes in the input data IN.
  • the analysis unit 12-1 may execute an analysis process by referencing, for example, at least one of the information relating to the movements of the athletes in the input data IN acquired by the acquisition unit 11 and the detection results of the movements of the athletes in the input data IN detected by the generation unit 12 using a machine learning model.
  • the analysis unit 12-1 may perform a performance analysis process for each action included in a group of actions PA in a sport.
  • the actions included in a group of actions PA in a sport may be, for example, actions related to skills in the sport.
  • Specific examples of actions included in a group of actions PA in a sport include serve, dig, set, spike, block, and rotation, as shown in FIG. 6.
  • the coach may use the AI assistant to decide which action of the actions included in a group of actions PA in a sport to focus on, or which actions to analyze in parallel.
  • the tactile training system TS may be set to either enabled (YES) or disabled (NO), for example, depending on the user's settings. If the tactile training system TS is enabled (YES), the generation unit 12 may generate control data COD, for example, by referring to the analysis results in the analysis unit 12-1. At this time, the athlete may train the relevant body part via a haptic device 18 worn on the body. The information processing device 1A may also respond to the athlete's movements via tactile feedback to the athlete's body, for example, to improve swing, rotation, or power transmission. Specific examples of body parts to be trained include the hand (HND) or arm (ARM), as shown in FIG. 6.
  • the information processing device 1A may, for example, provide visual or auditory feedback to the athlete regarding the analysis results from the analysis unit 12-1.
  • the athlete's response RES to the above-mentioned feedback may be recorded as part of the input data IN, for example, after passing through an analysis process AN.
  • the analysis process AN may, for example, improve the assistance provided by the AI and fine-tune the training.
  • the analysis process AN may be executed, for example, by a configuration such as the analysis unit 12-1 in the information processing device 1A, or by a configuration in another device.
  • FIG. 7 is a diagram showing a first processing example by the information processing system 100A. More specifically, FIG. 7 is a diagram showing a processing example of a tactile training system for an athlete's hands.
  • the hand tactile training system TS provided in the information processing device 1A (or 2A) may include settings related to training sessions for one or more athletes.
  • the settings related to training sessions for one or more athletes in the example of FIG. 7 may define a training library, which will be described later.
  • the settings related to the training sessions may be, for example, related to a training session TR or a self-practice session SP.
  • a training session TR may be, for example, related to the movements of multiple athletes.
  • a self-practice session SP may be, for example, related to training specialized for one skill of one athlete.
  • a specific example of a self-practice session SP is practice using the hands to improve pitching or blocking.
  • the haptic training system TS is connected to the hand-related haptic device 18 via the controller CON.
  • the configuration of the controller CON may be included in the generation unit 12, for example.
  • the controller CON may generate the control data COD, for example, by referencing a library related to the training of one or more athletes.
  • FIG. 7 shows a warm-up haptic library LIW and a practice haptic library LIP.
  • the warm-up haptic library LIW may relate to, for example, training of at least one of the left hand, right hand, fingers, and thumb.
  • the practice haptic library LIP may relate to, for example, training of at least one of the serve, dig, set, spike, and block.
  • the hand-related haptic device 18 may, for example, as shown in FIG. 7, include a left-hand glove HAPL, a left-hand controller HAPCL, a right-hand glove HAPR, and a right-hand controller HAPCR.
  • the left-hand glove HAPL and the right-hand glove HAPR may, for example, include vibrotactile motors.
  • the left-hand controller HAPCL and the right-hand controller HAPCR may, for example, supply signals related to the control of the vibrotactile motors included in the left-hand glove HAPL and the right-hand glove HAPR to the motors.
  • the left-hand glove HAPL and the right-hand glove HAPR include vibrotactile motors shown in black at positions corresponding to the fingertips and palm, respectively.
  • the controller CON may generate control data COD by referencing at least one of the practice haptic library LIP and the warm-up haptic library LIW. Furthermore, the controller CON may, for example, supply the generated control data COD to the vibrating haptic motors provided in the left-hand glove HAPL and the right-hand glove HAPR via the left-hand controller HAPCL and the right-hand controller HAPCR, respectively. The vibrating haptic motors may, for example, be driven in accordance with the supplied control data COD to generate vibrations, thereby transmitting tactile information to the athlete wearing the haptic device 18.
  • FIG. 8 is a diagram showing a processing example 2 by the information processing system 100A. More specifically, FIG. 8 is a diagram showing a processing example of a tactile training system for an athlete's arm.
  • the arm tactile training system TS provided in the information processing device 1A (or 2A) may include settings related to training sessions of one or more athletes, similar to the hand tactile training system TS in the example of FIG. 7.
  • the settings related to training sessions of one or more athletes in the example of FIG. 8 are similar to those described with reference to FIG. 7, and therefore will not be described here.
  • the haptic training system TS is connected to the arm-related haptic device 18 via the controller CON.
  • the configuration of the controller CON may, for example, be included in the generation unit 12.
  • the controller CON may, for example, generate the control data COD by referencing a library related to the training of one or more athletes.
  • FIG. 8 shows a haptic library LIS related to strategy training and a haptic library LIT related to team play.
  • the haptic library LIS related to strategy training may, for example, relate to training for one athlete, two athletes, or three or more athletes.
  • the haptic library LIT related to team play may, for example, relate to training for at least one of defense, offense, arm swing, and blocking.
  • the arm-related haptic device 18 may, for example, include a vibrotactile motor HAPM attached to the attachment location via an arm band, and a left arm or right arm controller HAPC, as shown in FIG. 8.
  • the left arm or right arm controller HAPC may, for example, supply signals to the vibrotactile motor HAPM related to the control of the motor.
  • the controller CON may generate control data COD by referencing at least one of a haptic library LIS related to strategic training and a haptic library LIT related to team play. Furthermore, the controller CON may, for example, supply the generated control data COD to the vibrotactile motor HAPM via the left-arm or right-arm controller HAPC. The vibrotactile motor HAPM may, for example, be driven in accordance with the supplied control data COD to generate vibrations, thereby transmitting tactile information to the athlete wearing the haptic device 18.
  • FIG. 9 is a diagram illustrating a third processing example performed by the information processing system 100A. More specifically, FIG. 9 is a diagram illustrating an example of output of virtual reality in the information processing system 100A.
  • the generation unit 12 may, for example, refer to the control data COD to generate presentation data DD for presenting a digital copy of a player as virtual reality or mixed reality.
  • the output unit 13 may, for example, output the virtual reality or mixed reality indicated by the presentation data DD.
  • the output unit 13 may, for example, output the virtual reality or mixed reality indicated by the presentation data DD via an input/output device 19, as shown in FIG. 9 .
  • FIG. 9 In the example of FIG.
  • the virtual reality or mixed reality indicated by the presentation data DD may include, for example, a scenario in which the player is present.
  • the scenario in which the player is present may be, for example, related to a match, or may include information regarding the actions of other team members.
  • the output unit 13 may present the virtual reality or mixed reality indicated by the presentation data DD to the player through at least one of the visual, auditory, and tactile sensations generated by the haptic device 18.
  • the tactile sensation generated by the haptic device 18 may indicate, for example, pitching or blocking of the ball.
  • the information processing system 100A configured as described above acquires input data including at least one of image data, audio data, and haptic data, and generates control data to be supplied to one or more haptic devices worn by the athlete by referring to the input data. This configuration allows the athlete's play to be improved by providing feedback on the athlete's movements.
  • the information processing system 100A can help sports athletes maintain their motivation during training sessions and improve their skills. Furthermore, by using the sense of touch during training, the information processing system 100A can directly provide new stimuli to the athlete's body. By using the sense of touch, the athlete can select the correct strategy and route.
  • Haptic feedback can help reduce injuries, which is essential when athletes are costly and the performance of the entire team depends on them. Incorrect movements, such as throwing (hitting) a ball, can be identified and corrected with immediate haptic feedback.
  • Coaches can gain a deeper understanding of player performance if they can further analyze tactile data. Skills such as jumping, force balance, and torque are essential for improving player performance. Coaches can access such data in real time and in playback mode via the information processing system 100A. Coaches can identify problem areas and provide skill improvements, thereby improving game outcomes.
  • the information processing system 100A assists in recording force and torque data during sports and training sessions. Such recorded data is important for improving an athlete's skills. Such recorded data is also useful for healthcare after an injury, helping to improve the healing process for a faster recovery.
  • the information processing system 100A can contribute to the athlete's training and recovery through tactile feedback during the recovery process from an injury.
  • Information processing system 100A supports tactile sensations in virtual reality (VR) or mixed reality (MR) online games. In addition to visual and audio information, tactile information completes the entire immersive experience. This allows game providers to offer new sensations and connect customers with new experiences.
  • VR virtual reality
  • MR mixed reality
  • the information processing system 100A can provide a tactile sensation that can be perceived even by athletes with disabilities, such as athletes with poor eyesight. Players can team up with athletes with various disabilities and play while feeling the action.
  • the information processing system 100A is a useful system for training volleyball teams. By recording multimodal data related to audio, video, tactile sensations, etc., a feedback mechanism can be applied to improve each player's performance.
  • the information processing system 100A provides a new method for improving volleyball team training. It can provide players with immediate tactile feedback to focus on specific skills that require specific forces or movements, such as pitching or blocking.
  • the information processing system 100A By recording multimodal data such as audio, video, and tactile data, the information processing system 100A allows each player to recreate their play more realistically, making it more enjoyable and improving their performance. Utilizing tactile data in training sessions provides new stimuli that can be enjoyed and recognized, increasing motivation.
  • the information processing system 100A can record multimodal data, such as audio, video, and haptic data, that is useful for playback sessions in virtual reality (VR) or mixed reality (MR). Furthermore, matches and training sessions can be experienced in VR or MR via the information processing system 100A. As an example, a specific training session, such as a blocking skill that can be realized using data recorded by the present invention, can be played back in VR.
  • multimodal data such as audio, video, and haptic data
  • VR virtual reality
  • MR mixed reality
  • matches and training sessions can be experienced in VR or MR via the information processing system 100A.
  • a specific training session such as a blocking skill that can be realized using data recorded by the present invention, can be played back in VR.
  • the information processing system 100A can train an entire team remotely using all necessary data, including audio, video, and haptics. Matches can be played remotely and can be enjoyed by children, minors, and even people with disabilities. Using the information processing system 100A, players from different regions and with different skills can participate in training sessions or the Game of Life, for example, via haptics.
  • Some or all of the functions of the information processing devices 1, 1A, 2A, and input/output device 19 may be realized by hardware such as an integrated circuit (IC chip), or by software.
  • each of the above devices is realized, for example, by a computer that executes program instructions, which are software that realizes each function.
  • An example of such a computer (hereinafter referred to as computer C) is shown in Figure 10.
  • Figure 10 is a block diagram showing the hardware configuration of computer C that functions as each of the above devices.
  • Computer C has at least one processor C1 and at least one memory C2.
  • Memory C2 stores a program P for operating computer C as each of the above devices.
  • processor C1 reads and executes program P from memory C2, thereby realizing the functions of each of the above devices.
  • the processor C1 may be, for example, a CPU (Central Processing Unit), GPU (Graphics Processing Unit), DSP (Digital Signal Processor), MPU (Micro Processing Unit), FPU (Floating point number Processing Unit), PPU (Physics Processing Unit), TPU (Tensor Processing Unit), quantum processor, microcontroller, or a combination of these.
  • the memory C2 may be, for example, a flash memory, HDD (Hard Disk Drive), SSD (Solid State Drive), or a combination of these.
  • Computer C may further include RAM (Random Access Memory) for expanding program P during execution and for temporarily storing various data.
  • Computer C may also include a communications interface for sending and receiving data to and from other devices.
  • Computer C may also include an input/output interface for connecting input/output devices such as a keyboard, mouse, display, or printer.
  • the program P can be recorded on a non-transitory, tangible recording medium M that can be read by the computer C.
  • a recording medium M can be, for example, a tape, disk, card, semiconductor memory, or programmable logic circuit.
  • the computer C can acquire the program P via such a recording medium M.
  • the program P can also be transmitted via a transmission medium.
  • a transmission medium can be, for example, a communications network or broadcast waves.
  • the computer C can also acquire the program P via such a transmission medium.
  • Appendix A1 an acquisition means for acquiring input data including at least one of image data, audio data, and tactile data; and generating means for generating control data to be supplied to one or more haptic devices worn by the athlete, by referring to the input data.
  • the input data includes information regarding one or more player actions;
  • the information processing device according to claim A1, wherein the generating means generates the control data in accordance with the actions of the one or more athletes.
  • Appendix A3 The information processing device described in Appendix A2, wherein the generation means detects the movements of the one or more athletes in the input data using a machine learning model and generates the control data by referring to the detection results.
  • Appendix A4 An information processing device as described in any one of Appendix A2 or A3, wherein the generation means further performs an analysis process of the performance of at least one of the one or more athletes by referring to information regarding the movements of the one or more athletes in the input data, and generates the control data by referring to the analysis results.
  • Appendix A5 The information processing device according to Appendix A4, further comprising an output unit that outputs a result of the analysis process performed by the generation unit.
  • Appendix A6 The information processing device according to Appendix A4, wherein the generating means generates the control data by further referring to a library related to training of the one or more athletes.
  • the generating means further generates presentation data for presenting the digital copy of the athlete in virtual reality or mixed reality, with reference to the control data;
  • the information processing device according to claim A5, wherein the output means outputs a virtual reality or a mixed reality indicated by the presentation data.
  • Appendix A8 An information processing device as described in Appendix A4, wherein the analysis results include information regarding the difference between the athlete's performance and a comparison performance.
  • Appendix A9 The information processing device according to Appendix A4, wherein the analysis results include information to support a coach's decision-making for each of the one or more athletes.
  • Appendix B1 an acquisition process in which at least one processor acquires input data including image data, audio data, and/or haptic data; and generating control data to be supplied to one or more haptic devices worn by the athlete by the at least one processor, with reference to the input data.
  • Appendix B2 the input data includes information regarding one or more player actions; An information processing method as described in Appendix B1, wherein, in the generation process, the at least one processor generates the control data in accordance with the actions of the one or more players.
  • Appendix B3 An information processing method described in Appendix B2, in which, in the generation process, the at least one processor detects the movements of the one or more athletes in the input data using a machine learning model and generates the control data by referring to the detection results.
  • Appendix B4 An information processing method described in any one of Appendix B2 or B3, wherein the generation process by the at least one processor further performs an analysis process of the performance of at least one of the one or more athletes by referring to information regarding the movements of the one or more athletes in the input data, and generates the control data by referring to the analysis results.
  • Appendix B5 The information processing method according to Appendix B4, further comprising an output process in which the at least one processor outputs a result of the analysis process performed by the generation process.
  • Appendix B6 An information processing method as described in Appendix B4, wherein in the generation process, the at least one processor further references a library related to the training of the one or more athletes to generate the control data.
  • the at least one processor In the generating process, the at least one processor further generates presentation data for presenting the digital copy of the athlete in virtual reality or mixed reality, with reference to the control data;
  • Appendix B8 An information processing method as described in Appendix B4, wherein the analysis results include information regarding the difference between the athlete's performance and a comparison performance.
  • Appendix B9 An information processing method as described in Appendix B4, wherein the analysis results include information to support a coach's decision-making for each of the one or more athletes.
  • Appendix C1 A program that causes a computer to function as an information processing device, The computer an acquisition means for acquiring input data including at least one of image data, audio data, and tactile data; An information processing program that functions as a generation means that generates control data to be supplied to one or more haptic devices worn by the athlete by referring to the input data.
  • Appendix C3 An information processing program as described in Appendix C2, wherein the generation means detects the movements of the one or more athletes in the input data using a machine learning model and generates the control data by referring to the detection results.
  • Appendix C4 An information processing program described in either one of Appendix C2 or C3, wherein the generation means further performs an analysis process of the performance of at least one of the one or more athletes by referring to information regarding the movements of the one or more athletes in the input data, and generates the control data by referring to the analysis results.
  • Appendix C5 The computer The information processing program according to appendix C4, which functions as an output process for outputting a result of the analysis process by the generation means.
  • Appendix C6 An information processing program according to appendix C4, wherein the generation means generates the control data by further referring to a library related to the training of the one or more athletes.
  • the generating means further generates presentation data for presenting the digital copy of the athlete in virtual reality or mixed reality, with reference to the control data;
  • Appendix C8 An information processing program as described in Appendix C4, wherein the analysis results include information regarding the difference between the athlete's performance and a comparison performance.
  • Appendix C9 An information processing program as described in Appendix C4, wherein the analysis results include information to support a coach's decision-making for each of the one or more athletes.
  • Appendix D1 at least one processor, an acquisition process for acquiring input data including at least one of image data, audio data, and haptic data; and an information processing device that executes a generation process for generating control data to be supplied to one or more haptic devices worn by the athlete, by referring to the input data.
  • the information processing device may further include a memory.
  • the memory may also store a program for causing the at least one processor to execute each of the processes.
  • Appendix D2 the input data includes information regarding one or more player actions; An information processing device as described in Appendix D1, wherein, in the generation process, the at least one processor generates the control data in accordance with the actions of the one or more players.
  • Appendix D3 An information processing device as described in Appendix D2, wherein, in the generation process, the at least one processor detects the movements of the one or more athletes in the input data using a machine learning model and generates the control data by referring to the detection results.
  • Appendix D4 An information processing device described in any one of Appendix D2 or D3, wherein in the generation process, the at least one processor further performs an analysis process of the performance of at least one of the one or more athletes by referring to information regarding the movements of the one or more athletes in the input data, and generates the control data by referring to the analysis results.
  • Appendix D5 The at least one processor The information processing device according to appendix D4, which executes an output process that outputs a result of the analysis process by the generation process.
  • Appendix D6 An information processing device as described in Appendix D4, wherein in the generation process, the at least one processor further references a library related to the training of the one or more athletes to generate the control data.
  • the at least one processor further generates presentation data for presenting the digital copy of the athlete in virtual reality or mixed reality, with reference to the control data;
  • the information processing device according to claim D5, wherein in the output processing, the at least one processor outputs virtual reality or mixed reality indicated by the presentation data.
  • Appendix D8 An information processing device as described in Appendix D4, wherein the analysis results include information regarding the difference between the athlete's performance and a comparison performance.
  • Appendix D9 An information processing device according to appendix D4, wherein the analysis results include information to support a coach's decision-making for each of the one or more athletes.
  • Appendix E1 A program that causes a computer to function as an information processing device, The computer, an acquisition process for acquiring input data including at least one of image data, audio data, and haptic data; A non-transitory recording medium on which is recorded an information processing program that executes a generation process to generate control data to be supplied to one or more tactile devices worn by the athlete by referring to the input data.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention permet d'obtenir un dispositif de traitement d'informations pouvant améliorer le jeu d'un athlète en renvoyant des mouvements de l'athlète. Le dispositif de traitement d'informations comprend : un moyen d'acquisition pour acquérir des données d'entrée comprenant des données d'image et/ou des données vocales et/ou des données tactiles ; et un moyen de génération pour générer, en référence aux données d'entrée, des données de commande à fournir à un ou plusieurs dispositifs tactiles portés par un joueur.
PCT/JP2024/045826 2024-02-21 2024-12-25 Dispositif de traitement d'informations, procédé de traitement d'informations et programme Pending WO2025177697A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2024-024897 2024-02-21
JP2024024897 2024-02-21

Publications (1)

Publication Number Publication Date
WO2025177697A1 true WO2025177697A1 (fr) 2025-08-28

Family

ID=96846933

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/045826 Pending WO2025177697A1 (fr) 2024-02-21 2024-12-25 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2025177697A1 (fr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018511450A (ja) * 2015-02-02 2018-04-26 ジーエヌ アイピー ピーティーワイ リミテッド パフォーマンスセンサデータの分析に基づく適合トレーニングプログラムの配信を含む、インタラクティブスキルトレーニングコンテンツを提供するように構成された、フレームワーク、デバイス及び方法

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018511450A (ja) * 2015-02-02 2018-04-26 ジーエヌ アイピー ピーティーワイ リミテッド パフォーマンスセンサデータの分析に基づく適合トレーニングプログラムの配信を含む、インタラクティブスキルトレーニングコンテンツを提供するように構成された、フレームワーク、デバイス及び方法

Similar Documents

Publication Publication Date Title
US10821347B2 (en) Virtual reality sports training systems and methods
US11826628B2 (en) Virtual reality sports training systems and methods
US11783721B2 (en) Virtual team sport trainer
Miles et al. A review of virtual environments for training in ball sports
US12377317B2 (en) Sporting sensor-based apparatus, system, method, and computer program product
US11446550B2 (en) Entertainment forum digital video camera, audio microphone, speaker and display device enabling entertainment participant and remote virtual spectator interaction, apparatus, system, method, and computer program product
US10821345B2 (en) Sporting device for analyzing an element in a tunnel, apparatus, system, method, and computer program product
US20160049089A1 (en) Method and apparatus for teaching repetitive kinesthetic motion
US11951376B2 (en) Mixed reality simulation and training system
Yeo et al. Augmented learning for sports using wearable head-worn and wrist-worn devices
JP2014151027A (ja) 運動及び/またはゲーム装置
JP6571701B2 (ja) 体感装置、およびその方法
WO2025177697A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2020241735A1 (fr) Système, procédé et programme d'aide à l'amélioration de la production
JP7760544B2 (ja) 処置対象に対する処置を提案するプログラム、装置、システム及び方法
WO2025177696A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP7764419B2 (ja) ジャイロモーメントを利用した動作支援プログラム、装置及び方法、並びに力覚提示装置
US20250375662A1 (en) Ball feeding systems and methods
TWI827134B (zh) 具延展實境、語音及動作辨識之團隊視野訓練系統及方法
JP7392739B2 (ja) 運動学習システム
TW202419135A (zh) 分析使用者揮擊動作以判斷擊球軌跡之系統及方法
Katz et al. Sport Technology Research Laboratory, University of Calgary

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24925731

Country of ref document: EP

Kind code of ref document: A1