WO2022231453A1 - Training method for training a robot to perform an ultrasound examination - Google Patents
Training method for training a robot to perform an ultrasound examination Download PDFInfo
- Publication number
- WO2022231453A1 WO2022231453A1 PCT/RU2021/000176 RU2021000176W WO2022231453A1 WO 2022231453 A1 WO2022231453 A1 WO 2022231453A1 RU 2021000176 W RU2021000176 W RU 2021000176W WO 2022231453 A1 WO2022231453 A1 WO 2022231453A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot arm
- robot
- ultrasound
- model
- training
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/085—Force or torque sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4209—Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
- A61B8/4218—Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames characterised by articulated arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/54—Control of the diagnostic device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/42—Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
- G05B19/423—Teaching successive positions by walk-through, i.e. the tool head or end effector being grasped and guided directly, with or without servo-assistance, to follow a path
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37558—Optical sensor, scanner
Definitions
- the present invention generally relates to a healthcare or medical industry, in particular to use of robots for examining or treating patients, and more particularly to use of robots for performing an ultrasound diagnostics of a patient. More specifically, the present invention relates to training method for training robots to perform ultrasound examinations.
- Ultrasound imaging is a non-invasive diagnostic imaging technique using high-frequency sound waves to image the inside of a patient body.
- ultrasound imaging is used for examining many internal organs of the patient body, including but not limited to the following: heart, liver, gallbladder, spleen, pancreas, kidneys, urinary bladder, uterus, ovaries, eyes, thyroid, parathyroid glands, scrotum (testicles), brain in infants, hips in infants, spine in infants, etc.
- Ultrasound imaging is also used for examining some other structures of the patient body, including but not limited to the following: tendons, muscles, nerves, ligaments, joints, blood vessels, soft tissue masses, bone surfaces, etc.
- ultrasound imaging is frequently used to: (i) diagnose a variety of conditions and assess organ damage following illness; (ii) examine an unborn child (fetus) in pregnant patients; (iii) guide procedures such as needle biopsies, in which needles are used to sample cells from an abnormal area for laboratory testing; (iv) image the breasts and guide biopsy of breast cancer; (v) diagnose a variety of heart conditions, including valve problems and congestive heart failure, and to assess damage after a heart attack; (vi) evaluate symptoms such as pain, swelling or infection; (vii) evaluate blockages to blood flow (such as clots), narrowing of vessels, tumors and congenital vascular malformations, reduced or absent blood flow to various organs (such as the testes or ovary), increased blood flow (which may be a sign of an infection), etc.
- Ultrasound images also known in the art as ultrasonic images or sonograms, are made by sending ultrasound pulses into tissues using an ultrasound probe. The ultrasound pulses echo off tissues with different reflection properties and are recorded and
- Ultrasound images are generally captured in real-time, so that they can also show movement of body internal organs as well as blood flowing through blood vessels. Unlike X-ray imaging, no ionizing radiation exposure is associated with ultrasound imaging.
- the ultrasound probe In an ultrasound imaging, the ultrasound probe is generally placed directly on a patient body skin. To optimize an image quality, the ultrasound probe may be placed inside a patient body, in particular via a gastrointestinal tract, vagina or blood vessels. A thin layer of a water-based gel is applied to the patient body skin to be examined; ultrasound waves are transmitted from the ultrasound probe through the applied gel into the patient body. The applied gel will allow the ultrasound probe to securely contact with the examined skin and eliminate air pockets between the ultrasound probe and the examined skin, the air pockets blocking sound waves from passing into the patient body.
- ultrasound images can be formed.
- the most common ultrasound image is a B-mode image (brightness) displaying an acoustic impedance of a two-dimensional cross-section of a tissue.
- Other types of ultrasound images may show a blood flow, motion of a tissue over time, the location of blood, the presence of specific molecules, the stiffness of a tissue, or the anatomy of a three-dimensional region.
- Sonographers are medical professionals performing scans, wherein the scans are then interpreted by radiologists and further used by clinicians (i.e. physicians and other healthcare professionals who provide direct patient care) for diagnostic or treatment purposes.
- ultrasound imaging is increasingly used in medical diagnostics and interventions.
- one of disadvantages of ultrasound imaging is the high inter-observer variability when acquiring ultrasound images, so that it calls for trained sonographers to guarantee clinically relevant images.
- receiving a reliable diagnosis generally depends on the availability of a specially trained technician or a qualified medical doctor. Lack of such specially trained technicians and doctors as well as cost of using radiologists for producing ultrasound images opens the need for robotic ultrasound-imaging techniques.
- robots representing a combination of ultrasound imaging technology with a computer-based robotic system or a robot controlled by a computer-based control system are highly integrated into a medical workspace, thereby allowing clinicians to treat individual patients in a more efficient, safer and less morbid way.
- robots are often uniquely suited for ultrasound examinations.
- robots may be pre-trained by expert sonographers to perform the best ultrasound examinations, thereby allowing every person (especially in remote regions) to gain access to the expertise of the expert sonographers.
- US 2021015453 discloses a training method for training a robot to perform an ultrasound examination, the training method including obtaining a motion control configuration for manually repositioning a robot arm provided with a ultrasound-imaging probe from a first imaging position to a second imaging position with respect to a patient body, wherein the motion control configuration is based on a prediction-convolutional neural network.
- the prediction-convolutional neural network is trained by performing the following operations: (i) providing a plurality of images obtained by the ultrasound-imaging probe from at least two imaging positions to obtain a target image view; (ii) obtaining a plurality of motion control configurations based on an ultrasound-imaging probe orientation or movement associated with the at least two imaging positions; and (iii) assigning a score to a relationship between the plurality of motion control configurations and the plurality of images with respect to the target image view.
- a main disadvantage of the training method of US 2021015453 and other similar training methods known in the art is in that it does not actually allow performing an ultrasound examination in a safe, precise and repeatable manner since the known training method at least does not take into account that a patient may reposition the patient’s body during the ultrasound examination, and/or may have personal or patient-specific anatomical features (for example, organ sizes or locations inside the patient body) or body structure features, and/or may be damaged by the ultrasound-imaging probe excessively pressing on the patient body.
- the improved training method to be developed in the art has to allow the use of the trained robot for performing a safe, precise and repeatable ultrasound examination.
- a technical problem to be solved by the present invention is to develop a training method for training a robot to perform an ultrasound examination that would allow the above disadvantage of the prior art training method to be at least partly eliminated.
- a training method for training a robot to perform an ultrasound examination comprises: providing a patient-specific anatomy 3D model; creating, by a 3D scanner, a 3D model of a patient body surface; manually moving a robot arm from a starting position to at least one predetermined training position on the patient body, the robot arm holding an ultrasound-imaging probe and being provided with a robot arm position tracker and at least one force sensor; manually actuating the ultrasound-imaging probe to produce at least one ultrasound image when the robot arm is manually moved to each of the training positions; sensing, by the force sensors, a plurality of forces applied by the robot arm to the ultrasound-imaging probe to hold thereof when the robot arm is positioned at least in the starting position and each of the training positions; computing a movement trajectory of the robot arm based on a plurality of robot arm locations on the created body surface 3D model, the locations corresponding to the robot arm positions tracked by the robot arm position tracker during the manual
- the provided training method further comprises sensing a plurality of forces applied by the robot arm to the ultrasound-imaging probe to hold thereof when the robot arm is moved between the starting position and the training positions.
- the provided training method further comprises updating, by the 3D scanner, the created body surface 3D model during the manual movement of the robot arm and correcting the robot arm locations according to the updated body surface 3D model.
- the provided training method further comprises saving the created robot-training model in a robot data storage as robot control instructions.
- the provided training method further comprises displaying, by a display, the produced ultrasound images to a user and accepting, by the user, at least one particular ultrasound image among the displayed ultrasound images.
- the provided training method further comprises manually moving the robot arm from an initial position to the starting position; and computing a positioning trajectory of the robot arm based on a plurality of robot arm spatial locations in relation to at least one reference point on the created body surface 3D model, the spatial locations corresponding to the robot arm positions tracked by the robot arm position tracker during the initial movement of the robot arm and associated with the created body surface 3D model.
- the robot arm spatial locations corresponding to the patient body in the provided training method may be further associated with the provided anatomy 3D model.
- the provided training method further comprises sensing, by the force sensors, a plurality of forces applied by the robot arm to the ultrasoundimaging probe to hold thereof during the initial movement of the robot arm; and creating a robot-positioning model by associating the computed positioning trajectory with the forces sensed during the initial movement of the robot arm.
- the training method according to the present invention allows the trained robot to effectively perform an ultrasound examination, in particular due to the fact that the training method provides an improved robot-training model associating the patient-specific anatomy 3D model, the body surface 3D model of the patient body, the tracked robot arm positions, the forces applied by the robot arm to the ultrasoundimaging probe, and the produced ultrasound images.
- FIG. 1 a flow diagram of a training method for training a robot to perform an ultrasound examination according to the present invention.
- the term “patient” means first of all a potentially sick person (a member of the mammalian class) seeking medical advice or remaining under medical observation to have a disease diagnosed and/or treated, wherein the term “patient” also means potentially sick mammalian animals remaining under medical observation to diagnose and/or treat their disease.
- mammal means a human or an animal, in particular anthropoid and non-human primates, dogs, cats, horses, camels, donkeys, cows, sheep, pigs, and other well-known mammals.
- the term "user” means a sonographer or any suitably skilled health care professional authorized to place an ultrasound probe on a patient body skin or inside a patient body (in particular, via a gastrointestinal tract, vagina or blood vessels) and/or manipulate the ultrasound probe placed on patient body skin or inserted inside the patient body, and/or remove the ultrasound probe from the patient body skin or the inner space of the patient body, wherein the healthcare professional may be, for example, surgeon, oncologist, endoscopist, thoracic surgeon, angiosurgeon, urologist, veterinarian, etc.
- patient-specific anatomy 3D model means an anatomy 3D model corresponding to a particular patient type, wherein the patient type may be defined by a patient age, a patient gender, a mammal type and/or other similar patient features. Therefore, the anatomy 3D model is designated in the present document as patient-specific since it may correspond to a particular patient who could be a human, animal or another mammal relating to a particular age group and/or having certain body dimensions or body features, in particular to a male or female human relating to an infant, teenager, full-aged person, adult, middle-age person, old person, etc.).
- FIG. 1 illustrates flow diagram of a training method 10 for teaching or training a robot to perform an ultrasound examination according to the present invention.
- the robot to be trained by the training method 10 of fig. 1 may be implemented as any mechanically appropriate robotic system or robot known in the art, the robot comprising or being provided with the following: (a) a driven robot manipulator or robot arm which comprises at least six joints representing the number of degrees of freedom of the robot arm (i.e.
- the robot arm has six or more degrees of freedom); (b) a driving unit or module for driving the robot arm; (c) a control unit or module for controlling the operation of the robot, including the operation of the robot arm, processing data used or collected during the operation of the robot and controlling a data recording procedure, the recorded data being collected during the operation of the robot; and (d) a long-term memory or a local data storage for storing executable program instructions or commands controlling the operation of robot (in particular, the operation of functional modules integrated into the robot and mentioned in the present document and, if required, the operation of external devices communicatively connected to the robot and mentioned in the present document) and allowing the functional modules to implement their functionalities. Meanwhile, the local data storage of the robot further stores different additional or supplemental data used by the functional modules to provide their outputs.
- the robot arm is used in the robot for holding an ultrasound-imaging probe which performs ultrasound scans or performs ultrasound examinations.
- the robot arm is provided with a robot arm position tracker for tracking positions of the robot arm during movement thereof, including angles between the ultrasound-imaging probe and the body surface to be examined.
- the robot arm is also provided with at least one force sensor for sensing a plurality of forces applied by the robot arm to the ultrasound-imaging probe to hold thereof when the ultrasoundimaging probe is manipulated on the patient body skin or inside the patient body.
- the force sensors are used in combination to detect a plurality of forces applied by the ultrasound-imaging probe to the patient body when a user manually manipulates the ultrasound-imaging probe in a teaching or training mode.
- the force sensors may be each implemented as a strain gage sensor.
- the position tracker is used for tracking a robot arm position in a three-dimensional (3D) system.
- the position tracker may be implemented as any appropriate position tracking system known in the art, the tracking system including a position tracking system built-in the robot arm.
- the force sensors may be incorporated into the robot arm or may be provided in an effector used for adapting the ultrasound-imaging probe to the robot arm (for example, the force sensors may be incorporated between an inner housing and an outer housing of the effector) and allowing the robot arm in an operating mode to imitate a natural human hand grasp used by the user to manipulate the ultrasound-imaging probe.
- the effector allows the robot arm in the training mode to be trained to simulate natural movements of the user.
- the ultrasound-imaging probe (also interchangeably referred to in the art as an ultrasound transducer or an ultrasound scanner) held by the robot arm may be implemented as any appropriate ultrasound-imaging probe known in the art.
- the ultrasound-imaging probe can both generate or emit ultrasound waves, as well as detect ultrasound echoes reflected back thereto (i.e. returned signals).
- active elements in the ultrasound ultrasound-imaging probe are made of special ceramic crystal materials called piezoelectrics.
- the piezoelectrics are able to produce sound waves when an electric field is applied to them, and they can work in reverse, producing an electric field when a sound wave hits the piezoelectrics.
- the ultrasound-imaging probe When used in the robot, the ultrasound-imaging probe sends out a beam of ultrasound waves into a patient body; ultrasound waves are reflected back to the ultrasound-imaging probe by boundaries between body tissues in the path of the beam (e.g., a boundary between a fluid and a soft tissue or between a tissue and a bone).
- these ultrasound echoes hit the ultrasoundimaging probe, they generate electrical signals, wherein the ultrasound-imaging probe computes or calculates the distance from the ultrasound-imaging probe to the tissue boundary based on the speed of the detected ultrasound echoes and the time of each echo’s return. These distances are then used to generate two-dimensional (2D) images or three-dimensional (3D) images of tissues and organs of the patient body.
- the training method 10 of fig. 1 comprises at least the following main actions or operations (also interchangeably referred to in the art as stages or steps):
- the above operation (1) includes the following sub-operations: (i) communicating data on a patient type used for training the robot to the control module of the robot, wherein the patient type may be defined by a patient age, a patient gender, a mammal type and/or other similar patient features; and (ii) automatically extracting, by the control module of the robot, a patient-specific anatomy 3D model from the data storage of the robot, the extracted anatomy 3D model depending on the patient type communicated to the control module.
- the anatomy 3D model provided in the operation (1) may correspond to a human, animal or another mammal relating to a particular age group and/or having certain body dimensions or body features, in particular to a male or female human relating to an infant, teenager, full- aged person, adult, middle-age person, old person, etc.).
- the patient type may be provided by the user as a text or voice input communicated to the control module of the robot, wherein the user may use standard I/O means (e.g. a keyboard, a mouse-pointing device, a touch-screen display, a microphone, etc.) connected to the robot.
- standard I/O means e.g. a keyboard, a mouse-pointing device, a touch-screen display, a microphone, etc.
- the patient-specific anatomy 3D model corresponding to the current patient type used for training the robot is stored for the current training session in the data storage of the robot, wherein the stored patient-specific anatomy 3D model may be communicated to the control module of the robot to provide the operation or the training of the robot.
- the patient type used for training the robot may be preliminarily communicated to the robot to be trained, in particular the patient type may be originally stored in the local data storage of the robot and automatically extracted therefrom by the control module of the robot when the patient type is required to be used to provide the operation of the robot.
- the patient type may be received, by the control module of the robot, from a data server, a cloud storage, an external storage or a similar external storage device used for storing data on the patient type to be used for training the robot.
- the robot is further provided with a communication module communicatively connected (e.g. in wireless manner via a communication network or in a wire manner via a physical cable) to the data server, cloud storage, external storage or the similar storage device to receive data on the patient type therefrom, wherein the robot is also provided with a communication bus communicatively coupled both to the communication module and the control module.
- the patient-specific anatomy 3D model to be used for training the robot may be received, by the control module of the robot, from a data server, a cloud storage, an external storage or a similar external storage device used for storing data on the patient type to be used for training the robot.
- the robot is further provided with a communication module communicatively connected (e.g. in wireless manner via a communication network or in a wire manner via a physical cable) to the data server, cloud storage, external storage or the similar storage device to receive data on the patient type therefrom, wherein the robot is also provided with a communication bus communicatively coupled both to the communication module and the control module.
- the communication module may be implemented as a network adapter provided with slots appropriate for connecting physical cables of desired types thereto if wired connections are provided between the robot and any external devices mentioned in the present document or as a network adapter in the form of WiFi-adaptor, 3G/4G/5G-adaptor, LTE-adaptor, or any other appropriate adaptor supporting any known wireless communication technology if wireless connections are provided between the robot and any external storage devices mentioned in the present document.
- such communication module may be implemented as a network adaptor supporting a combination of the above-mentioned wire or wireless communication technologies depending on types of connections provided between the robot and any external storage devices mentioned in the present document.
- the 3D scanner used for performing the above operation (2) may be implemented as any appropriate stationary, handheld or mobile 3D scanner or 3D scanning device used in healthcare applications for producing body-surface 3D scans.
- the body-surface 3D scans produced by the 3D scanner allow the control model to create a 3D whole-body model (also referred to in the art as a 3D avatar).
- the 3D scanner allows capturing in 3D a full patient body or particular parts of the patient body, wherein the created 3D body model represents accurate contours of the patient body taking into account body sizes, body shapes, body features being specific or individual for a particular patient, body textures, and a patient posture.
- the created 3D model of the patient body is communicated by the 3D scanner to the robot, wherein the 3D scanner is communicatively connected to the robot, and the communication module of the robot is configured to provide data transfer between the robot and the 3D scanner.
- the 3D body model is stored in the local data storage of the robot, the 3D body model being associated with the examined patient used for training the robot and with the current training session, wherein the stored 3D body model may be used by the control module of the robot to provide the operation or the training of the robot.
- the 3D scanner used for performing the above operation (2) may be integrated with or mounted on the robot art, thereby being another functional module of the robot.
- the 3D scanner may be a scanning module of the robot, the scanning module being controlled by the control module of the robot, wherein the created body surface 3D model may be stored in the local data storage of the robot.
- the user personally helps the robot arm to hold the ultrasound-imaging probe in an appropriate manner (i.e. manually providing an appropriate orientation of the ultrasound-imaging probe in relation to the patient body and manually controlling forces applied by the robot arm to the ultrasound-imaging probe to hold thereof) and manually moves the handheld robot arm from the starting position to at least one predetermined training position on the patient body, wherein the patient used for training the robot takes a particular position relating to a particular ultrasound examination to be subsequently performed with the trained robot.
- an appropriate manner i.e. manually providing an appropriate orientation of the ultrasound-imaging probe in relation to the patient body and manually controlling forces applied by the robot arm to the ultrasound-imaging probe to hold thereof
- the ultrasound-imaging probe When the ultrasound-imaging probe is positioned directly in each of the training positions, the ultrasound-imaging probe at least takes a required position in relation to a particular examined area or portion on the patient body, has a required orientation in relation to the particular examined body area or portion and is subjected to required forces applied by the user thereto, thereby allowing the ultrasound-imaging probe to capture the most representative ultrasound images corresponding to a particular ultrasound examination.
- the robot arm In case when there are two or more training positions to be taken by the robot arm in relation to the patient body in order to perform a particular ultrasound examination, the robot arm is manually moved by the user in the above-described manner from the staring position to a first training position, and then manually moved from the first training position to the second training position, and so on. In other words, in this case the robot arm needs to be successively manually moved by the user from the staring position to the training positions.
- the robot arm position tracker detects or tracks robot arm positions in a three-dimensional system, so that each of the robot arm positions taken by the robot arm during the manual movement thereof (i.e. the starting position and each of the training positions) corresponds to particular coordinates in the three- dimensional system.
- the tracked robot arm positions are stored in the local data storage of the robot, wherein the stored robot arm positions each corresponding to particular coordinates in the three-dimensional system may be used by the control module of the robot to provide the operation or the training of the robot.
- the 3D scanner may produce additional or supplemental body-surface 3D scans during the manual movement of the robot arm and update the created whole-body surface 3D model based on the supplemental body-surface 3D scans, wherein the control module of the robot may correct the robot arm locations according to the updated body surface 3D model.
- the user manually actuates the ultrasound-imaging probe, and the actuated ultrasound-imaging probe automatically generates or produces at least one ultrasound image in each of the training positions.
- the ultrasound images produced by the ultrasound-imaging probe are stored in the data storage of the robot, wherein each ultrasound image is associated with a corresponding one of the training positions. Meanwhile, the stored ultrasound images may be used by the control module of the robot to provide the operation or the training of the robot.
- ultrasound-imaging probe may be further actuated by the user to have supplemental or additional ultrasound images when the robot arm is moved between the starting position and the training position and/or moved between the training positions, and/or directly located in the starting position.
- the force sensors sense a plurality of forces or a set of forces (in particular, the sensed forces may be pressing forces) applied by the robot arm to the ultrasound-imaging probe to hold the ultrasound-imaging probe when the robot arm is directly located in each of the above-stated robot arm positions.
- Sets of forces each associated with corresponding one of the above-stated robot arm positions are stored in the local data storage of the robot. Meanwhile, the stored sets of forces may be then used by the control module of the robot to provide the operation or the training of the robot.
- the force sensors may be automatically activated when actuating the robot and may perform force measurements from the robot actuation moment to a moment when the robot is disactuated, so that the above-mentioned forces applied by the robot arm to the ultrasound-imaging probe may be sensed by the force sensors for each position taken by the robot arm during the manual movement thereof in relation to the patient body, including the starting position and each of the training positions.
- the force sensors may be manually activated by the user when the robot arm is located in the starting position and may then operate performing force measurements until the robot is disactuated.
- the force sensors may further sense forces applied by the robot arm to the ultrasound-imaging probe to hold thereof when the robot arm is moved between the starting position and the training positions, i.e. in the process of movement between the starting position and the first training position, and then in the process of movement between the first training position and the next second training position, etc. Therefore, in the present embodiment, each set of forces sensed by the force senses during the movement of the robot arm between the starting position and one of the training positions corresponds to a particular robot arm intermediate position between the starting position and said training position.
- the control module of the robot performs at least the following sub-operations: (i) extracting, from the data storage of the robot, data on robot arm positions tracked by the robot arm position tracker during the manual movement of the robot arm; (ii) extracting, from the data storage of the robot, data on the body surface 3D model created by the 3D scanner and data on the anatomy 3D model; (iii) associating the extracted robot arm positions with the extracted body surface 3D model in order to have precise robot arm locations on the created body surface 3D model, wherein the robot arm locations are each associated with the body surface 3D model; (iv) associating the robot arm locations with the anatomy 3D model in order to provide the correlation between the robot arm locations and the anatomy 3D model (e.g.
- the robot arm locations are each further associated with the anatomy 3D model; and (v) computing a movement trajectory of the robot arm based on the robot arm locations associated with both the anatomy 3D model and the body surface 3D model.
- the computed movement trajectory of the robot arm is stored in the data storage of the robot, wherein the stored movement trajectory may be used by the control module to provide the operation or the training of the robot.
- the control module of the robot performs at least the following sub-operations: (i) extracting, from the data storage of the robot, data on the robot arm movement trajectory computed by the control module; (ii) extracting, from the data storage of the robot, data on the forces sensed by the force sensors for the robot arm training positions; (iii) extracting, from the data storage of the robot, data on the ultrasound images produced by the ultrasound-imaging probe for the robot arm training positions; (iv) associating the extracted robot arm movement trajectory with both the extracted forces and the extracted ultrasound images to form or create the robot-training model.
- the created robot-training model is stored in the data storage of the robot as robot control instructions, wherein the stored robot-training model may be then used by the control module of the robot to provide the most precise and effective control of the robot when it is used for performing the same or similar ultrasound examination for a patient having the same or similar body features.
- the user before moving the robot arm from the starting position to a first training position, the user needs to manually move the robot arm with the ultrasound-imaging probe held by the user in the above-described manner from an initial position to the starting position, wherein the robot arm in the initial position may be spaced from the patient body, in particular from the starting position corresponding to a particular place or point on the patient body.
- control module of the robot may compute a positioning trajectory of the robot arm based on a plurality of robot arm spatial locations in relation to at least one reference point on the body surface 3D model created by the 3D scanner (such reference points may be preliminary set by the user and communicated to the control module of the robot), the spatial locations corresponding to the robot arm positions tracked by the robot arm position tracker during the initial movement of the robot arm (i.e. in process when the robot arm is manually moved by the user from the initial position to the starting position) and associated with the created body surface 3D model.
- the robot arm spatial locations corresponding to the patient body may be further associated with the anatomy 3D model.
- the force senses may further sense a plurality of forces applied by the robot arm to the ultrasound-imaging probe to hold thereof during the initial movement of the robot arm, i.e. during the movement of the robot arm from the initial position to the starting position, and the control module of the robot may form or create a robot-positioning model by associating the computed positioning trajectory with the forces sensed during the initial movement of the robot arm.
- the robot may be further provided with a display or communicatively connected to a display, wherein the display may be configured to display to the user the ultrasound images produced by the ultrasound-imaging probe when performing the operation (4).
- the user may accept at least one particular ultrasound image among the displayed ultrasound images, wherein such accepted ultrasound images may correspond to ultrasound images being the most representative for a particular ultrasound examination and/or a particular patient. It is to note that the most appropriate ultrasound images may be accepted by inputting a user text or voice command and communicating such command to the control module of the robot, wherein the user may use standard I/O means (e.g. a keyboard, a mouse-pointing device, a touch-screen display, a microphone, etc.) connected to the robot.
- standard I/O means e.g. a keyboard, a mouse-pointing device, a touch-screen display, a microphone, etc.
- robot-training models stored in the data storage of the robot as robot control instructions to be communicated to the control module of the robot may be then used by the robot for performing an ultrasound examination for a new patient in a safe, precise and effective way.
- the control module of the robot may use the most suitable robot-training model chosen by the control module based on initial input data relating to the patient to be examined (for example, a patient age, a patient gender, a mammal type and/or other appropriate patient parameters defining patient body features) and on the 3D body model initially produced by the 3D scanner for the new patient, thereby allowing the ultrasound examination to be performed for the new patient such that it simulates the best user practice which is the most suitable for the new patient with due consideration of individual body features of the patient.
- control module may use two or more robot-training models in the above-mentioned way, wherein each robot-training model chosen by the control module of the robot is related to the most suitable real user practice for a particular part of the ultrasound examination.
- the created robot-training model representing the collected or accumulated data merged with each other may be also used for creating special training algorithms based thereon, including but not limited to neural networks having different known topologies, thereby allowing the robot using such training algorithms, when used in practice, to generate an individual movement trajectory for the robot arm and individual force vectors for each individual patient taking into account an individual patient anatomy and body features.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
Claims
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/RU2021/000176 WO2022231453A1 (en) | 2021-04-27 | 2021-04-27 | Training method for training a robot to perform an ultrasound examination |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/RU2021/000176 WO2022231453A1 (en) | 2021-04-27 | 2021-04-27 | Training method for training a robot to perform an ultrasound examination |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022231453A1 true WO2022231453A1 (en) | 2022-11-03 |
Family
ID=83848433
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/RU2021/000176 Ceased WO2022231453A1 (en) | 2021-04-27 | 2021-04-27 | Training method for training a robot to perform an ultrasound examination |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2022231453A1 (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150297177A1 (en) * | 2014-04-17 | 2015-10-22 | The Johns Hopkins University | Robot assisted ultrasound system |
| US20210059772A1 (en) * | 2018-02-27 | 2021-03-04 | Intuitive Surgical Operations, Inc. | Artificial intelligence guidance system for robotic surgery |
-
2021
- 2021-04-27 WO PCT/RU2021/000176 patent/WO2022231453A1/en not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150297177A1 (en) * | 2014-04-17 | 2015-10-22 | The Johns Hopkins University | Robot assisted ultrasound system |
| US20210059772A1 (en) * | 2018-02-27 | 2021-03-04 | Intuitive Surgical Operations, Inc. | Artificial intelligence guidance system for robotic surgery |
Non-Patent Citations (1)
| Title |
|---|
| FARSONI SAVERIO; ASTOLFI LUCA; BONFE MARCELLO; SPADARO SAVINO; VOLTA CARLO ALBERTO: "A Versatile Ultrasound Simulation System for Education and Training in High-Fidelity Emergency Scenarios", IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE, IEEE, USA, vol. 5, 1 January 1900 (1900-01-01), USA , pages 1 - 9, XP011640079, DOI: 10.1109/JTEHM.2016.2635635 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Li et al. | An overview of systems and techniques for autonomous robotic ultrasound acquisitions | |
| Neubach et al. | Ultrasound-guided robot for flexible needle steering | |
| US11264135B2 (en) | Machine-aided workflow in ultrasound imaging | |
| RU2639026C2 (en) | Method and device for interactive display of three-dimensional ultrasound images | |
| CN107157512B (en) | Ultrasonic diagnostic apparatus and ultrasonic diagnostic support apparatus | |
| JP6104543B2 (en) | Ultrasonic diagnostic apparatus, ultrasonic image display apparatus, and ultrasonic image display method | |
| JP5208415B2 (en) | Method, system and computer program for generating ultrasound images | |
| TWI476403B (en) | Automated ultrasonic scanning system and scanning method thereof | |
| US20080021317A1 (en) | Ultrasound medical imaging with robotic assistance for volume imaging | |
| US20110301461A1 (en) | Self-administered breast ultrasonic imaging systems | |
| JP7462624B2 (en) | DEEP LEARNING BASED ULTRASOUND IMAGING GUIDANCE AND ASSOCIATED DEVICES, SYSTEMS, AND METHODS | |
| EP2836133B1 (en) | Cohesive robot-ultrasound probe for prostate biopsy | |
| CN115210761A (en) | Multi-modality medical image registration and associated devices, systems, and methods | |
| JP2021520939A (en) | Adaptive ultrasonic scanning | |
| JP6968576B2 (en) | Ultrasonic diagnostic device and ultrasonic diagnostic support device | |
| CN114845642A (en) | Intelligent measurement assistance for ultrasound imaging and associated devices, systems, and methods | |
| JP2011505951A (en) | Robot ultrasound system with fine adjustment and positioning control using a feedback responsive to the acquired image data | |
| CN104815399A (en) | High-strength focusing ultrasonic treatment guiding and control system and method based on six-shaft mechanical arm | |
| EP2977012A1 (en) | Ultrasound imaging apparatus and controlling method thereof | |
| RU2478340C2 (en) | Systems and methods for mechanical transfer of single-piece matrix lattice | |
| JP7538705B2 (en) | Ultrasound diagnostic system and operation support method | |
| KR20200107615A (en) | Ultrasound imaging apparatus, method for controlling the same, and computer program product | |
| CN112137643A (en) | Region of interest localization for longitudinal monitoring in quantitative ultrasound | |
| WO2022231453A1 (en) | Training method for training a robot to perform an ultrasound examination | |
| Li et al. | A virtual scanning framework for robotic spinal sonography with automatic real-time recognition of standard views |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21939483 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21939483 Country of ref document: EP Kind code of ref document: A1 |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17.04.2024) |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21939483 Country of ref document: EP Kind code of ref document: A1 |