[go: up one dir, main page]

WO2020210967A1 - Système de suivi optique et système d'entraînement pour instruments médicaux - Google Patents

Système de suivi optique et système d'entraînement pour instruments médicaux Download PDF

Info

Publication number
WO2020210967A1
WO2020210967A1 PCT/CN2019/082803 CN2019082803W WO2020210967A1 WO 2020210967 A1 WO2020210967 A1 WO 2020210967A1 CN 2019082803 W CN2019082803 W CN 2019082803W WO 2020210967 A1 WO2020210967 A1 WO 2020210967A1
Authority
WO
WIPO (PCT)
Prior art keywords
medical
surgical
optical
medical appliance
presentation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2019/082803
Other languages
English (en)
Chinese (zh)
Inventor
孙永年
周一鸣
朱敏慈
沈庭立
邱昌逸
蔡博翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to PCT/CN2019/082803 priority Critical patent/WO2020210967A1/fr
Publication of WO2020210967A1 publication Critical patent/WO2020210967A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges

Definitions

  • the invention relates to an optical tracking system and a training system, in particular to an optical tracking system and a training system for medical appliances.
  • the object of the present invention is to provide an optical tracking system and training system for medical devices, which can assist or train users to operate medical devices.
  • An optical tracking system for medical appliances which includes a plurality of optical markers, a plurality of optical sensors, and a computer device.
  • the optical markers are arranged on the medical appliance.
  • the optical sensor optically senses the optical markers to generate a plurality of sensings respectively. signal.
  • the computer device is coupled to the optical sensor to receive the sensing signal, and has a three-dimensional model of the surgical situation, and adjusts the relative position between the medical appliance present and the surgical target present in the three-dimensional model of the surgical situation according to the sensing signal.
  • the computer device and the optical sensor perform a pre-operation procedure, and the pre-operation procedure includes: calibrating the coordinate system of the optical sensor; and adjusting the scaling ratio of the medical appliance and the surgical target object.
  • the computer device and the optical sensor perform a coordinate calibration program
  • the calibration program includes an initial calibration step, an optimization step, and a correction step.
  • the initial correction step is to perform initial correction between the coordinate system of the optical sensor and the coordinate system of the three-dimensional model of the surgical situation to obtain initial conversion parameters.
  • the optimization step is to optimize the degrees of freedom of the initial conversion parameters to obtain the optimized conversion parameters.
  • the correction step is to correct the setting error caused by the optical marker in the optimized conversion parameter.
  • the initial calibration step is to use singular value decomposition (SVD), triangular coordinate registration (triangle coordinate registration), or linear least square estimation (linear least square estimation).
  • SVD singular value decomposition
  • triangular coordinate registration triangular coordinate registration
  • linear least square estimation linear least square estimation
  • the initial calibration step is to use singular value decomposition to find the transformation matrix between the feature points of the medical appliance and the optical sensor as the initial transformation parameter.
  • the transformation matrix includes a covariance matrix and rotation
  • the optimization step is to obtain multiple Euler angles with multiple degrees of freedom from the rotation matrix, and use Gauss-Newton method to iteratively optimize the parameters of the multiple degrees of freedom to obtain optimized conversion parameters.
  • the computer device sets the positions of the medical appliance presentation and the surgical target presentation in the three-dimensional model of the surgical situation according to the optimized conversion parameters and the sensing signal.
  • the correcting step is to use the reverse conversion and sensing signals to correct the positions of the medical appliance presentation and the surgical target presentation in the three-dimensional model of the surgical situation.
  • the computer device outputs display data, and the display data is used to present the 3D images of the medical appliance presentation and the surgical target presentation.
  • the computer device generates the medical image according to the three-dimensional model of the surgical situation and the medical image model.
  • the surgical target object is an artificial limb
  • the medical image is an artificial medical image for the surgical target object.
  • the computer device deduces the position of the medical appliance inside and outside the surgical target object, and adjusts the relative position between the medical appliance presentation and the surgical target presentation in the three-dimensional model of the surgical situation accordingly.
  • a training system for operating medical appliances includes medical appliances and the aforementioned optical tracking system for medical appliances.
  • the medical appliance includes a medical probe and a surgical appliance
  • the medical appliance presentation includes a medical probe presentation and a surgical appliance presentation.
  • the computer device scores the detection object found by the medical probe present and the operation of the surgical instrument present.
  • a method for calibrating an optical tracking system for medical appliances includes a sensing step, an initial calibration step, an optimization step, and a correction step.
  • the sensing step uses a plurality of optical sensors of the optical tracking system to optically sense a plurality of optical markers arranged on the optical tracking system on the medical appliance to generate a plurality of sensing signals;
  • the initial calibration step performs the optical sensor according to the sensing signals.
  • the initial correction between the coordinate system of the surgical scene and the coordinate system of the three-dimensional model of the surgical situation to obtain the initial conversion parameters;
  • the optimization step is to optimize the degrees of freedom of the initial conversion parameters to obtain the optimized conversion parameters;
  • the correction step is to modify the leading factors in the optimized conversion parameters For the setting error of the optical marker.
  • the calibration method further includes a pre-operation procedure, the pre-operation procedure includes calibrating the coordinate system of the optical sensor; and adjusting the zoom ratio for the medical appliance and the surgical target object.
  • the initial calibration step is to use singular value decomposition (SVD), triangular coordinate registration (triangle coordinate registration), or linear least square estimation (linear least square estimation).
  • SVD singular value decomposition
  • triangular coordinate registration triangular coordinate registration
  • linear least square estimation linear least square estimation
  • the initial calibration step is to use singular value decomposition to find the transformation matrix between the feature points of the medical appliance presentation of the three-dimensional model of the surgical situation and the optical sensor as the initial transformation parameter.
  • Variation matrix and rotation matrix are used as the initial transformation parameter.
  • the optimization step is to obtain multiple Euras angles with multiple degrees of freedom from the rotation matrix, and use Gauss-Newton method to iteratively optimize the parameters of multiple degrees of freedom to obtain optimized conversion parameters.
  • the positions of the medical appliance presentation and the surgical target presentation in the three-dimensional model of the surgical situation are set according to the optimized conversion parameters and the sensing signal.
  • the correction step is to use the reverse conversion and sensing signals to correct the positions of the medical appliance presentation and the surgical target presentation in the three-dimensional model of the surgical situation.
  • the optical tracking system of the present disclosure can assist or train users to operate medical appliances, and the training system of the present disclosure can provide a realistic surgical training environment for the trainees to effectively assist the trainees in completing surgical training.
  • FIG. 1A is a block diagram of the optical tracking system of the embodiment.
  • FIGS. 1B and 1C are schematic diagrams of the optical tracking system of the embodiment.
  • Fig. 1D is a schematic diagram of a three-dimensional model of the surgical situation of the embodiment.
  • Fig. 2 is a flow chart of the pre-operation procedure of the optical tracking system of the embodiment.
  • FIG. 3A is a flowchart of the coordinate correction program of the optical tracking system of the embodiment.
  • Fig. 3B is a schematic diagram of the coordinate system correction of the embodiment.
  • Fig. 3C is a schematic diagram of the degrees of freedom of the embodiment.
  • Fig. 4 is a block diagram of the training system for medical appliance operation according to the embodiment.
  • Fig. 5A is a schematic diagram of a three-dimensional model of the operation situation of the embodiment.
  • FIG. 5B is a schematic diagram of a three-dimensional model of an entity medical image according to an embodiment.
  • FIG. 5C is a schematic diagram of the three-dimensional model of the artificial medical image of the embodiment.
  • 6A to 6D are schematic diagrams of the direction vector of the medical appliance of the embodiment.
  • FIG. 7A to 7D are schematic diagrams of the training process of the training system of the embodiment.
  • Fig. 8A is a schematic diagram of the finger structure of the embodiment.
  • FIG. 8B is a schematic diagram of applying principal component analysis on bones from computed tomography images in this embodiment.
  • Fig. 8C is a schematic diagram of applying principal component analysis on the skin from a computed tomography image in an embodiment.
  • Fig. 8D is a schematic diagram of calculating the distance between the bone spindle and the medical appliance according to the embodiment.
  • Fig. 8E is a schematic diagram of the artificial medical image of the embodiment.
  • Fig. 9A is a block diagram for generating artificial medical images according to an embodiment.
  • Fig. 9B is a schematic diagram of the artificial medical image of the embodiment.
  • 10A and 10B are schematic diagrams of the artificial hand model and the correction of the ultrasonic volume of the embodiment.
  • Fig. 10C is a schematic diagram of ultrasonic volume and collision detection of the embodiment.
  • Fig. 10D is a schematic diagram of an artificial ultrasound image of the embodiment.
  • FIG. 1A is a block diagram of the optical tracking system of the embodiment.
  • the optical tracking system 1 for medical appliances includes a plurality of optical markers 11, a plurality of optical sensors 12, and a computer device 13.
  • the optical markers 11 are arranged on one or more medical appliances, and here are a plurality of medical appliances 21-24
  • the optical marker 11 can also be set on the surgical target object 3.
  • the medical appliances 21-24 and the surgical target object 3 are placed on the platform 4, and the optical sensor 12 optically senses the optical marker 11 to generate multiple senses. Test signal.
  • the computer device 13 is coupled to the optical sensor 12 to receive the sensing signal, and has a three-dimensional model 14 of the surgical context, and adjusts the gap between the medical appliance presents 141-144 and the surgical target present 145 in the three-dimensional model 14 of the surgical context according to the sensed signals relative position.
  • the medical appliance presentation objects 141 to 144 and the surgery target presentation object 145 are shown in FIG. 1D, and represent the medical appliances 21 to 24 and the surgery target object 3 in the three-dimensional model 14 of the surgery situation.
  • the three-dimensional model 14 of the surgical situation can obtain the current positions of the medical appliances 21-24 and the surgical target object 3 and reflect the medical appliance presentation and the surgical target presentation accordingly.
  • FIG. 1B is a schematic diagram of the optical tracking system of the embodiment, four optical sensors 121 to 124 are installed on the ceiling and facing the optical marker 11, medical appliances 21 to 24 and the surgical target on the platform 4 Object 3.
  • the medical tool 21 is a medical probe, such as a probe for ultrasonic imaging detection or other devices that can detect the inside of the surgical target object 3. These devices are actually used clinically, and the probe for ultrasonic imaging detection is, for example, ultrasound. Transducer (Ultrasonic Transducer).
  • the medical appliances 22-24 are surgical appliances, such as needles, scalpels, hooks, etc., which are actually used clinically. If used for surgical training, the medical probe can be a device that is actually used in clinical practice or a simulated device that simulates clinical practice, and the surgical instrument can be a device that is actually used in clinical practice or a simulated device that simulates clinical practice.
  • Figure 1C is a schematic diagram of the optical tracking system of the embodiment.
  • the medical appliances 21-24 and the surgical target 3 on the platform 4 are used for surgical training, such as minimally invasive finger surgery, which can be used for triggers. Refers to treatment surgery.
  • the material of the clamps of the platform 4 and the medical appliances 21-24 can be wood.
  • the medical appliance 21 is a realistic ultrasonic transducer (or probe), and the medical appliance 22-24 includes a plurality of surgical instruments, such as expanders ( dilator, needle, and hook blade.
  • the surgical target 3 is a hand phantom.
  • Three or four optical markers 11 are installed on each medical appliance 21-24, and three or four optical markers 11 are also installed on the surgical target object 3.
  • the computer device 13 is connected to the optical sensor 12 to track the position of the optical marker 11 in real time.
  • optical markers 11 There are 17 optical markers 11, including 4 that are linked on or around the surgical target object 3, and 13 optical markers 11 are on medical appliances 21-24.
  • the optical sensor 12 continuously transmits real-time information to the computer device 13.
  • the computer device 13 also uses the movement judgment function to reduce the computational burden. If the moving distance of the optical marker 11 is less than the threshold value, the position of the optical marker 11 is not updated ,
  • the threshold value is, for example, 0.7mm.
  • the computer device 13 includes a processing core 131, a storage element 132, and a plurality of I/O interfaces 133, 134.
  • the processing core 131 is coupled to the storage element 132 and I/O interfaces 133, 134.
  • the I/O interface 133 can receive optical sensors. 12, the I/O interface 134 communicates with the output device 5, and the computer device 13 can output the processing result to the output device 5 through the I/O interface 134.
  • the I/O interfaces 133 and 134 are, for example, peripheral transmission ports or communication ports.
  • the output device 5 is a device capable of outputting images, such as a display, a projector, a printer, and so on.
  • the storage element 132 stores program codes for execution by the processing core 131.
  • the storage element 132 includes a non-volatile memory and a volatile memory.
  • the non-volatile memory is, for example, a hard disk, a flash memory, a solid state disk, an optical disc, etc.
  • the volatile memory is, for example, dynamic random access memory, static random access memory, and so on.
  • the program code is stored in the non-volatile memory, and the processing core 131 can load the program code from the non-volatile memory to the volatile memory, and then execute the program code.
  • the storage component 132 stores the program code and data of the operation situation three-dimensional model 14 and the tracking module 15, and the processing core 131 can access the storage component 132 to execute and process the operation situation three-dimensional model 14 and the program code and data of the tracking module 15.
  • the processing core 131 is, for example, a processor, a controller, etc., and the processor includes one or more cores.
  • the processor may be a central processing unit or a graphics processor, and the processing core 131 may also be the core of a processor or a graphics processor.
  • the processing core 131 may also be a processing module, and the processing module includes multiple processors.
  • the operation of the optical tracking system includes the connection between the computer device 13 and the optical sensor 12, the pre-operation program, the coordinate correction program of the optical tracking system, the real-time rendering program, etc.
  • the tracking module 15 represents the relevant program codes of these operations and Data
  • the storage element 132 of the computer device 13 stores the tracking module 15
  • the processing core 131 executes the tracking module 15 to perform these operations.
  • the optimized conversion parameters can be found, and then the computer device 13 can set the medical appliance presentation objects 141-144 and the surgical target presentation objects 145 according to the optimized conversion parameters and the sensing signal The position in the three-dimensional model 14 of the surgical situation.
  • the computer device 13 can deduce the position of the medical appliance 21 inside and outside the surgical target object 3, and adjust the relative position between the medical appliance presenting objects 141 to 144 and the surgical target presenting object 145 in the three-dimensional model 14 of the operation situation accordingly.
  • the medical appliances 21-24 can be tracked in real time from the detection results of the optical sensor 12 and correspondingly presented in the three-dimensional model 14 of the surgical context.
  • the presentation of the three-dimensional model 14 in the surgical context is shown in FIG. 1D, for example.
  • the three-dimensional model 14 of the operation situation is a native model, which includes models established for the surgical target object 3 and also includes models established for the medical appliances 21-24.
  • the method of establishment can be that the developer directly uses computer graphics technology to construct it on the computer, such as using drawing software or special application development software.
  • the computer device 13 can output the display data 135 to the output device 5.
  • the display data 135 is used to present 3D images of the medical appliance presentation objects 141-144 and the surgical target presentation object 145.
  • the output device 5 can output the display data 135.
  • the output method is, for example, Display or print etc. The result of outputting in a display mode is shown in FIG. 1D, for example.
  • FIG. 2 is a flowchart of the pre-operation procedure of the optical tracking system of the embodiment.
  • the computer device 13 and the optical sensor 12 perform a pre-operation procedure.
  • the pre-operation procedure includes steps S01 and S02 for calibrating the optical sensor 12 and re-adjusting the scale of all medical appliances 21-24.
  • Step S01 is to calibrate the coordinate system of the optical sensor 12.
  • a plurality of calibration sticks have a plurality of optical markers, and the area enclosed by them is used to define the working area.
  • the optical sensor 12 senses the optical markers on the calibration stick. When all optical markers are detected by each optical sensor 12, the area enclosed by the correction rod is the effective working area.
  • the calibration bar is manually placed by the user, and the user can adjust the position of the calibration bar to modify the effective working area.
  • the sensitivity detected by the optical sensor 12 can be about 0.3 mm.
  • the coordinate system where the detection result of the optical sensor 12 is located is called the tracking coordinate system.
  • Step S02 is to adjust the scaling ratio of the medical appliances 21 to 24 and the surgical target object 3.
  • the medical appliances 21-24 are usually rigid bodies, and the coordinate correction adopts a rigid body correction method to avoid distortion. Therefore, the medical appliances 21-24 must be rescaled to the tracking coordinate system to obtain correct calibration results.
  • the calculation of scaling ratio can be obtained by the following formula:
  • Track G Tracking the center of gravity in the coordinate system
  • Track i Track the position of the optical marker in the coordinate system
  • Mesh G the center of gravity in the mesh point coordinate system
  • the tracking coordinate system is the coordinate system adopted by the detection result of the optical sensor 12, and the dot coordinate system is the coordinate system adopted by the three-dimensional model 14 of the surgical situation.
  • Step S02 first calculates the center of gravity in the tracking coordinate system and the dot coordinate system, and then calculates the distance between the optical marker and the center of gravity in the tracking coordinate system and the dot coordinate system. Then, for the individual ratios of the dot coordinate system to the tracking coordinate system, add up all the individual ratios and divide by the number of optical markers to obtain the ratio of the dot coordinate system to the tracking coordinate system.
  • FIG. 3A is a flowchart of the coordinate correction program of the optical tracking system of the embodiment.
  • the computer device and the optical sensor perform a coordinate calibration program, and the calibration program includes an initial calibration step S11, an optimization step S12, and a correction step S13.
  • the initial calibration step S11 performs an initial calibration between the coordinate system of the optical sensor 12 and the coordinate system of the three-dimensional model 14 of the surgical situation to obtain the initial conversion parameters.
  • the calibration between the coordinate systems is shown in FIG. 3B for example.
  • the optimization step S12 is to optimize the degrees of freedom of the initial conversion parameters to obtain the optimized conversion parameters. For example, the degrees of freedom are shown in FIG. 3C.
  • the correcting step S13 is to correct the setting error caused by the optical marker in the optimized conversion parameter.
  • the optical marker attached to the platform 4 can be used to correct the two coordinate systems.
  • the initial correction step S11 is to find the transformation matrix between the feature points of the medical appliance and the optical sensor as the initial transformation parameter.
  • the initial correction step is to use singular value decomposition (SVD) and triangular coordinate alignment (Triangle coordinate). registration) or linear least square estimation (linear least square estimation).
  • the transformation matrix includes, for example, a covariance matrix and a rotation matrix.
  • step S11 singular value decomposition can be used to find the optimal transformation matrix between the feature points of the medical appliance exhibits 141 to 144 and the optical sensor as the initial transformation parameter, and the covariance matrix H can be obtained from These feature points are obtained and can be regarded as the objective function to be optimized.
  • the rotation matrix M can be found by the following formula:
  • the translation matrix T After obtaining the rotation matrix M, the translation matrix T can be found by the following formula:
  • the optimization step S12 is to obtain multiple Euler angles with multiple degrees of freedom from the rotation matrix M, and use Gauss-Newton algorithm to iteratively optimize the parameters of multiple degrees of freedom to obtain optimized conversion parameters.
  • the multiple degrees of freedom are, for example, six degrees of freedom, and other numbers of degrees of freedom, such as nine degrees of freedom, etc., are also possible to modify the expression appropriately. Since the conversion result obtained from the initial calibration step S11 may not be accurate enough, performing the optimization step S12 can improve the accuracy and obtain a more accurate conversion result.
  • the rotation matrix M can be obtained from the above formula.
  • multiple Euler angles can be obtained from the following formula:
  • the rotation of the world coordinate system is assumed to be orthogonal. Since the parameters of six degrees of freedom have been obtained, these parameters can be iteratively optimized by the Gauss-Newton method to obtain optimized conversion parameters. Is the objective function to be minimized.
  • b represents the least square error between the reference target point and the current point
  • n is the number of feature points
  • It is a transformation parameter which has translation and rotation parameters, and iteratively transforms the parameters by using Gauss Newton method Will adjust to find the best value and change the parameters
  • the update function is as follows:
  • is the Jacobian matrix from the objective function (Jacobian matrix)
  • the stop condition is defined as follows:
  • the correcting step S13 is to correct the setting error caused by the optical marker in the optimized conversion parameter.
  • the correction step S13 includes a determination step S131 and an adjustment step S132.
  • the source feature point correction procedure can be used to overcome the error caused by manually selecting feature points. This is because the medical appliance presentation objects 141 to 144 of the surgical scene three-dimensional model 14 and the surgical target presentation object 145 have feature points and medical treatments. The error of the feature points of the tools 21-24 and the surgical target object 3, these feature points are selected by the user. The feature points of the medical tools 21-24 and the surgical target object 3 may include points where the optical marker 11 is set. Since the optimal transformation can be obtained from step S12, the target position transformed from the source point will approach the reference target point V T after the nth iteration as follows:
  • the source point correction step first calculate the inverse transformation of the transformation matrix, and then obtain the new source point from the reference target point.
  • the calculation formula is as follows:
  • each iteration can be set to a constraint step size (constraint step size) c 1 , and set the constraint area box size ( constraint region box size) c 2 It can be a constant value to limit the distance moved by the original source point. This correction is as follows:
  • V T from the source point is the target point after V S transformation.
  • the coordinate position of the three-dimensional model 14 of the surgical situation can be accurately transformed to correspond to the optical marker 11 in the tracking coordinate system, and vice versa.
  • the medical tools 21-24 and the surgical target object 3 can be tracked in real time based on the detection results of the optical sensor 12, and the positions of the medical tools 21-24 and the surgical target object 3 in the tracking coordinate system can be processed after the aforementioned processing.
  • the medical appliance presentation objects 141-144 correspond to the surgical target presentation object 145 accurately.
  • the medical appliance presentation objects 141-144 and the surgical target presentation The object 145 will follow the movement of the three-dimensional model 14 of the operation situation in real time.
  • FIG. 4 is a block diagram of the training system for the operation of the medical appliance according to the embodiment.
  • the training system for medical appliance operation (hereinafter referred to as the training system) can truly simulate the surgical training environment.
  • the training system includes an optical tracking system 1a, one or more medical appliances 21-24, and the surgical target object 3.
  • the optical tracking system 1a includes a plurality of optical markers 11, a plurality of optical sensors 12, and a computer device 13.
  • the optical markers 11 are arranged on medical appliances 21-24 and surgical target objects 3, and medical appliances 21-24 and surgical target objects 3 are placed On platform 4.
  • the medical appliance presents 141 to 144 and the surgical target presents 145 are correspondingly presented on the three-dimensional model 14a of the surgical context.
  • the medical tools 21-24 include medical probes and surgical tools.
  • the medical tools 21 are medical probes
  • the medical tools 22-24 are surgical tools.
  • the medical appliance presentations 141-144 include medical probe presentations and surgical appliance presentations.
  • the medical appliance presentation 141 is a medical probe presentation
  • the medical appliance presentations 142-144 are surgical appliance presentations.
  • the storage component 132 stores the program code and data of the operation situation three-dimensional model 14a and the tracking module 15, and the processing core 131 can access the storage component 132 to execute and process the operation situation three-dimensional model 14a and the program code and data of the tracking module 15.
  • the surgical target object 3 is an artificial limb, such as artificial upper limbs, hand phantoms, artificial palms, artificial fingers, artificial arms, artificial upper arms, artificial forearms, artificial elbows, artificial upper limbs, artificial feet, artificial toes, artificial ankles, artificial Calf, false thigh, false knee, false torso, false neck, false head, false shoulder, false chest, false abdomen, false waist, false hip or other false parts, etc.
  • artificial upper limbs such as artificial upper limbs, hand phantoms, artificial palms, artificial fingers, artificial arms, artificial upper arms, artificial forearms, artificial elbows, artificial upper limbs, artificial feet, artificial toes, artificial ankles, artificial Calf, false thigh, false knee, false torso, false neck, false head, false shoulder, false chest, false abdomen, false waist, false hip or other false parts, etc.
  • the training system takes the minimally invasive surgery training of the fingers as an example.
  • the surgical target 3 is a prosthetic hand, the surgery is for example a trigger finger treatment surgery, and the medical probe 21 is a realistic ultrasonic transducer (or probe). ), the surgical instruments 22-24 are a needle, a dilator, and a hook blade. In other embodiments, other surgical target objects 3 may be used for other surgical training.
  • the storage element 132 also stores the program codes and data of the physical medical image 3D model 14b, the artificial medical image 3D model 14c, and the training module 16.
  • the processing core 131 can access the storage element 132 to execute and process the physical medical image 3D model 14b and artificial medicine.
  • the training module 16 is responsible for the following surgical training procedures and the processing, integration and calculation of related data.
  • FIG. 5A is a schematic diagram of the three-dimensional model of the surgical scene of the embodiment
  • FIG. 5B is a schematic diagram of the three-dimensional physical medical image model of the embodiment
  • FIG. 5C is the three-dimensional model of the artificial medical image Schematic.
  • the content of these three-dimensional models can be output or printed by the output device 5.
  • the solid medical image three-dimensional model 14b is a three-dimensional model established from medical images, which is a model established for the surgical target object 3, such as the three-dimensional model shown in FIG. 5B.
  • the medical image is, for example, a computer tomography image, and the image of the surgical target object 3 actually generated after the computer tomography is used to build the solid medical image three-dimensional model 14b.
  • the artificial medical image three-dimensional model 14c contains an artificial medical image model.
  • the artificial medical image model is a model established for the surgical target object 3, such as the three-dimensional model shown in FIG. 5C.
  • the artificial medical imaging model is a three-dimensional model of artificial ultrasound images. Since the surgical target 3 is not a real living body, although computer tomography can obtain images of the physical structure, other medical imaging equipment such as ultrasound imaging can still be used. Effective or meaningful images cannot be obtained directly from the surgical target object 3. Therefore, the ultrasound image model of the surgical target object 3 must be artificially generated. Selecting an appropriate position or plane from the three-dimensional model of artificial ultrasound images can generate two-dimensional artificial ultrasound images.
  • the computer device 13 generates a medical image 136 according to the three-dimensional model 14a of the surgical situation and the medical image model.
  • the medical image model is, for example, a solid medical image three-dimensional model 14b or an artificial medical image three-dimensional model 14c.
  • the computer device 13 generates a medical image 136 based on the three-dimensional model 14a of the surgical situation and the three-dimensional model 14c of an artificial medical image.
  • the medical image 136 is a two-dimensional artificial ultrasound image.
  • the computer device 13 scores the detection object found by the medical probe 141 and the operation of the surgical instrument representation 145, such as a specific surgical site.
  • 6A to 6D are schematic diagrams of the direction vector of the medical appliance of the embodiment.
  • the direction vectors of the medical device presentation objects 141-144 corresponding to the medical devices 21-24 will be rendered in real time.
  • the direction vector of the medical probe can be calculated by calculating the center of gravity of the optical marker And get, and then project from another point to the xz plane, and calculate the vector from the center of gravity to the projection point.
  • Other medical appliance presentations 142-144 are relatively simple, and the direction vector can be calculated using the sharp points in the model.
  • the training system can only draw the model of the area where the surgical target presenting object 145 is located instead of drawing all the medical appliance presenting objects 141-144.
  • the transparency of the skin model can be adjusted to observe the internal anatomical structure of the surgical target present 145, and to see ultrasound image slices or computed tomography image slices of different cross-sections, such as horizontal cross-sections. plane or axial plane), sagittal plane (sagittal plane) or coronal plane (coronal plane), which can help the operator during the operation.
  • the bounding boxes of each model are constructed for collision detection.
  • the surgical training system can determine which medical appliances have contacted tendons, bones and/or skin, and can determine when to start scoring.
  • the optical marker 11 attached to the surgical target object 3 must be clearly seen or detected by the optical sensor 12. If the optical marker 11 is covered, the position of the optical marker 11 must be detected accurately The degree will decrease, and the optical sensor 12 needs at least two to see all the optical markers at the same time.
  • the calibration procedure is as described above, for example, three-stage calibration, which is used to accurately calibrate two coordinate systems.
  • the correction error, the iteration count and the final position of the optical marker can be displayed in the window of the training system, for example by the output device 5.
  • the accuracy and reliability information can be used to remind users that the system needs to be recalibrated when the error is too large.
  • the three-dimensional model is drawn at a frequency of 0.1 times per second, and the drawn result can be output to the output device 5 for display or printing.
  • the user can start the surgical training process.
  • the training process first use a medical probe to find the site to be operated on. After finding the site to be operated on, the site is anesthetized. Then, expand the path from the outside to the surgical site, and after expansion, the scalpel is deepened along this path to the surgical site.
  • FIGS. 7A to 7D are schematic diagrams of the training process of the training system of the embodiment.
  • the surgical training process includes four stages and the minimally invasive surgery training of the fingers is taken as an example for illustration.
  • the medical probe 21 is used to find the site to be operated on to confirm that the site to be operated on is in the training system.
  • the surgical site is, for example, the pulley area (pulley), which can be judged by looking for the position of the metacarpophalangeal joints, the anatomical structures of the bones and tendons of the fingers, and the focus at this stage is whether the first pulley area (A1 pulley) is found.
  • the training system will automatically enter the next stage of scoring.
  • the medical probe 21 is placed on the skin and kept in contact with the skin at the metacarpal joints (MCP joints) along the midline of the flexor tendon.
  • the surgical instrument 22 is used to open the path of the surgical area.
  • the surgical instrument 22 is, for example, a needle.
  • the needle is inserted to inject local anesthetic and expand the space.
  • the process of inserting the needle can be performed under the guidance of continuous ultrasound images.
  • This continuous ultrasound image is an artificial ultrasound image, which is like the aforementioned medical image 136. Since it is difficult to simulate regional anesthesia with a prosthetic hand, anesthesia is not specifically simulated.
  • the surgical instrument 23 is pushed in along the same path as the surgical instrument 22 in the second stage to create the trajectory required for hooking the knife in the next stage.
  • the surgical instrument 23 is, for example, a dilator.
  • the training system will automatically enter the next stage of scoring.
  • the surgical instrument 24 is inserted along the trajectory created in the third stage, and the pulley is divided by the surgical instrument 24.
  • the surgical instrument 24 is, for example, a hook blade.
  • the focus of the third stage is similar to that of the fourth stage. During the surgical training process, the vessels and nerves near both sides of the flexor tendon may be easily miscut. Therefore, the third stage and the fourth stage The focus of the stage is not only not contacting tendons, nerves and blood vessels, but also opening a track that is at least 2mm larger than the first pulley area to leave space for the hook knife to cut the pulley area.
  • the operations of each training phase must be quantified.
  • the operation area during the operation is defined by the finger anatomy as shown in Figure 8A, which can be divided into an upper boundary and a lower boundary. Because most of the tissue on the tendon is fat and does not cause pain, the upper boundary of the surgical area can be defined by the skin of the palm, and the lower boundary is defined by the tendon.
  • the proximal depth boundary is 10mm (average length of the first trochlear zone) from the metacarpal head-neck joint.
  • the remote depth boundary is not important, because it has nothing to do with tendons, blood vessels, and nerves.
  • the left and right boundaries are defined by the width of the tendon, and nerves and blood vessels are located on both sides of the tendon.
  • the scoring method for each training stage is as follows.
  • the focus of the training is to find the target, for example, the target to be excised.
  • the target for example, the target to be excised.
  • the first pulley area A1 Pulley.
  • the scoring formula for the first stage is as follows:
  • the first stage score the score of the target object ⁇ its weight + the angle score of the probe ⁇ its weight
  • the focus of training is to use the needle to open the path of the surgical area. Since the pulley area surrounds the tendon, the distance between the main axis of the bone and the needle should be small. Therefore, the calculation formula for the second stage scoring is as follows:
  • Second stage score opening score ⁇ its weight + needle angle score ⁇ its weight + distance from the main axis of the bone score ⁇ its weight
  • the focus of training is to insert a dilator that enlarges the surgical area into the finger.
  • the trajectory of the dilator must be close to the main axis of the bone.
  • the angle between the expander and the main axis of the bone should be approximately parallel, and the allowable angle deviation is ⁇ 30°. Due to the space left for the hook knife to cut the first trolley area, the expander must be at least 2mm higher than the first trolley area.
  • the third stage score higher than the pulley area score ⁇ its weight + expander angle score ⁇ its weight + distance from the main axis of the bone score ⁇ its weight + not leaving the surgical area score ⁇ its weight
  • the scoring conditions are similar to those in the third stage, except that the hook needs to be rotated 90°. This rule is added to the scoring at this stage.
  • the scoring formula is as follows:
  • the fourth stage score higher than the pulley area score ⁇ its weight + hook angle score ⁇ its weight + distance from the main axis of the bone score ⁇ its weight + not leaving the surgical area score ⁇ its weight + rotating hook score ⁇ its weight
  • this calculation method is the same as calculating the angle between the palm normal and the direction vector of the medical appliance.
  • this calculation method is the same as calculating the angle between the palm normal and the direction vector of the medical appliance.
  • the three axes of the bone can be found by using Principal Components Analysis (PCA) on the bone from the computed tomography image.
  • PCA Principal Components Analysis
  • the longest axis is taken as the main axis of the bone.
  • the shape of the bone in the computed tomography image is not uniform, which causes the axis found by the principal component analysis and the palm normal to be not perpendicular to each other.
  • FIG. 8C instead of using principal component analysis on the bone, the skin on the bone can be used to find the palm normal using principal component analysis. Then, the angle between the bone spindle and the medical appliance can be calculated.
  • the distance between the bone main axis and the medical appliance also needs to be calculated.
  • the distance calculation is similar to calculating the distance between the top and the plane of the medical appliance.
  • the plane refers to the plane containing the bone main axis vector vector and the palm normal.
  • the schematic diagram of distance calculation is shown in Figure 8D. This plane can be obtained by the cross product of the palm normal vector D2 and the bone principal axis vector D1. Since these two vectors can be obtained in the previous calculation, the distance between the main axis of the bone and the appliance can be easily calculated.
  • FIG. 8E is a schematic diagram of the artificial medical image of the embodiment, and the tendon section and the skin section in the artificial medical image are marked with dotted lines.
  • the tendon section and the skin section can be used to construct the model and the bounding box, the bounding box is used for collision detection, and the pulley area can be defined in the static model.
  • collision detection it is possible to determine the surgical area and determine whether the medical appliance crosses the pulley area.
  • the average length of the first pulley area is about 1mm, and the first pulley area is located at the proximal end of the metacarpal head-neck (MCP) joint.
  • MCP metacarpal head-neck
  • the average thickness of the pulley area is about 0.3mm and surrounds the tendons.
  • FIG. 9A is a flowchart of generating artificial medical images according to an embodiment. As shown in FIG. 9A, the generated flow includes step S21 to step S24.
  • Step S21 is to extract the first set of bone skin features from the cross-sectional image data of the artificial limb.
  • An artificial limb is the aforementioned surgical target object 3, which can be used as a limb for minimally invasive surgery training, such as a prosthetic hand.
  • the cross-sectional image data report contains multiple cross-sectional images, and the cross-sectional reference images are computed tomography images or solid cross-sectional images.
  • Step S22 is to extract the second set of bone skin features from the medical image data.
  • the medical image data is a three-dimensional ultrasound image, such as the three-dimensional ultrasound image of FIG. 9B, which is created by multiple planar ultrasound images.
  • Medical image data are medical images taken of real organisms, not artificial limbs.
  • the first group of bone skin features and the second group of bone skin features include multiple bone feature points and multiple skin feature points.
  • Step S23 is to establish feature registration data (registration) based on the first set of bone and skin features and the second set of bone and skin features.
  • Step S23 includes: taking the first set of bone-skin features as a reference target (target); finding out the correlation function as the spatial alignment correlation data, where the correlation function satisfies the second set of bone-skin features to align with the reference target without being due to the first set of bones Disturbance caused by skin features and the second set of bone skin features.
  • the correlation function is found through the algorithm of the maximum likelihood estimation problem (maximum likelihood estimation problem) and the maximum expectation algorithm (EM Algorithm).
  • Step S24 is to perform deformation processing on the medical image data according to the feature alignment data to generate artificial medical image data suitable for artificial limbs.
  • the artificial medical image data is, for example, a three-dimensional ultrasound image, which still retains the characteristics of the organism in the original ultrasound image.
  • Step S24 includes: generating a deformation function based on the medical image data and feature alignment data; applying a grid to the medical image data and obtaining multiple dot positions accordingly; deforming the dot positions according to the deformation function; based on the deformed dot positions,
  • the medical image data is supplemented with corresponding pixels to generate a deformed image, and the deformed image is used as artificial medical image data.
  • the deformation function is generated using the moving least square (MLS) method.
  • the deformed image is generated using affine transform.
  • step S21 to step S24 by capturing the image characteristics of the real ultrasound image and the artificial hand computed tomography image, the corresponding point relationship of the deformation is obtained by image alignment, and then the image close to the real ultrasound is generated based on the artificial hand through the deformation
  • the ultrasound retains the characteristics of the original live ultrasound image.
  • the artificial medical image data is a three-dimensional ultrasound image, a plane ultrasound image of a specific position or a specific section can be generated based on the position or section mapped by the three-dimensional ultrasound image.
  • FIG. 10A and FIG. 10B are schematic diagrams of the correction of the artificial hand model and the ultrasound volume of the embodiment.
  • the physical medical image 3D model 14b and the artificial medical image 3D model 14c are related to each other. Since the model of the prosthetic hand is constructed by the computed tomography image volume, the positional relationship between the computed tomography image volume and the ultrasound volume can be directly used to integrate the artificial hand. Establish correlation with ultrasound volume.
  • FIG. 10C is a schematic diagram of ultrasonic volume and collision detection according to an embodiment
  • FIG. 10D is a schematic diagram of an artificial ultrasound image according to an embodiment.
  • the training system must be able to simulate a real ultrasonic transducer (or probe) and generate slice image fragments from the ultrasonic volume. Regardless of the angle of the transducer (or probe), the simulated transducer (or probe) must depict the corresponding image segment.
  • the angle between the medical probe 21 and the ultrasonic body is first detected, and then the collision detection of the segment surface is based on the width of the medical probe 21 and the ultrasonic volume, which can be used to find the corresponding image segment being drawn.
  • the resulting image is shown in Figure 10D.
  • the artificial medical image data is a three-dimensional ultrasound image
  • the three-dimensional ultrasound image has a corresponding ultrasound volume
  • the content of the image segment to be depicted by the simulated transducer (or probe) can be generated according to the position of the three-dimensional ultrasound image mapping.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Optics & Photonics (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Instructional Devices (AREA)

Abstract

Un système de suivi optique (1) pour des instruments médicaux (21-24) comprend de multiples marqueurs optiques (11), de multiples capteurs optiques (12) et un dispositif informatique (13). Les marqueurs optiques (11) sont prévus au niveau des instruments médicaux (21-24). Les capteurs optiques respectifs (12) détectent optiquement les marqueurs optiques (11) pour obtenir de multiples signaux détectés. Le dispositif informatique (13) a un modèle de scénario chirurgical tridimensionnel (14), et est connecté aux capteurs optiques (12) pour recevoir les signaux détectés. Le dispositif informatique ajuste, en fonction des signaux détectés, des positions relatives de représentations d'instruments médicaux (141-144) et une représentation cible chirurgicale (145) dans le modèle de scénario chirurgical tridimensionnel (14).
PCT/CN2019/082803 2019-04-16 2019-04-16 Système de suivi optique et système d'entraînement pour instruments médicaux Ceased WO2020210967A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/082803 WO2020210967A1 (fr) 2019-04-16 2019-04-16 Système de suivi optique et système d'entraînement pour instruments médicaux

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/082803 WO2020210967A1 (fr) 2019-04-16 2019-04-16 Système de suivi optique et système d'entraînement pour instruments médicaux

Publications (1)

Publication Number Publication Date
WO2020210967A1 true WO2020210967A1 (fr) 2020-10-22

Family

ID=72837685

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/082803 Ceased WO2020210967A1 (fr) 2019-04-16 2019-04-16 Système de suivi optique et système d'entraînement pour instruments médicaux

Country Status (1)

Country Link
WO (1) WO2020210967A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090087050A1 (en) * 2007-08-16 2009-04-02 Michael Gandyra Device for determining the 3D coordinates of an object, in particular of a tooth
CN101467887A (zh) * 2007-12-29 2009-07-01 复旦大学 一种手术导航系统中x射线透视图像标定方法
CN102860841A (zh) * 2012-09-25 2013-01-09 陈颀潇 超声图像引导下穿刺手术的辅助导航系统及方法
CN106859767A (zh) * 2017-03-29 2017-06-20 上海霖晏网络科技有限公司 一种手术导航方法
CN106890025A (zh) * 2017-03-03 2017-06-27 浙江大学 一种微创手术导航系统和导航方法
CN107970074A (zh) * 2016-10-25 2018-05-01 韦伯斯特生物官能(以色列)有限公司 使用个性化夹具进行头部对准
CN109195527A (zh) * 2016-03-13 2019-01-11 乌泽医疗有限公司 用于与骨骼手术一起使用的设备及方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090087050A1 (en) * 2007-08-16 2009-04-02 Michael Gandyra Device for determining the 3D coordinates of an object, in particular of a tooth
CN101467887A (zh) * 2007-12-29 2009-07-01 复旦大学 一种手术导航系统中x射线透视图像标定方法
CN102860841A (zh) * 2012-09-25 2013-01-09 陈颀潇 超声图像引导下穿刺手术的辅助导航系统及方法
CN109195527A (zh) * 2016-03-13 2019-01-11 乌泽医疗有限公司 用于与骨骼手术一起使用的设备及方法
CN107970074A (zh) * 2016-10-25 2018-05-01 韦伯斯特生物官能(以色列)有限公司 使用个性化夹具进行头部对准
CN106890025A (zh) * 2017-03-03 2017-06-27 浙江大学 一种微创手术导航系统和导航方法
CN106859767A (zh) * 2017-03-29 2017-06-20 上海霖晏网络科技有限公司 一种手术导航方法

Similar Documents

Publication Publication Date Title
TWI711428B (zh) 用於醫療用具的光學追蹤系統及訓練系統
AU2024203340B2 (en) Systems for assisting surgery
CN113842213B (zh) 手术机器人导航定位方法及系统
AU2020311392B2 (en) Augmented reality assisted joint arthroplasty
US7715602B2 (en) Method and apparatus for reconstructing bone surfaces during surgery
US9101394B2 (en) Implant planning using captured joint motion information
CN109984843B (zh) 骨折闭合复位导航系统及方法
JP5866346B2 (ja) 運動パターンを使用して関節骨の奇形切除を判定する方法
US20210007806A1 (en) A method for obtaining 3-d deformity correction for bones
US12433679B2 (en) Bone registration methods for robotic surgical procedures
US11344180B2 (en) System, apparatus, and method for calibrating oblique-viewing rigid endoscope
US10621736B2 (en) Method and system for registering a patient with a 3D image using a robot
JP2004512136A (ja) 膝義足の位置を決定するシステム
WO2013083298A1 (fr) Acquisition de paramètres de position de contact et détection de contact d'un joint
KR20160133367A (ko) 외과 수술들의 컴퓨터-지원 시뮬레이션을 위한 디바이스 및 방법
Pettersson et al. Simulation of patient specific cervical hip fracture surgery with a volume haptic interface
TWI707660B (zh) 手術用穿戴式影像顯示裝置及手術資訊即時呈現系統
JP4319043B2 (ja) 外科手術の時に骨表面を再構成する方法および装置
CN109350059B (zh) 用于肘部自动对准的组合的转向引擎和界标引擎
Liu et al. Augmented reality system training for minimally invasive spine surgery
JP2021153773A (ja) ロボット手術支援装置、手術支援ロボット、ロボット手術支援方法、及びプログラム
WO2020210967A1 (fr) Système de suivi optique et système d'entraînement pour instruments médicaux
JP7495216B2 (ja) 鏡視下手術支援装置、鏡視下手術支援方法、及びプログラム
WO2020210972A1 (fr) Dispositif d'affichage d'image portable pour chirurgie et système de présentation en temps réel d'informations chirurgicales
CN119344869B (zh) 一种基于光学追踪系统的髓内钉锁钉方法、设备及程序产品

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19925185

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19925185

Country of ref document: EP

Kind code of ref document: A1