WO2017107116A1 - Système de navigation pour une opération à effraction minimale - Google Patents
Système de navigation pour une opération à effraction minimale Download PDFInfo
- Publication number
- WO2017107116A1 WO2017107116A1 PCT/CN2015/098590 CN2015098590W WO2017107116A1 WO 2017107116 A1 WO2017107116 A1 WO 2017107116A1 CN 2015098590 W CN2015098590 W CN 2015098590W WO 2017107116 A1 WO2017107116 A1 WO 2017107116A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- human body
- module
- data
- positioner
- minimally invasive
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
Definitions
- the invention relates to the field of medical device technology, in particular to a minimally invasive surgery navigation system.
- minimally invasive surgery is a procedure that allows surgeons to perform surgery without the need for large wounds to the patient, primarily through endoscopes and various imaging techniques. Compared with traditional surgery, minimally invasive surgery can be performed without causing a large wound to the patient, and the damage to the patient is greatly reduced, which is a great advancement in medicine. However, minimally invasive surgery also has higher technical requirements.
- minimally invasive surgery can not directly visually observe the surgical site, doctors must rely on the surgical navigation system to obtain the spatial position and posture information of the human surgical site and surgical instruments, and the stability of the minimally invasive surgical navigation system. Whether it will directly affect the results of minimally invasive surgery, and the ease with which doctors obtain information during surgery can also affect the quality of surgery. Therefore, the effective operation of minimally invasive surgery navigation system is the key to the success of minimally invasive surgery.
- 1 device tracking system intraoperative tracking of surgical equipment and other surgical equipment and real-time posture of the human body;
- preoperative modeling used to determine the physiological structure of the human body in real time during the operation (dynamic process);
- intraoperative images used for intraoperative doctors' reference, through the intraoperative interim angiography to obtain real-time image information of the human body during surgery, to avoid the error of the doctor due to the real-time state of the human body and the pre-operative human body modeling.
- the "Lobster” minimally invasive surgery navigation system mainly includes the following main parts: the robotic arm console, the binocular tracker, the auxiliary robot arm and the surgical instrument.
- the mechanical arm console is used for controlling the movement of the mechanical arm
- the real-time spatial positioning information about the object provided by the binocular tracker is used to indicate the moving target of the auxiliary mechanical arm
- the display on the console is used for real-time operation of the doctor during the operation.
- the binocular tracker tracks the human body by marking points that are attached to the tracking object (the latter is described in the following NDI products)
- Auxiliary robotic arm and hand The real-time spatial pose of the instrument, and the relevant information is provided to the robotic arm console for analysis and processing; the surgical instrument is clamped by the auxiliary mechanical arm, which can help the doctor to perform auxiliary operation on the patient.
- this patent has the following disadvantages: it is suitable for surgery on a relatively fixed human body part of the skull, and surgery for a human body part with real-time exercise volume affected by natural respiration cannot be applied because the program cannot fix a pre-operative computer-generated fixed model. It is effectively superimposed and displayed with the body part that changes with natural breathing in real time; and when the abnormality of the human body breathing occurs during surgery, the organ displacement state during breathing will be greatly different from that obtained before surgery, and the minimally invasive The surgery has an impact.
- the embodiment of the invention provides a minimally invasive surgery navigation system, which can effectively superimpose and display a fixed model generated by a computer before surgery and a human body part that changes with natural breathing in real time; There is a big difference between the organ displacement status and the preoperative image, which has an impact on minimally invasive surgery.
- the minimally invasive surgical navigation system includes:
- the preoperative data module is configured to reconstruct four-dimensional CT data at different times in a respiratory cycle collected before surgery to obtain a three-dimensional human body model at different times in a breathing cycle;
- the intraoperative data module is configured to collect the posture of the human body locator during the operation by using a tracker, and encode the posture of the human body locator to obtain a pose encoding; and find the pose encoding in the pose lookup table. Corresponding preoperative time;
- a positioning module configured to collect position data of the surgical instrument positioner and position data of the human body positioner through the tracker, and transmit the position data of the surgical instrument positioner and the position data of the human body positioner to the navigation module;
- a navigation module configured to receive and store the pre-operative time, obtain and display a three-dimensional model of the human body from the pre-operative data module according to the pre-operative time; receive the position data of the surgical instrument positioner and the human body locator The position data is obtained according to the position data of the surgical instrument positioner; based on the position data of the human body positioner, the surgical instrument model and the human body three-dimensional model are displayed by using the augmented reality display device.
- Intraoperative angiography module is used for real-time angiography of the surgical site when the body posture is abnormal during surgery, obtaining angiographic data, and transmitting the angiographic data to the navigation module;
- the navigation module is further configured to adjust and display the three-dimensional human body model according to the contrast data.
- the intraoperative contrast module is specifically configured to determine an abnormality of the posture of the human body during the operation when ⁇ x 0 is greater than or equal to ⁇ x;
- ⁇ x 0 is the error distance generated by the n body positioner
- i 1, 2, . . . , n
- ⁇ x i is the error distance generated by the i-th body positioner
- ⁇ x is a preset critical error distance
- the intraoperative contrast module performs real-time imaging of the surgical site through a C-wall or ultrasound device.
- the minimally invasive surgical navigation system further includes:
- An independent navigation device for acquiring position data of the human body positioner when the body positioner exceeds the tracking range of the tracker or when there is occlusion between the body positioner and the tracker;
- the independent navigation device includes a gyroscope, an accelerometer, and a wireless communication module;
- the gyroscope is configured to acquire real-time spatial angular acceleration data of a human body locator and/or a surgical instrument locator;
- the accelerometer is configured to acquire acceleration data in three coordinate directions of a real-time space of a human body positioner and/or a surgical instrument positioner;
- the wireless communication module is configured to transmit acceleration data acquired by the gyroscope and the accelerometer to the navigation module in real time.
- the independent navigation device further includes:
- Power supply for independent powering of gyroscopes, accelerometers, and wireless communication modules for independent powering of gyroscopes, accelerometers, and wireless communication modules.
- the gyroscope is a spatial three-phase gyroscope.
- the accelerometer is a spatial three-way acceleration sensor.
- the body positioner and surgical instrument positioner are optical positioners.
- the tracker is an optical tracker.
- the three-dimensional model of the human body at different times in the breathing cycle is obtained by the pre-operative data module; the pose of the human body locator is encoded by the intraoperative data module to obtain the pose encoding, and the pose lookup table is obtained.
- Finding a pre-operative moment corresponding to the pose encoding receiving a pre-operative moment through the navigation module, and finding a three-dimensional model of the human body at a corresponding moment in the pre-operative data module according to the pre-operative time; according to the position data of the received surgical instrument locator Obtaining a surgical instrument model; using the augmented reality display device to display the surgical instrument model and the human body three-dimensional model based on the position data of the human body locator, so that the computer-generated fixed model and the human body part that changes with natural breathing in real time cannot be effectively performed.
- FIG. 1 is a diagram showing the operation of a nasal endoscopic minimally invasive surgery navigation system based on augmented reality technology according to an embodiment of the present invention
- FIG. 2 is a schematic structural view of a minimally invasive surgery navigation system according to an embodiment of the present invention
- FIG. 3 is a system operation diagram of a minimally invasive surgery navigation system according to an embodiment of the present invention.
- FIG. 4 is a diagram showing the composition of a Marker with an independent navigation device in an embodiment of the present invention.
- FIG. 5 is a working mechanism diagram of a minimally invasive surgery navigation system according to an embodiment of the present invention.
- the surgical site can be re-executed.
- the problems in the prior art described above can be solved.
- the present invention proposes a minimally invasive surgical navigation system.
- the minimally invasive surgery navigation system includes:
- the pre-operative data module 100 is configured to reconstruct four-dimensional CT data at different times in a respiratory cycle collected before surgery to obtain a three-dimensional human body model at different times in a breathing cycle;
- the intraoperative data module 200 is configured to collect the posture of the human body locator during the operation by using a tracker, and encode the posture of the human body locator to obtain a pose encoding; and find the posture in the pose lookup table. Coding the corresponding preoperative time;
- the positioning module 300 is configured to collect position data of the surgical instrument positioner and position data of the human body positioner through the tracker, and transmit the position data of the surgical instrument positioner and the position data of the human body positioner to the navigation module;
- the navigation module 400 is configured to receive and store the pre-operative time, obtain and display a three-dimensional human body model from the pre-operative data module according to the pre-operative time, and receive the position data and the human body positioning of the surgical instrument positioner. Position data of the device, obtaining a surgical instrument model according to the position data of the surgical instrument positioner; displaying the surgical instrument model and the human body three-dimensional model using the augmented reality display device based on the position data of the human body positioner;
- the intraoperative contrast module 500 is configured to perform real-time angiography on the surgical site when the posture of the human body is abnormal during the operation, obtain angiographic data, and send the angiographic data to the navigation module;
- the navigation module 400 is further configured to adjust the three-dimensional human body model according to the contrast data.
- the pre-operative data module 100 includes a 3D database 101, a file lookup table 102, and an input and output module 103.
- the specific function of the preoperative data module 100 is: responsible for reconstructing 4D CT (generally taking one breathing cycle, 0.1s interval), and the reconstruction effect is that the human reconstructs a corresponding time 3D data every 0.1 s in a breathing cycle, and according to The imaging time is performed on the 3D database 101; the file lookup table 102 is to locate the location of the file to be extracted in the 3D database 101 according to the recording time provided by the intraoperative data module 200; the input and output module 103 can be used to perform data entry, and request data extraction. Provide file output function after searching.
- the intraoperative data module 200 includes a body locator 201, a tracker, a pose encoding module 203, and a pose lookup table 204.
- the tracker tracks the state of the body locator 201, and the pose is encoded by the pose encoding module 203.
- the code is used to find the pre-recorded time corresponding to the state code of the body locator 201 at this time based on the pose lookup table 204.
- Pre-stored in the pose lookup table 204 is a code sequence generated by continuously tracking the state of the human body locator 201 in synchronization with the dynamic CT scan, and the recording time corresponding to each code is consistent with the pre-operative contrast time (of course here) All system time systems are required to be fully synchronized).
- the number of the human body positioners 201 is three or more, and the position locator is used.
- the position locator is preferably a patch external locator, and the patch external locator is preferably a magnetic patch locator or an optical patch locator.
- the body positioner 201 Prior to surgery, the body positioner 201 should be attached as much as possible to the ribs or sternum that are subject to greater changes in breathing, such as under the chest.
- the tracker is preferably a magnetic tracker or an optical tracker.
- the positioning module 300 includes a surgical instrument positioner 301, a human body positioner 302, an augmented reality display device positioner 303, and a tracker.
- the specific function of the positioning module 300 is: tracking the surgical instrument positioner 301 through the tracker to obtain the position data of the surgical instrument positioner; tracking the body positioner 302 through the tracker to obtain the position data of the body positioner; and tracking the augmented reality through the tracker
- the device locator 303 obtains location data of the augmented reality device locator.
- the positioning module 300 transmits the position data of the surgical instrument positioner and the position data of the human body positioner to the navigation module for correlation operation and display of the navigation module.
- the augmented reality display device locator 303 is tracked by the tracker to obtain the position data of the augmented reality display device and transmitted to the navigation module for related operations and display of the navigation module. Since the existing augmented reality display device itself may also have a positioning function, the augmented reality display device locator 303 can also be cancelled.
- the tracker is preferably a magnetic tracker or an optical tracker
- the surgical instrument positioner 301 and the body positioner 302 are preferably magnetic or optical positioners.
- the body positioner 302 is mounted in an area that does not move with the human body (such as a hip or shoulder joint).
- the human body positioner 201 and the human body positioner 302 described above are collectively referred to as a human body positioner.
- the navigation module 400 is configured to receive and store the pre-operative time, obtain and display the human body three-dimensional model from the pre-operative data module 100 according to the pre-operative time; and receive the position data of the surgical instrument locator sent by the positioning module 300. And the position data of the human body positioner, the surgical instrument model is obtained according to the position data of the surgical instrument positioner; based on the position data of the human body positioner, the surgical instrument model and the human body three-dimensional model are displayed by using the augmented reality display device.
- the navigation module 400 when noise is mixed in the process of image acquisition, the navigation module 400 also needs to perform denoising operation and image enhancement operation on the image.
- the navigation module 400 also needs to delineate the lesions and the vascular nerves or organs that need attention in the nearby surgery in the three-dimensional model of the human body before surgery.
- the augmented reality display module included in the navigation module 400 is used to display the surgical instrument model and the human body three-dimensional model.
- the augmented reality display module is mainly an augmented reality display device, and the wearable smart glasses or augmented reality (AR) is a technique for calculating the position and angle of the camera image in real time and adding corresponding images.
- the goal is to put the virtual world on the screen and interact with the helmet.
- the effect of the display is that the images are delineated, denoised and enhanced to coincide with the real scene, so that doctors can quickly obtain real-time and intuitive surgery. Navigation information.
- the intraoperative contrast module 500 is used for real-time angiography of the surgical site when the abnormal posture of the human body during the operation (which can be said to be breathing or other conditions), obtaining angiographic data, and transmitting the angiographic data to the navigation. Module.
- ⁇ x 0 is greater than or equal to ⁇ x, it is determined that the critical error distance is exceeded.
- Intraoperative angiography is required for abnormalities in human breathing.
- the critical error distance ⁇ x is set according to the number of locator arrays according to the specific surgery.
- the error source is mainly the navigation error.
- ⁇ x should be set to 1 cm. Left and right, that is, the average navigation error of each marker point cannot exceed 1 mm.
- ⁇ x should be set to about 3 cm.
- the navigation module 400 is configured to adjust and display the three-dimensional human body model according to the contrast data.
- FIG. 3 is a system operation diagram of a minimally invasive surgery navigation system according to an embodiment of the present invention. As shown in FIG. 3, the specific execution flow of the minimally invasive surgery navigation system of the present invention is as follows:
- the human body locator 302 and the human body locator 201 are attached to the patient's body, and the 4D-CT (Four Dimensional Computed Tomography) scan and the posture tracking of the human body locator 201 are simultaneously performed, and the CT data is according to the imaging time. Reconstruction, when each time interval (recommended interval 0.1s) is reconstructed.
- the engraved three-dimensional volume data model the real-time tracking of the positioner pose sequence is encoded and archived by corresponding time intervals.
- the three-dimensional volume data is processed by the preoperative operation planning, and the navigation module 400 of the system is used to outline the lesions and the vascular nerves or organs that need attention in the vicinity of the operation before being stored in the 3D database 101.
- the pose state combination information of the human body positioner 201 is placed in the pose lookup table 203.
- the encoding method of the positioner sequence of the locator can be customized.
- an encoding method of four pose locators is provided: since the pose has directionality, the four locators here can be identified and numbered separately.
- A1, A2, A3, A4 this encoding method is coded according to "A1-A2", “A2-A3", “A3-A4" mode, where "A?-A?" represents two locators.
- the spatial orientation relationship between the simplest ones can directly use the coordinate transformation matrix, but this coding method is inconvenient to retrieve and can be stored using the "xyz" spatial angle recording method. The advantage is that there is no accurate data matching.
- the tracker of the intraoperative data module 200 tracks the pose state of the human body locator 201, and the pose is encoded by the pose encoding module 203. Using the code, the pose corresponding to the state code of the human body locator 201 is searched according to the pose lookup table 204. Before time
- the navigation module 400 receives and stores the pre-operative time, and the file lookup table 102 of the pre-operative data module 100 locates the location of the file to be extracted in the 3D database 101 according to the pre-operative time.
- the input and output module 103 transmits the found human body three-dimensional model data file to the navigation module 400;
- the positioning module 300 uses the tracker to calculate the real-time position data (position and posture coordinates) of the surgical instrument positioner 301, the human body positioner 302, and the augmented reality device 303 in a unified coordinate system, and transmits the real-time positioning information to the navigation.
- Module 400 uses the tracker to calculate the real-time position data (position and posture coordinates) of the surgical instrument positioner 301, the human body positioner 302, and the augmented reality device 303 in a unified coordinate system, and transmits the real-time positioning information to the navigation.
- Module 400
- the navigation module 400 receives the position data of the surgical instrument positioner and the position data of the human body positioner transmitted from the positioning module 300, obtains the surgical instrument model according to the position data of the surgical instrument positioner, and uses the augmented reality display based on the position data of the human body positioner.
- the device displays the surgical instrument model and the human body 3D model. Image denoising and image enhancement operations are also required.
- the processed image is displayed by the augmented reality display device. If the display module uses wearable smart glasses or an augmented reality helmet, the effect of the display is to delineate and go through the image. The noise and enhanced images coincide with the real scene, allowing doctors to quickly access real-time and intuitive surgical navigation information;
- the intraoperative imaging module 500 (mainly using a C-arm or an ultrasound device) performs real-time imaging on the surgical site, and transmits the contrast data to the navigation module 400;
- the navigation module 400 adjusts and displays the three-dimensional human body model according to the contrast data, and continues the operation according to the adjusted three-dimensional human body model.
- the system of the present invention further includes an independent navigation device, which can be performed when the tracked device (ie, the human body locator/surgical instrument locator) exceeds the tracking range and occlusion of the binocular tracker. Intermittent effective pose information tracking.
- the independent navigation device includes: a power source, a gyroscope, an accelerometer, and a wireless communication module. among them:
- the power supply is used to independently supply power to the gyroscope, accelerometer, and wireless communication module;
- the gyroscope is used as an angular acceleration sensor to acquire the real-time spatial angular acceleration data of the Marker;
- the accelerometer is used as a spatial acceleration sensor to acquire acceleration data in three coordinate directions of the Marker real-time space;
- the wireless communication module is used to transmit the data acquired by the gyroscope and the accelerometer in real time (the relevant data is received by the navigation system host).
- the navigation system host must have a wireless receiving module that wirelessly connects to the wireless communication module, which is used to collect data of the gyroscope and the accelerometer in real time.
- the navigation system host uses the data of the gyroscope and the accelerometer to calculate the relative spatial angle change and spatial position change of the Marker relative to the initial position of the motion.
- the data of the gyroscope can be obtained by the second integral to obtain a relative spatial angle change, and similarly, the relative spatial position change can be obtained by performing second integration on the data of the accelerometer.
- the Marker spatial pose of the occluded time is used as the starting spatial pose, and the real-time spatial pose of the Marker can be obtained by using the real-time data of the gyroscope and the accelerometer. .
- the gyro refers to an angular motion detecting device that uses a momentum moment sensitive housing of a high speed rotating body to orbit the one or two axes orthogonal to the axis of rotation with respect to the inertia space.
- the gyroscope is an inertial navigation. It can accurately measure the angular acceleration of motion. With the accelerometer, it can measure the acceleration and velocity of the motion. Multiply the speed by the time to obtain the distance of motion. Therefore, in the inertial navigator that requires the highest aircraft missiles, the high-performance gyroscope is one of the most important components. The accuracy of the gyroscope determines the safety of the flight and the ability to accurately hit the target.
- Gyroscopes are widely used. For example, most of today's smartphones are also equipped with low-cost gyroscopes.
- the main problem of gyroscopes is that they cannot be navigated for a long time in real time, because the gyroscope error will gradually become tired as time goes by. Accumulation, resulting in more and more navigation errors. Therefore, the current aircraft and the like generally use inertial navigation (gyro navigation) combined with GPS navigation (satellite navigation).
- gyro navigation inertial navigation
- GPS navigation satellite navigation
- Using low-cost gyroscopes and accelerometers as sensors can perform more accurate navigation in a short period of time, but when the error accumulates at a certain stage, it needs to be calibrated (to eliminate accumulated error), thus ensuring the accuracy of motion.
- the navigation system in the surgical system includes: a binocular tracker (Tracker), and the patient and the surgical instrument are fixed with the tracked device (hereinafter collectively referred to as Marker).
- Tracker binocular tracker
- Marker the working principle of Marker and the working mechanism of the surgical navigation system:
- the analysis is as follows:
- the traditional Marker is an optical positioning lattice consisting of four reflective spheres with a specific configuration in the same plane.
- the Tracker calculates the optical positioning lattice by tracking the optical positioning lattice. Spatial location and posture. If the light path between the Tracker and the optical positioning point is occluded or the positioning point moves out of the Tracker tracking range, the Tracker will lose track of the Marker, causing the navigation system to fail.
- the invention focuses on the tracking implementation of the Marker spatial position and posture in this case.
- the independent navigation device is fixed to the independent navigation device by using a low-cost sensor (gyroscope + acceleration sensor).
- Marker is able to independently provide its own spatial position and attitude when the optical path between Tracker and Marker is occluded (need to predict the spatial pose of the moment when using independent navigation devices).
- the system preferentially uses the navigation data of the Tracker (high precision and stability).
- the navigation system host records the spatial pose data provided by the last time of the Tracker at the occlusion time. As the starting pose of Marker, and using the navigation data provided by Marker's independent navigation device from this moment, and based on this (secondary integral), the Marker real-time spatial pose is calculated.
- Marker's independent navigation devices have limited time-preserving time, the Marker and Tracker optical paths need to be connected at specific times, and it must be ensured that each optical path is not interrupted for a specific period of time. To put it simply, the Tracker is placed in a position that it will not touch (keeping the world coordinate system fixed), and the Tracker is used to perform the Marker's spatial pose calibration.
- the specific length of time for this particular time depends on the actual measurement of the sensor used.
- the measurement methods are as follows:
- the navigation system host uses the real-time spatial pose data (x 1 , y 1 , z 1 , ⁇ x0 , ⁇ y0 , ⁇ z0 ) tracked by the Tracker as the reference, and simultaneously utilizes the spatial pose obtained by the simultaneous operation of the Mracker independent navigation device.
- the data (x 1 , y 1 , z 1 , ⁇ x1 , ⁇ y1 , ⁇ z1 ) is calculated in error with the reference value.
- the error calculation is defined as follows:
- the above problem can be solved by the augmented reality display module 404 (selecting wearable smart glasses or augmented reality helmet) in the navigation module 400.
- the "Google Glass” smart glasses combines a smartphone, a GPS, and a camera to present real-time information in front of the user and enable voice control. Because “Google Glass” is a wearable device, it can be worn lightly, without obscuring the real view, without affecting the normal movement of the human body, and providing a transparent display, and can also effectively make voice calls and video recordings. stand by. "Google Glass” is the ideal device for human-computer interaction between doctors in surgery.
- the surgeon can view the transmitted intraoperative angiography or preoperative image through "Google Glass", can perform routine surgery without changing the angle of view, and can perform surgery through the camera on "Google Glass” during the operation.
- the images are transmitted in real time to online expert panel members who can provide real-time surgical guidance or advice to physicians via online audio.
- the entire surgical procedure can also be recorded from the perspective of the doctor, and has been used for postoperative insufficiency or for surgical training.
- the navigation information is presented to the doctor through the smart glasses, and the smart glasses communicate with other devices by means of wireless access to TCP/IP (navigation images and related data transmission), and can perform real-time voice transmission.
- TCP/IP navigation images and related data transmission
- Preoperative image modeling method and improvement of intraoperative human tracking mode 4D CT images are used to attach external marker points to the human body, and the preoperative images corresponding to the intraoperative human body model will be tracked by tracking the marker group. Quickly correspond, and provide the spatial positional relationship between the surgical instrument and the human computer model for the surgeon to refer to during surgery. Not only can it be applied to the operation of a part of the body that does not move with the breathing movement, but also the operation of a part of the body with a large displacement of the breathing movement;
- an intraoperative contrast module has been added.
- the surgical site can be re-imaged during the operation, and the three-dimensional model of the human body can be adjusted according to the contrast data to continue the operation.
- the intraoperative interaction is more convenient.
- the smart glasses can be used to provide doctors with real-time surgical operation pictures and related prompts.
- the real-time intraoperative images can be transmitted to the expert group assisting the operation through the wireless network. Both doctors and expert groups can communicate in real time via voice. (The current level of development of 3D display technology is very low, and it is still not up to the medical level).
- modules or steps of the embodiments of the present invention can be implemented by a general computing device, which can be concentrated on a single computing device or distributed in multiple computing devices. Alternatively, they may be implemented by program code executable by the computing device such that they may be stored in the storage device by the computing device and, in some cases, may be different from The steps shown or described are performed sequentially, or they are separately fabricated into individual integrated circuit modules, or a plurality of modules or steps thereof are fabricated into a single integrated circuit module. Thus, embodiments of the invention are not limited to any specific combination of hardware and software.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Robotics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Processing Or Creating Images (AREA)
- Endoscopes (AREA)
Abstract
La présente invention concerne un système de navigation pour une opération à effraction minimale qui comprend : un module (100) de données préopératoires permettant d'acquérir un modèle 3D du corps humain ; un module (200) de données peropératoires permettant d'acquérir un code de pose, et de rechercher une durée préopératoire correspondante en fonction du code de pose ; un module de positionnement (300) permettant d'acquérir des données de position de l'instrument opératoire et des positionneurs (301, 201, 302) du corps humain ; un module de navigation (400) permettant de recevoir le modèle 3D du corps humain et les données de position de l'instrument opératoire et des positionneurs (301, 201, 302) du corps humain, d'acquérir un modèle de l'instrument opératoire en fonction des données de position du positionneur (301) de l'instrument opératoire et d'afficher le modèle de l'instrument opératoire et le modèle 3D du corps humain sur la base des données de position des positionneurs (201, 302) du corps humain à l'aide d'un dispositif d'affichage de réalité augmentée ; un module d'imagerie peropératoire (500) permettant d'acquérir des données d'imagerie d'un site chirurgical lors de l'apparition d'une anomalie ; et le module de navigation (400) permettant d'ajuster le modèle 3D du corps humain en fonction des données d'imagerie. La solution permet de superposer efficacement un modèle fixe généré avant l'opération avec les mouvements de respiration naturelle des parties du corps en temps réel pendant l'opération et d'afficher le résultat, résolvant ainsi le problème selon lequel une respiration humaine anormale pendant l'opération a une influence sur l'opération à effraction minimale.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2015/098590 WO2017107116A1 (fr) | 2015-12-24 | 2015-12-24 | Système de navigation pour une opération à effraction minimale |
| CN201580001146.0A CN107182200B (zh) | 2015-12-24 | 2015-12-24 | 微创手术导航系统 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2015/098590 WO2017107116A1 (fr) | 2015-12-24 | 2015-12-24 | Système de navigation pour une opération à effraction minimale |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017107116A1 true WO2017107116A1 (fr) | 2017-06-29 |
Family
ID=59088744
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2015/098590 Ceased WO2017107116A1 (fr) | 2015-12-24 | 2015-12-24 | Système de navigation pour une opération à effraction minimale |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN107182200B (fr) |
| WO (1) | WO2017107116A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107432766A (zh) * | 2017-07-04 | 2017-12-05 | 厦门强本宇康科技有限公司 | 一种精准微创手术导航系统 |
| WO2021190421A1 (fr) * | 2020-03-27 | 2021-09-30 | 海信视像科技股份有限公司 | Procédé de suivi de boule lumineuse de dispositif de commande fondé sur la réalité virtuelle et dispositif de réalité virtuelle |
| CN115252128A (zh) * | 2022-07-28 | 2022-11-01 | 上海霖晏医疗科技有限公司 | 一种x光图像的配准方法及装置 |
Families Citing this family (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108629845B (zh) * | 2018-03-30 | 2022-07-12 | 湖南沛健医疗科技有限责任公司 | 手术导航装置、设备、系统和可读存储介质 |
| CN109087695A (zh) * | 2018-08-06 | 2018-12-25 | 广州高通影像技术有限公司 | 一种基于物联网的智能内窥镜影像的数据传输系统 |
| CN109875646A (zh) * | 2019-01-22 | 2019-06-14 | 北京金智拓科技有限公司 | 超声能量平台控制系统 |
| WO2021007803A1 (fr) * | 2019-07-17 | 2021-01-21 | 杭州三坛医疗科技有限公司 | Méthode de positionnement et de navigation pour une réduction de fracture et une chirurgie de fermeture, et dispositif de positionnement destiné à être utilisé dans la méthode |
| CN111419399A (zh) | 2020-03-17 | 2020-07-17 | 京东方科技集团股份有限公司 | 定位跟踪件、定位球的识别方法、存储介质及电子设备 |
| CN111427452B (zh) * | 2020-03-27 | 2023-10-20 | 海信视像科技股份有限公司 | 控制器的追踪方法及vr系统 |
| CN113456229A (zh) * | 2020-03-31 | 2021-10-01 | 北京图灵微创医疗科技有限公司 | 用于腹腔手术的机器人系统 |
| FR3114957B1 (fr) * | 2020-10-08 | 2022-09-30 | Quantum Surgical | Système de navigation en réalité augmentée pour un robot médical |
| CN114565741A (zh) * | 2021-12-28 | 2022-05-31 | 杭州堃博生物科技有限公司 | 手术辅助的数据处理方法、装置、设备、介质与系统 |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20010007919A1 (en) * | 1996-06-28 | 2001-07-12 | Ramin Shahidi | Method and apparatus for volumetric image navigation |
| CN101019771A (zh) * | 2007-03-28 | 2007-08-22 | 新奥博为技术有限公司 | 一种支持多种模式的导航系统及导航方法 |
| CN101797182A (zh) * | 2010-05-20 | 2010-08-11 | 北京理工大学 | 一种基于增强现实技术的鼻内镜微创手术导航系统 |
| CN103479431A (zh) * | 2013-09-26 | 2014-01-01 | 中国科学院深圳先进技术研究院 | 非介入式微创手术导航系统 |
| CN103735312A (zh) * | 2013-12-11 | 2014-04-23 | 中国科学院深圳先进技术研究院 | 多模影像超声引导手术导航系统 |
| CN103860268A (zh) * | 2012-12-13 | 2014-06-18 | 中国科学院深圳先进技术研究院 | 一种标记点配准方法、装置及外科手术导航系统 |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7206627B2 (en) * | 2002-03-06 | 2007-04-17 | Z-Kat, Inc. | System and method for intra-operative haptic planning of a medical procedure |
| JP5354981B2 (ja) * | 2008-07-14 | 2013-11-27 | キヤノン株式会社 | 文書管理装置、文書管理方法及びプログラム |
| US9572592B2 (en) * | 2012-05-31 | 2017-02-21 | Ethicon Endo-Surgery, Llc | Surgical instrument with orientation sensing |
| CN103040525B (zh) * | 2012-12-27 | 2016-08-03 | 深圳先进技术研究院 | 一种多模医学影像手术导航方法及系统 |
| CN103247056B (zh) * | 2013-05-29 | 2016-01-13 | 中国人民解放军第三军医大学第一附属医院 | 人体骨关节系统三维模型-二维影像空间配准方法 |
| CN103479430A (zh) * | 2013-09-22 | 2014-01-01 | 江苏美伦影像系统有限公司 | 一种影像引导介入手术导航系统 |
-
2015
- 2015-12-24 WO PCT/CN2015/098590 patent/WO2017107116A1/fr not_active Ceased
- 2015-12-24 CN CN201580001146.0A patent/CN107182200B/zh not_active Expired - Fee Related
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20010007919A1 (en) * | 1996-06-28 | 2001-07-12 | Ramin Shahidi | Method and apparatus for volumetric image navigation |
| CN101019771A (zh) * | 2007-03-28 | 2007-08-22 | 新奥博为技术有限公司 | 一种支持多种模式的导航系统及导航方法 |
| CN101797182A (zh) * | 2010-05-20 | 2010-08-11 | 北京理工大学 | 一种基于增强现实技术的鼻内镜微创手术导航系统 |
| CN103860268A (zh) * | 2012-12-13 | 2014-06-18 | 中国科学院深圳先进技术研究院 | 一种标记点配准方法、装置及外科手术导航系统 |
| CN103479431A (zh) * | 2013-09-26 | 2014-01-01 | 中国科学院深圳先进技术研究院 | 非介入式微创手术导航系统 |
| CN103735312A (zh) * | 2013-12-11 | 2014-04-23 | 中国科学院深圳先进技术研究院 | 多模影像超声引导手术导航系统 |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107432766A (zh) * | 2017-07-04 | 2017-12-05 | 厦门强本宇康科技有限公司 | 一种精准微创手术导航系统 |
| WO2021190421A1 (fr) * | 2020-03-27 | 2021-09-30 | 海信视像科技股份有限公司 | Procédé de suivi de boule lumineuse de dispositif de commande fondé sur la réalité virtuelle et dispositif de réalité virtuelle |
| CN115252128A (zh) * | 2022-07-28 | 2022-11-01 | 上海霖晏医疗科技有限公司 | 一种x光图像的配准方法及装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN107182200B (zh) | 2019-12-06 |
| CN107182200A (zh) | 2017-09-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN107182200B (zh) | 微创手术导航系统 | |
| US12336771B2 (en) | Augmented reality navigation systems for use with robotic surgical systems and methods of their use | |
| TWI741359B (zh) | 與手術導航系統整合之混合實境系統 | |
| US12178520B2 (en) | Model registration system and method | |
| KR101647467B1 (ko) | 증강현실을 이용한 외과 수술용 3d 안경 시스템 | |
| CN103479431B (zh) | 非介入式微创手术导航系统 | |
| CN109925057A (zh) | 一种基于增强现实的脊柱微创手术导航方法及系统 | |
| US20230233259A1 (en) | Augmented reality headset systems and methods for surgical planning and guidance for knee surgery | |
| CN109758230A (zh) | 一种基于增强现实技术的神经外科手术导航方法和系统 | |
| CN104939925A (zh) | 基于三角测量的深处和表面可视化 | |
| EP3223677A1 (fr) | Système et procédé d'enregistrement de modèle | |
| TWI697317B (zh) | 應用於與手術導航整合之混合實境系統之數位影像實境對位套件與方法 | |
| US12042171B2 (en) | Systems and methods for surgical port positioning | |
| US20120004541A1 (en) | Surgery assistance system | |
| CN111936074A (zh) | 监视手术室中的移动物体 | |
| CN118648066A (zh) | 用于提供增强显示的系统、方法和装置 | |
| WO2023065495A1 (fr) | Système d'opération de ponction et de drainage d'hématome intracrânien utilisant un bras robotique pour la ponction | |
| Harders et al. | Multimodal augmented reality in medicine | |
| Zhang et al. | From AR to AI: augmentation technology for intelligent surgery and medical treatments | |
| US20240164844A1 (en) | Bone landmarks extraction by bone surface palpation using ball tip stylus for computer assisted surgery navigation | |
| CN118787449A (zh) | 基于多模态数据融合的手术器械导航装置及导航方法 | |
| CN111374784A (zh) | 一种增强现实ar定位系统及方法 | |
| CN109620409A (zh) | 一种内窥镜外部参数实时优化系统及方法 | |
| TWM484404U (zh) | 成像投影系統設備應用 | |
| JP2025501263A (ja) | 二次元画像位置合わせ |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15911115 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 04.10.2018) |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15911115 Country of ref document: EP Kind code of ref document: A1 |