EP4362844A1 - Système d'assistance chirurgicale à enregistrement amélioré et procédé d'enregistrement - Google Patents
Système d'assistance chirurgicale à enregistrement amélioré et procédé d'enregistrementInfo
- Publication number
- EP4362844A1 EP4362844A1 EP22741198.0A EP22741198A EP4362844A1 EP 4362844 A1 EP4362844 A1 EP 4362844A1 EP 22741198 A EP22741198 A EP 22741198A EP 4362844 A1 EP4362844 A1 EP 4362844A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- registration
- intracorporeal
- patient
- recording
- dimensional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/37—Determination of transform parameters for the alignment of images, i.e. image registration using transform domain methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2048—Tracking techniques using an accelerometer or inertia sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/20—Surgical microscopes characterised by non-optical aspects
Definitions
- the present disclosure relates to a surgical assistance system for use in a surgical intervention, comprising: at least one, in particular medical, imaging 3D recording device, which is intended and adapted to create a three-dimensional, in particular enlarged, intracorporeal recording of a patient and is ready for computer reading put; a tracking system/tracking system, in particular a surgical navigation system, which is intended and adapted to record and track a surgical procedure area of the patient, in particular the patient with an outer surface/outer structure, preferably together with at least a portion of the 3D recording device; a data provision unit, in particular a storage unit, which is adapted to provide digital 3D recording data, in particular preoperative 3D recording data, of the patient; and a control unit adapted to process the three-dimensional intracorporeal image, the tracking system data and the provided 3D image data.
- the disclosure relates to a registration method/registration method and a computer-readable storage medium according to the preambles of the independent claims.
- Surgical navigation systems are now a standard in neurosurgery. They can be used to navigate a procedure and guide an instrument to reach a target surgical site.
- the navigation systems allow the surgeon, although, anatomical To identify structures during the operation, however, their accuracy is disadvantageously often severely limited.
- the accuracy is largely defined or specified by the registration between digital (3D) recording data to/on the patient, i.e. in particular the assignment of a reference recording/reference image of preoperative 3D recording data (e.g. MRI recording data or CT recording data ) to the intraoperative, i.e. internal, reference recording of the patient.
- preoperative 3D recording data e.g. MRI recording data or CT recording data
- Such a registration is usually carried out by detecting points on an outer surface of the patient (for example points on the patient's face) as reference points and also assigning these reference points to the corresponding points in the preoperative 3D recording data.
- Various registration methods are available, in particular a point-to-point comparison or a surface comparison between the 3D recording data and the patient.
- the reference points to be measured can be recorded by palpation or contact-free with sensors such as video cameras or lasers.
- the registration can also be carried out by using so-called "fiducials" (optical reference points) on the patient, which are visible in the 3D recording data/in the 3D image set and enable automatic registration.
- An alternative approach to determining a deviation and fitting the 3D acquisition data to the actual location of the tissue is to use intraoperative (during surgery) ultrasound. Loss of accuracy can be at least partially corrected with ultrasound.
- the ultrasound recordings have the disadvantage that they are difficult to interpret and do not provide the required high level of accuracy. In the craniotomy area in particular, navigation must be carried out with the greatest precision.
- the use of an ultrasound device also increases operation time as well as complexity of the operation. An ultrasound device also requires a standing area in the operating room, which is already quite cramped, and reduces accessibility to the operating field.
- Another solution provides a method of compensating for a brain shift, for example, by using the microscope image and well-defined anatomical structures (especially vessels) to manually shift the 3D image data/3D image dataset in such a way that they match the Ultimately, the micrograph is at least roughly the same.
- this method is very subjective and requires manual adjustment, further delaying intervention.
- this method does not take into account all six possible degrees of freedom (three degrees of freedom of a translation and three degrees of freedom of a rotation), since only a translational correction is performed.
- US Pat. No. 9,336,592 B2 discloses a system that is based on a comparison of 3D surfaces extracted from preoperative 3D recording data/3D data set with a (real) 3D surface that comes from a stereo camera system was extracted.
- a fundamental task can be seen in particular in maintaining a high level of accuracy of a navigation throughout the entire intervention, in particular when tissue is spatially changed and moves as a result of the intervention.
- Another sub-task can be seen in carrying out a simple intraoperative registration without the need for a biomechanical model.
- another sub-task can be seen as being able to carry out a new registration at any time intraoperatively (ie during the intervention) in order to increase accuracy as required and, above all, to provide rapid registration.
- Another sub-task can also be seen as minimizing the computing effort for a registration.
- the objects are achieved with regard to a generic surgical assistance system by the features of claim 1, with regard to a generic registration method by the features of claim 8 and with regard to a computer-readable storage medium by the features of claim 10.
- a two-stage registration is proposed for the surgical assistance system and also for the registration method, in order in particular to simplify and reduce the computational complexity of the registration and to limit an area of error.
- a basic idea of the present disclosure is that the assistance system or the registration method is adapted to carry out the registration in at least two steps (hereinafter referred to as first (registration) step A and second (registration) step B). While in the first registration step A an outer surface or structure of the patient, in particular a face, is used for a first registration, in the second registration step B an inner/intracorporeal structure or tissue of the patient is used for a refined and timed seen downstream registration used.
- the first registration step A leads to a (standardized) first registration, which is initially quite accurate before the intervention.
- This first registration step A or the first registration is carried out in particular before the start of the intervention, i.e. before the patient is opened and the intervention begins (for example in the case of an intervention with a skull opening, a dural opening, a tissue resection or a tumor removal).
- the second registration step B is carried out in order to adapt the registration to the changed tissue position and, so to speak, to refine it even further.
- the registration from registration step A is used as a basis.
- the second registration step B to be adapted can be carried out at any time during the intervention, usually when a critical resection begins (e.g. resection of tumor margins) and when the surgeon needs high precision in locating the position of the resection instrument.
- the registration of the second registration step B is based on an intraoperative visualization system / a 3D recording device that creates three-dimensional (3D) intracorporeal images and enables the surgeon to locate identifiable anatomical landmarks of the operated soft tissue within the surgical field and preferably also to put. These landmarks can be sulcus structures, vessels or lesions, for example.
- the surgeon can use the visualization system/3D recording device to capture selected landmarks that correspond to the landmarks identified in the 3D recording data, in particular the preoperative 3D recording data, or vice versa.
- the 3D recording device can have a stereo camera (i.e. two optical systems spaced apart from one another) in order to calculate and create a three-dimensional intracorporeal recording using the two offset (two-dimensional) intracorporeal recordings or from two different viewing positions.
- a surgical assistance system having a tracking system, in particular a surgical one
- the 3D visualization system being adapted to generate three-dimensional, i.e. spatial, intracorporeal recordings of the internal structures of the patient, and the tracking system being adapted to the 3D - Track visualization system.
- the system is adapted to register 3D recording data, which were recorded in particular before the intervention (ie preoperatively), using an outer surface of the patient (outer surface) on this patient.
- the 3D visualization system (3D recording system with 3D recording device) generates intracorporeal three-dimensional recordings/3D recordings with at least one three-dimensional position (3D position with three coordinates) and/or at least one position of intracorporeal structures from the inside or from internal tissue of the patient (near the surgical field).
- Structures from the three-dimensional intracorporeal recording with the at least one three-dimensional position are correlated with corresponding (intracorporeal) structures from the 3D recording data, in particular from preoperative 3D recording data/3D image set, in order to improve registration and ensure registration accuracy for the to increase the area of interest.
- this assistance system and also the registration method allows a correction of accuracy errors resulting from the surgical workflow (e.g. a slight displacement of a tracker/marker during draping) and not just a correction of a tissue displacement, especially one brain displacement, is limited.
- the surgical assistance system can also be used continuously during the intervention in order to increase the accuracy of a registration.
- Another advantage is that not only intraoperative two-dimensional images with reduced information content (2D data) are used, which would require a large exposure of the brain and a biomechanical model of the brain, but that the three-dimensional intracorporeal image only allows a small local intervention area is needed without the need for a biomechanical model.
- a surgical assistance system and registration method that ensures accurate registration during the use of surgical navigation by using a two-step registration process with initial pre-procedure registration using the patient's external surface and with refined registration after the start of the procedure using the internal structure of the patient in the vicinity of the surgical site.
- the surgical assistance system in particular the control unit, is adapted to register the 3D recording data on an outer surface of the patient as a first registration and in particular to store this first registration in the memory unit; and the Control unit is adapted to determine at least one landmark and/or a surface and/or a three-dimensional volume of an intracorporeal tissue as an intracorporeal reference from/in the three-dimensional intracorporeal image and/or the 3D image data, and on the basis of the first registration and on Based on the determined intracorporeal reference, to register/correlate the 3D recording data to the (inner) intracorporeal tissue/structure of the three-dimensional intracorporeal recording (IA) as a second registration in order to increase the accuracy of the registration.
- a correlation can be carried out particularly quickly and efficiently in the second registration. Possible errors of incorrect registration are also prevented or at least minimized, which can occur due to calculations.
- 3D recording data defines that it is three-dimensional, spatial recording data, i.e. with three dimensions, for example in an X, Y and Z direction.
- external surface means a patient's body surface that is normally visible without an opening or incision.
- the outer surface is the patient's skin. An external visible structure of the patient is thus recorded.
- position means a geometric position in three-dimensional space, which is specified in particular by means of coordinates of a Cartesian coordinate system. In particular, the position can be given by the three coordinates X, Y and Z.
- orientation in turn indicates an alignment (e.g. at the position) in space.
- orientation indicates an alignment with directional or Rotation specification in three-dimensional space.
- orientation can be specified using three angles.
- position includes both a position and an orientation.
- location can be specified using six coordinates, three position coordinates X, Y and Z and three angular coordinates for orientation.
- control unit may be adapted to perform the second registration based on a rigid transformation.
- a rigid transformation means that the three-dimensional intracorporeal recording is only translationally displaced and/or rotationally rotated in relation to the 3D recording data. This favors a required computing effort. The 3D recording data is therefore not adjusted in such a way that local distortion or deformation would be taken into account.
- the registration correction performed is thus a rigid transformation that does not take into account elastic deformation of the brain.
- This rigid transformation hardly affects the surgical accuracy since the registration correction is indeed performed precisely in the area of the surgical field, so the rigid transformation provides the required precision in this area of interest, while outside this area in case of elastic deformation the precision is irrelevant
- the advantage here is that no biomechanical model is required for an adjustment.
- the correlated landmarks and/or surfaces can be adjusted using a rigid transformation, in particular, in order to refine the registration and obtain a high level of navigation accuracy in the intervention area.
- the surgical assistance system can have a surgical 3D microscope and/or a surgical 3D endoscope and/or a medical instrument with an optical 3D camera as a 3D recording device for creating the three-dimensional intracorporeal recording.
- All three medical devices i.e. the surgical (3D) microscope, the surgical ⁇ 3D) endoscope and the instrument with the 3D camera are adapted to take three-dimensional intracorporeal images from inside the body to create the patient.
- a 3D microscope, a 3D endoscope or an optical camera system can be used as the intraoperative 3D recording device (the intraoperative imaging device), which takes the (enlarged) picture from inside the patient, in particular in the form of 3D point cloud data (i.e. a set of points in three-dimensional space for geomorphological analyses, for example).
- the control unit can be adapted to determine at least one point-like landmark and/or at least one surface for the second registration/correlation in the three-dimensional intracorporeal recording. Areas can also be used as an alternative or in addition to landmarks. Both areas and landmarks can be automatically extracted and compared, in particular using standard image processing algorithms. To do this, corresponding areas between the 3D recording data and the intraoperative image of the visualization system must be selected, with the surgical assistance system allowing manual input via a touch display and/or the control unit being adapted to carry out the correlation or registration automatically. In the intracorporeal structure of the patient's inner tissue, punctiform orientation points and/or areas can thus be determined.
- control unit can be adapted to detect or determine exactly a single landmark, in particular a single landmark point in space, and based on the one landmark, in particular the landmark point, the registration, in particular the second registration, with only to perform a transformation correction with respect to a translation, while a rotation/rotation is not changed/kept unchanged.
- This configuration represents the simplest and, from a computational point of view, the most efficient and fastest correction of the registration.
- the second registration can thus be carried out with only a single set landmark point.
- the registration offset can be corrected in at least three degrees of freedom (only translation). will. In particular, therefore, only one landmark can be used to carry out a transformation correction, while the rotation remains unchanged.
- control unit can be adapted to detect or determine exactly two landmarks, in particular two landmark points spaced apart from one another, in space and to carry out the registration on the basis of the two landmarks.
- two landmarks 5 degrees of freedom can be corrected (translation and two rotations).
- control unit can be adapted to detect or determine at least three landmarks, in particular at least three landmark points spaced apart from one another, in space and to carry out the registration on the basis of the at least three landmarks.
- at least three landmarks in particular at least three landmark points spaced apart from one another, in space and to carry out the registration on the basis of the at least three landmarks.
- all six degrees of freedom can be corrected (translation and three rotations).
- the surgical assistance system can have a surgical 3D microscope, with the focus point of the microscope being recorded and processed by the control unit, and the landmark (position) being recorded intraoperatively on the basis of the focus point.
- the control unit is adapted in particular to use the tracking system, in particular the navigation system, to track the microscope and to determine a position and/or orientation of a microscope head with an optical system in order to determine the position of the focal point relative to the patient or to the patient reference system.
- the surgical assistance system can have a surgical tool and be adapted to geometrically detect the surgical tool, in particular a suction tube, and track it through the tracking system, and the control unit can also be adapted for this when a command is input by the user to determine a position of a distal tip, in particular the distal tip of the suction tube, and to set the landmark on the basis of this position.
- a user can therefore preferably simply use the distal end of the suction hose to the desired Show a landmark and, for example, enter a command for setting a landmark on a touch display in order to record this landmark.
- the three-dimensional intracorporeal recording of the 3D recording device (the intraoperative visualization device) can be used for the detection, since the position of the landmark reflects the real position.
- the tracking system in particular the navigation system, can preferably have an infrared-based navigation system and/or an electromagnetic navigation system and/or an image-processing navigation system.
- infrared markers/infrared trackers are preferably provided on the patient and on the 3D recording device.
- further infrared markers can be provided at the intended surgical intervention area (either still closed or open) in order to precisely detect this area of particular interest and, for example, to carry out an initial registration based on this area.
- EM navigation system electromagnetic navigation system
- EM sensors can be attached to the patient and/or 3D recording device.
- image processing navigation system navigation is performed using machine vision image analysis to detect movement such as translation and rotation.
- the tracking system in particular the surgical navigation system, can therefore be based on infrared tracking/infrared-based tracking, EM tracking/electromagnetic tracking and/or machine vision tracking/optical image processing tracking.
- the control unit can be adapted to perform the first registration by means of a point-to-point matching and/or a surface matching using the external surface/structure of the patient.
- the first registration or the registration method of the first registration step A can therefore be based in particular on a point-to-point registration and/or a surface comparison and/or a video registration and/or a fiducial comparison/mapping using an intraoperative 3D scanners such as a computer tomograph (CT) or a magnetic resonance tomograph (MRT) are based.
- CT computer tomograph
- MRT magnetic resonance tomograph
- the first registration with preferably point-to-point adjustment and/or surface adjustment is carried out using the external structure or the external surface of the patient.
- the tasks are solved by the steps: acquiring 3D image data of a patient; Registering the 3D acquisition data on an outer surface of the patient as a first registration; Creation of a three-dimensional intracorporeal recording of the patient using a 3D recording device; determining at least one landmark and/or a surface and/or a volume in the three-dimensional intracorporeal image and/or the 3D image data as an intracorporeal reference; and registering, on the basis of the first registration and the determined intracorporeal reference, the 3D image data on the internal structures detected by the three-dimensional intracorporeal image as a second registration in order to increase the accuracy of the registration.
- the registration method can also have the steps: detecting at least one landmark and/or one surface in the three-dimensional intracorporeal recording; and performing a rigid transformation to register the 3D acquisition data to the intracorporeal structure. This further reduces the amount of computation required by a processor, for example, since the rigid transformation does not require any complex mathematical calculations.
- a computer-readable storage medium in that it comprises instructions which, when executed by a computer, cause it to carry out the method steps of the registration method according to the present disclosure.
- Any disclosure related to the surgical assistance system of the present disclosure also applies to the registration method of the present disclosure, just as any disclosure related to the registration method according to the present disclosure also applies to the surgical assistance system of the present disclosure.
- the features of the adaptation of the control unit can be used analogously for a method step, as well as a method step for a corresponding analog adaptation of the control unit.
- Fig. 1 is a schematic perspective view of a surgical instrument
- FIG. 3 shows a schematic representation to explain a registration method with two-stage registration
- FIG. 4 shows a flow chart of a registration method of a further preferred embodiment.
- FIG. 1 shows a schematic view of a surgical assistance system 1 (hereinafter only referred to as assistance system) of a first preferred embodiment.
- the assistance system 1 is used in a medical sterile room. An open procedure is performed on the patient in a central sterile surgical area.
- a rigid 3D video endoscope 2 (hereinafter simply called endoscope) with a handpiece/handle 4 and a spaced, front-side imaging 3D recording head 6 is positioned inside the patient's body on the endoscope shaft side.
- the 3D recording head 6 has a 3D endoscope video camera in the form of a distal stereo camera, which creates a three-dimensional intracorporeal recording IA of the body interior of the patient P as a current (live video) recording in the direction of a longitudinal axis of the endoscope 2 on the front side and made available digitally.
- a display device in the form of an OP monitor 8 can then be used to display visual content for those involved in the OP for navigation. It is pointed out that the endoscope 2 can alternatively also be guided by a robot arm at its handling section.
- the surgical assistance system 1 has a data supply unit 14 in the form of a memory, in which digital, preoperative 3D recording data 3DA of the patient P to be treated are stored in digital/computer-readable form.
- digital, preoperative 3D recording data 3DA of the patient P to be treated are stored in digital/computer-readable form.
- MRT recording data or CT recording data which virtually or digitally image a body of the patient P or at least a part of the body, can be stored as 3D recording data 3DA.
- the assistance system 1 has a control unit 12 for detection, processing and calculation as well as for control, which is adapted to process the three-dimensional intracorporeal recording IA of the endoscope 2 and the 3D recording data 3DA.
- the surgical assistance system 1 also has a tracking system 14, which, among other things, continuously moves the endoscope 2 and the Recording head 6 detected and the control unit 12 ais endoscope position data.
- the tracking system 14 has an infrared-based navigation system with an external 3D stereo camera 16 and infrared markers 18 that are provided on the handle.
- an endoscope-internal sensor for example an IMU sensor, and/or with an image analysis carried out by the control unit 12 can also be used.
- the surgical assistance system 1 is adapted to carry out a two-stage registration with an outer patient structure for the first registration and on the basis of this first registration and on the basis of detected landmarks using the 3D recording device then use the internal structure of the patient P for an intraoperative, second registration for a more refined registration in the form of the endoscope 2 .
- the tracking system 14 and the control unit 12 are adapted for the 3D recording data 3DA initially via the tracking system 14 with the external 3D stereo camera 16 on an outer surface of the patient P, here the human body with the (planned) surgical area and the head register as the first registration.
- This first registration usually takes place before the intervention, when the patient P has been placed in the operating room.
- the transformation matrix determined in the first registration is stored in the storage unit 20 . This completes the initial or first registration as registration step A and an overview registration or basic registration, so to speak, is available to the surgical assistance system throughout the entire operation.
- control unit 12 is adapted to detect anatomical landmarks 15 in the three-dimensional intracorporeal image IA of the endoscope 2 after the start of the intervention and to determine corresponding anatomical landmarks 15′ in the 3D image data provided.
- the surgeon selects three predefined ones in this embodiment characteristic (real) landmarks 15 from.
- These three landmarks 15 in real space can also be found as three (virtual) landmarks 15' in virtual space, just as the endoscope 2 in virtual space can also be represented as a virtual endoscope 2'.
- the three-dimensional intracorporeal recording IA is correlated with the 3D recording data, ie registered, via the two times three corresponding anatomical landmarks 15, 15'.
- the control unit 12 is specially adapted to determine three landmarks and/or a surface of the internal structure or an intracorporeal tissue as an intracorporeal reference from the three-dimensional intracorporeal recording IA. Thereafter, based on the first registration and the determined intracorporeal reference, the control unit 12 registers the 3D recording data 3 DA on the (inner) intracorporeal tissue/the inner structure as a second registration in order to increase the accuracy of the registration. For example, if the liver is moved, the surgeon can use the surgical assistance system 1 to perform the second registration again and re-register to the actual location of the liver in the surgical area of interest.
- the surgical assistance system 1 is therefore specially adapted to carry out a two-stage registration method, with the second registration being able to be repeated intraoperatively several times at any desired time.
- FIG. 2 shows a registration method for registering 3D recording data 3DA on a tissue of a patient P according to a first preferred embodiment.
- a first registration of the 3D recording data 3DA of the patient P is carried out on the front surface of the patient's face.
- the geometric structure of the 3D image data of the patient's face is correlated with the detected three-dimensional structure of the real face of the patient P in order to obtain a first transformation between the 3D image data and the patient P.
- this step is carried out before an intervention.
- the skull of patient P can be draped around the planned surgical site.
- an incision step in which the skullcap is opened and the internal structure of the cerebrum is then visible. Due to the incision, the opening and manipulation with medical instruments, the position of the brain can change in some areas, so that the 3D recording data 3DA in the area of the target intervention no longer matches the actual internal structure of the patient P.
- a second registration is therefore carried out.
- the visible intracorporeal structure of the cortical surface is spatially recorded by the 3D recording device and compared with the 3D recording data 3DA via suitable landmarks and/or surfaces, correlated and thus registered as a second registration.
- a transformation matrix is determined that carries out a transformation only within defined limits compared to the first registration with the first transformation, in particular a transformation matrix.
- the basis is thus specified by the first registration, and the second registration is only carried out in a specified range (a predefined tolerance range) compared to the first registration, in order to refine the registration intraoperatively at any time. Due to the basis of the first registration, a correlation is carried out particularly quickly and efficiently. It also minimizes potential misregistration errors that can occur computationally. For example, if there is only a small structure of 3D image data 3DA of the patient P's body, for example a muscle section in the left arm, only the muscle section in the left arm and not in the right arm can be determined via the correlation. In other words, the two-stage method with initial registration specifies a rough minimum accuracy with the first registration, which can be refined later but no longer coarsened.
- FIG. 3 shows a schematic sequence of a registration method according to a further preferred embodiment.
- This registration method is characterized by the fact that based on the three-dimensional intracorporeal recording IA, even the 3D recording data 3DA can also be adjusted to the actual structure after opening the skull.
- the (three-dimensional structure) of the skin surface of the patient P's face in the 3D recording data 3DA and in real terms is used for a first registration.
- a craniological or cerebral area is spatially recognized or recorded by machine vision in the three-dimensional intracorporeal image IA and projected onto the preoperative 3D image data 3DA.
- a visible area of the brain can thus be determined in the 3D recording data 3DA.
- This reconstruction is used for the 3D recording data 3DA in order to carry out a cortical surface comparison.
- the preoperative 3D recording data 3DA are then adjusted such that the intervention area is updated via the three-dimensional reconstruction, ie the removed skullcap is removed from the 3D recording data 3DA and the position of the brain is adjusted.
- FIG. 4 shows a flow chart of a registration method of a further preferred embodiment with a two-stage registration process.
- preoperative 3D recording data 3DA is recorded, in particular M RT recording data or CT recording data.
- the navigation is prepared or set up.
- trackers are provided on the patient, on the 3D recording device (3D visualization system) and/or on an instrument.
- the registration structure is preferably defined.
- landmarks and/or surfaces are determined in the preoperative 3D recording data 3DA.
- a three-dimensional structure of the patient's face can be defined as the registration structure for the first registration and/or a cerebral region for the second registration.
- the first registration takes place in the next step S4. This takes place in particular on the basis of the previously defined outer surface of the face of patient P.
- the initial registration may be based on point-to-point registration and/or surface mapping and/or video surface acquisition such as a stereo camera and/or fiducial mapping.
- the first registration is complete and the surgical area can be draped and opened with an incision.
- a three-dimensional intracorporeal image IA is created in a step S5 using a 3D imager, which image is used to refine the registration.
- At least one landmark and/or surface is determined in the three-dimensional intracorporeal image IA, in particular the defined registration structure is used, and the intraoperative structure of the patient is compared with the preoperative structure of the patient for the second registration by means of a comparison.
- this step in particular, only a rigid transformation (that is to say only a shift and a rotation of the three-dimensional images 1A, 3DA relative to one another) is carried out in order to increase the accuracy of the registration for the target area of interest.
- instruments are now used for an operation, they can be tracked via a tracking system and tracked precisely in the 3D recording data 3DA, so that navigation using the 3D recording data 3DA alone is possible in order to use the instrument to reach the target application area reach.
- the position of the 3D recording device is continuously tracked/captured in step S7.
- the second registration can be repeated at any time, particularly if the tissue has moved. If the condition B1 of a new registration is met (Yes), the registration method goes back to step S5 and based on the three-dimensional intracorporeal recording IA of the 3D recording device, a correlation with the 3D recording data 3DA is determined.
- condition B2 If no new registration is desired (No), the method proceeds to condition B2. As long as condition B2 does not determine that the method should be ended, step S7 is executed again. Otherwise the procedure is terminated.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Robotics (AREA)
- Theoretical Computer Science (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Pathology (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Image Processing (AREA)
Abstract
L'invention concerne un système d'assistance chirurgicale (1) destiné à être utilisé lors d'une intervention chirurgicale, comprenant : au moins un appareil de prise de vue 3D formant des images qui est conçu et adapté pour être utilisé chez un patient (P) pour établir une prise de vue intracorporelle tridimensionnelle (IA) ; et un système de suivi (14) qui est conçu et adapté pour déterminer au moins une zone d'intervention chirurgicale du patient (P), et la suivre ; une unité de fourniture de données (10) qui est adaptée pour fournir des données de prise de vue 3D numériques (3DA) du patient (P) ; et une unité de commande (12) qui est adaptée pour traiter la prise de vue intracorporelle tridimensionnelle (IA), les données du système de suivi (14) ainsi que les données de prise de vue 3D (3DA) fournies, le système d'assistance chirurgicale (1) étant adapté pour enregistrer les données de prise de vue 3D (3DA) sur une surface extérieure du patient (P) en tant que premier enregistrement ; et l'unité de commande (12) est adaptée pour déterminer à partir de la prise de vue intracorporelle tridimensionnelle (IA) et/ou des données de prise de vue 3D (3DA) au moins un repère et/ou une surface et/ou un volume tridimensionnel d'une structure intracorporelle en tant que référence intracorporelle, et enregister, en tant que deuxième enregistrement, sur la base du premier enregistrement et sur la base de la référence intracorporelle déterminée, les données de prise de vue 3D (3DA) sur la structure intracorporelle du patient (P), détectée en tant que prise de vue intracorporelle tridimensionnelle (IA). L'invention concerne en outre un procédé d'enregistrement ainsi qu'un support d'enregistrement lisible par ordinateur.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102021117004.4A DE102021117004A1 (de) | 2021-07-01 | 2021-07-01 | Chirurgisches Assistenzsystem mit verbesserter Registrierung und Registrierverfahren |
| PCT/EP2022/067925 WO2023275158A1 (fr) | 2021-07-01 | 2022-06-29 | Système d'assistance chirurgicale à enregistrement amélioré et procédé d'enregistrement |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP4362844A1 true EP4362844A1 (fr) | 2024-05-08 |
Family
ID=82493976
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP22741198.0A Pending EP4362844A1 (fr) | 2021-07-01 | 2022-06-29 | Système d'assistance chirurgicale à enregistrement amélioré et procédé d'enregistrement |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US20240285351A1 (fr) |
| EP (1) | EP4362844A1 (fr) |
| CN (1) | CN117580541A (fr) |
| DE (1) | DE102021117004A1 (fr) |
| WO (1) | WO2023275158A1 (fr) |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240020840A1 (en) * | 2022-07-15 | 2024-01-18 | Globus Medical, Inc. | REGISTRATION OF 3D and 2D IMAGES FOR SURGICAL NAVIGATION AND ROBOTIC GUIDANCE WITHOUT USING RADIOPAQUE FIDUCIALS IN THE IMAGES |
| CN117137450B (zh) * | 2023-08-30 | 2024-05-10 | 哈尔滨海鸿基业科技发展有限公司 | 一种基于皮瓣血运评估的皮瓣移植术成像方法和系统 |
| DE102024201661A1 (de) * | 2024-02-22 | 2025-08-28 | Carl Zeiss Meditec Ag | Verfahren zur Patientenregistrierung an einem medizinischen Visualisierungssystem und medizinisches Visualisierungssystem |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2013116694A1 (fr) * | 2012-02-03 | 2013-08-08 | The Trustees Of Dartmouth College | Procédé et appareil de détermination d'un décalage des tumeurs durant une opération chirurgicale à l'aide d'un système de cartographie de surface stéréo-optique en trois dimensions |
| US10568535B2 (en) | 2008-05-22 | 2020-02-25 | The Trustees Of Dartmouth College | Surgical navigation with stereovision and associated methods |
| US10314658B2 (en) | 2017-04-21 | 2019-06-11 | Biosense Webster (Israel) Ltd. | Registration of an anatomical image with a position-tracking coordinate system based on visual proximity to bone tissue |
| US11540767B2 (en) | 2017-07-03 | 2023-01-03 | Globus Medical Inc. | Intraoperative alignment assessment system and method |
| US20220215625A1 (en) | 2019-04-02 | 2022-07-07 | The Methodist Hospital System | Image-based methods for estimating a patient-specific reference bone model for a patient with a craniomaxillofacial defect and related systems |
-
2021
- 2021-07-01 DE DE102021117004.4A patent/DE102021117004A1/de active Pending
-
2022
- 2022-06-29 EP EP22741198.0A patent/EP4362844A1/fr active Pending
- 2022-06-29 WO PCT/EP2022/067925 patent/WO2023275158A1/fr not_active Ceased
- 2022-06-29 US US18/575,221 patent/US20240285351A1/en active Pending
- 2022-06-29 CN CN202280046581.5A patent/CN117580541A/zh active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| CN117580541A (zh) | 2024-02-20 |
| US20240285351A1 (en) | 2024-08-29 |
| DE102021117004A1 (de) | 2023-01-05 |
| WO2023275158A1 (fr) | 2023-01-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| DE102007013807B4 (de) | Verfahren zur Unterstützung der Navigation interventioneller Werkzeuge bei Durchführung von CT- bzw. MRT-gesteuerten Interventionen in einer vorgegebenen Interventionsebene | |
| DE10202091B4 (de) | Vorrichtung zur Ermittlung einer Koordinatentransformation | |
| DE10047382C2 (de) | Röntgenkalibrierphantom, Verfahren zur markerlosen Registrierung für navigationsgeführte Eingriffe unter Verwendung des Röntgenkalibrierphantoms und medizinisches System aufweisend ein derartiges Röntgenkalibrierphantom | |
| DE60032475T2 (de) | Navigationsführung über computergestützte fluoroskopische bildgebung | |
| DE10322739B4 (de) | Verfahren zur markerlosen Navigation in präoperativen 3D-Bildern unter Verwendung eines intraoperativ gewonnenen 3D-C-Bogen-Bildes | |
| EP4213755B1 (fr) | Système d'assistance chirurgicale | |
| EP2632382B2 (fr) | Accessoire de navigation pour appareils optiques en médecine et procédé associé | |
| DE10136709B4 (de) | Vorrichtung zum Durchführen von operativen Eingriffen sowie Verfahren zum Darstellen von Bildinformationen während eines solchen Eingriffs an einem Patienten | |
| DE19846687C2 (de) | Chirurgische Hilfsvorrichtung zur Verwendung beim Ausführen von medizinischen Eingriffen und Verfahren zum Erzeugen eines Bildes im Rahmen von medizinischen Eingriffen | |
| EP4362844A1 (fr) | Système d'assistance chirurgicale à enregistrement amélioré et procédé d'enregistrement | |
| DE102010020284A1 (de) | Bestimmung von 3D-Positionen und -Orientierungen von chirurgischen Objekten aus 2D-Röntgenbildern | |
| DE10323008A1 (de) | Verfahren zur automatischen Fusion von 2D-Fluoro-C-Bogen-Bildern mit präoperativen 3D-Bildern unter einmaliger Verwendung von Navigationsmarken | |
| DE102008044529A1 (de) | System und Verfahren zur Verwendung von Fluoroskop- und Computertomographie-Registrierung für Sinuplastie-Navigation | |
| DE102004004620A1 (de) | Verfahren zur Registrierung und Überlagerung von Bilddaten bei Serienaufnahmen in der medizinischen Bildgebung | |
| DE102014203097A1 (de) | Verfahren zum Unterstützen einer Navigation einer endoskopischen Vorrichtung | |
| DE10210646A1 (de) | Verfahren zur Bilddarstellung eines in einen Untersuchungsbereich eines Patienten eingebrachten medizinischen Instruments | |
| WO2002062250A1 (fr) | Procede et dispositif de navigation peroperatoire | |
| DE10210287A1 (de) | Verfahren und Vorrichtung zur markerlosen Registrierung für navigationsgeführte Eingriffe | |
| EP2461759A1 (fr) | Procédé de superposition d'une image intra-opératoire instantanée d'un champ opératoire avec une image préopératoire du champ opératoire | |
| DE102005059804A1 (de) | Verfahren und Vorrichtung zur Bewegungskorrektur bei der Bildgebung während einer medizinischen Intervention | |
| DE112021003530T5 (de) | System zur Unterstützung eines Benutzers bei der Platzierung einer Eindringungsvorrichtung in Gewebe | |
| DE19951502A1 (de) | System mit Mitteln zur Aufnahem von Bildern, medizinischer Arbeitsplatz aufweisend ein derartiges System und Verfahren zur Einblendung eines Abbildes eines zweiten Objektes in ein von einem ersten Objekt gewonnenes Bild | |
| DE102019200786A1 (de) | Bildgebendes medizinisches Gerät, Verfahren zum Unterstützen von medizinischem Personal, Computerprogrammprodukt und computerlesbares Speichermedium | |
| EP1464285B1 (fr) | Recalage en perspective et visualisation des régions corporelles internes | |
| DE10137914B4 (de) | Verfahren zur Ermittlung einer Koordinatentransformation für die Navigation eines Objekts |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20240201 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| DAV | Request for validation of the european patent (deleted) | ||
| DAX | Request for extension of the european patent (deleted) |