[go: up one dir, main page]

WO2025176636A1 - System and method for registration update based on single x-ray image - Google Patents

System and method for registration update based on single x-ray image

Info

Publication number
WO2025176636A1
WO2025176636A1 PCT/EP2025/054283 EP2025054283W WO2025176636A1 WO 2025176636 A1 WO2025176636 A1 WO 2025176636A1 EP 2025054283 W EP2025054283 W EP 2025054283W WO 2025176636 A1 WO2025176636 A1 WO 2025176636A1
Authority
WO
WIPO (PCT)
Prior art keywords
anatomical
navigation system
anatomical object
interest
ray image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2025/054283
Other languages
French (fr)
Inventor
Arno Blau
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Metamorphosis GmbH
Original Assignee
Metamorphosis GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Metamorphosis GmbH filed Critical Metamorphosis GmbH
Publication of WO2025176636A1 publication Critical patent/WO2025176636A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/547Control of apparatus or devices for radiation diagnosis involving tracking of position of the device or parts of the device
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/16Instruments for performing osteoclasis; Drills or chisels for bones; Trepans
    • A61B17/17Guides or aligning means for drills, mills, pins or wires
    • A61B17/1739Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body
    • A61B17/1757Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body for the spine
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/254User interfaces for surgical systems being adapted depending on the stage of the surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/367Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • A61B2090/3764Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT] with a rotating C-arm having a cone beam emitting source
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4458Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit or the detector unit being attached to robotic arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest

Definitions

  • the invention relates to the field of navigation system supporting surgery.
  • the systems and methods described herein can be applied to assist in performing steps in surgical procedures.
  • the systems and methods aim in an improved navigation allowing a determination of position and orientation of anatomical objects in an operation field.
  • the methods can be implemented as computer program product which can be executed on a processing unit of a system.
  • Registration must be continuously watched. Registration may have to be repeated in case of tracker movements. • If a tracker movement goes unnoticed, navigation instructions will be incorrect, possibly resulting in harm to the patient.
  • the registration may expose the patient to a significant amount of X-ray radiation. This makes application in a pediatric setting particularly problematic.
  • Registration may have to be repeated in case of e.g. movements in between different vertebrae.
  • a navigation system that is capable of reliably determining the actual 3D positions and orientations of objects like tools, implants and/or anatomical structures of a patient. While such a navigation system may be particularly useful for spinal surgery, it may also be used for many other surgical procedures where the spatial relation of objects must be precisely determined. It may be desirable to achieve a level of safety which may even enable autonomous robotic surgery.
  • Navigational information may be derived from attaching a reference frame to a first vertebra and running a registration procedure with an intraoperative 3D scan of all vertebrae of interest. Based on such navigational information which typically is reliable for a first vertebra (which may be denoted as the anatomical object of reference) with an attached reference frame (which may be a non-anatomical reference object), an initial assumption of the navigational information for a neighboring vertebra (which may be denoted as the anatomy of interest) may be available.
  • an X-ray projection image with an intraoperative X-ray imaging device like a C-arm is acquired, depicting at least the anatomy of interest, i.e., the second vertebra.
  • the X-ray imaging device may be capable of acquiring both a 3D scan and a 2D X-ray image.
  • a 2D X-ray image may be used to gather a spatial information, meaning a relative 3D position and 3D orientation between the anatomical object of interest and the anatomical object of reference. There are different ways this may be accomplished, as further detailed below.
  • a navigation system may provide 3D navigational information for at least two anatomical objects. Those 3D navigational information may be provided previous and after a surgical step is performed.
  • the navigation system may, thus, be configured to track movements of objects, e.g., of anatomical objects like vertebrae.
  • tracking movements is intended to mean a continuous as well as a discontinuous determination of position and orientation of an object. It will be understood by a person having skill in the art that a navigation system may “track movements” of an object even when the object does not move, i.e., even if no movement occurs.
  • the navigation system provides several times information of current position and orientation of an object so that changes of the position and/or orientation can be detected over time, i.e., a movement of the object can be detected.
  • a navigation system may be capable of tracking this movement based on visual observations which then may be real time or based on acquisition of 2D X-rays as described in, e.g., WO 2023/110124 Al and in WO 2023/247328 Al.
  • a navigation system for assisting a surgical procedure provides 3D navigational information of at least two anatomical objects at a point in time previous to a step of the surgical procedure, wherein the at least two anatomical objects include an anatomical object of reference and an anatomical object of interest.
  • the navigation system may be configured to track movements of the at least two anatomical objects during the step of the surgical procedure. Further, the navigation system is configured to provide new 3D navigational information of the at least two anatomical objects at a point in time after the step of the surgical procedure.
  • the navigation system detects any movement of at least one anatomical object due to a surgical intervention or step
  • the X-ray image is generated after the step of the surgical procedure.
  • the X-ray image may be a single X-ray image. That is, the amount of radiation to which the patient is exposed may be reduced.
  • the X-ray image it is possible to determine a position of an object at least in the imaging plane.
  • the depth information may be estimated or may be determined with less accuracy. Pivoting or tilting of the object can be determined about all three space axes.
  • the spatial information may be determined utilizing a model of the object, e.g., of the anatomical object of interest. When doing so, a plurality of virtual X-ray images of the model may be generated. It will be understood that not only the model, but also further structures may be visualized in the virtual X-ray images. As the model of the anatomical object of interest may have a changed position, a changed orientation, or a changed position and a changed orientation in different X-ray images of the plurality of virtual X-ray images, it may be possible to identify a specific position and orientation of the model and, thus, of the imaged anatomy which provides the best fit to the real anatomical object.
  • a weighting algorithm may be applied to focus on the image area of interest (this area may be the area where the model is depicted).
  • a first way to compute spatial information would be that the x-ray image would not only depict the anatomical object of interest but also the anatomical object of reference.
  • the spatial information may be determined on the basis of the position and orientation of the anatomical object of interest relative to the anatomical object of reference.
  • a model of the anatomical object of interest as well as a model of the anatomical object of reference may be used, a plurality of virtual X-ray images may be generated thereof and compared with the real projection image.
  • the navigation system may take into account the spatial information being determined on the basis of the X-ray image showing the anatomical object of interest together with the non-anatomical element.
  • the non-anatomical element may have a known position and orientation relative to the anatomical object of reference, for example derived from a previous registration providing the 3D navigational information before the surgical step.
  • the navigation system may further comprise an imaging device for generating the X- ray image, wherein the navigation system tracks the position and orientation of the X-ray imaging device, so that the spatial information may be determined based on the X-ray image taking into account the position and orientation of the X-ray imaging device, e.g in relation to to a spine clamp attached to the anatomical object of reference.
  • the spatial information may retrieve positioning information about the C-arm relative to the anatomical object of interest and thus relative to the anatomical object of reference This may be accomplished by the navigation system, e.g.
  • positioning information about the C-arm may be known due to internal sensors of the C-arm based imaging device. It may also be possible to have the C-arm as part of a robotic system, always knowing its precise position provided by internal sensors.
  • Using the original navigational information and having a 3D model of the anatomical object of interest would allow computing a plurality of virtually generated X-ray images, based on a virtual 3D scenario, depicting at least the anatomical object of interest or the model thereof in the position and orientation as expected based on the original navigational information, but also from other positions and orientations slightly different from this expected values within a range as it can be expected based on the use case.
  • neighboring vertebrae can be expected to change relative position of up to 8mm, and rotation of up to 7 degree based on a partially non-elastic movement by applied forces.
  • a comparison algorithm may by applied to choose the one virtual X-ray which fits best to the X-ray image and thus choosing the 3D scenario and thus computing the spatial information.
  • the anatomical object of reference may be a first vertebra and the anatomical object of interest may be a second vertebra, or (ii) the anatomical object of reference may be a sacrum and the anatomical object of interest may be an ilium, or (iii) the anatomical object of reference may be a first fragment of a fractured bone and the anatomical object of interest may be a second fragment of the fractured bone.
  • any other pair/triple of neighboring bones that may need to be connected with an implant may be suitable for this procedure/method. It may even be applicable in a tumor resection situation.
  • the navigation systems described above may be utilized in accordance with a method as described in the following.
  • a method of assisting a surgical procedure may comprise the steps of providing 3D navigational information by a navigation system of at least two anatomical objects at a point in time previous to a step of the surgical procedure.
  • the method may comprise the step of tracking movements of the at least two anatomical objects by the navigation system. It is noted that a position and an orientation of each of the objects is determined so as to track any movement of the objects as soon as it occurs.
  • an X-ray image is generated which shows at least the anatomical object of interest.
  • new or updated 3D navigational information is provided by the navigation system with respect to the at least two anatomical objects at the point in time after the step of the surgical procedure.
  • the new 3D navigational information is generated taking into account spatial information of at least the anatomical object of interest, wherein the spatial information is determined on the basis of the X-ray image.
  • the spatial information may be determined utilizing a model of the anatomical object of interest and generating a plurality of virtual X-ray images of the model of the anatomical object of interest, with the model of the anatomical object of interest having a changed position, a changed orientation, or a changed position and a changed orientation in different X-ray images of the plurality of virtual X-ray images.
  • the spatial information of the anatomical objects relative to each other may be determined on the basis of the X-ray image showing the anatomical object of interest together with the non-anatomical element.
  • the non-anatomical element may be a surgical tool, an implant or even a marker attached to either the anatomical object of reference, to a surgical tool or to an implant.
  • the method may further comprise the step of tracking the position and orientation of the X- ray imaging device which is used to generate the X-ray image after the surgical step has been performed.
  • the spatial information may be determined based on the X-ray image taking into account the position and orientation of the X-ray imaging device.
  • a processing unit may be realized by only one processor performing all the steps of the process, or by a group or a plurality of processors, which need not be located at the same place.
  • cloud computing allows a processor to be placed anywhere.
  • a processing unit may be divided into a first sub-processor that controls interactions with the user, including a monitor for visualizing results, and a second sub-processor (possibly located elsewhere) that performs all computations.
  • the first sub-processor or another sub-processor may also control movements of, for example, a C-arm or a G-arm or an O-arm of an X-ray imaging device.
  • the system may further comprise a device for providing information to a user, wherein the information may include X-ray images and/or instructions regarding step of a procedure.
  • a device for providing information to a user, wherein the information may include X-ray images and/or instructions regarding step of a procedure.
  • a device may be a monitor or an augmented reality device for visualization of the information, or it may be a loudspeaker for providing the information acoustically.
  • the device may further comprise input means for manually determining or selecting a position or part of an anatomical structure in the X-ray image, such as a bone outline, for example for measuring a distance in the image.
  • Such input means may be for example a computer keyboard, a computer mouse, or a touch screen, to control a pointing device like a cursor on a monitor screen, which may be included in the device.
  • the device may also comprise a camera or a scanner to read the labeling of a packaging or otherwise identify a surgical object.
  • a camera may also enable the user to communicate with the device visually by gestures or mimics, e.g., by virtually touching devices displayed by virtual reality.
  • the device may also comprise a microphone and/or loudspeaker and communicate with the user acoustically.
  • a navigation system may comprise any device on the extended reality spectrum encompassing a superset of virtual reality, augmented reality und mixed reality devices.
  • Those sensors may track the environment including an imaging device and/or at least one object like an anatomy model, a surgical tool, or an implant. It is noted that the tracking information allows for a differentiation of objects and/or devices.
  • the extended reality device may be head-mounted and comprise a pair of glasses through which the user may see the environment, but which are configured to visualize information or a virtual representation of an object in the field of view of the user.
  • prior information about a surgical object may be obtained by simply scanning of a packaging (e.g., the barcode) or any writing on the surgical object itself, before or during surgery, so as to retrieve a model of the surgical object.
  • a packaging e.g., the barcode
  • a main aspect is an improvement of navigational information provided by a navigation system.
  • the methods described herein are to be understood as methods assisting in a surgical treatment of a patient. Consequently, the method may not include any step of treatment of an animal or human body by surgery.
  • embodiments are described with reference to different subject-matters.
  • some embodiments are described with reference to method-type claims (computer program product) whereas other embodiments are described with reference to apparatus-type claims (system/device).
  • a person skilled in the art will gather from the above and the following description that, unless otherwise specified, any combination of features belonging to one type of subject-matter as well as any combination between features relating to different subject-matters is considered to be disclosed with this application.
  • FIG. 1 illustrates a situation in which a user utilizes a system as disclosed herein.
  • Fig. 2 is a flowchart of a method as disclosed herein.
  • Figure 1 may be interpreted in that the dark portions, i.e., the portions illustrated as black surfaces, of the objects and of the devices are outside of the tracking, and the light portions, i.e., the portions illustrated by dotted outlines, of the objects and of the devices are those portions which are detected by the sensors of the navigation system, so that the objects and devices can be tracked and a 3D navigational information thereof can be determined.
  • the dark portions i.e., the portions illustrated as black surfaces
  • the light portions i.e., the portions illustrated by dotted outlines
  • anatomical objects like fragments of a fractured bone or multiple vertebrae as part of the patient 50 may be provided as representations on the extended reality device 10. At least navigational information with respect to those objects can be provided on the extended reality device 10.
  • the navigational information as determined by the navigation system may be visualized, the relative position and orientation of two anatomical objects as derived from the X-ray image may be visualized, or both. When visualizing both together, the user may immediately recognize any differences between the positions and orientations in the visualizations. Further, the navigation system may provide an indication in case that the position and orientation of the anatomical object of interest is possibly not at a position and orientation within the patient at which the navigation system has calculated it to be, but at a slightly different position and orientation as detected by the X-ray image. It will be understood that the spatial information as determined on the basis of the X-ray image may be automatically utilized so as to update and/or correct the navigational information, and only the update and/or corrected information is provided to the user.
  • step SI of the method of assisting a surgical procedure 3D navigational information is provided by a navigation system of at least two anatomical objects at a point in time previous to a step of the surgical procedure.
  • the at least two anatomical objects may include an anatomical object of reference and an anatomical object of interest.
  • step S2 the position and orientation of the at least two anatomical objects may be determined by the navigation system during the step of the surgical procedure.
  • the position and orientation i.e., movements of the at least two anatomical objects may be tracked by the navigation system.
  • step S3 an X-ray image is generated showing at least the anatomical object of interest after the step of the surgical procedure.
  • steps S4a, S4b and S4c spatial information of the anatomical objects is determined on the basis of the X-ray image.
  • steps S4a, S4b and S4c either one of steps S4a, S4b and S4c or a combination of two or three of them can be performed in accordance with embodiments.
  • step S5 new 3D navigational information is generated by the navigation system of the at least two anatomical objects at a point in time after the step of the surgical procedure, taking into account the spatial information of the anatomical objects and resulting in an accurate 3D navigational information for the anatomical object of interest.
  • the spatial information may be determined utilizing a model of the anatomical object of interest and generating a plurality of virtual X- ray images of the model of the anatomical object of interest, with the model of the anatomical object of interest having a changed position, a changed orientation, or a changed position and a changed orientation in different X-ray images of the plurality of virtual X-ray images.
  • the X-ray image may further show the anatomical object of reference, and the spatial information is determined on the basis of the X-ray image showing the anatomical object of interest together with and in relation to the anatomical object of reference.
  • a transformation matrix between the finally matched models of the object of interest and the object of reference may be computed and applied to determine the new 3D navigational information in step S5.
  • the X-ray image further shows a non-anatomical element like a surgical tool or an already implanted implant, the element having a known position and orientation relative to the anatomical object of reference, and the spatial information is determined on the basis of the X-ray image showing the anatomical object of interest together with the non-anatomical element.
  • a transformation matrix between the finally matched models of the object of interest and the non-anatomical element may be computed and applied to determine the new 3D navigational information in step S5.
  • the navigation system tracks the position and orientation of the X-ray imaging device and the spatial information is determined based on the X-ray image taking into account the position and orientation of the X-ray imaging device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Multimedia (AREA)
  • Pulmonology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

A navigation system is suggested, providing 3D navigational information for at least two anatomical objects Those 3D navigational information may be provided previous and after a surgical step is performed. The navigation system may, thus, be configured to track movements of objects, e.g., of anatomical objects like vertebrae. The navigation system is further configured to provide new 3D navigational information of the at least two anatomical objects at a point in time after the step of the surgical procedure. The new 3D navigational information is generated taking into account spatial information of at least the anatomical object of interest, wherein the spatial information is determined on the basis of an X-ray image showing at least the anatomical object of interest.

Description

System and method for registration update based on single X-ray image
FIELD OF INVENTION
The invention relates to the field of navigation system supporting surgery. The systems and methods described herein can be applied to assist in performing steps in surgical procedures. In particular, the systems and methods aim in an improved navigation allowing a determination of position and orientation of anatomical objects in an operation field. The methods can be implemented as computer program product which can be executed on a processing unit of a system.
BACKGROUND OF THE INVENTION
For example, spinal surgery has become a commonly performed surgical procedure.
However, surgery on the spine also poses substantial risks to the patient. An incorrect drilling performed on the spine may damage the spinal cord, which can cause serious injuries to the nerves or the covering of the spinal cord, potentially leading to chronic pain, permanent paralysis, incontinence, or sexual dysfunction. In order to mitigate these risks due to incorrect drillings, computer-assisted navigation is already used with some frequency in spinal surgery. Computer assistance concerns navigating the surgeon, ensuring that drillings are performed in the correct location, pedicle screws are properly placed, and so on. This entails determining the precise relative 3D position and 3D orientation of an anatomical structure as well as of a surgical tool.
Almost all existing navigation systems require additional procedural steps and apparatuses like 3D cameras, trackers, reference bodies, and so on. In navigated spinal surgery, most current systems use optical tracking, where dynamic reference frames are attached to the spine (for patient tracking) and a reference body is attached to the surgical tool. All references must then be at all times visible to a 3D camera. Such an approach has multiple disadvantages, including but not limited to:
• Plausibility and accuracy of the registration must be continuously watched. Registration may have to be repeated in case of tracker movements. • If a tracker movement goes unnoticed, navigation instructions will be incorrect, possibly resulting in harm to the patient.
• Accuracy decreases with increasing distance from the camera.
In summary, existing navigation systems are error-prone. The long chain of errors in the navigation workflow of registration, instrument calibration and real time tracking means that navigation systems can only be as accurate as the underlying weakest part of the chain. Therefore, even when working with navigation systems, surgeons need to constantly ensure the accuracy of the navigation instructions, e.g. by tactile and landmark checking, which is an uncertain validation procedure.
The main problem with conventional navigation systems is that they do not determine the actual relative 3D positions and orientations between objects like surgical tools and anatomy but rather infer those positions and orientations by tracking, with a camera, reference bodies externally affixed to tools (e.g., a drill) and anatomy. This camera, by its nature, can only see the externally fixated reference body but not the drill itself inside the bone. If either one of the reference bodies moves, or if the drill bends inside the bone, the navigation system will be oblivious to this fact and provide incorrect information, possibly resulting in harm to the patient.
For instance, in navigated spinal surgery, most current systems use optical tracking, where reference frames are attached to the spine (for patient tracking) and a reference body is attached to the instrument. Both references must then be visible to a 3D camera.
Disadvantages may be:
• A time-consuming “registration” procedure (lasting at least several, but possibly up to 15 minutes) is necessary so that the system learns relative 3D positions and orientations.
• The registration may expose the patient to a significant amount of X-ray radiation. This makes application in a pediatric setting particularly problematic.
• Plausibility and accuracy of the registration must be continuously watched. Registration may have to be repeated in case of e.g. movements in between different vertebrae.
• If such a movement goes unnoticed, navigation instructions will be incorrect, possibly resulting in harm to the patient. The most likely reason for relative movements of vertebrae is the application of force while drilling or inserting the implants (e.g. pedicle screws). Typically, the spine and thus the relative positions of vertebrae moves back to the original position (elastic movement), but unfortunately this is not always the case. The only safe method to prevent the navigation system from being inaccurate due to above reason, is to repeat the complete registration procedure, meaning, attaching the reference frame to the next vertebra of interest, run an intraoperative 3D scan. Considering the additional procedure time and x-ray exposure resulting from repeating the registration procedure for each vertebra (see above), this is typically not done in a daily live OR routine, and thus it too often results in inaccurate navigational information potentially causing patient harm.
It is desirable to have a navigation system that is capable of reliably determining the actual 3D positions and orientations of objects like tools, implants and/or anatomical structures of a patient. While such a navigation system may be particularly useful for spinal surgery, it may also be used for many other surgical procedures where the spatial relation of objects must be precisely determined. It may be desirable to achieve a level of safety which may even enable autonomous robotic surgery.
SUMMARY OF THE INVENTION
At least the one or the other of the mentioned problems are mitigated or solved by the subject matter according to each of the independent claims. Further embodiments are described in the respective dependent claims.
Navigational information may be derived from attaching a reference frame to a first vertebra and running a registration procedure with an intraoperative 3D scan of all vertebrae of interest. Based on such navigational information which typically is reliable for a first vertebra (which may be denoted as the anatomical object of reference) with an attached reference frame (which may be a non-anatomical reference object), an initial assumption of the navigational information for a neighboring vertebra (which may be denoted as the anatomy of interest) may be available.
As noted above, the initial assumption can be wrong in a case in which a force has been applied to the vertebra of reference, e.g. by inserting an instrument or implant (a surgical step of the procedure). In order to improve the accuracy of the navigational information for the neighboring, second vertebra, after the surgical step might have caused a (partially) none- elastic movement in between the vertebra of reference and the vertebra of interest, an X-ray projection image with an intraoperative X-ray imaging device like a C-arm is acquired, depicting at least the anatomy of interest, i.e., the second vertebra. It will be understood that the X-ray imaging device may be capable of acquiring both a 3D scan and a 2D X-ray image. A 2D X-ray image may be used to gather a spatial information, meaning a relative 3D position and 3D orientation between the anatomical object of interest and the anatomical object of reference. There are different ways this may be accomplished, as further detailed below.
In general, a navigation system may provide 3D navigational information for at least two anatomical objects. Those 3D navigational information may be provided previous and after a surgical step is performed. The navigation system may, thus, be configured to track movements of objects, e.g., of anatomical objects like vertebrae. In the context of this disclosure, “tracking movements” is intended to mean a continuous as well as a discontinuous determination of position and orientation of an object. It will be understood by a person having skill in the art that a navigation system may “track movements” of an object even when the object does not move, i.e., even if no movement occurs. In other words, the navigation system provides several times information of current position and orientation of an object so that changes of the position and/or orientation can be detected over time, i.e., a movement of the object can be detected. Such a navigation system may be capable of tracking this movement based on visual observations which then may be real time or based on acquisition of 2D X-rays as described in, e.g., WO 2023/110124 Al and in WO 2023/247328 Al.
According to an embodiment, a navigation system for assisting a surgical procedure provides 3D navigational information of at least two anatomical objects at a point in time previous to a step of the surgical procedure, wherein the at least two anatomical objects include an anatomical object of reference and an anatomical object of interest. The navigation system may be configured to track movements of the at least two anatomical objects during the step of the surgical procedure. Further, the navigation system is configured to provide new 3D navigational information of the at least two anatomical objects at a point in time after the step of the surgical procedure. According to this embodiment, the new 3D navigational information is generated taking into account spatial information of at least the anatomical object of interest, wherein the spatial information is determined on the basis of an X-ray image showing at least the anatomical object of interest. It may be noted that the spatial information may describe the position and the orientation of the anatomical object of interest relative to the anatomical object of reference. It may also describe the position and the orientation of the anatomical object of interest relative to a none-anatomical object of reference.
As it is intended that the navigation system detects any movement of at least one anatomical object due to a surgical intervention or step, the X-ray image is generated after the step of the surgical procedure.
According to an embodiment, the X-ray image may be a single X-ray image. That is, the amount of radiation to which the patient is exposed may be reduced. When utilizing only one X-ray image, it is possible to determine a position of an object at least in the imaging plane. The depth information may be estimated or may be determined with less accuracy. Pivoting or tilting of the object can be determined about all three space axes.
The spatial information may be determined utilizing a model of the object, e.g., of the anatomical object of interest. When doing so, a plurality of virtual X-ray images of the model may be generated. It will be understood that not only the model, but also further structures may be visualized in the virtual X-ray images. As the model of the anatomical object of interest may have a changed position, a changed orientation, or a changed position and a changed orientation in different X-ray images of the plurality of virtual X-ray images, it may be possible to identify a specific position and orientation of the model and, thus, of the imaged anatomy which provides the best fit to the real anatomical object. By assessing a projection of the model as in a virtual X-ray image with the best match to the projection of the anatomy of interest as in a real X-ray image, it is possible to draw conclusions with respect to the spatial position and orientation of the anatomical object of interest. In order to achieve this a weighting algorithm may be applied to focus on the image area of interest (this area may be the area where the model is depicted).
A navigation system in accordance with the disclosure may be at least one out of the group consisting of (i) positioning and/or movement sensory devices of a robot, (ii) a navigation system utilizing a non-anatomical element like a surgical tool or implant, (iii) a navigation system with optical trackers, (iv) a navigation system with infrared trackers, (v) a navigation system with EM tracking, (vi) a navigation system utilizing a 2D camera, (vii) a navigation system utilizing Lidar, (viii) a navigation system utilizing a 3D camera, (ix) a navigation system including a wearable tracking element like augmented reality glasses, (x) a navigation system utilizing intraoperative 2D X-ray images as in WO 2023/247327 Al and in WO 2023/247328 Al, or any combination of the above.
In the following, different ways of generating the spatial information are provided, with the spatial information finally improving the accuracy of the navigational information provided by the navigation system.
A first way to compute spatial information would be that the x-ray image would not only depict the anatomical object of interest but also the anatomical object of reference. As soon as the X-ray image shows not only the anatomical object of interest but also the anatomical object of reference, the spatial information may be determined on the basis of the position and orientation of the anatomical object of interest relative to the anatomical object of reference. As described above, a model of the anatomical object of interest as well as a model of the anatomical object of reference may be used, a plurality of virtual X-ray images may be generated thereof and compared with the real projection image.
Another way to compute the spatial information would be that the x-ray image would not only depict the anatomical object of interest but also a non-anatomical object of reference like a spine clamp or an instrument attached to an already inserted pedicle screw, or even the head of an already inserted pedicle screw itself. In such a case, the navigation system may take into account the spatial information being determined on the basis of the X-ray image showing the anatomical object of interest together with the non-anatomical element. It is noted that the non-anatomical element may have a known position and orientation relative to the anatomical object of reference, for example derived from a previous registration providing the 3D navigational information before the surgical step.
Further, the navigation system may further comprise an imaging device for generating the X- ray image, wherein the navigation system tracks the position and orientation of the X-ray imaging device, so that the spatial information may be determined based on the X-ray image taking into account the position and orientation of the X-ray imaging device, e.g in relation to to a spine clamp attached to the anatomical object of reference. By matching the anatomical object of interest to the model of the anatomical object of interest by applying a plurality of virtual generated X-ray images as described above, the spatial information may retrieve positioning information about the C-arm relative to the anatomical object of interest and thus relative to the anatomical object of reference This may be accomplished by the navigation system, e.g. by optical trackers attached to the C-arm and knowledge about its geometrical/physical dimensions. Additionally and/or alternatively, positioning information about the C-arm may be known due to internal sensors of the C-arm based imaging device. It may also be possible to have the C-arm as part of a robotic system, always knowing its precise position provided by internal sensors.
A skilled person will understand that any combination of the alternative ways above may be appropriate. Having a respective 3D model of all objects depicted in the x-ray image may improve the accuracy of the determination of the spatial information.
Using the original navigational information and having a 3D model of the anatomical object of interest would allow computing a plurality of virtually generated X-ray images, based on a virtual 3D scenario, depicting at least the anatomical object of interest or the model thereof in the position and orientation as expected based on the original navigational information, but also from other positions and orientations slightly different from this expected values within a range as it can be expected based on the use case. E.g., for the use case at a lumbar spine, neighboring vertebrae can be expected to change relative position of up to 8mm, and rotation of up to 7 degree based on a partially non-elastic movement by applied forces. In other scenarios like cervical spine or a fractured bone this change of position and orientation between the anatomical object of interest and the anatomical object of reference may be larger. A comparison algorithm may by applied to choose the one virtual X-ray which fits best to the X-ray image and thus choosing the 3D scenario and thus computing the spatial information.
As examples, it is noted that (i) the anatomical object of reference may be a first vertebra and the anatomical object of interest may be a second vertebra, or (ii) the anatomical object of reference may be a sacrum and the anatomical object of interest may be an ilium, or (iii) the anatomical object of reference may be a first fragment of a fractured bone and the anatomical object of interest may be a second fragment of the fractured bone. Also any other pair/triple of neighboring bones that may need to be connected with an implant may be suitable for this procedure/method. It may even be applicable in a tumor resection situation.
The navigation systems described above may be utilized in accordance with a method as described in the following.
According to the disclosure, a method of assisting a surgical procedure may comprise the steps of providing 3D navigational information by a navigation system of at least two anatomical objects at a point in time previous to a step of the surgical procedure. The method may comprise the step of tracking movements of the at least two anatomical objects by the navigation system. It is noted that a position and an orientation of each of the objects is determined so as to track any movement of the objects as soon as it occurs. After the step of the surgical procedure, an X-ray image is generated which shows at least the anatomical object of interest. Then, new or updated 3D navigational information is provided by the navigation system with respect to the at least two anatomical objects at the point in time after the step of the surgical procedure. The new 3D navigational information is generated taking into account spatial information of at least the anatomical object of interest, wherein the spatial information is determined on the basis of the X-ray image.
The spatial information may be determined utilizing a model of the anatomical object of interest and generating a plurality of virtual X-ray images of the model of the anatomical object of interest, with the model of the anatomical object of interest having a changed position, a changed orientation, or a changed position and a changed orientation in different X-ray images of the plurality of virtual X-ray images.
In a case in which the X-ray image shows not only the anatomical object of interest but also the anatomical object of reference, the spatial information may be determined on the basis of the X-ray image showing both the objects and a spatial relation between the objects.
In a case in which the X-ray image further shows a non-anatomical element having a known position and orientation relative to the anatomical object of reference, the spatial information of the anatomical objects relative to each other may be determined on the basis of the X-ray image showing the anatomical object of interest together with the non-anatomical element. The non-anatomical element may be a surgical tool, an implant or even a marker attached to either the anatomical object of reference, to a surgical tool or to an implant.
The method may further comprise the step of tracking the position and orientation of the X- ray imaging device which is used to generate the X-ray image after the surgical step has been performed. Thus, the spatial information may be determined based on the X-ray image taking into account the position and orientation of the X-ray imaging device.
It may be understood that the system in accordance with the disclosure may comprise a processing unit and that the method may be implemented as computer program product which can be executed on that processing unit.
It is noted that a processing unit may be realized by only one processor performing all the steps of the process, or by a group or a plurality of processors, which need not be located at the same place. For example, cloud computing allows a processor to be placed anywhere. For example, a processing unit may be divided into a first sub-processor that controls interactions with the user, including a monitor for visualizing results, and a second sub-processor (possibly located elsewhere) that performs all computations. The first sub-processor or another sub-processor may also control movements of, for example, a C-arm or a G-arm or an O-arm of an X-ray imaging device.
It will be understood that the system may further comprise a device for providing information to a user, wherein the information may include X-ray images and/or instructions regarding step of a procedure. Such a device may be a monitor or an augmented reality device for visualization of the information, or it may be a loudspeaker for providing the information acoustically. The device may further comprise input means for manually determining or selecting a position or part of an anatomical structure in the X-ray image, such as a bone outline, for example for measuring a distance in the image. Such input means may be for example a computer keyboard, a computer mouse, or a touch screen, to control a pointing device like a cursor on a monitor screen, which may be included in the device. The device may also comprise a camera or a scanner to read the labeling of a packaging or otherwise identify a surgical object. A camera may also enable the user to communicate with the device visually by gestures or mimics, e.g., by virtually touching devices displayed by virtual reality. The device may also comprise a microphone and/or loudspeaker and communicate with the user acoustically.
A navigation system according to the disclosure may comprise any device on the extended reality spectrum encompassing a superset of virtual reality, augmented reality und mixed reality devices. This represents all devices which can be worn by a user and can (1) alter the perception of the user via any sense - including vision, hearing, olfaction, gustation and tactile perception - and (2) can track its environment via a variety of sensor information - including cameras, depth sensors, accelerometers, and magnetometers. Those sensors may track the environment including an imaging device and/or at least one object like an anatomy model, a surgical tool, or an implant. It is noted that the tracking information allows for a differentiation of objects and/or devices. For example, the extended reality device may be head-mounted and comprise a pair of glasses through which the user may see the environment, but which are configured to visualize information or a virtual representation of an object in the field of view of the user.
A computer program product may preferably be loaded into the random-access memory of a data processor. The data processor or processing unit of a system according to an embodiment may thus be equipped to carry out at least a part of the described process. Further, the disclosure may relate to a computer-readable medium such as a CD-ROM on which the disclosed computer program product may be stored. However, the computer program product may also be presented over a network like the World Wide Web and can be downloaded into the random-access memory of the data processor from such a network. Furthermore, the computer program product may also be executed on a cloud-based processor, with results presented over the network.
It is noted that prior information about a surgical object (e.g., the size and type of a drill bit) may be obtained by simply scanning of a packaging (e.g., the barcode) or any writing on the surgical object itself, before or during surgery, so as to retrieve a model of the surgical object.
As should be clear from the above description, a main aspect is an improvement of navigational information provided by a navigation system. The methods described herein are to be understood as methods assisting in a surgical treatment of a patient. Consequently, the method may not include any step of treatment of an animal or human body by surgery. It has to be noted that embodiments are described with reference to different subject-matters. In particular, some embodiments are described with reference to method-type claims (computer program product) whereas other embodiments are described with reference to apparatus-type claims (system/device). However, a person skilled in the art will gather from the above and the following description that, unless otherwise specified, any combination of features belonging to one type of subject-matter as well as any combination between features relating to different subject-matters is considered to be disclosed with this application.
The aspects defined above and further aspects, features and advantages of the present invention can also be derived from the examples of the embodiments to be described hereinafter and are explained with reference to examples of embodiments also shown in the figures, but to which the invention is not limited.
BRIEF DESCRIPTION OF THE DRAWINGS
The aspects defined above and further aspects, features and advantages of the present invention can also be derived from the examples of the embodiments to be described hereinafter and are explained with reference to examples of embodiments also shown in the figures, but to which the invention is not limited.
Fig. 1 illustrates a situation in which a user utilizes a system as disclosed herein. Fig. 2 is a flowchart of a method as disclosed herein.
DETAILED DESCRIPTION OF THE EMBODIMENTS
Figure 1 illustrates a situation in which the system and method as disclosed herein can be utilized. An extended reality device 10 may be a head mounted device which comprises sensors for tracking objects in the room. The extended reality device may represent at least a part of the navigation system according to the disclosure. Objects tracked by the navigation system may, for example, be a tool like a drilling machine 20 with a drill bit 22, an operation table 30, a C-arm based X-ray imaging device 40 and/or a patient 50. It will be understood that it is not necessary that the whole of each object is tracked by the sensors of the navigation system. For example, it may be sufficient to detect the radiation source and/or the radiation detector of the imaging device to determine a 3D pose of the imaging device, i.e., a 3D position and 3D orientation of the imaging device. It may otherwise be sufficient to detect the radiation detector together with at least a part of the C-arm and the base of the imaging device to allow for a determination of the 3D pose of the device. Comparably, only a part of the table 30 or of the tool 20 must be detected by the tracking sensors.
Figure 1 may be interpreted in that the dark portions, i.e., the portions illustrated as black surfaces, of the objects and of the devices are outside of the tracking, and the light portions, i.e., the portions illustrated by dotted outlines, of the objects and of the devices are those portions which are detected by the sensors of the navigation system, so that the objects and devices can be tracked and a 3D navigational information thereof can be determined.
Knowing the 3D pose of the imaging device 40 as well as the 3D pose of the patient 50 allows for the generation of an X-ray image supporting the generation of navigational information of objects in accordance with the disclosure.
For example, anatomical objects like fragments of a fractured bone or multiple vertebrae as part of the patient 50 may be provided as representations on the extended reality device 10. At least navigational information with respect to those objects can be provided on the extended reality device 10. The navigational information as determined by the navigation system may be visualized, the relative position and orientation of two anatomical objects as derived from the X-ray image may be visualized, or both. When visualizing both together, the user may immediately recognize any differences between the positions and orientations in the visualizations. Further, the navigation system may provide an indication in case that the position and orientation of the anatomical object of interest is possibly not at a position and orientation within the patient at which the navigation system has calculated it to be, but at a slightly different position and orientation as detected by the X-ray image. It will be understood that the spatial information as determined on the basis of the X-ray image may be automatically utilized so as to update and/or correct the navigational information, and only the update and/or corrected information is provided to the user.
In the following, a method is described with reference to figure 2 showing a flowchart. It will be understood that steps of methods described herein, and in particular of methods described in connection with workflows according to embodiments some of which are visualized in the figures, are major steps, wherein these major steps might be differentiated or divided into several sub-steps. Furthermore, additional sub-steps might be between these major steps. It will also be understood that only part of the whole method may constitute the invention, i.e. steps may be omitted or summarized.
In step SI of the method of assisting a surgical procedure according to the embodiment of figure 2, 3D navigational information is provided by a navigation system of at least two anatomical objects at a point in time previous to a step of the surgical procedure. The at least two anatomical objects may include an anatomical object of reference and an anatomical object of interest.
In step S2, the position and orientation of the at least two anatomical objects may be determined by the navigation system during the step of the surgical procedure. In other words, the position and orientation, i.e., movements of the at least two anatomical objects may be tracked by the navigation system.
In step S3, an X-ray image is generated showing at least the anatomical object of interest after the step of the surgical procedure.
In at least one of steps S4a, S4b and S4c, spatial information of the anatomical objects is determined on the basis of the X-ray image. As will be understood, either one of steps S4a, S4b and S4c or a combination of two or three of them can be performed in accordance with embodiments.
In step S5, new 3D navigational information is generated by the navigation system of the at least two anatomical objects at a point in time after the step of the surgical procedure, taking into account the spatial information of the anatomical objects and resulting in an accurate 3D navigational information for the anatomical object of interest.
Finally, the new 3D navigational information is provided in step S6, e.g., to the user. According to each of steps S4a, S4b and S4c, the spatial information may be determined utilizing a model of the anatomical object of interest and generating a plurality of virtual X- ray images of the model of the anatomical object of interest, with the model of the anatomical object of interest having a changed position, a changed orientation, or a changed position and a changed orientation in different X-ray images of the plurality of virtual X-ray images.
According to step S4a, the X-ray image may further show the anatomical object of reference, and the spatial information is determined on the basis of the X-ray image showing the anatomical object of interest together with and in relation to the anatomical object of reference. A transformation matrix between the finally matched models of the object of interest and the object of reference may be computed and applied to determine the new 3D navigational information in step S5.
According to step S4b, the X-ray image further shows a non-anatomical element like a surgical tool or an already implanted implant, the element having a known position and orientation relative to the anatomical object of reference, and the spatial information is determined on the basis of the X-ray image showing the anatomical object of interest together with the non-anatomical element. A transformation matrix between the finally matched models of the object of interest and the non-anatomical element may be computed and applied to determine the new 3D navigational information in step S5.
According to step S4c, the navigation system tracks the position and orientation of the X-ray imaging device and the spatial information is determined based on the X-ray image taking into account the position and orientation of the X-ray imaging device.
While embodiments have been illustrated and described in detail in the drawings and aforegoing description, such illustrations and descriptions are to be considered illustrative or exemplary and not restrictive, and the invention is not limited to the disclosed embodiments.
Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure and the appended claims. In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single processor or other unit may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures can not be used to advantage.

Claims

1. A navigation system for assisting a surgical procedure, the navigation system providing 3D navigational information of at least two anatomical objects at a point in time previous to a step of the surgical procedure, wherein the at least two anatomical objects include an anatomical object of reference and an anatomical object of interest, wherein the navigation system is configured to provide new 3D navigational information of the at least two anatomical objects at a point in time after the step of the surgical procedure, wherein the new 3D navigational information is generated taking into account spatial information of at least the anatomical object of interest, the spatial information being determined on the basis of an X-ray image showing the anatomical object of interest, wherein the X-ray image is generated after the step of the surgical procedure.
2. The navigation system of claim 1, wherein the X-ray image is a single X-ray image and wherein the spatial information describes the position and the orientation of the anatomical object of interest relative to the anatomical object of reference.
3. The navigation system of claim 1 or 2, wherein the spatial information is determined utilizing a model of the anatomical object of interest and generating a plurality of virtual X- ray images of the model of the anatomical object of interest, with the model of the anatomical object of interest having a changed position, a changed orientation, or a changed position and a changed orientation in different X-ray images of the plurality of virtual X-ray images.
4. The navigation system of any one of claims 1 to 3, wherein the navigation system is at least one out of the group consisting of positioning and/or movement sensory devices of a robot, a navigation system utilizing a non-anatomical element, a navigation system with optical trackers, a navigation system with infrared trackers, a navigation system with EM tracking, a navigation system utilizing a 2D camera, a navigation system utilizing Lidar, a navigation system utilizing a 3D camera, a navigation system including a wearable tracking element like augmented reality glasses.
5. The navigation system of any one of claims 1 to 4, wherein the X-ray image further shows the anatomical object of reference, the spatial information being determined on the basis of the X-ray image showing the anatomical object of interest together with the anatomical object of reference.
6. The navigation system of any one of claims 1 to 5, wherein the X-ray image further shows a non-anatomical element having a known position and orientation relative to the anatomical object of reference, the spatial information being determined on the basis of the X-ray image showing the anatomical object of interest together with the non-anatomical element.
7. The navigation system of any one of claims 4 and 6, wherein the non-anatomical element is a surgical tool or an implant.
8. The navigation system of any one of claims 1 to 7, further comprising an imaging device for generating the X-ray image, wherein the navigation system tracks the position and orientation of the X-ray imaging device, wherein the spatial information is determined based on the X-ray image taking into account the position and orientation of the X-ray imaging device.
9. The navigation system of any one of claims 1 to 8, wherein the anatomical object of reference is a first vertebra and the anatomical object of interest is a second vertebra, or wherein the anatomical object of reference is a sacrum and the anatomical object of interest is an ilium, or wherein the anatomical object of reference is a first fragment of a fractured bone and the anatomical object of interest is a second fragment of the fractured bone.
10. A method of assisting a surgical procedure, the method comprising the steps of providing 3D navigational information by a navigation system of at least two anatomical objects at a point in time previous to a step of the surgical procedure, wherein the at least two anatomical objects include an anatomical object of reference and an anatomical object of interest, generating an X-ray image showing at least the anatomical object of interest after the step of the surgical procedure, providing new 3D navigational information by the navigation system of the at least two anatomical objects at a point in time after the step of the surgical procedure, wherein the new 3D navigational information is generated taking into account spatial information of at least the anatomical object of interest, the spatial information being determined on the basis of the X-ray image.
11. The method of claim 10, wherein the spatial information is determined utilizing a model of the anatomical object of interest and generating a plurality of virtual X-ray images of the model of the anatomical object of interest, with the model of the anatomical object of interest having a changed position, a changed orientation, or a changed position and a changed orientation in different X-ray images of the plurality of virtual X-ray images.
12. The method of any one of claims 10 to 11, wherein the X-ray image further shows the anatomical object of reference, the spatial information being determined on the basis of the X-ray image showing the anatomical object of interest together with the anatomical object of reference.
13. The method of any one of claims 10 to 12, wherein the X-ray image further shows a non-anatomical element having a known position and orientation relative to the anatomical object of reference, the spatial information being determined on the basis of the X-ray image showing the anatomical object of interest together with the non-anatomical element.
14. The navigation system of any one of claims 10 to 13, further comprising an imaging device for generating the X-ray image, wherein the navigation system tracks the position and orientation of the X-ray imaging device, wherein the spatial information is determined based on the X-ray image taking into account the position and orientation of the X-ray imaging device.
PCT/EP2025/054283 2024-02-22 2025-02-18 System and method for registration update based on single x-ray image Pending WO2025176636A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2024201199A AU2024201199A1 (en) 2024-02-22 2024-02-22 Accurate navigation system for assisting a surgical procedure
AU2024201199 2024-02-22

Publications (1)

Publication Number Publication Date
WO2025176636A1 true WO2025176636A1 (en) 2025-08-28

Family

ID=94820911

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2025/054283 Pending WO2025176636A1 (en) 2024-02-22 2025-02-18 System and method for registration update based on single x-ray image

Country Status (2)

Country Link
AU (1) AU2024201199A1 (en)
WO (1) WO2025176636A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018076109A1 (en) * 2016-10-24 2018-05-03 Torus Biomedical Solutions Inc. Systems and methods for producing real-time calibrated stereo long radiographic views of a patient on a surgical table
US20190133693A1 (en) * 2017-06-19 2019-05-09 Techmah Medical Llc Surgical navigation of the hip using fluoroscopy and tracking sensors
WO2020012479A1 (en) * 2018-07-12 2020-01-16 Deep Health Ltd. System method and computer program product, for computer aided surgery
WO2023110124A1 (en) 2021-12-17 2023-06-22 Metamorphosis Gmbh Precise 3d-navigation based on imaging direction determination of single intraoperative 2d x-ray images
WO2023247328A1 (en) 2022-06-21 2023-12-28 Metamorphosis Gmbh Systems and methods for effortless and reliable 3d navigation for musculoskeletal surgery based on single 2d x-ray images
WO2023247327A1 (en) 2022-06-21 2023-12-28 Metamorphosis Gmbh System and methods to achieve redundancy and diversification in computer assisted and robotic surgery in order to achieve maximum robustness and safety

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3429475B1 (en) * 2016-03-13 2021-12-15 Vuze Medical Ltd. Apparatus for use with skeletal procedures
US20220110698A1 (en) * 2018-11-22 2022-04-14 Vuze Medical Ltd. Apparatus and methods for use with image-guided skeletal procedures
EP3839890A1 (en) * 2019-12-17 2021-06-23 Metamorphosis GmbH Determining relative 3d positions and orientations between objects in 2d medical images

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018076109A1 (en) * 2016-10-24 2018-05-03 Torus Biomedical Solutions Inc. Systems and methods for producing real-time calibrated stereo long radiographic views of a patient on a surgical table
US20190133693A1 (en) * 2017-06-19 2019-05-09 Techmah Medical Llc Surgical navigation of the hip using fluoroscopy and tracking sensors
WO2020012479A1 (en) * 2018-07-12 2020-01-16 Deep Health Ltd. System method and computer program product, for computer aided surgery
WO2023110124A1 (en) 2021-12-17 2023-06-22 Metamorphosis Gmbh Precise 3d-navigation based on imaging direction determination of single intraoperative 2d x-ray images
WO2023247328A1 (en) 2022-06-21 2023-12-28 Metamorphosis Gmbh Systems and methods for effortless and reliable 3d navigation for musculoskeletal surgery based on single 2d x-ray images
WO2023247327A1 (en) 2022-06-21 2023-12-28 Metamorphosis Gmbh System and methods to achieve redundancy and diversification in computer assisted and robotic surgery in order to achieve maximum robustness and safety

Also Published As

Publication number Publication date
AU2024201199A1 (en) 2025-09-11

Similar Documents

Publication Publication Date Title
AU2020410356B2 (en) Determining relative 3D positions and orientations between objects in 2D medical images
CN110475509B (en) Systems, devices, and methods for improving surgical accuracy using inertial measurement units
US20190388155A1 (en) Controlling a surgical intervention to a bone
JP5328137B2 (en) User interface system that displays the representation of tools or buried plants
JP5121401B2 (en) System for distance measurement of buried plant
JP2008126075A (en) System and method for visual verification of ct registration and feedback
EP4296940A1 (en) Systems and methods for effortless and reliable 3d navigation for musculoskeletal surgery based on single 2d x-ray images
KR20190078853A (en) Laser projection apparatus and control method thereof, laser guidance system including the apparatus
KR20190058190A (en) Spine surgical navigation system and method using augmented reality technology
JP6873832B2 (en) Intraoperative system and usage with deformed grid
KR100726022B1 (en) 3D Surgical Space Measurement System and Method Using Multiple 2D Images
JP6943884B2 (en) Hybrid X-ray / camera intervention motion compensation
EP4296949B1 (en) System and methods to achieve redundancy and diversification in computer assisted and robotic surgery in order to achieve maximum robustness and safety
WO2025176636A1 (en) System and method for registration update based on single x-ray image
US12433690B2 (en) System and method for lidar-based anatomical mapping
EP4478286A1 (en) Systems and methods for mixed-reality supported 3d navigation for musculoskeletal surgery based on x-ray images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25708349

Country of ref document: EP

Kind code of ref document: A1