[go: up one dir, main page]

EP4468963A1 - Système de positionnement de rayons x mobile - Google Patents

Système de positionnement de rayons x mobile

Info

Publication number
EP4468963A1
EP4468963A1 EP22922651.9A EP22922651A EP4468963A1 EP 4468963 A1 EP4468963 A1 EP 4468963A1 EP 22922651 A EP22922651 A EP 22922651A EP 4468963 A1 EP4468963 A1 EP 4468963A1
Authority
EP
European Patent Office
Prior art keywords
imaging device
coordinates
image
processor
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22922651.9A
Other languages
German (de)
English (en)
Other versions
EP4468963A4 (fr
Inventor
Pengfei Cai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Warsaw Orthopedic Inc
Original Assignee
Warsaw Orthopedic Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Warsaw Orthopedic Inc filed Critical Warsaw Orthopedic Inc
Publication of EP4468963A1 publication Critical patent/EP4468963A1/fr
Publication of EP4468963A4 publication Critical patent/EP4468963A4/fr
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4405Constructional features of apparatus for radiation diagnosis the apparatus being movable or portable, e.g. handheld or mounted on a trolley
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/547Control of apparatus or devices for radiation diagnosis involving tracking of position of the device or parts of the device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4435Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure
    • A61B6/4441Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being coupled by a rigid structure the rigid structure being a C-arm or U-arm

Definitions

  • the present disclosure is generally directed to surgical systems and relates more particularly to imaging devices for the surgical systems.
  • Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure or may complete one or more surgical procedures autonomously.
  • Imaging may be used by a medical provider for diagnostic, operational, and/or therapeutic purposes.
  • Providing controllable linked articulating members allows a surgical robot to reach areas of a patient anatomy during various medical procedures (e.g., using the imaging) .
  • Example aspects of the present disclosure include:
  • a robotic surgical imaging system comprising: a first imaging device; a second imaging device; a processor coupled with the first imaging device and the second imaging device; and a memory coupled with and readable by the processor and storing therein instructions that, when executed by the processor, cause the processor to: capture a first image of a target environment using the first imaging device, wherein the first image comprises an object included in the target environment; select coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object; generate a set of real-world coordinates corresponding to the portion of the object based at least in part on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates; position the second imaging device into a first location based at least in part on the set of real-world coordinates; and display a second image captured using the second imaging device, the second image comprising at least the portion of the object.
  • the instructions to select the coordinates of interest included in the first image cause the processor to: select a target line that passes through at least the portion of the object, wherein the target line comprises the coordinates of interest.
  • the instructions further cause the processor to: capture a third image of the target environment using the first imaging device after the second imaging device has been positioned into the first location; and verify the second imaging device is at the coordinates of interest based at least in part on the third image, the coordinates of interest, the set of real-world coordinates, the set of pixel coordinates, or a combination thereof.
  • the instructions further cause the processor to: position the second imaging device into a second location based at least in part on determining the second imaging device is not located at the coordinates of interest from the verifying.
  • instructions further cause the processor to: display guiding information to assist an operator with positioning the second imaging device into the second location.
  • the instructions to generate the set of real-world coordinates corresponding to the portion of the object cause the processor to: calculate a distance to move the second imaging device into the first location to capture the second image comprising at least the portion of the object, wherein the distance is calculated based at least in part on the coordinates of interest and an initial location of the second imaging device.
  • the instructions further cause the processor to: display guiding information to assist an operator with positioning the second imaging device, wherein the guiding information comprises the calculated distance.
  • the guiding information is displayed to the operator based at least in part on the set of pixel coordinates associated with the first image.
  • the first imaging device comprises a camera and the second imaging device comprises an X-ray machine.
  • a system comprising: a processor; and a memory coupled with and readable by the processor and storing therein instructions that, when executed by the processor, cause the processor to:capture a first image of a target environment using a first imaging device, wherein the first image comprises an object included in the target environment; select coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object; generate a set of real-world coordinates corresponding to the portion of the object based at least in part on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates; position a second imaging device into a first location based at least in part on the set of real-world coordinates; and display a second image captured using the second imaging device, the second image comprising at least the portion of the object.
  • the instructions to select the coordinates of interest included in the first image cause the processor to: select a target line that passes through at least the portion of the object, wherein the target line comprises the coordinates of interest.
  • the instructions further cause the processor to: capture a third image of the target environment using the first imaging device after the second imaging device has been positioned into the first location; and verify the second imaging device is at the coordinates of interest based at least in part on the third image, the coordinates of interest, the set of real-world coordinates, the set of pixel coordinates, or a combination thereof.
  • the instructions further cause the processor to: position the second imaging device into a second location based at least in part on determining the second imaging device is not located at the coordinates of interest from the verifying.
  • instructions further cause the processor to: display guiding information to assist an operator with positioning the second imaging device into the second location.
  • the instructions to generate the set of real-world coordinates corresponding to the portion of the object cause the processor to: calculate a distance to move the second imaging device into the first location to capture the second image comprising at least the portion of the object, wherein the distance is calculated based at least in part on the coordinates of interest and an initial location of the second imaging device.
  • the instructions further cause the processor to: display guiding information to assist an operator with positioning the second imaging device, wherein the guiding information comprises the calculated distance.
  • the guiding information is displayed to the operator based at least in part on the set of pixel coordinates associated with the first image.
  • the first imaging device comprises a camera and the second imaging device comprises an X-ray machine.
  • a method comprising: capturing a first image of a target environment using a first imaging device, wherein the first image comprises an object included in the target environment; generating a set of real-world coordinates corresponding to coordinates of interest included in the first image based at least in part on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates, wherein the coordinates of interest indicate at least a portion of the object; and displaying a second image captured using a second imaging device, the second image comprising at least the portion of the object, wherein the second imaging device is positioned into a first location based at least in part on the set real-world coordinates.
  • the coordinates of interest comprise a target line that passes through at least the portion of the object.
  • each of the expressions “at least one of A, B and C” , “at least one of A, B, or C” , “one or more of A, B, and C” , “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • each one of A, B, and C in the above expressions refers to an element, such as X, Y, and Z, or class of elements, such as X1-Xn, Y1-Ym, and Z1-Zo
  • the phrase is intended to refer to a single element selected from X, Y, and Z, a combination of elements selected from the same class (e.g., X1 and X2) as well as a combination of elements selected from two or more classes (e.g., Y1 and Zo) .
  • Fig. 1 is a block diagram of a system according to at least one embodiment of the present disclosure
  • Fig. 2 is an imaging system diagram according to at least one embodiment of the present disclosure
  • Fig. 3 is an additional imaging system diagram according to at least one embodiment of the present disclosure.
  • Fig. 4 is a set of coordinate mapping diagrams according to at least one embodiment of the present disclosure.
  • Fig. 5 is a flowchart according to at least one embodiment of the present disclosure.
  • Fig. 6 is an additional flowchart according to at least one embodiment of the present disclosure.
  • Fig. 7 is an additional flowchart according to at least one embodiment of the present disclosure.
  • the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions) .
  • Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer) .
  • processors such as one or more digital signal processors (DSPs) , general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple A11, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors) , graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units) , application specific integrated circuits (ASICs) , field programmable logic arrays (FPGAs) , or other equivalent integrated circuits (ASICs) ,
  • proximal and distal are used in this disclosure with their conventional medical meanings, proximal being closer to the operator or user of the system, and further from the region of surgical interest in or on the patient, and distal being closer to the region of surgical interest in or on the patient, and further from the operator or user of the system.
  • X-ray systems may be used for ensuring a position of an X-ray machine is correct (e.g., prior to and/or during the surgical procedures) .
  • the X-ray systems may include an X-ray machine and a display screen for displaying X-ray images captured from the X-ray machine, where these X-ray images are used to ensure the position of the X-ray machine is correct.
  • these X-ray systems may rely on trial-and-error methods for determining if the position of the X-ray system is correct.
  • an operator of one of these X-ray systems may place an X-ray machine at an approximate location needed to capture a specific portion of a patient (e.g., for which the surgical procedure is being performed) . Subsequently, the operator may capture an X-ray image of the patient from the X-ray machine at this approximate location and may determine if the X-ray image accurately captures the specific portion of the patient. If the operator determines the X-ray image does not accurately capture the specific portion of the patient, the operator may readjust or move the X-ray machine and repeat capturing X-ray images of the patient until the X-ray machine is accurately positioned (e.g., for capturing X-ray images of specific areas of the patient, for the surgical procedure, etc. ) .
  • these X-ray systems may have a limited field of view (FOV) . That is, the X-ray machines may only be capable of capturing narrow areas of interest (e.g., to limit any possible X-ray radiation to the patient and/or operator) .
  • the limited FOV may cause the operator to have to reposition the X-ray machine multiple times and to capture multiple X-ray images of the patient when checking if the X-ray machine is accurately located to ensure subsequent X-ray scans contain all interested portions of the patient.
  • operators of these X-ray systems may try many times to accurately position the X-ray machine and system when ensuring a scan range is suitable for capturing X-ray images of the interested portions of the patient for the surgical procedure.
  • a safety risk is introduced with possibly exposing the patient and/or the operator to unnecessary amounts of radiation.
  • a positioning system uses a camera (e.g., situated at the top of an operating room or in a different location of the operating room) for identifying current positions of both an X-ray machine and a patient to then calculate and provide an operator (e.g., a surgeon) with precise adjustments needed for adjusting a position of the X-ray machine in the operating room in order to obtain an accurate FOV containing the entire region of interest of the patient for imaging.
  • This positioning system may help operators to efficiently position an X-ray machine according to manual inputs from the operator indicating the regions of interest of the patient.
  • a mark on the top of the X-ray machine may be used to check the position of the X-ray machine in the operation room.
  • the operator may use an image captured from the camera to input a target location onto the image corresponding to an interested FOV or location on the patient.
  • the positioning system may then use the target location to calculate a precise move distance for moving the X-ray machine to that target location and feedback the calculated distance to the operator, and the operator can move the X-ray machine to the correct location according to the feedback.
  • This positioning system may help in saving operators from having to locate and relocate X-ray machines many times and may decrease uncertainties of whether the X-ray scans are correct or not, leading to fewer X-ray images taken and less exposure from associated radiation.
  • Embodiments of the present disclosure provide technical solutions to one or more of the problems of (1) determining an accurate location for placing an X-ray system or machine, (2) exposing a patient and/or operator of the X-ray system to unnecessary amounts of radiation, and (3) prolonging procedure times for surgeries.
  • the positioning system described herein enables an operator of an X-ray system that employs the positioning system to more accurately place the X-ray system in an accurate location needed for capturing the correct areas of interest of the patient without having to take multiple X-rays of the patient.
  • this expedited and accurate locating of the X-ray machine less X-rays may be taken of the patient or otherwise limit the amount of radiation to which the patient and/or the operator are exposed. Additionally, the amount of time needed for associated surgical procedures using the X-ray system may decrease as a result of using the described positioning system.
  • a block diagram of a system 100 may be used to position an imaging device (e.g., X-ray system or machine) to capture areas of interest of a patient based on images captured from an additional imaging device (e.g., camera) . Accordingly, a distance to move the imaging device may be calculated based on selecting the areas of interest of the patient on the images captured from the additional imaging device and calculating how far the imaging device needs to be moved to accurately capture those areas of interest.
  • an imaging device e.g., X-ray system or machine
  • an additional imaging device e.g., camera
  • a distance to move the imaging device may be calculated based on selecting the areas of interest of the patient on the images captured from the additional imaging device and calculating how far the imaging device needs to be moved to accurately capture those areas of interest.
  • the system 100 may be used to control, pose, and/or otherwise manipulate a surgical mount system, a surgical arm, surgical tools, and/or imaging devices attached thereto and/or carry out one or more other aspects of one or more of the methods disclosed herein.
  • the system 100 comprises a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database 130, and/or a cloud or other network 134.
  • Systems according to other embodiments of the present disclosure may comprise more or fewer components than the system 100.
  • the system 100 may not include the imaging device 112, the robot 114, the navigation system 118, one or more components of the computing device 102, the database 130, and/or the cloud 134.
  • the computing device 102 comprises a processor 104, a memory 106, a communication interface 108, and a user interface 110.
  • Computing devices according to other embodiments of the present disclosure may comprise more or fewer components than the computing device 102.
  • the processor 104 of the computing device 102 may be any processor described herein or any similar processor.
  • the processor 104 may be configured to execute instructions stored in the memory 106, which instructions may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud 134.
  • the memory 106 may be or comprise RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions.
  • the memory 106 may store information or data useful for completing, for example, any step of the methods 500, 600, and/or 700 described herein, or of any other methods.
  • the memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114.
  • the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable image processing 120, segmentation 122, transformation 124, and/or registration 128.
  • Such content may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines.
  • the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc. ) that can be processed by the processor 104 to carry out the various method and features described herein.
  • various contents of memory 106 may be described as instructions, it should be appreciated that functionality described herein can be achieved through use of instructions, algorithms, and/or machine learning models.
  • the data, algorithms, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, and/or the cloud 134.
  • the computing device 102 may also comprise one or more user interfaces 110.
  • the user interface 110 may be or comprise a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
  • the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100.
  • the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102.
  • the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computer device 102.
  • the imaging device 112 may be operable to image anatomical feature (s) (e.g., a bone, veins, tissue, etc. ) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc. ) .
  • image data refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form.
  • the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof.
  • the image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure.
  • a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time, and a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
  • the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
  • the imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver) , an O-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine) , a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera) , a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae) , or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient.
  • the imaging device 112 may be contained entirely within a single housing or may comprise a transmitter/emitter and a receiver/detector that are
  • the imaging device 112 may comprise more than one imaging device 112.
  • a first imaging device may provide first image data and/or a first image
  • a second imaging device may provide second image data and/or a second image.
  • the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein.
  • the imaging device 112 may be operable to generate a stream of image data.
  • the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images.
  • image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
  • the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms. In some embodiments, one or more of the robotic arms 116 may be used to hold and/or maneuver the imaging device 112. In embodiments where the imaging device 112 comprises two or more physically separate components (e.g., a transmitter and receiver) , one robotic arm 116 may hold one such component, and another robotic arm 116 may hold another such component. Each robotic arm 116 may be positionable independently of the other robotic arm. The robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
  • the robot 114 may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
  • the robotic arm (s) 116 may comprise one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm) .
  • the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation.
  • the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStation TM S8 surgical navigation system or any successor thereof.
  • the navigation system 118 may include one or more cameras or other sensor (s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the system 100 is located.
  • the one or more cameras may be optical cameras, infrared cameras, or other cameras.
  • the navigation system 118 may comprise one or more electromagnetic sensors.
  • the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, and/or one or more surgical tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing) .
  • the navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
  • the system 100 can operate without the use of the navigation system 118.
  • the database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system) .
  • the database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100) ; one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information.
  • one or more surgical plans including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100
  • the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud 134.
  • the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS) , a health information system (HIS) , and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • a hospital image storage system such as a picture archiving and communication system (PACS) , a health information system (HIS) , and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • PACS picture archiving and communication system
  • HIS health information system
  • the cloud 134 may be or represent the Internet or any other wide area network.
  • the computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both.
  • the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
  • the system 100 or similar systems may be used, for example, to carry out one or more aspects of any of the methods 500, 600, and/or 700 described herein.
  • the system 100 or similar systems may also be used for other purposes.
  • Fig. 2 shows an imaging system diagram 200 according to at least one embodiment of the present disclosure.
  • the imaging system diagram 200 may include a first imaging device 202 and a second imaging device 204.
  • the first imaging device 202 may be a camera or camera system
  • the second imaging device 204 may be an X-ray machine or X-ray system.
  • One or both imaging devices 202, 204 may be similar or identical to the imaging device 112 depicted and described in connection with Fig. 1.
  • images captured by the first imaging device 202 may be used to determine a location for placing the second imaging device 204 in a room (e.g., an operating room or other type of room used for different medical procedures) .
  • the first imaging device 202 may be used to capture images of its surroundings.
  • the first imaging device may capture a first image of a target environment 206 to identify locations of different objects within the target environment 206, such as a patient 208 (e.g., object) , an operator of the imaging system, various equipment in the target environment (e.g., including the second imaging device 204) , etc. While shown as being located on the ceiling, the first imaging device 202 may be placed at other locations in the room as long as the first imaging device 202 is capable of capturing images of the target environment 206 and at least the patient 208.
  • a patient 208 e.g., object
  • various equipment in the target environment e.g., including the second imaging device 204
  • the first imaging device 202 While shown as being located on the ceiling, the first imaging device 202 may be placed at other locations in the room as long as the first imaging device 202 is capable of capturing images of the target environment 206 and at least the patient 208.
  • coordinates of interest may be selected from the first image, where the coordinates of interest include specific areas of interest of the patient 208 needed for imaging (e.g., to perform subsequent surgical operations and/or for assisting surgical operations that are in progress) .
  • the operator of the imaging system may manually input or select the coordinates of interest from the first image.
  • the imaging system may identify and select the coordinates of interest autonomously (e.g., based on previously received instructions or data) .
  • the operator and/or the imaging system may select a target line that passes through the specific areas of interest of the patient 208.
  • a processor may calculate the position of the coordinates of interest (e.g., target line) and of the second imaging device 204 within the room. Additionally, the processor may calculate a distance and determine a direction of the coordinates of interest with respect to the second imaging device 204. For example, the processor may generate a set of real-world coordinates corresponding to the specific areas of interest of the patient 208 based on the coordinates of interest and may calculate the distance needed to move the second imaging device 204 to those real-world coordinates from an initial position at which the second imaging device 204 is located.
  • the processor may generate the set of real-world coordinates based on a mapping between a set of pixel coordinates and the real-world coordinates, where the set of pixel coordinates are associated with the first image captured from the first imaging device 202 and the real-world coordinates.
  • the mapping between the set of pixel coordinates and the real-world coordinates will be described in greater detail with reference to Fig. 4.
  • the imaging system may position the second imaging device 204 into a first location using the calculated distance and determined direction.
  • the imaging system may output this calculated distance (e.g., guiding information) to a user interface (e.g., the user interface 110 as described with reference to Fig. 1) for the operator of the imaging system to move the second imaging device 204 according to the output.
  • the imaging system may autonomously move the second imaging system 204 to the first location based on the coordinates of interest and the calculated distance.
  • the imaging system may verify that the first location corresponds to the coordinates of interest using one or more additional images captured by the first imaging device 202. That is, the imaging system may check whether the second imaging device 204 is accurately positioned based on the coordinates of interest and a current location of the second imaging device 204 (e.g., the first position) . If the first location does not correspond to the coordinates of interest, the processor of the imaging system may again calculate a distance (and determine a direction) for which the second imaging device 204 needs to be moved. Accordingly, the imaging system may position the second imaging device 204 according to the newly calculated distance (e.g., autonomously or by the operator based on the newly calculated distance being output to the operator) .
  • the newly calculated distance e.g., autonomously or by the operator based on the newly calculated distance being output to the operator
  • the second imaging device 204 may be used to capture and display a second image (e.g., X-ray image) of those areas of interest of the patient 208. Subsequently, any related surgical operations and/or other medical operations may occur with the second imaging device 204 now properly positioned.
  • a second image e.g., X-ray image
  • Fig. 3 shows an imaging system diagram 300 according to at least one embodiment of the present disclosure.
  • the imaging system diagram 300 may include an imaging device 302, which may be an example of the second imaging device 204 as described with reference to Fig. 2 (e.g., an X-ray machine) .
  • the imaging device 302 e.g., of an X-ray system
  • the imaging device 302 may have a limited FOV, so it is important to ensure the imaging device 302 is properly positioned to accurately capture areas of interest of a patient 304 (e.g., or more generically, an “object” ) needed for imaging with the limited FOV.
  • an additional imaging device e.g., the first imaging device 202 of the Fig. 2, such as a camera
  • the imaging device 302 may be used to assist in placing the imaging device 302 at the correct location to capture an area or areas of interest of the patient 304.
  • the additional imaging device may capture a first image (e.g., digital image or video that is output to a user interface associated with the imaging system, such as the user interface 110 as described with reference to Fig. 1) of a target environment (e.g., operation room) , where the first image includes at least the imaging device 302 and the patient 304. Subsequently, coordinates of interest can be selected on the first image that correspond to the areas of interest of the patient 304. For example, an operator may draw a target line 306 on the first image showing the areas of interest of the patient 304 (e.g., a target location for the imaging device 302 to be placed for capturing subsequent images of the areas of interest, such as X-ray images) .
  • a target line 306 on the first image showing the areas of interest of the patient 304 (e.g., a target location for the imaging device 302 to be placed for capturing subsequent images of the areas of interest, such as X-ray images) .
  • a computing device and/or processor associated with the imaging system may calculate or determine a position 308 of the target line 306 (e.g., a set of real-world coordinates) and a position 310 of the imaging device 302.
  • the computing device and/or processor associated with the imaging system may calculate or determine the position 308 and the position 310 based on a mapping between pixel coordinates associated with the first image and real-world coordinates of the target line 306 and the imaging device 302, respectively.
  • a distance 312 may be calculated between the position 308 of the target line 306 and the position 310 of the imaging device 302 (e.g., a distance of the target line 306 relative to a current or initial location of the imaging device 302) . Additionally, a direction for which the imaging device 302 needs to be moved to reach the target line 306 may be determined and/or calculated based on the positions 308 and 310. The distance 312 (and the determined direction) may then be used to position the imaging device 302 at a first location corresponding to the target line 306.
  • the computing device and/or processor may output the distance 312 (and the determined direction) to the operator for the operator to move the imaging device 302 into the first location according to the output. Additionally or alternatively, the computing device and/or processor may autonomously move the imaging device 302 to the first location according to the distance 312. In some examples, the distance 312 (and the direction for moving the imaging device 302) may be referred to as guiding information as described herein.
  • the imaging system may verify if the imaging device 302 is accurately positioned at the target line 306 after being moved to the first location using additional images (e.g., a second image, a third image, etc. ) captured by the additional imaging device (e.g., camera or camera system) . If the first location does not correspond to the target line 306, another distance may be calculated for adjusting the position of the imaging device 302, and the imaging device 302 may be moved according to this other distance (e.g., by the operator or autonomously) . These steps may be repeated until the imaging device 302 is accurately positioned with respect to the target line 306.
  • additional images e.g., a second image, a third image, etc.
  • the imaging device 302 may be moved according to this other distance (e.g., by the operator or autonomously) .
  • the imaging device 302 may be used to capture images (e.g., X-ray images) of the areas of interest of the patient 304 (e.g., for imaging and diagnostic procedures, for surgical procedures, etc. ) .
  • images e.g., X-ray images
  • an imaging system e.g., robotic surgical imaging system
  • a first imaging device e.g., camera or camera system
  • a second imaging device e.g., X-ray machine or X-ray system
  • images captured from the first imaging device and inputs on those captured images are used to accurately position the second imaging device for the second imaging device to then be able to capture additional images of areas of interest of an object in a target environment (e.g., a patient in an operating room) .
  • a target environment e.g., a patient in an operating room
  • the set of coordinate mapping diagrams 400 may be used to calculate real-world coordinates from different positions of the images captured from the first imaging device. For example, real-world coordinates of a target position for the second imaging device to be placed may be determined from the inputs on the captured images (e.g., coordinates of interest, a target line 306 as described with reference to Fig. 3, etc. ) , as well as real-world coordinates of the second imaging device (e.g., using a mark on the top of the second imaging device) . Subsequently, a distance between the real-world coordinates of the target position and the real-world coordinates of the second imaging device may be calculated to move the second imaging device according to the distance (e.g., autonomously or manually) .
  • the distance e.g., autonomously or manually
  • the set of coordinate mapping diagrams 400 provided in the example of Fig. 4 may be used to map a set of pixel coordinates from the images captured by the first imaging device and corresponding to the target location and a current or initial location of the second imaging device to respective sets of real-world coordinates, and vice versa (e.g., from the sets of real-world coordinates to sets of pixel coordinates, for example, to display the distance for moving the second imaging device on a user interface) .
  • the set of coordinate mapping diagrams 400 may include a first rotation diagram 402, a second rotation diagram 404, and a third rotation diagram 406.
  • the first rotation diagram 402 may represent rotations about a first axis (e.g., x-axis) and may indicate how a second axis (e.g., y-axis) and a third axis (e.g., z-axis) are affected by the rotations about the first axis (e.g., by an angle, ⁇ ) .
  • This rotation about the first axis may be given as Equation (1) below:
  • the second rotation diagram 404 may represent rotations about the second axis (e.g., y-axis) and may indicate how the first axis (e.g., x-axis) and the third axis (e.g., z-axis) are affected by the rotations about the second axis (e.g., by an angle, ⁇ ) .
  • This rotation about the second axis may be given as Equation (2) below:
  • the third rotation diagram 406 may represent rotations about the third axis (e.g., z-axis) and may indicate how the first axis (e.g., x-axis) and the second axis (e.g., y-axis) are affected by the rotations about the third axis (e.g., by an angle, ⁇ ) .
  • This rotation about the third axis may be given as Equation (3) below:
  • Equation (4) After using one or more of Equations (1) , (2) , and (3) for the rotations about the respective axes, a whole movement matrix can be formed, which is given below by Equation (4) :
  • a computing device and/or processor of the imaging system described herein can calculate an old position of the second imaging device (e.g., given by X 1 , Y 1 , and Z 1 ) and/or the target location (e.g., target position, coordinates of interest, etc., given by X 2 , Y 2 , and Z 2 ) .
  • ‘O’ may represent a ‘realized position’ of a given set of coordinates
  • the ‘R 3x3 ’ matrix may represent real-time coordinates (e.g., XYZ coordinates determined from the rotation diagrams and corresponding equations)
  • the ‘T 3x1 ’ matrix may represent movement to a target location (e.g., moving from the real-time coordinates to the target location) .
  • the set of coordinate mapping diagrams 400 may also include a first coordinate mapping diagram 408 and a second coordinate mapping diagram 410 that can be used to map image pixel coordinated to/from real-world coordinates (e.g., using camera and/or image coordinates) .
  • first coordinate mapping diagram 408 the following image pixel coordinate mapping relationships can be determined as given below in Equations (5) , (6) , and (7) :
  • pixel positions of specific areas of images captured by the first imaging device can map to real-world coordinates, where a difference (e.g., distance) between the positions can be determined accuratley.
  • the pixel coordinates/positions may be output to the operator of the imaging system to indicate how far the operator needs to adjust the imaging system (e.g., how far to move the second imaging device) .
  • the transformation between real-world coordinates and the pixel coordinates may be given by Equation (9) below:
  • the computing device and/or processor of the imaging system may map between real-world coordinates, camera coordinates, image coordinates, and pixel coordinates (e.g., adjusting between real-world coordinates, imaginary coordinates, and pixel coordinates) .
  • the imaging system described herein may give the operator of the imaging system more suggestions about how to move the second imaging device (e.g., X-ray system or X-ray machine) .
  • Fig. 5 depicts a method 500 that may be used, for example, to identify a current position of an imaging device with respect to areas of interest of an object and to calculate a distance for moving the imaging device to capture those areas of interest more accurately.
  • the method 500 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor (s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118) .
  • a processor other than any processor described herein may also be used to execute the method 500.
  • the at least one processor may perform the method 500 by executing elements stored in a memory such as the memory 106.
  • the elements stored in the memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 500.
  • One or more portions of a method 500 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the method 500 comprises capturing a first image of a target environment using a first imaging device, where the first image includes an object included in the target environment (step 502) .
  • the target environment may include an operation room, where the first image includes at least an image of a patient.
  • the first imaging device may include a camera or camera system.
  • the first imaging device may be situated at the top of the target environment (e.g., on the ceiling of the operating room) or may be located elsewhere in the target environment such that the first image still includes the object.
  • the method 500 also comprises selecting coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object (step 504) .
  • a target line may be selected that passes through the portion of the object, where the coordinates of interest include the target line.
  • the method 500 also comprises generating a set of real-world coordinates corresponding to the portion of the object based on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates (step 506) .
  • the set of real-world coordinates may be generated as described with reference to Fig. 4.
  • the method 500 also comprises positioning a second imaging device into a first location based on the set of real-world coordinates (step 508) . That is, the second imaging devices may be placed at a location corresponding to the generated real-world coordinates that should, in turn, correspond to the coordinates of interest.
  • the method 500 also comprises displaying a second image captured using the second imaging device, where the second image includes at least the portion of the object (step 510) .
  • the second imaging device may then be used to capture images (e.g., X-ray images) of the portion of the object (e.g., areas of interest of the patient) .
  • the present disclosure encompasses embodiments of the method 500 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • Fig. 6 depicts a method 600 that may be used, for example, to verify a location of the second imaging device as described herein with respect to a given set of coordinates of interest.
  • the method 600 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor (s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118) .
  • a processor other than any processor described herein may also be used to execute the method 600.
  • the at least one processor may perform the method 600 by executing elements stored in a memory such as the memory 106.
  • the elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 600.
  • One or more portions of a method 600 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the method 600 comprises capturing a first image of a target environment using a first imaging device, where the first image includes an object included in the target environment (step 602) .
  • the method 600 also comprises selecting coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object (step 604) .
  • the method 600 also comprises generating a set of real-world coordinates corresponding to the portion of the object based on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates (step 606) .
  • the method 600 also comprises positioning the second imaging device into a first location based on the set of real-world coordinates (step 608) .
  • the method 600 also comprises capturing a third image of the target environment using the first imaging device after the second imaging device has been positioned into the first location (step 610) .
  • the method 600 also comprises verifying the second imaging device is at the coordinates of interest based on the third image, the coordinates of interest, the set of real-world coordinates, the set of pixel coordinates, or a combination thereof (step 612) .
  • the first imaging device may be used to capture an additional image of the moved second imaging device to verify the second imaging device has been accurately moved to the coordinated of interest or not. If the second imaging device is accurately positioned (e.g., its position has been verified to be correct) , the method 600 may continue to step 614.
  • the second imaging device may be positioned into a second location (e.g., autonomously or manually by an operator based on guiding information displayed for the operator) .
  • the method 600 also comprises displaying a second image captured using the second imaging device, where the second image includes at least the portion of the object (step 614) .
  • the second imaging device may be used to capture and display the second image after a position of the second imaging device has been verified to be accurate.
  • the present disclosure encompasses embodiments of the method 600 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • Fig. 7 depicts a method 700 that may be used, for example, to guide an operator of an imaging system described herein when the operator is manually moving the second imaging device to a target location.
  • the method 700 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor (s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118) .
  • a processor other than any processor described herein may also be used to execute the method 700.
  • the at least one processor may perform the method 700 by executing elements stored in a memory such as the memory 106.
  • the elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 700.
  • One or more portions of a method 700 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the method 700 comprises capturing a first image of a target environment using a first imaging device, where the first image includes an object included in the target environment (step 702) .
  • the method 700 also comprises selecting coordinates of interest included in the first image, the coordinates of interest including at least a portion of the object (step 704) .
  • the method 700 also comprises generating a set of real-world coordinates corresponding to the portion of the object based on a mapping between a set of pixel coordinates associated with the first image and the set of real-world coordinates (step 706) .
  • the method 700 also comprises calculating a distance to move the second imaging device into a first location to capture a second image that includes at least the portion of the object, where the distance is calculated based on the coordinates of interest, an initial location of the second imaging device, the real-world coordinates, or a combination thereof (step 708) .
  • guiding information may be displayed to assist the operator with positioning the second imaging device, where the guiding information includes the calculated distance. Additionally, the guiding information may be displayed to the operator based on the set of pixel coordinates associated with the first image (e.g., the calculated distance is converted back and forth between pixel coordinates, real-world coordinates, camera coordinates, and image coordinates as described with reference to Fig. 4) .
  • the method 700 also comprises positioning the second imaging device into the first location based on the set of real-world coordinates (e.g., and the calculated distance) (step 710) .
  • the method 700 also comprises displaying a second image captured using the second imaging device, the second image including at least the portion of the object (step 712) .
  • the present disclosure encompasses embodiments of the method 700 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • the present disclosure encompasses methods with fewer than all of the steps identified in Figs. 5, 6, and 7 (and the corresponding description of the methods 500, 600, and 700) , as well as methods that include additional steps beyond those identified in Figs. 5, 6, and 7 (and the corresponding description of the methods 500, 600, and 700) .
  • the present disclosure also encompasses methods that comprise one or more steps from one method described herein, and one or more steps from another method described herein. Any correlation described herein may be or comprise a registration or any other correlation.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Robotics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

Un système d'imagerie chirurgicale robotique (100, 200, 300) comprend un premier dispositif d'imagerie (202) et un second dispositif d'imagerie (204, 302). Le premier dispositif d'imagerie (202) peut être utilisé pour capturer une première image d'un environnement cible, la première image comprenant un sujet dans l'environnement cible (502, 602, 702). Par la suite, des coordonnées d'intérêt, associées à au moins une partie du sujet (504), peuvent être sélectionnées dans la première image. Des coordonnées du monde réel qui correspondent aux coordonnées d'intérêt et à la partie du sujet (506, 606), peuvent ensuite être générées, et le second dispositif d'imagerie (204) peut être placé à un emplacement sur la base des coordonnées du monde réel (508, 608). Après vérification que l'emplacement du second dispositif d'imagerie (204) correspond bien aux coordonnées d'intérêt (par exemple, après mise en œuvre de tous les ajustements nécessaires à l'emplacement), le second dispositif d'imagerie (204) peut être utilisé pour capturer une seconde image de la partie du sujet (510, 610, 710).
EP22922651.9A 2022-01-26 2022-01-26 Système de positionnement de rayons x mobile Pending EP4468963A4 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/073939 WO2023141800A1 (fr) 2022-01-26 2022-01-26 Système de positionnement de rayons x mobile

Publications (2)

Publication Number Publication Date
EP4468963A1 true EP4468963A1 (fr) 2024-12-04
EP4468963A4 EP4468963A4 (fr) 2025-11-12

Family

ID=87470130

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22922651.9A Pending EP4468963A4 (fr) 2022-01-26 2022-01-26 Système de positionnement de rayons x mobile

Country Status (3)

Country Link
EP (1) EP4468963A4 (fr)
CN (1) CN118660668A (fr)
WO (1) WO2023141800A1 (fr)

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140364720A1 (en) * 2013-06-10 2014-12-11 General Electric Company Systems and methods for interactive magnetic resonance imaging
EP3273861A1 (fr) * 2015-03-27 2018-01-31 3Shape A/S Procédé de réduction de dose de rayons x dans un système à rayons x
EP3461415A1 (fr) * 2017-09-27 2019-04-03 Koninklijke Philips N.V. Système et procédé de positionnement d'un système d'imagerie médicale mobile
CN209392094U (zh) * 2018-06-20 2019-09-17 深圳大学 一种增强现实的手术系统
CN111374690B (zh) * 2018-12-28 2025-04-11 通用电气公司 医学成像方法及系统
CN111658142A (zh) * 2019-03-07 2020-09-15 重庆高新技术产业开发区瑞晟医疗科技有限公司 一种基于mr的病灶全息导航方法及系统
WO2020220208A1 (fr) * 2019-04-29 2020-11-05 Shanghai United Imaging Healthcare Co., Ltd. Systèmes et procédés pour le positionnement d'objet et chirurgie guidée par image
CN112543623A (zh) * 2019-07-22 2021-03-23 京东方科技集团股份有限公司 手术机器人系统及其控制方法
JP2021074275A (ja) * 2019-11-08 2021-05-20 キヤノンメディカルシステムズ株式会社 撮像支援装置
EP4489023A3 (fr) * 2020-07-16 2025-03-26 Mazor Robotics Ltd. Procédé de vérification d'enregistrement et dispositif d'enregistrement de modèle
EP4167861A4 (fr) * 2020-07-27 2023-08-16 Shanghai United Imaging Healthcare Co., Ltd. Systèmes et procédés d'imagerie
EP4178446B1 (fr) * 2020-08-10 2024-11-20 Shanghai United Imaging Healthcare Co., Ltd. Systèmes et procédés d'imagerie
CN112348851B (zh) * 2020-11-04 2021-11-12 无锡蓝软智能医疗科技有限公司 移动目标追踪系统及混合现实手术辅助系统
CN113229836A (zh) 2021-06-18 2021-08-10 上海联影医疗科技股份有限公司 一种医学扫描方法和系统
CN113647967A (zh) * 2021-09-08 2021-11-16 上海联影医疗科技股份有限公司 一种医学扫描设备的控制方法、装置及系统

Also Published As

Publication number Publication date
CN118660668A (zh) 2024-09-17
WO2023141800A1 (fr) 2023-08-03
EP4468963A4 (fr) 2025-11-12

Similar Documents

Publication Publication Date Title
US20230389991A1 (en) Spinous process clamp registration and methods for using the same
US12419692B2 (en) Robotic arm navigation using virtual bone mount
US12295797B2 (en) Systems, methods, and devices for providing an augmented display
US20230240755A1 (en) Systems and methods for registering one or more anatomical elements
US20240382265A1 (en) Hybrid localization for minimally invasive surgery and cervical spinal referencing, and methods for using the same
US12067653B2 (en) Systems, methods, and devices for generating a corrected image
US20230255694A1 (en) Systems and methods for validating a pose of a marker
WO2023141800A1 (fr) Système de positionnement de rayons x mobile
US12249099B2 (en) Systems, methods, and devices for reconstructing a three-dimensional representation
US12004821B2 (en) Systems, methods, and devices for generating a hybrid image
US20230278209A1 (en) Systems and methods for controlling a robotic arm
US11847809B2 (en) Systems, devices, and methods for identifying and locating a region of interest
US20230240659A1 (en) Systems, methods, and devices for tracking one or more objects
WO2025120636A1 (fr) Systèmes et procédés de détermination du mouvement d'un ou plusieurs éléments anatomiques
WO2025186761A1 (fr) Systèmes et procédés de détermination d'une position d'un objet par rapport à un dispositif d'imagerie
WO2024180545A1 (fr) Systèmes et procédés d'enregistrement d'un élément anatomique cible
WO2025046505A1 (fr) Systèmes et procédés d'enregistrement de patient à l'aide de plans d'image 2d
WO2025109596A1 (fr) Systèmes et procédés d'enregistrement à l'aide d'un ou de plusieurs repères
WO2025120637A1 (fr) Systèmes et procédés de planification et de mise à jour de trajectoires pour dispositifs d'imagerie
WO2025133940A1 (fr) Systèmes et procédés de recalage de patient à l'aide de motifs lumineux

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20240826

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20251010

RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 6/12 20060101AFI20251006BHEP

Ipc: A61B 34/20 20160101ALI20251006BHEP

Ipc: A61B 6/03 20060101ALI20251006BHEP

Ipc: A61B 6/00 20240101ALI20251006BHEP

Ipc: A61B 90/00 20160101ALI20251006BHEP

Ipc: A61B 34/30 20160101ALI20251006BHEP