[go: up one dir, main page]

WO2025037243A1 - Systèmes et méthodes de visualisation en temps réel de l'anatomie dans des procédures de navigation - Google Patents

Systèmes et méthodes de visualisation en temps réel de l'anatomie dans des procédures de navigation Download PDF

Info

Publication number
WO2025037243A1
WO2025037243A1 PCT/IB2024/057858 IB2024057858W WO2025037243A1 WO 2025037243 A1 WO2025037243 A1 WO 2025037243A1 IB 2024057858 W IB2024057858 W IB 2024057858W WO 2025037243 A1 WO2025037243 A1 WO 2025037243A1
Authority
WO
WIPO (PCT)
Prior art keywords
voxels
image
surgical
processor
surgical instrument
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IB2024/057858
Other languages
English (en)
Inventor
Nikhil MAHENDRA
Victor SNYDER
Jinglin LIU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Medtronic Navigation Inc
Original Assignee
Medtronic Navigation Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/787,810 external-priority patent/US20250057603A1/en
Application filed by Medtronic Navigation Inc filed Critical Medtronic Navigation Inc
Publication of WO2025037243A1 publication Critical patent/WO2025037243A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • A61B2034/104Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation

Definitions

  • the present disclosure is generally directed to surgical navigation, and relates more particularly to visualization of anatomy during navigated surgeries or surgical procedures.
  • Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure, or may complete one or more surgical procedures autonomously. Imaging may be used by a medical provider for diagnostic and/or therapeutic purposes. Patient anatomy can change over time, particularly following placement of a medical implant in the patient anatomy.
  • Example aspects of the present disclosure include:
  • a system comprises: a processor; and a memory storing data thereon that, when processed by the processor, enable the processor to: receive an image depicting an anatomical element; segment the image into a segmented image that includes a plurality of voxels; track a portion of a surgical instrument as the portion of the surgical instrument interacts with the anatomical element; identify, based on the tracking, an area from the segmented image representative of a section of the anatomical element that interacts with the portion of the surgical instrument; modify one or more voxels from the plurality of voxels that reside within the area identified from the segmented image as being representative of the section of the anatomical element that interacts with the portion of the surgical instrument; and render, to a display, the segmented image showing the modified one or more voxels.
  • the portion of the surgical instrument is capable of resecting anatomical tissue.
  • the tracking comprises determining a pose of the portion of the surgical instrument relative to the anatomical element as the portion of the surgical instrument interacts with the anatomical element.
  • a system comprises: a processor; and a memory coupled with the processor and storing data thereon that, when processed by the processor, enable the processor to: receive a segmented image depicting an anatomical element segmented into a plurality of voxels; render, to a display, the segmented image; track a surgical tool as the surgical tool interacts with the anatomical element; determine, based on the tracking, a voxel of the plurality of voxels representative of a portion of the anatomical element that interacts with the surgical tool; and update a visual depiction of the voxel shown in the segmented image on the display.
  • segmented image comprises a two-dimensional image or a three-dimensional image.
  • the artificial intelligence data model comprises a convolutional neural network that receives image data as an input and outputs the segmented image.
  • the tracking comprises determining a pose of the surgical tool relative to the anatomical element as the surgical tool interacts with the anatomical element.
  • a system comprises: a processor; and a memory coupled with the processor and storing data thereon that, when processed by the processor, enable the processor to: receive image data associated with an anatomical element; segment the image data into a plurality of voxels; render, to a display, a visual depiction of the plurality of voxels; track an operative portion of a surgical instrument as the operative portion of the surgical instrument interacts with the anatomical element; identify, based on the tracking, a voxel of the plurality of voxels associated with the operative portion of the surgical instrument; and render, based on the tracking, an updated visual depiction of the image data that includes a modified version of the voxel.
  • the modified version of the voxel is rendered in the updated visual depiction with a first visual indicator when the surgical instrument is a first type of surgical instrument, and wherein the voxel is rendered with a second visual indicator when the surgical instrument is a second type of surgical instrument.
  • each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • each one of A, B, and C in the above expressions refers to an element, such as X, Y, and Z, or class of elements, such as Xl-Xn, Y 1-Ym, and Zl-Zo
  • the phrase is intended to refer to a single element selected from X, Y, and Z, a combination of elements selected from the same class (e.g., XI and X2) as well as a combination of elements selected from two or more classes (e.g., Y 1 and Zo).
  • FIG. 1 is a block diagram of a system according to at least one embodiment of the present disclosure
  • Fig. 2B is a diagram of the surgical tool moving relative to the anatomical element according to at least one embodiment of the present disclosure
  • Fig. 2C is a diagram of the surgical tool moving relative to the anatomical element according to at least one embodiment of the present disclosure
  • Fig. 2D is a depiction of segmented voxels of the anatomical element according to at least one embodiment of the present disclosure
  • Fig. 2E is a depiction of segmented voxels of the anatomical element according to at least one embodiment of the present disclosure.
  • Fig. 3 is a flowchart according to at least one embodiment of the present disclosure.
  • the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware -based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions).
  • Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A 10 or 10X Fusion processors; Apple Al l, A 12, A12X, A12Z, or Al 3 Bionic processors; or any other general purpose microprocessors), graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry
  • DSPs digital signal processors
  • preparation for spinal decompression can involve removal of some bony anatomy, such as lamina, facet joints, bone spurs, and/or the like.
  • some bony anatomy such as lamina, facet joints, bone spurs, and/or the like.
  • partial layers of bone on the surfaces of posterior sections of a vertebra are removed before adding bone graft.
  • a pilot hole in a vertebra can be created with surgical instruments (e.g., a drill, such as a Midas, or an awl), and threads can be created with instruments such as a tap.
  • surgical instruments e.g., a drill, such as a Midas, or an awl
  • threads can be created with instruments such as a tap.
  • Such examples involve resecting bone from vertebrae, but the depiction of the vertebrae on the navigation screen is not updated without new imaging.
  • navigation is used to identify vertebral bone that has been removed during a surgery or surgical procedure, such as a spine surgery.
  • a surgery or surgical procedure such as a spine surgery.
  • the state of the vertebral anatomy e.g., the state of a vertebra after facet joint resection
  • Navigation or robotics may assume that the spine is rigid, with the goal of spinal surgery being altering the spine’s shape. However, the accuracy of the navigation or robotics decreases as the spine is manipulated. Conventional art methods include workflow modifications, rescanning, reregistration, and guessing. What is missing are techniques for objectively understanding anatomical correction intraoperatively. Segmental tracking updates clinical images based on realtime knowledge of each vertebra’s unique position and orientation, maintaining accuracy and enabling intraoperative assessment of anatomical correction.
  • segmental tracking and tracked instruments such as taps and drills can be used to update the visualization of the vertebrae to show resected bony voxels of pilot holes, tapped screw holes, combinations thereof, and/or the like.
  • segmental tracking and tracked instruments such as drills (e.g., Midas MR8TM drills, Midas RexTM Mazor TM Facet Decortication Acorn Tool), voxels of vertebrae where a drill bit has been used to decorticate facet/lamina can be visualized with a different color to indicate an updated vertebral anatomy state.
  • information about an instrument in conjunction with image processing methods such as connected components analysis can be used to find resected bony anatomy such as facet joints and update visualization.
  • Embodiments of the present disclosure provide technical solutions to one or more of the problems of (1) inaccurate visual depictions of anatomical elements during surgeries or surgical procedures and (2) additional or excessive radiation exposure due to additional intraoperative imaging.
  • a block diagram of a system 100 according to at least one embodiment of the present disclosure is shown.
  • the system 100 may be used to navigate surgical tools and/or image anatomical elements during surgical procedures; to update anatomical element visualization to account for changes in the state of anatomical element(s) during a surgery or surgical procedure; to control, pose, and/or otherwise manipulate a surgical mount system, a surgical arm, and/or surgical tools attached thereto; and/or carry out one or more other aspects of one or more of the methods disclosed herein.
  • the system 100 comprises a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database 130, and/or a cloud or other network 134.
  • Systems according to other embodiments of the present disclosure may comprise more or fewer components than the system 100.
  • the system 100 may not include one or more components of the computing device 102, the database 130, and/or the cloud 134.
  • the computing device 102 comprises a processor 104, a memory 106, a communication interface 108, and a user interface 110.
  • Computing devices according to other embodiments of the present disclosure may comprise more or fewer components than the computing device 102.
  • the processor 104 of the computing device 102 may be any processor described herein or any similar processor.
  • the processor 104 may be configured to execute instructions or other data stored in the memory 106, which instructions or data may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud 134.
  • the memory 106 may be or comprise RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer- readable data and/or instructions.
  • the memory 106 may store information or data useful for completing, for example, any step of the method 300 described herein, or of any other methods.
  • the memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114.
  • the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable image processing 120 and/or segmentation 122.
  • Such content may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines.
  • the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc.) that can be processed by the processor 104 to carry out the various method and features described herein.
  • various contents of memory 106 may be described as instructions, it should be appreciated that functionality described herein can be achieved through use of instructions, algorithms, and/or machine learning models.
  • the data, algorithms, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, and/or the cloud 134.
  • the computing device 102 may also comprise a communication interface 108.
  • the communication interface 108 may be used for receiving image data or other information from an external source (such as the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100), and/or for transmitting instructions, images, or other information to an external system or device (e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100).
  • an external system or device e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100.
  • the communication interface 108 may comprise one or more wired interfaces (e.g., a USB port, an Ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.1 la/b/g/n, Bluetooth, NFC, ZigBee, and so forth).
  • the communication interface 108 may be useful for enabling the device 102 to communicate with one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
  • the computing device 102 may also comprise one or more user interfaces 110.
  • the user interface 110 may be or comprise a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
  • the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100.
  • the user interface 110 may be useful to allow a surgeon or other user to modify instructions to be executed by the processor 104 according to one or more embodiments of the present disclosure, and/or to modify or adjust a setting of other information displayed on the user interface 110 or corresponding thereto.
  • the user interface 110 is shown as part of the computing device 102, in some embodiments, the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102. In some embodiments, the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computer device 102.
  • the imaging device 112 may be operable to image anatomical feature(s) (e.g., a bone, veins, tissue, etc.) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc.).
  • image data refers to the data generated or captured by an imaging device 112, including in a machine -readable form, a graphical/visual form, and in any other form.
  • the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof.
  • the image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure.
  • the imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an 0-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient.
  • the imaging device 112 may be contained entirely within a single housing, or may comprise a transmitter/emitter and a receiver/detector that are in separate housings or are otherwise
  • the imaging device 112 may comprise more than one imaging device 112.
  • a first imaging device may provide first image data and/or a first image
  • a second imaging device may provide second image data and/or a second image.
  • the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein.
  • the imaging device 112 may be operable to generate a stream of image data.
  • the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images.
  • image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
  • the robot 114 together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
  • the robotic arm(s) 116 may comprise one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm 116 (as well as any object or element held by or secured to the robotic arm 116).
  • reference markers may be placed on the robot 114 (including, e.g., on the robotic arm 116), the imaging device 112, or any other object in the surgical space.
  • the reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof.
  • the navigation system 118 can be used to track other components of the system (e.g., imaging device 112) and the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or more surgical tools, based on information and/or instructions generated by the navigation system 118, for example).
  • the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation.
  • the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStationTM S8 surgical navigation system or any successor thereof.
  • the navigation system 118 may include one or more cameras or other sensor(s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the system 100 is located.
  • the one or more cameras may be optical cameras, infrared cameras, or other cameras.
  • the navigation system 118 may comprise one or more electromagnetic sensors.
  • the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, and/or one or more surgical tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing).
  • a position and orientation e.g., a pose
  • the robot 114 and/or robotic arm 116 e.g., a robot 114 and/or robotic arm 116
  • one or more surgical tools or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing.
  • the navigation system 118 may include or be connected to a navigation display 124 (e.g., navigation display 124) for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
  • the navigation display 124 may be similar to or the same as the user interface 110, and may communicate with one or more other components of the system 100 (e.g., via the communication interface 108, via the cloud 134, etc.).
  • the navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
  • Such guidance may be provided on the navigation display 124.
  • the navigation display 124 is also configured to be updated with modified depictions of patient anatomy through out a step, portion of, or the entirety of the surgery or surgical procedure, as discussed in further detail below.
  • the database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system).
  • the database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100); one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information.
  • one or more surgical plans including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100
  • the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud 134.
  • the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • a hospital image storage system such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • the cloud 134 may be or represent the Internet or any other wide area network.
  • the computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both.
  • the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
  • the system 100 comprises a surgical tool 136.
  • the surgical tool 136 may be configured to drill, burr, mill, cut, saw, ream, tap, etc. into anatomical tissues such as patient anatomy (e.g., soft tissues, bone, etc.).
  • the system 100 may comprise multiple surgical tools, with each surgical tool performing a different surgical task (e.g., a surgical drill for drilling, a surgical mill for milling, an osteotome for cutting bone, etc.).
  • the surgical tool 136 may provide an adapter interface to which different working ends can be attached to perform multiple different types of surgical maneuvers (e.g., the surgical tool 136 may be able to receive one or more different tool bits, such that the surgical tool 136 can drill, mill, cut, saw, ream, tap, etc. depending on the tool bit coupled with the surgical tool 136).
  • the surgical tool 136 may be operated autonomously or semi-autonomously.
  • the surgical tool 136 may be attached to a robotic arm 116, such that movement of the robotic arm 116 correspondingly causes movement in the surgical tool 136.
  • the surgical tool 136 may be gripped, held, or otherwise coupled to and controlled by the robotic arm 116.
  • the pose (e.g., position and orientation) of the surgical tool 136 may be controlled by the pose of the robotic arm 116.
  • the surgical tool 136 can be controlled by one or more components of the system 100, such as the computing device 102.
  • the computing device 102 may be capable of receiving or retrieving data or other information (e.g., from the database 130, from one or more sensors, from the imaging device 112, etc.), process the information, and control the surgical tool 136 based on the processed information.
  • the navigation system 118 may track the position of and/or navigate the surgical tool 136. Such tracking may enable the system 100 or components thereof (e.g., the computing device 102) to determine how the surgical tool 136 interacts with anatomical tissue and render updated depictions of the anatomical tissue to one or more displays (e.g., the navigation display 124) as discussed in further detail below.
  • the system 100 or similar systems may be used, for example, to carry out one or more aspects of the method 300 described herein.
  • the system 100 or similar systems may also be used for other purposes.
  • Figs. 2A-2E depict aspects of a surgical tool 136 moving relative to a vertebra 204 according to at least one embodiment of the present disclosure.
  • the movement of the surgical tool 136 relative to the vertebra 204 may occur when the surgery or surgical procedure comprises, for example, removing anatomical tissue from the vertebra 204.
  • the anatomical tissue may be removed to form an autograft that can be used by a user (e.g., a surgeon) during the course of a spinal fusion surgery.
  • anatomical tissue may be removed to insert pedicle and/or cortical screws.
  • anatomical tissue may be removed to gain access to intervertebral disc space to perform disc decompression.
  • the surgical tool 136 may interact with any other anatomical element (e.g., any other bone in the patient).
  • the vertebra 204 comprises at least one pedicle 208, a vertebral foramen 212, a spinous process 216, a transverse process 218, lamina 220, nerves 224, and a vertebral body area 228.
  • each figure depicts a superior view 202 and a lateral view 206 of the vertebra 204.
  • the superior view 202 may depict the vertebra 204 from the top of the patient (e.g., viewing the vertebra 204 while looking down on the patient’s head), while the lateral view 206 may depict the vertebra 204 from a side of the patient (e.g., from the patient’s right-hand side or from the patient’s left-hand side).
  • a tool tip 236 of the surgical tool 136 is placed on or proximate to the vertebra 204.
  • the tool tip 236 may be placed on any one or more portions of the outside surface of the vertebra 204, such as the at least one pedicle 208, the lamina 220, the spinous process 216, the transverse process 218, or the like.
  • the tool tip 236 may be placed on the lamina 220 of the vertebra 204.
  • the tool tip 236 may be placed elsewhere depending on the type of surgical tool or tool tip used, the type of surgery or surgical procedure, surgeon preference, combinations thereof, and the like.
  • the surgical tool 136 comprises a drill to insert a pedicle screw
  • the tool tip 236 of the drill may be placed proximate the pedicle 208 so that the surgical tool 136 can drill down into the vertebra 204 to form a hole, as depicted in Fig. 2C.
  • another tool tip 236 may then be inserted into a drilled hole in the vertebra 204 to thread the hole for insertion of the pedicle screw.
  • the tool tip 236 may be or comprise an operational portion of the surgical tool 136 such as a drill, saw, cutter, reamer, burr, tap, or the like that enables the surgical tool to interact with the vertebra 204.
  • the surgical tool 136 may be or comprise a drill capable of drilling through bone, and the tool tip 236 comprises the surgical tip of the drill that can decorticate or resect anatomical tissue from the lamina 220 or facet of the vertebra 204.
  • the surgical tool 136 may be or comprise an osteotome capable of cutting bone, and the tool tip 236 can decorticate or resect anatomical tissue from the lamina 220 or facet of the vertebra 204.
  • Information about the surgical tool 136 and/or the tool tip 236, as well as information about the surgical procedure (e.g., a spinal procedure) and/or the surgical workflow, may be stored in the database 130 and may be accessed during the course of the surgery or surgical procedure.
  • the information may comprise information about the type, dimensions, and/or operating parameters of surgical tool 136 and/or the tool tip 236; information about whether or not the surgical tool 136 and/or the tool tip 236 is designed to decorticate or resect anatomical tissue; information about the surgical procedure and the surgical workflow; combinations thereof; and the like.
  • Such information may be used, for example, by the navigation system 118 when tracking the surgical tool 136 to determine the locations on the vertebra 204 that interact with the surgical tool 136 and/or the tool tip 236 (e.g., to determine if the locations of the vertebra 204 have been resected).
  • the navigation system 118 may use information about the surgical tool tip in conjunction with the current step in the surgical workflow to identify decorticated or resected anatomy.
  • the information about the surgical tool 136 and/or the tool tip 236, information related to the navigation tracking of the surgical tool 136 and/or the tool tip 236 by the navigation system 118, one or more images of the vertebra 204, and/or information about the current step in a surgical workflow may be rendered to the navigation display 124.
  • the surgical tool 136 and the tool tip 236 may move across the vertebra 204 for the purposes of carrying out a surgical task performed during a surgery or surgical procedure.
  • the surgical tool 136 may comprise a drill, and the movement of the surgical tool 136 and the tool tip 236 across the vertebra 204 may occur when the surgical tool 136 is being used to resect anatomical tissues (e.g., bone) the vertebra 204, such as when a surgeon is gathering autograft to be used in a spinal fusion procedure.
  • anatomical tissues e.g., bone
  • the surgical tool 136 and the tool tip 236 may drill into a pedicle 208 of the vertebra 204 to create a hole into which a pedicle screw can be placed. While Fig. 2B depicts the tool tip 236 moving across the lamina 220 of the vertebra 204 in the direction of the arrow 240, and while Fig. 2C depicts the tool tip 236 moving into the pedicle 208 of the vertebra 204, it is to be understood that more generally the surgical tool 136 and/or the tool tip 236 may move across, move into, and/or interact with one or more other portions of the vertebra 204.
  • the surgical tool 136 and/or the tool tip 236 may interact with one or more facet joints of the vertebra 204, one or more spinous processes of the vertebra 204, one or more laminae of the vertebra 204, combinations thereof, and/or the like. Additionally or alternatively, the surgical tool 136 and/or the tool tip 236 may move across or interact with other vertebra, anatomical elements proximate the vertebra 204, or any other portion of patient anatomy. [0080] The navigation system 118 may track the position of the surgical tool 136 and/or the tool tip 236 as the surgical tool 136 and the tool tip 236 interact with the vertebra 204.
  • the navigation system 118 may use localizers (e.g., components that localize the location of the patient, the vertebra 204, the imaging device 112, etc. in a known coordinate space) and the imaging device 112 to track the position of the surgical tool 136 and/or the tool tip 236.
  • the surgical tool 136 may comprise navigation markers that can be tracked by the navigation system 118.
  • the tracking of the surgical tool 136 and/or the tool tip 236 may be rendered to the navigation display 124 for the user to view.
  • the navigation system 118 may render a visualization of the surgical tool 136 moving across a rendered visualization of the vertebra 204, such that the user can view in real-time or near real-time a depiction of the interaction between the vertebra 204 and the surgical tool 136.
  • Each voxel of the plurality of voxels 244A-244N may be sections of an image depicting the vertebra 204 representative of a section (e.g., a 2D area or a 3D volume) of the vertebra 204 at that point in space.
  • the image of the vertebra 204 may be segmented into the plurality of voxels 244A-244N, with each voxel of the plurality of voxels 244A-244N representing a portion of the vertebra 204 in 3D space (or, in some cases, in 2D space).
  • the plurality of voxels 244A-244N may cover the entirety of the image, while in other embodiments one or more portions of the image of the vertebra 204 may be segmented into voxels.
  • Each of the voxels includes an attenuation value.
  • the attenuation value may reflect a propensity of the area (or volume) represented by the voxel to be penetrated by energy (e.g., radiation from an X-ray).
  • the attenuation value may be based on Hounsfield units (HU).
  • Hounsfield units are dimensionless units universally used in CT scanning to express CT numbers in a standardized and convenient form.
  • Hounsfield units are obtained from a linear transformation of measured attenuation coefficients. The transformations are based on the arbitrarily-assigned densities of air and pure water.
  • the radiodensity of distilled water at a standard temperature and pressure (STP) of zero degrees Celsius and 105 pascals is 0 HU; the radiodensity of air at STP is -1000 HU.
  • attenuation values of the voxel are discussed qualitatively (e.g., low attenuation, medium attenuation, high attenuation, etc.) and/or quantitatively (e.g., based on values in HU) herein, it is to be understood that that the values of the voxels discussed herein are in no way limiting.
  • Images of the vertebra 204 may be captured and segmented into the plurality of voxels 244A-244N.
  • the segmenting may be performed manually, with the user providing input (e.g., via the user interface 110) to create the plurality of voxels 244A-244N. Additionally or alternatively, the segmenting may be performed by the processor 104 using, for example, segmentation 122.
  • the segmentation 122 may comprise one or more Artificial Intelligence (Al) and/or Machine Learning (ML) models or algorithms, such as a Convolutional Neural Networks (CNNs), Deep Neural Networks (DNNs), an autoencoder algorithm, a recurrent neural network (RNN) algorithm, and transformer neural network algorithm, a generative adversarial network (GAN) algorithm, linear regression, support vector machine (SVM) algorithm, random forest algorithm, hidden Markov model, and/or any combination thereof trained on data sets to segment an image of the vertebra 204 into the plurality of voxels 244A-244N.
  • the at least one processor may be configured to utilize a combination of a CNN algorithm in conjunction with an SVM algorithm.
  • the segmentation 122 data model(s) may be trained on historical data sets of similar anatomical elements and/or similar surgeries or surgical procedures to identify one or more regions of interest and superimpose the plurality of voxels 244A-244N on the image of the vertebra 204.
  • the segmentation 122 may be semiautomatic, with the user capable of modifying the results of the segmentation 122 manually.
  • the segmentation 122 may segment the image of the vertebra 204 and output a segmented image, and the user may be able to adjust the voxel dimensions in the segmented image, the position of one or more voxels of the plurality of voxels 244A-244N, combinations thereof, and the like manually via input into the user interface 110.
  • the segmentation 122 comprises labeling each voxel of the plurality of voxels 244A-244N as either having a first volume type or a second volume type.
  • voxels representing portions of the vertebra 204 such as a facet (e.g., the lamina 220, the spinous process 216, the transverse process 218, etc.) may be labeled as having the first volume type.
  • Voxels representing portions of adipose tissue e.g., tissue along the approach trajectory of the tool tip 236 to the vertebra 204) may in contrast may be labeled as having the second volume type.
  • the voxels with the first volume type may represent volumes of anatomical tissue that comprise bone, while the voxels with the second volume type may represent volumes of anatomical tissue that comprise non- bony tissue (e.g., fat).
  • the voxels may be labeled based on the attenuation values of the voxels.
  • bone has a greater attenuation value than fat due to the higher density of bone, so voxels that represent areas with high attenuation values (e.g., values above a predetermined threshold value stored in the database 130) may be labeled as having the first volume type, while voxels that represent areas with low attenuation values (e.g., values below the predetermined threshold value) may be labeled as having the second volume type.
  • high attenuation values e.g., values above a predetermined threshold value stored in the database 130
  • voxels that represent areas with low attenuation values e.g., values below the predetermined threshold value
  • the computing device 102 or components thereof may determine which voxels of the plurality of voxels 244A-244N were occupied by the tool tip 236 when the tool tip 236 moved across the vertebra 204.
  • the computing device 102 would determine that a first voxel 244A, a second voxel 244B, a third voxel 244C, a sixth voxel 244F, a seventh voxel 244G, and an eighth voxel 244H were all occupied by the tool tip 236. Additionally or alternatively, the computing device 102 may identify voxels of the plurality of voxels 244A-244N that were not occupied by or did not otherwise interact with the tool tip 236.
  • the computing device 102 may determine that a fourth voxel 244D, a fifth voxel 244E, a ninth voxel 2441, a tenth voxel 244J, and an eleventh voxel 244K did not interact with the tool tip 236.
  • the computing device 102 would determine that a twelfth voxel 244L was not occupied by the tool tip 236, but that a thirteenth voxel 244M and a fourteenth voxel 244N were occupied by the tool tip 236.
  • information from the computing device 102 about which voxels correspond to regions of the vertebra 204 that have interacted with the tool tip 236 and/or information about which voxels comprise the first volume type and/or the second volume type may be rendered to the navigation display 124 for the user (e.g., the surgeon) to see.
  • the computing device 102 or components thereof may only count a voxel as having interacted with the tool tip 236 when the voxel represents an area of the vertebra 204 that corresponds to bone.
  • other areas of the segmented image that represent non-bony regions e.g., voxels representing adipose tissue proximate the vertebra 204 may be excluded when determining which voxels interacted with the tool tip 236.
  • the computing device 102 may determine whether or not the region of the segmented image represents a bony or non-bony region based on HU values, and/or based on manual or automatic segmentation methods.
  • the computing device 102 may update the visual display of the navigation display 124 based on which voxels interacted with the tool tip 236, with the update including a change or modification to the visual indicator of one or more voxels.
  • the displayed information on the navigation display 124 may depend on the type of surgical tool 136 that was used and tracked by the navigation system 118, surgical plan information, user input, combinations thereof, and/or the like.
  • the surgical tool 136 comprises a pointer probe such as a navigated probe that is moved by the physician or other user when probing the vertebra 204
  • the computing device 102 may determine that no tissue has been removed, and may not consider the voxels through which the pointer probe has moved as being changed or modified.
  • the computing device 102 may count the voxel through which the tool tip 236 passes as being changed or modified. In some embodiments, the computing device 102 may count those voxels with the first volume type through which the tool tip 236 as being changed or modified, while not counting voxels with the second volume type. In other words, the computing device 102 may not count voxels that have little or no bone content as being modified by the surgical tool 136.
  • the surgical tool 136 may comprise a drill that drills through the pedicle 208 of the vertebra 204 to create a pilot hole for a pedicle screw.
  • the computing device 102 may use information associated with the drill (e.g., the trajectory of the drill with respect to the vertebra 204, the radius of the planned pilot hole, etc.) and the tracking of the drill by the navigation system 118 to determine which portions of the vertebra 204 have interacted with the surgical tool. As depicted in Fig.
  • the computing device 102 may determine that the drill interacts with areas of the vertebra 204 represented by the thirteenth voxel 244M and the fourteenth voxel 244N, and may cause the visual depictions of the thirteenth voxel 244M and the fourteenth voxel 244N to be updated on the navigation display 124.
  • the computing device 102 may also determine that the twelfth voxel 244L has not interacted with the tool tip 236, and may not change the visual depiction of the twelfth voxel 244L on the navigation display 124.
  • the surgical tool 136 may comprise an osteotome, in which case the tool tip 236 may be or comprise a blade.
  • the computing device 102 may use information associated with the blade (e.g., the trajectory of the blade with respect to the vertebra 204, the width of the blade, etc.) when the blade is docked on the vertebra 204 to define a cutting plane.
  • the computing device 102 may then define the smaller of the connected components separated by the plane to be changed or modified. In other words, the computing device 102 may expect that the volume of bone removed is smaller than the volume of the vertebra 204, and may identify the smaller voxel volume as being changed or modified.
  • the computing device 102 may update the visual display of the navigation display 124 based on information about the type of surgery or surgical procedure, information about the current step of the surgery or surgical procedure, combinations thereof, and/or the like.
  • the computing device 102 may receive information from the database 130 such as the current step of the surgical workflow (which may include information about the current type of surgical tool in use), and use such information to determine whether or not interaction between the patient anatomy and the surgical tool warrants an update to the visual depiction of the patient anatomy on the navigation display 124.
  • the surgical procedure may include a step where a navigated probe is used on the vertebra 204.
  • the computing device 102 may access the surgical workflow in the database 130, determine that the current step is a navigated probe step, and determine that the portions of the vertebra 204 that interact with the surgical tool 136 during this step have not been resected or decorticated. As a result, the visual depiction of the vertebra 204 on the navigation display 124 may remain unchanged during the navigated probe step.
  • the workflow may further include another step where the surgical tool 136 drills through the pedicle 208 of the vertebra 204 to create a pilot hole for a pedicle screw.
  • the computing device 102 may access the surgical workflow in the database 130, determine that the current step is a drilling step, and determine that any portions of the vertebra that interact with the tool tip 236 of the surgical tool 136 during this step should be considered resected. Then, based on the tracking of the surgical tool 136 during the drilling step, the computing device 102 may update the depiction of the voxels on the navigation display 124 that represent the portions of the vertebra 204 that interacted with the surgical tool 136 during the drilling step to indicate the portions of the vertebra 204 have been resected.
  • the computing device 102 may update the depiction of voxels on the navigation display 124 that have been identified as corresponding to portions or sections of the vertebra 204 that interacted with the tool tip 236.
  • the update may be or comprise a change in the rendered color of one or more of the voxels, a rendered border of one or more of the voxels, an addition of one or more visual labels indicating the state of the portion (e.g., “bone resected,” “bone decorticated,” etc.), combinations thereof, and/or the like.
  • the surgical tool 136 comprises a drill that is used to drill a pilot hole for a pedicle screw
  • the voxels associated with the area of the vertebra 204 that is drilled into by the surgical tool 136 may be modified with an outline to indicate that the area of the vertebra 204 has been resected.
  • the surgical tool 136 comprises a drill that is used to decorticate a facet or a lamina of the vertebra 204
  • the voxels associated with the facet or lamina of the vertebra 204 may be rendered in a different color to indicate that the facet or lamina has been decorticated.
  • the update of the voxel depiction on the navigation display 124 may include updated or modified versions of the voxels, which may indicate to the user that such sections of the vertebra 204 have been altered by interactions with the tool tip 236 of the surgical tool 136.
  • the computing device 102 may update the visual depiction of the voxels based on input information, which may be based on inputs from the user (e.g., via the user interface 110).
  • the surgery or surgical procedure may include the surgeon performing an operation with the surgical tool 136.
  • the user may perform the operation (e.g., drilling into the pedicle 208 for the later insertion of a pedicle screw), and may manually update the segmented image (e.g., by manipulating the voxels rendered to the navigation display 124, by inputting a command via the user interface 110 that the drilling step has been completed, etc.).
  • the computing device 102 may modify or otherwise update the depiction of the voxels accordingly.
  • Fig. 3 depicts a method 300 that may be used, for example, to update a display based on changes to the state of an anatomical element during a surgery or surgical procedure.
  • the method 300 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
  • a processor other than any processor described herein may also be used to execute the method 300.
  • the at least one processor may perform the method 300 by executing elements stored in a memory such as the memory 106.
  • the elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 300.
  • One or more portions of a method 300 may be performed by the processor executing any of the contents of memory, such as an image processing 120 and/or a segmentation 122.
  • the method 300 comprises receiving an image depicting an anatomical element (step 304).
  • the image may be captured by the imaging device 112, and may depict the anatomical element that may be similar to or the same as the vertebra 204.
  • the image may depict additional anatomical elements, such as vertebrae adjacent to the vertebra 204.
  • the image may be captured during the course of a spinal fusion surgical procedure.
  • the method 300 also comprises segmenting the image depicting the anatomical element into a plurality of voxels (step 308).
  • the plurality of voxels may be similar to or the same as the plurality of voxels 244A-244N.
  • the segmenting may be performed by the processor 104 using, for example, segmentation 122.
  • the segmentation 122 may comprise one or more data models (e.g., CNNs, DNNs, etc.) trained on data sets to receive an image or image data associated with the vertebra 204, segment the image of the vertebra 204 into the plurality of voxels 244A-244N, and output the segmented image to a display (e.g., to the navigation display 124).
  • the segmentation 122 data model(s) may be trained on historical data sets of similar anatomical elements and/or similar surgeries or surgical procedures to identify one or more regions of interest and superimpose the plurality of voxels 244A-244N on the image of the vertebra 204.
  • the segmentation 122 may be semiautomatic, with the user capable of modifying the results of the segmentation 122 manually.
  • the segmentation 122 may segment the image of the vertebra 204, and the user may be able to adjust the segments, the position of one or more voxels of the plurality of voxels 244A-244N, combinations thereof, and the like manually via input into the user interface 110.
  • the method 300 also comprises rendering, to a display, the segmented image (step 312).
  • the segmented image may be rendered to a display such as the navigation display 124 for the user (e.g., the surgeon) to see.
  • the user may be able to provide inputs into the display to alter, manipulate, or otherwise interact with the segmented image.
  • the method 300 also comprises tracking a portion of a surgical tool as the portion of the surgical tool interacts with the anatomical element (step 316).
  • the surgical tool may be similar to or the same as the surgical tool 136.
  • the tracking may be performed by the navigation system 118 tracking the surgical tool 136 (and/or the tool tip 236 of the surgical tool 136) using one or more navigation markers attached to the surgical tool 136.
  • the navigation system 118 may receive image data from the imaging device 112 that images the navigation markers on the surgical tool 136, and the navigation system 118 may use the processor 104 to determine the pose of the surgical tool 136 as well as changes thereto. Then, based on the movement of the surgical tool 136 relative to one or more localizers, the navigation system 118 may determine the pose and the change in pose of the surgical tool 136 relative to the vertebra 204.
  • the method 300 also comprises identifying, based on the tracking, an area from the segmented image representative of a section of the anatomical element that interacts with the portion of the surgical tool (step 320).
  • the computing device 102 may use pose information of the surgical tool 136 tracked by the navigation system 118 as the surgical tool 136 interacts with the vertebra 204 and the known pose of the vertebra 204 (e.g., based on navigation markers placed in known locations relative to the patient and registration between the navigation markers and the surgical tool 136) to determine which areas of the vertebra 204 interact with the tool tip 236 of the surgical tool 136.
  • the computing device 102 may then determine which voxels of the plurality of voxels 244A-244N correspond to the area of the segmented image.
  • the method 300 also comprises modifying one or more voxels from the plurality of voxels that reside within the area identified from the segmented image as being representative of the section of the anatomical element that interacts with the portion of the surgical instrument (step 324).
  • the computing device 102 may update the depiction of voxels on the navigation display 124 that have been identified as corresponding to portions or sections of the vertebra 204 that interacted with the tool tip 236.
  • the update may be or comprise a change in the rendered color of one or more of the voxels, a rendered border of one or more of the voxels, an addition of one or more visual labels indicating the state of the portion (e.g., “bone resected,” “bone decorticated,” etc.), combinations thereof, and/or the like.
  • the surgical tool 136 comprises a drill that is used to drill a pilot hole for a pedicle screw
  • the voxels associated with the area of the vertebra 204 that is drilled into by the surgical tool 136 may be modified with an outline to indicate that the area of the vertebra 204 has been resected.
  • the surgical tool 136 comprises a drill that is used to decorticate a facet or a lamina of the vertebra 204
  • the voxels associated with the facet or lamina of the vertebra 204 may be rendered in a different color to indicate that the facet or lamina has been decorticated.
  • the update of the voxel depiction on the navigation display 124 may indicate to the user that such sections of the vertebra 204 have been modified.
  • the computing device 102 may update the visual depiction of the voxels based on input information, which may be based on inputs from the user (e.g., via the user interface 110).
  • the surgery or surgical procedure may include the surgeon performing an operation with the surgical tool 136.
  • the user may perform the operation (e.g., drilling into the pedicle 208 for the later insertion of a pedicle screw), and may manually update the segmented image (e.g., by manipulating the voxels rendered to the navigation display 124, by inputting a command via the user interface 110 that the drilling step has been completed, etc.).
  • the method 300 also comprises rendering, to the display, the segmented image showing the modified one or more voxels (step 328). Once the voxels have been modified, the segmented image may be rendered to the display (e.g., the navigation display 124) as an updated segmented image that depicts the modified one or more voxels. The updated segmented image may enable the user to visualize the updated state of the vertebra 204, without the need for additional interoperative imaging.
  • the present disclosure encompasses embodiments of the method 300 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • the present disclosure encompasses methods with fewer than all of the steps identified in Fig. 3 (and the corresponding description of the method 300), as well as methods that include additional steps beyond those identified in Fig. 3 (and the corresponding description of the method 300).
  • the present disclosure also encompasses methods that comprise one or more steps from one method described herein, and one or more steps from another method described herein. Any correlation described herein may be or comprise a registration or any other correlation.
  • Example 1 A system, comprising: a processor; and a memory storing data thereon that, when processed by the processor, enable the processor to: receive an image depicting an anatomical element; segment the image into a segmented image that includes a plurality of voxels; track a portion of a surgical instrument as the portion of the surgical instrument interacts with the anatomical element; identify, based on the tracking, an area from the segmented image representative of a section of the anatomical element that interacts with the portion of the surgical instrument; modify one or more voxels from the plurality of voxels that reside within the area identified from the segmented image as being representative of the section of the anatomical element that interacts with the portion of the surgical instrument; and render, to a display, the segmented image showing the modified one or more voxels.
  • Example 2 The system of example 1, wherein the one or more voxels are rendered with a first visual depiction a first time, and wherein the one or more voxels are rendered with a second visual depiction at a second time later than the first time.
  • Example 3 The system of exampe 1, wherein the modified one or more voxels are rendered with at least one of a different color and a different border than the plurality of voxels.
  • Example 4 The system of example 1, wherein the portion of the surgical instrument is capable of resecting anatomical tissue.
  • Example 5 The system of example 1, wherein the image is a two-dimensional image or a three-dimensional image.
  • Example 6 The system of example 1, wherein the tracking comprises determining a pose of the portion of the surgical instrument relative to the anatomical element as the portion of the surgical instrument interacts with the anatomical element.
  • Example 7 The system of example 1, wherein the modified one or more voxels indicate that the portion of the anatomical element has been resected.
  • Example 8 A system, comprising: a processor; and a memory coupled with the processor and storing data thereon that, when processed by the processor, enable the processor to: receive a segmented image depicting an anatomical element segmented into a plurality of voxels; render, to a display, the segmented image; track a surgical tool as the surgical tool interacts with the anatomical element; determine, based on the tracking, a voxel of the plurality of voxels representative of a portion of the anatomical element that interacts with the surgical tool; and update a visual depiction of the voxel shown in the segmented image on the display.
  • Example 9 The system of example 8, wherein the update of the visual depiction of the voxel comprises a change in at least one of a color and a border of the voxel.
  • Example 10 The system of example 8, wherein the update of the visual depiction of the voxel comprises an indicator that the portion of the anatomical element has been resected.
  • Example 11 The system of example 8, wherein the segmented image comprises a two- dimensional image or a three-dimensional image.
  • Example 12 The system of example 8, wherein the segmented image is received from an artificial intelligence data model.
  • Example 13 The system of example 12, wherein the artificial intelligence data model comprises a convolutional neural network that receives image data as an input and outputs the segmented image.
  • Example 14 The system of example 8, wherein the tracking comprises determining a pose of the surgical tool relative to the anatomical element as the surgical tool interacts with the anatomical element.
  • Example 15 The system of example 8, wherein the update of the visual depiction of the voxel is based on at least one of a type of surgical tool and a surgical workflow.
  • Example 16 A system, comprising: a processor; and a memory coupled with the processor and storing data thereon that, when processed by the processor, enable the processor to: receive image data associated with an anatomical element; segment the image data into a plurality of voxels; render, to a display, a visual depiction of the plurality of voxels; track an operative portion of a surgical instrument as the operative portion of the surgical instrument interacts with the anatomical element; identify, based on the tracking, a voxel of the plurality of voxels associated with the operative portion of the surgical instrument; and render, based on the tracking, an updated visual depiction of the image data that includes a modified version of the voxel.
  • Example 17 The system of example 16, wherein the operative portion of the surgical instrument is capable of resecting anatomical tissue.
  • Example 18 The system of example 17, wherein the modified version of the voxel provides an indicator that a section of the anatomical element has been resected.
  • Example 19 The system of example 16, wherein the modified version of the voxel is rendered in the updated visual depiction with a first visual indicator when the surgical instrument is a first type of surgical instrument, and wherein the voxel is rendered with a second visual indicator when the surgical instrument is a second type of surgical instrument.
  • Example 20 The system of example 19, wherein the first visual indicator indicates that a section of the anatomical element has been resected, and wherein the second visual indicator indicates that the section of the anatomical element has been decorticated.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Image Processing (AREA)

Abstract

Un système selon un mode de réalisation de la présente divulgation comprend un processeur et une mémoire stockant des données sur celui-ci qui, lorsqu'elles sont traitées par le processeur, permettent au processeur de recevoir une image représentant un élément anatomique ; de segmenter l'image en une image segmentée qui comprend une pluralité de voxels ; de suivre une partie d'un instrument chirurgical lorsque la partie de l'instrument chirurgical interagit avec l'élément anatomique ; d'identifier, sur la base du suivi, une zone à partir de l'image segmentée représentative d'une section de l'élément anatomique qui interagit avec la partie de l'instrument chirurgicaL ; de modifier un ou plusieurs voxels de la pluralité de voxels qui résident dans la zone identifiée à partir de l'image segmentée comme représentant la section de l'élément anatomique qui interagit avec la partie de l'instrument chirurgical ; et de restituer, à un dispositif d'affichage, l'image segmentée montrant le(s) voxel(s) modifié(s).
PCT/IB2024/057858 2023-08-16 2024-08-13 Systèmes et méthodes de visualisation en temps réel de l'anatomie dans des procédures de navigation Pending WO2025037243A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202363532974P 2023-08-16 2023-08-16
US63/532,974 2023-08-16
US18/787,810 US20250057603A1 (en) 2023-08-16 2024-07-29 Systems and methods for real-time visualization of anatomy in navigated procedures
US18/787,810 2024-07-29

Publications (1)

Publication Number Publication Date
WO2025037243A1 true WO2025037243A1 (fr) 2025-02-20

Family

ID=92746569

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2024/057858 Pending WO2025037243A1 (fr) 2023-08-16 2024-08-13 Systèmes et méthodes de visualisation en temps réel de l'anatomie dans des procédures de navigation

Country Status (1)

Country Link
WO (1) WO2025037243A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120095586A (ko) * 2011-02-21 2012-08-29 한양대학교 산학협력단 수술 환부의 실시간 그래픽 표현 방법
US20170330487A1 (en) * 2016-05-12 2017-11-16 Affera, Inc. Three-dimensional cardiac representation
US20190251755A1 (en) * 2018-02-09 2019-08-15 David Byron Douglas Interactive voxel manipulation in volumetric medical imaging for virtual motion, deformable tissue, and virtual radiological dissection
US20190320878A1 (en) * 2017-01-09 2019-10-24 Intuitive Surgical Operations, Inc. Systems and methods for registering elongate devices to three dimensional images in image-guided procedures
US11442534B1 (en) * 2018-04-10 2022-09-13 Red Pacs, Llc Smart glasses system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120095586A (ko) * 2011-02-21 2012-08-29 한양대학교 산학협력단 수술 환부의 실시간 그래픽 표현 방법
US20170330487A1 (en) * 2016-05-12 2017-11-16 Affera, Inc. Three-dimensional cardiac representation
US20190320878A1 (en) * 2017-01-09 2019-10-24 Intuitive Surgical Operations, Inc. Systems and methods for registering elongate devices to three dimensional images in image-guided procedures
US20190251755A1 (en) * 2018-02-09 2019-08-15 David Byron Douglas Interactive voxel manipulation in volumetric medical imaging for virtual motion, deformable tissue, and virtual radiological dissection
US11442534B1 (en) * 2018-04-10 2022-09-13 Red Pacs, Llc Smart glasses system

Similar Documents

Publication Publication Date Title
US20220395342A1 (en) Multi-arm robotic systems and methods for monitoring a target or performing a surgical procedure
US20250152262A1 (en) Path planning based on work volume mapping
US20250127572A1 (en) Methods and systems for planning a surgical procedure
EP4472547A1 (fr) Poursuite segmentaire combinant une poursuite optique et des mesures inertielles
US20250318886A1 (en) Automatic robotic procedure for skin cutting, tissue pathway, and dilation creation
EP4026511B1 (fr) Systèmes et procédés de mise à jour d'enregistrement d'image unique
US20240398584A1 (en) Systems and methods for bone graft mixture guidance based on navigation information
US20250057603A1 (en) Systems and methods for real-time visualization of anatomy in navigated procedures
US12446962B2 (en) Spine stress map creation with finite element analysis
US20240138932A1 (en) Systems and methods for controlling one or more surgical tools
WO2025037243A1 (fr) Systèmes et méthodes de visualisation en temps réel de l'anatomie dans des procédures de navigation
WO2024116018A1 (fr) Sélection et suggestion intelligentes d'instruments chirurgicaux
US20230240749A1 (en) Systems and methods for controlling surgical tools based on bone density estimation
EP4284289A1 (fr) Systèmes et procédés de vérification de point d'entrée d'os
WO2024246740A1 (fr) Systèmes et procédés de guidage de mélange de greffe osseuse sur la base d'informations de navigation
US12004821B2 (en) Systems, methods, and devices for generating a hybrid image
US20220241016A1 (en) Bone entry point verification systems and methods
US12295683B2 (en) Systems and methods for robotic collision avoidance using medical imaging
WO2025150040A1 (fr) Systèmes et procédés de résection par navigation chirurgicale d'éléments anatomiques
US20240358461A1 (en) Multi-arm robotic systems and methods for monitoring a target or performing a surgical procedure
US20230240659A1 (en) Systems, methods, and devices for tracking one or more objects
WO2024236563A1 (fr) Systèmes et procédés de génération et de mise à jour d'un plan chirurgical
WO2025120637A1 (fr) Systèmes et procédés de planification et de mise à jour de trajectoires pour dispositifs d'imagerie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24769062

Country of ref document: EP

Kind code of ref document: A1