[go: up one dir, main page]

WO2025150040A1 - Systems and methods for navigated surgical resection of anatomical elements - Google Patents

Systems and methods for navigated surgical resection of anatomical elements

Info

Publication number
WO2025150040A1
WO2025150040A1 PCT/IL2025/050019 IL2025050019W WO2025150040A1 WO 2025150040 A1 WO2025150040 A1 WO 2025150040A1 IL 2025050019 W IL2025050019 W IL 2025050019W WO 2025150040 A1 WO2025150040 A1 WO 2025150040A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical tool
robotic arm
anatomical element
surgical
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IL2025/050019
Other languages
French (fr)
Inventor
Elad Rotman
Ido ZUCKER
Adi ESS
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mazor Robotics Ltd
Original Assignee
Mazor Robotics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mazor Robotics Ltd filed Critical Mazor Robotics Ltd
Publication of WO2025150040A1 publication Critical patent/WO2025150040A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery

Definitions

  • the present disclosure is generally directed to surgical navigation, and relates more particularly to using a navigation system information to control surgical tools in performing resection of anatomical tissues during a surgery or surgical procedure.
  • the memory stores further data that, when processed by the processor, enables the processor to: update, after at least one of the first surgical tool resecting the first portion of the anatomical element and the second surgical tool resecting the second portion of the anatomical element, a depiction of the anatomical element rendered to a display.
  • the memory stores further data that, when processed by the processor, enables the processor to: disable the first surgical tool when the first surgical tool reaches the plane.
  • the first surgical tool comprises a first set of navigation markers
  • the second surgical tool comprises a second set of navigation markers
  • tracking the first surgical tool comprises identifying a pose of the first set of navigation markers
  • tracking the second surgical tool comprises identifying a pose of the second set of navigation markers
  • the first surgical tool is connectable to a robotic arm, and wherein the robotic arm manipulates the first surgical tool to resect the first portion of the anatomical element.
  • the second surgical tool is connectable to the robotic arm, and wherein the robotic arm manipulates the second surgical tool to resect the second portion of the anatomical element.
  • the second surgical tool is connectable to a second robotic arm, and wherein the second robotic arm manipulates the second surgical tool to resect the second portion of the anatomical element.
  • any of the aspects herein further comprising: updating, after at least one of the resecting of the first portion and the resecting of the second portion, a depiction of the anatomical element rendered to a display.
  • each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple Al l, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors), graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuit
  • DSPs digital signal processors
  • the changed characteristics of the bone cut and/or the anatomical element may then be saved to the database 130, the surgical plan 136, combinations thereof, and/or the like.
  • the surgical tool information 140 may contain information about the parameters of one or more of the surgical tools used in the surgery or surgical procedure (e.g., the rotation speed of the operative end of the surgical tool, the electrical power requirements of the surgical tool, the dimensions of the surgical tool, etc.).
  • the surgical tool information 140 may also specify which type of surgical tool is to be used at each step in the surgery or surgical procedure.
  • the surgical tool information 140 may specify that the HSD 148 is to be used in a first surgical step of resecting cortical bone, and that the OSD 152 is to be used in a second surgical step of resecting cancellous/trabecular bone.
  • the surgical tool information 140 may be modifiable by the user (e.g., a physician) based on inputs to the user interface 110.
  • the surgical tool information 140 may be part of the surgical plan 136.
  • the mesh model 144 may be or comprise a model of one or more anatomical elements that are the subject of the surgery or surgical procedure.
  • the mesh model 144 may be or comprise a mesh of the vertebra.
  • the mesh model 144 may be generated by the computing device 102 based on one or more images captured using the imaging device 112, image information taken from the surgical plan 136, information retrieved from the database 130, combinations thereof, and/or the like.
  • the imaging device 112 may generate a CT scan of the patient including the target vertebra.
  • the computing device 102 may receive the image data from the imaging device 112 and, using image processing 120 and segmentation 122, generate the mesh model 144.
  • a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time, and a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
  • the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
  • the imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an O-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient.
  • the imaging device 112 may be contained entirely within a single housing, or may comprise a transmitter/emitter and a receiver/detector that are in separate housings or are otherwise
  • the imaging device 112 may comprise more than one imaging device 112.
  • a first imaging device may provide first image data and/or a first image
  • a second imaging device may provide second image data and/or a second image.
  • the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein.
  • the imaging device 112 may be operable to generate a stream of image data.
  • the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images.
  • image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
  • the robot 114 together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
  • the robotic arm(s) 116 may comprise one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm).
  • reference markers may be placed on the robot 114 (including, e.g., on the robotic arm 116), the imaging device 112, or any other object in the surgical space.
  • the reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof.
  • the navigation system 118 can be used to track other components of the system (e.g., imaging device 112) and the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or more surgical tools, based on information and/or instructions generated by the navigation system 118, for example).
  • the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation.
  • the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStationTM S8 surgical navigation system or any successor thereof.
  • the navigation system 118 may include one or more cameras or other sensor(s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the system 100 is located.
  • the one or more cameras may be optical cameras, infrared cameras, or other cameras.
  • the navigation system 118 may comprise one or more electromagnetic sensors.
  • the navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
  • the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud 134.
  • the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • a hospital image storage system such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • the cloud 134 may be or represent the Internet or any other wide area network.
  • the computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both.
  • the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
  • the HSD 148 may be configured to drill, burr, mill, cut, saw, ream, tap, etc. into anatomical tissues such as patient anatomy (e.g., soft tissues, bone, etc.).
  • the system 100 may comprise the HSD 148, the OSD 152, and/or multiple other surgical tools, with each surgical tool performing a different surgical task (e.g., the HSD 148 for drilling through trabecular bone of a vertebra, the OSD 152 for cutting through the distal cortex of the vertebra at a slower rate than the HSD 148, etc.).
  • the HSD 148 may be operated autonomously (e.g., when the HSD 148 is connected to and manipulated by the robotic arm 116) or semi-autonomously, such as when the HSD 148 is manipulated by a user with guidance from the navigation system 118.
  • the OSD 152 may be used to resect the distal cortex of a vertebra, and the oscillatory motion of the operative end of the OSD 152 may reduce the likelihood of breaching the nerve channel beneath the distal cortex.
  • the OSD 152 may be operated autonomously (e.g., when the OSD 152 is connected to and manipulated by the robotic arm 116) or semi-autonomously, such as when the OSD 152 is manipulated by a user with guidance from the navigation system 118.
  • the OSD 152 may be attached to a robotic arm 116, such that movement of the robotic arm 116 correspondingly causes movement in the OSD 152.
  • the OSD 152 may be gripped, held, or otherwise coupled to and controlled by the robotic arm 116.
  • the pose (e.g., position and orientation) of the OSD 152 may be controlled by the pose of the robotic arm 116.
  • the OSD 152 can be controlled by one or more components of the system 100, such as the computing device 102.
  • the computing device 102 may be capable of receiving or retrieving data or other information (e.g., from the database 130, from one or more sensors, from the imaging device 112, etc.), process the information, and control the OSD 152 based on the processed information. Additionally or alternatively, the navigation system 118 may track the position of and/or navigate the OSD 152. Such tracking may enable the system 100 or components thereof (e.g., the computing device 102) to determine the pose of the OSD 152, the location of the OSD 152 relative to the planned bone cut, combinations thereof, and/or the like, as discussed in further detail below.
  • the HSD 148 may be connected to a first robotic arm, while the OSD 152 is connected to a second, different robotic arm. In such embodiments, each robotic arm may independently operate each surgical tool, with the navigation system 118 tracking both and the computing device 102 generating navigation paths of both robotic arms to avoid or mitigate the likelihood of collisions between the two.
  • the system 100 or similar systems may be used, for example, to carry out one or more aspects of the method 400 described herein.
  • the system 100 or similar systems may also be used for other purposes.
  • An HSD 204 and an OSD 208 may be positioned relative to an anatomical element 210.
  • the HSD 204 may be similar to or the same as the HSD 148
  • the OSD 208 may be similar to or the same as the OSD 152.
  • the HSD 204 and the OSD 208 may be used to perform a bone cut 228.
  • the bone cut 228 may be or comprise a cut through one or more portions of the anatomical element 210, such as through the outer cortical bone of a vertebra and through the distal cortex of the vertebra.
  • the HSD 204 may be used to perform the first cut on the outer cortical bone
  • the OSD 208 may be used to perform the second cut to remove the distal cortex.
  • the HSD 204 may comprise a navigated portion 212 and the OSD 208 may comprise a navigated portion 216.
  • the navigated portion 212 and the navigated portion 216 may respectively comprise navigation markers 220A-220D (including a first navigation marker 220A, a second navigation marker 220B, a third navigation marker 220C, and a fourth navigation marker 220D) and navigation markers 224A-224D (including a first navigation marker 224A, a second navigation marker 224B, a third navigation marker 224C, and a fourth navigation marker 224D).
  • the navigation markers 220A-220D and the navigation markers 224A-224D may enable a navigation camera of the navigation system 118 to track the pose (e.g., position and orientation) of the HSD 204 and the OSD 208, respectively, as the HSD 204 and the OSD 208 move relative to the anatomical element 210.
  • the HSD 204 may be moved (e.g., using a robotic arm such as the robotic arm 116, manually by a user, etc.) relative to the anatomical element 210 as the operative portion of the HSD 204 resects one or more portions of the anatomical element 210.
  • the navigation camera of the navigation system 118 may identify the navigation markers 220A-220D and track the pose of the navigated portion 212 (e.g., based on the pose of the navigation markers 220A-220D), such that the navigation system 118 can determine a pose of the HSD 204 in a known coordinate system.
  • the OSD 208 may be moved (e.g., using a robotic arm such as the robotic arm 116, manually by a user, etc.) relative to the anatomical element 210 as the operative portion of the OSD 208 resects one or more portions of the anatomical element 210.
  • the navigation camera of the navigation system 118 may identify the navigation markers 224A-224D and track the pose of the navigated portion 216 (e.g., based on the pose of the navigation markers 224A- 224D), such that the navigation system 118 can determine a pose of the OSD 208 in a known coordinate system.
  • the navigation system 118 may provide the pose information of the HSD 204 and/or the OSD 208 to the computing device 102, which may use the pose information to determine when use of the HSD 204 and/or the OSD 208 should be discontinued or stopped, as discussed in more detail below.
  • the navigation system 118 may provide updated pose information at a predetermined or user- specified interval (e.g., every 10 milliseconds (ms), every 20 ms, every 50 ms, every 2 seconds, etc.).
  • the bone cut 228 may be or comprise an indication of the portions of the anatomical element 210 that are to be operated on by the HSD 204 and/or the OSD 208.
  • the bone cut 228 may comprise a multi-dimensional shape (e.g., 2D plane or 3D volume) that specifies the area or volume of the anatomical element 210 that is to be resected or otherwise operated on by the HSD 204 and/or the OSD 208.
  • the bone cut 228 comprises a plane 240 that divides the anatomical element 210 into a first portion 232 and a second portion 236.
  • the first portion 232 may be or comprise a section of the anatomical element 210 that is to be resected using the HSD 204
  • the second portion 236 may be or comprise a section of the anatomical element 210 that is to be resected using the OSD 208.
  • the plane 240 may be definable or editable by the user via, for example, the user interface 110.
  • the user may be able to change the shape, orientation, dimensions, and/or the like of the plane 240 to change the shape, orientation, dimensional and/or the like of the first portion 232 and/or the second portion 236.
  • the user may be able to draw a shape of the plane 240 on a mesh model (e.g., mesh model 144) or the anatomical element 210 rendered to a display, and the mesh model may be updated to incorporate the plane 240.
  • a mesh model e.g., mesh model 144
  • the HSD 204 may be used to resect anatomical tissue associated with the first portion 232 of the anatomical element 210. As depicted in Fig. 2B, the HSD 204 may move laterally across the anatomical element 210 such that the operative portion of the HSD 204 resects anatomical tissue. The pose of the HSD 204 may be tracked using a navigation camera of the navigation system 118 that identifies a pose of the navigation markers 220A-220D on the navigated portion 212. The HSD 204 may be used to resect the first portion 232 of the anatomical element 210 until the operative portion of the HSD 204 reaches the plane 240.
  • the computing device 102 may determine that the HSD 204 has reached the plane 240 based on the tracking of the pose of the HSD 204 as the HSD 204 resects the first portion 232.
  • the computing device 102 may receive tracking and/or pose information from the navigation system 118, and may use such information and information from the surgical plan 136 (e.g., the dimensions of the first portion 232) to determine that the HSD 204 has reached the plane 240.
  • the surgical plan 136 may specify that the first portion 232 has a first thickness, and the computing device 102 may determine the HSD 204 would have a first pose in a known coordinate system when the HSD 204 has drilled through the first thickness.
  • the navigation system 118 tracking information specifies that the HSD 204 is in the first pose
  • the computing device 102 may determine that the first portion 232 has been resected and the HSD 204 has reached the plane 240.
  • the computing device 102 may generate an alert.
  • the alert may be a visual alert (e.g., a flashing warning rendered to a display), an audio alert (e.g., a siren or other alarm), haptic (e.g., motors in the display are activated to vibrate the display), combinations thereof, and/or the like.
  • a warning message may be rendered to the display to indicate to the user that the first portion 232 has been resected and that the operative portion of the HSD 204 has arrived at the plane 240.
  • the plane 240 may represent the intersection of two different types of anatomical tissues.
  • the OSD 208 may be used to resect the second portion 236.
  • the computing device 102 may render the alert to the display instructing the user to change the surgical tool in use (e.g., to switch out the HSD 204 for the OSD 208).
  • the alert may instruct the user to switch out the surgical tool connected to the robotic arm.
  • the robotic arm may navigate the HSD 204 until the operative portion of the HSD 204 reaches the plane 240, at which point the alert may be generated.
  • the alert may be an instruction rendered to the display that instructs the user to detach the HSD 204 from the robotic arm and to attach the OSD 208.
  • the navigation system 118 may initiate and conduct an authentication process to ensure that the correct surgical tool (e.g., the OSD 208) has been connected to the robotic arm.
  • the navigation system 118 may identify the navigation markers 224A-224D (which may be distinguishable by the navigation system 118 from the navigation markers 220A-220D of the HSD 204) to verify that the HSD 204 has been disconnected from the robotic arm and that the OSD 208 has been connected.
  • the surgical plan 136 may specify that the second portion 236 has a first thickness, and the computing device 102 may determine the OSD 208 would have a first pose in a known coordinate system when the OSD 208 has drilled through the first thickness.
  • the computing device 102 may determine that the second portion 236 has been resected. Once the OSD 208 has resected the second portion 236, the computing device 102 may generate a second alert that indicates that the second portion 236 has been resected.
  • the alert may be a visual alert (e.g., a flashing warning rendered to a display), an audio alert (e.g., a siren or other alarm), haptic (e.g., motors in the display are activated to vibrate the display), combinations thereof, and/or the like.
  • the computing device 102 may disable use of the OSD 208 once the computing device 102 has determined that the second portion 236 has been resected and has generated the second alert.
  • a depiction of the anatomical element 210 on a display may be changed or otherwise updated to reflect the removal of anatomical tissue.
  • the updating may occur while the HSD 204 and the OSD 208 resect portions of the anatomical element 210 and/or after each step of the surgery or surgical procedure.
  • the mesh model 144 may be rendered to a display, and after the HSD 204 has resected the first portion 232, the mesh model 144 may be updated to depict the anatomical element 210 with the first portion 232 removed.
  • the mesh model 144 may be again updated to depict the anatomical element 210 with the second portion 236 removed.
  • the computing device 102 may generate a new model after each resection or step in the surgery or surgical procedure. For example, after the HSD 204 has resected the first portion 232, the computing device 102 may use image processing 120 and segmentation 122 to segment out the depiction of the first portion 232 from the mesh model 144, and render the mesh model 144 as a new model to the display.
  • the first robotic arm 316 and the second robotic arm 320 may each comprise a tool changer that enables the first robotic arm 316 and the second robotic arm 320 to be coupled to different surgical tools, such as to an HSD or to an OSD.
  • the first robotic arm 316 and the second robotic arm 320 can each provide both HSD and OSD drilling capabilities to a respective side of the patient.
  • the first robotic arm 316 provides HSD and OSD drilling capabilities to a first side of the patient 308, while the second robotic arm 320 provides HSD and OSD drilling capabilities to a second side of the patient 308.
  • the computing device 102 may lock or otherwise disable the use of one robotic arm while the other is in use, and vice versa.
  • the second robotic arm 320 and components thereof e.g., the second surgical tool 328
  • the second robotic arm 320 and components thereof may be locked in place within the working volume 312.
  • the second robotic arm 320 and components thereof may be moved outside of the working volume 312 so as to not interfere with navigation of the first robotic arm 316 and components thereof.
  • the first robotic arm 316 and the second robotic arm 320 may be used to perform an autonomous bone cut of an anatomical element of the patient 308.
  • the first robotic arm 316 and the second robotic arm 320 may be used to drill a hole for a pedicle screw in a vertebra 332 of the patient 308.
  • the planned drilling may comprise drilling down to a first depth 336 using the first surgical tool 324, and then drilling down to a second depth 340 using the second surgical tool 328.
  • the use of different surgical tools may account for differences in the composition and/or sensitive of anatomical tissues of the vertebra 332 along the trajectory.
  • one or more multi-dimensional shapes may be used to represent the different depths, such as when a 2D plane bisects the drilling depths into two portions: a first portion through which the first surgical tool 324 is to drill, and a second portion through which the second surgical tool 328 is to drill.
  • the multi-dimensional shape may be definable or editable by the user via, for example, the user interface 110. For instance, the user may be able to change the shape, orientation, dimensions, and/or the like of the multidimensional shape to change the shape, orientation, dimensional and/or the like of the drilling depths.
  • the user may be able to draw a shape of the multi-dimensional shape on a mesh model (e.g., mesh model 144) or the vertebra 332 rendered to a display, and the mesh model may be updated to incorporate the multi-dimensional shape. Additionally or alternatively, the user may be able to define and/or edit the trajectory of the drilling into the vertebra 332.
  • a mesh model e.g., mesh model 1414
  • the vertebra 332 rendered to a display
  • the mesh model may be updated to incorporate the multi-dimensional shape.
  • the user may be able to define and/or edit the trajectory of the drilling into the vertebra 332.
  • the computing device 102 may generate an alert.
  • the alert may be a visual alert (e.g., a flashing warning rendered to a display), an audio alert (e.g., a siren or other alarm), haptic (e.g., motors in the display are activated to vibrate the display), combinations thereof, and/or the like.
  • a warning message may be rendered to the display to indicate to the user that the first depth 336 has been reached by the first surgical tool 324.
  • the first depth 336 may be a depth at which a different type of anatomical tissue is encountered, such that a different surgical tool should be used.
  • the second surgical tool 328 may be used to drill past the first depth 336 and down to the second depth 340.
  • the second robotic arm 320 and/or the second surgical tool 328 may be tracked using a navigation camera of the navigation system 118 that identifies a pose of the navigation markers attached to the second robotic arm 320 and/or the second surgical tool 328.
  • the computing device 102 may determine that the second surgical tool 328 has drilled to the second depth 340 based on tracking the pose of the second surgical tool 328.
  • the computing device 102 may receive tracking and/or pose information from the navigation system 118, and may use such information and information from the surgical plan 136 (e.g., the depth of the second depth 340) to determine that the second surgical tool 328 has drilled to the second depth 340.
  • the method 400 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
  • a processor other than any processor described herein may also be used to execute the method 400.
  • the at least one processor may perform the method 400 by executing elements stored in a memory such as the memory 106.
  • the elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 400.
  • One or more portions of a method 400 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the method 400 also comprises generating, when the first surgical tool has reached the plane, an alert (step 420).
  • the alert may be a visual alert (e.g., a flashing warning rendered to a display), an audio alert (e.g., a siren or other alarm), haptic (e.g., motors in the display are activated to vibrate the display), combinations thereof, and/or the like.
  • a warning message may be rendered to the display to indicate to the user that the first portion has been resected by the first surgical tool.
  • the alert may be an instruction rendered to the display that instructs the user to detach the first surgical tool from the robotic arm and to attach the second surgical tool.
  • the navigation system 118 may initiate and conduct an authentication process to ensure that the correct surgical tool (e.g., the second surgical tool) has been connected to the robotic arm. For instance, the navigation system 118 may identify navigation markers unique to the second surgical tool to verify that the first surgical tool has been disconnected from the robotic arm and that the second surgical tool has been connected.
  • the alert may indicate to the user that the plane has been reached, and that the fully autonomous system is moving on to the next step in the surgical process.
  • the computing device 102 may disable the first surgical tool once the computing device 102 has determined that the plane has been reached, and may move the first robotic arm and/or the first surgical tool away from the anatomical element. After the first robotic arm and/or the first surgical tool have been removed, the computing device 102 may cause the second robotic arm 320 to move relative to the anatomical element such that the second surgical tool proceed with resecting the second portion of the anatomical element.
  • the second robotic arm may be manipulated such that the second surgical tool is disposed to continue resecting the anatomical element in a similar pose to that of the first surgical tool.
  • Example 10 A surgical system comprising: a processor (104); and a memory (106) storing data thereon that, when executed by the processor (104), enable the processor (104) to: receive a three-dimensional (3D) model (144) of an anatomical element; determine, based on the 3D model (144), a plane that at least partially bisects the anatomical element into a first portion and a second portion; track a first surgical tool (148, 204, 324) as the first surgical tool (148, 204, 324) resects the first portion of the anatomical element; determine, based on the tracking, that the first surgical tool (148, 204, 324) has reached the plane; generate, when the first surgical tool (148, 204, 324) has reached the plane, an alert; and track a second surgical tool (152, 208, 328) as the second surgical tool (152, 208, 328) resects the second portion of the anatomical element.
  • 3D three-dimensional
  • Example 13 The surgical system of any one of Examples 10-12, wherein the memory (106) stores further data that, when processed by the processor (104), enables the processor (104) to:

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Robotics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Manipulator (AREA)

Abstract

A system according to at least on embodiment of the present disclosure includes: a robotic arm couplable to a first surgical tool; a processor; and a memory storing data thereon that, when processed by the processor, enable the processor to: receive a three-dimensional (3D) model of an anatomical element; determine, based on the 3D model, a plane that at least partially bisects the anatomical element into a first portion and a second portion; track the robotic arm as the first surgical tool resects the first portion of the anatomical element; determine, based on the tracking of the robotic arm, that the first surgical tool has reached the plane; and generate, when the first surgical tool has reached the plane, an alert.

Description

SYSTEMS AND METHODS FOR NAVIGATED SURGICAL RESECTION OF
ANATOMICAL ELEMENTS
BACKGROUND
[0001] The present disclosure is generally directed to surgical navigation, and relates more particularly to using a navigation system information to control surgical tools in performing resection of anatomical tissues during a surgery or surgical procedure.
[0002] Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure, or may complete one or more surgical procedures autonomously. Providing controllable linked articulating members allows a surgical robot to reach areas of a patient anatomy during various medical procedures.
BRIEF SUMMARY
[0003] High speed drills often provide quick bone cuts, beneficially reducing the amount of time a patient spends under anesthesia and/or in the operating room. However, high speed drills can inadvertently cause damage to anatomical tissues surrounding the surgical site, such as when the high speed drill breaches the distal cortex of a vertebra. Such a breach could damage the underlying nerves, leading to less effective patient outcomes. Oscillating drills, in contrast, reduce the likelihood of such a breach, but are slower than high speed drills in resecting anatomical tissue. Embodiments of the present disclosure beneficially utilize high speed drills to perform an initial bone cut then, when the high speed drill approaches a sensitive area, an alert is generated. The alert may instruct the user that the sensitive area has been reached, enabling the user to change out the high speed drill for an oscillating drill. Accordingly, embodiments of the present disclosure beneficially utilize high speed drills in conjunction with oscillating drills to keep operating room times low while also enhancing patient safety.
[0004] Example aspects of the present disclosure include:
[0005] A system according to at least one embodiment of the present disclosure comprises: a robotic arm couplable to a first surgical tool; a processor; and a memory storing data thereon that, when processed by the processor, enable the processor to: receive a three-dimensional (3D) model of an anatomical element; determine, based on the 3D model, a plane that at least partially bisects the anatomical element into a first portion and a second portion; track the robotic arm as the first surgical tool resects the first portion of the anatomical element; determine, based on the tracking of the robotic arm, that the first surgical tool has reached the plane; and generate, when the first surgical tool has reached the plane, an alert.
[0006] Any of the aspects herein, wherein the robotic arm is further couplable to a second surgical tool, and wherein the memory stores further data that, when processed by the processor, enables the processor to: track the robotic arm as the second surgical tool resects the second portion of the anatomical element.
[0007] Any of the aspects herein, wherein the memory stores further data that, when processed by the processor, enables the processor to: update, after at least one of the first surgical tool resecting the first portion of the anatomical element and the second surgical tool resecting the second portion of the anatomical element, a depiction of the anatomical element rendered to a display.
[0008] Any of the aspects herein, wherein the second surgical tool comprises an oscillating drill. [0009] Any of the aspects herein, wherein the first surgical tool comprises a high speed drill.
[0010] Any of the aspects herein, wherein the robotic arm comprises a plurality of navigation markers, and wherein tracking the robotic arm comprises identifying a pose of the plurality of navigation markers.
[0011] Any of the aspects herein, further comprising: a second robotic arm couplable to a second surgical tool, wherein the memory stores further data that, when processed by the processor, enables the processor to track the second robotic arm as the second surgical tool resects the second portion of the anatomical element.
[0012] Any of the aspects herein, wherein the alert is rendered to a display.
[0013] Any of the aspects herein, wherein the second portion of the anatomical element is positioned between the first portion of the anatomical element and a second anatomical element.
[0014] A surgical system according to at least one embodiment of the present disclosure comprises: a processor; and a memory storing data thereon that, when executed by the processor, enable the processor to: receive a three-dimensional (3D) model of an anatomical element; determine, based on the 3D model, a plane that at least partially bisects the anatomical element into a first portion and a second portion; track a first surgical tool as the first surgical tool resects the first portion of the anatomical element; determine, based on the tracking, that the first surgical tool has reached the plane; generate, when the first surgical tool has reached the plane, an alert; and track a second surgical tool as the second surgical tool resects the second portion of the anatomical element.
[0015] Any of the aspects herein, wherein the memory stores further data that, when processed by the processor, enables the processor to: update, after at least one of the first surgical tool resecting the first portion of the anatomical element and the second surgical tool resecting the second portion of the anatomical element, a depiction of the anatomical element rendered to a display.
[0016] Any of the aspects herein, wherein the first surgical tool comprises a high speed drill, and wherein the second surgical tool comprises an oscillating drill.
[0017] Any of the aspects herein, wherein the memory stores further data that, when processed by the processor, enables the processor to: disable the first surgical tool when the first surgical tool reaches the plane.
[0018] Any of the aspects herein, wherein the first surgical tool comprises a first set of navigation markers, wherein the second surgical tool comprises a second set of navigation markers, wherein tracking the first surgical tool comprises identifying a pose of the first set of navigation markers, and wherein tracking the second surgical tool comprises identifying a pose of the second set of navigation markers.
[0019] Any of the aspects herein, wherein the first surgical tool is connectable to a robotic arm, and wherein the robotic arm manipulates the first surgical tool to resect the first portion of the anatomical element.
[0020] Any of the aspects herein, wherein the second surgical tool is connectable to the robotic arm, and wherein the robotic arm manipulates the second surgical tool to resect the second portion of the anatomical element.
[0021] Any of the aspects herein, wherein the second surgical tool is connectable to a second robotic arm, and wherein the second robotic arm manipulates the second surgical tool to resect the second portion of the anatomical element.
[0022] A method according to at least one embodiment of the present disclosure comprises: receiving a three-dimensional (3D) model of an anatomical element; determining, based on a surgical plan and the 3D model, a plane that bisects the anatomical element into a first portion and a second portion; tracking a first surgical tool as the first surgical tool resects the first portion of the anatomical element; determining, based on tracking, that the first surgical tool has reached the plane;
[0023] generating, when the first surgical tool has reached the plane, an alert; and tracking a second surgical tool as the second surgical tool resects the second portion of the anatomical element.
[0024] Any of the aspects herein, further comprising: updating, after at least one of the resecting of the first portion and the resecting of the second portion, a depiction of the anatomical element rendered to a display.
[0025] Any of the aspects herein, wherein the first surgical tool comprises a high speed drill, and wherein the second surgical tool comprises an oscillating drill.
[0026] Any aspect in combination with any one or more other aspects.
[0027] Any one or more of the features disclosed herein.
[0028] Any one or more of the features as substantially disclosed herein.
[0029] Any one or more of the features as substantially disclosed herein in combination with any one or more other features as substantially disclosed herein.
[0030] Any one of the aspects/features/embodiments in combination with any one or more other aspects/features/embodiments .
[0031] Use of any one or more of the aspects or features as disclosed herein.
[0032] It is to be appreciated that any feature described herein can be claimed in combination with any other feature(s) as described herein, regardless of whether the features come from the same described embodiment.
[0033] The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
[0034] The phrases “at least one”, “one or more”, and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together. When each one of A, B, and C in the above expressions refers to an element, such as X, Y, and Z, or class of elements, such as XI -Xn, Yl-Ym, and Zl- Zo, the phrase is intended to refer to a single element selected from X, Y, and Z, a combination of elements selected from the same class (e.g., XI and X2) as well as a combination of elements selected from two or more classes (e.g., Y 1 and Zo).
[0035] The term “a” or “an” entity refers to one or more of that entity. As such, the terms “a” (or “an”), “one or more” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising”, “including”, and “having” can be used interchangeably.
[0036] The preceding is a simplified summary of the disclosure to provide an understanding of some aspects of the disclosure. This summary is neither an extensive nor exhaustive overview of the disclosure and its various aspects, embodiments, and configurations. It is intended neither to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure but to present selected concepts of the disclosure in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other aspects, embodiments, and configurations of the disclosure are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
[0037] Numerous additional features and advantages of the present disclosure will become apparent to those skilled in the art upon consideration of the embodiment descriptions provided hereinbelow.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
[0038] The accompanying drawings are incorporated into and form a part of the specification to illustrate several examples of the present disclosure. These drawings, together with the description, explain the principles of the disclosure. The drawings simply illustrate preferred and alternative examples of how the disclosure can be made and used and are not to be construed as limiting the disclosure to only the illustrated and described examples. Further features and advantages will become apparent from the following, more detailed, description of the various aspects, embodiments, and configurations of the disclosure, as illustrated by the drawings referenced below. [0039] Fig. 1A is a block diagram of aspects of a system according to at least one embodiment of the present disclosure;
[0040] Fig. IB is a block diagram of additional aspects of the system according to at least one embodiment of the present disclosure;
[0041] Fig. 2A is an illustration of surgical tools of the system for resecting portions of an anatomical element according to at least one embodiment of the present disclosure; [0042] Fig. 2B is an illustration of a first surgical tool of the system resecting a first portion of an anatomical element according to at least one embodiment of the present disclosure;
[0043] Fig. 2C is an illustration of a second surgical tool of the system resecting a second portion of an anatomical element according to at least one embodiment of the present disclosure;
[0044] Fig. 3 A depicts a two robotic arm configuration of the system according to at least one embodiment of the present disclosure;
[0045] Fig. 3B depicts the two robotic arms configuration resecting portions of an anatomical element according to at least one embodiment of the present disclosure; and
[0046] Fig. 4 is a flowchart according to at least one embodiment of the present disclosure.
DETAILED DESCRIPTION
[0047] It should be understood that various aspects disclosed herein may be combined in different combinations than the combinations specifically presented in the description and accompanying drawings. It should also be understood that, depending on the example or embodiment, certain acts or events of any of the processes or methods described herein may be performed in a different sequence, and/or may be added, merged, or left out altogether (e.g., all described acts or events may not be necessary to carry out the disclosed techniques according to different embodiments of the present disclosure). In addition, while certain aspects of this disclosure are described as being performed by a single module or unit for purposes of clarity, it should be understood that the techniques of this disclosure may be performed by a combination of units or modules associated with, for example, a computing device and/or a medical device.
[0048] In one or more examples, the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions). Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer). [0049] Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple Al l, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors), graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
[0050] Before any embodiments of the disclosure are explained in detail, it is to be understood that the disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Further, the present disclosure may use examples to illustrate one or more aspects thereof. Unless explicitly stated otherwise, the use or listing of one or more examples (which may be denoted by “for example,” “by way of example,” “e.g.,” “such as,” or similar language) is not intended to and does not limit the scope of the present disclosure. [0051] The terms proximal and distal are used in this disclosure with their conventional medical meanings, proximal being closer to the operator or user of the system, and further from the region of surgical interest in or on the patient, and distal being closer to the region of surgical interest in or on the patient, and further from the operator or user of the system.
[0052] Bone cut drills can be prone to skiving, but tactile feedback may alert the user when reaching the distal cortex. High speed drills (HSDs) can reduce or prevent skiving issues, but may inadvertently breach the distal cortex of the vertebra. Oscillating drills (OSDs) operate more slowly than HSDs when performing bone cuts, but have a lower likelihood of breaching the distal cortex and puncturing the nerve sack.
[0053] According to embodiments of the present disclosure, a workflow for performing a surgical task (e.g., a bone cut) comprises scanning a patient on the operating room table. The scan may be used to generate a three-dimensional (3D) model mesh for the anatomical elements (e.g., vertebrae) of the patient. One or more bone cuts may be planned using the 3D model mesh. The planning may include estimating the width of one or more portions of the anatomical elements, such as the width of the distal cortex. A fast and accurate bone cut using an HSD may be performed until the distal cortex is reached. When the HSD is used manually by the user, a navigation system may track the pose of the HSD using navigation markers attached to the HSD. Based on the tracking and the known position of the patient, the navigation system may determine when the HSD has reached the distal cortex. When a robotic arm is used to navigate the HSD, the navigation system may track the pose of the robotic arm and/or the HSD using navigation markers connected to the robotic arm and/or the HSD. When the HSD reaches the distal cortex, the navigation system may mechanically stop the movement of the robotic arm. When two robotic arms are used in conjunction with the navigation system to perform the bone cut, a first robotic arm may operate the HSD as the HSD drills into the anatomical element until the distal cortex is reached.
[0054] Once the distal cortex is reached, an OSD may be used to remove the distal cortex. When the OSD is used manually by the user, a navigation system may track the pose of the OSD using navigation markers attached to the OSD. Based on the tracking and the known position of the patient, the navigation system may determine when the OSD has removed the distal cortex. When a robotic arm is used to navigate the OSD, the navigation system may track the pose of the robotic arm and/or the OSD using navigation markers connected to the robotic arm and/or the OSD. Once the OSD removes the distal cortex, the navigation system may mechanically stop the movement of the robotic arm. When two robotic arms are used in conjunction with the navigation system to perform the bone cut, a second robotic arm may operate the OSD as the OSD drills into the anatomical element until the distal cortex is reached.
[0055] Embodiments of the present disclosure provide technical solutions to one or more of the problems of (1) tool skiving, (2) extended time spent by the patient in an operating room, and (3) inaccurate resection of anatomical elements. [0056] Turning first to Figs. 1A-1B, aspects of a system 100 according to at least one embodiment of the present disclosure is shown. The system 100 may be used to track and/or navigate surgical tools while the surgical tools perform one or more surgical tasks; control, pose, and/or otherwise manipulate a surgical mount system, a surgical arm, and/or surgical tools attached thereto; and/or carry out one or more other aspects of the method described herein. The system 100 comprises a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database 130, a cloud or other network 134, a high speed drill (HSD) 148, and/or an oscillating drill (OSD) 152. Systems according to other embodiments of the present disclosure may comprise more or fewer components than the system 100. For example, the system 100 may not include the imaging device 112, the robot 114, one or more components of the computing device 102, the database 130, and/or the cloud 134.
[0057] The computing device 102 comprises a processor 104, a memory 106, a communication interface 108, and a user interface 110. Computing devices according to other embodiments of the present disclosure may comprise more or fewer components than the computing device 102.
[0058] The processor 104 of the computing device 102 may be any processor described herein or any similar processor. The processor 104 may be configured to execute instructions stored in the memory 106, which instructions may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud 134. The processor 104 may be or comprise one or more digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple Al l, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors), graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000- series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
[0059] The memory 106 may be or comprise RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer- readable data and/or instructions. The memory 106 may store information or data useful for completing, for example, any step of the method 400 described herein, or of any other methods. The memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114. For instance, the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable image processing 120, segmentation 122, transformation 124, and/or registration 128. Such content, if provided as in instruction, may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines. Alternatively or additionally, the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc.) that can be processed by the processor 104 to carry out the various method and features described herein. Thus, although various contents of memory 106 may be described as instructions, it should be appreciated that functionality described herein can be achieved through use of instructions, algorithms, and/or machine learning models. The data, algorithms, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud 134.
[0060] The surgical plan 136 may be or comprise patient information (e.g., images of anatomy of the patient, patient height and weight, etc.), planned surgical tasks (e.g., information about the types of intraoperative images to be captured, information about a path of one or more bone cuts, etc.), combinations thereof, and/or the like. Information from the surgical plan 136 may be rendered to a display (e.g., the user interface 110) for the user of the system 100 (e.g., a physician) to view. One or more aspects of the surgical plan 136 may be modifiable by the user via, for example, the user interface 110. For example, the planned bone cut of an anatomical element stored in the surgical plan 136 may be rendered in conjunction with a 3D model of the anatomical element to the display, and the user may be able to edit the characteristics of the bone cut (e.g., the trajectory of the bone cut, the dimensions of the bone cut, the type of surgical tool used for the bone cut, etc.) through inputs to the user interface 110. In one embodiment, the user may be able to manipulate a virtual representation of the bone cut and/or the anatomical element by interacting with a touchscreen (e.g., the user can draw in the shape of the desired bone cut along the 3D model of the anatomical element). The changed characteristics of the bone cut and/or the anatomical element may then be saved to the database 130, the surgical plan 136, combinations thereof, and/or the like. [0061] The surgical tool information 140 may contain information about the parameters of one or more of the surgical tools used in the surgery or surgical procedure (e.g., the rotation speed of the operative end of the surgical tool, the electrical power requirements of the surgical tool, the dimensions of the surgical tool, etc.). The surgical tool information 140 may also specify which type of surgical tool is to be used at each step in the surgery or surgical procedure. For example, the surgical tool information 140 may specify that the HSD 148 is to be used in a first surgical step of resecting cortical bone, and that the OSD 152 is to be used in a second surgical step of resecting cancellous/trabecular bone. In some embodiments, the surgical tool information 140 may be modifiable by the user (e.g., a physician) based on inputs to the user interface 110. In some embodiments, the surgical tool information 140 may be part of the surgical plan 136.
[0062] The mesh model 144 may be or comprise a model of one or more anatomical elements that are the subject of the surgery or surgical procedure. For example, when the surgery or surgical procedure comprises a bone cut performed on a vertebra, the mesh model 144 may be or comprise a mesh of the vertebra. The mesh model 144 may be generated by the computing device 102 based on one or more images captured using the imaging device 112, image information taken from the surgical plan 136, information retrieved from the database 130, combinations thereof, and/or the like. For example, the imaging device 112 may generate a CT scan of the patient including the target vertebra. The computing device 102 may receive the image data from the imaging device 112 and, using image processing 120 and segmentation 122, generate the mesh model 144. In some embodiments, the computing device 102 may use one or more artificial intelligence (Al) data models that take the image data as an input and output the mesh model 144. Such data models may be trained on historical data sets of similar image data. The mesh model 144 may be rendered to a display, where the user may be able to edit or manipulate the mesh model 144 via, for example, inputs to the user interface 110. In some embodiments, the mesh model 144 may be part of the surgical plan 136.
[0063] The communication interface 108 may be used for receiving image data or other information from an external source (such as the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100), and/or for transmitting instructions, images, or other information to an external system or device (e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100). The communication interface 108 may comprise one or more wired interfaces (e.g., a USB port, an Ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.1 la/b/g/n, Bluetooth, NFC, ZigBee, and so forth). In some embodiments, the communication interface 108 may be useful for enabling the device 102 to communicate with one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
[0064] The computing device 102 may also comprise one or more user interfaces 110. The user interface 110 may be or comprise a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user. The user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100. In some embodiments, the user interface 110 may be useful to allow a surgeon or other user to modify instructions to be executed by the processor 104 according to one or more embodiments of the present disclosure, and/or to modify or adjust a setting of other information displayed on the user interface 110 or corresponding thereto.
[0065] Although the user interface 110 is shown as part of the computing device 102, in some embodiments, the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102. In some embodiments, the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computing device 102.
[0066] The imaging device 112 may be operable to image anatomical feature(s) (e.g., a bone, veins, tissue, etc.) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc.). “Image data” as used herein refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form. In various examples, the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof. The image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure. In some embodiments, a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time, and a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time. The imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data. The imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an O-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient. The imaging device 112 may be contained entirely within a single housing, or may comprise a transmitter/emitter and a receiver/detector that are in separate housings or are otherwise physically separated.
[0067] In some embodiments, the imaging device 112 may comprise more than one imaging device 112. For example, a first imaging device may provide first image data and/or a first image, and a second imaging device may provide second image data and/or a second image. In still other embodiments, the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein. The imaging device 112 may be operable to generate a stream of image data. For example, the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images. For purposes of the present disclosure, unless specified otherwise, image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
[0068] The robot 114 may be any surgical robot or surgical robotic system. The robot 114 may be or comprise, for example, the Mazor X™ Stealth Edition robotic guidance system. The robot 114 may be configured to position the imaging device 112 at one or more precise position(s) and orientation(s), and/or to return the imaging device 112 to the same position(s) and orientation(s) at a later point in time. The robot 114 may additionally or alternatively be configured to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task. In some embodiments, the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure. The robot 114 may comprise one or more robotic arms 116. In some embodiments, the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms. In some embodiments, one or more of the robotic arms 116 may be used to hold and/or maneuver the imaging device 112. In embodiments where the imaging device 112 comprises two or more physically separate components (e.g., a transmitter and receiver), one robotic arm 116 may hold one such component, and another robotic arm 116 may hold another such component. Each robotic arm 116 may be positionable independently of the other robotic arm. The robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
[0069] The robot 114, together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
[0070] The robotic arm(s) 116 may comprise one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm).
[0071] In some embodiments, reference markers (e.g., navigation markers) may be placed on the robot 114 (including, e.g., on the robotic arm 116), the imaging device 112, or any other object in the surgical space. The reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof. In some embodiments, the navigation system 118 can be used to track other components of the system (e.g., imaging device 112) and the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or more surgical tools, based on information and/or instructions generated by the navigation system 118, for example). [0072] The navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation. The navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStation™ S8 surgical navigation system or any successor thereof. The navigation system 118 may include one or more cameras or other sensor(s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the system 100 is located. The one or more cameras may be optical cameras, infrared cameras, or other cameras. In some embodiments, the navigation system 118 may comprise one or more electromagnetic sensors. In various embodiments, the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, and/or one or more surgical tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing). The navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118. The navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
[0073] The database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system). The database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100); one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information. The database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud 134. In some embodiments, the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
[0074] The cloud 134 may be or represent the Internet or any other wide area network. The computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both. In some embodiments, the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
[0075] The HSD 148 may be configured to drill, burr, mill, cut, saw, ream, tap, etc. into anatomical tissues such as patient anatomy (e.g., soft tissues, bone, etc.). In some embodiments, the system 100 may comprise the HSD 148, the OSD 152, and/or multiple other surgical tools, with each surgical tool performing a different surgical task (e.g., the HSD 148 for drilling through trabecular bone of a vertebra, the OSD 152 for cutting through the distal cortex of the vertebra at a slower rate than the HSD 148, etc.). In other embodiments, the system 100 may comprise a surgical tool with an adapter interface to which different working ends can be attached to perform multiple different types of surgical maneuvers (e.g., the surgical tool may be able to receive one or more different tool bits, such that the surgical tool can drill, mill, cut, saw, ream, tap, etc. depending on the tool bit coupled with the surgical tool). In one embodiment, the surgical tool may comprise both the HSD 148 and the OSD 152, and may implement the HSD 148 during a first surgical task and the OSD 152 during a second, different surgical task. The HSD 148 may be operated autonomously (e.g., when the HSD 148 is connected to and manipulated by the robotic arm 116) or semi-autonomously, such as when the HSD 148 is manipulated by a user with guidance from the navigation system 118.
[0076] In some embodiments, the HSD 148 may be attached to a robotic arm 116, such that movement of the robotic arm 116 correspondingly causes movement in the HSD 148. In other words, the HSD 148 may be gripped, held, or otherwise coupled to and controlled by the robotic arm 116. As such, the pose (e.g., position and orientation) of the HSD 148 may be controlled by the pose of the robotic arm 116. The HSD 148 can be controlled by one or more components of the system 100, such as the computing device 102. In some embodiments, the computing device 102 may be capable of receiving or retrieving data or other information (e.g., from the database 130, from one or more sensors, from the imaging device 112, etc.), process the information, and control the HSD 148 based on the processed information. Additionally or alternatively, the navigation system 118 may track the position of and/or navigate the HSD 148. Such tracking may enable the system 100 or components thereof (e.g., the computing device 102) to determine the pose of the HSD 148, the location of the HSD 148 relative to the planned bone cut, combinations thereof, and/or the like, as discussed in further detail below.
[0077] The OSD 152 may be configured to drill, burr, mill, cut, saw, ream, tap, etc. into anatomical tissues such as patient anatomy (e.g., soft tissues, bone, etc.). The OSD 152 may comprise an operative end that operates with oscillatory motion to resect anatomical tissue. In other words, the operative end may change movement in one or more directions over a period of time. For example, the operative end of the OSD 152 may alternate between rotating in a clockwise direction and a counterclockwise direction. Such oscillation may beneficially improve the resection of anatomical tissue while reducing the likelihood of damage to other proximate tissues. For example, the OSD 152 may be used to resect the distal cortex of a vertebra, and the oscillatory motion of the operative end of the OSD 152 may reduce the likelihood of breaching the nerve channel beneath the distal cortex. The OSD 152 may be operated autonomously (e.g., when the OSD 152 is connected to and manipulated by the robotic arm 116) or semi-autonomously, such as when the OSD 152 is manipulated by a user with guidance from the navigation system 118.
[0078] In some embodiments, the OSD 152 may be attached to a robotic arm 116, such that movement of the robotic arm 116 correspondingly causes movement in the OSD 152. In other words, the OSD 152 may be gripped, held, or otherwise coupled to and controlled by the robotic arm 116. As such, the pose (e.g., position and orientation) of the OSD 152 may be controlled by the pose of the robotic arm 116. The OSD 152 can be controlled by one or more components of the system 100, such as the computing device 102. In some embodiments, the computing device 102 may be capable of receiving or retrieving data or other information (e.g., from the database 130, from one or more sensors, from the imaging device 112, etc.), process the information, and control the OSD 152 based on the processed information. Additionally or alternatively, the navigation system 118 may track the position of and/or navigate the OSD 152. Such tracking may enable the system 100 or components thereof (e.g., the computing device 102) to determine the pose of the OSD 152, the location of the OSD 152 relative to the planned bone cut, combinations thereof, and/or the like, as discussed in further detail below. In some embodiments, the HSD 148 may be connected to a first robotic arm, while the OSD 152 is connected to a second, different robotic arm. In such embodiments, each robotic arm may independently operate each surgical tool, with the navigation system 118 tracking both and the computing device 102 generating navigation paths of both robotic arms to avoid or mitigate the likelihood of collisions between the two.
[0079] The system 100 or similar systems may be used, for example, to carry out one or more aspects of the method 400 described herein. The system 100 or similar systems may also be used for other purposes.
[0080] With reference to Figs. 2A-2C, aspects of surgical tools of the system 100 performing resection of an anatomical element are shown in accordance with embodiments of the present disclosure. An HSD 204 and an OSD 208 may be positioned relative to an anatomical element 210. In some embodiments, the HSD 204 may be similar to or the same as the HSD 148, and the OSD 208 may be similar to or the same as the OSD 152. In some embodiments, the HSD 204 and the OSD 208 may be used to perform a bone cut 228. The bone cut 228 may be or comprise a cut through one or more portions of the anatomical element 210, such as through the outer cortical bone of a vertebra and through the distal cortex of the vertebra. In some embodiments, the HSD 204 may be used to perform the first cut on the outer cortical bone, while the OSD 208 may be used to perform the second cut to remove the distal cortex.
[0081] The HSD 204 may comprise a navigated portion 212 and the OSD 208 may comprise a navigated portion 216. The navigated portion 212 and the navigated portion 216 may respectively comprise navigation markers 220A-220D (including a first navigation marker 220A, a second navigation marker 220B, a third navigation marker 220C, and a fourth navigation marker 220D) and navigation markers 224A-224D (including a first navigation marker 224A, a second navigation marker 224B, a third navigation marker 224C, and a fourth navigation marker 224D). The navigation markers 220A-220D and the navigation markers 224A-224D may enable a navigation camera of the navigation system 118 to track the pose (e.g., position and orientation) of the HSD 204 and the OSD 208, respectively, as the HSD 204 and the OSD 208 move relative to the anatomical element 210. For example, the HSD 204 may be moved (e.g., using a robotic arm such as the robotic arm 116, manually by a user, etc.) relative to the anatomical element 210 as the operative portion of the HSD 204 resects one or more portions of the anatomical element 210. As the user or robotic arm 116 maneuvers the HSD 204 relative to the anatomical element 210, the navigation camera of the navigation system 118 may identify the navigation markers 220A-220D and track the pose of the navigated portion 212 (e.g., based on the pose of the navigation markers 220A-220D), such that the navigation system 118 can determine a pose of the HSD 204 in a known coordinate system. Similarly, the OSD 208 may be moved (e.g., using a robotic arm such as the robotic arm 116, manually by a user, etc.) relative to the anatomical element 210 as the operative portion of the OSD 208 resects one or more portions of the anatomical element 210. As the user or robotic arm 116 maneuvers the OSD 208 relative to the anatomical element 210, the navigation camera of the navigation system 118 may identify the navigation markers 224A-224D and track the pose of the navigated portion 216 (e.g., based on the pose of the navigation markers 224A- 224D), such that the navigation system 118 can determine a pose of the OSD 208 in a known coordinate system. In some cases, the navigation system 118 may provide the pose information of the HSD 204 and/or the OSD 208 to the computing device 102, which may use the pose information to determine when use of the HSD 204 and/or the OSD 208 should be discontinued or stopped, as discussed in more detail below. In some embodiments, the navigation system 118 may provide updated pose information at a predetermined or user- specified interval (e.g., every 10 milliseconds (ms), every 20 ms, every 50 ms, every 2 seconds, etc.).
[0082] The bone cut 228 may be or comprise an indication of the portions of the anatomical element 210 that are to be operated on by the HSD 204 and/or the OSD 208. For example, the bone cut 228 may comprise a multi-dimensional shape (e.g., 2D plane or 3D volume) that specifies the area or volume of the anatomical element 210 that is to be resected or otherwise operated on by the HSD 204 and/or the OSD 208. In one example, the bone cut 228 comprises a plane 240 that divides the anatomical element 210 into a first portion 232 and a second portion 236. The first portion 232 may be or comprise a section of the anatomical element 210 that is to be resected using the HSD 204, while the second portion 236 may be or comprise a section of the anatomical element 210 that is to be resected using the OSD 208. In some embodiments, the plane 240 may be definable or editable by the user via, for example, the user interface 110. For instance, the user may be able to change the shape, orientation, dimensions, and/or the like of the plane 240 to change the shape, orientation, dimensional and/or the like of the first portion 232 and/or the second portion 236. In some examples, the user may be able to draw a shape of the plane 240 on a mesh model (e.g., mesh model 144) or the anatomical element 210 rendered to a display, and the mesh model may be updated to incorporate the plane 240.
[0083] Once the bone cut 228 has been planned, the HSD 204 may be used to resect anatomical tissue associated with the first portion 232 of the anatomical element 210. As depicted in Fig. 2B, the HSD 204 may move laterally across the anatomical element 210 such that the operative portion of the HSD 204 resects anatomical tissue. The pose of the HSD 204 may be tracked using a navigation camera of the navigation system 118 that identifies a pose of the navigation markers 220A-220D on the navigated portion 212. The HSD 204 may be used to resect the first portion 232 of the anatomical element 210 until the operative portion of the HSD 204 reaches the plane 240. The computing device 102 may determine that the HSD 204 has reached the plane 240 based on the tracking of the pose of the HSD 204 as the HSD 204 resects the first portion 232. The computing device 102 may receive tracking and/or pose information from the navigation system 118, and may use such information and information from the surgical plan 136 (e.g., the dimensions of the first portion 232) to determine that the HSD 204 has reached the plane 240. For example, the surgical plan 136 may specify that the first portion 232 has a first thickness, and the computing device 102 may determine the HSD 204 would have a first pose in a known coordinate system when the HSD 204 has drilled through the first thickness. When the navigation system 118 tracking information specifies that the HSD 204 is in the first pose, the computing device 102 may determine that the first portion 232 has been resected and the HSD 204 has reached the plane 240.
[0084] In response to determining that the HSD 204 has reached the plane 240, the computing device 102 may generate an alert. The alert may be a visual alert (e.g., a flashing warning rendered to a display), an audio alert (e.g., a siren or other alarm), haptic (e.g., motors in the display are activated to vibrate the display), combinations thereof, and/or the like. In one embodiment, a warning message may be rendered to the display to indicate to the user that the first portion 232 has been resected and that the operative portion of the HSD 204 has arrived at the plane 240. In some cases, the plane 240 may represent the intersection of two different types of anatomical tissues. For instance, the first portion 232 may comprise cortical bone that can be resected using the HSD 204, while the second portion 236 comprises anatomical tissues within the vicinity of critical anatomical structures (e.g., nerve bundles, the spinal cord, etc.) that should be resected using a different tool such as the OSD 208. In some embodiments, the computing device 102 may disable the HSD 204 once the computing device 102 has determined that the HSD 204 has reached the plane 240, and may prevent the HSD 204 from being used again unless the user overrides the disabling of the HSD 204 (e.g., via the user interface 110).
[0085] Once the plane 240 has been reached, the OSD 208 may be used to resect the second portion 236. In cases where the HSD 204 and/or the OSD 208 are operated manually by a user (e.g., a physician), the computing device 102 may render the alert to the display instructing the user to change the surgical tool in use (e.g., to switch out the HSD 204 for the OSD 208). In semi- autonomous cases, such as when a robotic arm such as the robotic arm 116 navigates the HSD 204 and the OSD 208, the alert may instruct the user to switch out the surgical tool connected to the robotic arm. For example, the robotic arm may navigate the HSD 204 until the operative portion of the HSD 204 reaches the plane 240, at which point the alert may be generated. The alert may be an instruction rendered to the display that instructs the user to detach the HSD 204 from the robotic arm and to attach the OSD 208. In some cases, the navigation system 118 may initiate and conduct an authentication process to ensure that the correct surgical tool (e.g., the OSD 208) has been connected to the robotic arm. For instance, the navigation system 118 may identify the navigation markers 224A-224D (which may be distinguishable by the navigation system 118 from the navigation markers 220A-220D of the HSD 204) to verify that the HSD 204 has been disconnected from the robotic arm and that the OSD 208 has been connected.
[0086] The OSD 208 may then be used to resect the second portion 236. The pose of the OSD 208 may be tracked using the navigation camera of the navigation system 118 that identifies a pose of the navigation markers 224A-224D on the navigated portion 216. The computing device 102 may determine that the OSD 208 has sufficiently resected the second portion 236 based on tracking the pose of the OSD 208. The computing device 102 may receive tracking and/or pose information from the navigation system 118, and may use such information and information from the surgical plan 136 (e.g., the dimensions of the second portion 236) to determine that the OSD 208 has sufficiently resected the second portion 236. For example, the surgical plan 136 may specify that the second portion 236 has a first thickness, and the computing device 102 may determine the OSD 208 would have a first pose in a known coordinate system when the OSD 208 has drilled through the first thickness. When the navigation system 118 tracking information specifies that the OSD 208 is in the first pose, the computing device 102 may determine that the second portion 236 has been resected. Once the OSD 208 has resected the second portion 236, the computing device 102 may generate a second alert that indicates that the second portion 236 has been resected. The alert may be a visual alert (e.g., a flashing warning rendered to a display), an audio alert (e.g., a siren or other alarm), haptic (e.g., motors in the display are activated to vibrate the display), combinations thereof, and/or the like. In some embodiments, the computing device 102 may disable use of the OSD 208 once the computing device 102 has determined that the second portion 236 has been resected and has generated the second alert.
[0087] As the HSD 204 and/or the OSD 208 are used to resect anatomical tissue, a depiction of the anatomical element 210 on a display may be changed or otherwise updated to reflect the removal of anatomical tissue. The updating may occur while the HSD 204 and the OSD 208 resect portions of the anatomical element 210 and/or after each step of the surgery or surgical procedure. For example, the mesh model 144 may be rendered to a display, and after the HSD 204 has resected the first portion 232, the mesh model 144 may be updated to depict the anatomical element 210 with the first portion 232 removed. Similarly, once the OSD 208 has resected the second portion 236, the mesh model 144 may be again updated to depict the anatomical element 210 with the second portion 236 removed. In some embodiments, the computing device 102 may generate a new model after each resection or step in the surgery or surgical procedure. For example, after the HSD 204 has resected the first portion 232, the computing device 102 may use image processing 120 and segmentation 122 to segment out the depiction of the first portion 232 from the mesh model 144, and render the mesh model 144 as a new model to the display.
[0088] With reference to Figs. 3A-3B, a two robotic arm configuration of the system 100 is shown in accordance with embodiments of the present disclosure. The two robotic arm configuration may be disposed proximate a patient 308 disposed on a table 304. The table 304 is configured to support the patient 308 during a surgery or surgical procedure. The table 304 may include any accessories mounted to or otherwise coupled to the table 304 such as, for example, a bed rail, a bed rail adaptor, an arm rest, an extender, or the like. The table 304 may be stationary or may be operable to maneuver the patient 308 (e.g., the table 304 may be able to move). In some embodiments, the table 304 has two positioning degrees of freedom and one rotational degree of freedom, which allows positioning of the specific anatomy of the patient anywhere in space (within a volume defined by the limits of movement of the table 304). For example, the table 304 can slide forward and backward and from side to side, and can tilt (e.g., around an axis positioned between the head and foot of the table 304 and extending from one side of the table 304 to the other) and/or roll (e.g., around an axis positioned between the two sides of the table 304 and extending from the head of the table 304 to the foot thereof). In other embodiments, the table 304 can bend at one or more areas (which bending may be possible due to, for example, the use of a flexible surface for the table 304, or by physically separating one portion of the table 304 from another portion of the table 304 and moving the two portions independently). In at least some embodiments, the table 304 may be manually moved or manipulated by, for example, a surgeon or other user, or the table 304 may comprise one or more motors, actuators, and/or other mechanisms configured to enable movement and/or manipulation of the table 304 by a processor such as the processor 104.
[0089] The two robotic arm configuration comprises a first robotic arm 316 and a second robotic arm 320. Each of the first robotic arm 316 and the second robotic arm 320 may be similar to or the same as the robotic arm 116. The first robotic arm 316 may be couplable to a first surgical tool 324, and the second robotic arm 320 may be couplable to the second surgical tool 328. The first surgical tool 324 may be similar to or the same as the HSD 148 and/or the HSD 204, while the second surgical tool 328 may be similar to or the same as the OSD 152 and/or the OSD 208. In some cases, the first robotic arm 316 and the second robotic arm 320 may each be couplable to one or more other surgical tools. In one example, the first robotic arm 316 and the second robotic arm 320 may each comprise a tool changer that enables the first robotic arm 316 and the second robotic arm 320 to be coupled to different surgical tools, such as to an HSD or to an OSD. In this way, the first robotic arm 316 and the second robotic arm 320 can each provide both HSD and OSD drilling capabilities to a respective side of the patient. In other words, the first robotic arm 316 provides HSD and OSD drilling capabilities to a first side of the patient 308, while the second robotic arm 320 provides HSD and OSD drilling capabilities to a second side of the patient 308. In some examples, the first robotic arm 316, the second robotic arm 320, the first surgical tool 324, and/or the second surgical tool 328 may be controlled by the computing device 102 based on information received from the navigation system 118, without requiring additional user input. In some cases, however, the user may be able to monitor the operation of the system 100 and intervene with the operation of the first robotic arm 316, the second robotic arm 320, the first surgical tool 324, and/or the second surgical tool 328 at any time. In other words, the two robotic arm configuration may comprise manual override mechanisms that enable the user of the system 100 to change or stop operation of the various components of the two robotic arm configuration at any time.
[0090] The first surgical tool 324 and the second surgical tool 328 may be navigated by the computing device 102 and/or the navigation system 118 within a working volume 312. The working volume 312 may be or comprise a predefined volume in which the first surgical tool 324 and the second surgical tool 328 operate. In some embodiments, the computing device 102 may generate one or more navigation paths for the first robotic arm 316 and/or the second robotic arm 320 to avoid or mitigate the risk of collision between the first robotic arm 316 (or components thereof) with the second robotic arm 320 (or components thereof) within the working volume 312. In some embodiments, the first robotic arm 316 may be disposed on a first side of the patient 308, and the second robotic arm 320 may be disposed on an opposite side of the patient 308. The computing device 102 may lock or otherwise disable the use of one robotic arm while the other is in use, and vice versa. For example, when the first robotic arm 316 is used to perform a drilling step in the surgical procedure, the second robotic arm 320 and components thereof (e.g., the second surgical tool 328) may be locked in place within the working volume 312. Additionally or alternatively, the second robotic arm 320 and components thereof may be moved outside of the working volume 312 so as to not interfere with navigation of the first robotic arm 316 and components thereof.
[0091] The first robotic arm 316 and the second robotic arm 320 may be used to perform an autonomous bone cut of an anatomical element of the patient 308. For example, the first robotic arm 316 and the second robotic arm 320 may be used to drill a hole for a pedicle screw in a vertebra 332 of the patient 308. The planned drilling may comprise drilling down to a first depth 336 using the first surgical tool 324, and then drilling down to a second depth 340 using the second surgical tool 328. In some embodiments, the use of different surgical tools may account for differences in the composition and/or sensitive of anatomical tissues of the vertebra 332 along the trajectory. In some embodiments, one or more multi-dimensional shapes (e.g., a 2D plane or a 3D volume) may be used to represent the different depths, such as when a 2D plane bisects the drilling depths into two portions: a first portion through which the first surgical tool 324 is to drill, and a second portion through which the second surgical tool 328 is to drill. In some embodiments, the multi-dimensional shape may be definable or editable by the user via, for example, the user interface 110. For instance, the user may be able to change the shape, orientation, dimensions, and/or the like of the multidimensional shape to change the shape, orientation, dimensional and/or the like of the drilling depths. For example, the user may be able to draw a shape of the multi-dimensional shape on a mesh model (e.g., mesh model 144) or the vertebra 332 rendered to a display, and the mesh model may be updated to incorporate the multi-dimensional shape. Additionally or alternatively, the user may be able to define and/or edit the trajectory of the drilling into the vertebra 332.
[0092] Once the drilling is planned, the first surgical tool 324 may be used to drill through the anatomical tissue in the vertebra 332 along the planned trajectory. As depicted in Fig. 3B, the first surgical tool 324 may be aligned with the planned trajectory (e.g., by the computing device 102 adjusting the pose of the first robotic arm 316 by actuating one or more motors in the joints of the first robotic arm 316) and the first surgical tool 324 begins drilling into the vertebra 332. The pose of the first robotic arm 316 and/or the first surgical tool 324 may be tracked using a navigation camera of the navigation system 118 that identifies a pose of the navigation markers attached to the first robotic arm 316 and/or the first surgical tool 324. The first surgical tool 324 may drill into the vertebra 332 until the first depth 336 is reached.
[0093] The computing device 102 may determine that the first surgical tool 324 has reached the first depth 336 based on the tracking of the pose of the first surgical tool 324 as the first surgical tool 324 drills into the vertebra 332. The computing device 102 may receive tracking and/or pose information from the navigation system 118, and may use such information and information from the surgical plan 136 (e.g., the depth of the first depth 336) to determine that the first surgical tool 324 has reached the first depth 336. For example, the surgical plan 136 may specify the first depth 336, and the computing device 102 may determine the first surgical tool 324 would have a first pose in a known coordinate system when the first surgical tool 324 has drilled to the first depth 336. When the navigation system 118 tracking information specifies that the first surgical tool 324 is in the first pose, the computing device 102 may determine that the first surgical tool 324 has drilled down to the first depth 336.
[0094] In response to determining that the first surgical tool 324 has reached the first depth 336, the computing device 102 may generate an alert. The alert may be a visual alert (e.g., a flashing warning rendered to a display), an audio alert (e.g., a siren or other alarm), haptic (e.g., motors in the display are activated to vibrate the display), combinations thereof, and/or the like. In one embodiment, a warning message may be rendered to the display to indicate to the user that the first depth 336 has been reached by the first surgical tool 324. In some cases, the first depth 336 may be a depth at which a different type of anatomical tissue is encountered, such that a different surgical tool should be used. For instance, up to the first depth 336 the first surgical tool 324 may be drilling through pedicle bone, while anatomical tissue encountered along the trajectory after the first depth 336 may comprise softer anatomical tissues (e.g., cancellous bone) and/or may be within a threshold distance of critical anatomical structures (e.g., nerve bundles), such that a different surgical tool such as the second surgical tool 328 should be used. In some embodiments, the computing device 102 may disable the first surgical tool 324 once the computing device 102 has determined that the first depth 336 has been reached, and may move the first robotic arm 316 and/or the first surgical tool 324 from the working volume 312. After the first robotic arm 316 and/or the first surgical tool 324 have been removed from the working volume 312, the computing device 102 may cause the second robotic arm 320 to move into the working volume 312 such that the second surgical tool 328 can continue drilling along the trajectory of the planned drilling step. In other words, the second robotic arm 320 may be manipulated such that the second surgical tool 328 is disposed to drill into the drilling trajectory past the first depth 336 to a second depth 340.
[0095] The second surgical tool 328 may be used to drill past the first depth 336 and down to the second depth 340. The second robotic arm 320 and/or the second surgical tool 328 may be tracked using a navigation camera of the navigation system 118 that identifies a pose of the navigation markers attached to the second robotic arm 320 and/or the second surgical tool 328. The computing device 102 may determine that the second surgical tool 328 has drilled to the second depth 340 based on tracking the pose of the second surgical tool 328. The computing device 102 may receive tracking and/or pose information from the navigation system 118, and may use such information and information from the surgical plan 136 (e.g., the depth of the second depth 340) to determine that the second surgical tool 328 has drilled to the second depth 340. For example, the surgical plan 136 may specify the depth of the second depth 340, and the computing device 102 may determine the second surgical tool 328 would have a first pose in a known coordinate system when the second surgical tool 328 has drilled to the second depth 340. When the navigation system 118 tracking information specifies that the second surgical tool 328 is in the first pose, the computing device 102 may determine that the second surgical tool 328 has drilled to the second depth 340. Once the second surgical tool 328 has drilled to the second depth 340, the computing device 102 may generate a second alert that indicates that the second depth 340 has been reached. The alert may be a visual alert (e.g., a flashing warning rendered to a display), an audio alert (e.g., a siren or other alarm), haptic (e.g., motors in the display are activated to vibrate the display), combinations thereof, and/or the like. In some embodiments, the computing device 102 may disable use of the second depth 340 has been reached and has generated the second alert. Once the second surgical tool 328 has drilled to the second depth 340, the computing device 102 may cause second robotic arm 320 to move out of the working volume 312. The surgery or surgical procedure may then move to the next step (e.g., inserting a pedicle screw along the trajectory drilled out by the first surgical tool 324 and the second surgical tool 328. [0096] Fig. 4 depicts a method 400 that may be used, for example, to perform a bone cut using navigated surgical tools.
[0097] The method 400 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor. The at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above. The at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118). A processor other than any processor described herein may also be used to execute the method 400. The at least one processor may perform the method 400 by executing elements stored in a memory such as the memory 106. The elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 400. One or more portions of a method 400 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
[0098] The method 400 comprises receiving a 3D model of an anatomical element (step 404). The anatomical element may be similar to or the same as the anatomical element 210 or the vertebra 332. The 3D model may be similar to or the same as the mesh model 144. The 3D model may be generated by the computing device 102 using image processing 120 and segmentation 122 to process and segment image data received from the imaging device 112, from the database 130, from the surgical plan 136, combinations thereof, and/or the like.
[0099] The method 400 also comprises determining, based on a surgical plan and the 3D model, a plane that at least partially bisects the anatomical element into a first portion and a second portion (step 408). The plane may be similar to or the same as the plane 240. The plane may be generated by a user based on one or more inputs into the user interface 110. For example, the user may use a touchscreen to generate a planned bone cut (e.g., bone cut 228) relative to the anatomical element. The bone cut may separate the anatomical element into the first portion, which may be similar to or the same as the first portion 232, and a second portion, which may be similar to or the same as the second portion 236. Based on the planned bone cut, the computing device 102 or a component thereof (e.g., the processor 104) may determine the pose of the first portion and the second portion. In other words, the processor 104 may use transformation 124 to transform the user input relative to the 3D model into a known coordinate system to determine the location of the bone cut, and may identify (e.g., using segmentation 122) the first portion and the second portion of the anatomical 1 element. In another example, the first portion of the anatomical element may correspond to a portion of the anatomical element up to a first depth (e.g., first depth 336), while the second portion of the anatomical element corresponds to a portion of the anatomical element below the first depth and down to a second depth. In such examples, the plane may be the multi-dimensional plane that divides the anatomical element at the first depth, such that the first and second portions of the anatomical element are separated at the first depth.
[0100] The method 400 also comprises tracking a first surgical tool as the first surgical tool resects a first portion of the anatomical element (step 412). The surgical plan may specify that the first surgical tool (e.g., the HSD 148, the HSD 204, the first surgical tool 324, etc.) is to resect the first portion of the anatomical element. The tracking may be performed based on information generated by the navigation system 118. For example, a navigation camera of the navigation system 118 may track navigation markers (e.g., navigation markers 220A-220D) of the first surgical tool as the first surgical tool resects the first portion. Based on the tracking, the computing device 102 or a component thereof (e.g., the processor 104) may determine the pose of the first surgical tool based on, for example, the determined pose of the navigation markers and registration 128 between the navigation markers and the patient.
[0101] The method 400 also comprises determining, based on the tracking, that the first surgical tool has reached the plane (step 416). The computing device 102 may determine that the first surgical tool has reached the plane based on the tracking of the pose of the first surgical tool as the first surgical tool resects the first portion. The computing device 102 may receive tracking and/or pose information from the navigation system 118, and may use such information and information from the surgical plan 136 (e.g., the dimensions of the first portion) to determine that the first surgical tool has reached the plane. For example, the surgical plan 136 may specify that the first portion has a first thickness, and the computing device 102 may determine the first surgical tool would have a first pose in a known coordinate system when the first surgical tool has drilled through the first thickness. When the navigation system 118 tracking information specifies that the first surgical tool is in the first pose, the computing device 102 may determine that the first portion has been resected and the first surgical tool has reached the plane.
[0102] The method 400 also comprises generating, when the first surgical tool has reached the plane, an alert (step 420). The alert may be a visual alert (e.g., a flashing warning rendered to a display), an audio alert (e.g., a siren or other alarm), haptic (e.g., motors in the display are activated to vibrate the display), combinations thereof, and/or the like. In one embodiment, a warning message may be rendered to the display to indicate to the user that the first portion has been resected by the first surgical tool.
[0103] In cases where the first surgical tool and/or the second surgical tool are operated manually by a user (e.g., a physician), the computing device 102 may render the alert to the display instructing the user to adjust the surgical tool in use (e.g., to switch out the first surgical tool for the second surgical tool). In semi-autonomous cases, such as when a robotic arm such as the robotic arm 116 navigates the first surgical tool and the second surgical tool, the alert may instruct the user to switch out the surgical tool connected to the robotic arm. For example, the robotic arm may navigate the first surgical tool until the operative portion of the first surgical tool reaches the plane, at which point the alert may be generated. The alert may be an instruction rendered to the display that instructs the user to detach the first surgical tool from the robotic arm and to attach the second surgical tool. In some cases, the navigation system 118 may initiate and conduct an authentication process to ensure that the correct surgical tool (e.g., the second surgical tool) has been connected to the robotic arm. For instance, the navigation system 118 may identify navigation markers unique to the second surgical tool to verify that the first surgical tool has been disconnected from the robotic arm and that the second surgical tool has been connected.
[0104] In some embodiments, such as when a first robotic arm navigates the first surgical tool and a second robotic arm navigates the second surgical tool, the alert may indicate to the user that the plane has been reached, and that the fully autonomous system is moving on to the next step in the surgical process. The computing device 102 may disable the first surgical tool once the computing device 102 has determined that the plane has been reached, and may move the first robotic arm and/or the first surgical tool away from the anatomical element. After the first robotic arm and/or the first surgical tool have been removed, the computing device 102 may cause the second robotic arm 320 to move relative to the anatomical element such that the second surgical tool proceed with resecting the second portion of the anatomical element. In other words, the second robotic arm may be manipulated such that the second surgical tool is disposed to continue resecting the anatomical element in a similar pose to that of the first surgical tool.
[0105] The method 400 also comprises tracking a second surgical tool as the second surgical tool resects the second portion of the anatomical element (step 424). The second surgical tool (e.g., the OSD 152, the OSD 208, the second surgical tool 328, etc.) may be used to resect the second portion. The pose of the second surgical tool may be tracked using the navigation camera of the navigation system 118 that identifies a pose of the navigation markers on the second surgical tool. The computing device 102 may determine that the second surgical tool has sufficiently resected the second portion based on tracking the pose of the second surgical tool. The computing device 102 may receive tracking and/or pose information from the navigation system 118, and may use such information and information from the surgical plan 136 (e.g., the dimensions of the second portion) to determine that the second surgical tool has sufficiently resected the second portion.
[0106] The method 400 also comprises updating, after at least one of the resecting of the first portion and the resecting of the second portion, the 3D model of the anatomical element (step 428). As the first portion and the second portion are resected, the 3D model of the anatomical element on a display may be changed or otherwise updated to reflect the removal of anatomical tissue. The updating may occur while the first surgical tool and the second surgical tool resect respective portions of the anatomical element, after one or more steps of the surgery or surgical procedure, and/or after one or more steps of the method 400. For example, the 3D model may be rendered to a display, and after the first surgical tool has resected the first portion, the 3D model may be updated to depict the anatomical element with the first portion removed. Similarly, once the second surgical tool has resected the second portion, the 3D model may be again updated to depict the anatomical element with the second portion removed. In some embodiments, instead of updating the 3D model the computing device 102 may generate a new model (e.g., based on updated image data captured by the imaging device 112).
[0107] In some embodiments, any one or more of the steps of the method 400 may be repeated one or more times, such as when the surgical plan specifies a plurality of different bone cuts and/or a plurality of different portions. For example, the bone cut may comprise a third portion with a plane separating the third portion from the second portion. In such an example, the steps 416, 420, and 424 may be repeated for the resection of the second portion and the third portion of the anatomical element.
[0108] The present disclosure encompasses embodiments of the method 400 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
[0109] As noted above, the present disclosure encompasses methods with fewer than all of the steps identified in Fig. 4 (and the corresponding description of the method 400), as well as methods that include additional steps beyond those identified in Fig. 4 (and the corresponding description of the method 400). The present disclosure also encompasses methods that comprise one or more steps from one method described herein, and one or more steps from another method described herein. Any correlation described herein may be or comprise a registration or any other correlation. [0110] The foregoing is not intended to limit the disclosure to the form or forms disclosed herein. In the foregoing Detailed Description, for example, various features of the disclosure are grouped together in one or more aspects, embodiments, and/or configurations for the purpose of streamlining the disclosure. The features of the aspects, embodiments, and/or configurations of the disclosure may be combined in alternate aspects, embodiments, and/or configurations other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed aspect, embodiment, and/or configuration. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate preferred embodiment of the disclosure.
[0111] Moreover, though the foregoing has included description of one or more aspects, embodiments, and/or configurations and certain variations and modifications, other variations, combinations, and modifications are within the scope of the disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative aspects, embodiments, and/or configurations to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.
[0112] The following examples are provided as non-limiting examples for embodiments of the invention.
[0113] Example 1. A system comprising: a robotic arm (116, 316, 320) couplable to a first surgical tool (148, 204, 324); a processor (104); and a memory (106) storing data thereon that, when processed by the processor (104), enable the processor (104) to: receive a three-dimensional (3D) model (144) of an anatomical element; determine, based on the 3D model (144), a plane that at least partially bisects the anatomical element into a first portion and a second portion; track the robotic arm (116, 316, 320) as the first surgical tool (148, 204, 324) resects the first portion of the anatomical element; determine, based on the tracking of the robotic arm (116, 316, 320), that the first surgical tool (148, 204, 324) has reached the plane; and generate, when the first surgical tool (148, 204, 324) has reached the plane, an alert.
[0114] Example 2 : The system of Example 1, wherein the robotic arm (116, 316, 320) is further couplable to a second surgical tool (152, 208, 328), and wherein the memory (106) stores further data that, when processed by the processor (104), enables the processor (104) to: track the robotic arm (116, 316, 320) as the second surgical tool (152, 208, 328) resects the second portion of the anatomical element.
[0115] Example 3: The system of Example 2, wherein the memory (106) stores further data that, when processed by the processor (104), enables the processor (104) to: update, after at least one of the first surgical tool (148, 204, 324) resecting the first portion of the anatomical element and the second surgical tool (152, 208, 328) resecting the second portion of the anatomical element, a depiction of the anatomical element rendered to a display.
[0116] Example 4 : The system of Example 2 or 3, wherein the second surgical tool (152, 208, 328) comprises an oscillating drill.
[0117] Example 5 : The system of any one of Examples 1-4, wherein the first surgical tool (148, 204, 324) comprises a high speed drill.
[0118] Example 6: The system of any one of Examples 1-5, wherein the robotic arm (116, 316, 320) comprises a plurality of navigation markers (220A-220D, 224A-224D), and wherein tracking the robotic arm (116, 316, 320) comprises identifying a pose of the plurality of navigation markers (220A-220D, 224A-224D).
[0119] Example 7 : The system of any one of Examples 1-6, further comprising: a second robotic arm (116, 316, 320) couplable to a second surgical tool (152, 208, 328), wherein the memory (106) stores further data that, when processed by the processor (104), enables the processor (104) to track the second robotic arm (116, 316, 320) as the second surgical tool (152, 208, 328) resects the second portion of the anatomical element.
[0120] Example 8: The system of any one of Examples 1-7, wherein the alert is rendered to a display. [0121] Example 9: The system of any one of Examples 1-8, wherein the second portion of the anatomical element is positioned between the first portion of the anatomical element and a second anatomical element.
[0122] Example 10: A surgical system comprising: a processor (104); and a memory (106) storing data thereon that, when executed by the processor (104), enable the processor (104) to: receive a three-dimensional (3D) model (144) of an anatomical element; determine, based on the 3D model (144), a plane that at least partially bisects the anatomical element into a first portion and a second portion; track a first surgical tool (148, 204, 324) as the first surgical tool (148, 204, 324) resects the first portion of the anatomical element; determine, based on the tracking, that the first surgical tool (148, 204, 324) has reached the plane; generate, when the first surgical tool (148, 204, 324) has reached the plane, an alert; and track a second surgical tool (152, 208, 328) as the second surgical tool (152, 208, 328) resects the second portion of the anatomical element.
[0123] Example 11 : The surgical system of Example 10, wherein the memory (106) stores further data that, when processed by the processor (104), enables the processor (104) to: update, after at least one of the first surgical tool (148, 204, 324) resecting the first portion of the anatomical element and the second surgical tool (152, 208, 328) resecting the second portion of the anatomical element, a depiction of the anatomical element rendered to a display.
[0124] Example 12: The surgical system of Example 10 or 11, wherein the first surgical tool (148, 204, 324) comprises a high speed drill, and wherein the second surgical tool (152, 208, 328) comprises an oscillating drill.
[0125] Example 13: The surgical system of any one of Examples 10-12, wherein the memory (106) stores further data that, when processed by the processor (104), enables the processor (104) to:
[0126] disable the first surgical tool (148, 204, 324) when the first surgical tool (148, 204, 324) reaches the plane.
[0127] Example 14: The surgical system of any one of Examples 10-13, wherein the first surgical tool (148, 204, 324) comprises a first set of navigation markers (220A-220D, 224A-224D), wherein the second surgical tool (152, 208, 328) comprises a second set of navigation markers (220A-220D, 224A-224D), wherein tracking the first surgical tool (148, 204, 324) comprises identifying a pose of the first set of navigation markers (220A-220D, 224A-224D), and wherein tracking the second surgical tool (152, 208, 328) comprises identifying a pose of the second set of navigation markers (220A-220D, 224A-224D).
[0128] Example 15: The surgical system of any one of Examples 10-14, wherein the first surgical tool (148, 204, 324) is connectable to a robotic arm (116, 316, 320), and wherein the robotic arm (116, 316, 320) manipulates the first surgical tool (148, 204, 324) to resect the first portion of the anatomical element.
[0129] Example 16: The surgical system of Example 15, wherein the second surgical tool (152, 208, 328) is connectable to the robotic arm (116, 316, 320), and wherein the robotic arm (116, 316, 320) manipulates the second surgical tool (152, 208, 328) to resect the second portion of the anatomical element.
[0130] Example 17: The surgical system of Example 15, wherein the second surgical tool (152, 208, 328) is connectable to a second robotic arm (116, 316, 320), and wherein the second robotic arm (116, 316, 320) manipulates the second surgical tool (152, 208, 328) to resect the second portion of the anatomical element.
[0131] Example 18: A method, comprising: receiving a three-dimensional (3D) model (144) of an anatomical element; determining, based on a surgical plan and the 3D model (144), a plane that bisects the anatomical element into a first portion and a second portion; tracking a first surgical tool (148, 204, 324) as the first surgical tool (148, 204, 324) resects the first portion of the anatomical element; determining, based on tracking, that the first surgical tool (148, 204, 324) has reached the plane; generating, when the first surgical tool (148, 204, 324) has reached the plane, an alert; and tracking a second surgical tool (152, 208, 328) as the second surgical tool (152, 208, 328) resects the second portion of the anatomical element.
[0132] Example 19: The method of Example 18, further comprising: updating, after at least one of the resecting of the first portion and the resecting of the second portion, a depiction of the anatomical element rendered to a display.
[0133] Example 20: The method of Example 18 or 19, wherein the first surgical tool (148, 204, 324) comprises a high speed drill, and wherein the second surgical tool (152, 208, 328) comprises an oscillating drill.

Claims

CLAIMS What is claimed is:
1. A system, comprising: a robotic arm (116, 316, 320) couplable to a first surgical tool (148, 204, 324); a processor (104); and a memory (106) storing data thereon that, when processed by the processor (104), enable the processor (104) to: receive a three-dimensional (3D) model (144) of an anatomical element; determine, based on the 3D model (144), a plane that at least partially bisects the anatomical element into a first portion and a second portion; track the robotic arm (116, 316, 320) as the first surgical tool (148, 204, 324) resects the first portion of the anatomical element; determine, based on the tracking of the robotic arm (116, 316, 320), that the first surgical tool (148, 204, 324) has reached the plane; and generate, when the first surgical tool (148, 204, 324) has reached the plane, an alert.
2. The system of claim 1, wherein the robotic arm (116, 316, 320) is further couplable to a second surgical tool (152, 208, 328), and wherein the memory (106) stores further data that, when processed by the processor (104), enables the processor (104) to: track the robotic arm (116, 316, 320) as the second surgical tool (152, 208, 328) resects the second portion of the anatomical element.
3. The system of claim 2, wherein the memory (106) stores further data that, when processed by the processor (104), enables the processor (104) to: update, after at least one of the first surgical tool (148, 204, 324) resecting the first portion of the anatomical element and the second surgical tool (152, 208, 328) resecting the second portion of the anatomical element, a depiction of the anatomical element rendered to a display.
4. The system of claim 2 or 3, wherein the second surgical tool (152, 208, 328) comprises an oscillating drill.
5. The system of any one of claims 1-4, wherein the first surgical tool (148, 204, 324) comprises a high speed drill.
6. The system of any one of claims 1-5, wherein the robotic arm (116, 316, 320) comprises a plurality of navigation markers (220A-220D, 224A-224D), and wherein tracking the robotic arm (116, 316, 320) comprises identifying a pose of the plurality of navigation markers (220A-220D, 224A-224D).
7. The system of any one of claims 1-6, further comprising: a second robotic arm (116, 316, 320) couplable to a second surgical tool (152, 208, 328), wherein the memory (106) stores further data that, when processed by the processor (104), enables the processor (104) to track the second robotic arm (116, 316, 320) as the second surgical tool (152, 208, 328) resects the second portion of the anatomical element.
8. The system of any one of claims 1-7, wherein the alert is rendered to a display, and wherein the second portion of the anatomical element is positioned between the first portion of the anatomical element and a second anatomical element.
9. A surgical system, comprising: a processor (104); and a memory (106) storing data thereon that, when executed by the processor (104), enable the processor (104) to: receive a three-dimensional (3D) model (144) of an anatomical element; determine, based on the 3D model (144), a plane that at least partially bisects the anatomical element into a first portion and a second portion; track a first surgical tool (148, 204, 324) as the first surgical tool (148, 204, 324) resects the first portion of the anatomical element; determine, based on the tracking, that the first surgical tool (148, 204, 324) has reached the plane; generate, when the first surgical tool (148, 204, 324) has reached the plane, an alert; and track a second surgical tool (152, 208, 328) as the second surgical tool (152, 208, 328) resects the second portion of the anatomical element.
10. The surgical system of claim 9, wherein the memory (106) stores further data that, when processed by the processor (104), enables the processor (104) to: update, after at least one of the first surgical tool (148, 204, 324) resecting the first portion of the anatomical element and the second surgical tool (152, 208, 328) resecting the second portion of the anatomical element, a depiction of the anatomical element rendered to a display.
11. The surgical system of claim 9 or 10, wherein the memory (106) stores further data that, when processed by the processor (104), enables the processor (104) to: disable the first surgical tool (148, 204, 324) when the first surgical tool (148, 204, 324) reaches the plane.
12. The surgical system of any one of claims 9-11, wherein the first surgical tool (148, 204, 324) comprises a first set of navigation markers (220A-220D, 224A-224D), wherein the second surgical tool (152, 208, 328) comprises a second set of navigation markers (220A-220D, 224A-224D), wherein tracking the first surgical tool (148, 204, 324) comprises identifying a pose of the first set of navigation markers (220A-220D, 224A-224D), and wherein tracking the second surgical tool (152, 208, 328) comprises identifying a pose of the second set of navigation markers (220A-220D, 224A-224D).
13. The surgical system of any one of claims 9-12, wherein the first surgical tool (148, 204, 324) is connectable to a robotic arm (116, 316, 320), and wherein the robotic arm (116, 316, 320) manipulates the first surgical tool (148, 204, 324) to resect the first portion of the anatomical element.
14. A method, comprising: receiving a three-dimensional (3D) model (144) of an anatomical element; determining, based on a surgical plan and the 3D model (144), a plane that bisects the anatomical element into a first portion and a second portion; tracking a first surgical tool (148, 204, 324) as the first surgical tool (148, 204, 324) resects the first portion of the anatomical element; determining, based on tracking, that the first surgical tool (148, 204, 324) has reached the plane; generating, when the first surgical tool (148, 204, 324) has reached the plane, an alert; and tracking a second surgical tool (152, 208, 328) as the second surgical tool (152, 208, 328) resects the second portion of the anatomical element.
15. The method of claim 14, further comprising: updating, after at least one of the resecting of the first portion and the resecting of the second portion, a depiction of the anatomical element rendered to a display, wherein the first surgical tool (148, 204, 324) comprises a high speed drill, and wherein the second surgical tool (152, 208, 328) comprises an oscillating drill.
PCT/IL2025/050019 2024-01-09 2025-01-07 Systems and methods for navigated surgical resection of anatomical elements Pending WO2025150040A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463619165P 2024-01-09 2024-01-09
US63/619,165 2024-01-09

Publications (1)

Publication Number Publication Date
WO2025150040A1 true WO2025150040A1 (en) 2025-07-17

Family

ID=94605567

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2025/050019 Pending WO2025150040A1 (en) 2024-01-09 2025-01-07 Systems and methods for navigated surgical resection of anatomical elements

Country Status (1)

Country Link
WO (1) WO2025150040A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220273396A1 (en) * 2019-07-15 2022-09-01 Mako Surgical Corp. Robotic Hand-Held Surgical Instrument Systems And Methods
EP4070753A1 (en) * 2021-04-09 2022-10-12 MinMaxMedical Handle for guiding a robotic arm of a computer-assisted surgery system and a surgical tool held by said robotic arm
US20220338935A1 (en) * 2019-06-18 2022-10-27 Smith & Nephew, Inc. Computer controlled surgical rotary tool
US20230248371A1 (en) * 2019-04-12 2023-08-10 Mako Surgical Corp Robotic Systems And Methods For Manipulating A Cutting Guide For A Surgical Instrument
WO2023230349A1 (en) * 2022-05-26 2023-11-30 Stryker Corporation Alert system behavior based on localization awareness

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230248371A1 (en) * 2019-04-12 2023-08-10 Mako Surgical Corp Robotic Systems And Methods For Manipulating A Cutting Guide For A Surgical Instrument
US20220338935A1 (en) * 2019-06-18 2022-10-27 Smith & Nephew, Inc. Computer controlled surgical rotary tool
US20220273396A1 (en) * 2019-07-15 2022-09-01 Mako Surgical Corp. Robotic Hand-Held Surgical Instrument Systems And Methods
EP4070753A1 (en) * 2021-04-09 2022-10-12 MinMaxMedical Handle for guiding a robotic arm of a computer-assisted surgery system and a surgical tool held by said robotic arm
WO2023230349A1 (en) * 2022-05-26 2023-11-30 Stryker Corporation Alert system behavior based on localization awareness

Similar Documents

Publication Publication Date Title
US20250186152A1 (en) Systems, methods, and devices for defining a path for a robotic arm
US20220395342A1 (en) Multi-arm robotic systems and methods for monitoring a target or performing a surgical procedure
US20250152262A1 (en) Path planning based on work volume mapping
US20240138932A1 (en) Systems and methods for controlling one or more surgical tools
WO2025150040A1 (en) Systems and methods for navigated surgical resection of anatomical elements
US20250057603A1 (en) Systems and methods for real-time visualization of anatomy in navigated procedures
US20230293244A1 (en) Systems and methods for hybrid motion planning
US12295683B2 (en) Systems and methods for robotic collision avoidance using medical imaging
US20250318886A1 (en) Automatic robotic procedure for skin cutting, tissue pathway, and dilation creation
US12329479B2 (en) Systems and methods for setting an implant
US11847809B2 (en) Systems, devices, and methods for identifying and locating a region of interest
US20230404692A1 (en) Cost effective robotic system architecture
US20240358461A1 (en) Multi-arm robotic systems and methods for monitoring a target or performing a surgical procedure
WO2025037243A1 (en) Systems and methods for real-time visualization of anatomy in navigated procedures
WO2024236472A1 (en) Systems and methods for preventing and detecting skiving
EP4472526A1 (en) Systems for drilling and imaging an anatomical element

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25705341

Country of ref document: EP

Kind code of ref document: A1