[go: up one dir, main page]

WO2025150040A1 - Systèmes et procédés de résection par navigation chirurgicale d'éléments anatomiques - Google Patents

Systèmes et procédés de résection par navigation chirurgicale d'éléments anatomiques

Info

Publication number
WO2025150040A1
WO2025150040A1 PCT/IL2025/050019 IL2025050019W WO2025150040A1 WO 2025150040 A1 WO2025150040 A1 WO 2025150040A1 IL 2025050019 W IL2025050019 W IL 2025050019W WO 2025150040 A1 WO2025150040 A1 WO 2025150040A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical tool
robotic arm
anatomical element
surgical
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IL2025/050019
Other languages
English (en)
Inventor
Elad Rotman
Ido ZUCKER
Adi ESS
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mazor Robotics Ltd
Original Assignee
Mazor Robotics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mazor Robotics Ltd filed Critical Mazor Robotics Ltd
Publication of WO2025150040A1 publication Critical patent/WO2025150040A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery

Definitions

  • the present disclosure is generally directed to surgical navigation, and relates more particularly to using a navigation system information to control surgical tools in performing resection of anatomical tissues during a surgery or surgical procedure.
  • the memory stores further data that, when processed by the processor, enables the processor to: update, after at least one of the first surgical tool resecting the first portion of the anatomical element and the second surgical tool resecting the second portion of the anatomical element, a depiction of the anatomical element rendered to a display.
  • the memory stores further data that, when processed by the processor, enables the processor to: disable the first surgical tool when the first surgical tool reaches the plane.
  • the first surgical tool comprises a first set of navigation markers
  • the second surgical tool comprises a second set of navigation markers
  • tracking the first surgical tool comprises identifying a pose of the first set of navigation markers
  • tracking the second surgical tool comprises identifying a pose of the second set of navigation markers
  • the first surgical tool is connectable to a robotic arm, and wherein the robotic arm manipulates the first surgical tool to resect the first portion of the anatomical element.
  • the second surgical tool is connectable to the robotic arm, and wherein the robotic arm manipulates the second surgical tool to resect the second portion of the anatomical element.
  • the second surgical tool is connectable to a second robotic arm, and wherein the second robotic arm manipulates the second surgical tool to resect the second portion of the anatomical element.
  • any of the aspects herein further comprising: updating, after at least one of the resecting of the first portion and the resecting of the second portion, a depiction of the anatomical element rendered to a display.
  • each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple Al l, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors), graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuit
  • DSPs digital signal processors
  • the changed characteristics of the bone cut and/or the anatomical element may then be saved to the database 130, the surgical plan 136, combinations thereof, and/or the like.
  • the surgical tool information 140 may contain information about the parameters of one or more of the surgical tools used in the surgery or surgical procedure (e.g., the rotation speed of the operative end of the surgical tool, the electrical power requirements of the surgical tool, the dimensions of the surgical tool, etc.).
  • the surgical tool information 140 may also specify which type of surgical tool is to be used at each step in the surgery or surgical procedure.
  • the surgical tool information 140 may specify that the HSD 148 is to be used in a first surgical step of resecting cortical bone, and that the OSD 152 is to be used in a second surgical step of resecting cancellous/trabecular bone.
  • the surgical tool information 140 may be modifiable by the user (e.g., a physician) based on inputs to the user interface 110.
  • the surgical tool information 140 may be part of the surgical plan 136.
  • the mesh model 144 may be or comprise a model of one or more anatomical elements that are the subject of the surgery or surgical procedure.
  • the mesh model 144 may be or comprise a mesh of the vertebra.
  • the mesh model 144 may be generated by the computing device 102 based on one or more images captured using the imaging device 112, image information taken from the surgical plan 136, information retrieved from the database 130, combinations thereof, and/or the like.
  • the imaging device 112 may generate a CT scan of the patient including the target vertebra.
  • the computing device 102 may receive the image data from the imaging device 112 and, using image processing 120 and segmentation 122, generate the mesh model 144.
  • a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time, and a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
  • the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
  • the imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an O-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient.
  • the imaging device 112 may be contained entirely within a single housing, or may comprise a transmitter/emitter and a receiver/detector that are in separate housings or are otherwise
  • the imaging device 112 may comprise more than one imaging device 112.
  • a first imaging device may provide first image data and/or a first image
  • a second imaging device may provide second image data and/or a second image.
  • the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein.
  • the imaging device 112 may be operable to generate a stream of image data.
  • the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images.
  • image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
  • the robot 114 together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
  • the robotic arm(s) 116 may comprise one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm).
  • reference markers may be placed on the robot 114 (including, e.g., on the robotic arm 116), the imaging device 112, or any other object in the surgical space.
  • the reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof.
  • the navigation system 118 can be used to track other components of the system (e.g., imaging device 112) and the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or more surgical tools, based on information and/or instructions generated by the navigation system 118, for example).
  • the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation.
  • the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStationTM S8 surgical navigation system or any successor thereof.
  • the navigation system 118 may include one or more cameras or other sensor(s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the system 100 is located.
  • the one or more cameras may be optical cameras, infrared cameras, or other cameras.
  • the navigation system 118 may comprise one or more electromagnetic sensors.
  • the navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
  • the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud 134.
  • the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • a hospital image storage system such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • the cloud 134 may be or represent the Internet or any other wide area network.
  • the computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both.
  • the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
  • the HSD 148 may be configured to drill, burr, mill, cut, saw, ream, tap, etc. into anatomical tissues such as patient anatomy (e.g., soft tissues, bone, etc.).
  • the system 100 may comprise the HSD 148, the OSD 152, and/or multiple other surgical tools, with each surgical tool performing a different surgical task (e.g., the HSD 148 for drilling through trabecular bone of a vertebra, the OSD 152 for cutting through the distal cortex of the vertebra at a slower rate than the HSD 148, etc.).
  • the HSD 148 may be operated autonomously (e.g., when the HSD 148 is connected to and manipulated by the robotic arm 116) or semi-autonomously, such as when the HSD 148 is manipulated by a user with guidance from the navigation system 118.
  • the OSD 152 may be used to resect the distal cortex of a vertebra, and the oscillatory motion of the operative end of the OSD 152 may reduce the likelihood of breaching the nerve channel beneath the distal cortex.
  • the OSD 152 may be operated autonomously (e.g., when the OSD 152 is connected to and manipulated by the robotic arm 116) or semi-autonomously, such as when the OSD 152 is manipulated by a user with guidance from the navigation system 118.
  • the OSD 152 may be attached to a robotic arm 116, such that movement of the robotic arm 116 correspondingly causes movement in the OSD 152.
  • the OSD 152 may be gripped, held, or otherwise coupled to and controlled by the robotic arm 116.
  • the pose (e.g., position and orientation) of the OSD 152 may be controlled by the pose of the robotic arm 116.
  • the OSD 152 can be controlled by one or more components of the system 100, such as the computing device 102.
  • the computing device 102 may be capable of receiving or retrieving data or other information (e.g., from the database 130, from one or more sensors, from the imaging device 112, etc.), process the information, and control the OSD 152 based on the processed information. Additionally or alternatively, the navigation system 118 may track the position of and/or navigate the OSD 152. Such tracking may enable the system 100 or components thereof (e.g., the computing device 102) to determine the pose of the OSD 152, the location of the OSD 152 relative to the planned bone cut, combinations thereof, and/or the like, as discussed in further detail below.
  • the HSD 148 may be connected to a first robotic arm, while the OSD 152 is connected to a second, different robotic arm. In such embodiments, each robotic arm may independently operate each surgical tool, with the navigation system 118 tracking both and the computing device 102 generating navigation paths of both robotic arms to avoid or mitigate the likelihood of collisions between the two.
  • the system 100 or similar systems may be used, for example, to carry out one or more aspects of the method 400 described herein.
  • the system 100 or similar systems may also be used for other purposes.
  • An HSD 204 and an OSD 208 may be positioned relative to an anatomical element 210.
  • the HSD 204 may be similar to or the same as the HSD 148
  • the OSD 208 may be similar to or the same as the OSD 152.
  • the HSD 204 and the OSD 208 may be used to perform a bone cut 228.
  • the bone cut 228 may be or comprise a cut through one or more portions of the anatomical element 210, such as through the outer cortical bone of a vertebra and through the distal cortex of the vertebra.
  • the HSD 204 may be used to perform the first cut on the outer cortical bone
  • the OSD 208 may be used to perform the second cut to remove the distal cortex.
  • the HSD 204 may comprise a navigated portion 212 and the OSD 208 may comprise a navigated portion 216.
  • the navigated portion 212 and the navigated portion 216 may respectively comprise navigation markers 220A-220D (including a first navigation marker 220A, a second navigation marker 220B, a third navigation marker 220C, and a fourth navigation marker 220D) and navigation markers 224A-224D (including a first navigation marker 224A, a second navigation marker 224B, a third navigation marker 224C, and a fourth navigation marker 224D).
  • the navigation markers 220A-220D and the navigation markers 224A-224D may enable a navigation camera of the navigation system 118 to track the pose (e.g., position and orientation) of the HSD 204 and the OSD 208, respectively, as the HSD 204 and the OSD 208 move relative to the anatomical element 210.
  • the HSD 204 may be moved (e.g., using a robotic arm such as the robotic arm 116, manually by a user, etc.) relative to the anatomical element 210 as the operative portion of the HSD 204 resects one or more portions of the anatomical element 210.
  • the navigation camera of the navigation system 118 may identify the navigation markers 220A-220D and track the pose of the navigated portion 212 (e.g., based on the pose of the navigation markers 220A-220D), such that the navigation system 118 can determine a pose of the HSD 204 in a known coordinate system.
  • the OSD 208 may be moved (e.g., using a robotic arm such as the robotic arm 116, manually by a user, etc.) relative to the anatomical element 210 as the operative portion of the OSD 208 resects one or more portions of the anatomical element 210.
  • the navigation camera of the navigation system 118 may identify the navigation markers 224A-224D and track the pose of the navigated portion 216 (e.g., based on the pose of the navigation markers 224A- 224D), such that the navigation system 118 can determine a pose of the OSD 208 in a known coordinate system.
  • the navigation system 118 may provide the pose information of the HSD 204 and/or the OSD 208 to the computing device 102, which may use the pose information to determine when use of the HSD 204 and/or the OSD 208 should be discontinued or stopped, as discussed in more detail below.
  • the navigation system 118 may provide updated pose information at a predetermined or user- specified interval (e.g., every 10 milliseconds (ms), every 20 ms, every 50 ms, every 2 seconds, etc.).
  • the bone cut 228 may be or comprise an indication of the portions of the anatomical element 210 that are to be operated on by the HSD 204 and/or the OSD 208.
  • the bone cut 228 may comprise a multi-dimensional shape (e.g., 2D plane or 3D volume) that specifies the area or volume of the anatomical element 210 that is to be resected or otherwise operated on by the HSD 204 and/or the OSD 208.
  • the bone cut 228 comprises a plane 240 that divides the anatomical element 210 into a first portion 232 and a second portion 236.
  • the first portion 232 may be or comprise a section of the anatomical element 210 that is to be resected using the HSD 204
  • the second portion 236 may be or comprise a section of the anatomical element 210 that is to be resected using the OSD 208.
  • the plane 240 may be definable or editable by the user via, for example, the user interface 110.
  • the user may be able to change the shape, orientation, dimensions, and/or the like of the plane 240 to change the shape, orientation, dimensional and/or the like of the first portion 232 and/or the second portion 236.
  • the user may be able to draw a shape of the plane 240 on a mesh model (e.g., mesh model 144) or the anatomical element 210 rendered to a display, and the mesh model may be updated to incorporate the plane 240.
  • a mesh model e.g., mesh model 144
  • the HSD 204 may be used to resect anatomical tissue associated with the first portion 232 of the anatomical element 210. As depicted in Fig. 2B, the HSD 204 may move laterally across the anatomical element 210 such that the operative portion of the HSD 204 resects anatomical tissue. The pose of the HSD 204 may be tracked using a navigation camera of the navigation system 118 that identifies a pose of the navigation markers 220A-220D on the navigated portion 212. The HSD 204 may be used to resect the first portion 232 of the anatomical element 210 until the operative portion of the HSD 204 reaches the plane 240.
  • the computing device 102 may determine that the HSD 204 has reached the plane 240 based on the tracking of the pose of the HSD 204 as the HSD 204 resects the first portion 232.
  • the computing device 102 may receive tracking and/or pose information from the navigation system 118, and may use such information and information from the surgical plan 136 (e.g., the dimensions of the first portion 232) to determine that the HSD 204 has reached the plane 240.
  • the surgical plan 136 may specify that the first portion 232 has a first thickness, and the computing device 102 may determine the HSD 204 would have a first pose in a known coordinate system when the HSD 204 has drilled through the first thickness.
  • the navigation system 118 tracking information specifies that the HSD 204 is in the first pose
  • the computing device 102 may determine that the first portion 232 has been resected and the HSD 204 has reached the plane 240.
  • the computing device 102 may generate an alert.
  • the alert may be a visual alert (e.g., a flashing warning rendered to a display), an audio alert (e.g., a siren or other alarm), haptic (e.g., motors in the display are activated to vibrate the display), combinations thereof, and/or the like.
  • a warning message may be rendered to the display to indicate to the user that the first portion 232 has been resected and that the operative portion of the HSD 204 has arrived at the plane 240.
  • the plane 240 may represent the intersection of two different types of anatomical tissues.
  • the OSD 208 may be used to resect the second portion 236.
  • the computing device 102 may render the alert to the display instructing the user to change the surgical tool in use (e.g., to switch out the HSD 204 for the OSD 208).
  • the alert may instruct the user to switch out the surgical tool connected to the robotic arm.
  • the robotic arm may navigate the HSD 204 until the operative portion of the HSD 204 reaches the plane 240, at which point the alert may be generated.
  • the alert may be an instruction rendered to the display that instructs the user to detach the HSD 204 from the robotic arm and to attach the OSD 208.
  • the navigation system 118 may initiate and conduct an authentication process to ensure that the correct surgical tool (e.g., the OSD 208) has been connected to the robotic arm.
  • the navigation system 118 may identify the navigation markers 224A-224D (which may be distinguishable by the navigation system 118 from the navigation markers 220A-220D of the HSD 204) to verify that the HSD 204 has been disconnected from the robotic arm and that the OSD 208 has been connected.
  • the surgical plan 136 may specify that the second portion 236 has a first thickness, and the computing device 102 may determine the OSD 208 would have a first pose in a known coordinate system when the OSD 208 has drilled through the first thickness.
  • the computing device 102 may determine that the second portion 236 has been resected. Once the OSD 208 has resected the second portion 236, the computing device 102 may generate a second alert that indicates that the second portion 236 has been resected.
  • the alert may be a visual alert (e.g., a flashing warning rendered to a display), an audio alert (e.g., a siren or other alarm), haptic (e.g., motors in the display are activated to vibrate the display), combinations thereof, and/or the like.
  • the computing device 102 may disable use of the OSD 208 once the computing device 102 has determined that the second portion 236 has been resected and has generated the second alert.
  • a depiction of the anatomical element 210 on a display may be changed or otherwise updated to reflect the removal of anatomical tissue.
  • the updating may occur while the HSD 204 and the OSD 208 resect portions of the anatomical element 210 and/or after each step of the surgery or surgical procedure.
  • the mesh model 144 may be rendered to a display, and after the HSD 204 has resected the first portion 232, the mesh model 144 may be updated to depict the anatomical element 210 with the first portion 232 removed.
  • the mesh model 144 may be again updated to depict the anatomical element 210 with the second portion 236 removed.
  • the computing device 102 may generate a new model after each resection or step in the surgery or surgical procedure. For example, after the HSD 204 has resected the first portion 232, the computing device 102 may use image processing 120 and segmentation 122 to segment out the depiction of the first portion 232 from the mesh model 144, and render the mesh model 144 as a new model to the display.
  • the first robotic arm 316 and the second robotic arm 320 may each comprise a tool changer that enables the first robotic arm 316 and the second robotic arm 320 to be coupled to different surgical tools, such as to an HSD or to an OSD.
  • the first robotic arm 316 and the second robotic arm 320 can each provide both HSD and OSD drilling capabilities to a respective side of the patient.
  • the first robotic arm 316 provides HSD and OSD drilling capabilities to a first side of the patient 308, while the second robotic arm 320 provides HSD and OSD drilling capabilities to a second side of the patient 308.
  • the computing device 102 may lock or otherwise disable the use of one robotic arm while the other is in use, and vice versa.
  • the second robotic arm 320 and components thereof e.g., the second surgical tool 328
  • the second robotic arm 320 and components thereof may be locked in place within the working volume 312.
  • the second robotic arm 320 and components thereof may be moved outside of the working volume 312 so as to not interfere with navigation of the first robotic arm 316 and components thereof.
  • the first robotic arm 316 and the second robotic arm 320 may be used to perform an autonomous bone cut of an anatomical element of the patient 308.
  • the first robotic arm 316 and the second robotic arm 320 may be used to drill a hole for a pedicle screw in a vertebra 332 of the patient 308.
  • the planned drilling may comprise drilling down to a first depth 336 using the first surgical tool 324, and then drilling down to a second depth 340 using the second surgical tool 328.
  • the use of different surgical tools may account for differences in the composition and/or sensitive of anatomical tissues of the vertebra 332 along the trajectory.
  • one or more multi-dimensional shapes may be used to represent the different depths, such as when a 2D plane bisects the drilling depths into two portions: a first portion through which the first surgical tool 324 is to drill, and a second portion through which the second surgical tool 328 is to drill.
  • the multi-dimensional shape may be definable or editable by the user via, for example, the user interface 110. For instance, the user may be able to change the shape, orientation, dimensions, and/or the like of the multidimensional shape to change the shape, orientation, dimensional and/or the like of the drilling depths.
  • the user may be able to draw a shape of the multi-dimensional shape on a mesh model (e.g., mesh model 144) or the vertebra 332 rendered to a display, and the mesh model may be updated to incorporate the multi-dimensional shape. Additionally or alternatively, the user may be able to define and/or edit the trajectory of the drilling into the vertebra 332.
  • a mesh model e.g., mesh model 1414
  • the vertebra 332 rendered to a display
  • the mesh model may be updated to incorporate the multi-dimensional shape.
  • the user may be able to define and/or edit the trajectory of the drilling into the vertebra 332.
  • the computing device 102 may generate an alert.
  • the alert may be a visual alert (e.g., a flashing warning rendered to a display), an audio alert (e.g., a siren or other alarm), haptic (e.g., motors in the display are activated to vibrate the display), combinations thereof, and/or the like.
  • a warning message may be rendered to the display to indicate to the user that the first depth 336 has been reached by the first surgical tool 324.
  • the first depth 336 may be a depth at which a different type of anatomical tissue is encountered, such that a different surgical tool should be used.
  • the second surgical tool 328 may be used to drill past the first depth 336 and down to the second depth 340.
  • the second robotic arm 320 and/or the second surgical tool 328 may be tracked using a navigation camera of the navigation system 118 that identifies a pose of the navigation markers attached to the second robotic arm 320 and/or the second surgical tool 328.
  • the computing device 102 may determine that the second surgical tool 328 has drilled to the second depth 340 based on tracking the pose of the second surgical tool 328.
  • the computing device 102 may receive tracking and/or pose information from the navigation system 118, and may use such information and information from the surgical plan 136 (e.g., the depth of the second depth 340) to determine that the second surgical tool 328 has drilled to the second depth 340.
  • the method 400 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
  • a processor other than any processor described herein may also be used to execute the method 400.
  • the at least one processor may perform the method 400 by executing elements stored in a memory such as the memory 106.
  • the elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 400.
  • One or more portions of a method 400 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the method 400 also comprises generating, when the first surgical tool has reached the plane, an alert (step 420).
  • the alert may be a visual alert (e.g., a flashing warning rendered to a display), an audio alert (e.g., a siren or other alarm), haptic (e.g., motors in the display are activated to vibrate the display), combinations thereof, and/or the like.
  • a warning message may be rendered to the display to indicate to the user that the first portion has been resected by the first surgical tool.
  • the alert may be an instruction rendered to the display that instructs the user to detach the first surgical tool from the robotic arm and to attach the second surgical tool.
  • the navigation system 118 may initiate and conduct an authentication process to ensure that the correct surgical tool (e.g., the second surgical tool) has been connected to the robotic arm. For instance, the navigation system 118 may identify navigation markers unique to the second surgical tool to verify that the first surgical tool has been disconnected from the robotic arm and that the second surgical tool has been connected.
  • the alert may indicate to the user that the plane has been reached, and that the fully autonomous system is moving on to the next step in the surgical process.
  • the computing device 102 may disable the first surgical tool once the computing device 102 has determined that the plane has been reached, and may move the first robotic arm and/or the first surgical tool away from the anatomical element. After the first robotic arm and/or the first surgical tool have been removed, the computing device 102 may cause the second robotic arm 320 to move relative to the anatomical element such that the second surgical tool proceed with resecting the second portion of the anatomical element.
  • the second robotic arm may be manipulated such that the second surgical tool is disposed to continue resecting the anatomical element in a similar pose to that of the first surgical tool.
  • Example 10 A surgical system comprising: a processor (104); and a memory (106) storing data thereon that, when executed by the processor (104), enable the processor (104) to: receive a three-dimensional (3D) model (144) of an anatomical element; determine, based on the 3D model (144), a plane that at least partially bisects the anatomical element into a first portion and a second portion; track a first surgical tool (148, 204, 324) as the first surgical tool (148, 204, 324) resects the first portion of the anatomical element; determine, based on the tracking, that the first surgical tool (148, 204, 324) has reached the plane; generate, when the first surgical tool (148, 204, 324) has reached the plane, an alert; and track a second surgical tool (152, 208, 328) as the second surgical tool (152, 208, 328) resects the second portion of the anatomical element.
  • 3D three-dimensional
  • Example 13 The surgical system of any one of Examples 10-12, wherein the memory (106) stores further data that, when processed by the processor (104), enables the processor (104) to:

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Robotics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Manipulator (AREA)

Abstract

Un système selon au moins un mode de réalisation de la présente divulgation comprend : un bras robotique pouvant être couplé à un premier outil chirurgical ; un processeur ; et une mémoire qui stocke des données sur celui-ci qui, lorsqu'elles sont traitées par le processeur, permettent au processeur de : recevoir un modèle tridimensionnel (3D) d'un élément anatomique ; déterminer, sur la base du modèle 3D, un plan qui divise au moins partiellement l'élément anatomique en une première partie et une seconde partie ; suivre le bras robotique lorsque le premier outil chirurgical résèque la première partie de l'élément anatomique ; déterminer, sur la base du suivi du bras robotique, que le premier outil chirurgical a atteint le plan ; et générer, lorsque le premier outil chirurgical a atteint le plan, une alerte.
PCT/IL2025/050019 2024-01-09 2025-01-07 Systèmes et procédés de résection par navigation chirurgicale d'éléments anatomiques Pending WO2025150040A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463619165P 2024-01-09 2024-01-09
US63/619,165 2024-01-09

Publications (1)

Publication Number Publication Date
WO2025150040A1 true WO2025150040A1 (fr) 2025-07-17

Family

ID=94605567

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2025/050019 Pending WO2025150040A1 (fr) 2024-01-09 2025-01-07 Systèmes et procédés de résection par navigation chirurgicale d'éléments anatomiques

Country Status (1)

Country Link
WO (1) WO2025150040A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220273396A1 (en) * 2019-07-15 2022-09-01 Mako Surgical Corp. Robotic Hand-Held Surgical Instrument Systems And Methods
EP4070753A1 (fr) * 2021-04-09 2022-10-12 MinMaxMedical Poignée pour guider un bras robotique d'un système de chirurgie assistée par ordinateur et outil chirurgical maintenu par ledit bras robotique
US20220338935A1 (en) * 2019-06-18 2022-10-27 Smith & Nephew, Inc. Computer controlled surgical rotary tool
US20230248371A1 (en) * 2019-04-12 2023-08-10 Mako Surgical Corp Robotic Systems And Methods For Manipulating A Cutting Guide For A Surgical Instrument
WO2023230349A1 (fr) * 2022-05-26 2023-11-30 Stryker Corporation Comportement de système d'alerte basé sur la connaissance de la localisation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230248371A1 (en) * 2019-04-12 2023-08-10 Mako Surgical Corp Robotic Systems And Methods For Manipulating A Cutting Guide For A Surgical Instrument
US20220338935A1 (en) * 2019-06-18 2022-10-27 Smith & Nephew, Inc. Computer controlled surgical rotary tool
US20220273396A1 (en) * 2019-07-15 2022-09-01 Mako Surgical Corp. Robotic Hand-Held Surgical Instrument Systems And Methods
EP4070753A1 (fr) * 2021-04-09 2022-10-12 MinMaxMedical Poignée pour guider un bras robotique d'un système de chirurgie assistée par ordinateur et outil chirurgical maintenu par ledit bras robotique
WO2023230349A1 (fr) * 2022-05-26 2023-11-30 Stryker Corporation Comportement de système d'alerte basé sur la connaissance de la localisation

Similar Documents

Publication Publication Date Title
US20250186152A1 (en) Systems, methods, and devices for defining a path for a robotic arm
US20220395342A1 (en) Multi-arm robotic systems and methods for monitoring a target or performing a surgical procedure
US20250152262A1 (en) Path planning based on work volume mapping
US20240138932A1 (en) Systems and methods for controlling one or more surgical tools
WO2025150040A1 (fr) Systèmes et procédés de résection par navigation chirurgicale d'éléments anatomiques
US20250057603A1 (en) Systems and methods for real-time visualization of anatomy in navigated procedures
US20230293244A1 (en) Systems and methods for hybrid motion planning
US12295683B2 (en) Systems and methods for robotic collision avoidance using medical imaging
US20250318886A1 (en) Automatic robotic procedure for skin cutting, tissue pathway, and dilation creation
US12329479B2 (en) Systems and methods for setting an implant
US11847809B2 (en) Systems, devices, and methods for identifying and locating a region of interest
US20230404692A1 (en) Cost effective robotic system architecture
US20240358461A1 (en) Multi-arm robotic systems and methods for monitoring a target or performing a surgical procedure
WO2025037243A1 (fr) Systèmes et méthodes de visualisation en temps réel de l'anatomie dans des procédures de navigation
WO2024236472A1 (fr) Systèmes et procédés de prévention et de détection de coupe en biseau
EP4472526A1 (fr) Systèmes de forage et d'imagerie d'un élément anatomique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25705341

Country of ref document: EP

Kind code of ref document: A1