[go: up one dir, main page]

WO2025172998A1 - Adaptive bone removal system and method - Google Patents

Adaptive bone removal system and method

Info

Publication number
WO2025172998A1
WO2025172998A1 PCT/IL2025/050141 IL2025050141W WO2025172998A1 WO 2025172998 A1 WO2025172998 A1 WO 2025172998A1 IL 2025050141 W IL2025050141 W IL 2025050141W WO 2025172998 A1 WO2025172998 A1 WO 2025172998A1
Authority
WO
WIPO (PCT)
Prior art keywords
segmented part
volume
bone removal
bone
surgical tool
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IL2025/050141
Other languages
French (fr)
Inventor
Ido ZUCKER
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mazor Robotics Ltd
Original Assignee
Mazor Robotics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mazor Robotics Ltd filed Critical Mazor Robotics Ltd
Publication of WO2025172998A1 publication Critical patent/WO2025172998A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/16Instruments for performing osteoclasis; Drills or chisels for bones; Trepans
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/254User interfaces for surgical systems being adapted depending on the stage of the surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
    • A61B2090/3735Optical coherence tomography [OCT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously

Definitions

  • This application relates generally to the field of robotics assisted surgery.
  • Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure or may complete one or more surgical procedures autonomously. During such surgical procedures, surgical tools may be used on one or more anatomical elements. The tools may be oriented and operated by the surgical robot and/or the surgeon or other medical provider.
  • FIG. 1 is a block diagram illustrating a robotic system according to various examples.
  • FIG. 2 illustrates aspects of the operation of the robotic system of FIG. 1 according to various examples.
  • FIG. 5 illustrates aspects of the operation of the robotic system of FIG. 1 according to various examples.
  • the rules may limit how much volume (e.g., as a percentage) of a spinal segment may be removed from particular parts of the spinal segment without causing instability.
  • the rules may also provide aggregate limits (e.g., a percentage of overall volume for a segment or a combination of limits for different parts of the segment being reached).
  • Robotic surgical platforms operate according to surgical plans, which are determined without taking into account applicable rules.
  • Bone removal may be performed using a hand guided robotic arm, which allows the surgeon to perform the bone removal only in designated areas, based on the rules. However, as the surgery progresses, circumstances may develop, which make it advisable for the surgeon to remove more bone than originally planned for from a part of the spinal segment or remove bone from different parts of the spinal segment than called for in the plan. As this occurs, the surgeon may not be able to determine how these changes affect whether or not the rules’ limits have been met.
  • a robotic surgical platform incorporates an active constraint system, which will allow a surgeon to operate the surgical tool but will restrict efforts to remove too much bone.
  • a sub-anatomical segmentation and bone mineral density analysis is performed to create a plan for bone removal, which does not de-stabilize the spine.
  • the robotic surgical platform calculates and records the bone removed. Where limits are met, the robotic surgical platform prevents the surgeon from further removing bone, which would contribute to instability of the spine, while continuing to allow removal from areas of the spinal segment where such removal would not contribute to instability of the spine.
  • the examples presented herein calculate bone removal taking into account the volume of bone removed.
  • the platform may take into account information beyond volume of bone (e.g., patient characteristics. For example, bone mineral density of the vertebrae, patient age, patient smoker status, overall spinal alignment, and the like may be used to predict post-operative spinal stability as material is removed during surgery.
  • a surgeon has the freedom to address changing circumstances during a surgical procedure by deviating from a pre-operative surgical plan while staying within recommended tolerances for bone removal to prevent spinal instability.
  • the techniques described herein relate to a medical system including: a robot including a robotic arm; a surgical tool coupled to the robotic arm; an electronic processor coupled to the robot, and configured to: receive a scan of an anatomical element; perform a sub- anatomical segmentation on the scan to determine a segmented part; analyze the segmented part to determine a bone removal limit for the segmented part; control the robotic arm according to a user input to remove a volume of bone from the segmented part with the surgical tool; and when the volume exceeds the bone removal limit, prevent the surgical tool from interacting with the segmented part.
  • the techniques described herein relate to a method for operating a surgical robot, the method including: receiving a scan of an anatomical element; performing a sub- anatomical segmentation on the scan to determine a segmented part; analyzing the segmented part to determine a bone removal limit for the segmented part; controlling a robotic arm of the surgical robot according to a user input to remove a volume of bone from the segmented part with a surgical tool; and when the volume exceeds the bone removal limit, preventing the surgical tool from interacting with the segmented part.
  • distal and proximal are used in the following description with respect to a position or direction relative to the surgical robot. “Distal” or “distally” are a position distant from or in a direction away from the surgical robot toward the patient. “Proximal” and “proximally” are a position near or in a direction away from the patient toward the surgical robot. [0020] Before any examples are explained in detail, it is to be understood that the examples presented herein are not limited in their application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings.
  • the examples are capable of other embodiments and of being practiced or of being carried out in various ways.
  • the example systems presented herein may be illustrated with a single exemplar of each of its component parts. Some examples may not describe or illustrate all components of the systems. Other example embodiments may include more or fewer of each of the illustrated components, may combine some components, or may include additional or alternative components.
  • FIG. 1 is a block diagram of a robotic system 100.
  • the robotic system 100 may be used to carry out robotic assisted surgery, including one or more aspects of one or more of the methods disclosed herein.
  • the robotic system 100 includes a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, one or more sensors 126, a database 130, and/or a cloud (or another network) 134.
  • Systems according to other embodiments of the present disclosure may comprise more or fewer components than the robotic system 100.
  • the computing device 102 includes an electronic processor 104, a memory 106, a communication interface 108, and a user interface 110. In some aspects, the computing device 102 may include more or fewer components than illustrated in the example.
  • the memory 106 may be made up of one or more non-transitory computer-readable media and includes at least a program storage area and a data storage area.
  • the program storage area and the data storage area can include combinations of several types of memory, such as read-only memory (“ROM”), random access memory (“RAM”) (for example, dynamic RAM (“DRAM”), synchronous DRAM (“SDRAM”), etc.), electrically erasable programmable read-only memory (“EEPROM”), flash memory, or any other suitable tangible, non-transitory memory for storing computer- readable data and/or instructions.
  • ROM read-only memory
  • RAM random access memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory or any other suitable tangible, non-transitory memory for storing computer- readable data and/or instructions.
  • the memory 106 may store information or data useful for completing, for example, any aspect of the methods 400 and 700 described herein, or of any other methods.
  • the memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114.
  • the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the electronic processor 104, enable image processing 120, sensor processing 122, and/or active constraints 124.
  • Such content may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines.
  • the image processing 120 enables the electronic processor 104 to process image data of an image (received from, for example, the imaging device 112, a camera of the navigation system 118, or any imaging device) for the purpose of, for example, identifying information about an anatomical element 204 (shown in FIG. 2) and/or objects in the image such as a surgical tool 128.
  • the identifying information can be used to determine a three-dimensional location of the surgical tool 128, for example, relative to the anatomical element 204.
  • the sensor processing 122 enables the processor 104 to process sensor output data (received from for example, the one or more sensors 126) for the purpose of, for example, determining the location of the robotic arm 116 and/or the surgical tool 128.
  • the sensor output may be received as signal(s) and may be processed by the electronic processor 104 using the sensor processing 122 to output data such as, for example, force data, acceleration data, pose data, time data, location data, etc.
  • the active constraints 124 enables the processor 104 to restrict operations of the robotic arm 116 and the surgical tool 128, based in part on the adaptive bone removal methods disclosed herein. In some aspects, the active constraints 124 makes or uses determinations of bone removal made during surgical operations to plan and control movement for the robotic arm 116, as described herein.
  • the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc.) that can be processed by the electronic processor 104 to carry out the various method and features described herein.
  • content or data e.g., machine learning models, artificial neural networks, deep neural networks, etc.
  • the data, algorithms, and/or instructions may cause the electronic processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, the sensors 126, and/or the cloud 134.
  • the electronic processor 104 sends and receives information (for example, from the memory 106, the communication interface 108, and/or the user interface 110) and processes the information by executing one or more software instructions or modules, capable of being stored in the memory 106, or another non-transitory computer readable medium.
  • the software can include firmware, one or more applications, program data, filters, rules, one or more program modules, and other executable instructions.
  • the electronic processor 104 is configured to retrieve from the memory 106 and execute, among other things, software for performing techniques and methods as described herein.
  • the communication interface 108 transmits and receives information from devices external to the computing device 102, for example, components of the robotic system 100.
  • the communication interface 108 receives input (for example, from the user interface 110), provides system output or a combination of both.
  • the communication interface 108 may be used for receiving image data or other information from an external source (such as the imaging device 112, the robot 114, the navigation system 118, the sensors 126, the database 130, the cloud 134, and/or any other system or component not part of the robotic system 100), and/or for transmitting instructions, images, or other information to an external system or device (e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the sensors 126, the database 130, the cloud 134, and/or any other system or component not part of the robotic system 100).
  • an external source such as the imaging device 112, the robot 114, the navigation system 118, the sensors 126, the database 130, the cloud 134, and/or any other system or component not
  • the communication interface 108 may include one or more wired interfaces (e.g., a USB port, an Ethernet port, etc.) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.11a/b/g/n, Bluetooth, NFC, ZigBee, and so forth).
  • the communication interface 108 may be useful for enabling the computing device 102 to communicate with one or more other electronic processors or computing devices, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
  • the computing device 102 may also include one or more user interfaces 110.
  • the user interface 110 may be or include a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
  • the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the robotic system 100 (e.g., by the electronic processor 104 or another component of the robotic system 100) or received by the robotic system 100 from a source external to the robotic system 100.
  • the user interface 110 may be useful to allow a surgeon or other user to modify instructions to be executed by the electronic processor 104 according to one or more embodiments of the present disclosure, and/or to modify or adjust a setting of other information displayed on the user interface 110 or corresponding thereto.
  • the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102.
  • the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computing device 102.
  • FIG. 1 illustrates only a single electronic processor 104, memory 106, communication interface 108, and user interface 110
  • alternative embodiments of the computing device 102 may include multiple electronic processors, memory modules, communication interfaces, and/or user interfaces.
  • the robotic system 100 may include other computing devices, each including similar components as, and configured similarly to, the computing device 102.
  • portions of the computing device 102 are implemented partially or entirely on a semiconductor chip (e.g., an application specific integrated circuit (ASIC), a field-programmable gate array (“FPGA”), and the like).
  • ASIC application specific integrated circuit
  • FPGA field-programmable gate array
  • the various modules and controllers described herein may be implemented as individual controllers, as illustrated, or as components of a single controller. In some aspects, a combination of approaches may be used.
  • the imaging device 112 may be operable to image anatomical feature(s) (e.g., a bone, veins, tissue, etc.) and/or other aspects of patient anatomy and/or objects such as the surgical tool 128 to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc.).
  • image data refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form.
  • the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof, and/or objects such as the surgical tool 128.
  • the image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure.
  • the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
  • the imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an O-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other
  • the sensors 126 are configured to provide sensor output.
  • the sensors 126 may include a position sensor, a proximity sensor, a magnetometer, or an accelerometer.
  • the sensors 126 may include a linear encoder, a rotary encoder, or an incremental encoder (e.g., positioned to sense movement or position of the robotic arm 116).
  • Sensor output or output data from the sensors 126 may be provided to an electronic processor of the robot 114, to the electronic processor 104 of the computing device 102, and/or to the navigation system 118.
  • Output data from the sensor(s) 126 may also be used to determine position information for the robot 114. It will be appreciated that in some embodiments, the sensors 126 may be a component separate from the robotic arm 116.
  • sensors 136 which may be the same as or similar to the sensors 126 — may be integrated with the robot 114.
  • the sensors 126 may enable the electronic processor 104 (or an electronic processor of the robot 114) to determine a precise pose in space of a robotic arm 116 (as well as any object or element held by or secured to the robotic arm).
  • sensor output or output data from the sensors 126 may be used to calculate a position in space of the robotic arm 116 (and thus, the surgical tool 128) relative to one or more coordinate systems.
  • the robot 114 may be any surgical robot or part of any robotic assisted surgery system, either of which is capable of operating as described herein.
  • the robot 114 may be or comprise, for example, the Mazor XTM Stealth Edition robotic guidance system, or any derivative thereof.
  • the robot 114 may be configured to position the surgical tool 128 at one or more precise poses (e.g., position(s) and orientation(s)).
  • the surgical tool 128 may be any tool capable of cutting, drilling, milling, and/or parting an anatomical element.
  • the surgical tool 128 may be, in one example, a drill bit.
  • the robot 114 may be configured to rotate and/or advance the cutting tool 128 using, for example, one or more motors to rotate the surgical tool 128.
  • the robot 114 may additionally or alternatively be configured to manipulate any component (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task.
  • the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure.
  • the robot 114 may comprise one or more robotic arms 116.
  • the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms.
  • one or more of the robotic arms 116 may be used to hold and/or maneuver the surgical tool 128.
  • Each robotic arm 116 may be positionable independently of the other robotic arm.
  • the robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
  • the robot 114 together with the robotic arm(s) 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, a surgical tool 128 or another object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
  • reference markers e.g., navigation markers
  • the reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the robotic system 100 or any component thereof.
  • the navigation system 118 provides navigation for a surgeon and/or a surgical robot during an operation.
  • the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStationTM S8 surgical navigation system or any successor thereof.
  • the navigation system 118 may include one or more cameras (e.g., the camera 210 illustrated in FIG. 2) or other sensor(s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the robotic system 100 is located.
  • the one or more cameras may be optical cameras, infrared cameras, or other cameras.
  • the navigation system 118 may comprise one or more electromagnetic sensors.
  • the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, the surgical tool 128), and/or one or more other tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing).
  • the navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
  • the robotic system 100 can operate without the use of the navigation system 118.
  • the navigation system 118 may be configured to provide guidance to a surgeon or other user of the robotic system 100 or a component thereof, to the robot 114, or to any other element of the robotic system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
  • the database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system).
  • the database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the robotic system 100); one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the robotic system 100; and/or any other useful information.
  • one or more surgical plans including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the robotic system 100
  • the database 130 stores patient data associated with one or more patients.
  • Patient data may include, for example, an age associated with a patient, a body mass index (“BMI”) associated with a patient, a bone density associated with a patient, a computed tomography scan associated with a patient, a sex associated with a patient, an implant history associated with a patient (for example, a date and a placement location associated with an implant such as a screw, a rod, a cage, a prosthesis, or the like), other aspects of a patient’s medical history (e.g., smoker/non- smoker, chronic and acute illnesses, and the like), or combinations of the foregoing.
  • BMI body mass index
  • a computed tomography scan associated with a patient
  • a sex associated with a patient
  • an implant history associated with a patient for example, a date and a placement location associated with an implant such as a screw, a rod, a cage, a prosthesis, or the like
  • the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the robotic system 100 or external to the robotic system 100, whether directly or via the cloud 134.
  • the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • a hospital image storage system such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • the cloud 134 may be or represent the Internet or any other wide area network.
  • the computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both.
  • the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
  • the robotic system 100 or similar systems may be used, for example, to carry out one or more aspects of the methods described herein.
  • the robotic system 100 or similar systems may also be used for other purposes.
  • FIG. 2 illustrates a representative example system 200 of the robotic system 100.
  • the system 200 includes the computing device 102, the robot 114, and the navigation system 118.
  • the illustrated example of the robot 114 includes one robotic arm 116, and the surgical tool 128. As illustrated in FIG. 2, the robot 114 may be used to perform procedures on an anatomical element 204.
  • the navigation system 118 includes a camera 210, which has a field of view 206. As illustrated in FIG. 2, the field of view may encompass the anatomical element 204 (including an navigation marker 208) and the robotic arm 116.
  • FIG. 3 illustrates an example method 300 for operating the system of FIG. 1 to perform robotic surgery using adaptive bone removal. Although the method 300 is described in conjunction with the robotic system 100 as described herein, the method 300 could be used with other systems and devices. In addition, the method 300 may be modified or performed differently than the example provided. In particular, the method 300 is also applicable to surgeries using free hand surgical tools.
  • the method 300 is described as being performed by the computing device 102, and, in particular, the electronic processor 104. However, it should be understood that, in some examples, portions of the method 300 may be performed by other components of the robotic system 100, such as, for example, the robot 114 and the navigation system 118.
  • the electronic processor 104 receives a scan of an anatomical element.
  • the electronic processor 104 may receive a computed tomography scan from the database 130 (e.g., taken during preoperative evaluation).
  • the electronic processor 104 may control imaging devices 112, sensors 126, or another medical scanning device coupled to the system 100 to perform a scan of anatomical element (e.g., the anatomical element 204) within the operating theater as part of the surgical operation.
  • other types of scans may be performed or received, including a magnetic resonance imaging scan, a positron emission tomography scan, an ultrasound scan, and a fluoroscopy scan.
  • the electronic processor 104 performs a sub-anatomical segmentation on the scan to determine one or more segmented parts.
  • a scan of the anatomical element 204 includes many parts including pars interarticularis, pedicle, laminae, body, facets, and transverse processes.
  • four segmented parts 402 are identified: a pars interarticularis, a pedicle, and two laminae.
  • a bone removal limit may be a percentage of volume, which can be removed without creating an unacceptable instability.
  • the bone removal limit is based on a desired post-operative spinal stability.
  • the bone removal limits may be based on standard rules (e.g., remove no more than 50% of a facet capsule). However, rules may be altered based on individual characteristics of the patient and the anatomical element.
  • the bone removal limits may be determined based on a bone mineral density for the segmented parts.
  • FIG. 5 illustrates a scan 500 showing bone mineral density for several vertebrae. As illustrated in FIG.
  • bone density limits may also be determined based on one or more other characteristics. For example, bone removal limits may be based on a patient’ s sex, age, or another overall characteristic relevant to spinal stability. In another example, the bone removal limit may be based on an implant history associated with the patient that affects spinal stability (e.g., whether spinal fusions have been performed, whether screws, rods, cages, and the like have been installed, etc.). In another example, the bone removal limit may be based on the medical history of the patient (e.g., whether the patient is a smoker, whether the patient has suffered from acute or chronic conditions relevant to spinal stability, etc.).
  • the electronic processor 104 controls the robotic arm according to a user input to remove a volume of bone from one or more of the segmented parts with the surgical tool 128.
  • a surgeon may provide user input by selecting a procedure from the user interface 110 or by manipulating the surgical tool 128 directly to perform bone removal assisted by the robotic arm.
  • the electronic processor 104 tracks the movement of the surgical tool 128.
  • the electronic processor 104 may use inputs from imaging devices 112 or sensors 126, 136 to register the anatomical element using the navigation marker 208.
  • the electronic processor 104 tracks the movement of the robotic arm 116, and in particular the surgical tool 128, relative to the navigation marker (e.g., using the navigation system 118).
  • the electronic processor 104 can use knowledge of the movements and the state of the surgical tool 128 (e.g., the type of tool head and how it is operating) to generate data representing the volume of bone being removed.
  • the electronic processor 104 determines whether the volume of bone removed exceeds the bone removal limit for a segmented part (e.g., by comparing the data representing the volume to the bone removal limit). When the volume does not exceed the bone removal limit (at block 310), the electronic processor 104 continues to operate the robotic arm 116 and the surgical tool according to user inputs from the surgeon.
  • the electronic processor 104 When, at block 310, the volume does exceed the bone removal limit, the electronic processor 104, at block 314 prevents the surgical tool from interacting with the segmented part, for example, user inputs attempting to operate the tool head of the surgical tool 128 to remove bone from the segmented part may be ignored, while inputs attempting to operate the tool head of the surgical tool 128 to remove bone from areas outside the segmented part may be processed. In some aspects, the electronic processor 104 may stop operating the surgical tool. In some examples, responsive to determining that the volume exceeds the bone removal limit, the electronic processor 104 may control the user interface to generate an alert (e.g., audio, visual, haptic, or combinations of the forgoing) to inform the surgeon that a removal limit has been reached.
  • an alert e.g., audio, visual, haptic, or combinations of the forgoing
  • bone removal limits are not based on a single segmented part, but on combinations of bone removal.
  • a bone removal limit may dictate that no more than 50% of a pars volume or no more than 60% of a lamina volume may be removed.
  • a bone removal limit may dictate that the no more than 30% of a pars volume and no more than 30% of a lamina volume may be removed.
  • the electronic processor 104 may prevent the surgical tool from interacting with one or both of two segmented parts when a combination of removed volumes exceeds the bone removal limit based on the two segmented parts.
  • the adaptive bone removal method 300 may also be applicable to the use of non-robotic surgical tools.
  • embodiments of the system 100 may include a surgical tool, which is operable by a surgeon in a free hand mode.
  • the method 300 may be applied to track movement of the free hand operated surgical tool to determine the volume of bone removal and apply bone removal limits.
  • the electronic processor 104 may control the surgical tool to stop operating and/or generate an alert that a limit has been reached.
  • references herein to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the disclosure.
  • the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments necessarily mutually exclusive of other embodiments. The same applies to the term “implementation.”
  • the conjunction “if’ may also or alternatively be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” which construal may depend on the corresponding specific context.
  • the phrase “if it is determined” or “if [a stated condition] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event].”
  • the terms “couple,” “coupling,” “coupled,” “connect,” “connecting,” or “connected” refer to any manner known in the art or later developed in which energy is allowed to be transferred between two or more elements, and the interposition of one or more additional elements is contemplated, although not required.
  • the terms “directly coupled,” “directly connected,” et cetera imply the absence of such additional elements.
  • attachment and “directly attached,” as applied to a description of a physical structure.
  • a relatively thin layer of adhesive or other suitable binder can be used to implement such “direct attachment” of the two corresponding components in such physical structure.
  • processor or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), and nonvolatile storage. Other hardware, conventional and/or custom, may also be included.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • ROM read only memory
  • RAM random access memory
  • nonvolatile storage nonvolatile storage.
  • Other hardware conventional and/or custom, may also be included.
  • any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
  • circuitry may refer to one or more or all of the following: (a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry); (b) combinations of hardware circuits and software, such as (as applicable): (i) a combination of analog and/or digital hardware circuit(s) with software/firmware and (ii) any portions of hardware processor(s) with software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions); and (c) hardware circuit(s) and or processor(s), such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.”
  • This definition of circuitry applies to all uses of this term in this application, including in any claims.
  • Example 5 The medical system of any of Examples 1-4, wherein the electronic processor is further configured to: determine a bone removal limit for the segmented part based on at least one selected from a group consisting of a sex of a patient, an age of the patient, an implant history associated with the patient, a medical history of the patient.
  • the electronic processor is further configured to: perform the sub-anatomical segmentation on the scan to determine a second segmented part; analyze the second segmented part to determine a second bone removal limit for the second segmented part; control the robotic arm according to a second user input to remove a second volume of bone from the second segmented part with the surgical tool; and when a combination of the volume and the second volume exceeds the bone removal limit, prevent the surgical tool from interacting with the segmented part and the second segmented part.
  • Example 12 The method of any of Examples 9-11, wherein determining the bone removal limit for the segmented part includes determining the bone removal limit based on a bone mineral density for the segmented part.
  • Example 16 The method of any of Examples 9-15, wherein receiving the scan of an anatomical element includes receiving one selected from a group consisting of a computed tomography scan, a magnetic resonance imaging scan, a positron emission tomography scan, an ultrasound scan, and a fluoroscopy scan.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

An example system may include a robot including a robotic arm. A system may include a surgical tool coupled to the robotic arm. A system may include an electronic processor coupled to the robot and configured to receive a scan of an anatomical element; perform a sub-anatomical segmentation on the scan to determine a segmented part; analyze the segmented part to determine a bone removal limit for the segmented part; control the robotic arm according to a user input to remove a volume of bone from the segmented part with the surgical tool; and, when the volume exceeds the bone removal limit, prevent the surgical tool from interacting with the segmented part.

Description

ADAPTIVE BONE REMOVAL SYSTEM AND METHOD
FIELD
[0001] This application relates generally to the field of robotics assisted surgery.
BACKGROUND
[0002] Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure or may complete one or more surgical procedures autonomously. During such surgical procedures, surgical tools may be used on one or more anatomical elements. The tools may be oriented and operated by the surgical robot and/or the surgeon or other medical provider.
BRIEF DESCRIPTION OF THE DRAWINGS
[0003] The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments, examples, aspects, and features of concepts that include the claimed subject matter and explain various principles and advantages of those embodiments, examples, aspects, and features.
[0004] FIG. 1 is a block diagram illustrating a robotic system according to various examples.
[0005] FIG. 2 illustrates aspects of the operation of the robotic system of FIG. 1 according to various examples.
[0006] FIG. 3 illustrates a flowchart of a method performed by the robotic system of FIG. 1 according to various examples.
[0007] FIG. 4 is a chart illustrating a segmentation performed by the robotic system of FIG. 1 according to various examples.
[0008] FIG. 5 illustrates aspects of the operation of the robotic system of FIG. 1 according to various examples.
[0009] Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of examples, aspects, and features illustrated.
[0010] In some instances, the apparatus and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the of various embodiments, examples, aspects, and features so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0011] When performing robotics assisted spinal surgery, it may be necessary to remove bone from portions of the spine. For example, bone removal may be necessary to accomplish spinal decompression. However, bone removal may result in reduced spinal stability. Restoring spinal stability may require performing vertebral fusion and/or the addition of screws and rods to the spine. Accordingly, during surgical intervention involving the spine, it is important that the necessary bone removal be performed without de-stabilizing the spine.
[0012] Spinal surgery must be performed according to rules, which limit where and how much bone may be removed to prevent spinal instability. The rules may limit how much volume (e.g., as a percentage) of a spinal segment may be removed from particular parts of the spinal segment without causing instability. The rules may also provide aggregate limits (e.g., a percentage of overall volume for a segment or a combination of limits for different parts of the segment being reached). Robotic surgical platforms operate according to surgical plans, which are determined without taking into account applicable rules.
[0013] Bone removal may be performed using a hand guided robotic arm, which allows the surgeon to perform the bone removal only in designated areas, based on the rules. However, as the surgery progresses, circumstances may develop, which make it advisable for the surgeon to remove more bone than originally planned for from a part of the spinal segment or remove bone from different parts of the spinal segment than called for in the plan. As this occurs, the surgeon may not be able to determine how these changes affect whether or not the rules’ limits have been met.
[0014] To address these problems, embodiments and aspects presented herein provide an adaptive bone removal method. In some aspects, a robotic surgical platform incorporates an active constraint system, which will allow a surgeon to operate the surgical tool but will restrict efforts to remove too much bone. Using such embodiments, a sub-anatomical segmentation and bone mineral density analysis is performed to create a plan for bone removal, which does not de-stabilize the spine. As the surgery is performed, the robotic surgical platform calculates and records the bone removed. Where limits are met, the robotic surgical platform prevents the surgeon from further removing bone, which would contribute to instability of the spine, while continuing to allow removal from areas of the spinal segment where such removal would not contribute to instability of the spine.
[0015] In some aspects, the examples presented herein calculate bone removal taking into account the volume of bone removed. In other aspects, the platform may take into account information beyond volume of bone (e.g., patient characteristics. For example, bone mineral density of the vertebrae, patient age, patient smoker status, overall spinal alignment, and the like may be used to predict post-operative spinal stability as material is removed during surgery.
[0016] Using the examples presented herein, a surgeon has the freedom to address changing circumstances during a surgical procedure by deviating from a pre-operative surgical plan while staying within recommended tolerances for bone removal to prevent spinal instability.
[0017] In some aspects, the techniques described herein relate to a medical system including: a robot including a robotic arm; a surgical tool coupled to the robotic arm; an electronic processor coupled to the robot, and configured to: receive a scan of an anatomical element; perform a sub- anatomical segmentation on the scan to determine a segmented part; analyze the segmented part to determine a bone removal limit for the segmented part; control the robotic arm according to a user input to remove a volume of bone from the segmented part with the surgical tool; and when the volume exceeds the bone removal limit, prevent the surgical tool from interacting with the segmented part.
[0018] In some aspects, the techniques described herein relate to a method for operating a surgical robot, the method including: receiving a scan of an anatomical element; performing a sub- anatomical segmentation on the scan to determine a segmented part; analyzing the segmented part to determine a bone removal limit for the segmented part; controlling a robotic arm of the surgical robot according to a user input to remove a volume of bone from the segmented part with a surgical tool; and when the volume exceeds the bone removal limit, preventing the surgical tool from interacting with the segmented part.
[0019] Specific embodiments of the present disclosure are now described with reference to the figures, wherein like reference numbers indicate identical or functionally similar elements. The terms “distal” and “proximal” are used in the following description with respect to a position or direction relative to the surgical robot. “Distal” or “distally” are a position distant from or in a direction away from the surgical robot toward the patient. “Proximal” and “proximally” are a position near or in a direction away from the patient toward the surgical robot. [0020] Before any examples are explained in detail, it is to be understood that the examples presented herein are not limited in their application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The examples are capable of other embodiments and of being practiced or of being carried out in various ways. For ease of description, the example systems presented herein may be illustrated with a single exemplar of each of its component parts. Some examples may not describe or illustrate all components of the systems. Other example embodiments may include more or fewer of each of the illustrated components, may combine some components, or may include additional or alternative components.
[0021] FIG. 1 is a block diagram of a robotic system 100. The robotic system 100 may be used to carry out robotic assisted surgery, including one or more aspects of one or more of the methods disclosed herein. The robotic system 100 includes a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, one or more sensors 126, a database 130, and/or a cloud (or another network) 134. Systems according to other embodiments of the present disclosure may comprise more or fewer components than the robotic system 100. The computing device 102 includes an electronic processor 104, a memory 106, a communication interface 108, and a user interface 110. In some aspects, the computing device 102 may include more or fewer components than illustrated in the example.
[0022] The computing device 102 includes an electronic processor 104 (for example, a microprocessor, application specific integrated circuit, etc.), a memory 106, a communication interface 108, and a user interface 110. The electronic processor 104, the memory 106, the communication interface 108, and the user interface 110, as well as the other various modules (not illustrated) are coupled directly, by one or more control or data buses (e.g., the bus 140), or a combination thereof.
[0023] The memory 106 may be made up of one or more non-transitory computer-readable media and includes at least a program storage area and a data storage area. The program storage area and the data storage area can include combinations of several types of memory, such as read-only memory (“ROM”), random access memory (“RAM”) (for example, dynamic RAM (“DRAM”), synchronous DRAM (“SDRAM”), etc.), electrically erasable programmable read-only memory (“EEPROM”), flash memory, or any other suitable tangible, non-transitory memory for storing computer- readable data and/or instructions. The memory 106 may store information or data useful for completing, for example, any aspect of the methods 400 and 700 described herein, or of any other methods. The memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114. For instance, the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the electronic processor 104, enable image processing 120, sensor processing 122, and/or active constraints 124. Such content may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines.
[0024] The image processing 120 enables the electronic processor 104 to process image data of an image (received from, for example, the imaging device 112, a camera of the navigation system 118, or any imaging device) for the purpose of, for example, identifying information about an anatomical element 204 (shown in FIG. 2) and/or objects in the image such as a surgical tool 128. The identifying information can be used to determine a three-dimensional location of the surgical tool 128, for example, relative to the anatomical element 204.
[0025] The sensor processing 122 enables the processor 104 to process sensor output data (received from for example, the one or more sensors 126) for the purpose of, for example, determining the location of the robotic arm 116 and/or the surgical tool 128. The sensor output may be received as signal(s) and may be processed by the electronic processor 104 using the sensor processing 122 to output data such as, for example, force data, acceleration data, pose data, time data, location data, etc.
[0026] The active constraints 124 enables the processor 104 to restrict operations of the robotic arm 116 and the surgical tool 128, based in part on the adaptive bone removal methods disclosed herein. In some aspects, the active constraints 124 makes or uses determinations of bone removal made during surgical operations to plan and control movement for the robotic arm 116, as described herein.
[0027] Alternatively, or additionally, the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc.) that can be processed by the electronic processor 104 to carry out the various method and features described herein. Thus, although various contents of memory 106 may be described as instructions, it should be appreciated that functionality described herein can be achieved through use of instructions, algorithms, and/or machine learning models. The data, algorithms, and/or instructions may cause the electronic processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, the sensors 126, and/or the cloud 134. [0028] The electronic processor 104 sends and receives information (for example, from the memory 106, the communication interface 108, and/or the user interface 110) and processes the information by executing one or more software instructions or modules, capable of being stored in the memory 106, or another non-transitory computer readable medium. The software can include firmware, one or more applications, program data, filters, rules, one or more program modules, and other executable instructions. The electronic processor 104 is configured to retrieve from the memory 106 and execute, among other things, software for performing techniques and methods as described herein.
[0029] The communication interface 108 transmits and receives information from devices external to the computing device 102, for example, components of the robotic system 100. The communication interface 108 receives input (for example, from the user interface 110), provides system output or a combination of both. The communication interface 108 may be used for receiving image data or other information from an external source (such as the imaging device 112, the robot 114, the navigation system 118, the sensors 126, the database 130, the cloud 134, and/or any other system or component not part of the robotic system 100), and/or for transmitting instructions, images, or other information to an external system or device (e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the sensors 126, the database 130, the cloud 134, and/or any other system or component not part of the robotic system 100).
[0030] The communication interface 108 may include one or more wired interfaces (e.g., a USB port, an Ethernet port, etc.) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.11a/b/g/n, Bluetooth, NFC, ZigBee, and so forth). In some embodiments, the communication interface 108 may be useful for enabling the computing device 102 to communicate with one or more other electronic processors or computing devices, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
[0031] The computing device 102 may also include one or more user interfaces 110. The user interface 110 may be or include a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user. The user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the robotic system 100 (e.g., by the electronic processor 104 or another component of the robotic system 100) or received by the robotic system 100 from a source external to the robotic system 100. In some embodiments, the user interface 110 may be useful to allow a surgeon or other user to modify instructions to be executed by the electronic processor 104 according to one or more embodiments of the present disclosure, and/or to modify or adjust a setting of other information displayed on the user interface 110 or corresponding thereto.
[0032] Although the user interface 110 is shown as part of the computing device 102, in some embodiments, the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102. In some embodiments, the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computing device 102.
[0033] It should be understood that although FIG. 1 illustrates only a single electronic processor 104, memory 106, communication interface 108, and user interface 110, alternative embodiments of the computing device 102 may include multiple electronic processors, memory modules, communication interfaces, and/or user interfaces. It should also be noted that the robotic system 100 may include other computing devices, each including similar components as, and configured similarly to, the computing device 102. In some embodiments, portions of the computing device 102 are implemented partially or entirely on a semiconductor chip (e.g., an application specific integrated circuit (ASIC), a field-programmable gate array (“FPGA”), and the like). Similarly, the various modules and controllers described herein may be implemented as individual controllers, as illustrated, or as components of a single controller. In some aspects, a combination of approaches may be used.
[0034] Continuing with other aspects of the robotic system 100, the imaging device 112 may be operable to image anatomical feature(s) (e.g., a bone, veins, tissue, etc.) and/or other aspects of patient anatomy and/or objects such as the surgical tool 128 to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc.). “Image data” as used herein refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form. In various examples, the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof, and/or objects such as the surgical tool 128. The image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure. The imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data. The imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an O-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient and/or objects such as the surgical tool 128.
[0035] The sensors 126 are configured to provide sensor output. The sensors 126 may include a position sensor, a proximity sensor, a magnetometer, or an accelerometer. In some embodiments, the sensors 126 may include a linear encoder, a rotary encoder, or an incremental encoder (e.g., positioned to sense movement or position of the robotic arm 116). Sensor output or output data from the sensors 126 may be provided to an electronic processor of the robot 114, to the electronic processor 104 of the computing device 102, and/or to the navigation system 118. Output data from the sensor(s) 126 may also be used to determine position information for the robot 114. It will be appreciated that in some embodiments, the sensors 126 may be a component separate from the robotic arm 116. In other embodiments, sensors 136 — which may be the same as or similar to the sensors 126 — may be integrated with the robot 114. The sensors 126 may enable the electronic processor 104 (or an electronic processor of the robot 114) to determine a precise pose in space of a robotic arm 116 (as well as any object or element held by or secured to the robotic arm). In other words, sensor output or output data from the sensors 126 may be used to calculate a position in space of the robotic arm 116 (and thus, the surgical tool 128) relative to one or more coordinate systems.
[0036] The robot 114 may be any surgical robot or part of any robotic assisted surgery system, either of which is capable of operating as described herein. The robot 114 may be or comprise, for example, the Mazor X™ Stealth Edition robotic guidance system, or any derivative thereof. The robot 114 may be configured to position the surgical tool 128 at one or more precise poses (e.g., position(s) and orientation(s)). The surgical tool 128 may be any tool capable of cutting, drilling, milling, and/or parting an anatomical element. The surgical tool 128 may be, in one example, a drill bit. In some embodiments, the robot 114 may be configured to rotate and/or advance the cutting tool 128 using, for example, one or more motors to rotate the surgical tool 128.
[0037] The robot 114 may additionally or alternatively be configured to manipulate any component (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task. In some embodiments, the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure. The robot 114 may comprise one or more robotic arms 116. In some embodiments, the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms. In some embodiments, one or more of the robotic arms 116 may be used to hold and/or maneuver the surgical tool 128. Each robotic arm 116 may be positionable independently of the other robotic arm. The robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
[0038] The robot 114, together with the robotic arm(s) 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, a surgical tool 128 or another object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
[0039] In some embodiments, reference markers (e.g., navigation markers) may be placed on the robot 114 (including, e.g., on the robotic arm 116), the surgical tool 128, or any other object in the surgical space. The reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the robotic system 100 or any component thereof.
[0040] The navigation system 118 provides navigation for a surgeon and/or a surgical robot during an operation. The navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStation™ S8 surgical navigation system or any successor thereof. The navigation system 118 may include one or more cameras (e.g., the camera 210 illustrated in FIG. 2) or other sensor(s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the robotic system 100 is located. The one or more cameras may be optical cameras, infrared cameras, or other cameras. In some embodiments, the navigation system 118 may comprise one or more electromagnetic sensors. In various embodiments, the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, the surgical tool 128), and/or one or more other tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing). The navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118. In some embodiments, the robotic system 100 can operate without the use of the navigation system 118. The navigation system 118 may be configured to provide guidance to a surgeon or other user of the robotic system 100 or a component thereof, to the robot 114, or to any other element of the robotic system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
[0041] The database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system). The database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the robotic system 100); one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the robotic system 100; and/or any other useful information.
[0042] In some implementations, the database 130 stores patient data associated with one or more patients. Patient data may include, for example, an age associated with a patient, a body mass index (“BMI”) associated with a patient, a bone density associated with a patient, a computed tomography scan associated with a patient, a sex associated with a patient, an implant history associated with a patient (for example, a date and a placement location associated with an implant such as a screw, a rod, a cage, a prosthesis, or the like), other aspects of a patient’s medical history (e.g., smoker/non- smoker, chronic and acute illnesses, and the like), or combinations of the foregoing. In some implementations, patient data is associated with a patient undergoing a surgical procedure wherein the systems and methods described herein are utilized. In some implementations, the patient data may be stored in the database 130 prior to a surgical procedure. In some implementations, the electronic processor 104 is configured to send to the database 130, a query requesting patient data associated with a unique patient identifier and receive, from the database 130, patient data associated with the unique patient identifier.
[0043] The database 130 may be configured to provide any such information to the computing device 102 or to any other device of the robotic system 100 or external to the robotic system 100, whether directly or via the cloud 134. In some embodiments, the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
[0044] The cloud 134 may be or represent the Internet or any other wide area network. The computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both. In some embodiments, the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
[0045] The robotic system 100 or similar systems may be used, for example, to carry out one or more aspects of the methods described herein. The robotic system 100 or similar systems may also be used for other purposes.
[0046] FIG. 2 illustrates a representative example system 200 of the robotic system 100. The system 200 includes the computing device 102, the robot 114, and the navigation system 118. The illustrated example of the robot 114 includes one robotic arm 116, and the surgical tool 128. As illustrated in FIG. 2, the robot 114 may be used to perform procedures on an anatomical element 204.
[0047] The navigation system 118 includes a camera 210, which has a field of view 206. As illustrated in FIG. 2, the field of view may encompass the anatomical element 204 (including an navigation marker 208) and the robotic arm 116. [0048] FIG. 3 illustrates an example method 300 for operating the system of FIG. 1 to perform robotic surgery using adaptive bone removal. Although the method 300 is described in conjunction with the robotic system 100 as described herein, the method 300 could be used with other systems and devices. In addition, the method 300 may be modified or performed differently than the example provided. In particular, the method 300 is also applicable to surgeries using free hand surgical tools.
[0049] As an example, the method 300 is described as being performed by the computing device 102, and, in particular, the electronic processor 104. However, it should be understood that, in some examples, portions of the method 300 may be performed by other components of the robotic system 100, such as, for example, the robot 114 and the navigation system 118.
[0050] At block 302, the electronic processor 104 receives a scan of an anatomical element. For example, the electronic processor 104 may receive a computed tomography scan from the database 130 (e.g., taken during preoperative evaluation). In some aspects, the electronic processor 104 may control imaging devices 112, sensors 126, or another medical scanning device coupled to the system 100 to perform a scan of anatomical element (e.g., the anatomical element 204) within the operating theater as part of the surgical operation. Additionally or optionally, other types of scans may be performed or received, including a magnetic resonance imaging scan, a positron emission tomography scan, an ultrasound scan, and a fluoroscopy scan.
[0051] At block 304, the electronic processor 104 performs a sub-anatomical segmentation on the scan to determine one or more segmented parts. For example, as illustrated in FIG. 4, a scan of the anatomical element 204 includes many parts including pars interarticularis, pedicle, laminae, body, facets, and transverse processes. As illustrated, four segmented parts 402 are identified: a pars interarticularis, a pedicle, and two laminae.
[0052] Returning to FIG. 3, at block 306, the electronic processor 104 analyzes the segmented parts to determine a bone removal limit for the segmented parts. For example, a bone removal limit may be a percentage of volume, which can be removed without creating an unacceptable instability. In some aspects, the bone removal limit is based on a desired post-operative spinal stability. The bone removal limits may be based on standard rules (e.g., remove no more than 50% of a facet capsule). However, rules may be altered based on individual characteristics of the patient and the anatomical element. For example, the bone removal limits may be determined based on a bone mineral density for the segmented parts. FIG. 5 illustrates a scan 500 showing bone mineral density for several vertebrae. As illustrated in FIG. 5, bone density may vary by patient and across the anatomical element. In some cases, it may be advisable to adjust the bone removal limit lower than the standard because of a lower-than-average bone density, while in other cases it may be permissible to allow more volume to be removed because of a higher-than-average bone density. Bone density limits may also be determined based on one or more other characteristics. For example, bone removal limits may be based on a patient’ s sex, age, or another overall characteristic relevant to spinal stability. In another example, the bone removal limit may be based on an implant history associated with the patient that affects spinal stability (e.g., whether spinal fusions have been performed, whether screws, rods, cages, and the like have been installed, etc.). In another example, the bone removal limit may be based on the medical history of the patient (e.g., whether the patient is a smoker, whether the patient has suffered from acute or chronic conditions relevant to spinal stability, etc.).
[0053] At block 308, the electronic processor 104 controls the robotic arm according to a user input to remove a volume of bone from one or more of the segmented parts with the surgical tool 128. For example, a surgeon may provide user input by selecting a procedure from the user interface 110 or by manipulating the surgical tool 128 directly to perform bone removal assisted by the robotic arm.
[0054] While the surgical tool is being operated, the electronic processor 104 tracks the movement of the surgical tool 128. For example, the electronic processor 104 may use inputs from imaging devices 112 or sensors 126, 136 to register the anatomical element using the navigation marker 208. As the robotic arm is controller, the electronic processor 104 tracks the movement of the robotic arm 116, and in particular the surgical tool 128, relative to the navigation marker (e.g., using the navigation system 118). By tracking the movements of the surgical tool 128, the electronic processor 104 can use knowledge of the movements and the state of the surgical tool 128 (e.g., the type of tool head and how it is operating) to generate data representing the volume of bone being removed.
[0055] At block 310, the electronic processor 104 determines whether the volume of bone removed exceeds the bone removal limit for a segmented part (e.g., by comparing the data representing the volume to the bone removal limit). When the volume does not exceed the bone removal limit (at block 310), the electronic processor 104 continues to operate the robotic arm 116 and the surgical tool according to user inputs from the surgeon. [0056] When, at block 310, the volume does exceed the bone removal limit, the electronic processor 104, at block 314 prevents the surgical tool from interacting with the segmented part, for example, user inputs attempting to operate the tool head of the surgical tool 128 to remove bone from the segmented part may be ignored, while inputs attempting to operate the tool head of the surgical tool 128 to remove bone from areas outside the segmented part may be processed. In some aspects, the electronic processor 104 may stop operating the surgical tool. In some examples, responsive to determining that the volume exceeds the bone removal limit, the electronic processor 104 may control the user interface to generate an alert (e.g., audio, visual, haptic, or combinations of the forgoing) to inform the surgeon that a removal limit has been reached.
[0057] In some examples, bone removal limits are not based on a single segmented part, but on combinations of bone removal. For example, a bone removal limit may dictate that no more than 50% of a pars volume or no more than 60% of a lamina volume may be removed. In another example, a bone removal limit may dictate that the no more than 30% of a pars volume and no more than 30% of a lamina volume may be removed. In such examples, the electronic processor 104 may prevent the surgical tool from interacting with one or both of two segmented parts when a combination of removed volumes exceeds the bone removal limit based on the two segmented parts.
[0058] It should be understood that the adaptive bone removal method 300 may also be applicable to the use of non-robotic surgical tools. For example, embodiments of the system 100 may include a surgical tool, which is operable by a surgeon in a free hand mode. In such embodiments, the method 300 may be applied to track movement of the free hand operated surgical tool to determine the volume of bone removal and apply bone removal limits. In some embodiments, where the volume exceeds the bone removal limit, the electronic processor 104 may control the surgical tool to stop operating and/or generate an alert that a limit has been reached.
[0059] With regard to the processes, systems, methods, heuristics, etc. described herein, it should be understood that, although the steps of such processes, etc. have been described as occurring according to a certain ordered sequence, such processes could be practiced with the described steps performed in an order other than the order described herein. It further should be understood that certain steps could be performed simultaneously, that other steps could be added, or that certain steps described herein could be omitted. In other words, the descriptions of processes herein are provided for the purpose of illustrating certain implementations and should in no way be construed to limit the claims.
[0060] Accordingly, it is to be understood that the above description is intended to be illustrative and not restrictive. Many embodiments and applications other than the examples provided would be apparent upon reading the above description. The scope should be determined, not with reference to the above description, but should instead be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that future developments will occur in the technologies discussed herein, and that the disclosed systems and methods will be incorporated into such future embodiments. In sum, it should be understood that the application is capable of modification and variation.
[0061] All terms used in the claims are intended to be given their broadest reasonable constructions and their ordinary meanings as understood by those knowledgeable in the technologies described herein unless an explicit indication to the contrary in made herein. In particular, use of the singular articles such as “a,” “the,” “said,” et cetera, should be read to recite one or more of the indicated elements unless a claim recites an explicit limitation to the contrary.
[0062] Unless explicitly stated otherwise, each numerical value and range should be interpreted as being approximate as if the word “about” or “approximately” preceded the value or range.
[0063] Reference herein to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the disclosure. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments necessarily mutually exclusive of other embodiments. The same applies to the term “implementation.”
[0064] Unless otherwise specified herein, the use of the ordinal adjectives “first,” “second,” “third,” etc., to refer to an object of a plurality of like objects merely indicates that different instances of such like objects are being referred to, and is not intended to imply that the like objects so referred-to have to be in a corresponding order or sequence, either temporally, spatially, in ranking, or in any other manner.
[0065] Unless otherwise specified herein, in addition to its plain meaning, the conjunction “if’ may also or alternatively be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” which construal may depend on the corresponding specific context. For example, the phrase “if it is determined” or “if [a stated condition] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event].”
[0066] Also, for purposes of this description, the terms “couple,” “coupling,” “coupled,” “connect,” “connecting,” or “connected” refer to any manner known in the art or later developed in which energy is allowed to be transferred between two or more elements, and the interposition of one or more additional elements is contemplated, although not required. Conversely, the terms “directly coupled,” “directly connected,” et cetera, imply the absence of such additional elements. The same type of distinction applies to the use of terms “attached” and “directly attached,” as applied to a description of a physical structure. For example, a relatively thin layer of adhesive or other suitable binder can be used to implement such “direct attachment” of the two corresponding components in such physical structure.
[0067] The described embodiments are to be considered in all respects as only illustrative and not restrictive. In particular, the scope of the disclosure is indicated by the appended claims rather than by the description and figures herein. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.
[0068] The functions of the various elements shown in the figures, including any functional blocks labeled as “processors” and/or “controllers,” may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), and nonvolatile storage. Other hardware, conventional and/or custom, may also be included. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context. [0069] As used in this application, the term “circuitry” may refer to one or more or all of the following: (a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry); (b) combinations of hardware circuits and software, such as (as applicable): (i) a combination of analog and/or digital hardware circuit(s) with software/firmware and (ii) any portions of hardware processor(s) with software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions); and (c) hardware circuit(s) and or processor(s), such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.” This definition of circuitry applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware. The term circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.
[0070] It should be appreciated by those of ordinary skill in the art that any block diagrams herein represent conceptual views of illustrative circuitry embodying the principles of the disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
[0071] It should be understood that although certain figures presented herein illustrate hardware and software located within particular devices, these depictions are for illustrative purposes only. In some embodiments, the illustrated components may be combined or divided into separate software, firmware, and/or hardware. For example, instead of being located within and performed by a single electronic processor, logic and processing may be distributed among multiple electronic processors. Regardless of how they are combined or divided, hardware and software components may be located on the same computing device or may be distributed among different computing devices connected by one or more networks or other suitable communication links. [0072] The following paragraphs provide various Examples reciting examples and alternatives disclosed herein.
[0073] Example 1. A medical system comprising: a robot including a robotic arm; a surgical tool coupled to the robotic arm; an electronic processor coupled to the robot, and configured to: receive a scan of an anatomical element; perform a sub-anatomical segmentation on the scan to determine a segmented part; analyze the segmented part to determine a bone removal limit for the segmented part; control the robotic arm according to a user input to remove a volume of bone from the segmented part with the surgical tool; and when the volume exceeds the bone removal limit, prevent the surgical tool from interacting with the segmented part.
[0074] Example 2. The medical system of Example 1, wherein the electronic processor is further configured to: responsive to controlling the robotic arm according to the user input, track a movement of the robotic arm relative; generate data representing the volume based on the movement; and determine whether the volume exceeds the bone removal limit by comparing the data representing the volume to the bone removal limit.
[0075] Example 3. The medical system of any of Examples 1 and 2, wherein the electronic processor is further configured to: determine the bone removal limit for the segmented part based on a desired post-operative spinal stability.
[0076] Example 4. The medical system of any of Examples 1-3, wherein the electronic processor is further configured to: determine a bone removal limit for the segmented part based on a bone mineral density for the segmented part.
[0077] Example 5. The medical system of any of Examples 1-4, wherein the electronic processor is further configured to: determine a bone removal limit for the segmented part based on at least one selected from a group consisting of a sex of a patient, an age of the patient, an implant history associated with the patient, a medical history of the patient.
[0078] Example 6. The medical system of any of Examples 1-5, further comprising: a user interface; wherein the electronic processor is further configured to, responsive to determining that the volume exceeds the bone removal limit, control the user interface to generate an alert.
[0079] Example 7. The medical system of any of Examples 1-6, wherein the electronic processor is further configured to receive the scan of an anatomical element by receiving one selected from a group consisting of a computed tomography scan, a magnetic resonance imaging scan, a positron emission tomography scan, an ultrasound scan, and a fluoroscopy scan. [0080] Example 8. The medical system of any of Examples 1-7, wherein the electronic processor is further configured to: perform the sub-anatomical segmentation on the scan to determine a second segmented part; analyze the second segmented part to determine a second bone removal limit for the second segmented part; control the robotic arm according to a second user input to remove a second volume of bone from the second segmented part with the surgical tool; and when a combination of the volume and the second volume exceeds the bone removal limit, prevent the surgical tool from interacting with the segmented part and the second segmented part.
[0081] Example 9. A method for operating a surgical robot, the method comprising: receiving a scan of an anatomical element; performing a sub-anatomical segmentation on the scan to determine a segmented part; analyzing the segmented part to determine a bone removal limit for the segmented part; controlling a robotic arm of the surgical robot according to a user input to remove a volume of bone from the segmented part with a surgical tool; and when the volume exceeds the bone removal limit, preventing the surgical tool from interacting with the segmented part.
[0082] Example 10. The method of Example 9, further comprising: responsive to controlling the robotic arm according to the user input, tracking a movement of the robotic; generating data representing the volume based on the movement; and determining whether the volume exceeds the bone removal limit by comparing the data representing the volume to the bone removal limit.
[0083] Example 11. The method of any of Examples 9 and 10, wherein determining the bone removal limit for the segmented part includes determining the bone removal limit based on a desired post-operative spinal stability.
[0084] Example 12. The method of any of Examples 9-11, wherein determining the bone removal limit for the segmented part includes determining the bone removal limit based on a bone mineral density for the segmented part.
[0085] Example 13. The method of any of Examples 9-12, wherein determining the bone removal limit for the segmented part includes determining the bone removal limit based on at least one selected from a group consisting of a sex of a patient, an age of the patient, an implant history associated with the patient, a medical history of the patient.
[0086] Example 14. The method of any of Examples 9-13, further comprising: performing the sub- anatomical segmentation on the scan to determine a second segmented part; analyzing the second segmented part to determine a second bone removal limit for the second segmented part; controlling the robotic arm according to a second user input to remove a second volume of bone from the second segmented part with the surgical tool; and preventing the surgical tool from interacting with the segmented part and the second segmented part when a combination of the volume and the second volume exceeds the bone removal limit.
[0087] Example 15. The method of any of Examples 9-14, further comprising: responsive to determining that the volume exceeds the bone removal limit, generating an alert.
[0088] Example 16. The method of any of Examples 9-15, wherein receiving the scan of an anatomical element includes receiving one selected from a group consisting of a computed tomography scan, a magnetic resonance imaging scan, a positron emission tomography scan, an ultrasound scan, and a fluoroscopy scan.
[0089] Example 17. A medical system comprising: a surgical tool; an electronic processor coupled to the surgical tool, and configured to: receive a scan of an anatomical element; perform a sub- anatomical segmentation on the scan to determine a segmented part; analyze the segmented part to determine a bone removal limit for the segmented part; control the surgical tool according to a user input to remove a volume of bone from the segmented part; responsive to controlling the surgical tool according to the user input, track a movement of the surgical tool; generate data representing the volume based on the movement; comparing the data representing the volume to the bone removal limit; and when the volume exceeds the bone removal limit, control the surgical tool to stop operating.
[0090] Example 18. The medical system of Example 17, wherein the electronic processor is further configured to: perform the sub-anatomical segmentation on the scan to determine a second segmented part; analyze the second segmented part to determine a second bone removal limit for the second segmented part; control the surgical tool according to a second user input to remove a second volume of bone from the second segmented part; responsive to controlling the surgical according to the second user input, track a second movement of the robotic arm relative to the navigation marker; generate data representing the second volume based on the second movement; comparing the data representing the second volume to the bone removal limit; and when a combination of the volume and the second volume exceeds the bone removal limit, control the surgical tool to stop operating.
[0091] Example 19. The medical system of any of Examples 17 and 18, wherein the electronic processor is further configured to: determine a bone removal limit for the segmented part based on at least one selected from a group consisting of a desired post-operative spinal stability, a bone mineral density for the segmented part, a sex of a patient, an age of the patient, an implant history associated with the patient, a medical history of the patient.
[0092] Example 20. The medical system of any of Examples 17-19, further comprising: a user interface; wherein the electronic processor is further configured to, responsive to determining that the volume exceeds the bone removal limit, control the user interface to generate an alert.
[0093] Various features and advantages of the examples and embodiments presented herein are set forth in the following claims.

Claims

CLAIMS What is claimed is:
1. A medical system comprising: a robot (114) including a robotic arm (116); a surgical tool (128) coupled to the robotic arm (116); an electronic processor (104) coupled to the robot (114), and configured to: receive a scan (500) of an anatomical element (204); perform a sub-anatomical segmentation on the scan (500) to determine a segmented part; analyze the segmented part to determine a bone removal limit for the segmented part; control the robotic arm (116) according to a user input to remove a volume of bone from the segmented part with the surgical tool (128); and when the volume exceeds the bone removal limit, prevent the surgical tool (128) from interacting with the segmented part.
2. The medical system of claim 1, wherein the electronic processor (104) is further configured to: responsive to controlling the robotic arm (116) according to the user input, track a movement of the robotic arm (116) relative; generate data representing the volume based on the movement; and determine whether the volume exceeds the bone removal limit by comparing the data representing the volume to the bone removal limit.
3. The medical system of any of claims 1 and 2, wherein the electronic processor (104) is further configured to: determine the bone removal limit for the segmented part based on a desired post-operative spinal stability.
4. The medical system of any of claims 1-3, wherein the electronic processor (104) is further configured to: determine a bone removal limit for the segmented part based on a bone mineral density for the segmented part.
5. The medical system of any of claims 1-4, wherein the electronic processor (104) is further configured to: determine a bone removal limit for the segmented part based on at least one selected from a group consisting of a sex of a patient, an age of the patient, an implant history associated with the patient, a medical history of the patient.
6. The medical system of any of claims 1-5, further comprising: a user interface (110); wherein the electronic processor (104) is further configured to, responsive to determining that the volume exceeds the bone removal limit, control the user interface (110) to generate an alert.
7. The medical system of any of claims 1-6, wherein the electronic processor (104) is further configured to receive the scan (500) of an anatomical element (204) by receiving one selected from a group consisting of a computed tomography scan (500), a magnetic resonance imaging scan (500), a positron emission tomography scan (500), an ultrasound scan (500), and a fluoroscopy scan (500).
8. The medical system of any of claims 1-7, wherein the electronic processor (104) is further configured to: perform the sub-anatomical segmentation on the scan (500) to determine a second segmented part; analyze the second segmented part to determine a second bone removal limit for the second segmented part; control the robotic arm (116) according to a second user input to remove a second volume of bone from the second segmented part with the surgical tool (128); and when a combination of the volume and the second volume exceeds the bone removal limit, prevent the surgical tool (128) from interacting with the segmented part and the second segmented part.
9. A method (300) for operating a surgical robot (114), the method (300) comprising: receiving a scan (500) of an anatomical element (204); performing a sub-anatomical segmentation on the scan (500) to determine a segmented part; analyzing the segmented part to determine a bone removal limit for the segmented part; controlling a robotic arm (116) of the surgical robot (114) according to a user input to remove a volume of bone from the segmented part with a surgical tool (128); and when the volume exceeds the bone removal limit, preventing the surgical tool (128) from interacting with the segmented part.
10. The method (300) of claim 9, further comprising: responsive to controlling the robotic arm (116) according to the user input, tracking a movement of the robotic; generating data representing the volume based on the movement; and determining whether the volume exceeds the bone removal limit by comparing the data representing the volume to the bone removal limit.
11. The method (300) of any of claims 9 and 10, wherein determining the bone removal limit for the segmented part includes determining the bone removal limit based on at least of a desired post-operative spinal stability and a bone mineral density for the segmented part.
12. The method (300) of any of claims 9-11, further comprising: performing the sub-anatomical segmentation on the scan (500) to determine a second segmented part; analyzing the second segmented part to determine a second bone removal limit for the second segmented part; controlling the robotic arm (116) according to a second user input to remove a second volume of bone from the second segmented part with the surgical tool (128); and preventing the surgical tool (128) from interacting with the segmented part and the second segmented part when a combination of the volume and the second volume exceeds the bone removal limit.
13. A medical system comprising: a surgical tool (128); an electronic processor (104) coupled to the surgical tool (128), and configured to: receive a scan (500) of an anatomical element (204); perform a sub-anatomical segmentation on the scan (500) to determine a segmented part; analyze the segmented part to determine a bone removal limit for the segmented part; control the surgical tool (128) according to a user input to remove a volume of bone from the segmented part; responsive to controlling the surgical tool (128) according to the user input, track a movement of the surgical tool (128); generate data representing the volume based on the movement; comparing the data representing the volume to the bone removal limit; and when the volume exceeds the bone removal limit, control the surgical tool (128) to stop operating.
14. The medical system of claim 13, wherein the electronic processor (104) is further configured to: perform the sub-anatomical segmentation on the scan (500) to determine a second segmented part; analyze the second segmented part to determine a second bone removal limit for the second segmented part; control the surgical tool (128) according to a second user input to remove a second volume of bone from the second segmented part; responsive to controlling the surgical according to the second user input, track a second movement of the robotic arm (116) relative to the navigation marker (208); generate data representing the second volume based on the second movement; comparing the data representing the second volume to the bone removal limit; and when a combination of the volume and the second volume exceeds the bone removal limit, control the surgical tool (128) to stop operating.
15. The medical system of any of claims 13 and 14, wherein the electronic processor (104) is further configured to: determine a bone removal limit for the segmented part based on at least one selected from a group consisting of a desired post-operative spinal stability, a bone mineral density for the segmented part, a sex of a patient, an age of the patient, an implant history associated with the patient, a medical history of the patient.
PCT/IL2025/050141 2024-02-12 2025-02-11 Adaptive bone removal system and method Pending WO2025172998A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463552238P 2024-02-12 2024-02-12
US63/552,238 2024-02-12

Publications (1)

Publication Number Publication Date
WO2025172998A1 true WO2025172998A1 (en) 2025-08-21

Family

ID=95365530

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2025/050141 Pending WO2025172998A1 (en) 2024-02-12 2025-02-11 Adaptive bone removal system and method

Country Status (1)

Country Link
WO (1) WO2025172998A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190223962A1 (en) * 2018-01-24 2019-07-25 Think Surgical, Inc. Environmental mapping for robotic assisted surgery
US20210353374A1 (en) * 2014-12-02 2021-11-18 KB Medical SA Robot Assisted Volume Removal During Surgery
US20210369361A1 (en) * 2015-12-28 2021-12-02 Mako Surgical Corp. Apparatus And Methods For Robot Assisted Bone Treatment
US20230114040A1 (en) * 2018-06-15 2023-04-13 Mako Surgical Corp Techniques For Patient-Specific Milling Path Generation
US20230240749A1 (en) * 2022-02-01 2023-08-03 Mazor Robotics Ltd. Systems and methods for controlling surgical tools based on bone density estimation
US20240008934A1 (en) * 2016-07-15 2024-01-11 Mako Surgical Corp. Systems and methods for guiding a revision procedure
WO2025088616A1 (en) * 2023-10-27 2025-05-01 Mazor Robotics Ltd. Method and apparatus for procedure navigation

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210353374A1 (en) * 2014-12-02 2021-11-18 KB Medical SA Robot Assisted Volume Removal During Surgery
US20210369361A1 (en) * 2015-12-28 2021-12-02 Mako Surgical Corp. Apparatus And Methods For Robot Assisted Bone Treatment
US20240008934A1 (en) * 2016-07-15 2024-01-11 Mako Surgical Corp. Systems and methods for guiding a revision procedure
US20190223962A1 (en) * 2018-01-24 2019-07-25 Think Surgical, Inc. Environmental mapping for robotic assisted surgery
US20230114040A1 (en) * 2018-06-15 2023-04-13 Mako Surgical Corp Techniques For Patient-Specific Milling Path Generation
US20230240749A1 (en) * 2022-02-01 2023-08-03 Mazor Robotics Ltd. Systems and methods for controlling surgical tools based on bone density estimation
WO2025088616A1 (en) * 2023-10-27 2025-05-01 Mazor Robotics Ltd. Method and apparatus for procedure navigation

Similar Documents

Publication Publication Date Title
US12220195B2 (en) Systems, methods, and devices for defining a path for a robotic arm
US20220395342A1 (en) Multi-arm robotic systems and methods for monitoring a target or performing a surgical procedure
US20250152262A1 (en) Path planning based on work volume mapping
EP4473543A1 (en) Systems, methods, and devices for providing an augmented display
EP4472547A1 (en) Segemental tracking combining optical tracking and inertial measurements
EP4026511B1 (en) Systems and methods for single image registration update
WO2025172998A1 (en) Adaptive bone removal system and method
US12295683B2 (en) Systems and methods for robotic collision avoidance using medical imaging
US20230293244A1 (en) Systems and methods for hybrid motion planning
US20240358461A1 (en) Multi-arm robotic systems and methods for monitoring a target or performing a surgical procedure
US20230404692A1 (en) Cost effective robotic system architecture
US12249099B2 (en) Systems, methods, and devices for reconstructing a three-dimensional representation
US20230240659A1 (en) Systems, methods, and devices for tracking one or more objects
WO2025150039A1 (en) Rigidity-based robotic arm position selection and compensation
WO2025196751A1 (en) Simulator for transforming supine three-dimensional geometry to standing position measurements
WO2025115013A1 (en) Non-unique led pattern geometry and identification for robotics navigation
CN118871054A (en) Systems and methods for hybrid motion planning
WO2023286048A2 (en) Systems, devices, and methods for identifying and locating a region of interest
WO2025109596A1 (en) Systems and methods for registration using one or more fiducials
EP4472545A1 (en) Systems and methods for controlling a robotic arm

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25718073

Country of ref document: EP

Kind code of ref document: A1