WO2024252400A1 - Effecteur terminal de support osseux - Google Patents
Effecteur terminal de support osseux Download PDFInfo
- Publication number
- WO2024252400A1 WO2024252400A1 PCT/IL2024/050560 IL2024050560W WO2024252400A1 WO 2024252400 A1 WO2024252400 A1 WO 2024252400A1 IL 2024050560 W IL2024050560 W IL 2024050560W WO 2024252400 A1 WO2024252400 A1 WO 2024252400A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- bone mount
- bone
- robotic arm
- end effector
- interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/76—Manipulators having means for providing feel, e.g. force or tactile feedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00477—Coupling
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/03—Automatic limiting or abutting means, e.g. for safety
- A61B2090/031—Automatic limiting or abutting means, e.g. for safety torque limiting
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
- A61B2090/065—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring contact or contact pressure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/08—Accessories or related features not otherwise provided for
- A61B2090/0807—Indication means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
Definitions
- the present disclosure is generally directed to surgical systems, and relates more particularly to robotic surgical devices.
- Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure, or may complete one or more surgical procedures autonomously. Providing a rigid connection between an end effector of the surgical robot and a bone mount placed on a patient allows for the most accurate registration of the patient to the surgical robot.
- a system comprises a robotic arm, an end effector, a bone mount and a bone mount interface.
- the end effect has a proximal end and a distal end with the proximal end of the end effector being connected to the robotic arm.
- the bone mount is attachable to an anatomical element at one end of the bone mount and the bone mount interface is coupled to the distal end of the end effector via a proximal end of the bone mount interface and attached to another end of the bone mount via a distal end of the bone mount interface.
- the bone mount interface is configured to add a degree of freedom to the end effector.
- anatomical element includes one or more vertebrae.
- the one or more sensors include a force/torque sensor disposed at the distal end of the end effector configured to output a force signal in accordance with a force exerted on the bone mount.
- control circuitry configured to receive the force signal and output a control signal based on the force exerted on the bone mount.
- control signal includes activating an alarm when the force signal exceeds a predetermined threshold.
- control signal includes releasing the bone mount from the bone mount interface when the force signal exceeds a predetermined threshold.
- the bone mount includes a locking mechanism, configured to be selectively and repeatedly coupled to and uncoupled from the bone mount interface.
- any of the aspects herein further comprising an arm guide provided between a proximal end and a distal end of the robotic arm and adjacent to the end effector, wherein the arm guide is configured to accommodate a surgical tool.
- a system comprises one or more processors, at least one robotic arm and a memory storing data for processing by the one or more processors that, when processed by the one or more processors, causes the one or more processors to determine, based on first sensor data, a first force exerted by a bone mount interface onto a bone mount attached to an anatomical element, determine, based on second sensor data, a second force different from the first force, exerted by the bone mount interface onto the bone mount and output a control signal based on the second force exerted on the bone mount.
- the bone mount interface is coupled to an end effector of the at least one robotic arm.
- control signal includes releasing the bone mount from the bone mount interface when the second force exceeds a predetermined threshold.
- a method comprises attaching one end of a bone mount to an anatomical element, coupling a bone mount interface to a distal end of an end effector of a robotic arm via a proximal end of the bone mount interface and attaching another end of the bone mount to a distal end of the bone mount interface.
- the bone mount interface is configured to add a degree of freedom to the end effector.
- each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
- each one of A, B, and C in the above expressions refers to an element, such as X, Y, and Z, or class of elements, such as Xl-Xn, Yl- Ym, and Zl-Zo
- the phrase is intended to refer to a single element selected from X, Y, and Z, a combination of elements selected from the same class (e.g., XI and X2) as well as a combination of elements selected from two or more classes (e.g., Y 1 and Zo).
- FIG. 1 is a block diagram of a system according to at least one embodiment of the present disclosure
- Fig. 2 is a block diagram of aspects of the system according to at least one embodiment of the present disclosure
- Fig. 3 is a perspective diagram of a robotic arm with a bone mount end effector according to at least one embodiment of the present disclosure
- Fig. 4 is a detail perspective view of a portion of the robotic arm with the bone mount end effector according to at least one embodiment of the present disclosure
- Fig. 5 is a detail perspective view of a portion of the robotic arm with the end effector including the bone mount interface according to at least one embodiment of the present disclosure
- Fig. 6 is a flowchart of a method for operating the robotic arm with the bone mount end effector according to at least one embodiment of the present disclosure.
- Fig. 7 is a flowchart of a method for measuring a location of an end effector of a robotic arm relative to a bone mount and navigating the robotic arm relative to a patient anatomy using the bone mount according to at least one embodiment of the present disclosure.
- the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions).
- Computer- readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
- processors such as one or more digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple Al l, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors), graphics processing units (e.g., Nvidia GeForce RTX 2000- series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry
- DSPs digital signal processors
- proximal and distal are used in this disclosure with their conventional medical meanings, proximal being closer to the operator or user of the system, and further from the region of surgical interest in or on the patient, and distal being closer to the region of surgical interest in or on the patient, and further from the operator or user of the system.
- a robotic arm is attached rigidly to a patient’s bone (e.g., a vertebra) either directly to the bone, or on a structure attached rigidly to the bone, and a significant force is exerted by the robotic arm or the surgeon onto a surgical tool, there is a danger that the force to the surgical tool on the bone may be sufficient in some cases, to detach a bone-mounting element from the bone such that the robotic arm’s position is no longer defined relative to the bone. Even if complete detachment does not occur or even if no movement of the bone-mounting element occurs, a loss of the defined spatial relationship between the robotic arm and the bone still may exist. Issues with the above may be addressed with embodiments of the present disclosure presented herein.
- a robotic arm with a bone mount end effector is a device that may be used in surgical procedures to provide a greater precision and control during the surgical procedures involving the manipulation of bone tissue, for example.
- the robotic arm is typically mounted on the operating table and is usually controlled by a computer of a handheld device operated by a surgeon.
- the robotic arm can be programmed to move in specific patterns or follow a predetermined path, allowing the surgeon to perform complex surgical procedures with a high degree of accuracy.
- the bone mount end effector includes a bone mount interface and a bone mount.
- the bone mount is an attachment that can be affixed to the patient’s anatomy (e.g., a bone) allowing the robotic arm to precisely position surgical tools, for example, during a surgical procedure.
- the bone mount is designed to hold the bone in a specific position and orientation, allowing the surgeon to make accurate incisions based on pre-operative planning.
- the bone mount is designed to securely attach to the bone tissue, without causing any damage or undue stress.
- the bone mount is designed to interface with the robotic arm, allowing the surgeon to manipulate and control the robotic arm with precision and accuracy.
- the bone mount interface is a mechanism that acts as a connection point between the bone mount and the robotic arm to provide a secure and stable connection between the robotic arm and the patient’s bone.
- the bone mount interface is designated to provide a stable and secure attachment while still allowing for a wide range of motion and flexibility.
- the bone mount end effector can include various locking mechanisms between the bone mount interface and the bone mount for easy connection and reconnection. As such, the locking mechanisms are more flexible than conventional locking mechanisms.
- the locking mechanism may include a clamp, a screw, a ball and socket, a kinematic interface, etc. to ensure that the bone mount remains in place during the surgical procedure.
- the locking mechanism can be a manual locking mechanism or an electronic locking mechanism. Furthermore, the locking mechanism is easy to use and does not place excess force on the patient anatomy. Also, the locking mechanism connects easily and without forcing movement. According to an alternative embodiment of the present disclosure, a force or torque sensor is included that is configured to detect any skiving relative to the patient anatomy.
- the distal end of the robotic arm to which the bone mount interface is attached may include various sensors and feedback mechanisms to ensure accurate positioning and alignment during the surgical procedures by determining the location of the bone mount relative to the robotic arm.
- the bone mount interface provides the most accurate position of the patient, using feedback from the sensors to the robotic system.
- the location of where the bone mount is attached to the patient anatomy can be monitored with the sensors for maximum accuracy.
- the sensors provide feedback to the robotic arm, allowing the robotic arm to adjust its position and orientation based on movement of the patient, the breathing of the patient, the movement of the patient bone, etc.
- the feedback mechanism is provided such that the bone mount interface is released from the bone mount if a threshold level of force or distance is received by a processor of the robotic system. Also, the feedback mechanism may be provided such that an alarm is caused to be generated by the processor of the robotic system if a threshold level of force exerted by the surgeon or the robotic arm or a threshold distance is displaced by the bone mount is received by the processor.
- the bone mount interface is compatible with a variety of different bone mount designs in order to accommodate different surgical procedures and patient needs.
- a robotic arm with the bone mount end effector has numerous advantages over traditional surgical methods.
- the robotic arm with the bone mount end effector can provide greater precision and accuracy, reduce the risk of human error and allow for less invasive procedures that result in faster recovery times and fewer complications.
- mounting the robotic arm with reference to a patient’s anatomy allows the robotic systems to have a better accuracy than robotic systems that are not patient mounted.
- a higher degree of accuracy can be achieved. This higher degree of accuracy is achieved by the relative geometry of the patient anatomy (e.g., vertebra) and the end effector being locked in place.
- the bone mount interface enables rigid connection between the end effector and the bone mount with minimum length. Moreover, the bone mount interface enables a simple connection with the patient since the bone mount interface can be used with a variety of bone mounts. Furthermore, the bone mount interface, when attached to standard robotic arms, adds an additional degree of freedom (DOF) to the standard robotic arms with little to no reconfiguration of the standard robotic arms.
- DOF degree of freedom
- the additional DOF is an added rotational DOF at the end effector after the arm guide of the robotic arm. In this case the arm guide remains stationary.
- the additional DOF is an added cartesian DOF at the end effector after the arm guide. In this case also, the arm guide remains stationary.
- the bone mount includes orientation capabilities. Therefore, a registration process can register the location of the bone mount. In the registration process, the orientation of the vertebra to the bone mount is determined. This determination is made in several ways. After the registration process, the location of the bone mount interface relative to the bone mount is measured. Accordingly, a surgical procedure on one or two vertebrae above or below where the bone mount is located may be performed.
- the registration of the bone mount is not the key feature of the present disclosure.
- a reference frame to the bone mount is first registered.
- the navigation system such as an optical navigation system, is used to determine the location of the robotic arm.
- the incorporation of the bone mount interface enhances the features of the optically navigated robotic arm.
- various types of materials such as metal, plastic, etc. can be used for the bone mount.
- Embodiments of the present disclosure provide technical solutions to one or more of the problems of (1) accurate mounting of a robotic arm to a patient anatomy and (2) inaccurate tracking of the robotic arm relative to the patient anatomy.
- a block diagram of a system 100 according to at least one embodiment of the present disclosure is shown.
- the system 100 may be used to operate a robot 114 in order to provide a rigid connection between an end effector of the robot 114 and a bone mount placed on a patient to allow for the most accurate registration of the patient to the robot 114.
- the system 100 may control, pose, and/or otherwise manipulate a surgical mount system, a surgical arm, and/or surgical tools attached thereto and/or carry out one or more other aspects of one or more of the methods disclosed herein.
- the system 100 comprises a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database 130, and/or a cloud or other network 134.
- Systems according to other embodiments of the present disclosure may comprise more or fewer components than the system 100.
- the system 100 may not include the imaging device 112, the robot 114, the navigation system 118, one or more components of the computing device 102, the database 130, and/or the cloud 134.
- the computing device 102 comprises a processor 104, a memory 106, a communication interface 108, and a user interface 110.
- Computing devices according to other embodiments of the present disclosure may comprise more or fewer components than the computing device 102.
- the processor 104 of the computing device 102 may be any processor described herein or any similar processor.
- the processor 104 may be configured to execute instructions stored in the memory 106, which instructions may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud 134.
- the memory 106 may be or comprise RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions.
- the memory 106 may store information or data useful for completing, for example, any step of the methods described herein, or of any other methods.
- the memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114.
- the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable image processing 120, segmentation 122, transformation 124, and/or registration 128.
- Such content may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines.
- the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc.) that can be processed by the processor 104 to carry out the various method and features described herein.
- various contents of memory 106 may be described as instructions, it should be appreciated that functionality described herein can be achieved through use of instructions, algorithms, and/or machine learning models.
- the data, algorithms, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, and/or the cloud 134.
- the computing device 102 may also comprise a communication interface 108.
- the communication interface 108 may be used for receiving image data or other information from an external source (such as the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100), and/or for transmitting instructions, images, or other information to an external system or device (e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100).
- an external system or device e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100.
- the communication interface 108 may comprise one or more wired interfaces (e.g., a USB port, an Ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.11a/b/g/n, Bluetooth, NFC, ZigBee, and so forth).
- the communication interface 108 may be useful for enabling the device 102 to communicate with one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
- the computing device 102 may also comprise one or more user interfaces 110.
- the user interface 110 may be or comprise a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
- the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100.
- the user interface 110 may be useful to allow a surgeon or other user to modify instructions to be executed by the processor 104 according to one or more embodiments of the present disclosure, and/or to modify or adjust a setting of other information displayed on the user interface 110 or corresponding thereto.
- the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102.
- the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computer device 102.
- the imaging device 112 may be operable to image anatomical feature(s) (e.g., a bone, veins, tissue, etc.) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc.).
- image data refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form.
- the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof.
- the image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure.
- a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time, and a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
- the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
- the imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an O-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient.
- the imaging device 112 may be contained entirely within a single housing, or may comprise a transmitter/emitter and a receiver/detector that are in separate housings or are otherwise
- the imaging device 112 may comprise more than one imaging device 112.
- a first imaging device may provide first image data and/or a first image
- a second imaging device may provide second image data and/or a second image.
- the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein.
- the imaging device 112 may be operable to generate a stream of image data.
- the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images.
- image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
- the robot 114 may be any surgical robot or surgical robotic system.
- the robot 114 may be or comprise, for example, the Mazor XTM Stealth Edition robotic guidance system.
- the robot 114 may be configured to position the imaging device 112 at one or more precise position(s) and orientation(s), and/or to return the imaging device 112 to the same position(s) and orientation(s) at a later point in time.
- the robot 114 may additionally or alternatively be configured to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task.
- the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure.
- the robot 114 may comprise one or more robotic arms 116.
- the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms. In some embodiments, one or more of the robotic arms 116 may be used to hold and/or maneuver the imaging device 112. In embodiments where the imaging device 112 comprises two or more physically separate components (e.g., a transmitter and receiver), one robotic arm 116 may hold one such component, and another robotic arm 116 may hold another such component. Each robotic arm 116 may be positionable independently of the other robotic arm. The robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
- the robot 114 together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
- the robotic arm(s) 116 may comprise one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm).
- reference markers may be placed on the robot 114 (including, e.g., on the robotic arm 116), the imaging device 112, or any other object in the surgical space.
- the reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof.
- the navigation system 118 can be used to track other components of the system (e.g., imaging device 112) and the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or more surgical tools, based on information and/or instructions generated by the navigation system 118, for example).
- the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation.
- the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStationTM S8 surgical navigation system or any successor thereof.
- the navigation system 118 may include one or more cameras or other sensor(s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the system 100 is located.
- the one or more cameras may be optical cameras, infrared cameras, or other cameras.
- the navigation system 118 may comprise one or more electromagnetic sensors.
- the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, and/or one or more surgical tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing).
- the navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
- the system 100 can operate without the use of the navigation system 118.
- the navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
- the database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system).
- the database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100); one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information.
- one or more surgical plans including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100
- the database 130 may comprise movement profiles for the robot 114 based on a select end effector that is attached to the robotic arm 116. These movement profiles may correspond to kinematic solutions for the robot 114 and/or defined positions of a surgical tool axis of select end effector relative to at least one of a surface of a tool block of the end effector and a rotation axis of a final joint/mount flange of the robotic arm 116.
- the database 130 may store identifications of specific tool blocks and surgical tool axis orientations. In any event, the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud 134.
- the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
- a hospital image storage system such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
- the cloud 134 may be or represent the Internet or any other wide area network.
- the computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both.
- the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
- the system 100 or similar systems may be used, for example, to carry out one or more aspects of any of the methods (e.g., methods 600 and 700, etc.) described herein.
- the system 100 or similar systems may also be used for other purposes.
- Fig. 2 is a block diagram of aspects of the system 100 according to at least one embodiment of the present disclosure.
- Fig. 2 illustrates a surgical environment, such as an operating room, including a patient table 204 and a robotic table 208.
- the patient table 204 and the robotic table 208 may be positioned on a floor 212 of the surgical environment.
- the patient table 204 and/or the robotic table 208 may be mobile and capable of being moved around the surgical environment.
- the robotic table 208 may be or comprise a cart that moves relative to the patient table 204, allowing the robotic table 208 (and the robotic arm 116) to be brought in or otherwise introduced to the surgical environment after preparations for a surgery or surgical task have been performed.
- the robotic table 208 may be brought into the surgical environment after preoperative imaging has been performed, for example, to help lessen congestion of the surgical environment.
- the robotic arm 116 may be attached to the patient table 204 itself and the robotic table 208 would not be need.
- the robotic arm 116 could be free standing and include a base positioned on the floor 212.
- a patient 216 may be positioned on the patient table 204.
- the patient 216 may have anatomical elements 220A-220D, which may be the subject of the surgery or surgical procedure.
- the surgical procedure may be a spinal fusion
- the anatomical elements 220A- 220D may be vertebrae of the spine.
- the patient 216 may be securely positioned on the patient table 204, such that the patient 216 and/or the anatomical elements 220A-220D cannot move relative to the patient table 204.
- the discussion herein includes discussion of an anatomical element, it is to be understood that more or fewer anatomical elements may be present and may be identified and registered using methods discussed herein.
- the methods and embodiments discussed herein may alternatively apply to a portion of an anatomical element (e.g., a spinous process of a vertebra).
- the robotic table 208 includes the robotic arm 116 and an optical sensor 228.
- the robotic table 208 may include additional or alternative components.
- the optical sensor 228 may not be positioned on the robotic table 208, and may instead be disposed in another location in the surgical environment (e.g., on the floor 212, mounted on a wall, positioned on another surgical table, etc.).
- the robotic table 208 may include additional surgical components such as surgical tools, and may include one or more cabinets, drawers, trays, or the like to house the surgical components.
- the robotic table 208 may be mechanically decoupled or may be otherwise detached from the patient of the present disclosure table 204, such that the robotic table 208 and/or the surgical components thereon can move freely relative to the patient table 204, the patient 216, and/or the anatomical elements 220A-220C.
- the robotic table 208 may be disposed a first distance from the patient table 204 (e.g., 0.5 meters (m), Im, 1.5m, 2m, etc.).
- the optical sensor 228 may be or comprise a sensor capable of detecting and/or tracking optics-based targets (e.g., illuminated objects, visual targets, etc.).
- the optical sensor 228 may be or comprise a laser tracker.
- the laser tracker may project or emit a laser that may reflect off one or more targets and back toward the laser tracker.
- the reflected light may be received and processed by the computing device 102 and/or the navigation system 118 and may enable the computing device 102 and/or the navigation system 118 to determine the relative distance and/or pose of the target relative to the laser tracker based on, for example, the angle, intensity, frequency, and/or the like of the returning laser.
- the laser tracker may include a tracking system that tracks the target as the target moves, such that the laser tracker can continuously aim a laser at the target and receive the reflected laser.
- the information of the reflected laser may be processed (e.g., by the computing device 102, by the navigation system 118, etc.) to identify and determine a change in pose of the target as the target moves relative to the optical sensor 228.
- the optical sensor 228 may be or comprise a 3D camera capable of identifying one or more 3D optical tracking targets. The 3D camera may be able to identify the 3D optical tracking targets based on a number of faces, designs, or patterns displayed by the 3D optical tracking targets.
- the 3D camera may identify the optical tracking target based on different QR codes displayed on each surface of the 3D optical tracking target.
- the processor 104 may receive the identified faces, and may determine (e.g., using transformations 124) the pose of the optical tracking target within the surgical environment.
- the identified faces may be compared to a predetermined (e.g., preoperative) pose of the surfaces, with the changes in pose of each faces used to determine the pose of the optical tracking target.
- the optical sensor 228 may be disposed proximate the robotic arm 116 (e.g., disposed 0.1m, 0.2m, 0.5m, Im, 1.5m, 2m, etc. away from the robotic arm 116), such that the optical sensor 228 can view and track the robotic arm 116 in addition to any opticalbased targets in the environment.
- the optical sensor 228 may be disposed within a portion of the robotic arm 116 (e.g., within the end effector 224).
- the optical sensor 228 may be disposed in a predetermined configuration relative to the robotic arm 116, such that the pose of the robotic arm 116 may be determined based on internal readings generated by a sensor (e.g., using gyroscopes, accelerometers, etc.).
- the robotic arm 116 may include an end effector 224.
- the end effector 224 may be or comprise a receptacle, mount, gripper, or other mechanical interface for interacting with a surgical tool or instrument.
- the end effector 224 may interface with a surgical tool to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgery or surgical procedure or task.
- the end effector 224 may include a bone mount interface 250.
- the bone mount interface 250 is a mechanism that acts as a connection point between a bone mount 240 and the robotic arm 116 to provide a secure and stable connection between the robotic arm 116 and an anatomical element 220.
- the bone mount interface 250 is designated to provide a stable and secure attachment while still allowing for a wide range of motion and flexibility.
- the end effector 224 may include a tracking marker 232 disposed thereon.
- the positioning of the tracking marker 232 on the end effector 224 may enable the imaging devices 112 and/or the optical sensor 228 to track the pose of the end effector 224.
- the tracking marker 232 may be a QR code and the computing device 102 and/or the navigation system 118 can use the identified QR code to determine a pose of the tracking marker 232.
- the computing device 102 and/or the navigation system 118 may use the pose of the tracking marker 232 to determine a pose of the end effector 224 or, more generally, a pose of the robotic arm 116.
- the tracking marker 232 may be an optics-based target (e.g., illuminated objects, visual targets, etc.), whereby the optical sensor 228 may be or comprise a sensor capable of detecting and/or tracking the optics-based target.
- a bone mount 240 may be disposed on the patient 216.
- the bone mount 240 may be or comprise, for example, a clamp attached to a spinous process of a vertebra, a threaded rod capable of screwing into the patient table 204, or the like.
- the bone mount 240 may be connected to the patient 216 such that any movement of the patient 216 and/or the anatomical elements 220A- 220D may result in a signal being sent to the robotic arm 116 to which the bone mount interface 250 and the bone mount 240 are connected.
- a movement of an anatomical element 220D in a first distance in a first direction may, for example, cause a signal to be sent to the robotic arm 116 indicating that the patient and/or anatomical element 220D moved the first distance in the first direction.
- the bone mount end effector 290 includes the bone mount interface 250 and the bone mount 240 securely attached together.
- the bone mount interface 250 is configured to provide a stable and secure attachment while still allowing for a wide range of motion and flexibility.
- the bone mount end effector 290 can include various locking mechanisms between the bone mount interface 250 and the bone mount 240 for easy connection and reconnection.
- the locking mechanism may include a clamp, a screw, a ball and socket, a kinematic interface, etc. to ensure that the bone mount 240 remains in place during the surgical procedure.
- the kinematic interface may include the combination of a kinematic mount and a kinematic mount contact.
- the kinematic mount may correspond to one or more kinematic mounts including, but in no way limited to, chamfered slots, conical recesses, countersunk holes, counterbores, parallel dowel pins disposed in a slot offset a distance from one another, hardened slots, and/or combinations thereof.
- the kinematic mount contact may correspond to one or more contacts including, but in no way limited to, spherical balls, tooling balls with posts, dowel pins, and/or other protrusions.
- the locking mechanism can be a manual locking mechanism, or an electronic locking mechanism and the locking mechanism connects easily and without forcing movement.
- a force or torque sensor is included that is configured to detect any skiving relative to the patient anatomy.
- the robotic arm 116 with the bone mount end effector 290 surgeons can perform delicate and complex procedures with a greater degree of accuracy, reducing the risk of complications and improving patient outcomes.
- the bone mount interface 250 may be compatible with a variety of different bone mount 240 designs in order to accommodate different surgical procedures and patient needs.
- a robotic arm 116 with the bone mount end effector 290 has numerous advantages over traditional surgical methods.
- a robotic arm 116 with the bone mount end effector 290 can provide greater precision and accuracy, reduce the risk of human error and allow for less invasive procedures that result in faster recovery times and fewer complications.
- mounting the robotic arm 116 with reference to a patient’s anatomy allows the robotic systems to have a better accuracy than robotic systems that are not patient mounted.
- a higher degree of accuracy can be achieved. This higher degree of accuracy is achieved by the relative geometry of the patient anatomy 220 (e.g., vertebra) and the end effector 224 being locked in place.
- the bone mount interface 250 enables a simple connection with the patient 216 since the bone mount interface 250 can be used with a variety of bone mounts 240. Furthermore, the bone mount interface 250, when attached to standard robotic arms 116, adds an additional DOF to the standard robotic arms 116 with little to no reconfiguration of the standard robotic arms 116.
- the additional DOF is an added rotational DOF at the end effector 224 after an arm guide of the robotic arm. In this case the arm guide remains stationary.
- the additional DOF is an added cartesian DOF at the end effector 224 after the arm guide. In this case also, the arm guide remains stationary.
- the bone mount 240 includes orientation capabilities. Therefore, a registration process can register the location of the bone mount 240. In the registration process, the orientation of the vertebra 220 to the bone mount is determined. This determination is made in several ways. After the registration process, the location of the bone mount interface 250 relative to the bone mount 240 is measured. Accordingly, a surgical procedure on one or two vertebrae 220 above or below where the bone mount 240 is located may be performed.
- the registration of the bone mount 240 is not the key feature of the present disclosure.
- a reference frame to the bone mount 240 is first registered.
- the navigation system 118 such as an optical navigation system using the optical sensor 228, determines the location of the robotic arm 116.
- the incorporation of the bone mount interface 250 enhances the features of the optically navigated robotic arm 116.
- various types of materials such as metal, plastic, etc. can be used for the bone mount 240.
- Fig. 3 is a perspective diagram of the robotic arm 116 with a bone mount end effector 290 according to at least one embodiment of the present disclosure. More specifically, Fig. 3 shows the robotic arm 116 of the robot 114 connected to the end effector 224 including the bone mount interface 250 attached thereto. The bone mount interface 250 is positioned within tool block 332. Features of the robot 114 and/or robotic arm 116 may be described in conjunction with a coordinate system 302.
- the coordinate system 302, as shown in Fig. 3 includes three- dimensions comprising an X-axis, a Y-axis, and a Z-axis.
- the coordinate system 302 may be used to define planes (e.g., the XY-plane, the XZ-plane, and the YZ-plane) of the robot 114 and/or robotic arm 116. These planes may be disposed orthogonal, or at 90 degrees, to one another. While the origin of the coordinate system 302 may be placed at any point on or near the components of the robot 114, for the purposes of description, the axes of the coordinate system 302 are always disposed along the same directions from figure to figure, whether the coordinate system 302 is shown or not.
- planes e.g., the XY-plane, the XZ-plane, and the YZ-plane
- reference may be made to dimensions, angles, directions, relative positions, and/or movements associated with one or more components of the robot 114 and/or robotic arm 116 with respect to the coordinate system 302.
- the width of the robotic arm 116 e.g., running from the side shown in the foreground to the side in the background, into the page
- the height of the robotic arm 116 may be defined as a dimension along the Z-axis of the coordinate system 302
- the length of the robotic arm 116 e.g., running from a proximal end at the first link 304 to a distal end at the seventh link 324, etc.
- the height of the system 100 may be defined as a dimension along the Z-axis of the coordinate system 302
- a reach of the robotic arm 116 may be defined as a dimension along the Y-axis of the coordinate system 302
- a working area of the robotic arm 116 may be defined in the XY-plane with reference to the corresponding axes of the coordinate system 302.
- the robotic arm 116 may be comprised of a number of links 304, 308, 309, 312, 316, 320, 324 that interconnect with one another at respective axes of rotation 306, 310, 314, 318, 322, 326, 330, 334, or joints. There may be more or fewer links 304, 308, 309, 312, 316, 320, 324 and/or axes of rotation 306, 310, 314, 318, 322, 326, 330, 334 than are shown in Fig. 3.
- the robotic arm 116 may have a first link 304 disposed at a proximal end of the robotic arm 116 and an end mount flange 328 disposed furthest from the proximal end at a distal end of the robotic arm 116.
- the first link 304 may correspond to a base of the robotic arm 116.
- the first link 304 may rotate about first rotation axis 306.
- a second link 308 may be connected to the first link 304 at a second rotation axis 310, or joint.
- the second link 308 may rotate about the second rotation axis 310.
- the first rotation axis 306 and the second rotation axis 310 may be arranged parallel to one another.
- the first rotation axis 306 and the second rotation axis 310 are shown extending along the Z-axis in a direction perpendicular to the XY-plane.
- the robotic arm 116 may comprise a third link 309 that is rotationally interconnected to the second link 308 via the third rotation axis 314, or joint.
- the third rotation axis 314 is shown extending along the X-axis, or perpendicular to the first rotation axis 306 and second rotation axis 310. In this position, when the third link 309 is caused to move (e.g., rotate relative to the second link 308), the third link 309 (and the components of the robotic arm 116 extending from the third link 309) may be caused to move into or out of the XY-plane.
- the fourth link 312 is shown rotationally interconnected to the third link 309 via the fourth rotation axis 318, or joint.
- the fourth rotation axis 318 is arranged parallel to the third rotation axis 314.
- the fourth rotation axis 318 extends along the X-axis allowing rotation of the fourth link 312 into and out of the XY- plane.
- the robotic arm 116 may comprise one or more wrists 316, 324.
- the fifth link 316, or wrist is shown rotationally interconnected to the fourth link 312 via a fifth rotation axis 322, or wrist joint.
- the fifth rotation axis 322 is shown extending along the Y-axis, which is perpendicular to the X-axis and the Z-axis.
- causing the fifth link 316 to rotate about the fifth rotation axis 322 may cause the components of the robotic arm 116 distal the joint at the fifth rotation axis 322 (e.g., the fifth link 316, the sixth link 320, the seventh link 324, the end mount flange 328, and the end effector 224, etc.) to rotate about the Y-axis.
- the components of the robotic arm 116 distal the joint at the fifth rotation axis 322 e.g., the fifth link 316, the sixth link 320, the seventh link 324, the end mount flange 328, and the end effector 224, etc.
- the sixth link 320 is rotationally interconnected to the fifth link 316 via the sixth rotation axis 326.
- the sixth rotation axis 326 extends along the X-axis and provides for rotation of the sixth link 320 relative to the fifth link 316 (e.g., into and out of the XY-plane in the position shown).
- the seventh link 324 is shown rotationally interconnected to the sixth link 320 via a seventh rotation axis 330, or wrist joint.
- the seventh rotation axis 330 is shown extending along the Y-axis (e.g., perpendicular to the X-axis and the Z-axis).
- causing the seventh link 324 to rotate about the seventh rotation axis 330 may cause the components of the robotic arm 116 distal the joint at the seventh rotation axis 330 (e.g., the end mount flange 328, and the end effector 224, etc.) to rotate about the Y-axis.
- an end mount flange 328 may be rotationally interconnected to the end mount flange 328 via an eighth, or mount flange rotation, axis 334.
- the seventh link 324 is positioned rotationally about the seventh rotation axis 330 such that the end mount flange 328 is oriented where the mount flange rotation axis 334 is extending along the Z-axis for one type of robotic arm 116 with one type of movement kinematics.
- the seventh link 324 is positioned rotationally about the seventh rotation axis 330 such that the end mount flange 328 is oriented where the mount flange rotation axis 334 is extending along the X- axis for another type of robotic arm 116 having another type of movement kinematics.
- Robotic arm 116 may also include an arm guide 350 attached to the tool block 332 for example.
- the arm guide 350 may include a surgical tool 370. While shown as a single surgical tool 370 in Fig. 3, the surgical tool 370 may correspond to different surgical tools 370 used between operations in a surgical application.
- arm guide 350 as illustrated as being attached to the robotic arm 116 at one end of the tool block 332, arm guide 350, however, may be attached at any location on the robotic arm 116 so that the surgical tool 370 can be used to perform surgical applications.
- the arm guide 350 may be provided in close proximity to and on the side of the end effector 224 of the robotic arm 116 at one or more locations.
- the bone mount interface 250 when attached to standard robotic arms 116 at or near the end effector 224, adds an additional DOF to the standard robotic arms 116 with little to no reconfiguration of the standard robotic arms 116.
- the additional DOF is an added rotational DOF at the end effector 224 after the arm guide 350 of the robotic arm 116. In this case the arm guide 350 remains stationary.
- the additional DOF is an added cartesian DOF at the end effector 224 after the arm guide 350. In this case also, the arm guide 350 also remains stationary.
- the bone mount 240 physically connects the patient to the robot 114 (e.g., the robotic system) operating procedures are still guided by the robotic system.
- computing device 102 determines the position of the robotic arm 116 in order to connect the bone mount 240 (the position of the bone mount 240 is determined by the navigation system 118, the imaging device(s) 112 or other systems) and in order to direct or point the robot 114 in the correct trajectory to allow an operating procedure to take place (e.g., place a pedicle screw).
- This is not an easy task mathematically or physically.
- the bone mount end effector according to embodiments of the present disclosure is very forgiving and flexible and can allow for many configurations such as a ball and socket configuration, for example.
- the robotic arm 116 can attach to the specific bone mount 240 in several different directions. Being able to attach to a specific bone mount 240 in several different directions allows for the physical connection (e.g., the robot 114 may be limited in motion and can connect to the bone mount 240 only in specific directions) and allows for multiple operating procedures (e.g., several screw placements) with the same bone mount interface 250 detaching and reattaching the same or different bone mounts 240.
- the bone mount end effector according to embodiments of the present disclosure allows for less risk of losing registration and for greater safety for the patient.
- the bone mount end effector allows for less risk to loosen connection of the bone mount 240.
- the bone mount 240 can be tracked via the navigation system 118, the imaging device (s) 112 or other systems using a tracking marker for example.
- Fig. 4 is a detail perspective view of a portion of the robotic arm 116 with the bone mount end effector 290 according to at least one embodiment of the present disclosure.
- the end effector 224 may include the tool block 332 having a receptacle disposed therein (not shown) for receiving the bone mount interface 250.
- the receptacle may define a tool axis 338 of the tool block 332.
- an axis of the receptacle may coincide with the tool axis 338.
- the tool block 332 disposes the tool axis 338 parallel to the mount flange rotation axis 334.
- the robotic arm 116 may be configured such that when the tool block 332 is attached to the end mount flange 328, the tool block 332 disposes the tool axis 338 perpendicular to the mount flange rotation axis 334.
- the bone mount interface 250 when attached to the robotic arm 116 when the tool block 332 disposes the tool axis 338 parallel to the mount flange rotation axis 334 or when the tool block 223 disposes the tool axis 338 perpendicular to the mount flange rotation axis 334, adds the additional DOF to these different types of robotic arms 116 with little to no reconfiguration of the standard robotic arms 116.
- Fig. 5 is a detail perspective view of a portion of the robotic arm 116 with the end effector 224 including the bone mount interface 250 according to at least one embodiment of the present disclosure. As illustrated in Fig.
- the portion of the robotic arm 116 is configured such that when the tool block 332 is attached to the end mount flange 328, the tool block 332 disposes the tool axis 338 perpendicular to the mount flange rotation axis 334.
- the bone mount interface 250 On one face 432 of the tool block 332 there is provided the bone mount interface 250 along with one or more sensors 450.
- the one or more sensors 450 monitor the location of the bone mount 240 with respect to the bone mount interface 250.
- the one or more sensors 450 provide feedback to the robotic arm 116, allowing the robotic arm 116 to adjust its position and orientation based on movement of the patient, the breathing of the patient, the movement of the patient bone, etc.
- the one or more sensors 450 along with a feedback mechanism are provided such that the bone mount interface 250 is released from the bone mount 240 if a threshold level of force or distance is received by the processor 104 of the robot 114.
- the feedback mechanism may be provided such that an alarm is caused to be generated by the processor 104 of the robotic system 114 if a threshold level of force exerted by the surgeon or the robotic arm 116 or a threshold distance is displaced by the bone mount 240 is received by the processor 104.
- Fig. 6 depicts a method 600 that may be used, for example, in operating the robot 114 and/or one or more portions of the robotic arm 116 with the bone mount end effector 290.
- the method 600 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
- the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
- the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
- a processor other than any processor described herein may also be used to execute the method 600.
- the at least one processor may perform the method 600 by executing elements stored in a memory such as the memory 106.
- the elements stored in the memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 600.
- Fig. 6 is a flowchart of a method 600 for operating the robotic arm with the bone mount end effector 290 according to at least one embodiment of the present disclosure.
- the method 600 provides for a rigid connection between the end effector 224 of the robot 114 and a bone mount 240 placed on a patient 216 and allows for the most accurate registration of the patient 216 to the robotic arm 116 according to embodiments of the present disclosure. While a general order for the steps of the method 600 is shown in Fig. 6, the method 600 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 6.
- the method 600 starts with a START operation at step 604 and ends with an END operation at step 620.
- the method 600 can be executed as a set of computer-executable instructions executed by an assembly machine (e.g., robotic assembly system, automation assembly system, computer aided drafting (CAD) machine, etc.) and encoded or stored on a computer readable medium.
- an assembly machine e.g., robotic assembly system, automation assembly system, computer aided drafting (CAD) machine, etc.
- CAD computer aided drafting
- the method 600 may begin with the START operation at step 604 and proceeds to step 608 where one end of a bone mount 240 is attached to an anatomical element 220.
- the anatomical element 220 may be vertebra. After one end of a bone mount 240 is attached to the anatomical element 220 at step 608, method 600 proceeds to step 612, where a bone mount interface 250 is coupled to a distal end of the end effector 224 of the robotic arm 116 via a proximal end of the bone mount interface 250. After the bone mount interface 250 is coupled to a distal end of an end effector 224 of the robotic arm 116 via a proximal end of the bone mount interface 250 at step 612, method 600 proceeds to step 616, where the other end of the bone mount 240 is attached to a distal end of the bone mount interface 250. According to embodiments of the present disclosure, with this arrangement of the bone mount 240 and the bone mount interface 250 coupled together, a bone mount end effector 290 is configured. The bone mount interface 250 is configured to add an additional DOF to the end effector 224.
- method 600 may end with the END operation at step 620.
- one or more sensors 450 along with a feedback mechanism are provided on the end effector 224 such that the bone mount interface 250 is released from the bone mount 240 if a threshold level of force or distance is received by the processor 104 of the robot 114.
- the feedback mechanism may be provided such that an alarm is caused to be generated by the processor 104 of the robot 114 if a threshold level of force exerted by the surgeon or the robotic arm 116 or a threshold distance is displaced by the bone mount 240 is received by the processor 104.
- Fig. 7 is a flowchart of a method 700 for measuring a location of an end effector 224 of a robotic arm 166 relative to a bone mount 240 and navigating the robotic arm relative to a patient anatomy using the bone mount 240 according to at least one embodiment of the present disclosure.
- Fig. 7 depicts a method 700 that may be used, for example, in operating the robot 114 and/or one or more portions of the robotic arm 116 having a bone mount end effector 290.
- the method 700 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
- the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
- the at least one processor 104 may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
- a processor other than any processor described herein may also be used to execute the method 700.
- the at least one processor may perform the method 700 by executing elements stored in a memory such as the memory 106.
- the elements stored in the 1 memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 700.
- One or more portions of a method 700 may be performed by the processor executing any of the contents of memory, such as image processing 120, segmentation 122, transformation 124 and/or registration 128 instructions.
- the method 700 can include more or fewer steps or can arrange the order of the steps differently than those shown in Fig. 7.
- the method 700 starts with a START operation at step 704 and ends with an END operation at step 728.
- the method 700 can be executed as a set of computerexecutable instructions executed by an assembly machine (e.g., robotic assembly system, automation assembly system, computer aided drafting (CAD) machine, etc.) and encoded or stored on a computer readable medium.
- an assembly machine e.g., robotic assembly system, automation assembly system, computer aided drafting (CAD) machine, etc.
- CAD computer aided drafting
- the method 700 may begin with the START operation at step 704 and proceed to step 708 where a bone mount 240 with a particular orientation is inserted into a patient anatomy 220.
- step 708 a bone mount 240 with a particular orientation is inserted into a patient anatomy 220.
- method 700 proceeds to step 712, where a plurality of images of the bone mount 240 with the particular orientation inserted into the patient anatomy 220 are captured.
- the bone mount 240 may include a marker, such as one or more fluoroscopic markers.
- the plurality of images may be or comprise a plurality of fluoroscopic images or fluoroscopic image data captured by a fluoroscopic imaging device (e.g., an X-ray source and an X-ray detector).
- the images may be captured using an 0-arm or other imaging device 112.
- the fluoroscopic imaging device may be positioned at a predetermined location when each image is captured, and each captured image of the plurality of images may depict the one or more fluoroscopic markers in a different pose (e.g., a different position and/or orientation).
- the bone mount 240 may not include markers.
- the plurality of images of the bone mount 240 which may be made of metal or plastic, are captured, and identified from the images.
- the plurality of images may be captured preoperatively (e.g., before the surgery or surgical procedure begins).
- method 700 proceeds to step 716, where a registration from the bone mount with the particular orientation to the patient anatomy 220 is determined based on the plurality of images.
- the registration may be or comprise a map from the coordinates associated with one or more of the fluoroscopic markers in a first coordinate system to a second coordinate system containing the coordinates of the patient element 220, or vice versa.
- the registration may transform both sets of coordinates into a third coordinate system, such as a coordinate system used by the robotic arm 116.
- the plurality of images may depict additional patient elements 220 (e.g., multiple vertebrae of the spine), and the registration may include mapping coordinates associated with each of the additional patient elements 220 into a common coordinate system.
- the registration may be determined using image processing 120 and one or more registrations 128.
- the image processing 120 may be used to identify the one or more fluoroscopic markers and the patient element 220 in each image of the plurality of images.
- the image processing 120 may be or comprise one or more machine learning and/or artificial intelligence models that receive each image as an input and output coordinates associated with each identified fluoroscopic marker and patient element 220.
- the registration 128 may use the determined coordinates associated with each identified fluoroscopic marker and the patient element 220 to determine a pose of each fluoroscopic marker relative to the patient element 220 and may transform the coordinates associated with each fluoroscopic marker from a first coordinate system to a second coordinate system.
- the registration 128 may take coordinates associated with each of the identified fluoroscopic markers and map the coordinates into a coordinate system associated with the patient element 220 (or vice versa). Additionally, or alternatively, the registration 128 may map the fluoroscopic marker coordinates and the patient element 220 coordinates into a third coordinate system (e.g., a robotic space coordinate system) shared by other surgical tools or components in a surgical environment.
- a third coordinate system e.g., a robotic space coordinate system
- step 720 an optical sensor 228 disposed proximate to the robotic arm 116 is used to track movement of the robotic arm 116.
- the optical sensor 228 may be pointed at the tracking marker 232 provided on the end effector 224 of the robotic arm 116 such that the tracking marker 232 can be identified and a pose of the tracking marker 232 can be determined.
- the optical sensor 228 may be or comprise a laser tracker that emits a laser that is captured by the tracking marker 232 (e.g., an optical tracking marker).
- the optical sensor 228 may be or comprise a 3D camera, and the tracking marker 232 may be or comprise a 3D tracking target.
- the optical sensor 228 may automatically identify the tracking marker 232 (e.g., the processor 104 may cause the optical sensor 228 to search the surrounding area until the tracking marker 232 is identified), or the optical sensor 228 may alternatively be aligned manually (e.g., by a physician, by a member of the surgical staff, etc.).
- the alignment of the optical sensor 228 with the tracking marker 232 may occur after preoperative images (e.g., the plurality of images) are captured.
- step 724 the location of the end effector 224 of the robotic arm 116 relative to the bone mount 240 with the particular configuration is measured. After the location of the end effector 224 of the robotic arm 116 relative to the bone mount 240 with the particular configuration is measured at step 724, method 700 ends with the END operation at step 728.
- the present disclosure encompasses methods with fewer than all of the steps identified in Figs. 6 and 7 (and the corresponding description of the methods 600 and 700), as well as methods that include additional steps beyond those identified in Figs 6 and. 7 (and the corresponding description of the methods 600 and 700).
- the present disclosure also encompasses methods that comprise one or more steps from one method described herein, and one or more steps from another method described herein. Any correlation described herein may be or comprise a registration or any other correlation.
- Example 1 A system, comprising: a robotic arm; an end effector having a proximal end and a distal end, wherein the proximal end of the end effector is connected to the robotic arm; a bone mount attachable to an anatomical element at one end of the bone mount; and a bone mount interface coupled to the distal end of the end effector via a proximal end of the bone mount interface and attached to another end of the bone mount via a distal end of the bone mount interface, wherein the bone mount interface is configured to add a degree of freedom to the end effector.
- Example 2 The system of example 1, wherein the anatomical element includes one or more vertebrae.
- Example 3 The system of example 1, further comprising one or more sensors (450) disposed at the distal end of the end effector.
- Example 4 The system of example 3, wherein the one or more sensors include a force/torque sensor disposed at the distal end of the end effector configured to output a force signal in accordance with a force exerted on the bone mount.
- Example 5 The system of example 4, further comprising control circuitry configured to receive the force signal and output a control signal based on the force exerted on the bone mount.
- Example 6 The system of example 5, wherein the control signal includes activating an alarm when the force signal exceeds a predetermined threshold.
- Example 7 The system of example 5, wherein the control signal includes releasing the bone mount from the bone mount interface when the force signal exceeds a predetermined threshold.
- Example 8 The system of example 3, wherein the one or more sensors provide feedback to the robotic arm, allowing the robotic arm to adjust its position and orientation based on movement of the anatomical element.
- Example 9 The system of example 1, wherein the bone mount includes a locking mechanism configured to be selectively and repeatedly coupled to and uncoupled from the bone mount interface.
- Example 10 The system of example 1, wherein the bone mount includes a ball and socket mechanism.
- Example 11 The system of example 1, wherein the degree of freedom is a rotational degree of freedom.
- Example 12 The system of example 1, wherein the degree of freedom is a translational degree of freedom.
- Example 13 The system of example 1, wherein the bone mount interface is mechanically coupled to the distal end of the end effector to provide a rigid connection between the end effector and the bone mount.
- Example 14 The system of example 1, wherein the bone mount includes a kinetic interface or a clamping interface.
- Example 15 The system of example 1, further comprising an arm guide provided between a proximal end and a distal end of the robotic arm and adjacent to the end effector, wherein the arm guide is configured to accommodate a surgical tool.
- Example 16 The system of example 1, further comprising an arm guide provided between a proximal end and a distal end of the robotic arm and adjacent to the end effector, wherein the arm guide is configured to accommodate a surgical tool.
- a system comprising: one or more processors; at least one robotic arm; and a memory storing data for processing by the one or more processors that, when processed by the one or more processors, causes the one or more processors to: determine, based on first sensor data, a first force exerted by a bone mount interface onto a bone mount attached to an anatomical element, wherein the bone mount interface is couple to an end effector of the at least one robotic arm; determine, based on second sensor data, a second force different from the first force, exerted by the bone mount interface onto the bone mount; and output a control signal based on the second force exerted on the bone mount.
- Example 17 The system of example 16, wherein the control signal includes activating an alarm when the second force exceeds a predetermined threshold.
- Example 18 The system of example 16, wherein the control signal includes releasing the bone mount from the bone mount interface when the second force exceeds a predetermined threshold.
- Example 19 A method, comprising: attaching one end of a bone mount to an anatomical element; coupling a bone mount interface to a distal end of an end effector of a robotic arm via a proximal end of the bone mount interface; and attaching another end of the bone mount to a distal end of the bone mount interface, wherein the bone mount interface is configured to add a degree of freedom to the end effector.
- Example 20 The method of example 19, further providing one or more sensors at the distal end of the end effector.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Robotics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Manipulator (AREA)
Abstract
Un système selon au moins un mode de réalisation de la présente divulgation comprend un bras robotique, un effecteur terminal, un support osseux et une interface de support osseux. L'effecteur terminal a une extrémité proximale et une extrémité distale, l'extrémité proximale de l'effecteur terminal étant reliée au bras robotique. Le support osseux peut être fixé à un élément anatomique à une extrémité du support osseux et l'interface de support osseux est couplée à l'extrémité distale de l'effecteur terminal par l'intermédiaire d'une extrémité proximale de l'interface de support osseux et fixée à une autre extrémité du support osseux par l'intermédiaire d'une extrémité distale de l'interface de support osseux. L'interface de support osseux est configurée pour ajouter un degré de liberté à l'effecteur terminal.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363471455P | 2023-06-06 | 2023-06-06 | |
| US63/471,455 | 2023-06-06 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024252400A1 true WO2024252400A1 (fr) | 2024-12-12 |
Family
ID=91853356
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IL2024/050560 Pending WO2024252400A1 (fr) | 2023-06-06 | 2024-06-06 | Effecteur terminal de support osseux |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2024252400A1 (fr) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180325608A1 (en) * | 2017-05-10 | 2018-11-15 | Mako Surgical Corp. | Robotic Spine Surgery System And Methods |
| EP4018957A1 (fr) * | 2020-12-21 | 2022-06-29 | Mazor Robotics Ltd. | Systèmes et procédés de positionnement d'un port chirurgical |
| WO2022149136A1 (fr) * | 2021-01-11 | 2022-07-14 | Mazor Robotics Ltd. | Systèmes et dispositifs de manipulation robotique de la colonne vertébrale |
-
2024
- 2024-06-06 WO PCT/IL2024/050560 patent/WO2024252400A1/fr active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180325608A1 (en) * | 2017-05-10 | 2018-11-15 | Mako Surgical Corp. | Robotic Spine Surgery System And Methods |
| EP4018957A1 (fr) * | 2020-12-21 | 2022-06-29 | Mazor Robotics Ltd. | Systèmes et procédés de positionnement d'un port chirurgical |
| WO2022149136A1 (fr) * | 2021-01-11 | 2022-07-14 | Mazor Robotics Ltd. | Systèmes et dispositifs de manipulation robotique de la colonne vertébrale |
Non-Patent Citations (2)
| Title |
|---|
| BUZA JOHN A ET AL: "Robotic-assisted cortical bone trajectory (CBT) screws using the Mazor X Stealth Edition (MXSE) system: workflow and technical tips for safe and efficient use", JOURNAL OF ROBOTIC SURGERY, vol. 15, no. 1, 28 February 2020 (2020-02-28) - 28 August 2020 (2020-08-28), pages 13 - 23, XP037365112, ISSN: 1863-2483, DOI: 10.1007/S11701-020-01147-7 * |
| SPINE CONNECTION: "Mazor X - Robotic Assisted Spine Surgery (How it Works)", 27 July 2018 (2018-07-27), XP093199345, Retrieved from the Internet <URL:https://www.youtube.com/watch?v=gD_2l62M2yM> [retrieved on 20240829] * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12042171B2 (en) | Systems and methods for surgical port positioning | |
| US20230389991A1 (en) | Spinous process clamp registration and methods for using the same | |
| US12201377B2 (en) | Arm movement safety layer | |
| US12419692B2 (en) | Robotic arm navigation using virtual bone mount | |
| US20250288377A1 (en) | Multiple end effector interfaces coupled with different kinematics | |
| US12295678B2 (en) | Systems and methods for intraoperative re-registration | |
| US20230270503A1 (en) | Segemental tracking combining optical tracking and inertial measurements | |
| US20230240754A1 (en) | Tissue pathway creation using ultrasonic sensors | |
| EP4415634A1 (fr) | Systèmes pour définir une géométrie d'objet à l'aide de bras robotiques | |
| WO2024252400A1 (fr) | Effecteur terminal de support osseux | |
| WO2023156993A1 (fr) | Systèmes de validation d'une pose d'un marqueur | |
| US20230404692A1 (en) | Cost effective robotic system architecture | |
| US20250275818A1 (en) | Systems and methods for intraoperative re-registration | |
| US12354719B2 (en) | Touchless registration using a reference frame adapter | |
| US20230165653A1 (en) | Systems, methods, and devices for covering and tracking a surgical device | |
| WO2025120636A1 (fr) | Systèmes et procédés de détermination du mouvement d'un ou plusieurs éléments anatomiques | |
| US20230278209A1 (en) | Systems and methods for controlling a robotic arm | |
| US20230355325A1 (en) | Replaceable arm guide and end effector for surgical systems | |
| WO2025173000A1 (fr) | Systèmes robotiques à bras multiples et procédés d'étalonnage et de vérification d'étalonnage de ceux-ci | |
| EP4468963A1 (fr) | Système de positionnement de rayons x mobile |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24739730 Country of ref document: EP Kind code of ref document: A1 |