WO2025222291A1 - Advanced planning for accessibility assessment in robotic assisted surgery - Google Patents
Advanced planning for accessibility assessment in robotic assisted surgeryInfo
- Publication number
- WO2025222291A1 WO2025222291A1 PCT/CA2025/050583 CA2025050583W WO2025222291A1 WO 2025222291 A1 WO2025222291 A1 WO 2025222291A1 CA 2025050583 W CA2025050583 W CA 2025050583W WO 2025222291 A1 WO2025222291 A1 WO 2025222291A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot arm
- processing unit
- computer
- program instructions
- readable program
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
- A61B90/11—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/90—Identification means for patients or instruments, e.g. tags
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
- A61B2034/104—Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2059—Mechanical position encoders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
- A61B2090/3762—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3937—Visible markers
- A61B2090/3945—Active visible markers, e.g. light emitting diodes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B2090/502—Headgear, e.g. helmet, spectacles
Definitions
- the present application relates to planning associated with robotic-assisted surgical procedures and to subsequent navigation in robotic-assisted surgery, such as in stereotactic neurosurgery surgery and other stereotactic types of surgery, orthopedic surgery, etc.
- robotic-assisted surgery also known as robotized computer-assisted surgery (CAS) among other names
- tools may be supported by a robotic arm.
- the robotic arm may be used with tracking systems (e.g., optical tracking systems) and/or may have encoders and like joint tracking to provide additional tracking data.
- tracking systems e.g., optical tracking systems
- powered tools may still be triggered by a surgeon while held and positioned by a robotic arm.
- the tools may be entirely supported and operated by the robotic arm, often under the supervision of human operators.
- One of the challenges associated with robotic-assisted surgery is the volume taken by the robotic equipment.
- the robotic arm is typically constituted of numerous joints connected to a base and is therefore voluminous.
- a surgical robot may have additional components thereon such as optical trackers, that add to the volume of the robotic arm.
- robotic-assisted surgery may cause intraoperative planning and calibration, to reconcile the voluminous nature of the robotic arm with the available space. This may for example lengthen the duration of surgery.
- additional hardware components may be present and may be in contact with the operated body part, and thus may occupy some of the working volume surrounding the operated body part.
- the skull may be held immobile by a stereotactic frame system.
- the stereotactic frame system may therefore interfere with planned movements and positions of the robotic arm, resulting in intra-operative adjustments.
- a system for planning an end effector positioning in robotic-assisted surgery comprising: a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: obtaining a model of at least a portion of a robot arm of the surgical robot, obtaining a target location of an anatomical feature to be accessed, obtaining a model of a body part including the anatomical feature and a model of at least one hardware component in an environment of the body part, using the models, identifying a configuration of the robot arm avoiding the at least one hardware component and enabling access to the target location of the anatomical feature by an end effector of the robot arm, and outputting the configuration of the robot arm as a plan for driving the robot arm.
- the computer-readable program instructions are executable by the processing unit whereby: obtaining the location of the anatomical feature to be accessed includes obtaining at least one trajectory to access the target location of the anatomical feature.
- the computer- readable program instructions are executable by the processing unit whereby: obtaining the model of the body part including the anatomical feature and the model of at least one hardware component includes adjusting a relative position and/or location of the hardware component relative to the body part as a function of user input.
- the computer- readable program instructions are executable by the processing unit whereby: obtaining the model of the body part including the anatomical feature and the model of at least one hardware component includes switching the model of the at least one hardware component as a function of a selection of hardware component by a user.
- the computer- readable program instructions are executable by the processing unit whereby: obtaining the model of the body part including the anatomical feature and the model of at least one hardware component includes obtaining the model of a skull and of a stereotactic frame system.
- the computer- readable program instructions are executable by the processing unit whereby: obtaining the model of the skull and of the stereotactic frame system includes adjusting a relative position and/or location of the stereotactic frame system relative to the skull as a function of user input.
- the computer- readable program instructions are executable by the processing unit whereby: adjusting a relative position and/or location of the stereotactic frame system relative to the skull as a function of the user input includes displaying in real-time the adjusted relative position and/or location of the stereotactic frame system relative to the skull.
- the computer- readable program instructions are executable by the processing unit whereby: outputting the adjusted relative position and/or location of the stereotactic frame system relative to the skull.
- the computer- readable program instructions are executable by the processing unit whereby: obtaining the model of the skull and of the stereotactic frame system includes adjusting a relative position and/or location of at least one fixation post of the stereotactic frame system relative to a frame of the stereotactic frame system as a function of a user input.
- the computer- readable program instructions are executable by the processing unit whereby: includes displaying in real-time the adjusted relative position and/or location of the at least one fixation post of the stereotactic frame system relative to the frame of the stereotactic frame system.
- the computer- readable program instructions are executable by the processing unit whereby: outputting the configuration of the stereotactic frame system.
- the stereotactic frame system is included.
- the computer- readable program instructions are executable by the processing unit whereby: identifying a configuration of the robot arm avoiding the at least one hardware component includes modifying a link arrangement of the robot arm dynamically.
- the computer- readable program instructions are executable by the processing unit for: displaying the models of the body part, of the at least one hardware component and of the robot arm and a graphical-user interface.
- the computer- readable program instructions are executable by the processing unit whereby: identifying a configuration of the robot arm avoiding the at least one hardware component includes simulating a configuration of the robot arm relative to the hardware component, and identifying an encroachment.
- the computer- readable program instructions are executable by the processing unit for: driving the robot arm as a function of the configuration of the robot arm.
- the robot arm is included.
- a system for planning an end effector positioning in robotic-assisted surgery comprising: a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: obtaining a model of at least a portion of a robot arm of the surgical robot, obtaining a target location of an anatomical feature to be accessed, obtaining a model of a body part including the anatomical feature, using the models, identifying a configuration of the robot arm defining a trajectory between an end effector of the robot arm and the target location of the anatomical feature, and outputting the configuration of the robot arm as a plan for driving the robot.
- the computer- readable program instructions are executable by the processing unit whereby: obtaining the location of the anatomical feature to be accessed includes obtaining at least one trajectory to access the target location of the anatomical feature.
- the computer- readable program instructions are executable by the processing unit whereby: identifying a configuration of the robot arm includes modifying a link arrangement of the robot arm dynamically.
- the computer- readable program instructions are executable by the processing unit for: displaying the models of the body part and of the robot arm and a graphical-user interface.
- the computer- readable program instructions are executable by the processing unit for: driving the robot arm as a function of the configuration of the robot arm.
- the robot arm is included.
- Fig. 1 is a schematic view of a robot arm of a robot-assisted surgery (RAS) system operated in advanced planning in accordance with the present disclosure, relative to a stereotactic frame system holding a skull of a patient;
- RAS robot-assisted surgery
- FIG. 2 is a block diagram of the RAS system of Fig. 1;
- Fig. 3 is a display of an exemplary graphical user interface (GUI) used in advance planning of a procedure to be performed with the RAS system of Figs. 1 and 2;
- GUI graphical user interface
- Fig. 4 is an exemplary display of a robot arm relative to the patient based on the advanced planning
- FIG. 5 is a flowchart showing an exemplary method for planning an end effector positioning in robotic-assisted surgery.
- an exemplary robot-assisted surgery (RAS) system is generally shown at 10, and is used to perform orthopedic surgery maneuvers on a patient. It may also be referred to or known as a robotic surgery system, robot surgery system, etc.
- RAS robot-assisted surgery
- neurosurgery i.e. , brain surgery
- a stereotactic frame system F is one of numerous types of patient procedures that may be assisted with the RAS system 10, and is merely given as an example as other types of procedures may benefit from computer-assisted navigation.
- the RAS system 10 may consequently have one or more processing units dedicated to operating a workflow of a surgical procedure.
- the RAS system 10 may therefore include a non-transitory computer-readable memory communicatively coupled to the one or more processing units and may have computer- readable program instructions executable by the one or more processing units to operate the workflow described herein.
- the RAS system 10 drives a surgical robot 20 used autonomously, and/or as an assistive or collaborative tool for an operator (e.g., surgeon).
- the RAS system 10 is shown relative to a patient’s skull B in some of the figures, but only as an example.
- the system 10 could be used to treat other pathologies or in other surgical procedures, and this may include stereotactic surgical procedures that may include ablations, biopsies, injections, radiosurgery, implantation, stereoelectroencephalography (SEEG) among other examples.
- SEEG stereoelectroencephalography
- the system 10 could be used for orthopedic surgical and may therefore be used with bones, including non- exhaustively tibia, femur, hip joint, spine, and shoulder bones, or in other applications, including dentistry, craniomaxillofacial, other non-orthopedic surgeries, etc.
- the RAS system 10 may include the surgical robot 20, optical trackers 30, a tracker device 40, a RAS controller 50 (also known as a super controller 50), a tracking module 60, and a robot controller 70 (also known as a robot driver), or any combination thereof:
- the surgical robot 20 is one possible working end of the system 10.
- the surgical robot 20 may used to perform bone alterations as planned by an operator and/or the RAS controller 50 and as controlled by the RAS controller 50.
- the surgical robot 20 may support a tool that is operated by personnel, such as a surgeon.
- the tooling end also known as end effector, may be manipulated by the operator while supported by the surgical robot 20.
- the robot 20 may be the coordinate measuring machine (CMM) of the robotic surgery system 10;
- the optical trackers 30 are optionally present and may be positioned on the robot 20, on patient tissue (e.g., bones B), and/or on the tool(s) and surgical instruments, and provide tracking data for the robot 20, the patient and/or tools.
- the tracking device 40 also known as a sensor device, apparatus, etc is optionally present and performs optical tracking of the optical trackers 30, so as to enable the tracking in space (a.k.a., navigation) of the robot 20, the patient and/or tools.
- the tracking device 40 may be optional as the RAS system 10 may rely on self-tracking capacity of the robot 20 and/or on intraoperative imaging such computerized tomography (CT) to track the robot 20, the tools, the stereotactic frame system F and/or the patient intraoperatively;
- CT computerized tomography
- the RAS controller 50 also known as the super controller, includes the processor(s) and appropriate hardware and software to run a computer- assisted surgery procedure in accordance with one or more workflows.
- the RAS controller 50 may include or operate the tracking device 40, the tracking module 60, and/or the robot controller 70.
- the CAS controller 50 may also drive the robot arm 20A through advanced planning in the surgical procedure.
- the RAS controller 50 may also guide an operator through the surgical procedure, by providing intraoperative data of position and orientation, and may therefore have the appropriate interfaces such as mouse, foot pedal, touchscreen, etc;
- the tracking module 60 is tasked with determining the position and/or orientation of the various relevant objects during the surgery procedure, such as the end effector of the robot arm 20, stereotactic frame system F, bone(s) B and tool(s), using data acquired by the tracking device 40 (if present) and by the robot 20, and/or obtained from the robot controller 70.
- the position and/or orientation may be used by the RAS controller 50 to control the robot arm 20A;
- the robot controller 70 is tasked with powering or controlling the various joints of the robot arm 20A, based on operator demands or on surgery advanced planning in accordance with the present disclosure, and may also be referred to as a robot controller module that is part of the super controller 50.
- the robot controller 70 may also optionally calculate robot movements of the robot arm 20A, so as to control movements of the robot arm 20A autonomously in some instances, i.e. , without intervention from the RAS controller 50;
- An additional camera(s) may be present, for instance as a complementary registration tool.
- the camera may for instance be mounted on the robot 20A, such as on the robot arm, such that the point of view of the camera is known in the frame of reference, also known as the coordinate system.
- the RAS system 10 may also include and/or track a variety of tools that are manipulated by the operator and/or by the surgical robot 20 in order to perform tasks on the patient.
- Tools T may include awls, drills, impactors, saws, to name a few of the one or more tools 30 that may be used, as numerous other tools may be used.
- the stereotactic frame F may also be referred to as a tool T.
- the surgical tool(s) may or may not be used depending on whether a surgical robot 20 is in the system 10. It is contemplated to have both the surgical robot 20 and the surgical tool(s) 30 in the RAS system 10.
- the surgical robot 20 may have a robot arm 20A (a.k.a., robotic arm, serial arm, etc) stand from a base 21 , for instance in a fixed relation relative to the operating- room (OR) table supporting the patient.
- the fixed positioning of the surgical robot 20 relative to the patient is a determinative factor in the precision of the surgical procedure, whereby the base 21 may have its casters (shown) or any other support structure locked into place so as not to move.
- the robot arm 20A has a plurality of joints 22 and links 23, of any appropriate form, to support a tool head 24, also referred to as the end effector or as connected to the end effector, that interfaces with the patient or with a tool.
- the tool head 24 may be an appropriate type of tool and may be interchangeable, as a function of the nature of the surgical procedure.
- the tool head 24 may be a clamp that supports a tool, such as those shown at T, with the tools T being manipulated and/or actuated by a human operator.
- a support 24A may optionally be present, and may be used to secure the stereotactic frame system F to the robot 20, so as to limit or prevent any movement between the robot base 21 and the stereotactic frame system F during the surgical procedure.
- Other approaches and/or hardware may be used to ensure that there is no or limited/negligible movement between the robot base 21 and the stereotactic frame system F during the surgical procedure.
- the robot arm 20A of the surgical robot 20 is shown being a serial mechanism, arranged for the tool head 24 to be displaceable in a desired number of degrees of freedom (DOF).
- DOF degrees of freedom
- the surgical robot 20 controls movements of the tool head 24.
- the robot arm 20A is a 6-DOF articulated arm, i.e. , X, Y, Z in the coordinate system, and pitch, roll and yaw. Fewer or additional DOFs may be present.
- only a generic illustration of the joints 22 and links 23 is provided, but more joints of different types may be present to move the tool head 24 in the manner described above.
- the joints 22 are powered for the robot arm 20A to move as controlled by the controller 50 in the six DOFs.
- the powering of the joints 22 is such that the robot arm 20A may execute precise movements, such as moving along a single direction in one translation DOF, or being restricted to moving along a plane, among possibilities.
- Such surgical robots 20 are known, for instance as described in United States Patent Application Serial nos. 11/610,728 and 12/452,142 , incorporated herein by reference.
- the robot arm 20A may include sensors in its various joints 21 and links 22.
- the sensors 25 may be of any appropriate type, such as rotary encoders, optical sensors, position switches that are a non-exhaustive list of potential sensors, for the position and orientation of the end effector, and of the tool in the end effector 23 to be known.
- the tracking module 60 may determine the position and orientation of the robot 20 in a frame of reference of the robot 20, such as by obtaining the position (x,y,z) and orientation (phi, theta, ro) of the end effector 24 from the RAS controller 50 using the sensors 25 in the robot arm 20A, i.e.
- robot coordinates may be an integrated function of the robot 20 in that it may determine the position and orientation of its end effector 24 with respect to its coordinate system.
- the robot 20 may be the coordinate measuring machine (CMM) of the robotic surgery system 10, with a frame of reference (e.g., coordinate system, referential system) of the procedure being relative to the fixed position of the base 21 of the robot 20.
- CMM coordinate measuring machine
- the sensors 25 must provide the precision and accuracy appropriate for surgical procedures.
- the coupling of tools to the robot arm 20 may automatically cause a registration of the position and orientation of the tools in the frame of reference of the robot 20, though steps of calibration could be performed, as explained below.
- trackers 30 may optionally be present and may be secured to the robot arm 20A, to the stereotactic frame system F, to the patient, to instruments.
- the trackers 30 may be known as trackable elements, markers, navigation markers, active sensors (e.g., wired or wireless) that may for example include infrared emitters.
- the trackers 30 are passive retro-reflective elements, that reflect light.
- the trackers 30 have a known geometry so as to be recognizably through detection by the tracker device 40.
- the trackers 30 may be retro-reflective lenses.
- the trackers 30 may be used with a tracker device 40 that may be embodied as an image capture device, capable of illuminating its environment.
- the tracker device 40 may have two (or more) points of view, such that triangulation can be used to determine the position of the tracker devices 30 in space, i.e., the coordinate system of the robotic surgery system 10.
- the tracker device 40 may emit light, or use ambient light, to observe the trackers 30 from its points of view, so as to determine a position of the trackers 30 relative to itself.
- the tracker device 40 can produce navigation data enabling the locating of objects within the coordinate system of the robotic surgery system 10.
- the tracker device 40 is of the type known as the Polaris products by Northern Digital Inc.
- the tracker device 40 may form the complementary part of the CMM function of the robotic surgery system 10, with the trackers 30 on the robot base 21 for example.
- image-capture device is configured to generate an image, such as image data, of a field of view of the image-capture device.
- a controller or processor of the image-capture device is configured to perform one or more operations based on or using the image data, such as one or more image processing functions.
- the one or more operations may include object detection, object recognition, object tracking.
- the image-capture device may be configured to generate an output, such as the image data or an indicator of an operation or a result of an option.
- the expression “image-capture device” may be used in the singular, the image-capture device may include more than a single point of view, for example using triangulation as an option with two points of view.
- the image-capture device is mounted onto a stationary structure, such as a ground stand, which may include a GUI, and a processor unit.
- a stationary structure such as a ground stand, which may include a GUI, and a processor unit.
- the image-capture device may stand over the patient and look down on the surgical zone.
- the stand may be on casters or like supporting structure to fix the stand in position on the ground. It is also contemplated to use other types of structures or mechanisms to support the image-capture device, such a ceiling-mounted arm, a wall- mounted arm, a table mounted arm, a console-mounted arm, etc.
- a reference tracker such as reference trackers, may be provided on the robot base 21 , or the position and/or orientation of the robot base 21 may be calculated by tracking the end effector 24 of the robot arm and using encoder signals from the joints in the robot arm to determine the position and/or orientation of robot base 21, as redundant information, and vice versa. Then, in situations where line of sight to the reference of the robot arm is compromised, the position and orientation of the end effector 24 of the robot arm may be calculated.
- the RAS controller 50 has a processor unit 51 and a non-transitory computer- readable memory 52 communicatively coupled to the processing unit 51 and configured for executing computer-readable program instructions executable by the processing unit 51 to perform some functions, such as tracking the patient tissue and tools, using the position and orientation data from the robot 20 and the readings from the tracker device 40.
- the computer- readable program instructions may include an advanced planning system that may be viewed by a user or operator as a GUI on one or more of the interfaces of the robotic surgery system 10. It is via this or these interfaces that the user or operator may interface with the robotic surgery system, be guided by a surgical workflow, obtain navigation data, etc.
- the RAS controller 50 may also control the movement of the robot arm 20A via the robot controller module 70.
- the robotic surgery system 10 may comprise various types of interfaces l/F, for the information to be provided to the operator.
- the interfaces l/F may include and/or screens including wireless portable devices (e.g., phones, tablets), audio guidance, LED displays, head-mounted display for virtual reality, augmented reality, mixed reality, among many other possibilities.
- the interface l/F comprises a graphic-user interface (GUI) operated by the system 10.
- GUI graphic-user interface
- the RAS controller 50 may also display images captured pre-operatively, or using cameras associated with the procedure (e.g., 3D camera, laparoscopic cameras, tool mounted cameras), for instance to be used in the collaborative/cooperative control mode of the system 10, or for visual supervision by the operator of the system 10, with augmented reality for example.
- the RAS controller 50 may drive the robot arm 20A, in performing the surgical procedure based on the surgery planning achieved pre- operatively, or in maintaining a given position and orientation to support a tool.
- the RAS controller 50 may run various modules, in the form of algorithms, code, non-transient executable instructions, etc, in order to operate the robotic surgery system 10 in the manner described herein.
- the CAS controller 50 may be part of any suitable processor unit, such as a personal computer or computers including laptops and desktops, tablets, server, etc.
- the tracking module 60 may be a subpart of the CAS controller 50, or an independent module or system.
- the tracking module 60 receives the position and orientation data from the robot 20 and the readings from the tracker device 40 if present.
- the tracking module 60 may hence determine the relative position of the objects relative to the robot arm 20A in a manner described below.
- the tracking module 60 may also be provided with models of the objects to be tracked. For example, the tracking module 60 may track bones and tools, and hence may use virtual bone models and tool models.
- the bone models may be acquired from pre-operative imaging (e.g., MRI, CT-scans), for example in 3D or in multiple 2D views, including with 2D X-ray to 3D bone model technologies, as described below in terms of advanced planning.
- the virtual bone models may also include some image processing done preoperatively, for example to remove soft tissue or refine the surfaces that will be exposed and tracked.
- the virtual bone models may be of greater resolution at the parts of the bone that will be tracked during surgery, such as the knee articulation in knee surgery.
- the bone models may also carry additional orientation data, such as various axes (e.g., longitudinal axis, mechanical axis, etc). The bone models may therefore be patient specific.
- the virtual tool models may be provided by the tool manufacturer, or may also be generated in any appropriate way so as to be a virtual 3D representation of the tool(s).
- the RAS controller 50 may have the robot controller 70 integrated therein.
- the robot controller 70 may be physically separated from the RAS controller 50, for instance by being integrated into the robot 20 (e.g., in the robot base 21).
- the robot controller 70 is tasked with powering and/or controlling the various joints of the robot arm 20A.
- the robot controller 70 may also optionally calculate robot movements of the robot arm 20A, so as to control movements of the robot arm 20A autonomously in some instances, i.e., without intervention from the CAS controller 50.
- the robot controller 70 may perform actions based on the surgery planning.
- the surgery planning may be a module programmed specifically for any given patient, according to the parameters of surgery desired by an operator such as an engineer and/or surgeon.
- the parameters may include geometry of selected, planned bone cuts, planned cut depths, sequence or workflow of alterations with a sequence of surgical steps and tools, tools used, etc.
- the use of the tracking system may provide tracking data to perform surgical navigation.
- the tracking system may assist in performing the calibration of the patient skull and stereotactic frame system with respect to the coordinate system (i.e. , the locating of the bone in the coordinate system), for subsequent navigation in the X, Y, Z coordinate system.
- the tracking system includes an image capturing device, also known as a navigation camera, that optically sees and recognizes references (e.g., retro-reflective references, optically recognizable references) - that may be part of the tools used in surgery -, so as to track the robot arm 20A of the surgical robot 20 and body part in six DOFs, namely in position and orientation. Therefore, the RAS controller 50 continuously updates the position and/or orientation of the surgical robot 20, tools T,F and patient bones B in the X, Y, Z coordinate system using the data from the tracking system 50.
- references e.g., retro-reflective references, optically recognizable references
- a contemplated approach to advanced planning is set forth, with reference to Figs. 3 and 4, in the context of a neurological surgical procedure in which the stereotactic frame system F holds the patient’s skull B, such as in the manner shown in Fig. 1.
- the steps and maneuvers described for Fig. 3 are performed pre- operatively, i.e., before a surgical procedure actually commences.
- the pre-operative steps may be performed hours, days, weeks before the surgical procedure, or during the preparative stages of surgery (e.g., peri-operatively).
- the advanced planning may be done using a graphical user interface (GUI) display that may be displayed on a monitor(s) or any other type of screen(s) from the interface l/F in Fig. 2.
- GUI graphical user interface
- the GUI display is operated by an advanced planning system that may or may not be integrated in the RAS system 10.
- the advanced planning system may be described as being a system for planning an end effector positioning in robotic-assisted surgery, in the various maneuvers performed by the robot arm based on target moves.
- the advanced planning system may include a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit.
- the processing unit may be the processing unit 51 in Fig. 2, with the non-transitory computer-readable memory may be 52 in Fig. 2. It is however possible to have the advanced planning system in a dedicated computer, or any other computing environment.
- a GUI display is generally shown at 100.
- the GUI display 100 is an example of a GUI that can be used for advanced planning in accordance with the present disclosure, and that is an interface and/or output from the advanced planning system of the present disclosure.
- the GUI display 100 has one or more windows, with an example of four windows shown in Fig. 3.
- the display may include a first window 100A featuring a patient-specific image (e.g., a model of the patient’s head) in three dimensions (3D) in the top left corner, along with the model of the stereotactic frame system F (or other hardware component), in a simulation.
- the patient body part in window 100A is in the form of a model, such as a 3D model.
- the stereotactic frame system F is also a 3D model, with the window 100A displaying a simulated position and orientation of the model of the stereotactic frame system F relative to the model of the head, as if the stereotactic frame were secured to the patient’s head.
- the models are attached to one another, such that the modelized assembly of head B and stereotactic frame F may be rotated concurrently to change a point of view. Stated differently, the models may be displaced as a single assembly.
- the window 100A may also be used to change a relative position and/or orientation of the stereotactic frame relative to the model of the head. For example, three degrees of translation and three degrees of rotation may be possible between the hardware component and the body part, or between the hardware components, such as when a vertical bar of the stereotactic frame could be moved.
- the movements may be dictated by a user with appropriate interfaces, such as a mouse, a keyboard, a touch screen (e.g., a smart device), a control bar 101 with command buttons, to name a few examples.
- a user with appropriate interfaces, such as a mouse, a keyboard, a touch screen (e.g., a smart device), a control bar 101 with command buttons, to name a few examples.
- the model of the body part or of any component at the surgical site may result from imaging and modelling of the patient’s body part.
- the imaging may be from images acquired by a 3D camera that may for example use structured light.
- the model of the body part may be partially or completely obtained from a bone atlas, based on some anatomical features of the patient.
- the model from the bone atlas may be adjusted in different ways based on dimensions obtained from the patient, for the generic model to be morphed toward the patient’s anatomy.
- the model of the hardware component may be obtained from the manufacturer, or may be generated in any appropriate way.
- stereotactic frame systems There are numerous types of stereotactic frame systems that may be employed, in different configurations, dimensions, etc.
- the user of the GUI display 100 may be given the choice between two or more distinct stereotactic frame system, with the advanced planning system for the GUI display 100 having the capacity to retrieve the appropriate model, and overlay it onto the skull.
- the stereotactic frame system F may have adjustable features in its tangible configuration.
- the stereotactic frame system F illustrated in Fig. 4 may have a rectangular frame F1 upon which are mounted fixation posts F2.
- the fixation posts F2 may for example move in a vertical direction relative to the frame F1 , for a user to then lock the fixation posts F2 at a desired position
- fixation screws F3 or equivalent skull abutments may be at the end of the fixation posts F2 and may displaced in translation.
- fixation screws F3 are screwingly engaged to the fixation posts F2 such that a rotation of the fixation screws F3 results in an axial displacement, i.e. , a displacement along a longitudinal axis thereof.
- the fixation screws F3 may not be backdrivable.
- additional components may be present in the stereotactic frame system F of Fig. 4, such as ear plugs that may be mounted to fixation posts F2 and that may also be displaceable relative to the rectangular frame F1.
- a similar stereotactic frame system F is shown as being modeled relative to the model of the patient’s head.
- the user may be given the possibility of modifying the arrangement of components of the modeled stereotactic frame.
- different entry fields may be provided in the control bar 101 to cause adjustments of the model, such as the adjustable features described above for Fig. 4.
- the models may be 3-D solid models, such that the advanced planning system may provide automated position arrangements for the modeled stereotactic frame as a function of interferences.
- the advanced planning system may adjust a relative height of the fixation post F2 when an interference (i.e., collision, encroachment) is detected.
- the two upper trajectory lines may be the sutterella antennae, and could represent trajectories that could be subject to collisions and should be avoided, as an example among others.
- a user may optionally perform different adjustments for the stereotactic frame using the GUI display 100, such as one or more of position and/or orientation adjustments for the model of the stereotactic frame relative to model of the head, stereotactic frame model selection, stereotactic frame configuration adjustments, to name a few.
- the advanced planning system may allow point-of-view selections by a user.
- the advanced planning system may provide compatibility suggestions for stereotactic frame models, stereotactic frame configurations, stereotactic frame locations, etc, based on patient-specific anatomical features.
- the GUI display 100 may also have other windows, three of which are shown as 100B.
- the three additional windows show magnetic resonance (MRI) images from different viewpoints, by which a user can identify one or more target locations to reach during the surgery.
- MRI images are just one possibility among others, and if images are provided in the windows, they can be from CT scans, X-rays, or from any other imaging modality. The imaging modality may depend on the nature of the surgical intervention for which planning is done.
- the views may be along sagittal, transverse and/or frontal planes of the patient, and/or could be aligned with a trajectory (e.g., in a plane of the trajectory, perpendicular to trajectory as shown, along the trajectory as shown, axial view as shown).
- a trajectory e.g., in a plane of the trajectory, perpendicular to trajectory as shown, along the trajectory as shown, axial view as shown.
- the user of the GUI display 100 may then select one or more trajectories to reach the one or more target locations.
- target trajectories are displayed as 102 in Figs. 3 and 4.
- the user may use any one or more of the windows to set a target trajectory, and the other windows may then provide a corresponding position of the target trajectory.
- the user is assisted by the visual display, as the user can see where the stereotactic frame or other hardware component is located, such as relative to the trajectory. Accordingly, the user can avoid any interference or conflict by the visual guidance.
- the advanced planning system may provide recommendations of trajectories.
- the advanced planning system may perform a simulation to position a model of the robot arm relative to the body part and to the stereotactic frame, such as in the window featuring the 3D models.
- the advanced planning system may use a model of a part of the robot arm with an appropriate tool or guide at its end effector to effect the procedure along the trajectory or trajectories.
- the model could be adjusted to display the arrangement of its links required to provide a given orientation for the tool T, i.e. , the robot arm configuration.
- the advanced planning system may add to the display a model of part of the robot arm.
- the advanced planning system can detect any encroachment, a.k.a., collision, interference, such as between part of the robot arm and the stereotactic frame, or other hardware component based on the nature of the surgical procedure. Accordingly, the user may be informed of any encroachment, such as between the outer envelope of the robot arm and the stereotactic frame. In the absence of encroachment, the user may be provided with a clearance distance value, i.e., the shortest measured distance between part of the robot arm (including any tool it may support, or the outer surface or envelope of the robot arm) and the stereotactic frame, or other hardware component. In a variant, the user and advanced planning system may have entered or provided predefined clearance distance values, so as to then indicate that the minimum clearance distance value has not been respected.
- the user may thus take various corrective actions, such as using a different stereotactic frame model (in the case of stereotactic neurosurgery), repositioning the stereotactic frame relative to the skull, finding another trajectory(ies), or suggesting a different robot arm configuration (i.e., orientation between links of the robot arm) is such an option is available.
- the advanced planning system may achieve a plan in which there is no encroachment and/or in which the minimum clearance values are respected.
- the advanced planning system may then identify the corresponding robot arm configuration.
- the robot arm configuration includes relative joint orientations by the end effector is properly positioned and/or orientation for the trajectory to be defined between the end effector and the target location.
- the robot arm configuration may include multiple joint orientations for multiple trajectories for multiple target locations.
- the plan may also include the specific details pertaining to the hardware component, such as in the case of the stereotactic frame system F, the stereotactic frame model, stereotactic frame arrangement of adjustable components, the relative position and/or orientation of the stereotactic frame relative to the user’s head (such as in the form of images), etc.
- the plan may be in different forms, such as instructions for driving the robot arm 20A of the RAS system 10.
- an exemplary method for planning an end effector positioning in robotic-assisted surgery is generally shown at 200.
- the method 200 may be performed for example using the RAS system 10, with the surgical robot 20, during neurosurgery using a stereotactic frame system, such as that shown in Figs. 3 and 4.
- the method 200 may be performed with other types of RAS systems, in other types of surgery.
- the method 200 may be computer-readable program instructions in the non-transitory computer-readable memory 52 for example, and executable by the processing unit 51 communicatively coupled to the processing unit 51.
- a model of at least a portion of a robot arm of the surgical robot is obtained.
- the surgical robot may the surgical robot 20 described herein, for example.
- a target location of an anatomical feature to be accessed is obtained. This may include multiple target locations.
- the target location(s) is obtained in the form of coordinates from the use of the GUI display 100 of Fig. 3.
- the target location may be a target location of a brain with a skull, as an example.
- Obtaining the location of the anatomical feature to be accessed includes obtaining at least one trajectory to access the location of the anatomical feature.
- a model of a body part including the anatomical feature and a model of at least one hardware component in an environment of the body part are obtained, with the model of the body part optionally being a real image of the body part, or a composite of a model and real images.
- the hardware component is a stereotactic frame.
- Obtaining the model or real images of the body part may include the anatomical feature and the model of at least one hardware component, and may also include adjusting a relative position and/or location of the hardware component relative to the body part as a function of user input.
- Obtaining the model of the body part including the anatomical feature and the model of at least one hardware component may include switching the model of the at least one hardware component as a function of a selection of hardware component by a user.
- Obtaining the model of the body part including the anatomical feature and the model of at least one hardware component includes obtaining the model of a skull and of a stereotactic frame system.
- a configuration of the robot arm is identified, which configuration avoids the hardware component(s) and defines a trajectory between an end effector of the robot arm and the target location of the anatomical feature. If there is an encroachment, the system may modify a link arrangement of the robot arm dynamically. Identifying a configuration of the robot arm avoiding the hardware component(s) includes simulating a configuration of the robot arm relative to the hardware component, simulating movements between links of the robot arm, and identifying an encroachment.
- the configuration of the robot arm may be output as a plan for driving the robot. Accordingly, the RAS system 10 may be driven as a function of the plan.
- the planning performed by the method and/or system described herein simulates surgery elements.
- it could consist in proposing a model of the hardware component, such as a stereotactic frame, to simulate or emulate the one that will be used during the surgical procedure.
- a model of the hardware component such as a stereotactic frame
- different types of frames can be used intraoperatively (e.g., Leksell, CRW, Mayfield, Inomed).
- Models of the hardware components may be obtained in 3D during the planning phase to allow the surgeon to simulate their position and/or arrangement relative to a model of the patient’s head, and adapt the positioning of selected trajectories accordingly.
- the hardware component models provided may not be exactly and fully faithful to the hardware component used in surgery
- the simulations and steps described herein allow planning to be performed to avoid encroachments during surgery.
- the advanced planning system may provide suitably precise guidance at a pre-operative stage to allow an operator such as a surgeon to plan and optimize the time and comfort of the surgery on the day.
- the advanced planning system could be used to simulate the position of the bone fiducials based on the different planned trajectories.
- the advanced planning system may allow an improvement of the trajectory accessibility verification by simulating the position of a tool and/or instrument holder at the end effector of a robot arm, and may thus permit modifications to a position and/or orientation of the tool and/or instrument holder at the end effector, so that a setup is optimized, notably by the absence of encroachment, or by having suitable clearance distances between the tool and/or instrument holder as well as the robot arm and the patient environment intra-operatively.
- the advanced planning system may allow a reduction of risk of redefining some trajectories during the surgery (and subsequent complications).
- the advanced planning system in some instances may reduce the surgery time.
- the use of the advanced planning system may also allow an increase in ergonomics during the surgical procedure, as the surgeon has more information on a planned surgery, and can anticipate how to position the stereotactic frame for the surgery.
- the advanced planning system may essentially be described as adding to the planning of robot positioning elements, the planning of the surgery elements.
- the plan may feature data on the end effector instrument and the planning of the surgeon’s instruments.
- robot positioning elements are trajectories, and a surgery element can be the stereotactic frame (or other head or skull fixing frame).
- the tool or instrument holder orientation and surgeon’s instruments are planned.
- the advanced planning system may also perform an accessibility assessment.
- a 3D representation of the head frame is obtained, as well as like models of surgical instruments and part of the robot arm.
- Model representations can be schematic, can be the result of a conversion of the mechanical drawings of those objects, and/or generating a 3D model from a scan of the robot arm.
- the 3D models are used by the advanced planning system described above to translate and rotate these 3D objects relative to one another, to allow an operator to visualize relative position and/or orientation in addition to planning trajectories.
- the planning of the trajectories is realized by the advanced planning system via the visual representation (2D or 3D) of patient exams, upon which a 2D or 3D representation of the trajectories may be overlaid.
- the system described herein is for planning an end effector positioning in robotic-assisted surgery, and may include a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: obtaining a model of at least a portion of a robot arm of the surgical robot, obtaining a target location of an anatomical feature to be accessed, obtaining a model of a body part including the anatomical feature, using the models, identifying a configuration of the robot arm defining a trajectory between an end effector of the robot arm and the target location of the anatomical feature, and outputting the configuration of the robot arm as a plan for driving the robot.
- Having a hardware component in an environment of the body part is not mandatory. For example, without any hardware component, it is relevant to the surgeon to visualize the posture of the robot at the target location. Because this posture could hinder the surgeon to do its surgical act, the surgeon could change the final posture of the robot, to find one more convenient for him.
- the advanced planning system may achieve a complete accessibility analysis, may measure distances, compute the shortest distance between objects (i.e. , minimum clearance as described above), detect encroachments between objects, for example by giving shapes to the objects with a customizable error margin.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Robotics (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
A system for planning an end effector positioning in robotic-assisted surgery, comprising: obtaining a model of at least a portion of a robot arm of the surgical robot, obtaining a target location of an anatomical feature to be accessed, obtaining a model of a body part including the anatomical feature and a model of at least one hardware component in an environment of the body part, using the models, identifying a configuration of the robot arm avoiding the at least one hardware component and defining a trajectory between an end effector of the robot arm and the target location of the anatomical feature, and outputting the configuration of the robot arm as a plan for driving the robot.
Description
ADVANCED PLANNING FOR ACCESSIBILITY ASSESSMENT IN ROBOTIC ASSISTED SURGERY
CROSS-REFERENCE TO RELATED APPLICATION
[0001] The present application claims the priority of United States Patent Application No. 63/637,686, filed on April 23, 2024 and incorporated herein in its entirety by reference.
TECHNICAL FIELD
[0002] The present application relates to planning associated with robotic-assisted surgical procedures and to subsequent navigation in robotic-assisted surgery, such as in stereotactic neurosurgery surgery and other stereotactic types of surgery, orthopedic surgery, etc.
BACKGROUND
[0003] In robotic-assisted surgery, also known as robotized computer-assisted surgery (CAS) among other names, tools may be supported by a robotic arm. By this support, the positioning of a surgical tool relative to the operated body may be stabilized by the robotic assistance, removing the need for human muscle support. Moreover, the robotic arm may be used with tracking systems (e.g., optical tracking systems) and/or may have encoders and like joint tracking to provide additional tracking data. Thus, while surgeons may have developed an expertise in manipulations performed during surgery, some practitioners prefer robotized assistance. In some instances, powered tools may still be triggered by a surgeon while held and positioned by a robotic arm. In some other instances, the tools may be entirely supported and operated by the robotic arm, often under the supervision of human operators.
[0004] One of the challenges associated with robotic-assisted surgery is the volume taken by the robotic equipment. Notably, the robotic arm is typically constituted of numerous joints connected to a base and is therefore voluminous. Moreover, a surgical robot may have additional components thereon such as optical trackers, that add to the volume of the robotic arm. As surgical procedures commonly imply high levels of accuracy and precision, robotic-assisted surgery may cause intraoperative planning and
calibration, to reconcile the voluminous nature of the robotic arm with the available space. This may for example lengthen the duration of surgery.
[0005] In some instances, additional hardware components may be present and may be in contact with the operated body part, and thus may occupy some of the working volume surrounding the operated body part. For example, in stereotactic surgery, the skull may be held immobile by a stereotactic frame system. The stereotactic frame system may therefore interfere with planned movements and positions of the robotic arm, resulting in intra-operative adjustments.
SUMMARY
[0006] In accordance with a first aspect of the present disclosure, there is provided a system for planning an end effector positioning in robotic-assisted surgery, comprising: a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: obtaining a model of at least a portion of a robot arm of the surgical robot, obtaining a target location of an anatomical feature to be accessed, obtaining a model of a body part including the anatomical feature and a model of at least one hardware component in an environment of the body part, using the models, identifying a configuration of the robot arm avoiding the at least one hardware component and enabling access to the target location of the anatomical feature by an end effector of the robot arm, and outputting the configuration of the robot arm as a plan for driving the robot arm.
[0007] Further in accordance with the first aspect, for instance, the computer-readable program instructions are executable by the processing unit whereby: obtaining the location of the anatomical feature to be accessed includes obtaining at least one trajectory to access the target location of the anatomical feature.
[0008] Still further in accordance with the first aspect, for instance, the computer- readable program instructions are executable by the processing unit whereby: obtaining the model of the body part including the anatomical feature and the model of at least one hardware component includes adjusting a relative position and/or location of the hardware component relative to the body part as a function of user input.
[0009] Still further in accordance with the first aspect, for instance, the computer- readable program instructions are executable by the processing unit whereby: obtaining the model of the body part including the anatomical feature and the model of at least one hardware component includes switching the model of the at least one hardware component as a function of a selection of hardware component by a user.
[0010] Still further in accordance with the first aspect, for instance, the computer- readable program instructions are executable by the processing unit whereby: obtaining the model of the body part including the anatomical feature and the model of at least one hardware component includes obtaining the model of a skull and of a stereotactic frame system.
[0011] Still further in accordance with the first aspect, for instance, the computer- readable program instructions are executable by the processing unit whereby: obtaining the model of the skull and of the stereotactic frame system includes adjusting a relative position and/or location of the stereotactic frame system relative to the skull as a function of user input.
[0012] Still further in accordance with the first aspect, for instance, the computer- readable program instructions are executable by the processing unit whereby: adjusting a relative position and/or location of the stereotactic frame system relative to the skull as a function of the user input includes displaying in real-time the adjusted relative position and/or location of the stereotactic frame system relative to the skull.
[0013] Still further in accordance with the first aspect, for instance, the computer- readable program instructions are executable by the processing unit whereby: outputting the adjusted relative position and/or location of the stereotactic frame system relative to the skull.
[0014] Still further in accordance with the first aspect, for instance, the computer- readable program instructions are executable by the processing unit whereby: obtaining the model of the skull and of the stereotactic frame system includes adjusting a relative position and/or location of at least one fixation post of the stereotactic frame system relative to a frame of the stereotactic frame system as a function of a user input.
[0015] Still further in accordance with the first aspect, for instance, the computer- readable program instructions are executable by the processing unit whereby: includes displaying in real-time the adjusted relative position and/or location of the at least one fixation post of the stereotactic frame system relative to the frame of the stereotactic frame system.
[0016] Still further in accordance with the first aspect, for instance, the computer- readable program instructions are executable by the processing unit whereby: outputting the configuration of the stereotactic frame system.
[0017] Still further in accordance with the first aspect, for instance, the stereotactic frame system is included.
[0018] Still further in accordance with the first aspect, for instance, the computer- readable program instructions are executable by the processing unit whereby: identifying a configuration of the robot arm avoiding the at least one hardware component includes modifying a link arrangement of the robot arm dynamically.
[0019] Still further in accordance with the first aspect, for instance, the computer- readable program instructions are executable by the processing unit for: displaying the models of the body part, of the at least one hardware component and of the robot arm and a graphical-user interface.
[0020] Still further in accordance with the first aspect, for instance, the computer- readable program instructions are executable by the processing unit whereby: identifying a configuration of the robot arm avoiding the at least one hardware component includes simulating a configuration of the robot arm relative to the hardware component, and identifying an encroachment.
[0021] Still further in accordance with the first aspect, for instance, the computer- readable program instructions are executable by the processing unit for: driving the robot arm as a function of the configuration of the robot arm.
[0022] Still further in accordance with the first aspect, for instance, the robot arm is included.
[0023] In accordance with a second aspect of the present disclosure, there is provided a system for planning an end effector positioning in robotic-assisted surgery,
comprising: a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: obtaining a model of at least a portion of a robot arm of the surgical robot, obtaining a target location of an anatomical feature to be accessed, obtaining a model of a body part including the anatomical feature, using the models, identifying a configuration of the robot arm defining a trajectory between an end effector of the robot arm and the target location of the anatomical feature, and outputting the configuration of the robot arm as a plan for driving the robot.
[0024] Further in accordance with the second aspect, for instance, the computer- readable program instructions are executable by the processing unit whereby: obtaining the location of the anatomical feature to be accessed includes obtaining at least one trajectory to access the target location of the anatomical feature.
[0025] Still further in accordance with the second aspect, for instance, the computer- readable program instructions are executable by the processing unit whereby: identifying a configuration of the robot arm includes modifying a link arrangement of the robot arm dynamically.
[0026] Still further in accordance with the second aspect, for instance, the computer- readable program instructions are executable by the processing unit for: displaying the models of the body part and of the robot arm and a graphical-user interface.
[0027] Still further in accordance with the second aspect, for instance, the computer- readable program instructions are executable by the processing unit for: driving the robot arm as a function of the configuration of the robot arm.
[0028] Still further in accordance with the second aspect, for instance, the robot arm is included.
DESCRIPTION OF THE DRAWINGS
[0029] Fig. 1 is a schematic view of a robot arm of a robot-assisted surgery (RAS) system operated in advanced planning in accordance with the present disclosure, relative to a stereotactic frame system holding a skull of a patient;
[0030] Fig. 2 is a block diagram of the RAS system of Fig. 1;
[0031] Fig. 3 is a display of an exemplary graphical user interface (GUI) used in advance planning of a procedure to be performed with the RAS system of Figs. 1 and 2;
[0032] Fig. 4 is an exemplary display of a robot arm relative to the patient based on the advanced planning; and
[0033] Fig. 5 is a flowchart showing an exemplary method for planning an end effector positioning in robotic-assisted surgery.
DETAILED DESCRIPTION
[0034] Referring to the drawings and more particularly to Figs. 1 and 2, an exemplary robot-assisted surgery (RAS) system is generally shown at 10, and is used to perform orthopedic surgery maneuvers on a patient. It may also be referred to or known as a robotic surgery system, robot surgery system, etc. As observed from subsequent figures, neurosurgery, i.e. , brain surgery, using a stereotactic frame system F is one of numerous types of patient procedures that may be assisted with the RAS system 10, and is merely given as an example as other types of procedures may benefit from computer-assisted navigation. The RAS system 10 may consequently have one or more processing units dedicated to operating a workflow of a surgical procedure. The RAS system 10 may therefore include a non-transitory computer-readable memory communicatively coupled to the one or more processing units and may have computer- readable program instructions executable by the one or more processing units to operate the workflow described herein. In an embodiment, the RAS system 10 drives a surgical robot 20 used autonomously, and/or as an assistive or collaborative tool for an operator (e.g., surgeon).
[0035] The RAS system 10 is shown relative to a patient’s skull B in some of the figures, but only as an example. The system 10 could be used to treat other pathologies or in other surgical procedures, and this may include stereotactic surgical procedures that may include ablations, biopsies, injections, radiosurgery, implantation, stereoelectroencephalography (SEEG) among other examples. The system 10 could be used for orthopedic surgical and may therefore be used with bones, including non- exhaustively tibia, femur, hip joint, spine, and shoulder bones, or in other applications, including dentistry, craniomaxillofacial, other non-orthopedic surgeries, etc.
[0036] The RAS system 10 may include the surgical robot 20, optical trackers 30, a tracker device 40, a RAS controller 50 (also known as a super controller 50), a tracking module 60, and a robot controller 70 (also known as a robot driver), or any combination thereof:
• The surgical robot 20 is one possible working end of the system 10. The surgical robot 20 may used to perform bone alterations as planned by an operator and/or the RAS controller 50 and as controlled by the RAS controller 50. The surgical robot 20 may support a tool that is operated by personnel, such as a surgeon. For example, the tooling end, also known as end effector, may be manipulated by the operator while supported by the surgical robot 20. The robot 20 may be the coordinate measuring machine (CMM) of the robotic surgery system 10;
• The optical trackers 30 are optionally present and may be positioned on the robot 20, on patient tissue (e.g., bones B), and/or on the tool(s) and surgical instruments, and provide tracking data for the robot 20, the patient and/or tools.
• The tracking device 40, also known as a sensor device, apparatus, etc is optionally present and performs optical tracking of the optical trackers 30, so as to enable the tracking in space (a.k.a., navigation) of the robot 20, the patient and/or tools. The tracking device 40 may be optional as the RAS system 10 may rely on self-tracking capacity of the robot 20 and/or on intraoperative imaging such computerized tomography (CT) to track the robot 20, the tools, the stereotactic frame system F and/or the patient intraoperatively;
• The RAS controller 50, also known as the super controller, includes the processor(s) and appropriate hardware and software to run a computer- assisted surgery procedure in accordance with one or more workflows. The RAS controller 50 may include or operate the tracking device 40, the tracking module 60, and/or the robot controller 70. As described hereinafter, the CAS controller 50 may also drive the robot arm 20A through advanced planning in the surgical procedure. The RAS controller 50 may also guide an operator
through the surgical procedure, by providing intraoperative data of position and orientation, and may therefore have the appropriate interfaces such as mouse, foot pedal, touchscreen, etc;
• The tracking module 60 is tasked with determining the position and/or orientation of the various relevant objects during the surgery procedure, such as the end effector of the robot arm 20, stereotactic frame system F, bone(s) B and tool(s), using data acquired by the tracking device 40 (if present) and by the robot 20, and/or obtained from the robot controller 70. The position and/or orientation may be used by the RAS controller 50 to control the robot arm 20A;
• The robot controller 70 is tasked with powering or controlling the various joints of the robot arm 20A, based on operator demands or on surgery advanced planning in accordance with the present disclosure, and may also be referred to as a robot controller module that is part of the super controller 50. The robot controller 70 may also optionally calculate robot movements of the robot arm 20A, so as to control movements of the robot arm 20A autonomously in some instances, i.e. , without intervention from the RAS controller 50;
• An additional camera(s) may be present, for instance as a complementary registration tool. The camera may for instance be mounted on the robot 20A, such as on the robot arm, such that the point of view of the camera is known in the frame of reference, also known as the coordinate system.
• The RAS system 10 may also include and/or track a variety of tools that are manipulated by the operator and/or by the surgical robot 20 in order to perform tasks on the patient. Tools T may include awls, drills, impactors, saws, to name a few of the one or more tools 30 that may be used, as numerous other tools may be used. The stereotactic frame F may also be referred to as a tool T. The surgical tool(s) may or may not be used depending on whether a surgical robot 20 is in the system 10. It is contemplated to have both the surgical robot 20 and the surgical tool(s) 30 in the RAS system 10.
[0037] Still referring to Fig. 1, a schematic example of the surgical robot 20 is provided. The surgical robot 20 may have a robot arm 20A (a.k.a., robotic arm, serial arm, etc) stand from a base 21 , for instance in a fixed relation relative to the operating-
room (OR) table supporting the patient. The fixed positioning of the surgical robot 20 relative to the patient is a determinative factor in the precision of the surgical procedure, whereby the base 21 may have its casters (shown) or any other support structure locked into place so as not to move. The robot arm 20A has a plurality of joints 22 and links 23, of any appropriate form, to support a tool head 24, also referred to as the end effector or as connected to the end effector, that interfaces with the patient or with a tool. The tool head 24 may be an appropriate type of tool and may be interchangeable, as a function of the nature of the surgical procedure. In particular, the tool head 24 may be a clamp that supports a tool, such as those shown at T, with the tools T being manipulated and/or actuated by a human operator. A support 24A may optionally be present, and may be used to secure the stereotactic frame system F to the robot 20, so as to limit or prevent any movement between the robot base 21 and the stereotactic frame system F during the surgical procedure. Other approaches and/or hardware may be used to ensure that there is no or limited/negligible movement between the robot base 21 and the stereotactic frame system F during the surgical procedure.
[0038] The robot arm 20A of the surgical robot 20 is shown being a serial mechanism, arranged for the tool head 24 to be displaceable in a desired number of degrees of freedom (DOF). For example, the surgical robot 20 controls movements of the tool head 24. In an embodiment, the robot arm 20A is a 6-DOF articulated arm, i.e. , X, Y, Z in the coordinate system, and pitch, roll and yaw. Fewer or additional DOFs may be present. For simplicity, only a generic illustration of the joints 22 and links 23 is provided, but more joints of different types may be present to move the tool head 24 in the manner described above. The joints 22 are powered for the robot arm 20A to move as controlled by the controller 50 in the six DOFs. Therefore, the powering of the joints 22 is such that the robot arm 20A may execute precise movements, such as moving along a single direction in one translation DOF, or being restricted to moving along a plane, among possibilities. Such surgical robots 20 are known, for instance as described in United States Patent Application Serial nos. 11/610,728 and 12/452,142 , incorporated herein by reference.
[0039] The robot arm 20A may include sensors in its various joints 21 and links 22. The sensors 25 may be of any appropriate type, such as rotary encoders, optical sensors, position switches that are a non-exhaustive list of potential sensors, for the
position and orientation of the end effector, and of the tool in the end effector 23 to be known. More particularly, the tracking module 60 may determine the position and orientation of the robot 20 in a frame of reference of the robot 20, such as by obtaining the position (x,y,z) and orientation (phi, theta, ro) of the end effector 24 from the RAS controller 50 using the sensors 25 in the robot arm 20A, i.e. , robot coordinates may be an integrated function of the robot 20 in that it may determine the position and orientation of its end effector 24 with respect to its coordinate system. Using the data from the sensors, the robot 20 may be the coordinate measuring machine (CMM) of the robotic surgery system 10, with a frame of reference (e.g., coordinate system, referential system) of the procedure being relative to the fixed position of the base 21 of the robot 20. The sensors 25 must provide the precision and accuracy appropriate for surgical procedures. The coupling of tools to the robot arm 20 may automatically cause a registration of the position and orientation of the tools in the frame of reference of the robot 20, though steps of calibration could be performed, as explained below.
[0040] Referring to Fig. 2, trackers 30 may optionally be present and may be secured to the robot arm 20A, to the stereotactic frame system F, to the patient, to instruments. The trackers 30 may be known as trackable elements, markers, navigation markers, active sensors (e.g., wired or wireless) that may for example include infrared emitters. In a variant, the trackers 30 are passive retro-reflective elements, that reflect light. The trackers 30 have a known geometry so as to be recognizably through detection by the tracker device 40. For example, the trackers 30 may be retro-reflective lenses. The trackers 30 may be used with a tracker device 40 that may be embodied as an image capture device, capable of illuminating its environment. In a variant, the tracker device 40 may have two (or more) points of view, such that triangulation can be used to determine the position of the tracker devices 30 in space, i.e., the coordinate system of the robotic surgery system 10. The tracker device 40 may emit light, or use ambient light, to observe the trackers 30 from its points of view, so as to determine a position of the trackers 30 relative to itself. By knowing the geometry of the arrangements of trackers 30, the tracker device 40 can produce navigation data enabling the locating of objects within the coordinate system of the robotic surgery system 10. In an embodiment, the tracker device 40 is of the type known as the Polaris products by Northern Digital Inc. The tracker device 40 may form the complementary part of the
CMM function of the robotic surgery system 10, with the trackers 30 on the robot base 21 for example.
[0041] Although an embodiment featuring retro-reflective trackers 30 has been described (e.g., Navitrack® system), other image based tracking technologies may be used, such as depth cameras, 3D cameras, etc, without the need for trackers or like trackable references on objects, or with other types of trackers, such as QR codes, etc. The use of the expression “image-capture device” herein is deemed to incorporate all such imaging devices using for navigation. The image-capture device is configured to generate an image, such as image data, of a field of view of the image-capture device. In some implementations, a controller or processor of the image-capture device is configured to perform one or more operations based on or using the image data, such as one or more image processing functions. For example, the one or more operations may include object detection, object recognition, object tracking. Additionally, or alternatively, the image-capture device may be configured to generate an output, such as the image data or an indicator of an operation or a result of an option. Moreover, even though the expression “image-capture device” may be used in the singular, the image-capture device may include more than a single point of view, for example using triangulation as an option with two points of view.
[0042] In an embodiment, the image-capture device is mounted onto a stationary structure, such as a ground stand, which may include a GUI, and a processor unit. For example, the image-capture device may stand over the patient and look down on the surgical zone. The stand may be on casters or like supporting structure to fix the stand in position on the ground. It is also contemplated to use other types of structures or mechanisms to support the image-capture device, such a ceiling-mounted arm, a wall- mounted arm, a table mounted arm, a console-mounted arm, etc.
[0043] In a variant, a reference tracker, such as reference trackers, may be provided on the robot base 21 , or the position and/or orientation of the robot base 21 may be calculated by tracking the end effector 24 of the robot arm and using encoder signals from the joints in the robot arm to determine the position and/or orientation of robot base 21, as redundant information, and vice versa. Then, in situations where line of sight to
the reference of the robot arm is compromised, the position and orientation of the end effector 24 of the robot arm may be calculated.
[0044] The RAS controller 50 has a processor unit 51 and a non-transitory computer- readable memory 52 communicatively coupled to the processing unit 51 and configured for executing computer-readable program instructions executable by the processing unit 51 to perform some functions, such as tracking the patient tissue and tools, using the position and orientation data from the robot 20 and the readings from the tracker device 40. Accordingly, as part of the operation of the RAS controller 50, the computer- readable program instructions may include an advanced planning system that may be viewed by a user or operator as a GUI on one or more of the interfaces of the robotic surgery system 10. It is via this or these interfaces that the user or operator may interface with the robotic surgery system, be guided by a surgical workflow, obtain navigation data, etc. The RAS controller 50 may also control the movement of the robot arm 20A via the robot controller module 70. The robotic surgery system 10 may comprise various types of interfaces l/F, for the information to be provided to the operator. The interfaces l/F may include and/or screens including wireless portable devices (e.g., phones, tablets), audio guidance, LED displays, head-mounted display for virtual reality, augmented reality, mixed reality, among many other possibilities. For example, the interface l/F comprises a graphic-user interface (GUI) operated by the system 10. The RAS controller 50 may also display images captured pre-operatively, or using cameras associated with the procedure (e.g., 3D camera, laparoscopic cameras, tool mounted cameras), for instance to be used in the collaborative/cooperative control mode of the system 10, or for visual supervision by the operator of the system 10, with augmented reality for example. The RAS controller 50 may drive the robot arm 20A, in performing the surgical procedure based on the surgery planning achieved pre- operatively, or in maintaining a given position and orientation to support a tool. The RAS controller 50 may run various modules, in the form of algorithms, code, non-transient executable instructions, etc, in order to operate the robotic surgery system 10 in the manner described herein. The CAS controller 50 may be part of any suitable processor unit, such as a personal computer or computers including laptops and desktops, tablets, server, etc.
[0045] The tracking module 60 may be a subpart of the CAS controller 50, or an independent module or system. The tracking module 60 receives the position and orientation data from the robot 20 and the readings from the tracker device 40 if present. The tracking module 60 may hence determine the relative position of the objects relative to the robot arm 20A in a manner described below. The tracking module 60 may also be provided with models of the objects to be tracked. For example, the tracking module 60 may track bones and tools, and hence may use virtual bone models and tool models. The bone models may be acquired from pre-operative imaging (e.g., MRI, CT-scans), for example in 3D or in multiple 2D views, including with 2D X-ray to 3D bone model technologies, as described below in terms of advanced planning. The virtual bone models may also include some image processing done preoperatively, for example to remove soft tissue or refine the surfaces that will be exposed and tracked. The virtual bone models may be of greater resolution at the parts of the bone that will be tracked during surgery, such as the knee articulation in knee surgery. The bone models may also carry additional orientation data, such as various axes (e.g., longitudinal axis, mechanical axis, etc). The bone models may therefore be patient specific. It is also considered to obtain bone models from a bone model library, with the data obtained from the video images used to match a generated 3D surface of the bone with a bone from the bone atlas. The virtual tool models may be provided by the tool manufacturer, or may also be generated in any appropriate way so as to be a virtual 3D representation of the tool(s).
[0046] Still referring to Fig. 2, the RAS controller 50 may have the robot controller 70 integrated therein. However, the robot controller 70 may be physically separated from the RAS controller 50, for instance by being integrated into the robot 20 (e.g., in the robot base 21). The robot controller 70 is tasked with powering and/or controlling the various joints of the robot arm 20A. The robot controller 70 may also optionally calculate robot movements of the robot arm 20A, so as to control movements of the robot arm 20A autonomously in some instances, i.e., without intervention from the CAS controller 50. There may be some force feedback provided by the robot arm 20A to avoid damaging the bones, to avoid impacting other parts of the patient or equipment and/or personnel. The robot controller 70 may perform actions based on the surgery planning. The surgery planning may be a module programmed specifically for any given patient,
according to the parameters of surgery desired by an operator such as an engineer and/or surgeon. The parameters may include geometry of selected, planned bone cuts, planned cut depths, sequence or workflow of alterations with a sequence of surgical steps and tools, tools used, etc.
[0047] The use of the tracking system may provide tracking data to perform surgical navigation. For example, the tracking system may assist in performing the calibration of the patient skull and stereotactic frame system with respect to the coordinate system (i.e. , the locating of the bone in the coordinate system), for subsequent navigation in the X, Y, Z coordinate system.
[0048] According to an embodiment, the tracking system includes an image capturing device, also known as a navigation camera, that optically sees and recognizes references (e.g., retro-reflective references, optically recognizable references) - that may be part of the tools used in surgery -, so as to track the robot arm 20A of the surgical robot 20 and body part in six DOFs, namely in position and orientation. Therefore, the RAS controller 50 continuously updates the position and/or orientation of the surgical robot 20, tools T,F and patient bones B in the X, Y, Z coordinate system using the data from the tracking system 50.
[0049] Now that the various components of the RAS system 10 have been described, a contemplated approach to advanced planning is set forth, with reference to Figs. 3 and 4, in the context of a neurological surgical procedure in which the stereotactic frame system F holds the patient’s skull B, such as in the manner shown in Fig. 1. In an embodiment, the steps and maneuvers described for Fig. 3 are performed pre- operatively, i.e., before a surgical procedure actually commences. The pre-operative steps may be performed hours, days, weeks before the surgical procedure, or during the preparative stages of surgery (e.g., peri-operatively). The advanced planning may be done using a graphical user interface (GUI) display that may be displayed on a monitor(s) or any other type of screen(s) from the interface l/F in Fig. 2. The GUI display is operated by an advanced planning system that may or may not be integrated in the RAS system 10. The advanced planning system may be described as being a system for planning an end effector positioning in robotic-assisted surgery, in the various maneuvers performed by the robot arm based on target moves. The advanced planning
system may include a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit. For example, the processing unit may be the processing unit 51 in Fig. 2, with the non-transitory computer-readable memory may be 52 in Fig. 2. It is however possible to have the advanced planning system in a dedicated computer, or any other computing environment.
[0050] Referring to Fig. 3, a GUI display is generally shown at 100. The GUI display 100 is an example of a GUI that can be used for advanced planning in accordance with the present disclosure, and that is an interface and/or output from the advanced planning system of the present disclosure. The GUI display 100 has one or more windows, with an example of four windows shown in Fig. 3. The display may include a first window 100A featuring a patient-specific image (e.g., a model of the patient’s head) in three dimensions (3D) in the top left corner, along with the model of the stereotactic frame system F (or other hardware component), in a simulation. In a variant, the patient body part in window 100A is in the form of a model, such as a 3D model. The stereotactic frame system F is also a 3D model, with the window 100A displaying a simulated position and orientation of the model of the stereotactic frame system F relative to the model of the head, as if the stereotactic frame were secured to the patient’s head. The models are attached to one another, such that the modelized assembly of head B and stereotactic frame F may be rotated concurrently to change a point of view. Stated differently, the models may be displaced as a single assembly. In a variant, the window 100A may also be used to change a relative position and/or orientation of the stereotactic frame relative to the model of the head. For example, three degrees of translation and three degrees of rotation may be possible between the hardware component and the body part, or between the hardware components, such as when a vertical bar of the stereotactic frame could be moved.
[0051] The movements may be dictated by a user with appropriate interfaces, such as a mouse, a keyboard, a touch screen (e.g., a smart device), a control bar 101 with command buttons, to name a few examples.
[0052] The model of the body part or of any component at the surgical site may result from imaging and modelling of the patient’s body part. The imaging may be from images
acquired by a 3D camera that may for example use structured light. In a variant, the model of the body part may be partially or completely obtained from a bone atlas, based on some anatomical features of the patient. The model from the bone atlas may be adjusted in different ways based on dimensions obtained from the patient, for the generic model to be morphed toward the patient’s anatomy.
[0053] The model of the hardware component may be obtained from the manufacturer, or may be generated in any appropriate way. There are numerous types of stereotactic frame systems that may be employed, in different configurations, dimensions, etc. In an embodiment, the user of the GUI display 100 may be given the choice between two or more distinct stereotactic frame system, with the advanced planning system for the GUI display 100 having the capacity to retrieve the appropriate model, and overlay it onto the skull.
[0054] In a variant, the stereotactic frame system F may have adjustable features in its tangible configuration. For example, the stereotactic frame system F illustrated in Fig. 4 may have a rectangular frame F1 upon which are mounted fixation posts F2. Other frame configurations are possible. The fixation posts F2 may for example move in a vertical direction relative to the frame F1 , for a user to then lock the fixation posts F2 at a desired position In a variant, it may also be possible for the fixation posts F2 to be displaceable horizontally. In the illustrated embodiment, fixation screws F3 or equivalent skull abutments may be at the end of the fixation posts F2 and may displaced in translation. For example, the fixation screws F3 are screwingly engaged to the fixation posts F2 such that a rotation of the fixation screws F3 results in an axial displacement, i.e. , a displacement along a longitudinal axis thereof. The fixation screws F3 may not be backdrivable. Although not shown, additional components may be present in the stereotactic frame system F of Fig. 4, such as ear plugs that may be mounted to fixation posts F2 and that may also be displaceable relative to the rectangular frame F1.
[0055] As observed from Fig. 3, a similar stereotactic frame system F is shown as being modeled relative to the model of the patient’s head. The user may be given the possibility of modifying the arrangement of components of the modeled stereotactic frame. For example, different entry fields may be provided in the control bar 101 to cause adjustments of the model, such as the adjustable features described above for
Fig. 4. Moreover, in an embodiment, the models may be 3-D solid models, such that the advanced planning system may provide automated position arrangements for the modeled stereotactic frame as a function of interferences. For example, the advanced planning system may adjust a relative height of the fixation post F2 when an interference (i.e., collision, encroachment) is detected. In Fig. 4, the two upper trajectory lines may be the sutterella antennae, and could represent trajectories that could be subject to collisions and should be avoided, as an example among others.
[0056] Therefore, a user may optionally perform different adjustments for the stereotactic frame using the GUI display 100, such as one or more of position and/or orientation adjustments for the model of the stereotactic frame relative to model of the head, stereotactic frame model selection, stereotactic frame configuration adjustments, to name a few. The advanced planning system may allow point-of-view selections by a user. The advanced planning system may provide compatibility suggestions for stereotactic frame models, stereotactic frame configurations, stereotactic frame locations, etc, based on patient-specific anatomical features.
[0057] The GUI display 100 may also have other windows, three of which are shown as 100B. The three additional windows show magnetic resonance (MRI) images from different viewpoints, by which a user can identify one or more target locations to reach during the surgery. In a variant, the target location(s) has been identified prior to the use of the GUI display 100, though the GUI display 100 could be used to identify the target location(s). MRI images are just one possibility among others, and if images are provided in the windows, they can be from CT scans, X-rays, or from any other imaging modality. The imaging modality may depend on the nature of the surgical intervention for which planning is done. The views may be along sagittal, transverse and/or frontal planes of the patient, and/or could be aligned with a trajectory (e.g., in a plane of the trajectory, perpendicular to trajectory as shown, along the trajectory as shown, axial view as shown).
[0058] Using the various data forms, windows, images, the user of the GUI display 100 may then select one or more trajectories to reach the one or more target locations. As observed, target trajectories are displayed as 102 in Figs. 3 and 4. In an embodiment, the user may use any one or more of the windows to set a target trajectory,
and the other windows may then provide a corresponding position of the target trajectory. In setting trajectories, the user is assisted by the visual display, as the user can see where the stereotactic frame or other hardware component is located, such as relative to the trajectory. Accordingly, the user can avoid any interference or conflict by the visual guidance. In an embodiment, the advanced planning system may provide recommendations of trajectories.
[0059] Once trajectories 102 are obtained, the advanced planning system may perform a simulation to position a model of the robot arm relative to the body part and to the stereotactic frame, such as in the window featuring the 3D models. The advanced planning system may use a model of a part of the robot arm with an appropriate tool or guide at its end effector to effect the procedure along the trajectory or trajectories. The model could be adjusted to display the arrangement of its links required to provide a given orientation for the tool T, i.e. , the robot arm configuration. In an embodiment, the advanced planning system may add to the display a model of part of the robot arm. By performing such a simulation, whether a display of the robot arm is provided or not, the advanced planning system can detect any encroachment, a.k.a., collision, interference, such as between part of the robot arm and the stereotactic frame, or other hardware component based on the nature of the surgical procedure. Accordingly, the user may be informed of any encroachment, such as between the outer envelope of the robot arm and the stereotactic frame. In the absence of encroachment, the user may be provided with a clearance distance value, i.e., the shortest measured distance between part of the robot arm (including any tool it may support, or the outer surface or envelope of the robot arm) and the stereotactic frame, or other hardware component. In a variant, the user and advanced planning system may have entered or provided predefined clearance distance values, so as to then indicate that the minimum clearance distance value has not been respected.
[0060] As a result thereof, the user may thus take various corrective actions, such as using a different stereotactic frame model (in the case of stereotactic neurosurgery), repositioning the stereotactic frame relative to the skull, finding another trajectory(ies), or suggesting a different robot arm configuration (i.e., orientation between links of the robot arm) is such an option is available.
[0061] The advanced planning system may achieve a plan in which there is no encroachment and/or in which the minimum clearance values are respected. The advanced planning system may then identify the corresponding robot arm configuration. In a variant, the robot arm configuration includes relative joint orientations by the end effector is properly positioned and/or orientation for the trajectory to be defined between the end effector and the target location. The robot arm configuration may include multiple joint orientations for multiple trajectories for multiple target locations. The plan may also include the specific details pertaining to the hardware component, such as in the case of the stereotactic frame system F, the stereotactic frame model, stereotactic frame arrangement of adjustable components, the relative position and/or orientation of the stereotactic frame relative to the user’s head (such as in the form of images), etc. The plan may be in different forms, such as instructions for driving the robot arm 20A of the RAS system 10.
[0062] Referring to Fig. 5, an exemplary method for planning an end effector positioning in robotic-assisted surgery is generally shown at 200. The method 200 may be performed for example using the RAS system 10, with the surgical robot 20, during neurosurgery using a stereotactic frame system, such as that shown in Figs. 3 and 4. The method 200 may be performed with other types of RAS systems, in other types of surgery. However, for simplicity, reference is made to components of the RAS system 10. For example, the method 200 may be computer-readable program instructions in the non-transitory computer-readable memory 52 for example, and executable by the processing unit 51 communicatively coupled to the processing unit 51.
[0063] According to 201, a model of at least a portion of a robot arm of the surgical robot is obtained. The surgical robot may the surgical robot 20 described herein, for example.
[0064] According to 202, a target location of an anatomical feature to be accessed is obtained. This may include multiple target locations. For example the target location(s) is obtained in the form of coordinates from the use of the GUI display 100 of Fig. 3. The target location may be a target location of a brain with a skull, as an example. Obtaining the location of the anatomical feature to be accessed includes obtaining at least one trajectory to access the location of the anatomical feature.
[0065] According to 203, a model of a body part including the anatomical feature and a model of at least one hardware component in an environment of the body part are obtained, with the model of the body part optionally being a real image of the body part, or a composite of a model and real images. In an embodiment, the hardware component is a stereotactic frame. Obtaining the model or real images of the body part may include the anatomical feature and the model of at least one hardware component, and may also include adjusting a relative position and/or location of the hardware component relative to the body part as a function of user input. Obtaining the model of the body part including the anatomical feature and the model of at least one hardware component may include switching the model of the at least one hardware component as a function of a selection of hardware component by a user. Obtaining the model of the body part including the anatomical feature and the model of at least one hardware component includes obtaining the model of a skull and of a stereotactic frame system.
[0066] According to 204, using the models, a configuration of the robot arm is identified, which configuration avoids the hardware component(s) and defines a trajectory between an end effector of the robot arm and the target location of the anatomical feature. If there is an encroachment, the system may modify a link arrangement of the robot arm dynamically. Identifying a configuration of the robot arm avoiding the hardware component(s) includes simulating a configuration of the robot arm relative to the hardware component, simulating movements between links of the robot arm, and identifying an encroachment.
[0067] According to 205, the configuration of the robot arm may be output as a plan for driving the robot. Accordingly, the RAS system 10 may be driven as a function of the plan.
[0068] The planning performed by the method and/or system described herein simulates surgery elements. As an example, it could consist in proposing a model of the hardware component, such as a stereotactic frame, to simulate or emulate the one that will be used during the surgical procedure. In the context of stereotactic neurosurgery, different types of frames can be used intraoperatively (e.g., Leksell, CRW, Mayfield, Inomed). Models of the hardware components may be obtained in 3D during the planning phase to allow the surgeon to simulate their position and/or arrangement
relative to a model of the patient’s head, and adapt the positioning of selected trajectories accordingly. In variant, though the hardware component models provided may not be exactly and fully faithful to the hardware component used in surgery, the simulations and steps described herein allow planning to be performed to avoid encroachments during surgery. Thus, the advanced planning system may provide suitably precise guidance at a pre-operative stage to allow an operator such as a surgeon to plan and optimize the time and comfort of the surgery on the day.
[0069] As another example, the advanced planning system could be used to simulate the position of the bone fiducials based on the different planned trajectories. Moreover, the advanced planning system may allow an improvement of the trajectory accessibility verification by simulating the position of a tool and/or instrument holder at the end effector of a robot arm, and may thus permit modifications to a position and/or orientation of the tool and/or instrument holder at the end effector, so that a setup is optimized, notably by the absence of encroachment, or by having suitable clearance distances between the tool and/or instrument holder as well as the robot arm and the patient environment intra-operatively.
[0070] Accordingly, the advanced planning system may allow a reduction of risk of redefining some trajectories during the surgery (and subsequent complications). Thus, the advanced planning system in some instances may reduce the surgery time. The use of the advanced planning system may also allow an increase in ergonomics during the surgical procedure, as the surgeon has more information on a planned surgery, and can anticipate how to position the stereotactic frame for the surgery.
[0071] The advanced planning system may essentially be described as adding to the planning of robot positioning elements, the planning of the surgery elements. For each robot positioning element, the plan may feature data on the end effector instrument and the planning of the surgeon’s instruments. In the example of neurosurgery, robot positioning elements are trajectories, and a surgery element can be the stereotactic frame (or other head or skull fixing frame). For each trajectory, the tool or instrument holder orientation and surgeon’s instruments are planned.
[0072] The advanced planning system may also perform an accessibility assessment. To achieve this, a 3D representation of the head frame is obtained, as well as like
models of surgical instruments and part of the robot arm. Model representations can be schematic, can be the result of a conversion of the mechanical drawings of those objects, and/or generating a 3D model from a scan of the robot arm. To be able to perform an accessibility analysis one important point is the dimensions of the 3D models must be as accurate as possible. Then, the 3D models are used by the advanced planning system described above to translate and rotate these 3D objects relative to one another, to allow an operator to visualize relative position and/or orientation in addition to planning trajectories. The planning of the trajectories is realized by the advanced planning system via the visual representation (2D or 3D) of patient exams, upon which a 2D or 3D representation of the trajectories may be overlaid.
[0073] In accordance with another aspect, the system described herein is for planning an end effector positioning in robotic-assisted surgery, and may include a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: obtaining a model of at least a portion of a robot arm of the surgical robot, obtaining a target location of an anatomical feature to be accessed, obtaining a model of a body part including the anatomical feature, using the models, identifying a configuration of the robot arm defining a trajectory between an end effector of the robot arm and the target location of the anatomical feature, and outputting the configuration of the robot arm as a plan for driving the robot. Having a hardware component in an environment of the body part is not mandatory. For example, without any hardware component, it is relevant to the surgeon to visualize the posture of the robot at the target location. Because this posture could hinder the surgeon to do its surgical act, the surgeon could change the final posture of the robot, to find one more convenient for him.
[0074] The advanced planning system may achieve a complete accessibility analysis, may measure distances, compute the shortest distance between objects (i.e. , minimum clearance as described above), detect encroachments between objects, for example by giving shapes to the objects with a customizable error margin.
Claims
1. A system for planning an end effector positioning in robotic-assisted surgery, comprising: a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: obtaining a model of at least a portion of a robot arm of the surgical robot, obtaining a target location of an anatomical feature to be accessed, obtaining a model of a body part including the anatomical feature and a model of at least one hardware component in an environment of the body part, using the models, identifying a configuration of the robot arm avoiding the at least one hardware component and enabling access to the target location of the anatomical feature by an end effector of the robot arm, and outputting the configuration of the robot arm as a plan for driving the robot arm.
2. The system according to claim 1, wherein the computer-readable program instructions executable by the processing unit whereby: obtaining the location of the anatomical feature to be accessed includes obtaining at least one trajectory to access the target location of the anatomical feature.
3. The system according to any one of claims 1 and 2, wherein the computer- readable program instructions executable by the processing unit whereby: obtaining the model of the body part including the anatomical feature and the model of at least one hardware component includes adjusting a relative position and/or location of the hardware component relative to the body part as a function of user input.
4. The system according to any one of claims 1 to 3, wherein the computer- readable program instructions executable by the processing unit whereby: obtaining the model of the body part including the anatomical feature and the model of at least one hardware component includes switching the model of the at least one hardware component as a function of a selection of hardware component by a user.
5. The system according to claim 1, wherein the computer-readable program instructions executable by the processing unit whereby: obtaining the model of the body part including the anatomical feature and the model of at least one hardware component includes obtaining the model of a skull and of a stereotactic frame system.
6. The system according to claim 5, wherein the computer-readable program instructions executable by the processing unit whereby: obtaining the model of the skull and of the stereotactic frame system includes adjusting a relative position and/or location of the stereotactic frame system relative to the skull as a function of user input.
7. The system according to claim 6, wherein the computer-readable program instructions executable by the processing unit whereby: adjusting a relative position and/or location of the stereotactic frame system relative to the skull as a function of the user input includes displaying in real-time the adjusted relative position and/or location of the stereotactic frame system relative to the skull.
8. The system according to claim 7, wherein the computer-readable program instructions executable by the processing unit whereby: outputting the adjusted relative position and/or location of the stereotactic frame system relative to the skull.
9. The system according to any one of claims 5 to 8, wherein the computer- readable program instructions executable by the processing unit whereby: obtaining the model of the skull and of the stereotactic frame system includes adjusting a relative position and/or location of at least one fixation post of the stereotactic frame system relative to a frame of the stereotactic frame system as a function of a user input.
10. The system according to claim 9, wherein the computer-readable program instructions executable by the processing unit whereby: includes displaying in real-time the adjusted relative position and/or location of the at least one fixation post of the stereotactic frame system relative to the frame of the stereotactic frame system.
11. The system according to claim 10, wherein the computer-readable program instructions executable by the processing unit whereby: outputting the configuration of the stereotactic frame system.
12. The system according to any one of claims 5 to 11 , including the stereotactic frame system.
13. The system according to any one of claims 1 to 12, wherein the computer- readable program instructions executable by the processing unit whereby: identifying a configuration of the robot arm avoiding the at least one hardware component includes modifying a link arrangement of the robot arm dynamically.
14. The system according to any one of claims 1 to 13, wherein the computer- readable program instructions executable by the processing unit for: displaying the models of the body part, of the at least one hardware component and of the robot arm and a graphical-user interface.
15. The system according to any one of claims 1 to 14, wherein the computer- readable program instructions executable by the processing unit whereby: identifying a configuration of the robot arm avoiding the at least one hardware component includes simulating a configuration of the robot arm relative to the hardware component, and identifying an encroachment.
16. The system according to any one of claims 1 to 15, wherein the computer- readable program instructions executable by the processing unit for: driving the robot arm as a function of the configuration of the robot arm.
17. The system according to any one of claims 1 to 16, including the robot arm.
18. A system for planning an end effector positioning in robotic-assisted surgery, comprising: a processing unit; and a non-transitory computer-readable memory communicatively coupled to the processing unit and comprising computer-readable program instructions executable by the processing unit for: obtaining a model of at least a portion of a robot arm of the surgical robot, obtaining a target location of an anatomical feature to be accessed, obtaining a model of a body part including the anatomical feature,
using the models, identifying a configuration of the robot arm defining a trajectory between an end effector of the robot arm and the target location of the anatomical feature, and outputting the configuration of the robot arm as a plan for driving the robot.
19. The system according to claim 18, wherein the computer-readable program instructions executable by the processing unit whereby: obtaining the location of the anatomical feature to be accessed includes obtaining at least one trajectory to access the target location of the anatomical feature.
20. The system according to any one of claims 18 to 19, wherein the computer- readable program instructions executable by the processing unit whereby: identifying a configuration of the robot arm includes modifying a link arrangement of the robot arm dynamically.
21. The system according to any one of claims 18 to 20, wherein the computer- readable program instructions executable by the processing unit for: displaying the models of the body part and of the robot arm and a graphical-user interface.
22. The system according to any one of claims 18 to 21, wherein the computer- readable program instructions executable by the processing unit for: driving the robot arm as a function of the configuration of the robot arm.
23. The system according to any one of claims 18 to 22, including the robot arm.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463637686P | 2024-04-23 | 2024-04-23 | |
| US63/637,686 | 2024-04-23 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025222291A1 true WO2025222291A1 (en) | 2025-10-30 |
Family
ID=97489087
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CA2025/050583 Pending WO2025222291A1 (en) | 2024-04-23 | 2025-04-23 | Advanced planning for accessibility assessment in robotic assisted surgery |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025222291A1 (en) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2004058049A2 (en) * | 2002-12-17 | 2004-07-15 | Kenneth Lipow | Surgical robot and robotic controller |
| US20180049825A1 (en) * | 2016-08-16 | 2018-02-22 | Koh Young Technology Inc. | Surgical robot for stereotactic surgery and method for controlling stereotactic surgery robot |
| US20200297357A1 (en) * | 2019-03-22 | 2020-09-24 | Globus Medical, Inc. | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
| US20230346484A1 (en) * | 2022-04-28 | 2023-11-02 | Orthosoft Ulc | Robotic surgery system with user interfacing |
-
2025
- 2025-04-23 WO PCT/CA2025/050583 patent/WO2025222291A1/en active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2004058049A2 (en) * | 2002-12-17 | 2004-07-15 | Kenneth Lipow | Surgical robot and robotic controller |
| US20180049825A1 (en) * | 2016-08-16 | 2018-02-22 | Koh Young Technology Inc. | Surgical robot for stereotactic surgery and method for controlling stereotactic surgery robot |
| US20200297357A1 (en) * | 2019-03-22 | 2020-09-24 | Globus Medical, Inc. | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
| US20230346484A1 (en) * | 2022-04-28 | 2023-11-02 | Orthosoft Ulc | Robotic surgery system with user interfacing |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12369984B2 (en) | Surgical systems and methods for providing surgical guidance with a head-mounted device | |
| US11844577B2 (en) | System and method for verifying calibration of a surgical system | |
| EP4054468B1 (en) | Robotic positioning of a device | |
| CN113229938B (en) | Surgical robots for positioning surgery | |
| US20230165649A1 (en) | A collaborative surgical robotic platform for autonomous task execution | |
| CN113940755A (en) | A surgical-image-integrated surgical planning and navigation method | |
| JP6894466B2 (en) | Systems and methods related to robotic guidance in surgery | |
| CN112370159A (en) | System for guiding a user to position a robot | |
| CN112043382A (en) | Surgical navigation system and use method thereof | |
| KR20150127032A (en) | System for arranging objects in an operating room in preparation for surgical procedures | |
| US20230346484A1 (en) | Robotic surgery system with user interfacing | |
| Korb et al. | Development and first patient trial of a surgical robot for complex trajectory milling | |
| WO2019135805A1 (en) | Interactive anatomical positioner and a robotic system therewith | |
| EP3733112A1 (en) | System for robotic trajectory guidance for navigated biopsy needle | |
| US20220338886A1 (en) | System and method to position a tracking system field-of-view | |
| WO2025222291A1 (en) | Advanced planning for accessibility assessment in robotic assisted surgery | |
| KR20180044241A (en) | Surgical robot system for stereotactic surgery and method for controlling a stereotactic surgery robot | |
| Wörn | Computer-and robot-aided head surgery | |
| US20250134606A1 (en) | Robotic surgery system with user interfacing | |
| US20240374329A1 (en) | Robotic system with force monitoring for computer-assisted surgery system | |
| US20250134594A1 (en) | Intraoperative interfacing method for computer-assisted surgery system | |
| EP4454597A2 (en) | Surgical robotic arm with proximity sensing | |
| WO2025189285A1 (en) | Movable surgical tracker | |
| KR20180100514A (en) | Surgical robot system for stereotactic surgery | |
| WO2025006323A1 (en) | Locating features at pre-defined locations relative to a bone cut surface |