WO2025178732A1 - Système chirurgical robotique à combinaison de résection dynamique - Google Patents
Système chirurgical robotique à combinaison de résection dynamiqueInfo
- Publication number
- WO2025178732A1 WO2025178732A1 PCT/US2025/013495 US2025013495W WO2025178732A1 WO 2025178732 A1 WO2025178732 A1 WO 2025178732A1 US 2025013495 W US2025013495 W US 2025013495W WO 2025178732 A1 WO2025178732 A1 WO 2025178732A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cutting stage
- unified
- cutting
- tool
- bone
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/32—Surgical robots operating autonomously
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/76—Manipulators having means for providing feel, e.g. force or tactile feedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
- A61B2034/254—User interfaces for surgical systems being adapted depending on the stage of the surgical procedure
Definitions
- FIG. 2 is an illustration of a surgical system, according to an exemplary embodiment.
- FIG. 5 is an illustration of a robotic device, according to an exemplary embodiment.
- FIG. 6 is a flowchart of a process of controlling a surgical robot, according to some embodiments.
- FIG. 7 is a flowchart of a process of controlling a surgical robot, according to some embodiments.
- FIG. 9 is a flowchart of a process of controlling a surgical robot, according to some embodiments.
- the computer is also programmed to control the robot to provide, absent selection of the option, the first cutting stage and the second cutting stage sequentially.
- the computer is also programmed to control the robot to provide, in response to user selection of the option, a unified cutting stage based on the first cutting stage and the second cutting stage.
- Another implementation of the present disclosure is at least one non-transitory computer-readable medium storing program instructions that, when executed by one or more processors, cause the one or more processors to perform operations (e.g., such instructions can be stored one or more media, memory devices, etc., for execution to perform the operations described herein).
- the operations include generating an option to combine a first cutting stage of a surgical plan with a second cutting stage of the surgical plan in response to determining that the first cutting stage is eligible for combination with the second cutting stage by comparing first characteristics of the first cutting stage with second characteristics of the second cutting stage.
- the operations also include causing a robot to operate in accordance with selection or rejection of the option by generating, in response to the rejection of the option, a first control boundary or path for the first cutting stage and a second control boundary or path for the second cutting stage, the first control boundary or path distinct from the second control boundary or path and generating, in response to the selection of the option, a unified control boundary or path for unified execution of the first cutting stage and the second cutting stage.
- Another implementation of the present disclosure is a method for robotically- assisted surgery.
- the method includes generating an option to combine a first cutting stage of a surgical plan with a second cutting stage of the surgical plan in response to determining that the first cutting stage is eligible for combination with the second cutting stage by comparing first characteristics of the first cutting stage with second characteristics of the second cutting stage.
- the method also includes causing a robot to operate in accordance with selection or rejection of the option by generating, in response to the rejection of the option, a first control boundary or path for the first cutting stage and a second control boundary or path for the second cutting stage, the first control boundary or path distinct from the second control boundary or path and generating, in response to the selection of the option, a unified control boundary or path for unified execution of the first cutting stage and the second cutting stage.
- a femur 101 as modified during a knee arthroplasty procedure is shown, according to an exemplary embodiment.
- the femur 101 has been modified with multiple planar cuts.
- the femur 101 has been modified by five substantially planar cuts to create five substantially planar surfaces, namely distal surface 102, posterior chamfer surface 104, posterior surface 106, anterior surface 108, and anterior chamfer surface 110.
- the planar surfaces may be achieved using a sagittal saw or other surgical device, for example a surgical device coupled to a robotic device as in the examples described below.
- the planar surfaces 102-110 are created such that the planar surfaces 102-110 will mate with corresponding surfaces of a femoral implant component.
- the positions and angular orientations of the planar surfaces 102-110 may determine the alignment and positioning of the implant component. Accordingly, operating a surgical device to create the planar surfaces 102-110 with a high degree of accuracy may improve the outcome of a joint replacement procedure.
- the femur 101 has also been modified to have a pair of pilot holes 120.
- the pilot holes 120 extend into the femur 101 and are created such that the pilot holes 120 can receive a screw, a projection extending from a surface of an implant component, or other structure configured to facilitate coupling of an implant component to the femur 101.
- the pilot holes 120 may be created using a drill, spherical burr, or other surgical device as described herein.
- the pilot holes 120 may have a pre-planned position, orientation, and depth, which facilitates secure coupling of the implant component to the bone in a desired position and orientation.
- the pilot holes 120 are planned to intersect with higher-density areas of a bone and/or to avoid other implant components and/or sensitive anatomical features. Accordingly, operating a surgical device to create the pilot holes 120 with a high degree of accuracy may improve the outcome of a joint replacement procedure.
- a tibia may also be modified during a joint replacement procedure.
- a planar surface may be created on the tibia at the knee joint to prepare the tibia to mate with a tibial implant component.
- one or more pilot holes 120 or other recess may also be created in the tibia to facilitate secure coupling of an implant component tot eh bone.
- the systems and methods described herein provide robotic assistance for creating the planar surfaces 102-110 and the pilot holes 120 at the femur 101, and/or a planar surface and/or pilot holes 120 or other recess on a tibia. It should be understood that the creation of five planar cuts and two cylindrical pilot holes as shown in FIG. 1 is an example only, and that the systems and methods described herein may be adapted to plan and facilitate creation of any number of planar or non-planar cuts, any number of pilot holes, any combination thereof, etc., for preparation of any bone and/or joint in various embodiments.
- planar surfaces 102-110, pilot holes 120, and any other surfaces or recesses created on bones of the knee joint can affect how well implant components mate to the bone as well as the resulting biomechanics for the patient after completion of the surgery. Tension on soft tissue can also be affected. Accordingly, systems and methods for planning the cuts which create these surfaces, facilitating intraoperative adjustments to the surgical plan, and providing robotic-assistance or other guidance for facilitating accurate creation of the planar surfaces 102-110, other surfaces, pilot holes 120, or other recesses can make surgical procedures easier and more efficient for healthcare providers and improve surgical outcomes.
- a surgical system 200 for orthopedic surgery is shown, according to an exemplary embodiment.
- the surgical system 200 is configured to facilitate the planning and execution of a surgical plan, for example to facilitate a joint- related procedure.
- the surgical system 200 is set up to treat a leg 202 of a patient 204 sitting or lying on table 205.
- the leg 202 includes femur 206 (e.g., femur 101 of FIG. 1) and tibia 208, between which a prosthetic knee implant is to be implanted in a total knee arthroscopy procedure.
- the robotic device 220 is configured to modify a patient’s anatomy (e.g., femur 206 of patient 204) under the control of the computing system 224.
- One embodiment of the robotic device 220 is a haptic device.
- “Haptic” refers to a sense of touch, and the field of haptics relates to, among other things, human interactive devices that provide feedback to an operator. Feedback may include tactile sensations such as, for example, vibration. Feedback may also include providing force to a user, such as a positive force or a resistance to movement.
- haptics is to provide a user of the device with guidance or limits for manipulation of that device.
- a haptic device may be coupled to a surgical device, which can be manipulated by a surgeon to perform a surgical procedure.
- the surgeon's manipulation of the surgical device can be guided or limited through the use of haptics to provide feedback to the surgeon during manipulation of the surgical device.
- Another embodiment of the robotic device 220 is an autonomous or semi- autonomous robot.
- “Autonomous” refers to a robotic device’s ability to act independently or semi-independently of human control by gathering information about its situation, determining a course of action, and automatically carrying out that course of action.
- the robotic device 220 in communication with the tracking system 222 and the computing system 224, may autonomously complete the series of femoral cuts mentioned above without direct human intervention.
- the robotic device 220 includes a base 230, a robotic arm 232, and a surgical device 234, and is communicably coupled to the computing system 224 and the tracking system 222.
- the base 230 provides a moveable foundation for the robotic arm 232, allowing the robotic arm 232 and the surgical device 234 to be repositioned as needed relative to the patient 204 and the table 205.
- the base 230 may also contain power systems, computing elements, motors, and other electronic or mechanical system necessary for the functions of the robotic arm 232 and the surgical device 234 described below.
- the robotic arm 232 thereby allows a surgeon to have full control over the surgical device 234 within a control object while providing force feedback along a boundary of that object (e.g., a vibration, a force preventing or resisting penetration of the boundary).
- the robotic arm 232 is configured to move the surgical device to a new pose automatically without direct user manipulation, as instructed by computing system 224, in order to position the robotic arm 232 as needed and/or complete certain surgical tasks, including, for example, cuts in a femur 206.
- the tracking system 222 can also be used to collect biomechanical measurements relating to the patient’s anatomy, assess joint gap distances, identify a hip center point, assess native or corrected joint deformities, or otherwise collect information relating to the relative poses of anatomical features. More particularly, the tracking system 222 determines a position and orientation (e.g., pose) of objects (e.g., surgical device 234, femur 206) with respect to a coordinate frame of reference and tracks (e.g., continuously determines) the pose of the objects during a surgical procedure.
- a position and orientation e.g., pose
- objects e.g., surgical device 234, femur 206
- the tracking system 222 includes an optical tracking system. Accordingly, tracking system 222 includes a first fiducial tree 240 coupled to the tibia 208, a second fiducial tree 241 coupled to the femur 206, a third fiducial tree 242 coupled to the base 230, one or more fiducials attachable to surgical device 234, and a detection device 246 configured to detect the three-dimensional position of fiducials (e.g., markers on fiducial trees 240-242). Fiducial trees 240, 241 may be coupled to other bones as suitable for various procedures (e.g., pelvis and femur in a hip arthroplasty procedure).
- fiducial trees 240, 241 may be coupled to other bones as suitable for various procedures (e.g., pelvis and femur in a hip arthroplasty procedure).
- Each fiducial has a geometric relationship to a corresponding object, such that tracking of the fiducials allows for the tracking of the object (e.g., tracking the second fiducial tree 241 allows the tracking system 222 to track the femur 206), and the tracking system 222 may be configured to carry out a registration process to determine or verify this geometric relationship.
- Unique arrangements of the fiducials in the fiducial trees 240-242 e.g., the fiducials in the first fiducial tree 240 are arranged in a different geometry than fiducials in the second fiducial tree 241) allows for distinguishing the fiducial trees, and therefore the objects being tracked, from one another.
- the preoperative surgical plan includes the desired cuts, holes, surfaces, burrs, or other modifications to a patient's anatomy to be made using the surgical system 200.
- the preoperative plan may include the cuts necessary to form, on a femur, a distal surface, a posterior chamfer surface, a posterior surface, an anterior surface, and an anterior chamfer surface in relative orientations and positions suitable to be mated to corresponding surfaces of the prosthetic to be joined to the femur during the surgical procedure, as well as cuts necessary to form, on the tibia, surface(s) suitable to mate to the prosthetic to be joined to the tibia during the surgical procedure.
- the preoperative plan may include the modifications necessary to create holes (e.g., pilot holes 120) in a bone.
- the surgical plan may include the burr necessary to form one or more surfaces on the acetabular region of the pelvis to receive a cup and, in suitable cases, an implant augment.
- the processing circuit 260 may receive, access, and/or store a model of the prosthetic to facilitate the generation of surgical plans.
- the processing circuit facilitate intraoperative modifications to the preoperative plant.
- the processing circuit 260 is further configured to generate one or more haptic objects based on the preoperative surgical plan to assist the surgeon during implementation of the surgical plan by enabling constraint of the surgical device 234 during the surgical procedure.
- a haptic object may be formed in one, two, or three dimensions.
- a haptic object can be a line, a plane, or a three-dimensional volume.
- a haptic object may be curved with curved surfaces and/or have flat surfaces, and can be any shape, for example a funnel shape.
- Haptic objects can be created to represent a variety of desired outcomes for movement of the surgical device 234 during the surgical procedure.
- One or more of the boundaries of a three- dimensional haptic object may represent one or more modifications, such as cuts, to be created on the surface of a bone.
- a planar haptic object may represent a modification, such as a cut, to be created on the surface of a bone.
- a curved haptic object may represent a resulting surface of a bone as modified to receive a cup implant and/or implant augment.
- a line haptic object may correspond to a pilot hole to be made in a bone to prepare the bone to receive a screw or other projection.
- the processing circuit 260 is further configured to generate a virtual tool representation of the surgical device 234.
- the virtual tool includes one or more haptic interaction points (HIPs), which represent and are associated with locations on the surgical device 234.
- HIPs haptic interaction points
- the surgical device 234 is a spherical burr (e.g., as shown in FIG. 2)
- a HIP may represent the center of the spherical burr.
- TCP tool center point
- the virtual representation of the sagittal saw may include numerous HIPs.
- Using multiple HIPs to generate haptic forces (e.g. positive force feedback or resistance to movement) on a surgical device is described in U.S. application Ser. No. 13/339,369, titled “System and Method for Providing Substantially Stable Haptics,” filed Dec. 28, 2011, and hereby incorporated by reference herein in its entirety.
- a virtual tool representing a sagittal saw includes eleven HIPs.
- references to an “HIP” are deemed to also include references to “one or more HIPs.”
- relationships between HIPs and haptic objects enable the surgical system 200 to constrain the surgical device 234.
- the patient's anatomy e.g., femur 206
- the patient's anatomy is registered to the virtual bone model of the patient's anatomy by any known registration technique.
- One possible registration technique is point-based registration, as described in U.S. Pat. No. 8,010,180, titled “Haptic Guidance System and Method,” granted Aug. 30, 2011, and hereby incorporated by reference herein in its entirety.
- registration may be accomplished by 2D/3D registration utilizing a hand-held radiographic imaging device, as described in U.S. application Ser. No. 13/562,163, titled “Radiographic Imaging Device,” filed Jul. 30, 2012, and hereby incorporated by reference herein in its entirety.
- the surgical system 200 includes a clamp or brace to substantially immobilize the femur 206 to minimize the need to track and process motion of the femur 206.
- the form of constraint imposed on surgical device 234 depends on the form of the relevant haptic object.
- a haptic object may be formed in any desirable shape or configuration.
- three exemplary embodiments include a line, plane, or three- dimensional volume.
- the surgical device 234 is constrained because a HIP of surgical device 234 is restricted to movement along a linear haptic object.
- the haptic object is a three-dimensional volume and the surgical device 234 may be constrained by substantially preventing movement of the HIP outside of the volume enclosed by the walls of the three-dimensional haptic object.
- the surgical device 234 is constrained because a planar haptic object substantially prevents movement of the HIP outside of the plane and outside of the boundaries of the planar haptic object.
- the processing circuit 260 can establish a planar haptic object corresponding to a planned planar distal cut needed to create a distal surface on the femur 206 in order to confine the surgical device 234 substantially to the plane needed to carry out the planned distal cut.
- the surgical system 200 is configured to autonomously move and operate the surgical device 234 in accordance with the control object.
- the control object may define areas relative to the femur 206 for which a cut should be made.
- one or more motors, actuators, and/or other mechanisms of the robotic arm 232 and the surgical device 234 are controllable to cause the surgical device 234 to move and operate as necessary within the control object to make a planned cut, for example using tracking data from the tracking system 222 to allow for closed-loop control.
- a user option for selecting whether to provide the unified cutting stage as an automated cutting stage or as a haptic cutting stage is provided.
- the user option can be provided in a graphical user interface, for example via display 264 as in FIG. 2 and/or a graphical user interface presented on a personal computing device (e.g., laptop or desktop computer, etc.).
- the graphical user interface can include the time estimates generate in step 704, such that the graphical user interface shows a first time estimation for automating the unified cutting stage and a second time estimation for providing a haptic cutting stage.
- step 708 a determination is made as to whether the user selected the option for automated cutting. If a user selected automation of the unified cutting stage (“Yes” at step 708), process 700 proceeds to step 710 where a unified tool path for unified completion of the first and second cutting stages is generated.
- the unified tool path can be generated using various path planning algorithms in various embodiments, for example to maximize cut quality, cut speed, and/or various other parameters, to complete resection of a region associated with the unified cutting stage.
- Path planning can be provided independent of any consideration of association of the first or second cutting stage with different portions of a unified cutting region, i.e., such that the path planning is agnostic of the fact that the unified cutting path is based on unification of first and second cutting stages. Accordingly, in some scenarios, the unified tool path will repeatedly switch between intersecting a region associated with the first cutting stage and intersecting a region associated with the second cutting stage, as may be optimal for completing resection of the combination of such regions.
- a robot e.g., robotic device 500
- the robot is controlled to move a cutting tool along the unified tool path determined in step 710.
- the unified cutting stage is thereby executed in step 712.
- step 802 real-time positions of a first bone and a second bone are obtained.
- a surgical plan include a first cutting stage for modifying the first bone and a second cutting stage for modifying the second bone, with the first cutting stage and the second cutting stage selected for unification and automated execution, for example via process 600 and/or process 700.
- the real-time positions of the first bone and the second bone can be obtained by optically tracking at least one first marker coupled to the first bone and at least one second marker coupled to the second bone (e.g., fiducial trees 240 and 241 as in FIG. 2), for example using an optical tracking system 222 as in FIG. 2, and/or otherwise tracked using another tracking modality (e.g., mechanical tracking, electromagnetic tracking, image processing, etc.).
- Step 802 can include dynamically updating relative positions of virtual models of the first bone and the second bone in a virtual modeling coordinate system based on tracked relative movement of the first bone and the second bone.
- step 904 a determination is made as to whether separate haptic objects or a unified haptic object is predicted to provide faster computation time for robot control.
- control circuitry e.g., computer, controller, etc.
- Faster computation speeds in such control can provide higher responsiveness to movement of the end effector, control which feels immediate, continuous, etc. to the user, while slower computation speeds can cause a lag between end effector movement and force feedback computation and control outputs.
- step 904 predicts which of multiple approaches for providing unified haptic control in step 904 will provide faster computation time for online robot control.
- step 904 both use of separate haptic object and use of a unified haptic object are assessed.
- the control system may repeatedly calculate distances between a haptic interaction point on the surgical tool of the robotic device and determine multiple forces accordingly while combining or otherwise handling different force feedback associated with the different haptic objects.
- a unified haptic object is used for online robot control (e.g., generated as in step 908 below)
- the control system may only need to calculate distances between the haptic interaction point and the one, unified haptic object, thereby potentially saving computational time during online control as compared to a scenario where separate haptic objects are used.
- Step 904 can accordingly include predicting (estimating, forecasting, etc.) computation times associated with the separate haptic objects approach and the unified haptic object approach and/or assessing a proxy for such computation times.
- Such prediction can be based on density of surface meshes used to represent the various haptic objects which would be in use (e.g., where flatter or otherwise more-regular geometries can be represented by lower-density meshes while complex, irregular geometries require more points in a surface meshes to represent such irregularities) and which influences computational complexity.
- Such prediction can also be based on a degree of overlap or similarity between the separate haptic objects, i.e., an assessment as to whether substantially redundant calculations would be provided for calculations associated with separate haptic objects that substantially align in certain regions.
- Various simulation techniques can be used in step 904 to determine computational times or other variable(s) representative of computational complexity, computational load, etc. associated with the different approaches. Step 904 can thereby culminate in a determination as to the computationally faster, for online robot control, between different approaches for providing haptic control which unifies the first cutting stage and the second cutting stage.
- step 906 haptic feedback is provided based on simultaneously comparing a tracked position of a surgical tool to the multiple, separate haptic objects.
- step 906 can include comparing a position of a surgical tool, robotic end effector, etc. to a first haptic object associated with the first cutting stage and a second haptic object associated with the second cutting stage (and, in some embodiments, to one or more additional haptic objects associated with unifying the cutting stages, providing a path between the haptic objects, etc.).
- Step 906 can include automatically determining, during online control, points at which a first haptic object overrides the second haptic object, portions of the first haptic object or second haptic object to be deactivated to allow movement between the haptic objects, determining combined force feedback to be provided based on interactions with both the first haptic object and the second haptic object, and/or other computations for determining force feedback in a manner which enables unified execution of the unified cutting stage.
- step 906 can include automatically moving the first haptic object relative to the second haptic object in response to tracked movement of the first bone relative to the second bone.
- the unified cutting stage can thus be carried out by a surgeon manipulating the surgical tool through the first and second haptic objects while the first and second haptic objects are both used online to control the robot (e.g., robotic device 500) to provide force feedback that constrains the surgical tool to the first and second haptic objects, including in scenarios where the haptic objects are associated with different bones that move relative to one another during execution of the unified cutting stage.
- the robot e.g., robotic device 500
- process 900 proceeds to step 908 where a new, unified haptic object is generated based on the separate haptic objects.
- Step 908 can provide a unified (e.g., continuous) virtual geometry (surface, etc.) which bounds positions to be reached by a surgical tool during performance of the unified cutting stage (e.g., based on a combination of positions to be reached in first and second stages selected for unification).
- Step 908 can include melding a first haptic object associated with a first cutting stage together with a second haptic object associated with the second cutting stage, for example by stitching together the haptic objects along lines of intersection and then removing any segments which are thereby rendered internal to the unified haptic object.
- Step 908 can include generating a surface mesh (e.g., polygonal mesh) which defines the unified haptic object based on geometries associated with the first cutting stage and the second cutting stage.
- a surface mesh e.g., polygonal mesh
- Various computer modeling approaches, computer-aided-design techniques, model visualization techniques, etc. can be executed in step 908 to provide the unified haptic object.
- step 910 in a scenario where the unified haptic object spans two bones to be modified in the unified cutting stage, the unified haptic object is morphed in response to movement of the first bone relative to the second bone.
- Morphing the unified haptic object can including stretching, compressing, twisting, etc. a region of the unified haptic object which extends across a space between the first and second bones (e.g., which provides a path between the first and second bones) to account for relative movement of the first and second bones.
- Morphing the unified haptic object can be performed without interrupting operation of the surgical tool, in some embodiments.
- haptic feedback is provided by controlling a robot (e.g., robotic device 500) based on comparing a tracked position of a surgical tool (e.g., saw blade 516, other surgical end effector of the robotic device 500) to the morphed, unified haptic object.
- the surgical tool can be moved by a surgeon to complete the unified cutting stage while force feedback is applied by the robot based on the unified haptic object (e.g., to constrain the surgical tool to unified haptic object).
- a unified, haptic cutting stage is thereby provided in step 912.
- process 900 can thereby provide a unified haptic cutting stage using different techniques for providing haptic feedback associated with geometries associated with different cutting stages based on which technique may be best suited for online use in robot control for a particular surgical plan, particular patient, particular cutting operation, etc., according to various embodiments.
- tools compatible with the planned resection are determined.
- the tools can be determined from a set of available tools, for example including saws, reamers, burrs, drill bits, etc. of different sizes, shapes, etc.
- the set of available tools may all be useable with the robotic device 500, e.g., as different end effectors attached to the robotic device 500.
- Step 1004 can include determine which of the tools are compatible with the planned resection, for example based on a shape and/or size of the planned resection.
- step 1004 can include determining that a set of tools is compatible with the planned resection as a group, i.e., that the planned resection can be completed by using a first tool to complete a first portion of the planned resection and second tool to complete a second portion of the planned resection.
- Other patient characteristics may be used to determine cutting rates as additional or alternative variables in such functions, for example bone type (e.g., cortical, cancellous, trabecular), resection geometry (e.g., smooth or flat surface may have higher cutting rate than irregular surfaces), body mass index or other characteristics relating to surgical access (e.g., fatty tissue of certain patients may restrict access by certain tools and thus reduce cutting rates, potentially different for different types of tools having different geometries), bone defects (e.g., certain bone defects, conditions, diseases, etc.), and/or various other patient characteristics in various embodiments.
- functions are preprogrammed and stored for each possible cutting tool which can be called in step 1006 and applied to the at least one patient character! stic(s) to calculate cutting rates for the eligible tools.
- Step 1014 a control program is generated and executed based on the tool or combination of tools selected in step 1012.
- Step 1014 can included path planning to generate a path for automated resection based on the selected tool(s) and/or generating one or more haptic objects for constraining the selected tool(s) for providing haptic feedback during executing of the planned resection.
- Step 1014 can be executed based on characteristics of the selected tool(s), for example because a larger tool may need to pass through fewer positions of an autonomous path to resect the same bone as a smaller tool or because a larger tool may need to be constrained to fewer positions to be constrained to a planned resection as compared to a smaller tool.
- Coupled or variations thereof are modified by an additional term (e.g., directly coupled)
- the generic definition of “coupled” provided above is modified by the plain language meaning of the additional term (e.g., “directly coupled” means the joining of two members without any separate intervening member), resulting in a narrower definition than the generic definition of “coupled” provided above.
- Such coupling may be mechanical, electrical, magnetic, or fluidic.
- the hardware and data processing components used to implement the various processes, operations, illustrative logics, logical blocks, modules and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose single- or multi-chip processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein.
- a general purpose processor may be a microprocessor, or, any conventional processor, controller, microcontroller, or state machine.
- a processor also may be implemented as a combination of computing devices, such as a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
- particular processes and methods may be performed by circuitry that is specific to a given function.
- the memory e.g., memory, memory unit, storage device
- the memory may be or include volatile memory or non-volatile memory, and may include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present disclosure.
- the memory is communicably connected to the processor via a processing circuit and includes computer code for executing (e.g., by the processing circuit or the processor) the one or more processes described herein.
- the present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing various operations, for example non-transitory computer-readable media.
- the embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system.
- Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon.
- Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor.
- machine-readable media can comprise RAM, ROM, EPROM, EEPROM, or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of machine- readable media.
- Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Robotics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Human Computer Interaction (AREA)
- Prostheses (AREA)
Abstract
Un système chirurgical assisté par robot comprend un robot et un ordinateur. L'ordinateur est programmé pour fournir à un utilisateur une option pour combiner une première étape de découpe d'un plan chirurgical avec une seconde étape de découpe du plan chirurgical en réponse à la détermination que la première étape de découpe est éligible pour une combinaison avec la seconde étape de découpe par comparaison de premières caractéristiques de la première étape de découpe avec des secondes caractéristiques de la seconde étape de découpe. L'ordinateur est également programmé pour, en l'absence de sélection de l'option, commander le robot pour permettre successivement la première étape de coupe et la seconde étape de coupe et également, en réponse à la sélection de l'option par l'utilisateur, commander le robot pour permettre une étape de coupe unifiée sur la base de la première étape de coupe et de la seconde étape de coupe.
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463555152P | 2024-02-19 | 2024-02-19 | |
| US63/555,152 | 2024-02-19 | ||
| US19/028,202 US20250261961A1 (en) | 2024-02-19 | 2025-01-17 | Robotic surgical system with dynamic resection combination |
| US19/028,202 | 2025-01-17 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025178732A1 true WO2025178732A1 (fr) | 2025-08-28 |
Family
ID=94732906
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2025/013495 Pending WO2025178732A1 (fr) | 2024-02-19 | 2025-01-29 | Système chirurgical robotique à combinaison de résection dynamique |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025178732A1 (fr) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8010180B2 (en) | 2002-03-06 | 2011-08-30 | Mako Surgical Corp. | Haptic guidance system and method |
| US9289264B2 (en) | 2011-12-29 | 2016-03-22 | Mako Surgical Corp. | Systems and methods for guiding an instrument using haptic object with collapsing geometry |
| US20220071720A1 (en) * | 2019-05-20 | 2022-03-10 | Icahn School Of Medicine At Mount Sinai | System and method for interaction and definition of tool pathways for a robotic cutting tool |
| EP4070752A1 (fr) * | 2021-04-09 | 2022-10-12 | MinMaxMedical | Système de chirurgie assistée par ordinateur |
-
2025
- 2025-01-29 WO PCT/US2025/013495 patent/WO2025178732A1/fr active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8010180B2 (en) | 2002-03-06 | 2011-08-30 | Mako Surgical Corp. | Haptic guidance system and method |
| US9289264B2 (en) | 2011-12-29 | 2016-03-22 | Mako Surgical Corp. | Systems and methods for guiding an instrument using haptic object with collapsing geometry |
| US20220071720A1 (en) * | 2019-05-20 | 2022-03-10 | Icahn School Of Medicine At Mount Sinai | System and method for interaction and definition of tool pathways for a robotic cutting tool |
| EP4070752A1 (fr) * | 2021-04-09 | 2022-10-12 | MinMaxMedical | Système de chirurgie assistée par ordinateur |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12214501B2 (en) | Robotic surgical system with recovery alignment | |
| US20220183767A1 (en) | Dynamic gap capture and flexion widget | |
| US10383638B2 (en) | System and method for bone preparation for an implant | |
| KR102470649B1 (ko) | 커스터마이징된 햅틱 경계를 생성하기 위한 시스템 및 방법 | |
| JP2022546381A (ja) | オーグメント式股関節形成術手順のためのロボット外科システム | |
| US20160051334A1 (en) | Method of determining a contour of an anatomical structure and selecting an orthopaedic implant to replicate the anatomical structure | |
| JP2025523348A (ja) | 寛骨臼のナビゲートされたリーミングのためのシステムおよび方法 | |
| WO2025170802A1 (fr) | Système chirurgical robotique à haptiques contextuelles | |
| US20250026014A1 (en) | Systems and methods for providing haptic guidance | |
| US20250261961A1 (en) | Robotic surgical system with dynamic resection combination | |
| WO2025178732A1 (fr) | Système chirurgical robotique à combinaison de résection dynamique | |
| US20250255685A1 (en) | Robotic surgical system with context haptics | |
| WO2024044169A1 (fr) | Procédé de détermination d'une résection osseuse optimale | |
| WO2024173736A1 (fr) | Détermination d'une position pour un implant par rapport à un os sur la base de données de remodelage osseux |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25707248 Country of ref document: EP Kind code of ref document: A1 |