[go: up one dir, main page]

WO2025072418A1 - Aligning an instrument supported by a computer-assisted system - Google Patents

Aligning an instrument supported by a computer-assisted system Download PDF

Info

Publication number
WO2025072418A1
WO2025072418A1 PCT/US2024/048529 US2024048529W WO2025072418A1 WO 2025072418 A1 WO2025072418 A1 WO 2025072418A1 US 2024048529 W US2024048529 W US 2024048529W WO 2025072418 A1 WO2025072418 A1 WO 2025072418A1
Authority
WO
WIPO (PCT)
Prior art keywords
instrument
control system
target
computer
imaging device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/048529
Other languages
French (fr)
Inventor
Pavel Chtcheprov
Daniel NISSENBAUM
Trevor PIER
Zhuoqun XU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intuitive Surgical Operations Inc
Original Assignee
Intuitive Surgical Operations Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intuitive Surgical Operations Inc filed Critical Intuitive Surgical Operations Inc
Publication of WO2025072418A1 publication Critical patent/WO2025072418A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Leader-follower robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras

Definitions

  • the present disclosure relates generally to computer-assisted systems and more particularly to aligning instruments supported by computer-assisted systems.
  • Some computer-assisted systems include one or more instruments that are articulated when performing various procedures.
  • the computer-assisted system can be automated, semiautomated, teleoperated, etc.
  • a human operator manipulates one or more leader input controls to command motion of one or more follower devices located in a workspace.
  • Example follower devices include instruments, repositionable structures configured to support instruments, etc.
  • the teleoperated system is configured to support an instrument such as a catheter, electrocautery device, cutting device, grasping device, stapler, etc.
  • the computer-assisted system moves instrument within a workspace to perform a task at a worksite, such as to manipulate specific tissue within the interior anatomy of a patient in a medical example.
  • Inserting the instrument into the workspace and aligning the instrument for performing tasks at the worksite can be complicated or inefficient. For example, repositionable structures or instruments with more complex kinematic designs, more massive materials, or higher inertia components, can be harder to manipulate or control. As another example workspaces with smaller entry sites (such as minimally invasive applications) can require more precise alignment.
  • control system may be configured to (1) determine a target in a workspace for positioning the first instrument supported by the first repositionable structure; (2) define a first target axis based on the target and an entry site location; (3) determine a configuration of the first plurality of joints that aligns the first instrument with the first target axis such that the first instrument is able to be advanced along the first target axis; and (4) command the first plurality of joints based on the determined configuration.
  • the method includes (1) determining, via the control system, a target in a workspace for positioning the first instrument supported by the first repositionable structure; (2) defining, via the control system, a first target axis based on the target and an entry site location; (3) determining, via the control system, a configuration of the first plurality of joints that aligns the first instrument with the first target axis such that the first instrument is able to be advanced along the first target axis; and (4) commanding, via the control system, the first plurality of joints based on the determined configuration.
  • one or more non-transitory machine-readable media include a plurality of machine-readable instructions which when executed by a processor system are adapted to cause the processor system to perform any of the methods described herein.
  • Figure 1 is a diagram of a computer-assisted system in accordance with one or more embodiments.
  • Figure 2 is an example depiction for defining target axes and/or target positions using a computer- assisted system, in accordance with one or more embodiments.
  • Figure 2 is an example depiction for defining target axes and/or target positions using a computer- assisted system, in accordance with one or more embodiments.
  • Figure 3 is an example graphical user interface depicting visual indications of target axes defined using a computer-assisted system, in accordance with one or more embodiments.
  • Figures 4A-4D illustrate an example process of the computer-assisted system aligning a follower device with the target axes, in accordance with one or more embodiments.
  • Figure 5 is a flow diagram of method steps for manipulating an imaging device when inserting an instrument in a computer-assisted system in accordance with one or more embodiments.
  • spatially relative terms such as “beneath”, “below”, “lower”, “above”, “upper”, “proximal”, “distal”, and the like-may be used to describe the relation of one element or feature to another element or feature as illustrated in the figures.
  • These spatially relative terms are intended to encompass different positions (i.e., locations) and orientations (z.e., rotational placements) of the elements or their operation in addition to the position and orientation shown in the figures. For example, if the content of one of the figures is turned over, elements described as “below” or “beneath” other elements or features would then be “above” or “over” the other elements or features.
  • a device may be otherwise oriented and the spatially relative descriptors used herein interpreted accordingly. Likewise, descriptions of movement along and around various axes include various special element positions and orientations.
  • the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context indicates otherwise.
  • the terms “comprises”, “comprising”, “includes”, and the like specify the presence of stated features, steps, operations, elements, and/or components but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups. Components described as coupled may be electrically or mechanically directly coupled, or they may be indirectly coupled via one or more intermediate components.
  • position refers to the location of an element or a portion of an element (e.g., three degrees of translational freedom in a three-dimensional space, such as along Cartesian x-, y-, and z- coordinates).
  • orientation refers to the rotational placement of an element or a portion of an element (e.g., three degrees of rotational freedom in three-dimensional space, such as about roll, pitch, and yaw axes, represented in angle-axis, rotation matrix, quaternion representation, and/or the like).
  • a pose refers to the multi-degree of freedom (DOF) spatial position and orientation of a coordinate system of interest attached to a rigid body.
  • DOF multi-degree of freedom
  • a pose includes a pose variable for each of the DOFs in the pose.
  • a full 6-DOF pose for a rigid body in three-dimensional space would include 6 pose variables corresponding to the 3 positional DOFs (e.g., x, y, and z) and the 3 orientational DOFs (e.g., roll, pitch, and yaw).
  • a 3- DOF position only pose would include only pose variables for the 3 positional DOFs.
  • a 3-DOF orientation only pose would include only pose variables for the 3 rotational DOFs.
  • a velocity of the pose captures the change in pose over time (e.g., a first derivative of the pose).
  • the velocity would include 3 translational velocities and 3 rotational velocities. Poses with other numbers of DOFs would have a corresponding number of velocities translational and/or rotational velocities.
  • aspects of this disclosure are described in reference to computer- assisted systems, which can include devices that are teleoperated, externally manipulated, autonomous, semiautonomous, and/or the like. Further, aspects of this disclosure are described in terms of an implementation using a tclcopcratcd surgical system, such as the da Vinci® Surgical System commercialized by Intuitive Surgical, Inc. of Sunnyvale, California. Knowledgeable persons will understand, however, that inventive aspects disclosed herein may be embodied and implemented in various ways, including teleoperated and non-teleoperated, and medical and non-medical embodiments and implementations. Implementations on da Vinci® Surgical Systems are merely exemplary and are not to be considered as limiting the scope of the inventive aspects disclosed herein.
  • the instruments, systems, and methods described herein may be used for humans, animals, portions of human or animal anatomy, industrial systems, general robotic, or teleoperated systems.
  • the instruments, systems, and methods described herein may be used for non-medical purposes including industrial uses, general robotic uses, sensing or manipulating non-tissue work pieces, cosmetic improvements, imaging of human or animal anatomy, gathering data from human or animal anatomy, setting up or taking down systems, training medical or non-medical personnel, and/or the like.
  • Additional example applications include use for procedures on tissue removed from human or animal anatomies (with or without return to a human or animal anatomy) and for procedures on human or animal cadavers.
  • these techniques can also be used for medical treatment or diagnosis procedures that include, or do not include, surgical aspects.
  • FIG. 1 is a simplified diagram of an example computer-assisted system 100, according to various embodiments.
  • the computer-assisted system 100 is a teleoperated system.
  • the computer-assisted system 100 can be a teleoperated medical system such as a surgical system.
  • the computer-assisted system 100 includes a follower device 104 that can be teleoperated by being controlled by one or more leader devices (also called “leader input devices” when designed to accept external input), described in greater detail below.
  • Leader-follower systems also sometimes referred to as master-slave systems.
  • an input system that includes a workstation 102 (e.g., a console), and in various embodiments the input system can be in any appropriate form and may or may not include the workstation 102.
  • a workstation 102 e.g., a console
  • the workstation 102 includes one or more leader input devices 106 that are designed to be contacted and manipulated by an operator 108.
  • the workstation 102 may comprise one or more leader input devices 106 for use by the hands, the head, or some other body part(s) of operator 108.
  • the leader input devices 106 in this example are supported by the workstation 102 and can be mechanically grounded.
  • an ergonomic support 110 e.g., forearm rest
  • the operator 108 can perform tasks at a worksite within a workspace near the follower device 104 during a procedure, by commanding the follower device 104 using the leader input devices 106.
  • the worksite may be a surgical worksite associated with a patient.
  • a display device 112 is also included in the workstation 102.
  • the display device 112 may be configured to display images for viewing by the operator 108.
  • the display device 112 can be moved in various DOFs to accommodate the viewing position of the operator 108 and/or to provide control functions.
  • the leader input devices 106 may include the display device 112.
  • displayed images may depict a worksite at which the operator 108 is performing various tasks by manipulating the leader input devices 106 and/or the display device 112.
  • images displayed by display device 112 may be received by the workstation 102 from one or more imaging devices arranged at a worksite.
  • the images displayed by the display device 112 may be generated by the display device 112 (or by a different connected device or system), such as for virtual representations of tools, the worksite, or for user interface components.
  • the operator 108 can sit in a chair or other support in front of the workstation 102, position his or her eyes in front of the display device 112, manipulate the leader input devices 106, and rest his or her forearms on the ergonomic support 110 as desired.
  • the operator 108 can stand at the workstation or assume other poses, and the display device 112 and the leader input devices 106 can be adjusted in position (height, depth, etc.) to accommodate the pose of the operator 108.
  • the one or more leader input devices 106 can be ungrounded (ungrounded leader input devices being not kinematically grounded, such as leader input devices 106 held by the hands of the operator 108 without additional physical support). Such ungrounded leader input devices 106 can be used in conjunction with the display device 112.
  • the display device 112 is positioned near the worksite such that the operator 108 can manually operate instruments at the worksite, such as a medical instrument in a medical example, while viewing images displayed by the display device 112.
  • the computer-assisted system 100 also includes a follower device 104 that can be commanded by the workstation 102.
  • the follower device 104 can be located near’ an operating table (e.g., a table, bed, or other support) on which a patient can be positioned.
  • the workspace is provided on an operating table, e.g., on or in a patient, simulated patient, or model, training dummy, etc. (not shown).
  • the follower device 104 may include a plurality of repositionable structures 120 (sometimes referred to as “manipulator arms” in robotic embodiments).
  • the repositionable structures 120 may include a plurality of links that are rigid members and joints that can be individually actuated as part of a kinematic series. Additionally, each of the repositionable structures 120 is configured to couple to an instrument 122. While Figure 1 illustrates a follower device 104 that has four repositionable structures 120, in other embodiments, the follower device 104 may include one, two, three, four, five, six, or additional or fewer repositionable structures 120.
  • the instrument 122 can include, for example, a working portion 126 and one or more structures for supporting and/or driving the working portion 126.
  • Example working portions 126 include end effectors that physically contact or manipulate material, energy application elements that apply electrical, RF, ultrasonic, or other types of energy, sensors that detect characteristics of the workspace environment (such as temperature sensors, imaging devices, etc.), and the like.
  • examples of instruments 122 include, without limitation, a sealing instrument, a cutting instrument, a sealing-and-cutting instrument, an energy instrument for applying energy, a gripping instrument (e.g., clamps, jaws), a stapler, an imaging instrument such as one using optical, RF, or ultrasonic imaging modalities, a sensing instrument, an irrigation instrument, a suction instrument, and/or the like.
  • the instrument 122 may include a transmission mechanism 128 that can be coupled to a drive assembly 130 of the respective repositionable structure 120.
  • the drive assembly 130 may include a drive and/or other mechanisms controllable from workstation 102 that transmit forces to the transmission mechanism 128 to articular or otherwise actuate the instrument 122.
  • This technique enables the computer-assisted system 100 to automatically align instruments 122 supported by the repositionable structures 120 for insertion of their respective working portions 126 to the worksite prior to the performance of, in some examples, a procedure.
  • the disclosed techniques reduce the likelihood of the instruments (or the repositionable structures supporting the instruments) colliding with one another and/or an avoidance region (e.g.. in a medical example, a subject’s anatomy).
  • the control system 140 may operate in an alignment mode in which the control system 140 determines a target (e.g., an operator-identified and/or system-identified position and/or line) to use for determining a target axis along which the second instrument 122b is to be inserted into the workspace.
  • a target e.g., an operator-identified and/or system-identified position and/or line
  • Aligning the second instrument 122b to the target axis can comprise aligning an insertion axis of the second instrument 122b to be aligned with the target axis.
  • the workstation 102 and/or the display device 112 may include a user interface that enables the operator 108 to indicate that a position associated with the imaging instrument 122a comprising the imaging device 126a.
  • the workstation 102 and/or the input devices 106 may include a button, switch, toggle or other user-interactable element via which the operator 108 is able to indicate that the operator-identified position is to be determined.
  • the control system 140 may analyze kinematic data associated with the imaging instrument 122a and/or the image data generated the imaging device 126a thereof to determine the operator-identified position.
  • the control system 140 utilizes the kinematic information associated with the imaging instrument 122a to set the operator- identified position at a tip of the imaging instrument 122a, or at an origin point for a field of view (FOV) of the imaging device 126a included in the imaging instrument 122a.
  • the control system 140 sets the operator-identified position as being offset from the imaging instrument 122aalong a defined direction.
  • the defined direction is along a longitudinal axis of the imaging instrument 122a in the direction towards the worksite, or along a direction of view of the imaging instrument 122a.
  • the offset is a predetermined offset.
  • the direction of view of an imaging device is a direction about which the FOV is centered.
  • an endoscope’s direction of view can be described as a central axis extending from the origin of the endoscope’s FOV toward a far extent of the FOV.
  • the direction of view extends from the original of the FOV toward the center of the base of the cone or frustum.
  • the direction of view can be defined for each of the two cameras of the endoscope, or a combination or average of the directions of views or FOVs of the two cameras (e.g., an average of the two directions of views of the two cameras; a central axis of a union of the FOVs of the two cameras, an intersection of the FOVs if the FOVs intersect, etc.)
  • the direction of view for A-mode ultrasound is the direction outwards from the ultrasonic sensor along the single dimension of the A-mode ultrasound image
  • the direction of view for B-mode ultrasound is a central axis extending from the base of the ultrasound image toward a center of the extent of the ultrasound image in two-dimensional space.
  • the operator- identified position may be more precisely indicated by the operator.
  • the display device 112 may include a graphical user interface (GUI) that depicts the image data generated by the imaging instrument comprising the imaging device 126a.
  • GUI graphical user interface
  • the operator 108 may indicate a particular position by interacting with the GUI (e.g., by moving a cursor object to the location of the target, by touching the display device 112, and/or other known means for interacting with a GUI).
  • the GUI may enable the operatori 08 to input a position, size, and/or shape of a volume for the target into one or more parameter entry fields.
  • the operator may instead identify a line via the GUI (e.g., by identifying multiple positions which the control system 140 uses to define the line, by manipulating a virtual line indicator, etc.). Accordingly, the term “operator-identified position” also refers to embodiments in which the operator identifies a line.
  • control system 140 may be use pre-configured data indicative of the worksite.
  • a user may label pre-operative image data of the worksite with an indication of a desired position for the instruments 122 to be positioned.
  • the user may label a target lesion (or other anatomical feature of interest) in the pre-operative image data.
  • the control system 140 may be configured to process the image data generated by the imaging instrument 122a comprising the imaging device 126a to detect when the anatomical feature is included in the FOV. Upon detecting the presence of the anatomical feature, the control system 140 may automatically define a position associated with the anatomical feature from which the target is to be derived.
  • the control system 140 may then define a target axis for aligning the instrument 122b. More particularly, the control system 140 may define a target axis that coincides with a straight line defined by the operator-identified and/or system-defined position and the entry site associated with instrument 122b.
  • the control system 140 may define a goal position along the target axis at which the instrument 122b is to be inserted for performing the procedure.
  • the control system 140 may define a volume centered at the operator-identified and/or system-defined position.
  • the volume has a spherical and/or ellipsoidal shape.
  • the control system 140 may then define a point on a surface of the target volume closest to the entry site location as the goal position.
  • the control system 140 may then define the target axis as coinciding with a line intersecting the goal position and the entry site location.
  • the point on the surface of the target volume is the point on the surface closest to the entry site via which the instrument 122b is inserted.
  • the target position finding problem is constrained to only one solution.
  • this solution results in the target insertion axis that causes the instrument 122b to pass through a restricted area (e.g., in a medical context, a portion of a patient anatomy and/or a proximity thereto) upon insertion, result in a collision between repositionable structures 120 and/or the instruments 122, or simply be outside of the range of motion for the instrument 122 when supported by the repositionable structure 120.
  • the target position problem may be redefined to permit solutions at any point on the surface of the target volume.
  • the control system 140 may introduce additional constraints such that the new set of potential solutions can be reduced to a single solution.
  • the additional constraints may include the avoidance of the restricted areas and/or collisions.
  • additional sensor data (such as line image recognition from the imaging instrument 122a) is used to determine the constraints.
  • the control system 140 may redefine the target axis for the instruments 122 based upon the determined solution to alternate target position problem.
  • control system 140 may perform similar techniques to determine the target axes and target positions for any of the instruments 122 supported by respective repositionable structures 120 that are ready for targeting (e.g., ready to accept a target axis for alignment therewith).
  • the control system 140 may verify one or more of (i) the rcpositionablc structure 120 has an instrument 122 mounted thereto, (ii) the rcpositionablc structure 120 is physically coupled with a cannula supported by the cannula mount 124, (iii) the instrument 122 is not inserted past a threshold distance from an uninserted position or remote center, (iv) a distal portion or working portion of the instrument 122 is not inserted beyond a threshold distance from an uninserted position, from a remote center, from an entry site location, from a feature of a cannula, and/or (v) a distal portion or working portion of the instrument 122 is not inserted beyond a tip of a cannula corresponding to the entry site location.
  • the position a distal portion or working portion of the instrument 122, or some other part of the instrument 122 can be determined through any appropriate technique, such as by image analysis to identify and locate known features of the instrument 122 to locate and orient the instrument, and/or by detecting with sensors kinematic information about the repositionable structure 120 and the instrument 122 and applying forward kinematics, etc.
  • the control system 140 may determine a configuration of the instruments 122 and/or the joint(s) of the repositionable structures 120 supporting the instruments 122 that are ready for targeting such that the respective instruments 122 are aligned for insertion along their respective target axes.
  • the control system 140 may analyze one or more kinematic models of the instruments 122 and/or the repositionable structures 120 supporting the instruments 122.
  • the control system 140 may analyze the kinematic models to determine a configuration of the joints of the instruments 122 and/or the repositionable structures 120 supporting the instruments 122 such that the instruments 122 are aligned with the respective target axes.
  • the instruments 122 may be inserted through their respective entry site locations to reach their respective goal positions.
  • the control system 140 may then command the instruments 122 and/or the rcpositionablc structures 120 based on the determined configuration.
  • Techniques for commanding an instrument 122 and/or a repositionable structure 120 to pivot with respect to an entry site location are described in U.S. Patent No. 8,823,308, the entire disclosure of which is hereby incorporated by reference.
  • the control system 140 commands the instruments 122 and/or repositionable structures 120 toward the determined configuration in parallel.
  • the control system 140 sequentially commands the instruments 122 and/or repositionable structures 120 toward the determined configuration.
  • the FOV of the imaging device 126a is typically not oriented in an optimal manner for performing the procedure.
  • the operator 108 may adjust the digital zoom level and/or pose of the imaging instrument 122a in a particular manner to indicate the operator-identified position.
  • control system 140 may also command the imaging instrument 122a and/or repositionable structure 120a to adjust the pose (e.g., by retracting the imaging instrument 122a along the insertion axis) and/or the zoom level e.g. , by digitally zooming out) of the imaging device 126a such that the FOV of the imaging device 126a provides the operator a better view of the workspace.
  • control system 140 may command the repositionable structure 120a to adjust the imaging instrument 122a prior to aligning the repositionable structures 120 with the target axes, simultaneously with commanding other repositionable structures 120, or after the other repositionable structures 120 have reached the determined configuration.
  • control system 140 may be configured to advance the instruments 122 along their respective target axes to their respective goal positions such that the working portions 126 are located proximate to the worksite.
  • the operator 108 interfaces with the workstation 102 and/or the input devices 106 to advance the instruments 122 along their respective target axes to their respective goal positions.
  • the control system 140 may provide feedback to the operator 108 when the instrument 122 reaches the goal position.
  • the control system 140 may provide haptic feedback by changing the resistance on the instrument 122 as the instrument 122 approaches the goal position and/or lock the pose of the instrument 122 and/or the repositionable structure 120 that supports the instrument 122 when the instrument 122 reaches the goal position. It should be appreciated that the improved avoidance of collisions and the more efficient targeting and alignment processes do not require the instruments 122 to be inserted to achieve the described benefits.
  • Figure 2 depicts an example diagram 200 for defining target axes 236 for aligning instruments 122 (such as the instruments 122 of Figure 1) for entry to a workspace 242.
  • the techniques for defining the target axes 236 are performed by the control system 140 of Figure 1.
  • An operator may control the pose of a repositionable structure (such as the repositionable structure 120a of Figure 1) that supports an imaging instrument 222a (such as the imaging instrument 122a of Figure 1) such that the FOV of the imaging device 226a (such as the imaging device 126a of Figure 1) includes a position 232 to utilize for targeting the other instruments 222.
  • the control system may analyze image data generated by the imaging device 226a to detect the presence of an anatomical feature 244 to automatically define the position 232.
  • the operator may interact with an operator interface (e.g., a)physical button or a GUI element) to identify the position 232 to the control system.
  • the control system may define target axes 236 for the repositionable structures that are ready for targeting.
  • the control system may define target axes 236 for the repositionable structures that are ready for targeting.
  • Each of the repositionable structures may be associated with a respective entry site 230. That is, the imaging instrument 222a may be inserted through the entry site 230a, a second instrument 222b may be inserted through the entry site 230b, and a third instrument 222c may be inserted through the entry site 230c.
  • the control system may define an axis that connects the position 232 with the entry site 230b.
  • the control system may define an axis that connects the position 232 with the entry site 230c. The control system may then determine a configuration of the repositionable structures such that a shaft of the instrument 222b is aligned with the target axis 236b and a shaft of the instrument 222c is aligned with the target axis 236c.
  • the control system may define goal positions 238 along the target axes 236 at which the instruments 222 are to be positioned prior to the procedure. Accordingly, the control system may define a target volume 234 centered at the target position 232. It should be appreciated that while Figure 2 depicts a spherical target volume 234, in other embodiments, the target volume 234 may have an ellipsoidal shape, a polygonal shape, a shape having convex or concave surfaces, a shape having a convex surface facing the entry site, or any other appropriate shape.
  • control system may instead truncate the target volume to ensure that the target volume does not include the patient anatomy (or, in some embodiments, approach a threshold distance of the patient anatomy).
  • the control system may define the goal positions 238 as the intersection points between the target axes 236 and a surface of the target volume 234 (z.e., the closest point on the target volume 234 to the respective entry site 230). Accordingly, the control system may define the goal position 238b of the instrument 226b to be the intersection of the target axis 236b and the surface of the target volume 234 and the goal position 238c of the instrument 226c to be the intersection of the target axis 236c and the surface of the target volume 234.
  • control system may generate multiple target volumes 234 that have different radii and/or shapes based upon the particular instrument associated with the goal position 238 being defined.
  • control system is able to maximize the working space associated with the workspace 242 thereby providing improved working conditions for performing the procedure.
  • control system may be configured to support a guided tool change mode, in which the control system facilitates the “changing” of an instrument previously mounted to a repositionable structure.
  • This guided tool “change” mode does not require that the instrument actually be changed. For example, an instrument can be removed for cleaning and remounted, an instrument can be removed to reload the instrument with staples, clips, sutures, or other material and then remounted, and/or the like.
  • Techniques related to guided tool change executed by a computer-assisted system are described in U.S. Patent No. 6,645,196 Bl, filed June 16, 2000, and entitled “GUIDED TOOL CHANGE,” which is incorporated herein by reference.
  • the control system may analyze an indication of the mounted instrument to determine whether the goal position 238 should be updated. If the indication indicates that an instrument of a different type was mounted to the repositionable structure, the control system may perform re-define the goal position 238 using a target volume 234 for the new type of instrument to ensure that the new tool is re-inserted to the appropriate depth.
  • FIG 3 illustrates an example GUI 300 is provided to an operator (such as the operator 108 of Figure 1) to facilitate a procedure using a computer-assisted system (such as the computer- assisted system 100 of Figure 1).
  • the GUI 300 is presented via a display device associated with a workstation (such as the display device 112 of the workstation 102 of Figure 1).
  • a control system such as the control system 140 of Figure 1
  • the workstation and/or a combination thereof may generate the GUI 300 and/or the data displayed thereby.
  • the GUI 300 enables a user to visualize image data 325 generated by an imaging instrument (such as the imaging instruments 122a, 222a) and/or other imaging devices.
  • the follower device that supports the imaging instrument may include multiple imaging instruments mounted to repositionable structures.
  • the GUI 300 may include an indication 320 that identifies the repositionable structure and/or the imaging instrument associated with the displayed image data 325.
  • the GUI includes visual indicators 336 representative of target axes of instruments.
  • the control system may access kinematic data to determine a pose of the imaging instrument to identify a portion of the coordinate space within the FOV of the imaging instrument. If the stored data associated with the target axes is included within the portion of the coordinate space, the control system may generate visual indicators 336b representative of the target axes in the image data 325. As a result, the operator is able to visually see the insertion path of the instruments.
  • this may enable to operator to confirm the automatic ally -defined target axes are appropriate (e.g., do not collide with patient anatomy, align with operator preferences, etc.) before the control system commands the follower device toward a configuration that aligns the instrument shafts with the target axes. If the operator is satisfied with the target axes, the operator may interface with a GUI element 346 to accept the confirm the target axes and cause the control system to command the follower device towards the configuration.
  • GUI element 346 depicts the GUI element 346 as a button, any suitable GUI element may be utilized in other embodiments.
  • one or more hardware user interface elements e.g., physical buttons, dials, foot pedals, etc. may be used instead of, in addition to, or concurrently with GUI element 346 to command the follower device towards the configuration.
  • Figures 4A-4D illustrate an example process of a computer-assisted system 400 (such as the computer-assisted system 100 of Figure 1) aligning a follower device 404 (such as the follower device 104 of Figure 1) with target axes that were defined in accordance with techniques described elsewhere herein. It should be understood that the examples of Figures 4A-4D are not restrictive, and that other repositionable structures, instruments, behaviors, and/or the like depicted in Figures 4A-4D may be different for other computer- assisted systems, different repositionable structures, different imaging devices, different instruments, different DOFs, different procedures, and/or the like.
  • Figures 4A-4D depict four repositionable structures 420 (such as the repositionable structures 120 of Figure 1) all mechanically grounded to a common kinematic base.
  • the repositionable structures 420 do not share such a common kinematic base, and are mounted to two or more separate carts, configured to be mounted to walls, tables, ceilings, floors independently of each other, etc.
  • a control system (such as the control system 140 of Figure 1) may employ registration techniques, applied with appropriate geometric models and reference frame transformations, to locate the repositionable structures 420, and components such as instrument 422 (such as an instrument 122 of Figure 1) supported by the repositionable structures 420, relative to each other.
  • the repositionable structure 420 exhibit an initial and/or default pose. As such, the instrament shafts of the instruments 422 are not aligned with their corresponding target axes.
  • the instrument 422b of the repositionable structure 420b is mounted with an imaging instrument (such as the imaging instruments 122a, 222a) and the instruments 422a, c,d of the repositionable structures 420a, c,d are mounted with respective instruments and, in some embodiments, have been verified to be ready for targeting.
  • an imaging instrument such as the imaging instruments 122a, 222a
  • the control system may have previously determined a configuration at which the instrument shafts of the instruments 422 are aligned with their respective target axes.
  • the control system may determine the configuration by using inverse kinematics of the repositionable structures 420 and/or the instruments 422. More particularly, inverse kinematics may be applied to determine joint commands for the repositionable structures 420 and/or the instruments 422, such as to align the instrument shafts of the instrument with their respective target axes.
  • the control system may also apply image processing, geometric modelling, pre-stored parameters, real-time data regarding the environment or the system, and/or the like to determine the appropriate joint commands.
  • the control system may sequentially command the instruments 422 and/or the repositionable structures 420 toward the determined configurations (e.g., by issuing the joint commands associated with the respective repositionable structure 420). To reduce the likelihood of collision, the control system may command the instruments and/or repositionable structures further from the center of the cluster of repositionable structures 420 (e.g., the repositionable structures 420a, d) prior to commanding the instruments and/or repositionable structures closer towards to the center of the cluster (e.g., the repositionable structure 420c).

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Robotics (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Surgical Instruments (AREA)

Abstract

Computer-assisted systems and methods for performing instrument targeting therewith are provided. The computer-assisted system includes a first repositionable structure configured to support a first instrument and a control system comprising one or more processors. The control system may be communicatively coupled to the first repositionable structure. To perform the instrument targeting, the control system (1) determines a target in a workspace for positioning the first instrument supported by the first repositionable structure; (2) defines a first target axis based on the target and an entry site location; (3) determines a configuration of the first plurality of joints that aligns the first instrument with the first target axis such that the first instrument is able to be advanced towards the target along the first target axis; and (4) commands the first plurality of joints based on the determined configuration.

Description

ALIGNING AN INSTRUMENT SUPPORTED BY A COMPUTER-ASSISTED SYSTEM
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to and the benefit of the filing date of provisional U.S. Patent Application No. 63/585,912 entitled “ALIGNING AN INSTRUMENT SUPPORTED BY A COMPUTER-ASSISTED SYSTEM,” filed on September 27, 2023. The entire contents of the provisional application are hereby expressly incorporated herein by reference.
TECHNICAL FIELD
[0002] The present disclosure relates generally to computer-assisted systems and more particularly to aligning instruments supported by computer-assisted systems.
BACKGROUND
[0003] Some computer-assisted systems include one or more instruments that are articulated when performing various procedures. The computer-assisted system can be automated, semiautomated, teleoperated, etc. In a teleoperated example, a human operator manipulates one or more leader input controls to command motion of one or more follower devices located in a workspace. Example follower devices include instruments, repositionable structures configured to support instruments, etc. In some examples, the teleoperated system is configured to support an instrument such as a catheter, electrocautery device, cutting device, grasping device, stapler, etc. In some instances, the computer-assisted system moves instrument within a workspace to perform a task at a worksite, such as to manipulate specific tissue within the interior anatomy of a patient in a medical example.
[0004] Inserting the instrument into the workspace and aligning the instrument for performing tasks at the worksite can be complicated or inefficient. For example, repositionable structures or instruments with more complex kinematic designs, more massive materials, or higher inertia components, can be harder to manipulate or control. As another example workspaces with smaller entry sites (such as minimally invasive applications) can require more precise alignment.
[0005] Accordingly, improved techniques to align the instrument, such as for advancement towards a workspace, are desirable. Such techniques can allow performance of instrument alignment more quickly and more efficiently. SUMMARY
[0006] The following presents a simplified summary of various examples described herein and is not intended to identify key or critical elements or to delineate the scope of the claims.
[0007] In one embodiment, a computer-assisted system is provided. The computer-assisted system may include (i) a first repositionable structure configured to support a first instrument, where the first repositionable structure supporting the first instrument comprises a first plurality of joints; and (ii) a control system comprising one or more processors, where the control system is communicatively coupled to the first repositionable structure. To perform instrument targeting, the control system may be configured to (1) determine a target in a workspace for positioning the first instrument supported by the first repositionable structure; (2) define a first target axis based on the target and an entry site location; (3) determine a configuration of the first plurality of joints that aligns the first instrument with the first target axis such that the first instrument is able to be advanced along the first target axis; and (4) command the first plurality of joints based on the determined configuration.
[0008] In another embodiment, a method performing instrument targeting using a computer- assisted system that includes (i) a first repositionable structure configured to support a first instrument, wherein the first repositionable structure supporting the first instrument comprises a first plurality of joints; and (ii) a control system comprising one or more processors, the control system communicatively coupled to the first repositionable structure is provided. The method includes (1) determining, via the control system, a target in a workspace for positioning the first instrument supported by the first repositionable structure; (2) defining, via the control system, a first target axis based on the target and an entry site location; (3) determining, via the control system, a configuration of the first plurality of joints that aligns the first instrument with the first target axis such that the first instrument is able to be advanced along the first target axis; and (4) commanding, via the control system, the first plurality of joints based on the determined configuration.
[0009] In a further embodiment, one or more non-transitory machine-readable media include a plurality of machine-readable instructions which when executed by a processor system are adapted to cause the processor system to perform any of the methods described herein. [0010] It is to be understood that both the foregoing general description and the following detailed description arc illustrative and explanatory in nature and arc intended to provide an understanding of the present disclosure without limiting the scope of the present disclosure. In that regard, additional aspects, features, and advantages of the present disclosure will be apparent to one skilled in the art from the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] Figure 1 is a diagram of a computer-assisted system in accordance with one or more embodiments.
[0012] Figure 2 is an example depiction for defining target axes and/or target positions using a computer- assisted system, in accordance with one or more embodiments.
[0013] Figure 2 is an example depiction for defining target axes and/or target positions using a computer- assisted system, in accordance with one or more embodiments.
[0014] Figure 3 is an example graphical user interface depicting visual indications of target axes defined using a computer-assisted system, in accordance with one or more embodiments.
[0015] Figures 4A-4D illustrate an example process of the computer-assisted system aligning a follower device with the target axes, in accordance with one or more embodiments.
[0016] Figure 5 is a flow diagram of method steps for manipulating an imaging device when inserting an instrument in a computer-assisted system in accordance with one or more embodiments.
[0017] Examples of the present disclosure and their advantages arc best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures, wherein showings therein are for purposes of illustrating examples of the present disclosure and not for purposes of limiting the same.
DETAILED DESCRIPTION
[0018] In this description, specific details are set forth describing some embodiments consistent with the present disclosure. Numerous specific details are set forth in order to provide a thorough understanding of the embodiments. It will be apparent, however, to one skilled in the ail that some embodiments may be practiced without some or all of these specific details. The specific embodiments disclosed herein arc meant to be illustrative but not limiting. One skilled in the art may realize other elements that, although not specifically described here, are within the scope and the spirit of this disclosure. In addition, to avoid unnecessary repetition, one or more features shown and described in association with one embodiment may be incorporated into other embodiments unless specifically described otherwise or if the one or more features would make an embodiment non-functional.
[0019] Further, the terminology in this description is not intended to limit the invention. For example, spatially relative terms-such as “beneath”, “below”, “lower”, “above”, “upper”, “proximal”, “distal”, and the like-may be used to describe the relation of one element or feature to another element or feature as illustrated in the figures. These spatially relative terms are intended to encompass different positions (i.e., locations) and orientations (z.e., rotational placements) of the elements or their operation in addition to the position and orientation shown in the figures. For example, if the content of one of the figures is turned over, elements described as “below” or “beneath” other elements or features would then be “above” or “over” the other elements or features. A device may be otherwise oriented and the spatially relative descriptors used herein interpreted accordingly. Likewise, descriptions of movement along and around various axes include various special element positions and orientations. In addition, the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context indicates otherwise. Additionally, the terms “comprises”, “comprising”, “includes”, and the like specify the presence of stated features, steps, operations, elements, and/or components but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups. Components described as coupled may be electrically or mechanically directly coupled, or they may be indirectly coupled via one or more intermediate components.
[0020] Elements described in detail with reference to one embodiment, implementation, system, or module may, whenever practical, be included in other embodiments, implementations, systems, or modules in which they are not specifically shown or described. For example, if an element is described in detail with reference to one embodiment and is not described with reference to a second embodiment, the element may nevertheless be claimed as included in the second embodiment. Thus, to avoid unnecessary repetition in the following description, one or more elements shown and described in association with one embodiment, implementation, or application may be incorporated into other embodiments, implementations, or aspects unless specifically described otherwise, unless the one or more elements would make an embodiment or implementation non-functional, or unless two or more of the elements provide conflicting functions.
[0021] In some instances, well known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
[0022] This disclosure describes various devices, elements, and portions of computer-assisted systems and elements in terms of their state in three-dimensional space. As used herein, the term “position” refers to the location of an element or a portion of an element (e.g., three degrees of translational freedom in a three-dimensional space, such as along Cartesian x-, y-, and z- coordinates). As used herein, the term “orientation” refers to the rotational placement of an element or a portion of an element (e.g., three degrees of rotational freedom in three-dimensional space, such as about roll, pitch, and yaw axes, represented in angle-axis, rotation matrix, quaternion representation, and/or the like). As used herein, and for a device with a kinematic series, such as with a repositionable structure with a plurality of links coupled by one or more joints, the term “proximal” refers to a direction toward a base of the kinematic series, and “distal” refers to a direction away from the base along the kinematic series.
[0023] As used herein, the term “pose” refers to the multi-degree of freedom (DOF) spatial position and orientation of a coordinate system of interest attached to a rigid body. In general, a pose includes a pose variable for each of the DOFs in the pose. For example, a full 6-DOF pose for a rigid body in three-dimensional space would include 6 pose variables corresponding to the 3 positional DOFs (e.g., x, y, and z) and the 3 orientational DOFs (e.g., roll, pitch, and yaw). A 3- DOF position only pose would include only pose variables for the 3 positional DOFs. Similarly, a 3-DOF orientation only pose would include only pose variables for the 3 rotational DOFs. Further, a velocity of the pose captures the change in pose over time (e.g., a first derivative of the pose). For a full 6-DOF pose of a rigid body in three-dimensional space, the velocity would include 3 translational velocities and 3 rotational velocities. Poses with other numbers of DOFs would have a corresponding number of velocities translational and/or rotational velocities.
[0024] Aspects of this disclosure are described in reference to computer- assisted systems, which can include devices that are teleoperated, externally manipulated, autonomous, semiautonomous, and/or the like. Further, aspects of this disclosure are described in terms of an implementation using a tclcopcratcd surgical system, such as the da Vinci® Surgical System commercialized by Intuitive Surgical, Inc. of Sunnyvale, California. Knowledgeable persons will understand, however, that inventive aspects disclosed herein may be embodied and implemented in various ways, including teleoperated and non-teleoperated, and medical and non-medical embodiments and implementations. Implementations on da Vinci® Surgical Systems are merely exemplary and are not to be considered as limiting the scope of the inventive aspects disclosed herein. For example, techniques described with reference to surgical instruments and surgical methods may be used in other contexts. Thus, the instruments, systems, and methods described herein may be used for humans, animals, portions of human or animal anatomy, industrial systems, general robotic, or teleoperated systems. As further examples, the instruments, systems, and methods described herein may be used for non-medical purposes including industrial uses, general robotic uses, sensing or manipulating non-tissue work pieces, cosmetic improvements, imaging of human or animal anatomy, gathering data from human or animal anatomy, setting up or taking down systems, training medical or non-medical personnel, and/or the like. Additional example applications include use for procedures on tissue removed from human or animal anatomies (with or without return to a human or animal anatomy) and for procedures on human or animal cadavers. Further, these techniques can also be used for medical treatment or diagnosis procedures that include, or do not include, surgical aspects.
[0025] Figure 1 is a simplified diagram of an example computer-assisted system 100, according to various embodiments. In some examples, the computer-assisted system 100 is a teleoperated system. In medical examples, the computer-assisted system 100 can be a teleoperated medical system such as a surgical system. As shown, the computer-assisted system 100 includes a follower device 104 that can be teleoperated by being controlled by one or more leader devices (also called “leader input devices” when designed to accept external input), described in greater detail below. Systems that include a leader device and a follower device are referred to as leader-follower systems, and also sometimes referred to as master-slave systems. Also shown in Figure 1 is an input system that includes a workstation 102 (e.g., a console), and in various embodiments the input system can be in any appropriate form and may or may not include the workstation 102.
[0026] In the example of Figure 1 , the workstation 102 includes one or more leader input devices 106 that are designed to be contacted and manipulated by an operator 108. For example, the workstation 102 may comprise one or more leader input devices 106 for use by the hands, the head, or some other body part(s) of operator 108. The leader input devices 106 in this example are supported by the workstation 102 and can be mechanically grounded. In some embodiments, an ergonomic support 110 (e.g., forearm rest) can be provided on which the operator 108 can rest his or her forearms. In some examples, the operator 108 can perform tasks at a worksite within a workspace near the follower device 104 during a procedure, by commanding the follower device 104 using the leader input devices 106. In a medical example, the worksite may be a surgical worksite associated with a patient.
[0027] A display device 112 is also included in the workstation 102. The display device 112 may be configured to display images for viewing by the operator 108. The display device 112 can be moved in various DOFs to accommodate the viewing position of the operator 108 and/or to provide control functions. In embodiments where the display device 112 provides control functions, the leader input devices 106 may include the display device 112. In the example of the computer-assisted system 100, displayed images may depict a worksite at which the operator 108 is performing various tasks by manipulating the leader input devices 106 and/or the display device 112. In some examples, images displayed by display device 112 may be received by the workstation 102 from one or more imaging devices arranged at a worksite. In other examples, the images displayed by the display device 112 may be generated by the display device 112 (or by a different connected device or system), such as for virtual representations of tools, the worksite, or for user interface components.
[0028] When using the workstation 102, the operator 108 can sit in a chair or other support in front of the workstation 102, position his or her eyes in front of the display device 112, manipulate the leader input devices 106, and rest his or her forearms on the ergonomic support 110 as desired. In some embodiments, the operator 108 can stand at the workstation or assume other poses, and the display device 112 and the leader input devices 106 can be adjusted in position (height, depth, etc.) to accommodate the pose of the operator 108.
[0029] In some embodiments, the one or more leader input devices 106 can be ungrounded (ungrounded leader input devices being not kinematically grounded, such as leader input devices 106 held by the hands of the operator 108 without additional physical support). Such ungrounded leader input devices 106 can be used in conjunction with the display device 112. In some embodiments, the display device 112 is positioned near the worksite such that the operator 108 can manually operate instruments at the worksite, such as a medical instrument in a medical example, while viewing images displayed by the display device 112.
[0030] As illustrated, the computer-assisted system 100 also includes a follower device 104 that can be commanded by the workstation 102. In a medical example, the follower device 104 can be located near’ an operating table (e.g., a table, bed, or other support) on which a patient can be positioned. In some medical examples, the workspace is provided on an operating table, e.g., on or in a patient, simulated patient, or model, training dummy, etc. (not shown). As illustrated, the follower device 104 may include a plurality of repositionable structures 120 (sometimes referred to as “manipulator arms” in robotic embodiments). In some embodiments, the repositionable structures 120 may include a plurality of links that are rigid members and joints that can be individually actuated as part of a kinematic series. Additionally, each of the repositionable structures 120 is configured to couple to an instrument 122. While Figure 1 illustrates a follower device 104 that has four repositionable structures 120, in other embodiments, the follower device 104 may include one, two, three, four, five, six, or additional or fewer repositionable structures 120.
[0031] The instrument 122 can include, for example, a working portion 126 and one or more structures for supporting and/or driving the working portion 126. Example working portions 126 include end effectors that physically contact or manipulate material, energy application elements that apply electrical, RF, ultrasonic, or other types of energy, sensors that detect characteristics of the workspace environment (such as temperature sensors, imaging devices, etc.), and the like. In various embodiments, examples of instruments 122 include, without limitation, a sealing instrument, a cutting instrument, a sealing-and-cutting instrument, an energy instrument for applying energy, a gripping instrument (e.g., clamps, jaws), a stapler, an imaging instrument such as one using optical, RF, or ultrasonic imaging modalities, a sensing instrument, an irrigation instrument, a suction instrument, and/or the like. In addition, the instrument 122 may include a transmission mechanism 128 that can be coupled to a drive assembly 130 of the respective repositionable structure 120. The drive assembly 130 may include a drive and/or other mechanisms controllable from workstation 102 that transmit forces to the transmission mechanism 128 to articular or otherwise actuate the instrument 122. [0032] As illustrated, each instrument 122 may be mounted to a portion of a respective rcpositionablc structure 120. In Figure 1, this is shown with the drive assembly 130 physically coupled to the transmission mechanism 128. The distal portion of each repositionable structure 120 further includes a cannula mount 124 to which a cannula (not shown) is mounted. When a cannula is mounted to the cannula mount 124, a shaft of the instrument 122 passes through the cannula and into a workspace.
[0033] In various embodiments, one or more of the working portions 126 of the instruments 122 may include an imaging device for capturing images. The imaging device may include any sensing technology capable of acquiring an image. Example imaging instruments include an optical endoscope, a hyperspectral camera, an ultrasonic sensor, etc. Imaging instruments may comprise monoscopic imagers, stereoscopic imagers, and/or the like. Imaging devices based on radiofrequency domains may capture images in any frequency spectrum, including visible light, infrared light, ultraviolet light, and/or the like. The imaging device may include an illumination source to light the region being imaged. In embodiments where the working portions 126 of one or more of the instruments 122 include an imaging device, the instrument 122 may be configured to capture images of a portion of the workspace for display via the display device 112.
[0034] In some embodiments, the repositionable structures 120 and/or instrument 122 can be controlled to move the working portion 126 in response to manipulation of the leader input devices 106 by the operator 108. Accordingly, the repositionable structures 120 and/or instrument 122 may be said to “follow” the leader input devices 106 through teleoperation. This enables the operator 108 to perform tasks at the worksite using the repositionable structures 120 and/or instrument 122. For a surgical example, the operator 108 can direct the repositionable structures 120 of the follower device 104 to move the working portions 126 as part of a surgical procedure performed at an internal surgical site that is entered via one or more minimally invasive apertures or natural orifices.
[0035] In some embodiments, a repositionable structure 120a of the computer-assisted system 100 may be configured to support a working portion 126a that includes an imaging device (also referred to herein as an “imaging device 126a”). For convenience, an instrument 122 that includes an imaging device is also referred to as an “imaging instrument” herein. The control system 140 may be configured to command the repositionable structure 120a and/or the imaging instrument 122 comprising the imaging device 126a to automatically position and/or orient (“pose”) the field of view (FOV) of the imaging device 126a to provide images of the workspace and/or other instruments 122.
[0036] In the illustrated embodiment, a control system 140 is communicatively coupled to the workstation 102. In other embodiments, the control system 140 may be provided as a component of the workstation 102 and/or the follower device 104. During teleoperation, as the operator 108 moves the leader input device(s) 106, one or more sensors configured to detect the leader input device(s) 106 generate spatial and/or orientation movement data that is provided to control system 140. The control system 140 may interpret the spatial and/or orientation information to determine and/or provide control signals to follower device 104 to control the movement of repositionable structures 120, instrument 122, and/or working portions 126. In one embodiment, control system 140 supports one or more wired communication protocols, (e.g., Ethernet, USB, and/or the like) and/or one or more wireless communication protocols (e.g., Bluetooth, IrDA, HomeRF, IEEE 1102.11, DECT, Wireless Telemetry, and/or the like) for communications between the control system 140 and the
[0037] In some embodiments, the control system 140 may be implemented at one or more computing systems. For example, one or more computing systems may be used to control follower device 104. As another example, one or more computing systems may be used to control components of workstation 102, such as movement of a display device 112.
[0038] As illustrated, the control system 140 includes a processor system 150 and a memory 160. The memory 160 may store a control module 170. The processor system 150 may include one or more processors having different processing architectures for processing instructions. For example, the one or more processors may be one or more cores or micro-cores of a multi-core processor, a central processing unit (CPU), a microprocessor, a field-programmable gate array (FPGA), an application- specific integrated circuit (ASIC), a digital signal processor (DSP), a graphics processing unit (GPU), a tensor processing unit (TPU), and/or the like.
[0039] In some embodiments, the processor system 150 includes circuity to support one or more communication interfaces e.g., Bluetooth interface, infrared interface, network interface, optical interface, etc.). Additionally, a communication interface of control system 140 may include an integrated circuit for connecting the control system 140 to a network (not shown) (e.g., a local area network (LAN), a wide area network (WAN) such as the Internet, mobile network, or any other type of network) and/or to another device, such as the workstation 102 and/or the follower device 104.
[0040] Additionally, the memory 160 may include non-persistent storage (e.g., volatile memory, such as random access memory (RAM), cache memory), persistent storage (e.g., a hard disk, an optical drive such as a compact disk (CD) drive or digital versatile disk (DVD) drive, a flash memory, a floppy disk, a flexible disk, a magnetic tape, any other magnetic medium, any other optical medium, programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), a FLASH-EPROM, and/or any other memory chip or cartridge. The non- persistent storage and persistent storage are examples of non -transitory, tangible machine-readable media that can store executable code that, when run by one or more processors (e.g., processor system 150), can cause the one or more processors to perform one or more of the techniques and/or methods disclosed herein.
[0041] Additionally, the control system 140 may also include one or more input devices (such as a touchscreen, keyboard, mouse, microphone, touchpad, electronic pen, or any other type of input device) and/or output devices (such as a display device, a speaker, external storage, a printer, or any other output device). In some embodiments, the control system 140 may be implemented on a particular node of a distributed computing system (e.g., a cloud computing system). As another example, different functionalities associated with the control system 140 may be implemented on different nodes of the distributed computing system. Further, one or more elements of the aforementioned control system 140 may be located at a remote location and connected to the other elements over a network.
[0042] In an endoscopic surgery example, the imaging instrument comprising the imaging device 126a may be inserted into the patient prior to the other instruments 122, including a second instrument 122b comprising a second working portion 126b. The instrument 122b can include any appropriate working portion 126b, and can even include a second imaging device. Accordingly, the imaging device 126a may be maneuvered to positioned to identify a target to use to define an insertion axis (also referred to herein as a “target axis”) along which the other instruments 122 are to be traverse to reach the worksite. Based on this position, the control system 140 may automatically command the corresponding repositionable structures 120 to align a shaft that supports the working portions 126 with their respective insertion axes.
[0043] This technique enables the computer-assisted system 100 to automatically align instruments 122 supported by the repositionable structures 120 for insertion of their respective working portions 126 to the worksite prior to the performance of, in some examples, a procedure. By automatically aligning the instruments 122, the disclosed techniques reduce the likelihood of the instruments (or the repositionable structures supporting the instruments) colliding with one another and/or an avoidance region (e.g.. in a medical example, a subject’s anatomy). As a result, when the operator 108 is ready to perform the procedure, the operator 108 may simply command the follower device 104 to insert the instruments 122 along the insertion axes, or another operator (e.g., an operator who can physically contact the instrument 122 or the follower device 104) can simply push the instrument 122 along the insertion axis. The system facilitating such a technique can make the procedure setup process easier for human operators, reduce workflow disruption, reduce time spent for instrument alignment, and increase the overall efficiency of the medical procedure. Such techniques can also improve the efficiency of operating the computer-assisted system or instrument, simplify user control of the computer-assisted system or instrument, improve the accuracy of the alignment of the instrument. Further, although a surgical example is shown, the disclosed techniques provide an improvement to the computer-assisted system 100 in the non- surgical aspects of the procedure, and can be used to improve computer-assisted systems applied in non-medical contexts.
[0044] As an example workflow, the control system 140 may receive an indication that a second instrument 122b mounted to a second repositionable structure 120b is to be inserted into the workspace. In a medical example, this second instrument 122b can be inserted through an entry site, such as through an incision or a natural orifice in a patient. An accessory can be inserted in the entry site, and provide an opening for instrument insertion. For example, an instrument can be inserted through an entry site by being inserted through a port or cannula already inserted in the entry site.
[0045] In response to receiving the indication, the control system 140 may operate in an alignment mode in which the control system 140 determines a target (e.g., an operator-identified and/or system-identified position and/or line) to use for determining a target axis along which the second instrument 122b is to be inserted into the workspace. Aligning the second instrument 122b to the target axis can comprise aligning an insertion axis of the second instrument 122b to be aligned with the target axis.
[0046] In some embodiments where the target is derived from an operator identification (e.g., an operator-identified position), the workstation 102 and/or the display device 112 may include a user interface that enables the operator 108 to indicate that a position associated with the imaging instrument 122a comprising the imaging device 126a. For example, the workstation 102 and/or the input devices 106 may include a button, switch, toggle or other user-interactable element via which the operator 108 is able to indicate that the operator-identified position is to be determined. In response to detecting the input, the control system 140 may analyze kinematic data associated with the imaging instrument 122a and/or the image data generated the imaging device 126a thereof to determine the operator-identified position. In some embodiments, the control system 140 utilizes the kinematic information associated with the imaging instrument 122a to set the operator- identified position at a tip of the imaging instrument 122a, or at an origin point for a field of view (FOV) of the imaging device 126a included in the imaging instrument 122a. In other embodiments, the control system 140 sets the operator-identified position as being offset from the imaging instrument 122aalong a defined direction. In some examples, the defined direction is along a longitudinal axis of the imaging instrument 122a in the direction towards the worksite, or along a direction of view of the imaging instrument 122a. In some examples, the offset is a predetermined offset. In some other examples, the offset is or a dynamically determined offset such that the operator-identified position coincides with a physical structure (e.g., an object such as a portion of the patient anatomy) along the defined direction. Some example predetermined offsets include a focal distance of the imaging device 126a, a typical offset distance for the instant procedure, etc.
[0047] The direction of view of an imaging device is a direction about which the FOV is centered. For example, an endoscope’s direction of view can be described as a central axis extending from the origin of the endoscope’s FOV toward a far extent of the FOV. For a monoscopic endoscope with FOV having a truncated conical or frustum shape, the direction of view extends from the original of the FOV toward the center of the base of the cone or frustum. For a stereoscopic endoscope, the direction of view can be defined for each of the two cameras of the endoscope, or a combination or average of the directions of views or FOVs of the two cameras (e.g., an average of the two directions of views of the two cameras; a central axis of a union of the FOVs of the two cameras, an intersection of the FOVs if the FOVs intersect, etc.) As another example, the direction of view for A-mode ultrasound is the direction outwards from the ultrasonic sensor along the single dimension of the A-mode ultrasound image, and the direction of view for B-mode ultrasound is a central axis extending from the base of the ultrasound image toward a center of the extent of the ultrasound image in two-dimensional space.
[0048] In other embodiments, the operator- identified position may be more precisely indicated by the operator. As another example, the display device 112 may include a graphical user interface (GUI) that depicts the image data generated by the imaging instrument comprising the imaging device 126a. In this example, the operator 108 may indicate a particular position by interacting with the GUI (e.g., by moving a cursor object to the location of the target, by touching the display device 112, and/or other known means for interacting with a GUI). As another example, the GUI may enable the operatori 08 to input a position, size, and/or shape of a volume for the target into one or more parameter entry fields. It should be appreciated that although the term “operator- identified position” is used herein, in some embodiments, the operator may instead identify a line via the GUI (e.g., by identifying multiple positions which the control system 140 uses to define the line, by manipulating a virtual line indicator, etc.). Accordingly, the term “operator-identified position” also refers to embodiments in which the operator identifies a line.
[0049] In embodiments where the control system 140 defines a position for the target, the control system 140 may be use pre-configured data indicative of the worksite. For a medical example, a user may label pre-operative image data of the worksite with an indication of a desired position for the instruments 122 to be positioned. In this example, the user may label a target lesion (or other anatomical feature of interest) in the pre-operative image data. Accordingly, in this example, the control system 140 may be configured to process the image data generated by the imaging instrument 122a comprising the imaging device 126a to detect when the anatomical feature is included in the FOV. Upon detecting the presence of the anatomical feature, the control system 140 may automatically define a position associated with the anatomical feature from which the target is to be derived.
[0050] Regardless if the position is operator-identified and/or system-defined, the control system 140 may then define a target axis for aligning the instrument 122b. More particularly, the control system 140 may define a target axis that coincides with a straight line defined by the operator-identified and/or system-defined position and the entry site associated with instrument 122b.
[0051] Additionally, in some embodiments, the control system 140 may define a goal position along the target axis at which the instrument 122b is to be inserted for performing the procedure. For example, the control system 140 may define a volume centered at the operator-identified and/or system-defined position. In some embodiments, the volume has a spherical and/or ellipsoidal shape. The control system 140 may then define a point on a surface of the target volume closest to the entry site location as the goal position. The control system 140 may then define the target axis as coinciding with a line intersecting the goal position and the entry site location.
[0052] In various embodiments, the point on the surface of the target volume is the point on the surface closest to the entry site via which the instrument 122b is inserted. Thus, in most cases, the target position finding problem is constrained to only one solution. However, in some scenarios, this solution results in the target insertion axis that causes the instrument 122b to pass through a restricted area (e.g., in a medical context, a portion of a patient anatomy and/or a proximity thereto) upon insertion, result in a collision between repositionable structures 120 and/or the instruments 122, or simply be outside of the range of motion for the instrument 122 when supported by the repositionable structure 120. Thus, in these scenarios, the target position problem may be redefined to permit solutions at any point on the surface of the target volume. Accordingly, the control system 140 may introduce additional constraints such that the new set of potential solutions can be reduced to a single solution. For example, the additional constraints may include the avoidance of the restricted areas and/or collisions. In some embodiments, additional sensor data (such as line image recognition from the imaging instrument 122a) is used to determine the constraints. In these embodiments, the control system 140 may redefine the target axis for the instruments 122 based upon the determined solution to alternate target position problem.
[0053] While the foregoing describes the control system 140 determining the target axis and the target position for the instrument 122b, the control system 140 may perform similar techniques to determine the target axes and target positions for any of the instruments 122 supported by respective repositionable structures 120 that are ready for targeting (e.g., ready to accept a target axis for alignment therewith). To determine that an instrument 122 supported by a repositionable structure 120 is ready for targeting, the control system 140 may verify one or more of (i) the rcpositionablc structure 120 has an instrument 122 mounted thereto, (ii) the rcpositionablc structure 120 is physically coupled with a cannula supported by the cannula mount 124, (iii) the instrument 122 is not inserted past a threshold distance from an uninserted position or remote center, (iv) a distal portion or working portion of the instrument 122 is not inserted beyond a threshold distance from an uninserted position, from a remote center, from an entry site location, from a feature of a cannula, and/or (v) a distal portion or working portion of the instrument 122 is not inserted beyond a tip of a cannula corresponding to the entry site location. . The position a distal portion or working portion of the instrument 122, or some other part of the instrument 122, can be determined through any appropriate technique, such as by image analysis to identify and locate known features of the instrument 122 to locate and orient the instrument, and/or by detecting with sensors kinematic information about the repositionable structure 120 and the instrument 122 and applying forward kinematics, etc.
[0054] The control system 140 may store data associated with the target axes and/or the target positions in a memory. For example, the data may include position data and/or orientation data defined with respect to a coordinate system associated with the kinematic models of the follower device 104. Accordingly, the control system may then access the stored data when, for example, determining a configuration of the joint(s) of the repositionable structures 120 and/or presenting a GUI.
[0055] In this example, the control system 140 may determine a configuration of the instruments 122 and/or the joint(s) of the repositionable structures 120 supporting the instruments 122 that are ready for targeting such that the respective instruments 122 are aligned for insertion along their respective target axes. To determine the configuration, the control system 140 may analyze one or more kinematic models of the instruments 122 and/or the repositionable structures 120 supporting the instruments 122. For example, the control system 140 may analyze the kinematic models to determine a configuration of the joints of the instruments 122 and/or the repositionable structures 120 supporting the instruments 122 such that the instruments 122 are aligned with the respective target axes. In such a pose, the instruments 122 may be inserted through their respective entry site locations to reach their respective goal positions. [0056] The control system 140 may then command the instruments 122 and/or the rcpositionablc structures 120 based on the determined configuration. Techniques for commanding an instrument 122 and/or a repositionable structure 120 to pivot with respect to an entry site location (sometimes referred to as “software centering”) are described in U.S. Patent No. 8,823,308, the entire disclosure of which is hereby incorporated by reference. In some embodiments, the control system 140 commands the instruments 122 and/or repositionable structures 120 toward the determined configuration in parallel. In other embodiments, the control system 140 sequentially commands the instruments 122 and/or repositionable structures 120 toward the determined configuration. For example, if the configuration requires the control system 140 to rotate multiple instruments 122 and/or repositionable structures 120 in the same direction, the current pose of a first repositionable structure 120 (typically the outmost repositionable structure along the axis of rotation) may inhibit the ability of a second instrument 122 and/or repositionable structure 120 to reach the determined configuration without collision. Accordingly, in this example, the control system 140 may command the first instrument 122 and/or repositionable structure 120 toward the configuration and, after the first instrument 122 and/or repositionable structure 120 reaches the pose associated with the configuration, command the second instrument 122 and/or repositionable structure 120 toward the configuration.
[0057] In embodiments where the imaging instrument 122a is used to indicate an operator- identified position for targeting, the FOV of the imaging device 126a is typically not oriented in an optimal manner for performing the procedure. For example, the operator 108 may adjust the digital zoom level and/or pose of the imaging instrument 122a in a particular manner to indicate the operator-identified position. Accordingly, control system 140 may also command the imaging instrument 122a and/or repositionable structure 120a to adjust the pose (e.g., by retracting the imaging instrument 122a along the insertion axis) and/or the zoom level e.g. , by digitally zooming out) of the imaging device 126a such that the FOV of the imaging device 126a provides the operator a better view of the workspace. In various embodiments, the control system 140 may command the repositionable structure 120a to adjust the imaging instrument 122a prior to aligning the repositionable structures 120 with the target axes, simultaneously with commanding other repositionable structures 120, or after the other repositionable structures 120 have reached the determined configuration. [0058] In some embodiments, the control system 140 may be configured to advance the instruments 122 along their respective target axes to their respective goal positions such that the working portions 126 are located proximate to the worksite. In other embodiments, the operator 108 interfaces with the workstation 102 and/or the input devices 106 to advance the instruments 122 along their respective target axes to their respective goal positions. In the operatoradvancement scenario, the control system 140 may provide feedback to the operator 108 when the instrument 122 reaches the goal position. For example, the control system 140 may provide haptic feedback by changing the resistance on the instrument 122 as the instrument 122 approaches the goal position and/or lock the pose of the instrument 122 and/or the repositionable structure 120 that supports the instrument 122 when the instrument 122 reaches the goal position. It should be appreciated that the improved avoidance of collisions and the more efficient targeting and alignment processes do not require the instruments 122 to be inserted to achieve the described benefits.
[0059] Figure 2 depicts an example diagram 200 for defining target axes 236 for aligning instruments 122 (such as the instruments 122 of Figure 1) for entry to a workspace 242. In some embodiments, the techniques for defining the target axes 236 are performed by the control system 140 of Figure 1.
[0060] An operator (such as the operator 108 of Figure 1) may control the pose of a repositionable structure (such as the repositionable structure 120a of Figure 1) that supports an imaging instrument 222a (such as the imaging instrument 122a of Figure 1) such that the FOV of the imaging device 226a (such as the imaging device 126a of Figure 1) includes a position 232 to utilize for targeting the other instruments 222. As described above, in some embodiments, the control system may analyze image data generated by the imaging device 226a to detect the presence of an anatomical feature 244 to automatically define the position 232. In other embodiments, the operator may interact with an operator interface (e.g., a)physical button or a GUI element) to identify the position 232 to the control system.
[0061] After the control system automatically defines the position 232 and/or the operator identifies the position 232, the control system may define target axes 236 for the repositionable structures that are ready for targeting. In the depicted scenario, there are two repositionable structures ready for targeting. Each of the repositionable structures may be associated with a respective entry site 230. That is, the imaging instrument 222a may be inserted through the entry site 230a, a second instrument 222b may be inserted through the entry site 230b, and a third instrument 222c may be inserted through the entry site 230c. To define a target axis 236b for the second instrument 222b, the control system may define an axis that connects the position 232 with the entry site 230b. Similarly, to define a target axis 236c for the third instrument 222c, the control system may define an axis that connects the position 232 with the entry site 230c. The control system may then determine a configuration of the repositionable structures such that a shaft of the instrument 222b is aligned with the target axis 236b and a shaft of the instrument 222c is aligned with the target axis 236c.
[0062] Additionally, in some embodiments, the control system may define goal positions 238 along the target axes 236 at which the instruments 222 are to be positioned prior to the procedure. Accordingly, the control system may define a target volume 234 centered at the target position 232. It should be appreciated that while Figure 2 depicts a spherical target volume 234, in other embodiments, the target volume 234 may have an ellipsoidal shape, a polygonal shape, a shape having convex or concave surfaces, a shape having a convex surface facing the entry site, or any other appropriate shape. For example, in alternate scenarios where a target volume 234 would include portions of the patient anatomy within its volume if a spherical and/or ellipsoidal target volume is defined, the control system may instead truncate the target volume to ensure that the target volume does not include the patient anatomy (or, in some embodiments, approach a threshold distance of the patient anatomy).
[0063] Regardless, the control system may define the goal positions 238 as the intersection points between the target axes 236 and a surface of the target volume 234 (z.e., the closest point on the target volume 234 to the respective entry site 230). Accordingly, the control system may define the goal position 238b of the instrument 226b to be the intersection of the target axis 236b and the surface of the target volume 234 and the goal position 238c of the instrument 226c to be the intersection of the target axis 236c and the surface of the target volume 234.
[0064] It should be appreciated that in the depicted scenario, the control system uses the same target volume 234 for both the instrument 222b and the instrument 222c. However, in other embodiments, the target volume 234 may be different for the instrument 222b and the instrument 222c. For instance, the operator may prefer that different types of instruments be positioned at different distances from the position 232. As one example, instruments with larger working portions may be positioned further away from the position 232 to provide additional space for other instruments to maneuver within the workspace 242. As another example, instruments that are utilized later in the procedure may be positioned further from the position 232 to maximize the space within the workspace 242 for the instruments that are utilized earlier in the procedure. Accordingly, in these embodiments, the control system may generate multiple target volumes 234 that have different radii and/or shapes based upon the particular instrument associated with the goal position 238 being defined. As a result, the control system is able to maximize the working space associated with the workspace 242 thereby providing improved working conditions for performing the procedure.
[0065] Additionally, in some embodiments, the control system may be configured to support a guided tool change mode, in which the control system facilitates the “changing” of an instrument previously mounted to a repositionable structure. This guided tool “change” mode does not require that the instrument actually be changed. For example, an instrument can be removed for cleaning and remounted, an instrument can be removed to reload the instrument with staples, clips, sutures, or other material and then remounted, and/or the like. Techniques related to guided tool change executed by a computer-assisted system are described in U.S. Patent No. 6,645,196 Bl, filed June 16, 2000, and entitled “GUIDED TOOL CHANGE,” which is incorporated herein by reference. Accordingly, in embodiments that implement the guided tool change mode, when the control system detects that an instrument (either a new instrument or the prior instrument) has been mounted to a repositionable structure, the control system may analyze an indication of the mounted instrument to determine whether the goal position 238 should be updated. If the indication indicates that an instrument of a different type was mounted to the repositionable structure, the control system may perform re-define the goal position 238 using a target volume 234 for the new type of instrument to ensure that the new tool is re-inserted to the appropriate depth.
[0066] Figure 3 illustrates an example GUI 300 is provided to an operator (such as the operator 108 of Figure 1) to facilitate a procedure using a computer-assisted system (such as the computer- assisted system 100 of Figure 1). In some embodiments, the GUI 300 is presented via a display device associated with a workstation (such as the display device 112 of the workstation 102 of Figure 1). It should be appreciated that a control system (such as the control system 140 of Figure 1), the workstation, and/or a combination thereof may generate the GUI 300 and/or the data displayed thereby.
[0067] As illustrated, the GUI 300 enables a user to visualize image data 325 generated by an imaging instrument (such as the imaging instruments 122a, 222a) and/or other imaging devices. In some scenarios, the follower device that supports the imaging instrument may include multiple imaging instruments mounted to repositionable structures. Accordingly, the GUI 300 may include an indication 320 that identifies the repositionable structure and/or the imaging instrument associated with the displayed image data 325.
[0068] In some embodiments, the GUI includes visual indicators 336 representative of target axes of instruments. For example, the control system may access kinematic data to determine a pose of the imaging instrument to identify a portion of the coordinate space within the FOV of the imaging instrument. If the stored data associated with the target axes is included within the portion of the coordinate space, the control system may generate visual indicators 336b representative of the target axes in the image data 325. As a result, the operator is able to visually see the insertion path of the instruments.
[0069] In some embodiments, this may enable to operator to confirm the automatic ally -defined target axes are appropriate (e.g., do not collide with patient anatomy, align with operator preferences, etc.) before the control system commands the follower device toward a configuration that aligns the instrument shafts with the target axes. If the operator is satisfied with the target axes, the operator may interface with a GUI element 346 to accept the confirm the target axes and cause the control system to command the follower device towards the configuration. Although the Figure 3 depicts the GUI element 346 as a button, any suitable GUI element may be utilized in other embodiments. In certain embodiments, one or more hardware user interface elements (e.g., physical buttons, dials, foot pedals, etc.) may be used instead of, in addition to, or concurrently with GUI element 346 to command the follower device towards the configuration.
[0070] Figures 4A-4D illustrate an example process of a computer-assisted system 400 (such as the computer-assisted system 100 of Figure 1) aligning a follower device 404 (such as the follower device 104 of Figure 1) with target axes that were defined in accordance with techniques described elsewhere herein. It should be understood that the examples of Figures 4A-4D are not restrictive, and that other repositionable structures, instruments, behaviors, and/or the like depicted in Figures 4A-4D may be different for other computer- assisted systems, different repositionable structures, different imaging devices, different instruments, different DOFs, different procedures, and/or the like.
[0071] Figures 4A-4D depict four repositionable structures 420 (such as the repositionable structures 120 of Figure 1) all mechanically grounded to a common kinematic base. In some other examples, the repositionable structures 420 do not share such a common kinematic base, and are mounted to two or more separate carts, configured to be mounted to walls, tables, ceilings, floors independently of each other, etc. As appropriate, a control system (such as the control system 140 of Figure 1) may employ registration techniques, applied with appropriate geometric models and reference frame transformations, to locate the repositionable structures 420, and components such as instrument 422 (such as an instrument 122 of Figure 1) supported by the repositionable structures 420, relative to each other. Techniques related to registration and frame transforms include those described in U.S. Pat. 9,259,289, filed January 27, 2012 and titled “Estimation of a position and orientation of a frame used in controlling movement of a tool,” U.S. Pat. 11,534,252, filed November 13, 2018 and titled “Master/slave registration and control for teleoperation,” and U.S. Pat. Publication 2023/0028689, filed January 1, 2021 and titled “System and method for interarm registration.”
[0072] Stalling with Figure 4A, the repositionable structure 420 exhibit an initial and/or default pose. As such, the instrament shafts of the instruments 422 are not aligned with their corresponding target axes. In the illustrated example, the instrument 422b of the repositionable structure 420b is mounted with an imaging instrument (such as the imaging instruments 122a, 222a) and the instruments 422a, c,d of the repositionable structures 420a, c,d are mounted with respective instruments and, in some embodiments, have been verified to be ready for targeting.
[0073] Accordingly, the control system may have previously determined a configuration at which the instrument shafts of the instruments 422 are aligned with their respective target axes. For example, the control system may determine the configuration by using inverse kinematics of the repositionable structures 420 and/or the instruments 422. More particularly, inverse kinematics may be applied to determine joint commands for the repositionable structures 420 and/or the instruments 422, such as to align the instrument shafts of the instrument with their respective target axes. In addition to using inverse kinematics, the control system may also apply image processing, geometric modelling, pre-stored parameters, real-time data regarding the environment or the system, and/or the like to determine the appropriate joint commands.
[0074] As described herein, in some embodiments the control system may sequentially command the instruments 422 and/or the repositionable structures 420 toward the determined configurations (e.g., by issuing the joint commands associated with the respective repositionable structure 420). To reduce the likelihood of collision, the control system may command the instruments and/or repositionable structures further from the center of the cluster of repositionable structures 420 (e.g., the repositionable structures 420a, d) prior to commanding the instruments and/or repositionable structures closer towards to the center of the cluster (e.g., the repositionable structure 420c).
[0075] Turning to Figure 4B, the control system has issued the joint commands to cause the instrument 422a and the repositionable structure 420a to move toward the determined configuration. Accordingly, the instrument shaft of the instrument 422a is now aligned with the target axis for the instrument 422a. The other three repositionable structures 420b, c,d remain in the configurations shown in Figure 4A.
[0076] Turning to Figure 4C, the control system has issued the joint commands to cause the instrument 422d and the repositionable structure 420d to move toward the determined configuration. Accordingly, the instrument shaft of the instrument 422d is now also aligned with the target axis for the instrument 422a. The other three repositionable structures 420a, b,c remain in the configurations shown in Figure 4B.
[0077] Turning to Figure 4D, the control system has issued the joint commands to cause the instrument 422c and the repositionable structure 420c to move toward the determined configuration. Accordingly, the instrument shaft of the instrument 422c is now aligned with the target axis for the instrument 422c. Thus, each of the instruments 422a, c,d and the repositionable structures 420a, c,d are now exhibit a pose consistent with the determined configuration.
[0078] It should be appreciated that in the illustrated scenario, the imaging device of the instrument 422b was utilized to indicate an operator-identified and/or automatically define position (such as the position 232 of Figure 2) used to define the target axes. Accordingly, the control system also issued commands to change the pose of the instrument 422b and/or the repositionable structure 420b such that a FOV of the imaging instrument mounted thereto provides a wider view of the workspace thereby enabling the operator to see the ends of the cannulas disposed proximate to the workspace. More particularly, in the illustrated scenario, the control system issued command to retract the instrument shaft along the insertion axis of the instrument 422b. In other embodiments, the control system may instead a change of a zoom level of the imaging instrument to achieve a wider view of the workspace than the prior configuration. The zoom mechanism may include an optical zoom mechanism, a digital zoom mechanism, physical movement of the imaging device, and/or the like.
[0079] Figure 5 is a flow diagram of an example method 500 for performing instrument targeting in a computer-assisted system 100 in accordance with one or more embodiments. The method 500 may be performed by one or more processors executing instractions stored in one or more computer-readable media (e.g., non-volatile memory), for example, the processor system 150 of the control system 140 of Figure 1.
[0080] As described herein, the control system may be communicatively coupled to a first repositionable structure (such as the repositionable structures 120, 220, 420) configured to support a first instrament (such as the instruments 122, 222). The first repositionable structure may include a first plurality of joints.
[0081] The method 500 may begin at block 502 when the control system determines a target (such as the position 232 of Figure 2) in a workspace (such as the workspaces 242, 342) for positioning the first instrument supported by the first repositionable structure. For example, in some embodiments, the computer-assisted may also include a second repositionable structure (such as the repositionable structures 120a, 420b) configured to support a second instrument (such as the instruments 122a, 222a), wherein the second instrument includes an imaging device. In these embodiments, the control system may determine the target using kinematic information associated with the imaging device.
[0082] As one example, the control system may determine the target by detecting an operator interaction with an input device. The input device may be a physical input device (such as a button or toggle associated with the input devices 106 of Figure 1) or a virtual input device (such as a graphical user interface element presented on a display device, such as the display device 112 of Figure 1). In this example, the control system may determine a position of the second instrument in response to the detecting the operator interaction with the input device (e.g., by detecting an operator interaction with an image displayed on the display device). In some embodiments, the position is a position of a portion of the second instrument (such as a tip of the second instrument or an origin point for a field of view (FOV) of the imaging device). In other embodiments, the system locates the position of the target at a longitudinal distance along a direction of view of the second instrument (e.g., “in front of’ the second instrument, such as in the example depicted in Figure 2). In still other embodiments, the position may be based on an anatomical feature (such as the anatomical feature 244 of Figure 2) within the FOV of the imaging device. In any of these embodiments, a position of the target may be determined based on the position of the second instrument.
[0083] As another example, the control system may automatically define the target based on an analysis of the image data generated by the imaging device. For example, the control system may detect an anatomical feature within the FOV of the imaging device and define the target to be at a position associated with the anatomical feature (e.g., on a surface of the anatomical feature, at a particular subfeature of the anatomical feature (such as a lesion), at an offset from the anatomical feature, etc.). In some embodiments, the control system may automatically define multiple potential targets from which the target can be selected (e.g., by detecting an operator interaction with an input device or a display device). Accordingly, depending on the example, the target may an operator-identified or automatically defined position.
[0084] At block 504, the control system defines a first target axis (such as the target axis 236b of Figure 2) based on the target and an entry site location (such as the entry site 230b of Figure 2). For example, the control system may define the first target axis based on a line connecting the target to the entry site.
[0085] In some embodiments, the computer-assisted system includes a display device (such as the display device 112 of Figure 1). In these embodiments, the display device may be configured to display an image (such as the image data 325 of Figure 3) based upon the image data generated by the imaging device. In these embodiments, the control system may render the image to include a representation of the first target axis (such as one of the visual indicators 336 of Figure 3) within the image.
[0086] At block 506, the control system may determine a configuration of the first plurality of joints that aligns the first instrument with the first target axis such that the first instrument is able to be advanced along the first target axis. As described herein, the control system may be configured to apply kinematic and/or geometric models of the plurality of joints, imaging data, preconfigured parameters, and/or sensed real-time data to determine the configuration of the first plurality of joints.
[0087] At block 508, the control system may then command the first plurality of joints based on the determined configuration. In some embodiments, the control system may implement collision avoidance techniques when commanding the first plurality of joints. For example, the control system may determine that a current position of the first repositionable structure inhibits a second repositionable structure from aligning a second instrument supported by the second repositionable structure with a second target axis and command the first plurality of joints toward the configuration prior to commanding a plurality of joints of the second repositionable structure toward the configuration. This example is reflected by the scenario represented by Figures 4A-4D where control system commanded the plurality of joints of the repositionable structure 420d prior to commanding the plurality of joints of the repositionable structure 420c.
[0088] As described herein, in embodiments where the imaging device is used to identify the target, the FOV may not provide a sufficiently wide view of the workspace. Accordingly, in some embodiments, the control system may receive an indication of the first plurality of joints reaching the configuration; and in response to receiving the indication, command a zooming out of the field of view of the imaging device by causing at least one action selected from the group consisting of: commanding the second plurality of joints to retract the second instrument, commanding the second plurality of joints to move the imaging device further away from the workspace, causing an imaging sensor of the imaging device to decrease a magnification level, and causing a decrease in digital magnification of an image captured by the imaging device.
[0089] In some embodiments, the control system may perform one or more verifications prior to performing the method 500. For example, the control system may perform a first verification to verify that the first instrument satisfies an insertion condition, the insertion condition comprising at least one criterion selected from the group consisting of: the first instrument not being inserted past a threshold distance from an uninserted position, and a distal portion of the first instrument not being inserted beyond a tip of a cannula corresponding to the entry site location, a second verification to verify that the first repositionable structure is physically coupled with a cannula, and/or a third verification to verify that the first instrument is mounted to the first repositionable structure. In this example, the control system may only perform the instrument targeting in response to the first instrument and/or the first repositionable structure satisfying the verifications. Additionally, in some embodiments where the operator controls the second instrument to determine the target, the control system may wait until receiving an indication that a positioning of the second instrument has completed before performing the instrument targeting.
[0090] In some embodiments, the control system may additionally determine a goal position along the first target axis indicative of a depth of insertion for the first instrument prior to performing the procedure. Accordingly, in these embodiments, the control system may identify a position on a surface defined at least partially by the operator-identified or automatically-defined position. For example, the control system may define a spherical shape centered at the target and identify a position on a surface of the spherical shape based upon the entry site location. In some embodiments, the control system determines the size and/or the shape of the surface based upon an instrument type associated with the first instrument (e.g., a typical working distance for the first instrument during the instant procedure).
[0091] In many scenarios, the control system defines a first candidate as the goal position on the surface as a position on the surface closest to the entry site location. However, in some scenarios, the control system may determine that setting the goal position to the first candidate would result in: the first instrument being commanded to pass through a restricted area or to move outside of a motion limit of the first instrument, when inserting the first instrument to the workspace. Accordingly, in these scenarios the control system may identify identifying a second candidate as the position on the surface such that setting the goal position to the second candidate would result in: the first instrument not being commanded to pass through the restricted area or to move outside of the motion limit, when inserting the first instrument to the workspace. As another example, the computer-assisted system may include a third repositionable structure configured to support a third instrument. Accordingly, the control system may identify the second position on the surface based on detecting a potential collision between the first target axis based on at least one parameter selected from the group consisting of: (i) a physical configuration of the third repositionable structure, (ii) an insertion axis of the third instrument; or (iii) a target axis of the third instrument. [0092] In some embodiments, the control system may be further configured to control the first rcpositionablc structure to advance the first instrument towards the target along the first target axis. For example, the control system may be configure to advance the first instrument until the first instrument reaches the target position. In other embodiments, the control system may be configured to provide feedback to an operator that is manually and/or semi-automatically advancing the first instrument to indicate when the first instrument has reached the target position.
[0093] One or more components of the examples discussed in this disclosure, such as control system 140, may be implemented in software for execution on one or more processors of a computer system. The software may include code that when executed by the one or more processors, configures the one or more processors to perform various functionalities as discussed herein. The code may be stored in a non-transitory computer readable storage medium (e.g., a memory, magnetic storage, optical storage, solid-state storage, etc.). The computer readable storage medium may be part of a computer readable storage device, such as an electronic circuit, a semiconductor device, a semiconductor memory device, a read only memory (ROM), a flash memory, an erasable programmable read only memory (EPROM); a floppy diskette, a CD-ROM, an optical disk, a hard disk, or other storage device. The code may be downloaded via computer networks such as the Internet, Intranet, etc. for storage on the computer readable storage medium. The code may be executed by any of a wide variety of centralized or distributed data processing architectures. The programmed instructions of the code may be implemented as a number of separate programs or subroutines, or they may be integrated into a number of other aspects of the systems described herein. The components of the computing systems discussed herein may be connected using wired and/or wireless connections. In some examples, the wireless connections may use wireless communication protocols such as Bluetooth, near-field communication (NFC), Infrared Data Association (IrDA), home radio frequency (HomeRF), IEEE 502.11, Digital Enhanced Cordless Telecommunications (DECT), and wireless medical telemetry service (WMTS).
[0094] Various general-purpose computer systems may be used to perform one or more processes, methods, or functionalities described herein. Additionally or alternatively, various specialized computer systems may be used to perform one or more processes, methods, or functionalities described herein. In addition, a variety of programming languages may be used to implement one or more of the processes, methods, or functionalities described herein. [0095] While certain examples and examples have been described above and shown in the accompanying drawings, it is to be understood that such examples and examples arc merely illustrative and are not limited to the specific constructions and arrangements shown and described, since various other alternatives, modifications, and equivalents will be appreciated by those with ordinary skill in the art.

Claims

WHAT IS CLAIMED IS:
1. A computer-assisted system comprising: a first repositionable structure configured to support a first instrument, wherein the first repositionable structure supporting the first instrument comprises a first plurality of joints; and a control system comprising one or more processors, the control system communicatively coupled to the first repositionable structure; wherein the control system is configured to perform instrument targeting by: determining a target in a workspace for positioning the first instrument supported by the first repositionable structure; defining a first target axis based on the target and an entry site location; determining a configuration of the first plurality of joints that aligns the first instrument with the first target axis such that the first instrument is able to be advanced along the first target axis; and commanding the first plurality of joints based on the determined configuration.
2. The computer-assisted system of claim 1, wherein: the computer-assisted system further comprises: a second repositionable structure configured to support a second instrument, wherein the second instrument includes an imaging device; and determining the target comprises: using kinematic information associated with the imaging device.
3. The computer-assisted system of claim 1, wherein: the computer-assisted system further comprises: a second repositionable structure configured to support a second instrument, wherein the second instrument includes an imaging device; and determining the target comprises: detecting an operator interaction with an input device; determining, in response to the operator interaction, a position of the second instrument; and determining the target based on the position of the second instrument.
4. The computer-assisted system of claim 3, wherein determining the position of the second instrument comprises: determining a position of a tip of the second instrument, or of an origin point for a field of view (FOV) of the imaging device.
5. The computer-assisted system of claim 3, wherein to determining the target based on the position of the second instrument comprises: locating the target at a longitudinal distance along a direction of view of the second instrument.
6. The computer-assisted system of claim 1, wherein: the computer-assisted system further comprises: a second repositionable structure configured to support a second instrument, wherein the second instrument includes an imaging device; and determining the target comprises analyzing image data generated by the imaging device to automatically determine the target, or to automatically determine multiple potential targets from which the target can be selected.
7. The computer-assisted system of claim 6, wherein determining the target or multiple potential targets comprises: identifying an anatomical feature based on the image data.
8. The computer-assisted system of claim 1, wherein: the computer-assisted system further comprises: a display device configured to display an image based upon image data generated by an imaging device; and determining the target comprises: detecting an operator interaction with the image displayed on the display device; and determining the target based on the operator interaction.
9. The computer-assisted system of claim 1 , wherein defining the first target axis based on the target and an entry site location comprises: defining a surface or volume based on the target; identifying a goal position on the surface or in the volume; and defining the first target axis to intersect the goal position and the entry site location.
10. The computer-assisted system of claim 9, wherein a shape of the surface or volume, or a size of the surface or volume, is determined based upon an instrument type associated with the first instrument.
11. The computer-assisted system of claim 9, wherein identifying the goal position comprises: identifying a first candidate as a position on the surface closest to the entry site location; making a determination that setting the goal position to the first candidate would result in: the first instrument being commanded to pass through a restricted area or to move outside of a motion limit of the first instrument, when inserting the first instrument to the workspace; identifying, in response to making the determination, a second candidate on the surface, wherein setting the goal position to the second candidate would result in: the first instrument not being commanded to pass through the restricted area or to move outside of the motion limit, when inserting the first instrument to the workspace; and setting the goal position to the second candidate.
12. The computer-assisted system of claim 1, wherein defining the first target axis based on the target and the entry site location comprises: defining a spherical shape centered at the target; identifying a goal position on a surface of the spherical shape based upon the entry site location; and defining the first target axis to intersect the goal position and the entry site location.
13. The computer-assisted system of claim 12, wherein identifying the goal position comprises: defining the goal position as a position on the surface closest to the entry site location.
14. The computer-assisted system of any of claims 1 to 13, wherein: the computer-assisted system further comprises: a second repositionable structure configured to support a second instrument, wherein the second instrument includes an imaging device; and the control system is configured to perform the instrument targeting in response to receiving an indication that a positioning of the second instrument has completed.
15. The computer-assisted system of any of claims 1 to 13, wherein: the computer-assisted system further comprises: a second repositionable structure configured to support a second instrument, wherein the second instrument includes an imaging device, and wherein the second repositionable structure supporting the second instrument comprises a second plurality of joints; and the control system is configured to perform instrument targeting further by: receiving an indication of the first plurality of joints reaching the determined configuration; and causing, in response to receiving the indication, a zooming out of a field of view of the imaging device by at least one action selected from the group consisting of: commanding the second plurality of joints to retract the second instrument, commanding the second plurality of joints to move the imaging device further away from the workspace, causing an imaging sensor of the imaging device to decrease a magnification level, and causing a decrease in digital magnification of an image captured by the imaging device.
16. The computer-assisted system of any of claims 1 to 13, further comprising: a second repositionable structure configured to support a second instrument, wherein the second instrument includes an imaging device; and a display device configured to display an image based upon image data captured by the imaging device, wherein the control system is further configured to: render the image to include a representation of the first target axis.
17. The computer-assisted system of any of claims 1 to 13, further comprising: a third repositionable structure configured to support a third instrument, wherein defining the first target axis comprises: defining the first target axis based on at least one parameter selected from the group consisting of: (i) a physical configuration of the third repositionable structure, (ii) an insertion axis of the third instrument, and (iii) a target axis of the third instrument.
18. The computer-assisted system of any of claims 1 to 13, wherein the control system is further configured to: perform a first verification to verify that the first instrument satisfies an insertion condition, the insertion condition comprising at least one criterion selected from the group consisting of: the first instrument not being inserted past a threshold distance from an uninserted position, and a distal portion of the first instrument not being inserted beyond a tip of a cannula corresponding to the entry site location; and perform the instrument targeting based on the first verification.
19. The computer-assisted system of any of claims 1 to 13, wherein the control system is further configured to: perform a second verification to verify that the first repositionable structure is physically coupled with a cannula; and perform the instrument targeting based on the second verification.
20. The computer-assisted system of any of claims 1 to 13, wherein the control system is further configured to: perform a third verification to verify that the first instrument is mounted to the first repositionable structure; and perform the instrument targeting based on the third verification.
21. The computer-assisted system of any of claims 1 to 13, wherein commanding the first plurality of joints comprises: determining that a current configuration of the first repositionable structure inhibits a second repositionable structure from aligning a second instrument supported by the second repositionable structure with a second target axis; and commanding the first plurality of joints toward the configuration prior to commanding an alignment of the second instrument with the second target axis.
22. The computer-assisted system of any of claims 1 to 13, wherein the control system is configured to perform the instrument targeting further by: controlling the first repositionable structure to advance the first instrument along the first target axis.
23. A method performing instrument targeting using a computer-assisted system that includes (i) a first repositionable structure configured to support a first instrument, wherein the first repositionable structure supporting the first instrument comprises a first plurality of joints; and (ii) a control system comprising one or more processors, the control system communicatively coupled to the first repositionable structure, wherein the method comprises: determining, via the control system, a target in a workspace for positioning the first instrument supported by the first repositionable structure; defining, via the control system, a first target axis based on the target and an entry site location; determining, via the control system, a configuration of the first plurality of joints that aligns the first instrument with the first target axis such that the first instrument is able to be advanced along the first target axis; and commanding, via the control system, the first plurality of joints based on the determined configuration.
24. The method of claim 23, wherein the computer-assisted system further includes a second repositionable structure configured to support a second instrument, wherein the second instrument includes an imaging device, and determining the target comprises: using kinematic information associated with the imaging device.
25. The method of claim 23, wherein the computer-assisted system further includes a second repositionable structure configured to support a second instrument, wherein the second instrument includes an imaging device, and determining the target comprises: detecting, via the control system, an operator interaction with an input device; determining, in response to the operator interaction, a position of the second instrument; and determining, via the control system, the target based on the position of the second instrument.
26. The method of claim 25, wherein determining the position of the second instrument comprises: determining, via the control system, a position of a tip of the second instrument, or of an origin point for a field of view (FOV) of the imaging device.
27. The method of claim 25, wherein determining the target based on the position of the second instrument comprises: locating, via the control system, the target at a longitudinal distance along a direction of view of the second instrument.
28. The method of claim 23, wherein the computer-assisted system further includes a second repositionable structure configured to support a second instrument, wherein the second instrument includes an imaging device, and determining the target comprises: analyzing, via the control system, analyzing image data generated by the imaging device to automatically determine the target, or to automatically determine multiple potential targets from which the target can be selected.
29. The method of claim 28, wherein determining the target or multiple potential targets comprises: identifying, via the control system, an anatomical feature based on the image data.
30. The method of claim 23, wherein the computer-assisted system further comprises: a display device configured to display an image based upon image data generated by an imaging device, and determining the target comprises: detecting, via the control system, an operator interaction with the image displayed on the display device; and determining, via the control system, the target based on the operator interaction.
31. The method of claim 23, wherein defining the first target axis based on the target and an entry site location comprises: defining, via the control system, a surface or volume based on the target; identifying, via the control system, a goal position on the surface or in the volume; and defining, via the control system, the first target axis to intersect the goal position and the entry site location.
32. The method of claim 31, wherein a shape of the surface or volume, or a size of the surface or volume, is determined based upon an instrument type associated with the first instrument.
33. The method of claim 31, wherein identifying the goal position comprises: identifying, via the control system, a first candidate as a position on the surface closest to the entry site location; making a determination, via the control system, that setting the goal position to the first candidate would result in: the first instrument being commanded to pass through a restricted area or to move outside of a motion limit of the first instrument, when inserting the first instrument to the workspace; identifying, in response to making the determination, a second candidate on the surface, wherein setting the goal position to the second candidate would result in: the first instrument not being commanded to pass through the restricted area or to move outside of the motion limit, when inserting the first instrument to the workspace; and setting, via the control system, the goal position to the second candidate.
34. The method of claim 23, wherein defining the first target axis based on the target and the entry site location comprises: defining, via the control system, a spherical shape centered at the target; identifying, via the control system, a goal position on a surface of the spherical shape based upon the entry site location; and defining, via the control system, the first target axis to intersect the goal position and the entry site location.
35. The method of claim 34, wherein identifying the goal position comprises: defining, via the control system, the goal position as a position on the surface closest to the entry site location.
36. The method of any of claims 23 to 35, wherein the computer-assisted system further comprises a second repositionable structure configured to support a second instrument, wherein the second instrument includes an imaging device, and the method comprises: performing, via the control system, the instrument targeting in response to receiving an indication that a positioning of the second instrument has completed.
37. The method of any of claims 23 to 35, wherein the computer-assisted system further comprises a second repositionable structure configured to support a second instrument, wherein the second instrument includes an imaging device, and wherein the second repositionable structure supporting the second instrument comprises a second plurality of joints; and the method further comprises: receiving, via the control system, an indication of the first plurality of joints reaching the determined configuration; and causing, in response to receiving the indication, a zooming out of a field of view of the imaging device by at least one action selected from the group consisting of: commanding the second plurality of joints to retract the second instrument, commanding the second plurality of joints to move the imaging device further away from the workspace, causing an imaging sensor of the imaging device to decrease a magnification level, and causing a decrease in digital magnification of an image captured by the imaging device.
38. The method of any of claims 23 to 35, wherein the computer-assisted system further comprises (i) a second repositionable structure configured to support a second instrument, wherein the second instrument includes an imaging device and (ii) a display device configured to display an image based upon image data captured by the imaging device, wherein the method further comprises; rendering, via the control system, the image to include a representation of the first target axis.
39. The method of any of claims 23 to 35, wherein the computer-assisted system further comprises a third repositionable structure configured to support a third instrument, and defining the first target axis comprises: defining, via the control system, the first target axis based on at least one parameter selected from the group consisting of: (i) a physical configuration of the third repositionable structure, (ii) an insertion axis of the third instrument, and (iii) a target axis of the third instrument.
40. The method of any of claims 23 to 35, further comprising: performing, via the control system, a first verification to verify that the first instrument satisfies an insertion condition, the insertion condition comprising at least one criterion selected from the group consisting of: the first instrument not being inserted past a threshold distance from an uninserted position, and a distal portion of the first instrument not being inserted beyond a tip of a cannula corresponding to the entry site location; and performing, via the control system, the instrument targeting based on the first verification.
41. The method of any of claims 23 to 35, further comprising: performing, via the control system, a second verification to verify that the first repositionable structure is physically coupled with a cannula; and performing, via the control system, the instrument targeting based on the second verification.
42. The method of any of claims 23 to 35, further comprising: performing, via the control system, a third verification to verify that the first instrument is mounted to the first repositionable structure; and performing, via the control system, the instrument targeting based on the third verification.
43. The method of any of claims 23 to 35, wherein commanding the first plurality of joints comprises: determining, via the control system, that a current configuration of the first repositionable structure inhibits a second repositionable structure from aligning a second instrument supported by the second repositionable structure with a second target axis; and commanding, via the control system, the first plurality of joints toward the configuration prior to commanding an alignment of the second instrument with the second target axis.
44. The method of any of claims 23 to 35, further comprising: controlling, via the control system, the first repositionable structure to advance the first instrument along the first target axis.
45. One or more non-transitory machine-readable media comprising a plurality of machine-readable instructions which when executed by a control system associated with a computer-assisted system are adapted to cause the control system to perform the method of claims 23 to 44.
PCT/US2024/048529 2023-09-27 2024-09-26 Aligning an instrument supported by a computer-assisted system Pending WO2025072418A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363585912P 2023-09-27 2023-09-27
US63/585,912 2023-09-27

Publications (1)

Publication Number Publication Date
WO2025072418A1 true WO2025072418A1 (en) 2025-04-03

Family

ID=93333624

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/048529 Pending WO2025072418A1 (en) 2023-09-27 2024-09-26 Aligning an instrument supported by a computer-assisted system

Country Status (1)

Country Link
WO (1) WO2025072418A1 (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6645196B1 (en) 2000-06-16 2003-11-11 Intuitive Surgical, Inc. Guided tool change
US8823308B2 (en) 2005-05-19 2014-09-02 Intuitive Surgical Operations, Inc. Software center and highly configurable robotic systems for surgery and other uses
US9259289B2 (en) 2011-05-13 2016-02-16 Intuitive Surgical Operations, Inc. Estimation of a position and orientation of a frame used in controlling movement of a tool
US20210196398A1 (en) * 2019-12-31 2021-07-01 Auris Health, Inc. Anatomical feature identification and targeting
US20210196312A1 (en) * 2019-12-31 2021-07-01 Auris Health, Inc. Alignment interfaces for percutaneous access
US20220241013A1 (en) * 2014-03-28 2022-08-04 Intuitive Surgical Operations, Inc. Quantitative three-dimensional visualization of instruments in a field of view
US11534252B2 (en) 2017-11-16 2022-12-27 Intuitive Surgical Operations, Inc. Master/slave registration and control for teleoperation
US20230008419A1 (en) * 2021-07-08 2023-01-12 Mendaera, Inc. Real time image guided portable robotic intervention system
US20230028689A1 (en) 2020-01-06 2023-01-26 Intuitive Surgical Operations, Inc. System and method for inter-arm registration
US20230099189A1 (en) * 2021-09-29 2023-03-30 Cilag Gmbh International Methods and Systems for Controlling Cooperative Surgical Instruments with Variable Surgical Site Access Trajectories

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6645196B1 (en) 2000-06-16 2003-11-11 Intuitive Surgical, Inc. Guided tool change
US8823308B2 (en) 2005-05-19 2014-09-02 Intuitive Surgical Operations, Inc. Software center and highly configurable robotic systems for surgery and other uses
US9259289B2 (en) 2011-05-13 2016-02-16 Intuitive Surgical Operations, Inc. Estimation of a position and orientation of a frame used in controlling movement of a tool
US20220241013A1 (en) * 2014-03-28 2022-08-04 Intuitive Surgical Operations, Inc. Quantitative three-dimensional visualization of instruments in a field of view
US11534252B2 (en) 2017-11-16 2022-12-27 Intuitive Surgical Operations, Inc. Master/slave registration and control for teleoperation
US20210196398A1 (en) * 2019-12-31 2021-07-01 Auris Health, Inc. Anatomical feature identification and targeting
US20210196312A1 (en) * 2019-12-31 2021-07-01 Auris Health, Inc. Alignment interfaces for percutaneous access
US20230028689A1 (en) 2020-01-06 2023-01-26 Intuitive Surgical Operations, Inc. System and method for inter-arm registration
US20230008419A1 (en) * 2021-07-08 2023-01-12 Mendaera, Inc. Real time image guided portable robotic intervention system
US20230099189A1 (en) * 2021-09-29 2023-03-30 Cilag Gmbh International Methods and Systems for Controlling Cooperative Surgical Instruments with Variable Surgical Site Access Trajectories

Similar Documents

Publication Publication Date Title
US20230200923A1 (en) Systems, methods, and computer-readable program products for controlling a robotically delivered manipulator
US12162143B2 (en) Systems and methods for master/tool registration and control for intuitive motion
US20220241039A1 (en) Systems and methods for onscreen menus in a teleoperational medical system
KR102839599B1 (en) Handheld user interface device for surgical robots
KR20200078422A (en) System and method for master/tool matching and control for intuitive movement
CN111093549A (en) Method of guiding manual movement of a medical system
US20250248769A1 (en) System and method related to registration for a medical procedure
US20240375282A1 (en) Techniques for following commands of an input device using a constrained proxy
EP3973540A1 (en) Systems and methods for generating workspace volumes and identifying reachable workspaces of surgical instruments
US20250090156A1 (en) Method and apparatus for manipulating tissue
WO2025072418A1 (en) Aligning an instrument supported by a computer-assisted system
US20240070875A1 (en) Systems and methods for tracking objects crossing body wallfor operations associated with a computer-assisted system
US20200315740A1 (en) Identification and assignment of instruments in a surgical system using camera recognition
US20250205900A1 (en) Setting and using software remote centers of motion for computer-assisted systems
US20240390068A1 (en) Systems and methods for generating workspace geometry for an instrument
WO2025024562A1 (en) Reach assist motion for computer-assisted systems
CN120051254A (en) Increasing mobility of computer-aided systems while maintaining a partially constrained field of view
WO2025019679A1 (en) Positioning an imaging device to view a portion of an instrument during insertion of the instrument
WO2024211671A1 (en) Automated determination of deployment settings for a computer-assisted system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24799348

Country of ref document: EP

Kind code of ref document: A1