[go: up one dir, main page]

WO2025076175A1 - Système et procédé de détection de collision à l'aide d'un déplacement de raccord détecté - Google Patents

Système et procédé de détection de collision à l'aide d'un déplacement de raccord détecté Download PDF

Info

Publication number
WO2025076175A1
WO2025076175A1 PCT/US2024/049732 US2024049732W WO2025076175A1 WO 2025076175 A1 WO2025076175 A1 WO 2025076175A1 US 2024049732 W US2024049732 W US 2024049732W WO 2025076175 A1 WO2025076175 A1 WO 2025076175A1
Authority
WO
WIPO (PCT)
Prior art keywords
usage situation
cart
drivable
computer
joint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/049732
Other languages
English (en)
Inventor
Pavel Chtcheprov
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intuitive Surgical Operations Inc
Original Assignee
Intuitive Surgical Operations Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intuitive Surgical Operations Inc filed Critical Intuitive Surgical Operations Inc
Publication of WO2025076175A1 publication Critical patent/WO2025076175A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Leader-follower robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/03Automatic limiting or abutting means, e.g. for safety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/064Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/064Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
    • A61B2090/066Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension for measuring torque
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39082Collision, real time collision avoidance

Definitions

  • FIG. 2 depicts a drivable cart in accordance with one or more embodiments.
  • FIG. 3 depicts a primary control interface for a drivable cart in accordance with one or more embodiments.
  • ordinal numbers e.g., first, second, third, etc.
  • an element i.e., any noun in the application.
  • the use of ordinal numbers is not to imply or create any particular ordering of the elements nor to limit any element to being only a single element unless expressly disclosed, such as using the terms “before,” “after,” “single,” and other such terminology. Rather, the use of ordinal numbers is to distinguish between the elements.
  • a first element is distinct from a second element, and the first element can encompass more than one element and succeed (or precede) the second element in an ordering of elements.
  • any component described with regard to a figure in various embodiments disclosed herein, may be equivalent to one or more like-named components described with regard to any other figure.
  • descriptions of these components are not necessarily repeated with regard to each figure.
  • each and every embodiment of the components of each figure is incorporated by reference and assumed to be optionally present within every other figure having one or more like-named components.
  • any description of the components of a figure is to be interpreted as an optional embodiment which may be implemented in addition to, in conjunction with, or in place of the embodiments described with regard to a corresponding like-named component in any other figure.
  • the instruments, systems, and methods described herein may be used for humans, animals, portions of human or animal anatomy, industrial systems, general robotic, or teleoperated systems.
  • the instruments, systems, and methods described herein may be used for non-medical purposes including industrial uses, general robotic uses, sensing or manipulating non-tissue work pieces, cosmetic improvements, imaging of human or animal anatomy, gathering data from human or animal anatomy, setting up or taking down systems, training medical or non-medical personnel, and/or the like.
  • Additional example applications include use for procedures on tissue removed from human or animal anatomies (with or without return to a human or animal anatomy) and for procedures on human or animal cadavers.
  • An image of the worksite may be obtained by an imaging instrument including an imaging device (e.g., an endoscope, an optical camera, an ultrasonic probe, etc. in a medical example).
  • the imaging instrument can be used for imaging the worksite, and may be manipulated by a manipulator arm (e.g., first manipulator arm (112)) of the robotic manipulation system (102) so as to position and orient the imaging instrument.
  • the auxiliary system (106) may process the captured images in a variety of ways prior to any subsequent display on a display monitor (121).
  • the auxiliary system (106) may overlay the captured images with a virtual control interface prior to displaying the combined images to the operator (e.g., first clinician (101)) via the user control system (104) or other display systems located locally or remotely from the procedure.
  • One or more separate displays may also be coupled with a computing system (e.g., a control system) and/or the auxiliary system (106) for local and/or remote display of images, such as images of the procedure site, or other related images.
  • the computer-assisted system (100) may include one or more control systems (142).
  • a control system (142) may be used to process input provided by the user control system (104) from an operator (e.g., first clinician (101)).
  • a control system (142) may further be used to provide an output, e.g., a video image for display.
  • One or more control systems (142) may further be used to control aspects of the at least one manipulator arm, such as movement between kinematic configurations, braking or stopping of the at least one manipulator arm, and collision detection.
  • the control system (142) may be disposed within, or otherwise considered part of, the robotic manipulation system (102), the user control system (104), and/or the auxiliary system (106).
  • a control system (142) may be connected to or be a part of a network.
  • the network may include multiple nodes. Each node may correspond to a control system, or a group of nodes.
  • embodiments of the disclosure may be implemented on a node of a distributed system that is connected to other nodes.
  • embodiments of the disclosure may be implemented on a distributed computing system having multiple nodes, where each portion of the disclosure may be located on a different node within the distributed computing system.
  • one or more elements of the aforementioned control system(s) (142) may be located at a remote location and connected to the other elements over a network.
  • the primary control interface (111) can alter the speed of the drivable cart (202), when moved, according to the sensed pressure or displacement.
  • the speed of the drivable cart (202) is determined based on the sensed pressure of displacement of the one or more drive switches.
  • the drivable cart further includes a speedometer that measures the frequency of rotation of at least one wheel included in the drivable cart (202). As such, the speed of the drivable cart (202) may be determined given the frequency of rotation of the at least one wheel along with a knowledge of the diameter of the at least one wheel.
  • An upper speed limit may be placed on the speed of the drivable cart (202).
  • the upper speed limit may originate from a physical limitation of the drive mechanism of the drivable cart (202) (e.g., motor limitations) or may be governed according to a speed threshold.
  • the speed threshold may be configured according to a usage situation of the robotic manipulation system (102).
  • the primary control interface (111) can include a touchscreen (310).
  • the touchscreen (310) can display a menu (not shown) providing additional control functionality of the drivable cart (202) to a user.
  • the touchscreen (310) may be used to, among other things, select a usage situation of the robotic manipulations system (102).
  • the touchscreen (310) may provide visual feedback to a user.
  • the primary control interface (111) may provide other suitable output mechanisms such as speakers, actuators, LEDs, buzzers, etc., for visual, audio, and/or other types of feedback (e.g., audible feedback, haptic feedback, etc.).
  • operation of the manipulator arm (400) includes control of the position of the distal end (401) and/or a tool and end effector attached thereon relative to the base (402).
  • position encompasses both location and orientation (e.g., translation and rotation).
  • an instrument or tool may be positioned and manipulated through an entry location, so that a kinematic remote center is maintained at the entry location.
  • non-driven joints do not have a motor and are positioned manually such as through physical contact by an operator. That said, non-driven joints may be configured for gravity compensation and may have an actuator. Additionally, in some embodiments, non-driven joints include a joint sensor to determine the state of the joint (e.g., position, velocity, acceleration) and a braking system (or, more simply, a “brake”) to maintain a stationary position of the joint relative to the base (402) or relative to a link proximal to the joint. An example brake of a non-driven joint is described later in the instant disclosure. Driven joints, in contrast, generally include a control board and a joint actuator, the joint actuator including all necessary components for operating and stopping the joint.
  • a joint actuator can include a motor, an encoder, gears, and/or a brake.
  • non-driven joints are manipulated to adjust a position of the kinematic remote center. Once a kinematic remote center is established, non-driven joints are fixed in place (e.g., through use of their brakes) and the driven joints are controlled to manipulate the distal end (401) of the manipulator arm (400) and/or end effector (403) while maintaining the kinematic remote center.
  • the first, second, third, and fourth joints (Ji, J2, J3, and J4) of the example manipulator arm (400) of FIG. 4A may be considered non-driven joints while the remaining joints are driven joints.
  • the kinematic configuration of the manipulator arm (400) is completely specified through the state of the joints in view of the relative positions and lengths/sizes of the intervening links, if any.
  • the kinematic configuration of the manipulator arm (400) may further include state information for position components external to the manipulator arm such as a boom (113) and support column (204). Operation of the manipulator arm (400), which primarily consists of controlling the manipulator arm (400) through a continuous space of joint states, is governed by a control system (142).
  • the control system (142) may perform at least some of the calculations to determine the kinematic configuration of the manipulator arm (400) and the sequence and timing of joint states that should be undertaken to achieve a desired kinematic configuration. Often, there will be many potential sequences of joint states that can achieve a desired kinematic configuration such that the control system (142) may also apply a constrained optimization routine to select a sequence of joint states (or select a path of traversal through a space of joint states) that is safe (i.e., avoids collisions between joints and links of the manipulator arm (400)) while optimizing some predefined criterion/criteria (e.g., a minimum length path through the space of joint states, maximizing the range of motion of all available joints upon achieving the desired kinematic configuration, etc.).
  • a constrained optimization routine to select a sequence of joint states (or select a path of traversal through a space of joint states) that is safe (i.e., avoids collisions between joints and links of the manipulator
  • FIG. 4B depicts a simplified schematic view (450) of the example manipulator arm (400) depicted in FIG. 4A.
  • rotary joints are represented as cylinders
  • prismatic joints are represented as dampers
  • links are represented as solid lines with increased line thickness.
  • Each joint defines a joint axis.
  • the simplified schematic view (450) illustrates a joint axis for each joint (e.g., first axis of rotation (451)), axis of translation (e.g., first axis of translation (452), etc.).
  • first axis of rotation 451
  • axis of translation e.g., first axis of translation (452), etc.
  • the first joint (Ji) is directly connected to a base (402) (e.g., the distal end of the boom (113)).
  • the second joint ( J2) which is an axial or prismatic joint, has a proximal end in direct contact with the first joint (JI) and is coupled to the proximal end of the third joint (J3), through the first link (LI), on its distal end.
  • the first link (LI) alters the direction of the kinematic chain of joints and links of the manipulator arm (400).
  • the second link (L2) extends distally from the third joint (J3) and is connected to the fourth joint (J4) on its distal end.
  • the joints (e.g., non-driven joints) of a manipulator arm may include a brake system (“brake”).
  • brake system e.g., a brake system
  • the present disclosure places no limitation on the type or configuration of a brake so long a as a variable braking force may be applied.
  • braking force is used to describe the general holding force applied by a brake system to an associated joint to prevent movement of said joint (rotation or translation), regardless of whether the joint is rotary or prismatic.
  • applied torques or moments may be referred to simply as applied forces.
  • the rotor (504) resides between the armature (508) and the plate (510), and, when no braking force is applied or when a sufficient force is applied to overcome the braking force, can rotate relative to the stator (502) about a first rotational axis (516). Without power, the spring (512) of the spring actuated electromagnetic brake system (500) pushes the armature (508) into the rotor (504) and the rotor (504) is consequently pushed into the plate (510).
  • the medical facility includes a reception area (602), a hallway (604), a first operating room (606), a second operating room (608), and two auxiliary rooms (610) (e.g., storage areas, offices, etc.).
  • the first usage situation consists of navigating the robotic manipulation system (102) from a first position (102-1) to a second position (102-2) along a first path (614), where the first path (614) resides or traverses, at least partially, an area outside of an operating room (e.g., hallway (604)).
  • the robotic manipulation system (102) is being transported from the first operating room (606) to the second operating room (608).
  • FIG. 6 also depicts a person (612) present in the hallway (604).
  • the second usage situation, or setup movement of the robotic manipulation system (102) may generally be defined as the positioning of the robotic manipulation system (102) within an operating room (700) in preparation for an upcoming procedure or operation.
  • the setup movement of the robotic manipulation system (102) may include transporting the robotic manipulation system (102) from a third position (102-3), in the operating room (700), to a fourth position (102-4) approaching the operating table (119).
  • FIG. 7 depicts the transportation of the robotic manipulation system (102) along a second path (702) that avoids the various obstacles in the operating room (700).
  • the setup movement of the robotic manipulation system (102) may further include draping one or more components (e.g., manipulator arm) of the robotic manipulation system (102).
  • Draping in general, consists of wrapping or enclosing portions of the robotic manipulation system (102) with one or more drapes to provide a sterile interface between components of the robotic manipulation system (102), such as a manipulator arm, and a patient. After the draping, the manipulator arms may remain in an extended configuration or may be returned to a stowed (folded) configuration.
  • FIG. 8 depicts a third usage situation for the robotic manipulation system (102).
  • the third usage situation may be described a pre-procedure movement or docking approach of the robotic manipulation system (102).
  • the third usage situation may include transportation of the robotic manipulation system (102) from the fourth position (102-4), established during the second usage situation, to a fifth position (102-5), where the fifth position (102-5) is proximate the operating table (119).
  • the fifth position (102-5) is such that, once positioned, a computer-assisted medical procedure may be performed on a patient with use of the robotic manipulation system (102) (e.g., by the at least one manipulator arm and associated instrument, if any) without further movement of the robotic manipulation system (102) as a whole.
  • the robotic manipulation system (102) may traverse a third path (802) towards the operating table (119).
  • the pre-procedure movement or docking approach may further include draping one or more components (e.g., manipulator arm) of the drivable cart (e.g., when draping is not performed during the second usage situation).
  • the pre-procedure movement of the robotic manipulation system may further include an initial or approximate positioning of the at least one manipulator arm including extension of the support column (204), extension of the boom (113), and manipulation of one or more joints (e.g., non-driven joints) of a manipulator arm such that a meaningful kinematic remote center can be obtained and maintained for the procedure to be performed.
  • the computer-assisted system may be said to include at least one sensor, where the at least one sensor may be camera (of an included camera system), a drape sensor, a EiDAR sensor (of an included EiDAR system), and an ultrasonic sensor (of an included ultrasonic system).
  • sensor data e.g., images
  • environment classification e.g., object detection and classification
  • object avoidance e.g., object proximity determination
  • collision prediction e.g., collision prediction.
  • environment classification consists of selecting a class, from a predefined set of classes, that describes the external surroundings or environment of the robotic manipulation system (102) of the computer-assisted system (100).
  • the pre-defined set of classes includes an “operating room environment” (i.e., within an operating room) and a “non-operating room environment” (e.g., hallway, storage room, etc.).
  • the pre-defined set of classes includes an “operating room environment,” a “non-operating room environment,” and “an unknown or other environment.”
  • a benefit to the latter pre-defined set of classes is that ambiguous environments can be indicated without forcing an assignation to either an operating room environment or non-operating room environment.
  • Object detection and classification includes both detecting the presence of an object in space (i.e., object localization) and assigning a class to the object where the class describes the object.
  • object detection and classification herein, the term “object” is broadly defined to include items conventionally considered objects (“generic objects”) such as walls, tables, chairs, plants, etc., but also people.
  • the object detection and classification task can further classify people as “patient” or “non-patient.”
  • object avoidance encompasses the simultaneous self-locating and mapping of the surrounding environment of the robotic manipulation system (102), a process known as simultaneous localization and mapping (SLAM).
  • SLAM is a method used principally by autonomous vehicles to map (spatially orient surrounding objects) and localize the vehicle in that map at the same time.
  • a SLAM algorithm may be used to plan a path for the robotic manipulation system (102) to travel in order to safely arrive at desired destination (e.g., in an operating room).
  • the SLAM algorithm can run in real-time or traceable time. Methods used by the SLAM algorithm may include, but are not limited to: particle filter; extended Kalman filter; and covariance intersection.
  • Object proximity determination includes determining, or at least approximating, the distance of one or more objects from the robotic manipulation system (102).
  • object proximity determination is performed through the joint use of an object segmentation process or model and a depth estimator.
  • a depth estimator given an image, returns a depth field for the given image.
  • the depth field indicates the distance, or depth, of every pixel in the image from the acquiring camera.
  • the object segmentation process similarly processes a given image and segments the image into one or more objects.
  • a segmented object can be assigned a depth (or distance) by aggregating the depth field of the segmented object, where aggregation may include an average, minimum, or maximum operator applied pixelwise over the pixels enclosed by the segmented object.
  • object proximity determination is performed using an ultrasonic system or a LiDAR system, where at least the source and receiver of such systems are disposed on the robotic manipulation system (102).
  • object proximity detection is performed using a stereoscopic technique with two or more cameras, where the two or more camaras may be disposed on the robotic manipulation system (102) or within a surrounding environment in view of the robotic manipulation system (102).
  • Collision prediction includes determining whether or not the robotic manipulation system (102) will collide with an object (e.g., wall, table, person: patient, nonpatient) given a current trajectory of the robotic manipulation system (102) and/or a given planned path that the robotic manipulation system (102) is traversing or is to traverse. For example, if the robotic manipulation system (102) is known to be proceeding toward an object, the distance to the object (i.e., object proximity determination) and the speed of the drivable cart (e.g., determined using the pressure sensor or speedometer) can be used to calculate a time to collision. If the time to collision is less than a pre-defined collision time threshold, then a collision may be predicted to be imminent. In such a case, the drivable cart may be automatically stopped, or slowed, to avoid collision.
  • an object e.g., wall, table, person: patient, nonpatient
  • the distance to the object i.e., object proximity determination
  • the speed of the drivable cart e.g., determined
  • Embodiments of the disclosure control one or more of the brakes, for example, of the non-driven joint(s) of a manipulator arm (e.g., manipulator arm (400)) to facilitate detecting a collision between the manipulator arm and an object (e.g., a wall, a person, an equipment item).
  • a collision may occur, for example when the entire robotic manipulation system (102) is moved, but also under other conditions.
  • Embodiments disclosed herein relate to a system and method for modulating the braking force of a brake in a non-driven joint dynamically, and in real-time, based on at least one of: a usage situation, a kinematic configuration of the manipulator arm (including the configuration of the boom (113) and support column (204) of the robotic manipulation system (102)), and a speed of the drivable cart (e.g., determined using a pressure sensor associated with a drive switch of the primary control interface (111)).
  • modulation of the braking force may further depend on the detection of a patient and/or a determined proximity of the robotic manipulation system (102), or any of its components (e.g., manipulator arm), to the patient, if detected.
  • the movement threshold of a non-driven joint is dependent on the location of the non-driven joint on the manipulator arm as well as the kinematic configuration of the manipulator arm. For example, a relatively proximal non-driven joint may have a smaller movement threshold than a more distal non-driven joint.
  • the movement threshold of a non-driven joint may be set to zero, or otherwise the smallest sensed joint movement as defined by the joint sensor.
  • braking force to a level below the braking force that is generated when the brake is fully engaged
  • braking force may be desirable to set the braking force to at least a prespecified braking force such that displacement of the nondriven joint does not occur under normal movement of robotic manipulation system (102), or the associated manipulator arm of the non-driven joint, due to inertial loads, for example, while moving and stopping the drivable cart (202).
  • a second usage situation may cover more limited movement of the drivable cart (202) of the robotic manipulation system (102).
  • the second usage situation may apply, for example, inside an operating room (e.g., operating room (700)). Movement of the drivable cart (202) within an operating room may require the avoidance of equipment (e.g., other components of the computer-assisted system (100)), one or more users or medical staff, and/or the patient (if present).
  • the speed of the drivable cart may be more restricted.
  • the speed of the cart may be restricted to a second speed where the second speed is strictly less than the first speed.
  • the motion of the drivable cart (202) may still continue, although with the speed being limited.
  • a third usage situation may cover an even more limited movement of the drivable cart (202) of the robotic manipulation system (102).
  • the third usage situation may apply, for example, when incremental movements are performed prior to a surgical procedure (i.e., a pre-procedure movement or docking approach).
  • the docking approach is designed to reach a mechanical configuration of the robotic manipulation system (102) that allows coupling of a cannula that passes through an entry location (e.g., a natural orifice such as the throat or anus, or through an incision) to an instrument holder at the distal end of a manipulator arm (112), for subsequent insert of the instrument through the cannula.
  • the usage situation is manually input by a user of the robotic manipulation system (102), for example, through selection of the usage situation using the touchscreen (310).
  • Usage situations may have certain characteristics that may be detectable in data. Accordingly various sensors may be used to collect different types of data that may be used to determine the usage situation. When a usage situation is determined through the use of sensor data, the usage situation, and all accompanying settings (e.g., speed restrictions, automatic stopping of the drivable cart (202), etc.), may be automatically applied to the robotic manipulation system (102). Or, in instances where a usage situation is input manually by a user, if a different usage situation is detected, the detected usage situation may override the previously manually input usage situation.
  • sensors may be used to collect different types of data that may be used to determine the usage situation.
  • the usage situation may be automatically applied to the robotic manipulation system (102). Or, in instances where a usage situation is input manually by a user, if a different usage situation is detected, the detected usage situation may override the previously manually input usage situation.
  • a detection may be made, for example, using the camera. Classification of the image data may result in the detection of a hallway and/or other non-operating room environment. The detection, thus, indicates that the system is in a common, not particularly delicate environment. In response to the detection, the brakes may be tuned as previously discussed.
  • the brakes may be tuned to avoid arm movement that is merely based on inertia, thereby avoiding false positives in the detection of collisions.
  • objects detected in the surrounding environment may also be indicative of the current usage situation, thereby affecting the tuning of the brake(s). For example, the presence of a patient suggests that the surrounding environment is an OR, thereby ruling out the first operating scenario.
  • the patient on the operating table being within a certain distance may be indicative of the third operating scenario.
  • a functional relationship between the force/torque threshold applied to a non-driven joint given the usage situation (which may be detected), the kinematic configuration of the associated manipulator arm, and speed of the cart (as well as known speed limits given the usage situation), may be parameterized by a set 1 of calibration parameters, where the set of calibration parameters includes at least one calibration parameter.
  • each non-driven joint may have its own set of calibration parameters.
  • the set of calibration parameters for each nondriven joint are periodically determined through a sequence of physical tests defining a calibration procedure. Consequently, the set of calibration parameters for each non-driven joint may be updated with the evolution of time to account for changing brake system behavior (e.g., burnishing of the brake(s) with time).
  • FIGs. 9A and 9B depict fictional force and relative joint movement plots, respectively. While fictional (i.e., not based on observed data), these plots are illustrative of the dynamic modulation of a brake of a non-driven joint and the detection of a collision through sensed joint movement.
  • the ordinate axis of the force plot of FIG. 9A represents a value of applied force imposed on a given non-driven joint of a manipulator arm. In the case of FIG. 9A, the applied force is given in normalized arbitrary units (a.u.).
  • the abscissa axis represents time (e.g., seconds).
  • FIG. 9A depicts an example force curve (901) indicating the applied force imposed on the given non-driven joint with respect to time.
  • the force plot of FIG. 9A is also partitioned into six regions, referenced herein as the first region (902), second region (904), third region (906), fourth region (908), fifth region (910), and sixth region (912).
  • the ordinate axis of the relative joint movement plot of FIG. 9A represents a value of joint movement sensed on the given non-driven using its joint sensor (e.g., position encoder). Similar to FIG. 9A, the relative joint movement of FIG. 9B is displayed in arbitrary units (a.u.). For example, the units of the relative joint movement could be degrees for a rotary joint.
  • the abscissa axis of FIG. 9B is identical to that of FIG. 9A and represents time (e.g., seconds).
  • FIG. 9B depicts an example joint movement curve (907) indicating the sensed joint movement of the given non-driven joint relative to datum with respect to time.
  • the relative joint movement plot of FIG. 9B is partitioned into the same regions as the force plot of FIG. 9A.
  • an operator instructs or otherwise commands the drivable cart (202) to move (e.g., by depressing a drive switch).
  • the non-driven joint experiences a temporary peak in applied force.
  • the applied force in the second region (904) does not exceed the first force/torque threshold (903) and, consequently, the non-driven joint does not experience any joint movement (FIG. 9B).
  • FIG. 10 a flowchart in accordance with one or more embodiments is shown.
  • the flowchart of FIG. 10 depicts a method (1000) for computer-assisted systems.
  • the method (1000) may be used to detect a collision distal to a non-driven joint of a manipulator arm of a computer-assisted system.
  • the method may be executed on one or more processors, e.g., of the one or more control systems of the computer-assisted system.
  • processors e.g., of the one or more control systems of the computer-assisted system.
  • Step 1002 a usage situation for a robotic manipulation system of the computer-assisted system is determined.
  • the robotic manipulation system includes a manipulator arm consisting of at least one non-driven joint.
  • the usage situation is determined through reception of a manual input from a user or operator of the drivable cart.
  • the robotic manipulation system may have an integrated primary control interface by which a user or operator of the robotic manipulation system may indicate the usage situation of the robotic manipulation system.
  • the computer-assisted system can include one or more sensors, such as a camera or ultrasonic sensor (and associated ultrasonic or acoustic source).
  • the one or more sensors can be disposed on any component of the computer-assisted system (e.g., the robotic manipulation system) or can be proximate to, or in view of, the robotic manipulation system (e.g., disposed within the surroundings of the robotic manipulation system).
  • Sensor data are collected from the one or more sensors.
  • the speed of the drivable cart may be used to determine the usage situation.
  • a first usage scenario (e.g., as illustrated in FIG. 6) is typically associated with higher speeds of the drivable cart, whereas a third usage scenario (e.g., as illustrated in FIG.
  • a second usage scenario (e.g., as illustrated in FIG. 7) may be associated with intermediate speeds of the drivable cart. Accordingly, the actual speed of the drivable cart may serve as a meaningful predictor for the current usage situation.
  • the sensor data e.g., images, drape sensor data
  • Determination of the usage situation from the sensor data can occur automatically and be performed by one or more control systems of the computer- assisted system. Each control system including at least one computer processor.
  • the kinematic configuration of the manipulator arm is determined.
  • the kinematic configuration of the manipulator arm is completely specified through the state of the joints in view of the relative positions and lengths/sizes of the intervening links, if any.
  • the kinematic configuration of the manipulator arm may further include state information for position components external to the manipulator arm such as a boom and support column of the robotic manipulation system.
  • the state of a joint is determined using, or based on, a joint sensor (e.g., non-driven joints) and/or a position encoder (e.g., included in the joint actuator of a driven joint).
  • the joint sensor of a nondriven joint may be a position encoder.
  • one or more control systems of the computer- assisted system may perform at least some of the calculations to determine the kinematic configuration of the manipulator arm.
  • a movement threshold for each non-driven joint of the manipulator arm is obtained.
  • the movement threshold specifies an allowable movement for a given joint, where absolute joint movements within the movement threshold do not result in a detected collision.
  • the movement threshold is set to prevent inertial effects from being detected as collisions. Inertial effects may be a result of, for example, the drivable cart encountering a bump on the floor, causing some shaking.
  • the movement threshold is further set to allow detection of even relatively minor collisions at joints and links of the manipulator arm. In other words, the movement threshold may be a tradeoff between allowing movement associated with collisions (even minor collisions) at joints and links, while preventing movement associated with inertial effects.
  • the movement threshold may be tunable.
  • the movement threshold of a given joint is set to zero, or otherwise not used, such that a joint movement exceeding the minimum measurable (or sensed) joint movement by the joint sensor of the given non-driven joint is indicative of a collision.
  • a force or torque threshold is determined.
  • the force/torque threshold(s) are based on, at least, the usage situation and the kinematic configuration. In some instances, the force/torque threshold(s) may further be based on sensor data (e.g., speed of cart, drape sensor, proximity to a detected patient, etc.). Additionally, in Step 1008, the force/torque threshold(s) are adjusted dynamically, in real-time, as dictated by changes in the usage situation, kinematic configuration (and all other factors used in the determination of the force/torque threshold (s)).
  • the force/torque threshold of a given non-driven joint may further be configured by a set of calibration parameters.
  • the force/torque threshold is set such that the previously described inertial effects do not result in movement at the non-driven joint. While an actual collision of the drivable cart (but not the encountering of a bump by the drivable cart) would, have the potential to cause moment at the non-driven joint as a result of the significant inertial effects, the force/torque threshold does not need to be set to accommodate such collisions because a collision of the drivable cart would stop further movement of the drivable cart.
  • Step 1010 the braking force of each non-driven joint of the manipulator arm is set to its respective force/torque threshold as determined in Step 1008.
  • the braking force for a given non-driven joint is applied by a braking system (e.g., a spring actuated electromagnetic braking system) where the braking force is proportionate to a supplied input variable (e.g., voltage) and the supplied input variable is continuous (e.g., analog) or highly discretized (e.g., a bit depth of 12).
  • Changes in force/torque threshold(s) determined in Step 1008 are applied in real-time, or modulated, in Step 1010.
  • a collision procedure Upon detecting the collision (a collision that applies a force/torque to at least one non-driven joint that exceeds its associated force/torque threshold), a collision procedure is applied.
  • a default collision procedure involves automatically stopping the drivable cart or slowing down the drivable cart and raising one or more audiovisual alarms or notifications. Alarms or notifications can include an audible sound, flashing lights on the affected manipulator alarm, and a displayed text message or graphic.
  • Step 1014 If, in Step 1014, none of non-driven joints have experienced joint movement in excess of their associated movement thresholds, no collision is detected, and the flowchart returns to Step 1008.
  • the manipulator arm of the drivable cart is continually monitored for collisions, where collision sensitivity is adapted through application of variable braking forces of the non-driven joints; the variable braking forces based on, at least, the usage situation and kinematic configuration.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

Un système assisté par ordinateur comprend un système de manipulation robotique et un système de commande. Le système de manipulation robotique comprend un bras manipulateur comprenant un raccord non entraîné, comprenant un capteur de raccord configuré pour détecter un déplacement de raccord du raccord non entraîné, et un frein qui applique une force de freinage au raccord non entraîné. Le système de commande est configuré pour moduler la force de freinage du frein sur la base d'une situation d'utilisation actuelle du système assisté par ordinateur de telle sorte qu'une force ou un couple au niveau du raccord non entraîné provoque un déplacement du raccord non entraîné, la force ou le couple étant provoqué par une collision du bras manipulateur distale par rapport au raccord non entraîné et dépassant un seuil de force ou de couple réglé sur la base de la situation d'utilisation actuelle, et déterminer si une collision s'est produite selon que le déplacement du raccord non entraîné est détecté par le capteur de raccord.
PCT/US2024/049732 2023-10-06 2024-10-03 Système et procédé de détection de collision à l'aide d'un déplacement de raccord détecté Pending WO2025076175A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363542974P 2023-10-06 2023-10-06
US63/542,974 2023-10-06

Publications (1)

Publication Number Publication Date
WO2025076175A1 true WO2025076175A1 (fr) 2025-04-10

Family

ID=93214152

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/049732 Pending WO2025076175A1 (fr) 2023-10-06 2024-10-03 Système et procédé de détection de collision à l'aide d'un déplacement de raccord détecté

Country Status (1)

Country Link
WO (1) WO2025076175A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180361578A1 (en) * 2015-12-01 2018-12-20 Kawasaki Jukogyo Kabushiki Kaisha Monitoring device of robot system
WO2022144640A1 (fr) * 2020-12-30 2022-07-07 Auris Health, Inc. Systèmes et procédés de détection de contact et de réaction de contact de bras robotiques
US20220241969A1 (en) * 2019-10-30 2022-08-04 Neuromeka Method for automatically setting collision sensitivity of collaborative robot
US20230255704A1 (en) * 2014-03-17 2023-08-17 Intuitive Surgical Operations, Inc. System and method for breakaway clutching in an articulated arm

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230255704A1 (en) * 2014-03-17 2023-08-17 Intuitive Surgical Operations, Inc. System and method for breakaway clutching in an articulated arm
US20180361578A1 (en) * 2015-12-01 2018-12-20 Kawasaki Jukogyo Kabushiki Kaisha Monitoring device of robot system
US20220241969A1 (en) * 2019-10-30 2022-08-04 Neuromeka Method for automatically setting collision sensitivity of collaborative robot
WO2022144640A1 (fr) * 2020-12-30 2022-07-07 Auris Health, Inc. Systèmes et procédés de détection de contact et de réaction de contact de bras robotiques

Similar Documents

Publication Publication Date Title
US12390290B2 (en) System and method for rapid halt and recovery of motion deviations in repositionable arms
US11207099B2 (en) Intelligent positioning system and methods therefor
US9827054B2 (en) Intelligent positioning system and methods therefore
US11998296B2 (en) Hard stop protection system and method
KR20170140179A (ko) 하이퍼덱스테러스 시스템 사용자 인터페이스
US12220195B2 (en) Systems, methods, and devices for defining a path for a robotic arm
CN113286682A (zh) 减少伺服控制中的能量累积
CN115279292A (zh) 遥操作终止期间的外科医生脱离检测
CN113613582B (zh) 用于包括在计算机辅助医疗系统内的操纵器推车的分叉导航控制的系统和方法
US20240382216A1 (en) Systems and methods for monitoring a surgical tool
CA2948719A1 (fr) Systeme de positionnement intelligent et procedes y relatifs
WO2025076175A1 (fr) Système et procédé de détection de collision à l'aide d'un déplacement de raccord détecté
US12447618B2 (en) Techniques for constraining motion of a drivable assembly
WO2024226849A1 (fr) Positionnement assisté d'une structure repositionnable
WO2025230898A1 (fr) Système pour optimiser la position d'une articulation en réponse au mouvement d'une liaison distale
US20250366927A1 (en) Visual guidance for repositioning a computer-assisted system
US20250162157A1 (en) Temporal non-overlap of teleoperation and headrest adjustment in a computer-assisted teleoperation system
US20250143819A1 (en) Techniques for repositioning a computer-assisted system with motion partitioning
WO2024236488A1 (fr) Systèmes et procédés de surveillance d'un outil chirurgical
WO2023244636A1 (fr) Guidage visuel de repositionnement d'un système assisté par ordinateur
WO2024261752A1 (fr) Systèmes de détection en temps réel de collision d'objet et/ou de mouvement d'objet
WO2025207554A1 (fr) Évitement des collisions entre instruments pour systèmes assistés par ordinateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24794296

Country of ref document: EP

Kind code of ref document: A1