WO2025179100A1 - System and method for controlling teleoperation based on hand presence and input device drift - Google Patents
System and method for controlling teleoperation based on hand presence and input device driftInfo
- Publication number
- WO2025179100A1 WO2025179100A1 PCT/US2025/016722 US2025016722W WO2025179100A1 WO 2025179100 A1 WO2025179100 A1 WO 2025179100A1 US 2025016722 W US2025016722 W US 2025016722W WO 2025179100 A1 WO2025179100 A1 WO 2025179100A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input device
- hand
- instrument
- computer
- metric
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/35—Surgical robots for telesurgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/74—Manipulators with manual electric input means
Definitions
- the present invention generally provides improved computer-assisted devices, systems, and methods.
- Computer-assisted systems can be used to perform tasks at worksites.
- a computer-assisted system may include a handheld, motorized tool assembly.
- a computer-assisted system may include a manipulator system that may include one or more manipulator arms to manipulate instruments for performing the task.
- Example computer-assisted systems include industrial and recreational manipulator systems.
- Example computer-assisted systems also include medical manipulator systems used in procedures for diagnosis, non-surgical treatment, surgical treatment, etc.
- Some computer-assisted systems include user input systems, each user input system comprising one or more input devices. These input devices allow users to command and control movement of various aspects of the computer-assisted systems, such as manipulator systems. Generally, operation of a hand-based input device by a user is performed through contact and manipulation of the input device by a hand of the user.
- the input devices can control functions of various types of mechanisms and instruments. Examples include instruments supported by computer-assisted systems that are articulated to perform various procedures. For example, a computer-assisted system can use various types of medical instruments to perform minimally invasive surgical procedures, where the medical instruments are teleoperated by a user manipulating the control input devices.
- one or more embodiments relate to a computer- assisted system including an input device configured to be manipulated by a user and one or more sensors configured to provide sensor data indicative of an absence of a hand of the user at the input device.
- the computer-assisted system further includes a control system with one or more processors, the control system communicatively coupled to the input device and the one or more sensors.
- the control system is configured to command motion of an instrument in response to input received at the input device and determine, based on the sensor data, a hand presence metric.
- the control system is further configured to determine a drift metric of the input device based on a movement of the input device and the hand presence metric and disable teleoperational control of the instrument by the input device in response to a determination that the drift metric exceeds a drift threshold.
- one or more embodiments relate to a non-transitory machine-readable medium comprising a plurality of machine-readable instructions executed by one or more processors associated with a computer-assisted system.
- the computer- assisted system includes an instrument and an input device, where the input device is configured to be manipulated by a user.
- the computer-assisted system further includes one or more sensors configured to provide sensor data indicative of an absence of a hand of the user at the input device.
- the plurality of machine -readable instructions causing the one or more processors to perform a method.
- the method includes commanding motion of the first instrument in response to input received at the input device.
- the method further includes determining, based on the sensor data, a hand presence metric.
- the method further includes determining a drift metric of the input device based on a movement of the input device and the hand presence metric.
- the method further includes disabling teleoperational control of the instrument by the input device in response to a determination that the drift metric exceeds a drift threshold.
- FIG. 1 shows an example computer-assisted system, in accordance with one or more embodiments.
- FIG. 3 shows an example input system, in accordance with one or more embodiments.
- FIG. 5 depicts an example leader-follower control system architecture, in accordance with one or more embodiments.
- FIG. 6 shows a flowchart, in accordance with one or more embodiments.
- FIG. 7A depicts an example movement of an input device with respect to two degrees of freedom.
- FIG. 7B depicts an example drift threshold, in accordance with one or more embodiments.
- FIG. 7C depicts two example drift thresholds, the example drift thresholds dependent on another aspect of a computer-assisted system.
- FIG. 8 various example scenarios, in accordance with one or more embodiments.
- ordinal numbers e.g., first, second, third, etc.
- an element i.e., any noun in the application.
- the use of ordinal numbers is not to imply or create any particular ordering of the elements, and is not to limit any element to being only a single element unless expressly disclosed, such as by the use of the terms “before”, “after”, “single”, and other such terminology. Rather, the use of ordinal numbers is to distinguish between the elements.
- a first element is distinct from a second element, and the first element may encompass more than one element and succeed (or precede) the second element in an ordering of elements.
- proximal refers to a direction toward a base of the kinematic series
- distal refers to a direction away from the base along the kinematic series.
- pose refers to the multi-degree of freedom (DOF) spatial position and orientation of a coordinate system of interest attached to a rigid body.
- DOF multi-degree of freedom
- aspects of this disclosure are described in reference to computer-assisted systems, which can include devices that are teleoperated, externally manipulated, autonomous, semiautonomous, and/or the like. Further, aspects of this disclosure are described in terms of an implementation using a teleoperated surgical system, such as the da Vinci® Surgical System commercialized by Intuitive Surgical, Inc. of Sunnyvale, California. Knowledgeable persons will understand, however, that inventive aspects disclosed herein may be embodied and implemented in various ways, including teleoperated and nonteleoperated, and medical and non-medical embodiments and implementations. Implementations on da Vinci® Surgical Systems are merely exemplary and are not to be considered as limiting the scope of the inventive aspects disclosed herein.
- the instruments, systems, and methods described herein may be used for humans, animals, portions of human or animal anatomy, industrial systems, general robotic, or teleoperated systems.
- the instruments, systems, and methods described herein may be used for non-medical purposes including industrial uses, general robotic uses, sensing or manipulating non-tissue work pieces, cosmetic improvements, imaging of human or animal anatomy, gathering data from human or animal anatomy, setting up or taking down systems, training medical or non-medical personnel, and/or the like.
- Additional example applications include use for procedures on tissue removed from human or animal anatomies (with or without return to a human or animal anatomy) and for procedures on human or animal cadavers.
- FIG. 1 depicts an overhead view of an example computer- assisted system (100) as a medical system, in accordance with one or more embodiments.
- FIG. 1 shows computer-assisted system (100) as a medical system, the following description is applicable to other scenarios and systems, e.g., medical scenarios or systems that are non-surgical, non-medical scenarios or computer-assisted systems, etc.
- a diagnostic or therapeutic medical procedure is performed on a patient (190) on an operating table (110).
- the computer-assisted system (100) may include a robotic manipulating system (130) (e.g., a patient-side robotic device in a medical example).
- the robotic manipulating system (130) may include at least one manipulator arm (150A, 150B, 150C, 150D), each of which may support a removably coupled instrument (160) (also called tool (160)).
- a manipulator assembly may be said to include a manipulator arm (150A, 150B, 150C, 150D) and an instrument (160).
- instruments (160) not attached to a manipulator arm are depicted on an auxiliary table or cart.
- the instrument (160) may enter the workspace through an entry location (e.g., enter the body of the patient (190) through a natural orifice such as the throat or anus, or through an incision), while an operator (192) (e.g., a clinician such as a surgeon) views the worksite (e.g., a surgical site in the surgical scenario) through a user input system (“input system”) (120).
- the manipulator assembly may include a common proximal repositionable structure and one or more distal repositionable structures attached to the proximal repositionable structure. And the one or more distal repositionable structures may be each configured to support one or more instruments.
- each distal repositionable structure may include a prismatic joint, that when driven, linearly moves an instrument carriage and any instrument(s) coupled to the carriage along an insertion axis.
- the repositionable structure may include a single instrument manipulator and no serial coupling of manipulators.
- the repositionable structure may include a single instrument manipulator coupled to a single base manipulator.
- the computer-assisted system may include a moveable-base that is cartmounted or mounted to an operating table, and one or multiple manipulators mounted to the moveable base.
- An image of the worksite may be obtained by an imaging instrument (160) comprising an imaging device (e.g., an endoscope, an optical camera, an ultrasonic probe, etc. in a medical example).
- the imaging instrument (160) can be used for imaging the worksite, and may be manipulated by one of manipulator arms (150A, 150B, 150C, 150D) of the robotic manipulating system (130) so as to position and orient the imaging instrument (160).
- the auxiliary system (140) may process the captured images in a variety of ways prior to any subsequent display. For example, the auxiliary system (140) may overlay the captured images with a virtual control interface prior to displaying the combined images to the operator via the input system (120) or other display systems located locally or remotely from the procedure.
- One or more separate displays may also be coupled with a control system and/or the auxiliary system (140) for local and/or remote display of images, such as images of the procedure site, or other related images.
- the number of instruments (160) used at one time generally depends on the task and space constraints, among other factors. If it is appropriate to change, clean, inspect, or reload one or more of the instruments (160) being used during a procedure, an assistant (194A, 194B, 194C) may remove the instrument (160) from the manipulator arm (150A, 150B, 150C, 150D), and replace it with the same instrument (160) or another instrument (160).
- FIG. 2 is a block diagram (200) depicting select components of a computer-assisted system (100), such as the medical system of FIG. 1, and their relationships and interactions.
- a computer-assisted system (100) such as the medical system of FIG. 1, and their relationships and interactions.
- the illustration, partitioning, organization, and interaction of the components and/or modules of the computer-assisted system (100) in the block diagram (200) of FIG. 2 is intended to promote clear discussion and should not be considered fixed or limiting.
- FIG. 2 depicts a display (244) as an independent entity, however, it is well understood that in practice the display (244) may be implemented as part of, for example, the auxiliary system (140).
- the computer-assisted system (100) may include a control system (242).
- the control system (242) may be used to process input provided by the user input system (120) from an operator (192), such as to control the computer-assisted system (100) (e.g., direct movement of the manipulator arm (150A, 150B, 150C, 150D)).
- the control system (242) may also be used to process signals from other devices, from sensors, from any networks to which the control system (242) connects, etc.
- Example sensors include those associated with actuators or joints of the computer-assisted system, such as motor encoders, rotary or linear joint encoders, torque sensors, current sensors, accelerometers, force sensors, inertial measurement units, optical or ultrasonic sensors or imagers, RF sensors, etc.
- the control system (242) may further be used to provide an output, e.g., a video image for display by the display (244).
- the control system (242) may further be used to control the robotic manipulating system (130).
- the control system (242) may include one or more computer processors, non- persistent storage (e.g., volatile memory, such as random access memory (RAM), cache memory), persistent storage (e.g., a hard disk, an optical drive such as a compact disk (CD) drive or digital versatile disk (DVD) drive, a flash memory, etc.), a communication interface (e.g., Bluetooth interface, infrared interface, network interface, optical interface, etc.), and numerous other elements and functionalities.
- non- persistent storage e.g., volatile memory, such as random access memory (RAM), cache memory
- persistent storage e.g., a hard disk, an optical drive such as a compact disk (CD) drive or digital versatile disk (DVD) drive, a flash memory, etc.
- a communication interface e.g., Bluetooth interface, infrared interface, network interface, optical interface, etc.
- a computer processor of the control system (242) may be part or all of an integrated circuit for processing instructions.
- the computer processor may be one or more cores or micro-cores of a processor.
- the control system (242) may also communicate with one or more auxiliary input devices (which may be separate from the user input system (120) and one or more input devices of the user input system (120)), such as a touchscreen, keyboard, mouse, microphone, touchpad, electronic pen, or any other type of input device.
- a communication interface of the control system (242) may include an integrated circuit for connecting the control system (242) to a network (not shown) (e.g., a local area network (LAN), a wide area network (WAN) such as the Internet, mobile network, or any other type of network) and/or to another device, such as another control system (242).
- the control system (242) may communicate with one or more output devices, such as a display device (244) (e.g., a liquid crystal display (LCD), a plasma display, touchscreen, organic LED display (OLED), projector, or other display device), a printer, a speaker, external storage, or any other output device.
- a display device e.g., a liquid crystal display (LCD), a plasma display, touchscreen, organic LED display (OLED), projector, or other display device
- a printer e.g., a speaker, external storage, or any other output device.
- a speaker e.g., a speaker, external storage, or any other output device.
- Software instructions in the form of computer readable program code to perform embodiments of the disclosure may be stored, in whole or in part, temporarily or permanently, on a non-transitory computer readable medium such as a CD, DVD, storage device, a diskette, a tape, flash memory, physical memory, or any other computer readable storage medium.
- the software instructions may correspond to computer readable program code that, when executed by a processor(s), is configured to perform one or more embodiments of the invention.
- a control system (242) may be connected to or be a part of a network.
- the network may include multiple nodes. Each node may correspond to a computing system, or a group of nodes.
- embodiments of the disclosure may be implemented on a node of a distributed system that is connected to other nodes.
- embodiments of the invention may be implemented on a distributed computing system having multiple nodes, where each portion of the disclosure may be located on a different node within the distributed computing system.
- one or more elements of the aforementioned control system (242) may be located at a remote location and connected to the other elements over a network.
- the computer-assisted system (100) may include a robotic manipulating system (130) (e.g., a patient-side robotic device in a medical example).
- the robotic manipulating system (130) may include at least one manipulator arm (150A, 150B, 150C, 150D), each of which may support a removably coupled instrument (160) (also called tool (160)).
- a manipulator assembly may be said to include a manipulator arm (e.g., 150A, 150B, 150C, 150D) and an instrument (160), where, as described above, the manipulator arm generally supports the instrument (160) extending distally from the manipulator arm, and effects movements of the instrument (160).
- an instrument (160) may be positioned and manipulated through an entry location, so that a kinematic remote center is maintained at the entry location.
- Images of the worksite, taken by an imaging device of an imaging instrument such as an optical camera, may include images of the distal ends of the instruments (160) when the instruments (160) are positioned within the field-of-view of an imaging device.
- a distal instrument holder facilitates removal and replacement of the mounted instrument.
- a manipulator arm (e.g., 150A, 150B, 150C, 150D) is proximally mounted to the robotic manipulating system (130), or a base of the robotic manipulating system (130).
- manipulator arms or manipulator assemblies may be mounted to separate bases that may be independently movable, e.g., by the manipulator arms or manipulator assemblies being mounted to single- manipulator- arm carts, being provided with mounting clamps that allow mounting of the manipulator arms directly or indirectly to the operating table at various locations, etc.
- An example manipulator arm includes a plurality of links and joints extending between the proximal base and the distal instrument holder.
- a manipulator arm includes multiple joints (e.g., revolute joints, prismatic joints, etc.) and links.
- the joints of the manipulator arm, in combination, may or may not have redundant degrees of freedom.
- the instrument (160) is releasably mounted on an instrument holder.
- the instrument holder may translate along a linear guide formed by, for example, a prismatic joint and a distal link.
- the instrument holder may provide in/out movement of the instrument (160) along an insertion axis.
- the distal link may further support a cannula through which the instrument shaft of the instrument (160) extends.
- the cannula may be mechanically supported by another component of the manipulator arm or may not be supported at all.
- Actuation of the instrument (160) in one or more degrees may be provided by actuators of the manipulator assembly. These actuators may be integrated in the instrument holder, or drive drivetrains in the instrument holder, and may actuate the instrument (160) via a transmission assembly.
- FIG. 3 depicts an example user input system (“input system”) (120) as described above in refence to FIG. 1.
- the input system (120) includes a viewer (313), where an image of a worksite can be displayed during a procedure using the computer-assisted system (100).
- images depicting a surgical site can be displayed during a surgical procedure.
- the viewer (313) can be positioned within a viewing recess (311) in which a user can position their head to view images displayed by the viewer (313).
- the user e.g., operator (192)
- the user can sit in a chair in front of the user input system (120) and position their head within the viewing recess (311) such that their eyes are positioned in front of the viewer (313).
- the input system (120) includes a user presence system including one or more sensors positioned at one or more locations of the input system (120) to detect the presence of a user proximate the input system (120).
- the user presence system can include one or more hand sensors and/or one or more head sensors.
- a head sensor is disposed within the viewing recess (311) and oriented to sense a presence of a user's head within the viewing recess (311).
- the head sensor is an optical sensor that includes an emitter and a detector.
- a beam of infrared or other wavelength of light can be emitted from an emitter disposed on one side of the viewing recess (311) and the emitted light can be detected on the other side of the viewing recess (311) by a detector.
- an interruption of the beam as sensed by the detector is indicative of an object within the viewing recess (311) (e.g., a user's head blocking the beam).
- Additional or alternative types of sensors can be used in various implementations.
- a user presence system includes one or more hand sensors to sense the presence (or, in complementary fashion, the absence) of a hand of a user proximate portions of the input system (120) contactable by the hand of the user (e.g., an input device of the input system).
- a user input system (120) can include one or more input devices.
- a computer-assisted system (100) can include a control system (242), where the control system (242) processes input received through the input system (120) from an operator (192) to control aspects of the computer-assisted system (100) (e.g., direct movement of the manipulator arm (150A, 150B, 150C, 150D)).
- the handle (402) can include grip members (406) that are rotationally coupled to the central portion (407) that extends along the central axis (412).
- the central portion (407) is configured to be positioned between at least two fingers of a hand during grip of the handle by the hand.
- FIG. 4 depicts an index finger (450) and a thumb (460) of the hand of the user being used to contact the grip members (406).
- the grip members (406) can be positioned on opposite sides of the central portion (407) of the handle (402), and the grip members (406) can be grasped, held, or otherwise contacted by a user's fingers.
- a finger loop is attached to each respective grip member (406) and can be used to secure a user's fingers to the associated grip member (406).
- the leader device (502) generates control signals Cl to Cx indicating positions, states, and/or changes of one or more input devices in their degrees of freedom, the control signals Cl to Cx received by Control System A (510).
- the leader device (502) can also generate control signals (not shown) to Control System A (510) indicating selection of physical buttons and other manipulations by the user.
- the leader device (502) can also generate control signals to Control System A (510) including sensor data associated with detection of user presence by a user presence system, e.g., a head sensor and/or one or more hand sensors of the leader device (502) as described below (e.g., indication of hand detection, detection parameters including distance, direction, and/or velocity of detected objects, etc.).
- Control System A can include general components such as a processor (512), memory (514), and leader interface hardware (516) and follower interface hardware (518) for communication with the leader device (502) and follower device (504), respectively.
- the processor (512) can execute program code.
- the processor (512) need not be a singular processor but can include one or more processors of various types, including microprocessors, application specific integrated circuits (ASICs), and other electronic circuits.
- the memory (514) can store instructions for execution by the processor and can include any suitable processor-readable storage medium, e.g., random access memory (RAM), read-only memory (ROM), Electrical Erasable Readonly Memory (EEPROM), Flash memory, etc.
- Control System A Various other devices can also be coupled to Control System A (510), e.g., display(s) (520) such as the viewer (313) of the input system (120) and/or display (244) of FIG. 2.
- One or more other sensors of a user presence system can provide signals to Control System A (510) indicating detection of user presence and/or parameters related to such detection, e.g., the one or more hand sensors as depicted in FIG. 3.
- Control System A (510) given sensor data from one or more hand sensors, can determine a drift metric (522).
- the Control System A includes a mode control module (540), a teleoperation control module (550), and a non-teleoperation control module (560).
- Other implementations can use other modules, e.g., a force output control module, sensor input signal module, etc.
- the mode control module (540) can determine when a teleoperation mode and a non-teleoperation mode of the computer-assisted system (100) is initiated by a user or otherwise triggered (e.g., by user selection of controls, sensing a presence of a user at a user control system or control input device, sensing required manipulation of an input device, etc.)
- the mode control module (540) can set a teleoperation mode or a non-teleoperation mode of Control System A (510) based on one or more control signals Cl to Cx.
- modes of the control system (242) can be categorized as either a teleoperation mode and a non-teleoperation mode as indicated by the teleoperation control module (550) and the non-teleoperation control module (560) depicted in FIG. 5.
- a teleoperation mode one or more components of the computer-assisted system (100) (e.g., a manipulator assembly) can be controlled through interaction of a user with an input device of the input system (120).
- manipulation of the input system (120) does not cause an associated movement (and possibly a change in function) in other components of the computer-assisted system (100).
- contactable input devices of the input system (120) are locked as to prevent movement (e.g., translation of an input device) and/or commands associated with the movement of an input device are not transmitted (e.g., signal corresponding to the depression of a button), by the control system (242), to other components of the computer-assisted system (100).
- “teleoperational control” dictates control of one or more components of the computer-assisted system (100) (e.g., a manipulator assembly) through interaction of a user with an input device of the input system (120).
- teleoperational control can be enabled without a specific designation of a mode (e.g., a teleoperation mode).
- teleoperational control is enabled by a user meeting one or more conditions. For example, a condition may consist of a user selecting a teleoperation mode (e.g., using a graphical user interface, depressing a button, etc.).
- the teleoperation entry requirement may require that a user first indicate (e.g., through a selection mechanism) a desire to initiate teleoperation (e.g., selection of a teleoperation mode), followed by the user-directed movement of an input device through a specified movement (e.g., moving a grip or roll axis through its degree(s) of freedom), and then moving the input device to be aligned, according to some mapping, with an associated manipulator assembly controlled by the input device.
- a user first indicate (e.g., through a selection mechanism) a desire to initiate teleoperation (e.g., selection of a teleoperation mode), followed by the user-directed movement of an input device through a specified movement (e.g., moving a grip or roll axis through its degree(s) of freedom), and then moving the input device to be aligned, according to some mapping, with an associated manipulator assembly controlled by the input device.
- a disablement (or exit) of teleoperational control (i.e., teleoperational control no longer enabled) need not be associated with, or necessitate, an exit from the teleoperation mode. That is, in some implementations, the teleoperation mode persists even with disablement of teleoperational control. In some implementations, teleoperational control can be re-enabled by meeting the teleoperation entry conditions where selection of the teleoperation mode may, in some instances, already be met. In other implementations, a disablement (or exit) of teleoperational control triggers an exit or switch from a teleoperation mode to a non-teleoperation mode.
- contactable input devices of the input system (120) are locked as to prevent movement (e.g., translation of an input device) and/or commands associated with the movement of an input device are not transmitted (e.g., signal corresponding to the depression of a button), by the control system (242), to other components of the computer-assisted system (100).
- Distinctions in operation of the control system (242) between a state of enabled and disabled (or not enabled) teleoperational control can be dependent on a type of instrument(s) (160) controlled by the computer-assisted system (100), the availability and use of auxiliary functions provided and controlled by the computer-assisted system (100), and/or the state (e.g., current state) of a procedure or process performed with the computer-assisted system (100).
- various medical instruments (160) can be attached to and manipulated by the robotic manipulating system (130).
- medical instruments (160) can include, but are not limited to, clip appliers, needle drivers, suction and/or irrigation tools, graspers, scissors, staplers, endoscopes, and energy-emitting devices (e.g., electrocautery or ablation instruments).
- clip appliers needle drivers
- suction and/or irrigation tools graspers
- scissors staplers
- endoscopes endoscopes
- energy-emitting devices e.g., electrocautery or ablation instruments
- movement of all instruments can be disabled while maintaining any active functions of the instruments (e.g., irrigation instruments remain active) such as to control the computer-assisted system (100).
- any active functions of the instruments e.g., irrigation instruments remain active
- movement of all instruments can be disabled while maintaining control (e.g., teleoperational control) of functions of the instruments (e.g., irrigation can be turned “on” or “off,” clip appliers can be “fired,” etc.).
- enabled teleoperational control may allow for the control of all or most controllable parameters of the computer-assisted system (100) though interaction of a user with the input system (120).
- disabled teleoperational control may inhibit, at least, movement of the robotic manipulating system (130) even in the event of contact at an input device of the input system (120) designated to manipulate the robotic manipulating system (130).
- one or more teleoperation and non-teleoperations modes may be associated with the teleoperational control of a computer-assisted system (100), but this need not be the case.
- various behaviors of a computer-assisted system (100) are defined with respect to enabled and disabled states of teleoperational control. Additionally, differences in behavior of a computer-assisted system (100) between states of enabled or disabled teleoperational control can be dependent (or dynamically altered) based on factors such as the type and state of instruments (160) employed in the computer-assisted system (100), the state of a procedure enacted with use of the computer-assisted system (100), and/or the state of a function provided by an instrument (e.g., an energy-emitting instrument can have a state of “energy on” and a state of “energy off’).
- an instrument e.g., an energy-emitting instrument can have a state of “energy on” and a state of “energy off’.
- Disabling or inhibiting movement of the robotic manipulating system (130) can be provided through, for example, inhibiting movement of the one or more manipulator arms (e.g., 150A, 150B, 150C, 150D) (e.g., back-driving the actuators of the one or more manipulator arms to prevent/oppose movement of the one or more manipulator arms, locking the joints of one or more manipulator arms, putting the actuators of one or more manipulator arms into a gravity-compensation mode wherein the manipulator arm maintains a position but can otherwise move subject to an external force, etc.), inhibiting movement of the one or more input devices (e.g., FIG.
- inhibiting movement of the one or more input devices e.g., FIG.
- disablement of teleoperational control can selectively disable or maintain functions of certain instruments (e.g., energy to electrical cortical stimulation instruments turned off while irrigation instruments remain active).
- the teleoperation control module (550) can also be used to control forces on an input device of the leader device (502) (e.g., input system (120)), such as forces output on one or more components of the input device (e.g., grip members) using one or more control signals DI to Dx output to actuator(s) used to apply forces to the components (e.g., to the grip members of the input device, in a rotary degree of freedom of the input device, on arm links coupled to the input device, etc.).
- control signals DI to Dx can be used to provide force feedback, gravity compensation, etc.
- disabled teleoperational control can allow movement of the input device to control a display provided from cameras, or movement of cameras, that may not be included in the follower device (504).
- the control signals Cl to Cx can be used by the non-teleoperation control module (560) to control such elements (e.g., cursor, views, etc.) and control signals DI to Dx can be determined by the non-teleoperation control module to cause output of forces on one or more input devices of the input system (120) during disabled teleoperational control, e.g., to indicate to the user interactions or events occurring during such modes.
- leader-follower control scheme also called a “masterslave” control scheme
- movement of components of a computer-assisted system (100) such as a manipulator arm (e.g., 150A, 150B, 150C, 150D), an instrument (160) supported by the manipulator arm, and/or a working portion of the instrument (160) in one or more degrees of freedom corresponds to movement in one or more degrees of freedom of an associated input device (of an input system) operated by a user.
- a manipulator arm e.g., 150A, 150B, 150C, 150D
- an instrument (160) supported by the manipulator arm e.g., 150A, 150B, 150C, 150D
- a working portion of the instrument (160) in one or more degrees of freedom corresponds to movement in one or more degrees of freedom of an associated input device (of an input system) operated by a user.
- the input system can be used within a room (e.g., an operating room) containing the controllable components of the computer-assisted system (100) (e.g., robotic manipulating system (130)) or can be positioned more remotely, e.g., in a different room, building, city, or country location than, for example, the robotic manipulating system (130).
- a room e.g., an operating room
- the controllable components of the computer-assisted system (100) e.g., robotic manipulating system (130)
- robotic manipulating system (130) can be positioned more remotely, e.g., in a different room, building, city, or country location than, for example, the robotic manipulating system (130).
- Some implementations of the computer-assisted system (100) can provide enabled and disabled states of teleoperational control (and/or teleoperation and non- teleoperation modes) through the control system (242).
- input e.g., movement, button depression, etc.
- the robotic manipulating system (130) In other words, when teleoperational control is disabled, movement of an input device does not cause a movement in the robotic manipulating system (130).
- the control system (242) commands motion of the robotic manipulating system (130) in response to input received at one or more input devices of the user input system (120). For example, in a leader-follower control scheme, movement of the input device(s) of the input system (120) causes the control system (242) to command similar motion of the robotic manipulating system (130).
- a user controls or directs manipulation of the robotic manipulating system (130) through at least one input device manipulated by a hand of the user.
- the input system also includes one or more hand sensors that are configured to detect the presence of a user's hand operating the input device. It is noted that the complement of “hand presence” (or presence of a hand) is “hand absence” (or absence of a hand) such that knowledge or measurement of hand presence is sufficient to determine hand absence and vice versa.
- the hand presence metric may be represented as a ratio between the probability of hand presence over absence, or vice versa.
- the hand presence metric may be determined as a log ratio between the probability of hand presence over absence, or vice versa.
- An overall hand presence metric determined for the user input device which may be configured with a plurality of hand sensors, can be a cumulative function (e.g., cumulative product) of the individual hand presence metrics determined for the individual hand sensors.
- a hand sensor can be configured to detect a hand of a user near an input device and output sensor data that is, or can be made to be, a categorical variable, such as a binary categorical variable, x, where x can either be “hand present” or “hand absent.”
- a categorical variable such as a binary categorical variable, x, where x can either be “hand present” or “hand absent.”
- a hand sensor regardless of its configuration (e.g., time of flight sensor, computer vision-based classifier, etc.), is said to output (or return) sensor data where the sensor data can be used to determine a hand presence metric.
- the hand presence metric can be probabilistic (i.e., a probability that a hand is absent or a probability that a hand is absent at an input device) or categorical (e.g., “hand present” or “hand absent”).
- Other hand presence metrics for example, a determined distance of a hand to a hand sensor, can be readily determined and applied without departure from this disclosure.
- Hand sensors can be employed by the computer-assisted system (100) to determine a hand presence metric for an input device of the computer- assisted system (100).
- Hand sensors can include, but are not limited to, optical time of flight sensors, capacitance sensors, optical sensors, resistive sensors, and a camera with an associated detection algorithm.
- computer vision, machine learning, or other algorithm is used to determine the presence and/or location of a hand in one or more image acquired by the camera, or other sensors.
- embodiments disclosed herein relate to a method for disabling teleoperational control of a computer-assisted system (100) in response to a determination that a drift metric exceeds a drift threshold, the drift metric based on a hand presence metric and movement of an input device.
- the term “hand presence metric” is used as a general representation for a measurement of whether a hand is absent (or present) at the input device.
- the hand presence metric is a continuous-valued probability that a hand is absent (or present) at the input device (i.e., having a value in the range [0,1]).
- the hand presence metric indicates, e.g., categorically, whether a hand is absent at the input device (e.g., “hand is absent” or “hand is present”).
- the hand presence metric is an odds ratio, for example, the ratio of the probability of hand absence to the probability of hand presence.
- the hand presence metric may be determined based on various hand detection techniques.
- the hand presence metric is estimated (e.g., with a probability) using sensor data from one or more hand sensors.
- the hand presence metric is determined as the product of one or more odds ratios, where each of the one or more odds ratios corresponds to a distinct hand sensor of the input device.
- the hand sensors may be complementary (i.e., have different methods for detecting a hand, exhibit different sensitivities or accuracies under different operating conditions, etc.).
- an infrared detector and an optical detector e.g., a camera
- an infrared detector and an optical detector each determine the presence/absence of a hand with a unique and complementary modality. For example, when two or more hand sensors are in use and each provides an independent measurement of hand absence (or presence) represented as an odds ratio, then the overall odds of hand absence is the product of the individual odds.
- a hand sensor returns sensor data, where the sensor data may be processed by the control system (242) to determine a hand presence metric.
- the sensor data returned by a hand sensor is a continuous-valued probability measurement that a hand is absent (or, complementary, present) or a categorical variable indicating the state (e.g., present or absent) of the hand at an associated input device.
- sensor data returned by a hand sensor is processed by the control system (242) to convert the sensor data to a desired representation such as a probabilistic representation (which may include an odds representation) or categorical representation.
- sensor data from the one or more hand sensors may need to be combined to form a hand presence metric for the input device.
- Various methods for combining sensor data from two or more hand sensors of an input device are discussed below for instances where the hand sensor data of the hand sensor is (or can be made to be) probabilistic (including an odds ratio) or categorical. These methods are provided as examples and should not be considered limiting.
- Other methods for combining sensor data of two or more hand sensors to form a hand presence metric can be readily applied. For example, in instances where two hand sensors return sensor data in the form of a distance of detected object from the respective hand sensors, the distances may be averaged and a hand presence metric can be formed by comparing the average distance to a predefined distance threshold.
- Sensor data returned by a given hand sensor can be referenced as h or h when more than one hand sensor needs to be referenced, where each hand sensor is indexed by i.
- the following discussion relates to instances where the sensor data of a hand sensor is, or can be made to be, categorical or probabilistic (with probabilities of either hand absence or hand presence represented in the range [0,1]).
- categorical or probabilistic with probabilities of either hand absence or hand presence represented in the range [0,1].
- the subscript of c or p can be added to the measurement, respectively.
- a probabilistic sensor data i.e., h p
- h c categorical
- the predefined segmentation variable can be adjusted based on, or otherwise dependent on, a type of instrument (160) manipulated with the input device, a type of procedure being performed by the computer-assisted system (100), and/or the current state of the procedure being performed by the computer-assisted system (100).
- the sensitivity in determining whether a hand is absent or a hand is present at the input device when initially measured probabilistically (or sensor data made probabilistic) with a hand sensor, can be tuned according to factors such the type of instrument (160) controlled by the input device.
- sensor data determined using a hand sensor can be represented by the ratio of the probability of hand absence over probability of hand presence, or vice versa (i.e., an odds ratio).
- a categorical sensor data from a hand sensor can be made probabilistic by specifying a numerical value for each category.
- EQ. 3 where N is the number of hand sensors (N > 2), a t is the weight of the applied to the i th probabilistic representation of sensor data, h determined using the i th hand sensor.
- a hand presence metric is determined as an aggregation of sensor data from two or more hand sensors, where the two or more hand sensors each return an independent odds ratio (e.g., probability of absence to probability of presence).
- the hand presence metric is itself an odds ratio determined as a product of individual odds ratios, each individual odds ratio obtained using an individual hand sensor. Indexing the two or more hand sensors with the index i, and further designating that the hand presence metric is an overall odds ratio, the hand presence metric can be computed according to hand presence metric
- the weights are predefined.
- a t The predefined values for the weights can be adjusted based on, or otherwise dependent on, a type of instrument (160) manipulated with the input device, a type of procedure being performed by the computer-assisted system (100), and/or the current state of the procedure being performed by the computer-assisted system (100). For example, consider an input device with two hand sensors, the hand sensors having different sensing modalities. The measurements of the two hand sensors can be referenced as a first hand sensor and a second hand sensor, each returning a probabilistic hand absence measurement (i.e., p a ).
- the relative importance of various hand sensors can be based on factors like a type of instrument (160) manipulated with the input device, a current state of an instrument (160) manipulated with the input device (e.g., whether energy is being applied by an energy-emitting instrument, whether a grasping instrument is grasping, etc.), a type of procedure being performed by the computer-assisted system (100), the current state of the procedure being performed by the computer-assisted system (100), the current movement of the input device (e.g., input device moving to insert an associated controlled by the input device deeper into a worksite), and/or a distance and direction of a hand of the user relative to the input device (if such information is available from one or more hand absence sensors).
- a type of instrument (160) manipulated with the input device e.g., whether energy is being applied by an energy-emitting instrument, whether a grasping instrument is grasping, etc.
- a type of procedure being performed by the computer-assisted system (100) e.g.
- the sensor data returned by a hand sensor is categorical (or can be readily made categorical).
- the sensor data of two or more such hand sensors can still be aggregated (e.g., using EQ. 3) and then the hand presence metric subsequently made categorical through application of a relationship that maps a numeric (or, in some instances, more strictly, probabilistic) hand presence metric to a categorical hand presence metric.
- a probabilistic hand presence metric can be made categorical using the relationship
- a hand presence metric is computed for each input device of an input system (120).
- the hand presence metric is numeric (e.g., distance, probabilistic, odds, log odds, etc.) or categorical.
- the hand presence metric for a given input device is determined using one or more hand sensors associated with the input device.
- Hand sensors provide sensor data.
- sensor data of a hand sensor can be probabilistic (including odds ratios) or categorical (or readily made probabilistic or categorical).
- an input device can include one or more joints, each joint with a joint sensor (e.g., encoder) that determines a position, velocity, and/or acceleration of the joint.
- a joint sensor e.g., encoder
- position, velocity, and acceleration can be determined from each other through integration and differentiation.
- position, velocity, and/or acceleration of a joint, or any similar component is determined using a joint sensor (e.g., encoder), where based on the configuration of the joint sensor the joint sensor may measure one or more of position, velocity, and acceleration.
- “movement” or the position and orientation, and changes in the position and orientation, of an input device are measurable and quantifiable.
- the drift metric of an input device is based, at least in part, on movement of the input device.
- the input device consists of a kinematic series, such as with a repositionable structure with a plurality of links coupled by one or more joints (e.g., See FIG. 4).
- sensors e.g., encoders
- sensors are coupled to the components or joints of an input device to detect the position (and/or velocity, acceleration) of the components or joints throughout their respective degrees of freedom (e.g., grip members have grip degrees of freedom).
- the “pose” of an input device can be defined by the position (and/or orientation) of each of its components or joints along with a knowledge of the geometry and disposition of any interconnecting links between joints of the input device.
- the pose of the input device (or information regarding the position and orientation of each of its components and/or joints (e.g., a value for each degree of freedom)) can be stored in a variety of mathematical or computational data structures such as a tensor.
- a pose of an input device can be represented as a data point in an operational multidimensional space spanned by the degrees of freedom associated with the input device.
- a drift threshold can be represented as a surface (or hypersurface) in the operational multidimensional space that bounds (e.g., encloses) a volume (or hypervolume) representative of allowed movement (i.e., allowed values of the input device degrees of freedom) of the input device without the computer-assisted system (100) disabling teleoperational control.
- such a surface allows for accounting for movements of an input device over different degrees of freedom, where the degrees are represented with different units (e.g., millimeters, radians).
- degrees are represented with different units (e.g., millimeters, radians).
- Various example movements of an input device with respect to one or more degrees of freedom of the input device and associated drift thresholds are discussed in greater detail later in the instant disclosure with respect to FIGS. 7A-7C.
- teleoperational control of the computer-assisted system (100) is disabled when the drift metric exceeds a predefined drift threshold.
- a predefined drift threshold Various implementations for determining the drift metric and setting an associated drift threshold are described below. In general, implementation of the drift metric and/or drift threshold can depend on factors of the computer-assisted system (100) such as a type of instrument (160) manipulated with the input device, a type of procedure being performed by the computer-assisted system (100), and/or the current state of the procedure being performed by the computer-assisted system (100).
- the drift metric and/or the drift threshold can be informed by environmental factors such as the configuration of the worksite in which the computer-assisted system (100) is used to perform a procedure.
- the worksite can include an entry location (e.g., enter the body of the patient (190) through a natural orifice such as the throat or anus, or through an incision).
- Environmental factors, such as the configuration of the worksite can be determined in realtime (or near-real time) using, for example, cameras associated with (i.e., proximate the computer-assisted system (100)) or controlled by the computer-assisted system (100) (e.g., and endoscopic instrument).
- environmental factors such as the configuration of the worksite are determined using data collected before enacting a procedure with the computer-assisted system (100) (e.g., pre-operative images of a patient).
- the control system (242) is configured to begin monitoring movement of the input device in response to an indication or determination of a hand not being present on/at the input device. For example, in response to an indication or determination of a hand not being present on/at the input device (e.g., a probability hand presence metric (representing hand absence) exceeding a predefined threshold, an odds ratio hand presence metric (with odds given as probability of absence to probability of presence) exceeding a predefined threshold, etc.), a reference position and/or orientation of the input device may be recorded and the control system (242) begins monitoring a current position and/or orientation of the input device. Thus, the movement can be determined based on a comparison of the reference position and/or orientation against the monitored current position and/or orientation of the input device.
- the drift metric is v(T)hpm p (T)dT, where t is the current time (z.e., drift metric is computed in real-time), V(T) is the velocity of the input device at time T, and hpm p (r) (probabilistic hand metric) is the probability the hand is absent at the input device at time T.
- the drift metric is a weighted path distance.
- the drift metric is set to a value of zero and the integral is started anew upon the probability that the hand is absent exceeding, again, the hand absence threshold.
- the drift threshold is a scalar value.
- the hand presence metric is a continuousvalued probabilistic measure (i.e., probabilistic hand presence metric) that the hand is absent at the input device and the drift metric is an integral of a velocity of the input device weighted by the probabilistic measure that the hand is absent over a rolling temporal window of a predefined duration.
- the drift threshold may depend on an estimated distance of a hand relative to the input device.
- an Euclidean distance drift threshold may decrease in value as the distance of the hand relative to the input device increases.
- the drift threshold can be dependent on a direction of the movement of the input device.
- a spatial distance drift threshold can be smaller in a first direction of movement of the input device relative to a second direction of movement of the input device, the first and second directions having opposing directions.
- the drift threshold can depend on a characteristic of the movement of the input device (e.g., a direction of movement, a velocity, etc.).
- Disabling teleoperational control in response to meeting a drift metric that is based both on movement of the input device and a hand presence metric (and further conditions in some instances) is beneficial, at least, because it allows for intra-operative hand adjustments (i.e., temporarily removing the hand from the control input device) without delaying the procedure by disabling teleoperational control and then re-enabling teleoperational control, for example, by meeting anew one or more teleoperation entry conditions.
- intra-operative hand adjustment can be discussed with reference to FIG. 4. As seen in FIG.
- an example input device (400) is contacted by a hand of a user where, as depicted, the index finger (450) contacts an upper grip member (406), and the thumb (460) contacts a lower grip member (406).
- the user may wish to adjust the position of their hand, for example, such that the thumb (460) contacts the upper grip member (406), and the index finger (450) contacts the lower grip member (406).
- Such an adjustment may be desirable to improve dexterity during certain operations of a procedure and/or to prevent hand fatigue.
- a computer-assisted system (100) may enter or enable a state of teleoperational control.
- this process consists of a user selecting a teleoperation mode (e.g., using a graphical user interface, depressing a button, etc.).
- the process of enabling teleoperational control of the computer-assisted system (100) can further require a user to meet one or more conditions, such as moving the input device in a prescribed direction or movement and ensuring that an input device and its associated controllable component (e.g., manipulator assembly) are synchronized and/or properly aligned according to a mapping relating their movements.
- the teleoperation entry requirement must be satisfied, where the teleoperation entry requirement can consist of many related (and possibly sequential) processes.
- the teleoperation entry requirement may require that a user first indicate (e.g., through a selection mechanism) a desire to initiate teleoperation, followed by the user-directed movement of an input device through a specified movement (e.g., moving a grip or roll axis through its degree(s) of freedom), and then moving the input device to be aligned, according to some mapping, with an associated manipulator assembly controlled by the input device.
- a computer-assisted system (100) may require that all conditions of teleoperation entry requirement be re-satisfied (re-performed) in order to enable, anew, teleoperational control of the computer-assisted system (100).
- disabling teleoperational control was not the desired intent of a user, for example, by removing a hand from an input device to re-orient the hand (e.g., to improve dexterity and/or adjust hand position based on the state of the procedure)
- the unintended disabling of teleoperational control can result in procedural delays; especially in instances where a teleoperation entry requirement must be re-met in order to re-enable teleoperational control.
- drift metric that is based on both movement of the input device and a hand presence metric reduces the number of false positives that may be associated with drift detection, where drift is the unintended movement of an input device (e.g., bumped by a user, affected by gravity when not manipulated by a user, etc.).
- advantages of embodiments disclosed herein further include the fact that the drift metric and/or drift threshold can be dynamic and dependent on other aspects of the computer- assisted system (100) as previously discussed (e.g., instrument in use, state of the procedure, etc.).
- a drift metric of the input device is determined.
- the drift metric is based on a movement of the input device (if any) and the hand presence metric.
- the drift metric is an absolute displacement of the input device relative to a reference position, where the reference position is established when the hand presence metric indicates that the hand is absent from the input device.
- the drift metric is a representation of the movement (e.g., drift) of the input device whenever, according to the hand presence metric, the hand is absent from the input device. Accordingly, the drift metric may be said to “reset” whenever the hand presence metric indicates that the hand is present at the input device.
- FIG. 7A depicts a situation in which, upon establishing the reference position (702), the input device is moved (or experiences movement).
- a current position (704) of the input device, as a result of the movement of the input device, is depicted in FIG. 7A.
- FIG. 7 A further depicts a path travelled (706) by the input device during the movement of the input device from the reference position (702) to the current position (704).
- a distance between the reference position (702) and the current position (704) is computed according to a distance metric.
- d is the computed distance between the current position (704) and the reference position (702)
- M is the number of degrees of freedom (M > 1)
- x ⁇ cp ⁇ is the value of the i th degree of freedom at the current position (704)
- x ⁇ rpJ ' is the value of the i th degree of freedom at the reference position (702)
- the distance metric is computed with respect to the movement of a component of the computer-assisted system (100) controlled by the input device (e.g., an instrument (160) supported by a manipulator arm (150)).
- the drift metric can also be given with respect to the component of the computer-assisted system (100) controlled by the input device.
- FIG. 7B depicts an example drift threshold (710) as a surface enclosing the reference position (702).
- the drift threshold e.g., the depicted surface
- the drift threshold (710) is determined (e.g., positioned) based on the reference position (702).
- the example of FIG. 7B further designates a first direction (712) and a second direction (714), the first direction (712) and the second direction (714) directed in an opposing manner and aligned with the first degree of freedom (703).
- the drift threshold (710) is an asymmetric boundary encompassing the reference position (702). In the example of FIB.
- the drift threshold (710) is closer to the reference position (702) in the first direction (712) than in the second direction (714).
- the input device is allowed to move further in the second direction (714) from the reference position (702) relative to the first direction (712).
- FIG. 7B depicts an example where the drift threshold (710) depends on the movement of the input device (at least with respect to one degree of freedom).
- the drift threshold is composed of individual drift thresholds, one for each degree of freedom of the input device, and these singular drift thresholds are independent, the surface representation of the drift threshold encloses a hypercuboid.
- the drift metric, and similarly the drift threshold are associated with a single joint movement. That is, the drift metric and drift threshold can be based on a single degree of freedom the input device, where the degree of freedom may be linear, angular, etc.
- FIG. 8 depicts various example sequences of events (800) that can occur while using a computer-assisted system (80).
- the sequences of events (800) specifically dictate that some events occur, such as a determination that a user’s hand is not present at an input device of the computer-assisted system (100) (e.g., according to the hand presence metric). This is done in order to demonstrate behaviors of the computer-assisted system (100), in accordance with one or more embodiments disclosed herein. That said, events in the example sequences of events (800) need not occur when using a computer-assisted system (100).
- FIG. 8 depicts three example scenarios, namely, scenario A, scenario B, and scenario C.
- Block 802 the computer-assisted system (100) is assumed to, at first, have teleoperational control disabled. While teleoperational control is disabled, in Block 804, a user’s hand is determined to be present with respect to (w.r.t.) an input device of the computer-assisted system. The determination can be made according to a hand presence metric.
- enabling teleoperational control of the computer-assisted system (100) can require the satisfaction of one or more teleoperation entry requirements, such as selecting a teleoperation mode, indicating user presence at an input system, and/or aligning an input device with a component (e.g., an instrument) of the computer-assisted system (100) the input device is intended to control.
- teleoperation entry requirement(s) consists of all the steps and/or conditions that must be undertaken and/or satisfied (sometimes in a strict order) to enable teleoperational control of the computer-assisted system (100).
- Block 806 it is stated that the teleoperation entry requirement(s) are satisfied.
- teleoperational control of the computer-assisted system (100) is enabled, and continues to be enabled, unless stated otherwise.
- Block 810 specifies an event where the user’s hand is determined to be present, again, with respect to (w.r.t.) the input device.
- Block 818 it is explicitly stated that the computer-assisted system (100) continues with teleoperational control enabled, at least with respect to (w.r.t.) the input device.
- FIG. 8 depicts examples of sequences of events. These examples are provided to illustrate the behavior of a computer-assisted system (100) according to embodiments of the instant disclosure. That said, these events, or their sequence, need not occur as depicted in practice.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Robotics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Manipulator (AREA)
Abstract
A computer-assisted system including an input device configured to be manipulated by a user and one or more sensors configured to provide sensor data indicative of an absence of a hand of the user at the input device. The computer-assisted system further includes a control system communicatively coupled to the input device and the one or more sensors. The control system is configured to command motion of an instrument in response to input received at the input device and determine, based on the sensor data, a hand presence metric. The control system is further configured to determine, based on the hand presence metric and a movement of the input device, whether to disable teleoperational control of the instrument by the input device. The control system is further configured to disable teleoperational control of the instrument by the input device in response to the determination to disable teleoperational control.
Description
SYSTEM AND METHOD FOR CONTROLLING TELEOPERATION
BASED ON HAND PRESENCE AND INPUT DEVICE DRIFT
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to and the benefit of the filing date of U.S. Provisional Patent Application 63/557,008 filed on February 23, 2024, which is hereby incorporated by reference herein in its entirety.
BACKGROUND
Field of Invention
[0002] The present invention generally provides improved computer-assisted devices, systems, and methods.
Overview
[0003] Computer-assisted systems can be used to perform tasks at worksites. For example, a computer-assisted system may include a handheld, motorized tool assembly. As another example, a computer-assisted system may include a manipulator system that may include one or more manipulator arms to manipulate instruments for performing the task. [0004] Example computer-assisted systems include industrial and recreational manipulator systems. Example computer-assisted systems also include medical manipulator systems used in procedures for diagnosis, non-surgical treatment, surgical treatment, etc.
[0005] Some computer-assisted systems include user input systems, each user input system comprising one or more input devices. These input devices allow users to command and control movement of various aspects of the computer-assisted systems, such as manipulator systems. Generally, operation of a hand-based input device by a user is performed through contact and manipulation of the input device by a hand of the user.
[0006] The input devices can control functions of various types of mechanisms and instruments. Examples include instruments supported by computer-assisted systems that are articulated to perform various procedures. For example, a computer-assisted system can use various types of medical instruments to perform minimally invasive surgical procedures, where the medical instruments are teleoperated by a user manipulating the control input devices.
Thus, it is desirable to improve teleoperation for computer-assisted systems.
SUMMARY
[0007] It should be appreciated that the foregoing concepts, and additional concepts discussed below, may be arranged in any suitable combination, as the present disclosure is not limited in this respect. Further, other advantages and novel features of the present disclosure will become apparent from the following detailed description of various nonlimiting embodiments when considered in conjunction with the accompanying figures. [0008] In general, in one aspect, one or more embodiments relate to a computer- assisted system including an input device configured to be manipulated by a user and one or more sensors configured to provide sensor data indicative of an absence of a hand of the user at the input device. The computer-assisted system further includes a control system with one or more processors, the control system communicatively coupled to the input device and the one or more sensors. The control system is configured to command motion of an instrument in response to input received at the input device and determine, based on the sensor data, a hand presence metric. The control system is further configured to determine a drift metric of the input device based on a movement of the input device and the hand presence metric and disable teleoperational control of the instrument by the input device in response to a determination that the drift metric exceeds a drift threshold.
[0009] In general, in one aspect, one or more embodiments relate to a method for controlling a computer-assisted system where the method performed by a control system of the computer-assisted system. The method includes commanding motion of an instrument in response to input received at an input device, where the input device is configured to be manipulated by a user and where the control system is communicatively coupled to the input device and one or more sensors that are configured to provide sensor data indicative of an absence of a hand of the user at the input device. The method further includes determining, based on the sensor data, a hand presence metric and determining a drift metric of the input device based on a movement of the input device and the hand presence metric. The method further includes disabling teleoperational control of the instrument by the input device in response to a determination that the drift metric exceeds a drift threshold.
[0010] In general, in one aspect, one or more embodiments relate to a non-transitory machine-readable medium comprising a plurality of machine-readable instructions executed by one or more processors associated with a computer-assisted system. The computer- assisted system includes an instrument and an input device, where the input device is configured to be manipulated by a user. The computer-assisted system further includes one or more sensors configured to provide sensor data indicative of an absence of a hand of the
user at the input device. The plurality of machine -readable instructions causing the one or more processors to perform a method. The method includes commanding motion of the first instrument in response to input received at the input device. The method further includes determining, based on the sensor data, a hand presence metric. The method further includes determining a drift metric of the input device based on a movement of the input device and the hand presence metric. The method further includes disabling teleoperational control of the instrument by the input device in response to a determination that the drift metric exceeds a drift threshold.
[0011] Other aspects of the invention will be apparent from the following description and the appended claims.
BRIEF DESCRIPTION OF DRAWINGS
[0012] The accompanying drawings are not intended to be drawn to scale. In the drawings, each identical or nearly identical component that is illustrated in various figures may be represented by a like numeral. For purposes of clarity, not every component may be labeled in every drawing. In the drawings:
[0013] FIG. 1 shows an example computer-assisted system, in accordance with one or more embodiments.
[0014] FIG. 2 depicts a block diagram of an example computer-assisted system, in accordance with one or more embodiments.
[0015] FIG. 3 shows an example input system, in accordance with one or more embodiments.
[0016] FIG. 4 shows an example input device, in accordance with one or more embodiments.
[0017] FIG. 5 depicts an example leader-follower control system architecture, in accordance with one or more embodiments.
[0018] FIG. 6 shows a flowchart, in accordance with one or more embodiments.
[0019] FIG. 7A depicts an example movement of an input device with respect to two degrees of freedom.
[0020] FIG. 7B depicts an example drift threshold, in accordance with one or more embodiments.
[0021] FIG. 7C depicts two example drift thresholds, the example drift thresholds dependent on another aspect of a computer-assisted system.
[0022] FIG. 8 various example scenarios, in accordance with one or more embodiments.
DETAILED DESCRIPTION
[0023] Specific embodiments of the disclosure will now be described in detail with reference to the accompanying figures. Like elements in the various figures are denoted by like reference numerals for consistency.
[0024] In the following detailed description of embodiments of the disclosure, numerous specific details are set forth in order to provide a more thorough understanding of the invention. However, it will be apparent to one of ordinary skill in the art that the invention may be practiced without these specific details. In other instances, well-known features have not been described in detail to avoid unnecessarily complicating the description.
[0025] Throughout the application, ordinal numbers (e.g., first, second, third, etc.) may be used as an adjective for an element (i.e., any noun in the application). The use of ordinal numbers is not to imply or create any particular ordering of the elements, and is not to limit any element to being only a single element unless expressly disclosed, such as by the use of the terms “before”, “after”, “single”, and other such terminology. Rather, the use of ordinal numbers is to distinguish between the elements. By way of an example, a first element is distinct from a second element, and the first element may encompass more than one element and succeed (or precede) the second element in an ordering of elements.
[0026] This disclosure describes various devices, elements, and portions of computer- assisted systems and elements in terms of their state in three-dimensional space. As used herein, the term “position” refers to the location of an element or a portion of an element (e.g., three degrees of translational freedom in a three-dimensional space, such as along Cartesian x-, y-, and z-coordinates). As used herein, the term “orientation” refers to the rotational placement of an element or a portion of an element (e.g., three degrees of rotational freedom in three-dimensional space, such as about roll, pitch, and yaw axes, represented in angle-axis, rotation matrix, quaternion representation, and/or the like). As used herein, and for a device with a kinematic series, such as with a repositionable structure with a plurality of links coupled by one or more joints, the term “proximal” refers to a direction toward a base of the kinematic series, and “distal” refers to a direction away from the base along the kinematic series.
[0027] As used herein, the term “pose” refers to the multi-degree of freedom (DOF) spatial position and orientation of a coordinate system of interest attached to a rigid body. In general, a pose includes a pose variable for each of the DOFs in the pose. For example, a full 6-DOF pose for a rigid body in three-dimensional space would include 6 pose variables corresponding to the 3 positional DOFs (e.g., x, y, and z) and the 3 orientational DOFs (e.g., roll, pitch, and yaw). A 3-DOF position only pose would include only pose variables for the 3 positional DOFs. Similarly, a 3-DOF orientation only pose would include only pose variables for the 3 rotational DOFs. Further, a velocity of the pose captures the change in pose over time (e.g., a first derivative of the pose). For a full 6-DOF pose of a rigid body in three-dimensional space, the velocity would include 3 translational velocities and 3 rotational velocities. Poses with other numbers of DOFs would have a corresponding number of velocities translational and/or rotational velocities.
[0028] Aspects of this disclosure are described in reference to computer-assisted systems, which can include devices that are teleoperated, externally manipulated, autonomous, semiautonomous, and/or the like. Further, aspects of this disclosure are described in terms of an implementation using a teleoperated surgical system, such as the da Vinci® Surgical System commercialized by Intuitive Surgical, Inc. of Sunnyvale, California. Knowledgeable persons will understand, however, that inventive aspects disclosed herein may be embodied and implemented in various ways, including teleoperated and nonteleoperated, and medical and non-medical embodiments and implementations. Implementations on da Vinci® Surgical Systems are merely exemplary and are not to be considered as limiting the scope of the inventive aspects disclosed herein. For example, techniques described with reference to surgical instruments and surgical methods may be used in other contexts. Thus, the instruments, systems, and methods described herein may be used for humans, animals, portions of human or animal anatomy, industrial systems, general robotic, or teleoperated systems. As further examples, the instruments, systems, and methods described herein may be used for non-medical purposes including industrial uses, general robotic uses, sensing or manipulating non-tissue work pieces, cosmetic improvements, imaging of human or animal anatomy, gathering data from human or animal anatomy, setting up or taking down systems, training medical or non-medical personnel, and/or the like. Additional example applications include use for procedures on tissue removed from human or animal anatomies (with or without return to a human or animal anatomy) and for procedures on human or animal cadavers. Further, these techniques can also be used for medical treatment or diagnosis procedures that include, or do not include, surgical aspects.
[0029] Referring now to the drawings, in which like reference numerals represent like parts throughout the several views, FIG. 1 depicts an overhead view of an example computer- assisted system (100) as a medical system, in accordance with one or more embodiments. [0030] While FIG. 1 shows computer-assisted system (100) as a medical system, the following description is applicable to other scenarios and systems, e.g., medical scenarios or systems that are non-surgical, non-medical scenarios or computer-assisted systems, etc. [0031] In the example, a diagnostic or therapeutic medical procedure is performed on a patient (190) on an operating table (110). The computer-assisted system (100) may include a robotic manipulating system (130) (e.g., a patient-side robotic device in a medical example). The robotic manipulating system (130) may include at least one manipulator arm (150A, 150B, 150C, 150D), each of which may support a removably coupled instrument (160) (also called tool (160)). A manipulator assembly may be said to include a manipulator arm (150A, 150B, 150C, 150D) and an instrument (160). In FIG. 1, instruments (160) not attached to a manipulator arm are depicted on an auxiliary table or cart. In the illustrated procedure, the instrument (160) may enter the workspace through an entry location (e.g., enter the body of the patient (190) through a natural orifice such as the throat or anus, or through an incision), while an operator (192) (e.g., a clinician such as a surgeon) views the worksite (e.g., a surgical site in the surgical scenario) through a user input system (“input system”) (120). In other examples, the manipulator assembly may include a common proximal repositionable structure and one or more distal repositionable structures attached to the proximal repositionable structure. And the one or more distal repositionable structures may be each configured to support one or more instruments. For instance, each distal repositionable structure may include a prismatic joint, that when driven, linearly moves an instrument carriage and any instrument(s) coupled to the carriage along an insertion axis. In more examples, the repositionable structure may include a single instrument manipulator and no serial coupling of manipulators. In additional examples, the repositionable structure may include a single instrument manipulator coupled to a single base manipulator. In yet additional example, the computer-assisted system may include a moveable-base that is cartmounted or mounted to an operating table, and one or multiple manipulators mounted to the moveable base.
[0032] An image of the worksite may be obtained by an imaging instrument (160) comprising an imaging device (e.g., an endoscope, an optical camera, an ultrasonic probe, etc. in a medical example). The imaging instrument (160) can be used for imaging the worksite, and may be manipulated by one of manipulator arms (150A, 150B, 150C, 150D) of
the robotic manipulating system (130) so as to position and orient the imaging instrument (160). The auxiliary system (140) may process the captured images in a variety of ways prior to any subsequent display. For example, the auxiliary system (140) may overlay the captured images with a virtual control interface prior to displaying the combined images to the operator via the input system (120) or other display systems located locally or remotely from the procedure. One or more separate displays (not shown) may also be coupled with a control system and/or the auxiliary system (140) for local and/or remote display of images, such as images of the procedure site, or other related images.
[0033] The number of instruments (160) used at one time generally depends on the task and space constraints, among other factors. If it is appropriate to change, clean, inspect, or reload one or more of the instruments (160) being used during a procedure, an assistant (194A, 194B, 194C) may remove the instrument (160) from the manipulator arm (150A, 150B, 150C, 150D), and replace it with the same instrument (160) or another instrument (160).
[0034] Turning to FIG. 2, FIG. 2 is a block diagram (200) depicting select components of a computer-assisted system (100), such as the medical system of FIG. 1, and their relationships and interactions. The illustration, partitioning, organization, and interaction of the components and/or modules of the computer-assisted system (100) in the block diagram (200) of FIG. 2 is intended to promote clear discussion and should not be considered fixed or limiting. For example, FIG. 2 depicts a display (244) as an independent entity, however, it is well understood that in practice the display (244) may be implemented as part of, for example, the auxiliary system (140).
[0035] In general, the computer-assisted system (100) may include a control system (242). The control system (242) may be used to process input provided by the user input system (120) from an operator (192), such as to control the computer-assisted system (100) (e.g., direct movement of the manipulator arm (150A, 150B, 150C, 150D)). The control system (242) may also be used to process signals from other devices, from sensors, from any networks to which the control system (242) connects, etc. Example sensors include those associated with actuators or joints of the computer-assisted system, such as motor encoders, rotary or linear joint encoders, torque sensors, current sensors, accelerometers, force sensors, inertial measurement units, optical or ultrasonic sensors or imagers, RF sensors, etc. The control system (242) may further be used to provide an output, e.g., a video image for display by the display (244). The control system (242) may further be used to control the robotic manipulating system (130).
[0036] The control system (242) may include one or more computer processors, non- persistent storage (e.g., volatile memory, such as random access memory (RAM), cache memory), persistent storage (e.g., a hard disk, an optical drive such as a compact disk (CD) drive or digital versatile disk (DVD) drive, a flash memory, etc.), a communication interface (e.g., Bluetooth interface, infrared interface, network interface, optical interface, etc.), and numerous other elements and functionalities.
[0037] A computer processor of the control system (242) may be part or all of an integrated circuit for processing instructions. For example, the computer processor may be one or more cores or micro-cores of a processor. The control system (242) may also communicate with one or more auxiliary input devices (which may be separate from the user input system (120) and one or more input devices of the user input system (120)), such as a touchscreen, keyboard, mouse, microphone, touchpad, electronic pen, or any other type of input device.
[0038] A communication interface of the control system (242) may include an integrated circuit for connecting the control system (242) to a network (not shown) (e.g., a local area network (LAN), a wide area network (WAN) such as the Internet, mobile network, or any other type of network) and/or to another device, such as another control system (242). [0039] Further, the control system (242) may communicate with one or more output devices, such as a display device (244) (e.g., a liquid crystal display (LCD), a plasma display, touchscreen, organic LED display (OLED), projector, or other display device), a printer, a speaker, external storage, or any other output device. One or more of the output devices may be the same or different from the auxiliary input device(s). Many different types of control systems exist, and the aforementioned input and output device(s) may take other forms.
[0040] Software instructions in the form of computer readable program code to perform embodiments of the disclosure may be stored, in whole or in part, temporarily or permanently, on a non-transitory computer readable medium such as a CD, DVD, storage device, a diskette, a tape, flash memory, physical memory, or any other computer readable storage medium. Specifically, the software instructions may correspond to computer readable program code that, when executed by a processor(s), is configured to perform one or more embodiments of the invention.
[0041] A control system (242) may be connected to or be a part of a network. The network may include multiple nodes. Each node may correspond to a computing system, or a group of nodes. By way of an example, embodiments of the disclosure may be implemented on a node of a distributed system that is connected to other nodes. By way of another
example, embodiments of the invention may be implemented on a distributed computing system having multiple nodes, where each portion of the disclosure may be located on a different node within the distributed computing system. Further, one or more elements of the aforementioned control system (242) may be located at a remote location and connected to the other elements over a network.
[0042] As stated, the computer-assisted system (100) may include a robotic manipulating system (130) (e.g., a patient-side robotic device in a medical example). The robotic manipulating system (130) may include at least one manipulator arm (150A, 150B, 150C, 150D), each of which may support a removably coupled instrument (160) (also called tool (160)). A manipulator assembly may be said to include a manipulator arm (e.g., 150A, 150B, 150C, 150D) and an instrument (160), where, as described above, the manipulator arm generally supports the instrument (160) extending distally from the manipulator arm, and effects movements of the instrument (160).
In minimally invasive scenarios, an instrument (160) may be positioned and manipulated through an entry location, so that a kinematic remote center is maintained at the entry location. Images of the worksite, taken by an imaging device of an imaging instrument such as an optical camera, may include images of the distal ends of the instruments (160) when the instruments (160) are positioned within the field-of-view of an imaging device. In some implementations, a distal instrument holder facilitates removal and replacement of the mounted instrument.
[0043] As may be understood with reference to FIG. 1, a manipulator arm (e.g., 150A, 150B, 150C, 150D) is proximally mounted to the robotic manipulating system (130), or a base of the robotic manipulating system (130). Alternatively, manipulator arms or manipulator assemblies may be mounted to separate bases that may be independently movable, e.g., by the manipulator arms or manipulator assemblies being mounted to single- manipulator- arm carts, being provided with mounting clamps that allow mounting of the manipulator arms directly or indirectly to the operating table at various locations, etc. An example manipulator arm includes a plurality of links and joints extending between the proximal base and the distal instrument holder.
[0044] In an example, a manipulator arm includes multiple joints (e.g., revolute joints, prismatic joints, etc.) and links. The joints of the manipulator arm, in combination, may or may not have redundant degrees of freedom.
[0045] In some implementations, the instrument (160) is releasably mounted on an instrument holder. The instrument holder may translate along a linear guide formed by, for
example, a prismatic joint and a distal link. Thus, the instrument holder may provide in/out movement of the instrument (160) along an insertion axis. The distal link may further support a cannula through which the instrument shaft of the instrument (160) extends. Alternatively, the cannula may be mechanically supported by another component of the manipulator arm or may not be supported at all.
[0046] Actuation of the instrument (160) in one or more degrees may be provided by actuators of the manipulator assembly. These actuators may be integrated in the instrument holder, or drive drivetrains in the instrument holder, and may actuate the instrument (160) via a transmission assembly.
[0047] FIG. 3 depicts an example user input system (“input system”) (120) as described above in refence to FIG. 1. In the example of FIG. 3, the input system (120) includes a viewer (313), where an image of a worksite can be displayed during a procedure using the computer-assisted system (100). For example, images depicting a surgical site can be displayed during a surgical procedure. The viewer (313) can be positioned within a viewing recess (311) in which a user can position their head to view images displayed by the viewer (313). When using the input system (120), the user (e.g., operator (192)) can sit in a chair in front of the user input system (120) and position their head within the viewing recess (311) such that their eyes are positioned in front of the viewer (313).
[0048] In some implementations, the input system (120) includes a user presence system including one or more sensors positioned at one or more locations of the input system (120) to detect the presence of a user proximate the input system (120). For example, the user presence system can include one or more hand sensors and/or one or more head sensors. Referring to FIG. 3, in some implementations, a head sensor is disposed within the viewing recess (311) and oriented to sense a presence of a user's head within the viewing recess (311). In some examples, the head sensor is an optical sensor that includes an emitter and a detector. In such examples, a beam of infrared or other wavelength of light can be emitted from an emitter disposed on one side of the viewing recess (311) and the emitted light can be detected on the other side of the viewing recess (311) by a detector. Thus, an interruption of the beam as sensed by the detector is indicative of an object within the viewing recess (311) (e.g., a user's head blocking the beam). Additional or alternative types of sensors can be used in various implementations. In particular, and as will be described below, a user presence system includes one or more hand sensors to sense the presence (or, in complementary fashion, the absence) of a hand of a user proximate portions of the input system (120) contactable by the hand of the user (e.g., an input device of the input system).
[0049] A user input system (120) can include one or more input devices. As stated, a computer-assisted system (100) can include a control system (242), where the control system (242) processes input received through the input system (120) from an operator (192) to control aspects of the computer-assisted system (100) (e.g., direct movement of the manipulator arm (150A, 150B, 150C, 150D)). For example, an input device of the input system (120) can be used to control a manipulator assembly to position, orient, and use an instrument (160). In general, the control system (242) can provide and/or facilitate a mapping of movements and other input signals (e.g., depression of a button) between an input device and components of the computer-assisted system (100) such as the robotic manipulating system (130). Further, in some implementations, behavior of the computer- assisted system (100) can be affected through one or more modes of the control system (242) and/or the disablement and enablement of teleoperational control of a portions of the computer-assisted system (100) (e.g., robotic manipulating system (130)). For example, the one or more modes can configure, at least, the mapping between input devices of the input system and other components of the computer-assisted system (100) such as the robotic manipulating system (130). Greater details surrounding the modes of the control system (242), such as one or more teleoperation modes and one or more non-teleoperation modes, are described later in the instant disclosure.
[0050] In the example of FIG. 3, two input devices (310) are provided for user manipulation. In some implementations, each input device (310) can be configured to control motion and functions of an associated manipulator assembly of the robotic manipulating system (130). For example, an input device (310) can be moved in a plurality of degrees of freedom to move a corresponding instrument (or, in some instances, specifically the distal end of an instrument) of the robotic manipulating system (130) in corresponding degrees of freedom. In some implementations, the input devices (310) are manual input devices that can be moved in six Cartesian degrees of freedom. The depicted input devices (310) are positioned in a workspace (301) inwardly beyond a support (306). As such, in this example implementation, a user can rest their forearms on the support (306) while gripping the two input devices (310), one with each hand.
[0051] As depicted in FIG. 3, some implementations of the user input system (120) can include one or more foot controls (320) positioned below the input devices (310). The foot controls (320) can be depressed, slid, and/or otherwise manipulated by a user's feet to input various commands to the computer-assisted system (100) while the user is sitting at the user input system (120).
[0052] FIG. 4 depicts a side elevational view of an example input device (400) including example absence sensors. In some implementations, the depicted example input device (400) can be used as one or more input devices (310) as described above with respect to FIG. 3, or can be included in a different input device.
[0053] The example input device (400) includes a handle (402) coupled to a base member (408). The handle (402) of the example input device (400) includes a first end (proximal end) (404), a second end (distal end) (405) opposite the first end (404), and a central axis (412) defined between the first end (404) and the second end (405). A central portion (407) can extend between the first end (404) and the second end (405). The handle (402) (e.g., a roll member) can be rotated about the central axis (412) in a roll degree of freedom with respect to the base member (408).
[0054] In some implementations, the handle (402) can include grip members (406) that are rotationally coupled to the central portion (407) that extends along the central axis (412). In the example of FIG. 4, the central portion (407) is configured to be positioned between at least two fingers of a hand during grip of the handle by the hand. In particular, FIG. 4 depicts an index finger (450) and a thumb (460) of the hand of the user being used to contact the grip members (406). The grip members (406) can be positioned on opposite sides of the central portion (407) of the handle (402), and the grip members (406) can be grasped, held, or otherwise contacted by a user's fingers. In the depicted example, a finger loop is attached to each respective grip member (406) and can be used to secure a user's fingers to the associated grip member (406).
[0055] In some implementations, each grip member (406) can be moved in an associated degree of freedom (z.e., a “grip degree of freedom”). In some examples, the grip members (406) are each coupled to the central portion (407) of the handle (402) at respective rotational couplings, allowing rotational movement of the grip members about associated grip axes, with respect to the central portion (407). As such, each grip member (406) can be moved in an associated grip degree of freedom by a user contacting the grip members (406). In various implementations, a single grip member and finger loop can be provided, or only one of the grip members can be moved in the corresponding grip degree of freedom while the other grip member can be fixed with reference to the handle (402). For example, the positions of grip members (406) in their degrees of freedom can control corresponding rotational positions of an instrument or component thereof.
[0056] One or more grip sensors (not shown) can be coupled to the handle (402) and/or other components input device and can detect the positions of the grip members (406)
in their respective grip degrees of freedom. The grip sensors can send signals describing sensed positions and/or motions to the control system (242) of the computer-assisted system (100). In some modes (e.g., a teleoperation mode with enabled teleoperational control, described later in the instant disclosure) or implementations, the control system (242) can provide control signals to a device manipulated by the computer-assisted system (100) (e.g., a manipulator assembly). For example, the positions of the grip members (406) in their respective grip degrees of freedom can be used to control any of various degrees of freedom of an instrument (or, in some instances, the distal end of an instrument) controlled by the robotic manipulating system (130).
[0057] Various implementations of an input device, such as the that depicted in FIG. 4, can provide one or more active actuators (e.g., motors, voice coils, etc.) to output active forces on the grip members (406) in the grip degrees of freedom. For example, a sensor and/or actuator can be housed in central portion (407) or in another portion of the input device and coupled to the grip members (406) by a transmission. Some implementations can provide one or more passive actuators (e.g., brakes) or springs between the grip members (406) and the central portion (407) of the handle (402) to provide resistance in particular directions of the grips (e.g., movement in directions toward each other in their grip degrees of freedom).
[0058] The example input device (400) can include one or more control input sensors (e.g., roll sensors) coupled to the example input device (400) to detect the roll (rotary) orientation of the handle (402) about the central axis (412). The roll sensors can send signals describing sensed orientations and/or motion to the control system (242) of the computer- assisted system (100). In some implementations, an actuator (e.g., motor) can be used to drive rotation of the handle (402) about the central axis (412). That is, in some implementations, aspects of the input device may be actuated and/or motorized.
[0059] As seen in FIG. 4, the base member (408) is rotationally coupled to the handle (402), allowing the handle (402) to rotate about the central axis (412) with respect to the base member (408). The base member (408) can have a variety of shapes and can include portions or extensions in various configurations. In an example implementation, the base member (408) is mechanically coupled to a ground such that the handle (402) is mechanically grounded, e.g., via one or more links.
[0060] In the example of FIG. 4, the base member (408) includes a first base portion (420), a second base portion (421), and a third base portion (422). The first base portion (720) is rotatably coupled to the handle (402). The second base portion (421) extends from
the first base portion (420). The third base portion (422) extends from the second base portion (421).
[0061] In various implementations, the base member (408) can be provided with additional degrees of freedom. For example, a rotational degree of freedom about a yaw axis (413) can be provided to the handle (402). In this example, the yaw axis (413) intersects and is orthogonal to the central axis (412) (where the central axis may also be described as a roll axis). Additional degrees of freedom can similarly be provided. In some examples, the handle (402) can be moved within the workspace (301) of the input system (120) with a plurality of degrees of freedom, e.g., six degrees of freedom including three rotational degrees of freedom and three translational degrees of freedom. One or more additional degrees of freedom can be sensed by associated input sensors and/or actuated by actuators (motors, etc.) similarly as described above for the grip degrees of freedom. In various implementations, sensors can sense positions of the handle (402) in a degree of freedom, or sense orientations of the handle in a degree of freedom, or sense positions and orientations of the handle in multiple degrees of freedom. For example, positions in a translational degree of freedom and orientations in a rotational degree of freedom (e.g., roll, yaw, etc.) can be sensed by one or more associated input sensors. In some examples, a position in a translational degree of freedom and/or orientation in a rotational degree of freedom can be derived from rotations of components (e.g., links of a linkage) coupled to the handle (402) as sensed by rotational sensors. Some implementations can include linear sensors that can directly sense translational motion of one or more components coupled to the handle (402). In some implementations, each additional degree of freedom of the handle (402) can control a different degree of freedom (or other motion) in the robotic manipulating system (130).
[0062] In some implementations, the input device includes one or more control switches (not shown). For example, two control switches can be positioned on opposite sides of the central axis (412), and/or additional control switches can be provided. In some examples, a control switch has a portion that can slide parallel to the central axis (412), e.g., as directed by a user's finger, or the control switch portion can be depressed. In some implementations, the control switch can be moved to various positions to provide particular command signals, e.g., to select functions, options, or modes of the input system (120) and/or input device. In some implementations, one or more of the control switches can be implemented as a button (e.g., depressed in a direction, such as perpendicular to the central axis (412) or other direction), a rotary dial, a switch that moves parallel to the central axis (412), or other type of input control. Control switches can use electromagnetic sensors,
mechanical switches, magnetic sensors, or other types of sensors to detect positions of the switch.
[0063] In the depiction of FIG. 4, a distal end (424) of the third base portion (422) includes one or more hand sensors (430). The one or more hand sensors (430) can sense objects (e.g., a hand) in one or more sensing fields in space. Herein, a “sensing field” can include multiple individual sensing fields, e.g., each individual sensing field provided by a corresponding one of multiple hand sensors. FIG. 4 depicts at least one sensing field (432). In some implementations, the one or more hand sensors (430) detect a presence of an object in the sensing field (432). For example, in some implementations, a hand sensor can detect electromagnetic radiation (or ultrasonic wave, as described below) that is directed through space to the hand sensor by a presence of an object in the sensing field (432) of the hand sensor, such as a hand. The one or more hand sensors (430) generate sensor data that is processed by the control system (242). For example, in some implementations the sensor data can include a parameter, e.g., a value that indicates the detection of an object and/or corresponds to a variable distance between the object (e.g., hand) a corresponding hand sensor (or other reference location). The parameter can also or alternatively indicate other characteristics, e.g., velocity of the object.
[0064] In the example of FIG. 4, the one or more hand sensors (430) are positioned at the distal end (424) of the third base portion (422). In some implementations, a user presence system of the input system (120) can include one or more sensors (e.g., hand sensors) at one or more other locations of the example input device (400).
[0065] In some implementations, when more than one hand sensor is provided each with its own sensing field, the sensing fields can be oriented to overlap (and thus provide measurement redundancy and/or uncertainty estimates), partially overlap, or be nonoverlapping. In some implementations, a single hand sensor can provide multiple individual sensing fields. In some implementations, a sensing field can be a combination of multiple individual sensing fields.
[0066] Fingers of a hand operating the handle (402) may contact grip members (406) as shown (i.e., with index finger (450) and thumb (460)), such that the operating hand is present in at least one of the sensing fields (e.g., 432). For example, the sensing field (432) is positioned such that the hand is included in the sensing field in response to one or more fingers of the hand touching either of the two grip members (406). Thus, the position of the sensing field (432) (and other possible sensing fields not shown) allows for the detection of the user's hand while the hand operates the input device.
[0067] In some implementations, hand sensors can be provided on the left and right sides of the handle (402) relative to the perspective of a user to provide more robust sensing than a hand sensor that is centered in the third base portion (422). For example, a disadvantage of some implementations of centered hand sensors is that it may be possible, in some hand grip configurations, for a centered sensing field to be in a gap between the user’s fingers and miss detection of the hand. Further, a left and right placement of the hand sensors allows the hand sensors to detect the regions to the sides of the grip members (406), e.g., without detecting the second end (405). In some implementations, a single hand sensor pointed towards the second end (405) can be used, e.g., if the sensing field of the sensor is sufficiently wide to detect the hand in various possible hand grip configurations.
[0068] In some implementations, as shown in FIG. 4, at least a portion of the sensing field (432) is located in an approach region or path of a hand when the hand moves toward the handle (402) prior to operating the handle (402). For example, the hand enters one or more of the sensing fields as the hand approaches the handle (402) with user intent to operate the handle (402). In some implementations, the sensing field (432) has an orientation and/or size such that an object, such as a hand, can be sensed within as well as outside a particular designated region, e.g., sensed within or greater than a designated threshold distance as described herein.
[0069] The orientation, size, and/or shape of sensing fields can be based on the type of the one or more hand sensors (430) that are used to detect a hand of a user (where detection may include recognition of a presence or, in complementary fashion, an absence of a hand of a user).
[0070] In some implementations, sensing fields (e.g., 432) can be shaped as a cone. For example, the sensing field (432) can have a particular width at the hand sensor (e.g., 430) and increase in width in a direction away from the hand sensor. Herein, the term “cone” or “conical” refers to an approximate cone shape, which does not necessitate an exact conical geometry, e.g., manufacturing tolerances, interference patterns, warps due to obstructions such as the handle (402), or other allowances can be included in the conical sensing field. Furthermore, this term can refer to cones having circular cross sections, as well as or alternatively cross sections of other shapes, e.g., ellipses, ovals, rectangles, squares, triangles, etc.
[0071] Examples of one or more hand sensors (430) that can be used as depicted in the example input device (400) of FIG. 4 include electromagnetic sensors, which includes an emitter and a detector, the emitter configured to emit a first electromagnetic signal in the
sensing field and the detector configured to detect the first electromagnetic signal reflected from the hand in the sensing field. In some implementations, a hand sensor after the manner of FIG. 4 includes an optical time-of-flight sensor that generates a signal comprising a value that corresponds to a variable distance between the hand in the sensing field and the hand sensor. In some implementations, the hand sensor includes a thermopile sensor or thermal imaging camera, that includes a detector configured to detect infrared radiation emitted by the hand in the sensing field. Other types of hand sensors can also be used, e.g., ultrasonic sensor, etc. Regardless of the type of hand sensor(s) used, the hand sensor(s) may be said to output (or return) sensor data, collected by the control system (242), the sensor data indicative of, in some manner, of an absence and/or presence of a hand. For example, sensor data may include time of flight data, distance data (from a detected object to the hand sensor), probabilistic data (z.e., a probability of an object being present/absent in a sensing field), a binary detection result (e.g., “hand/object present” or “hand/object absent”), among other types of data.
[0072] While FIG. 4 depicts one or more hand sensors (430) disposed on the example input device (400) (specifically, in the distal end (424) of the third base portion (422)), hand sensors can be employed by the computer-assisted system (100) in other locations to determine a presence and/or absence of a hand near an associated input device. Hand sensors can include, but are not limited to, optical time of flight sensors, capacitance sensors, optical sensors, resistive sensors, and a camera with an associated detection algorithm.
[0073] For example, using a camera, the input device and expected location of a hand during operation of the input device are contained by the field of view of the camera. As such, the camera can be used to acquire one or more images (or a stream of images known as a video), and the one or more images can be analyzed to detect a hand. For example, machine learning object detection algorithms can be employed for object detection, where object detection includes the tasks of classifying an object (e.g., a hand) within an image and further locating (e.g., with a bounding box or pixel- wise segmentation map) the object within the image. Machine-learning algorithms can further be used to estimate the pose (a task commonly known as “pose estimation”) of a detected hand in an image. In such cases, pose information can be used to alter a drift metric (described later in the instant disclosure) through an estimation of intent, by the user, to interact with the input device. Machine- learned models for object detection and/or pose estimation may include, but are not limited to, convolutional neural networks (CNN) and vision transformers (ViT).
[0074] As previously stated, the computer-assisted system (100) can be a teleoperated system in which the input system (120) is, or is included in, a “leader” (also known as a “master”) device that controls at least a portion of the robotic manipulating system (130) (which in literature describing teleoperated systems may be known as a “follower” or “slave” device).
[0075] FIG. 5 depicts a control system, namely Control System A (510), that can be considered as an example of the control system (242) of the computer-assisted system (100) and used with one or more features described herein. Specifically, Control System A (510) implements a “leader-follower” architecture wherein a “leader” device can command or otherwise control, through the Control System A (510), a “follower” device. It is noted that such control systems may also be described with a nomenclature where the “leader” is known as the “master” and the “follower” is known as the “slave” (i.e., a so-called “master- slave” architecture).
[0076] Continuing with FIG. 5, Control System A (510) interfaces with a leader device (502) (e.g., input system (120)) and a follower device (e.g., robotic manipulating system (130)), where, through Control System A (510), the leader device (502) can control the follower device (504). In some implementations, the leader device (502) can be, or can be included in, the input system (120). In some implementations, the follower device (504) can be, or can be included in, the robotic manipulating system (130). More generally, the leader device (502) can be any type of device including an input device (e.g., as depicted in FIG. 4) that can be physically manipulated by a user. The leader device (502) generates control signals Cl to Cx indicating positions, states, and/or changes of one or more input devices in their degrees of freedom, the control signals Cl to Cx received by Control System A (510). The leader device (502) can also generate control signals (not shown) to Control System A (510) indicating selection of physical buttons and other manipulations by the user. The leader device (502) can also generate control signals to Control System A (510) including sensor data associated with detection of user presence by a user presence system, e.g., a head sensor and/or one or more hand sensors of the leader device (502) as described below (e.g., indication of hand detection, detection parameters including distance, direction, and/or velocity of detected objects, etc.).
[0077] Control System A (510) can be included in the leader device (502), in the follower device (504), or in a separate device, e.g., an intermediary device between the leader device (502) and the follower device (504). In some implementations, the Control System A (510) can be distributed among multiple of these devices. In general, Control System A
(510) receives control signals Cl to Cx and generates actuation signals Al to Ay that are sent to the follower device (504). Control System A (510) can also receive sensor signals Bl to By from the follower device (504) that indicate positions, orientations, states, and/or changes of various follower components (e.g., manipulator arm elements). Control System A (510) can include general components such as a processor (512), memory (514), and leader interface hardware (516) and follower interface hardware (518) for communication with the leader device (502) and follower device (504), respectively. The processor (512) can execute program code. The processor (512) need not be a singular processor but can include one or more processors of various types, including microprocessors, application specific integrated circuits (ASICs), and other electronic circuits. The memory (514) can store instructions for execution by the processor and can include any suitable processor-readable storage medium, e.g., random access memory (RAM), read-only memory (ROM), Electrical Erasable Readonly Memory (EEPROM), Flash memory, etc. Various other devices can also be coupled to Control System A (510), e.g., display(s) (520) such as the viewer (313) of the input system (120) and/or display (244) of FIG. 2. One or more other sensors of a user presence system can provide signals to Control System A (510) indicating detection of user presence and/or parameters related to such detection, e.g., the one or more hand sensors as depicted in FIG. 3. In some implementations, Control System A (510), given sensor data from one or more hand sensors, can determine a drift metric (522).
[0078] In the example depicted in FIG. 5, the Control System A (510) includes a mode control module (540), a teleoperation control module (550), and a non-teleoperation control module (560). Other implementations can use other modules, e.g., a force output control module, sensor input signal module, etc. The mode control module (540) can determine when a teleoperation mode and a non-teleoperation mode of the computer-assisted system (100) is initiated by a user or otherwise triggered (e.g., by user selection of controls, sensing a presence of a user at a user control system or control input device, sensing required manipulation of an input device, etc.) The mode control module (540) can set a teleoperation mode or a non-teleoperation mode of Control System A (510) based on one or more control signals Cl to Cx.
[0079] At a high-level, modes of the control system (242) can be categorized as either a teleoperation mode and a non-teleoperation mode as indicated by the teleoperation control module (550) and the non-teleoperation control module (560) depicted in FIG. 5. In general, in a teleoperation mode, one or more components of the computer-assisted system (100) (e.g., a manipulator assembly) can be controlled through interaction of a user with an input device
of the input system (120). In contrast, in a non-teleoperation mode, manipulation of the input system (120) does not cause an associated movement (and possibly a change in function) in other components of the computer-assisted system (100). In some implementations, in the non-teleoperation mode contactable input devices of the input system (120) are locked as to prevent movement (e.g., translation of an input device) and/or commands associated with the movement of an input device are not transmitted (e.g., signal corresponding to the depression of a button), by the control system (242), to other components of the computer-assisted system (100).
[0080] In some implementations, “teleoperational control” dictates control of one or more components of the computer-assisted system (100) (e.g., a manipulator assembly) through interaction of a user with an input device of the input system (120). In some implementations, teleoperational control can be enabled without a specific designation of a mode (e.g., a teleoperation mode). In some implementations, teleoperational control is enabled by a user meeting one or more conditions. For example, a condition may consist of a user selecting a teleoperation mode (e.g., using a graphical user interface, depressing a button, etc.). In some instances, the process of enabling teleoperational control of the computer-assisted system (100) may require a user to meet additional conditions, such as moving the input device in a prescribed direction or movement and ensuring that an input device and its associated controllable component (e.g., manipulator assembly (300)) are synchronized and/or properly aligned according to a mapping relating their movements.
Thus, in some implementations, it may be said that to enable teleoperational control of the computer-assisted system (100) a teleoperation entry requirement must be satisfied, where the teleoperation entry requirement can consist of many related (and possibly sequential) processes. For example, the teleoperation entry requirement may require that a user first indicate (e.g., through a selection mechanism) a desire to initiate teleoperation (e.g., selection of a teleoperation mode), followed by the user-directed movement of an input device through a specified movement (e.g., moving a grip or roll axis through its degree(s) of freedom), and then moving the input device to be aligned, according to some mapping, with an associated manipulator assembly controlled by the input device. In this example, where selection of a teleoperation mode is a condition for teleoperational control, a disablement (or exit) of teleoperational control (i.e., teleoperational control no longer enabled) need not be associated with, or necessitate, an exit from the teleoperation mode. That is, in some implementations, the teleoperation mode persists even with disablement of teleoperational control. In some implementations, teleoperational control can be re-enabled by meeting the teleoperation entry
conditions where selection of the teleoperation mode may, in some instances, already be met. In other implementations, a disablement (or exit) of teleoperational control triggers an exit or switch from a teleoperation mode to a non-teleoperation mode. In some implementations, when teleoperational control is disabled (or not enabled), contactable input devices of the input system (120) are locked as to prevent movement (e.g., translation of an input device) and/or commands associated with the movement of an input device are not transmitted (e.g., signal corresponding to the depression of a button), by the control system (242), to other components of the computer-assisted system (100).
[0081] Distinctions in operation of the control system (242) between a state of enabled and disabled (or not enabled) teleoperational control (where disablement of teleoperational control may still retain the computer-assisted system (100), or control system (242), in a teleoperation mode) can be dependent on a type of instrument(s) (160) controlled by the computer-assisted system (100), the availability and use of auxiliary functions provided and controlled by the computer-assisted system (100), and/or the state (e.g., current state) of a procedure or process performed with the computer-assisted system (100). For example, in the context of a computer-assisted system (100) as a medical system, various medical instruments (160) can be attached to and manipulated by the robotic manipulating system (130). Examples of such medical instruments (160) can include, but are not limited to, clip appliers, needle drivers, suction and/or irrigation tools, graspers, scissors, staplers, endoscopes, and energy-emitting devices (e.g., electrocautery or ablation instruments). As such, for example, when disabling teleoperational control, dependent on the type of instrument(s) (160) operated with the computer-assisted system (100) and/or the type or state of a procedure, the behavior and/or control of some instruments may be altered while others are unaffected. For example, in a state of disabled teleoperational control and/or a non- teleoperation mode, movement of all instruments can be disabled while maintaining any active functions of the instruments (e.g., irrigation instruments remain active) such as to control the computer-assisted system (100). Further, in a state of disabled teleoperational control and/or a non-teleoperation mode, movement of all instruments can be disabled while maintaining control (e.g., teleoperational control) of functions of the instruments (e.g., irrigation can be turned “on” or “off,” clip appliers can be “fired,” etc.).
[0082] In general, enabled teleoperational control (or, in instances without ambiguity, teleoperational control) may allow for the control of all or most controllable parameters of the computer-assisted system (100) though interaction of a user with the input system (120). Likewise, in general, disabled teleoperational control may inhibit, at least, movement of the
robotic manipulating system (130) even in the event of contact at an input device of the input system (120) designated to manipulate the robotic manipulating system (130). As described above, one or more teleoperation and non-teleoperations modes may be associated with the teleoperational control of a computer-assisted system (100), but this need not be the case. As such, for concision, various behaviors of a computer-assisted system (100) are defined with respect to enabled and disabled states of teleoperational control. Additionally, differences in behavior of a computer-assisted system (100) between states of enabled or disabled teleoperational control can be dependent (or dynamically altered) based on factors such as the type and state of instruments (160) employed in the computer-assisted system (100), the state of a procedure enacted with use of the computer-assisted system (100), and/or the state of a function provided by an instrument (e.g., an energy-emitting instrument can have a state of “energy on” and a state of “energy off’). Disabling or inhibiting movement of the robotic manipulating system (130) can be provided through, for example, inhibiting movement of the one or more manipulator arms (e.g., 150A, 150B, 150C, 150D) (e.g., back-driving the actuators of the one or more manipulator arms to prevent/oppose movement of the one or more manipulator arms, locking the joints of one or more manipulator arms, putting the actuators of one or more manipulator arms into a gravity-compensation mode wherein the manipulator arm maintains a position but can otherwise move subject to an external force, etc.), inhibiting movement of the one or more input devices (e.g., FIG. 4) of the input system (120) (e.g., by back-driving actuators to prevent/oppose movement of one or more input devices), not propagating signals/movement associated with the input system (120) to the corresponding components of the robotic manipulating system (130), or any combination thereof. Further, disablement of teleoperational control (and/or a switch to a non- teleoperation mode) can selectively disable or maintain functions of certain instruments (e.g., energy to electrical cortical stimulation instruments turned off while irrigation instruments remain active).
[0083] Generally, the teleoperation control module (550) can also be used to control forces on an input device of the leader device (502) (e.g., input system (120)), such as forces output on one or more components of the input device (e.g., grip members) using one or more control signals DI to Dx output to actuator(s) used to apply forces to the components (e.g., to the grip members of the input device, in a rotary degree of freedom of the input device, on arm links coupled to the input device, etc.). In some examples, control signals DI to Dx can be used to provide force feedback, gravity compensation, etc.
[0084] In some implementations, disabled teleoperational control can allow movement of the input device to control a display provided from cameras, or movement of cameras, that may not be included in the follower device (504). The control signals Cl to Cx can be used by the non-teleoperation control module (560) to control such elements (e.g., cursor, views, etc.) and control signals DI to Dx can be determined by the non-teleoperation control module to cause output of forces on one or more input devices of the input system (120) during disabled teleoperational control, e.g., to indicate to the user interactions or events occurring during such modes.
[0085] Using the described “leader-follower” control scheme (also called a “masterslave” control scheme), movement of components of a computer-assisted system (100) such as a manipulator arm (e.g., 150A, 150B, 150C, 150D), an instrument (160) supported by the manipulator arm, and/or a working portion of the instrument (160) in one or more degrees of freedom corresponds to movement in one or more degrees of freedom of an associated input device (of an input system) operated by a user. The input system can be used within a room (e.g., an operating room) containing the controllable components of the computer-assisted system (100) (e.g., robotic manipulating system (130)) or can be positioned more remotely, e.g., in a different room, building, city, or country location than, for example, the robotic manipulating system (130).
[0086] Some implementations of the computer-assisted system (100) can provide enabled and disabled states of teleoperational control (and/or teleoperation and non- teleoperation modes) through the control system (242). In some examples, in a disabled teleoperational control state of the computer-assisted system (100), input (e.g., movement, button depression, etc.) to the input devices is not propagated, or otherwise mapped, to the robotic manipulating system (130). In other words, when teleoperational control is disabled, movement of an input device does not cause a movement in the robotic manipulating system (130). When teleoperational control is enabled (and/or a teleoperational model), the control system (242) commands motion of the robotic manipulating system (130) in response to input received at one or more input devices of the user input system (120). For example, in a leader-follower control scheme, movement of the input device(s) of the input system (120) causes the control system (242) to command similar motion of the robotic manipulating system (130).
[0087] In some examples, a user (e.g., a surgeon or other clinician or a non-medical person) controls or directs manipulation of the robotic manipulating system (130) through at least one input device manipulated by a hand of the user. In such examples, the input system
also includes one or more hand sensors that are configured to detect the presence of a user's hand operating the input device. It is noted that the complement of “hand presence” (or presence of a hand) is “hand absence” (or absence of a hand) such that knowledge or measurement of hand presence is sufficient to determine hand absence and vice versa. For example, a hand sensor can be configured to provide sensor data inferring an absence of a hand of the user at the input device, and a hand presence metric (which may be a continuousvalued variable), ph, may be determined based on the sensor data. In some implementations, the hand presence metric is representative of the probability of hand presence, where 0 < ph < 1 and ph = 0 indicates a 0% probability of a hand being present and ph = 1 indicates a 100% probability of a hand being present. In such a case, a complementary continuousvalued variable representative of the probability of hand absence, pa, is simply pa = 1 — ph. Furthermore, the hand presence metric may be represented as a ratio between the probability of hand presence over absence, or vice versa. As another example, the hand presence metric may be determined as a log ratio between the probability of hand presence over absence, or vice versa. An overall hand presence metric determined for the user input device, which may be configured with a plurality of hand sensors, can be a cumulative function (e.g., cumulative product) of the individual hand presence metrics determined for the individual hand sensors. Some benefits of constructing sensor data and/or the hand presence metric in this manner, are described later in the instant disclosure. Similarly, in some situations, a hand sensor can be configured to detect a hand of a user near an input device and output sensor data that is, or can be made to be, a categorical variable, such as a binary categorical variable, x, where x can either be “hand present” or “hand absent.” Hereafter, for concision, a hand sensor, regardless of its configuration (e.g., time of flight sensor, computer vision-based classifier, etc.), is said to output (or return) sensor data where the sensor data can be used to determine a hand presence metric. As will be described below, in some implementations, the hand presence metric can be probabilistic (i.e., a probability that a hand is absent or a probability that a hand is absent at an input device) or categorical (e.g., “hand present” or “hand absent”). Other hand presence metrics, for example, a determined distance of a hand to a hand sensor, can be readily determined and applied without departure from this disclosure.
[0088] Further, hereafter, embodiments of the instant disclosure are described with respect to single input device operable with a single hand of a user. However, it is noted that this choice is made for clarity and concision and does not impose a limitation on the instant disclosure. It is emphasized that embodiments disclosed herein are readily applicable to input
systems (120) that include more than one input device, where each input device is operable with a hand of a user, and input devices that are operable with more than one hand.
[0089] Various types of hand sensors can be employed by the computer-assisted system (100) to determine a hand presence metric for an input device of the computer- assisted system (100). Hand sensors can include, but are not limited to, optical time of flight sensors, capacitance sensors, optical sensors, resistive sensors, and a camera with an associated detection algorithm. In a camera example, computer vision, machine learning, or other algorithm is used to determine the presence and/or location of a hand in one or more image acquired by the camera, or other sensors.
[0090] In general, embodiments disclosed herein relate to a method for disabling teleoperational control of a computer-assisted system (100) in response to a determination that a drift metric exceeds a drift threshold, the drift metric based on a hand presence metric and movement of an input device. As previously stated, the term “hand presence metric” is used as a general representation for a measurement of whether a hand is absent (or present) at the input device. In some implementations, the hand presence metric is a continuous-valued probability that a hand is absent (or present) at the input device (i.e., having a value in the range [0,1]). In other implementations, the hand presence metric indicates, e.g., categorically, whether a hand is absent at the input device (e.g., “hand is absent” or “hand is present”). In other implementations, the hand presence metric is an odds ratio, for example, the ratio of the probability of hand absence to the probability of hand presence. For a given input device, the hand presence metric may be determined based on various hand detection techniques. In one example, the hand presence metric is estimated (e.g., with a probability) using sensor data from one or more hand sensors. In another example, the hand presence metric is determined as the product of one or more odds ratios, where each of the one or more odds ratios corresponds to a distinct hand sensor of the input device. Where two or more hand sensors are in use, the hand sensors may be complementary (i.e., have different methods for detecting a hand, exhibit different sensitivities or accuracies under different operating conditions, etc.). For example, an infrared detector and an optical detector (e.g., a camera) can be used simultaneously to detect a hand at an input device. In some implementations, an infrared detector and an optical detector each determine the presence/absence of a hand with a unique and complementary modality. For example, when two or more hand sensors are in use and each provides an independent measurement of hand absence (or presence) represented as an odds ratio, then the overall odds of hand absence is the product of the individual odds. In this case, a hand sensor returning an odds ratio with a value of 1 is of
interest because it indicates that the hand sensor, with its associated particular sensing modality, is unable to provide any information to discriminate between hand absence and hand presence. As a result, if a particular sensor reports an odds of 1 then it makes no contribution to the final, overall odds (z.e., it is simply a multiplication of 1). In some implementations, the hand presence metric is determined using sensor data from two or more hand sensors operating with identical modalities (e.g., two or more hand sensors of the same type) to determine the uncertainty associated with the sensor data. In such implementations, the two or more hand sensors may be positioned or oriented to have overlapping, nonoverlapping, or partially overlapping sensing fields.
[0091] In general, a hand sensor returns sensor data, where the sensor data may be processed by the control system (242) to determine a hand presence metric. In some implementations, the sensor data returned by a hand sensor is a continuous-valued probability measurement that a hand is absent (or, complementary, present) or a categorical variable indicating the state (e.g., present or absent) of the hand at an associated input device. In some implementations, sensor data returned by a hand sensor is processed by the control system (242) to convert the sensor data to a desired representation such as a probabilistic representation (which may include an odds representation) or categorical representation. In instances where an input device contains more than one hand sensor, sensor data from the one or more hand sensors may need to be combined to form a hand presence metric for the input device. Various methods for combining sensor data from two or more hand sensors of an input device are discussed below for instances where the hand sensor data of the hand sensor is (or can be made to be) probabilistic (including an odds ratio) or categorical. These methods are provided as examples and should not be considered limiting. Other methods for combining sensor data of two or more hand sensors to form a hand presence metric can be readily applied. For example, in instances where two hand sensors return sensor data in the form of a distance of detected object from the respective hand sensors, the distances may be averaged and a hand presence metric can be formed by comparing the average distance to a predefined distance threshold.
[0092] Sensor data returned by a given hand sensor can be referenced as h or h when more than one hand sensor needs to be referenced, where each hand sensor is indexed by i. As stated, the following discussion relates to instances where the sensor data of a hand sensor is, or can be made to be, categorical or probabilistic (with probabilities of either hand absence or hand presence represented in the range [0,1]). To further distinguish the sensor
data returned by a hand sensor (or processed by the control system (242)) as categorical or probabilistic, the subscript of c or p can be added to the measurement, respectively. That is, for an Ith hand sensor of an input device, if the Ith hand sensor returns a categorical measurement of hand absence/presence, then this measurement may be referenced as h^l [0093] In some instances, a probabilistic sensor data (i.e., hp). determined using a hand sensor, can be made categorical (i.e., hc) by assigning a category to mutually exclusive and exhaustive segments over the range of probabilities (i.e., [0,1]). For example, probabilistic sensor data can be converted to a binary categorical representation according to ( hand is absent hv > 0 h, = M c (hand is present otherwise’
EQ. 1 where 0 < hp < 1 and 0 is a predefined segmentation variable, 0 < 0 < 1. In some implementations, 0 = 0.5. The predefined segmentation variable can be adjusted based on, or otherwise dependent on, a type of instrument (160) manipulated with the input device, a type of procedure being performed by the computer-assisted system (100), and/or the current state of the procedure being performed by the computer-assisted system (100). As such, the sensitivity in determining whether a hand is absent or a hand is present at the input device, when initially measured probabilistically (or sensor data made probabilistic) with a hand sensor, can be tuned according to factors such the type of instrument (160) controlled by the input device. For example, consider the case where hp represents the probability that a hand is absent at the input device. Then, for an irrigation instrument, 0 may be set to 0.8, and for an energy-emitting instrument, 0 may be set to 0.2. In this example, a hand sensor with associated probabilistic sensor data must have greater confidence in the hand being absent (i.e., larger value for hp) for the irrigation instrument for the categorical representation to assume the category of “hand is absent.”
[0094] In some instances, sensor data determined using a hand sensor can be represented by the ratio of the probability of hand absence over probability of hand presence, or vice versa (i.e., an odds ratio).
[0095] In a similar manner, a categorical sensor data from a hand sensor can be made probabilistic by specifying a numerical value for each category. For example, a categorical representation can be converted to a probabilistic representation (with the probability corresponding to the probability of hand absence) according to 1
, CO hand is absent /l = p (1 hand is present
EQ. 2 [0096] As stated, a given input device can be configured with two or more hand sensors. As such, in some implementations, a hand presence metric is determined as an aggregation of sensor data from the two or more hand sensors. For example, in some implementations, the hand presence metric is a weighted sum of hand absence probabilities, each hand absence probability obtained using an individual hand sensor. Indexing the two or more hand sensors with the index i, and further designating that the hand presence metric is numeric, the hand presence metric can be computed as a weighted sum according to hand presence metric
EQ. 3 where N is the number of hand sensors (N > 2), at is the weight of the applied to the ith probabilistic representation of sensor data, h determined using the ith hand sensor. In some implementations, EQ. 3 is further constrained by the relationship
ai = 1 such that the hand presence metric is itself probabilistic (as opposed to simply numeric).
[0097] In some implementations, a hand presence metric is determined as an aggregation of sensor data from two or more hand sensors, where the two or more hand sensors each return an independent odds ratio (e.g., probability of absence to probability of presence). For example, in some implementations, the hand presence metric is itself an odds ratio determined as a product of individual odds ratios, each individual odds ratio obtained using an individual hand sensor. Indexing the two or more hand sensors with the index i, and further designating that the hand presence metric is an overall odds ratio, the hand presence metric can be computed according to hand presence metric
EQ. 4 [0098] where N is the number of hand sensors (A > 2) and
is the odds ratio representation of the sensor data of the ith hand sensor. As stated above, sensor data of a hand sensor can be, or made to be, categorical. In such cases where it is desirous to apply EQ. 3 or EQ. 4, categorical sensor data (i.e., hc) are first made probabilistic, e.g., using EQ.
2. The weights are predefined. In some implementations, at =
The
predefined values for the weights can be adjusted based on, or otherwise dependent on, a type of instrument (160) manipulated with the input device, a type of procedure being performed by the computer-assisted system (100), and/or the current state of the procedure being performed by the computer-assisted system (100). For example, consider an input device with two hand sensors, the hand sensors having different sensing modalities. The measurements of the two hand sensors can be referenced as a first hand sensor and a second hand sensor, each returning a probabilistic hand absence measurement (i.e., pa). As an example, for a first instrument the associated weights may be set to a = 0 and a2 = 1, and for a second instrument the associated wights may be set to a = 1 and a2 = 0. Thus, in this example, for the first instrument, the hand presence metric is determined using only the second hand sensor and the for the second instrument the hand presence metric is determined using only the first hand sensor. As such, the relative importance of various hand sensors can be based on factors like a type of instrument (160) manipulated with the input device, a current state of an instrument (160) manipulated with the input device (e.g., whether energy is being applied by an energy-emitting instrument, whether a grasping instrument is grasping, etc.), a type of procedure being performed by the computer-assisted system (100), the current state of the procedure being performed by the computer-assisted system (100), the current movement of the input device (e.g., input device moving to insert an associated controlled by the input device deeper into a worksite), and/or a distance and direction of a hand of the user relative to the input device (if such information is available from one or more hand absence sensors).
[0099] In some implementations, the sensor data returned by a hand sensor is categorical (or can be readily made categorical). In these instances, the sensor data of two or more such hand sensors can still be aggregated (e.g., using EQ. 3) and then the hand presence metric subsequently made categorical through application of a relationship that maps a numeric (or, in some instances, more strictly, probabilistic) hand presence metric to a categorical hand presence metric. For example, a probabilistic hand presence metric can be made categorical using the relationship
( hand is absent hpm„ > categorical hand presence metric = 1, ,
(hand is present otherwise
EQ. 5 where hpmp represents the probabilistic hand presence metric and is a predefined segmentation variable, 0 < < 1.
[00100] To summarize the preceding discussion of the hand presence metric, a hand presence metric is computed for each input device of an input system (120). In some implementations, the hand presence metric is numeric (e.g., distance, probabilistic, odds, log odds, etc.) or categorical. Further, the hand presence metric for a given input device is determined using one or more hand sensors associated with the input device. Hand sensors provide sensor data. In some instances, sensor data of a hand sensor can be probabilistic (including odds ratios) or categorical (or readily made probabilistic or categorical). EQS. 1 and 2, above, provide examples of relationships for transforming between probabilistic and categorical sensor data. The transformation between probabilistic and categorical sensor data may be achieved through other relationships. EQ. 3, according to one example, is an equation for aggregating multiple probabilistic representations of sensor data (i.e., one from each hand sensor of an input device). EQ. 4, according to one example, is an equation for aggregating multiple odds representations of sensor data (i.e., one from each hand sensor of an input device). Other aggregation methods, such as identifying the maximum or minimum value, taking a median, etc., can be employed without departing from the scope of this disclosure. Use of EQS. 1-5 allow for the determination of either a numeric or categorical hand presence metric given any combination sensor data, from one or more hand sensors, where the sensor data are either probabilistic (including odds ratios) or categorical. Other equations, or combinations of equations, can be applied to determine a hand presence metric from other types and combinations of sensor data (e.g., a distance measurement of a hand/object from a respective sensor, the pose of a hand, etc.), such that EQS. 1-5 are provided solely as examples and do not impose a limitation on the instant disclosure.
[00101] Returning to the concept of the computer-assisted system (100) disabling teleoperational control using a drift metric, the drift metric is based on the hand presence metric and movement of the input device. The movement of an input device, such as the examples previously described with respect to FIG. 4, is sensed using sensors (e.g., encoders) of the input device. For example, sensors can sense positions of a handle of an input device in a degree of freedom, or sense orientations of the handle in a degree of freedom, or sense positions and orientations of the handle in multiple degrees of freedom. For example, positions in a translational degree of freedom and orientations in a rotational degree of freedom can be sensed by one or more associated input sensors. In some examples, a position in a translational degree of freedom and/or orientation in a rotational degree of freedom (e.g., roll, yaw, etc.) can be derived from rotations of components (e.g., links of a linkage) coupled to the handle as sensed by rotational sensors. Some implementations can
include linear sensors that can directly sense translational motion of one or more components coupled to an input device. Similarly, one or more grip sensors can be coupled to the handle of an input device and/or other components of the input device to detect the positions of, for example, grip members in their respective grip degrees of freedom. Input sensors of an input device can send signals describing sensed positions and/or motions of the input device to the control system (242) of the computer-assisted system (100). Concisely it may be said that an input device can include one or more joints, each joint with a joint sensor (e.g., encoder) that determines a position, velocity, and/or acceleration of the joint. It is noted that, in general, position, velocity, and acceleration can be determined from each other through integration and differentiation. Thus position, velocity, and/or acceleration of a joint, or any similar component, is determined using a joint sensor (e.g., encoder), where based on the configuration of the joint sensor the joint sensor may measure one or more of position, velocity, and acceleration. As such, “movement” or the position and orientation, and changes in the position and orientation, of an input device are measurable and quantifiable.
[00102] As stated, the drift metric of an input device is based, at least in part, on movement of the input device. In some implementations, the input device consists of a kinematic series, such as with a repositionable structure with a plurality of links coupled by one or more joints (e.g., See FIG. 4). In general, sensors (e.g., encoders) are coupled to the components or joints of an input device to detect the position (and/or velocity, acceleration) of the components or joints throughout their respective degrees of freedom (e.g., grip members have grip degrees of freedom). Thus, the “pose” of an input device can be defined by the position (and/or orientation) of each of its components or joints along with a knowledge of the geometry and disposition of any interconnecting links between joints of the input device. The pose of the input device (or information regarding the position and orientation of each of its components and/or joints (e.g., a value for each degree of freedom)) can be stored in a variety of mathematical or computational data structures such as a tensor. [00103] As such, in some implementations, a pose of an input device can be represented as a data point in an operational multidimensional space spanned by the degrees of freedom associated with the input device. In these implementations, a drift threshold can be represented as a surface (or hypersurface) in the operational multidimensional space that bounds (e.g., encloses) a volume (or hypervolume) representative of allowed movement (i.e., allowed values of the input device degrees of freedom) of the input device without the computer-assisted system (100) disabling teleoperational control. Such a surface can account for interdependencies among the degrees of freedom of the input device (i.e., the drift
threshold can change based on the position and orientation of the input device) and can further depend on other factors such as the state of a procedure being enacted with the aid of the computer-assisted system (100) and an instrument type and or state of the instrument (e.g., a jawed instrument with jaws currently open) controlled by the input device. Additionally, such a surface allows for drift thresholds to be established where an input device can be moved (or have movement) to the surface-based drift threshold without physical translation of the input device (e.g., movement of the input device occurs only over rotational degrees of freedom such as the roll axis (312)). Moreover, such a surface allows for accounting for movements of an input device over different degrees of freedom, where the degrees are represented with different units (e.g., millimeters, radians). Various example movements of an input device with respect to one or more degrees of freedom of the input device and associated drift thresholds are discussed in greater detail later in the instant disclosure with respect to FIGS. 7A-7C.
[00104] In accordance with one or more embodiments, teleoperational control of the computer-assisted system (100) is disabled when the drift metric exceeds a predefined drift threshold. Various implementations for determining the drift metric and setting an associated drift threshold are described below. In general, implementation of the drift metric and/or drift threshold can depend on factors of the computer-assisted system (100) such as a type of instrument (160) manipulated with the input device, a type of procedure being performed by the computer-assisted system (100), and/or the current state of the procedure being performed by the computer-assisted system (100). Additionally, the drift metric and/or the drift threshold can be informed by environmental factors such as the configuration of the worksite in which the computer-assisted system (100) is used to perform a procedure. For example, in a surgical context, the worksite can include an entry location (e.g., enter the body of the patient (190) through a natural orifice such as the throat or anus, or through an incision). Environmental factors, such as the configuration of the worksite, can be determined in realtime (or near-real time) using, for example, cameras associated with (i.e., proximate the computer-assisted system (100)) or controlled by the computer-assisted system (100) (e.g., and endoscopic instrument). In some implementations, environmental factors such as the configuration of the worksite are determined using data collected before enacting a procedure with the computer-assisted system (100) (e.g., pre-operative images of a patient).
[00105] In some implementations, the control system (242) is configured to begin monitoring movement of the input device in response to an indication or determination of a hand not being present on/at the input device. For example, in response to an indication or
determination of a hand not being present on/at the input device (e.g., a probability hand presence metric (representing hand absence) exceeding a predefined threshold, an odds ratio hand presence metric (with odds given as probability of absence to probability of presence) exceeding a predefined threshold, etc.), a reference position and/or orientation of the input device may be recorded and the control system (242) begins monitoring a current position and/or orientation of the input device. Thus, the movement can be determined based on a comparison of the reference position and/or orientation against the monitored current position and/or orientation of the input device.
[00106] In one or more implementations, the hand presence metric indicates whether the hand is absent at the input device (i.e., a categorical hand presence metric) and the drift metric is an Euclidean distance travelled by the input device from a reference position established at the start of a duration when the hand is determined to be absent at the input device (i.e., as indicated by the hand presence metric). In other words, in these implementations, the drift metric is the distance of a current position of the input device from a reference position of the input device established when the hand presence metric indicates that a user’s hand is first absent from the input device. As such, the drift threshold is a maximum allowable distance (e.g., a distance in the world frame, a Euclidean distance, etc.) of the input device from the reference position.
[00107] In one or more implementations, the hand presence metric indicates whether the hand is absent at the input device (i.e., a categorical hand presence metric) and the drift metric is a path distance travelled by the input device from a reference position established at the start of a duration when the hand is determined to be absent at the input device (i.e., as indicated by the hand presence metric). That is, in these implementations, the drift metric is determined as the length of a path traversed by the input device, the path originating at the reference position of the input device when the hand presence metric first indicates that the user’s hand is absent from the input device. In these implementations, the drift threshold is a maximum allowable path distance of the input device from the reference position.
[00108] In one or more implementations, the hand presence metric is a continuousvalued probabilistic measure that the hand is absent at the input device and the drift metric is an integral of a velocity of the input device weighted by the probabilistic measure that the hand is absent over a duration when the probabilistic measure that the hand is absent at the input device is above a hand absence threshold. That is, given a predefined hand absence threshold, once the hand presence metric (that is indicating hand absence) exceeds the hand absence threshold, the velocity of the input device is multiplied by the hand presence metric
and this product is integrated temporally to form the drift metric. That is, defining the start of the duration as To, the drift metric is v(T)hpmp(T)dT, where t is the current time (z.e., drift metric is computed in real-time), V(T) is the velocity of the input device at time T, and hpmp(r) (probabilistic hand metric) is the probability the hand is absent at the input device at time T. Thus, the drift metric is a weighted path distance. If, while accumulating the drift metric, the hand presence metric value drops below the predefined hand absence threshold before the drift metric exceeds the drift threshold, the drift metric is set to a value of zero and the integral is started anew upon the probability that the hand is absent exceeding, again, the hand absence threshold. In these implementations, the drift threshold is a scalar value. [00109] In one or more implementations, the hand presence metric is a continuousvalued probabilistic measure (i.e., probabilistic hand presence metric) that the hand is absent at the input device and the drift metric is an integral of a velocity of the input device weighted by the probabilistic measure that the hand is absent over a rolling temporal window of a predefined duration. That is, given a predefined window duration of T, the drift metric is v(r)hpmp(r)dT, where t is the current time (i.e., drift metric is computed in real-time), V(T) is the velocity of the input device at time T, and hpmp(r) is the probabilistic measure the hand is absent at the input device at time T. In these implementations, the drift threshold is a scalar value.
[00110] In one or more implementations, the hand presence metric is an overall odds ratio that the hand is absent at the input device (i.e., ratio of a probability of absence to a probability of presence) and the drift metric is an integral of a velocity of the input device weighted by the overall odds ratio over a duration when the overall odds ratio is above a hand absence threshold. In one or more implementations, given a predefined hand absence odds threshold, once the hand absence odds metric exceeds the hand absence threshold, the velocity of the input device is multiplied by the hand absence odds metric and this product is integrated temporally to form the drift metric.
[00111] In one or more implementations, the hand presence metric indicates whether the hand is absent at the input device, the drift metric is a spatial location of the input device, and the drift threshold is a spatial surface surrounding the input device from a reference position established at the start of a duration when the hand is determined to be absent at the input device (i.e., as indicated by the hand presence metric). In these implementations, the drift threshold is exceeded by the drift metric crossing the spatial surface. In some examples, the spatial surface may be defined in relation to the input device (e.g., in the input device
reference frame). For instance, the spatial surface may be defined in the input device reference frame as a spatial surface surrounding a reference position of the input device established when the hand presence metric indicates that a user’s hand is first absent from the input device. In other examples, the spatial surface may be defined in relation to the manipulator arm (e.g., 150A, 150B, 150C, 150D) and/or instrument (160) being teleoperationally controlled by the input device (e.g., in a work site reference frame, in a manipulator arm reference frame, in an instrument reference frame, etc.). For instance, the spatial surface may be defined in the work site reference frame as a spatial volume surrounding a control point on the instrument being teleoperationally controlled by the input device. In yet additional examples, the spatial surface may be defined in a combination of the work site reference frame and the input device reference frame. For instance, given a mapping between movements of the input device and the manipulator assembly a spatial surface surrounding a control point in the manipulator assembly and/or the instrument can be mapped (or inverse mapped) to form an associated spatial surface surrounding the reference position of the input device. In any of the above examples, the spatial surface can be thought of as a virtual boundary, that if crossed by the input device (and/or the control point on the instrument, for example), results in the drift metric exceeding the drift threshold causing the computer-assisted system (100) to disable teleoperational control (e.g., exit a teleoperation mode). As discussed, the spatial surface can correspond to a virtual boundary associated with the manipulator assembly. As such, the drift metric can be said to exceed the drift threshold when a movement of the input device results (or will result) in the manipulator assembly (or, for example, an instrument (160)) moving into a region outside of its predefined boundary. In some implementations, the spatial surface is determined based on information relating to the work site and/or the environment of the manipulator assembly (e.g., using pre-operative surgical images, real-time endoscopic images captured by an imaging device of the computer-assisted system, intra-operative scans, etc.). For instance, the spatial surface may be defined based on locations of anatomical structures or features within the work site (e.g., as recognized using endoscopic images, etc.).
[00112] The drift metric and/or the drift threshold can be dependent on an instrument type of instrument (160) (e.g., non-jawed instrument, jawed instrument, etc.) manipulated with the input device (e.g., the working size of the instrument may dictate a spatial surface drift threshold), a current state of an instrument (160) manipulated with the input device (e.g., whether energy is being applied by an energy-emitting instrument, whether a grasping instrument is grasping, etc.), a type of procedure being performed by the computer-assisted
system (100), the current state of the procedure being performed by the computer-assisted system (100), a characteristic of the movement of the input device (e.g., input device moving to insert an associated instrument controlled by the input device deeper into a worksite, input device moving in a certain direction, etc.), and/or a distance and direction of a hand of the user relative to the input device (if such information is available from one or more hand absence sensors). For example, for a computer-assisted system (100) used to perform a procedure in a surgical setting where one or more instruments are manipulated by a robotic manipulating system (130) of the computer-assisted system (100) to interact with a patient (e.g., patient (190)), a spatial surface drift threshold can be established based on, but not limited to, the following factors: an instrument type (e.g., an energy-emitting instrument); the current state of the procedure (e.g., energy -emitting instrument is in close proximity to the patient (190); the current state of the instrument (e.g., whether energy is activated at the energy-emitting instrument); and a knowledge of regions where an instrument is expected to operate (i.e., a predefined worksite “volume,” where in some instances the volume is established based on pre-operative images (e.g., computed tomography) of the patient (190); in other instances the volume is established based on an endoscopic view and image analysis that identifies anatomical structure locations in the endoscopic view ). As another example, the drift threshold may depend on an estimated distance of a hand relative to the input device. In this instance, for example, an Euclidean distance drift threshold may decrease in value as the distance of the hand relative to the input device increases. As yet another example, the drift threshold can be dependent on a direction of the movement of the input device. In this instance, for example, a spatial distance drift threshold can be smaller in a first direction of movement of the input device relative to a second direction of movement of the input device, the first and second directions having opposing directions. In general, the drift threshold can depend on a characteristic of the movement of the input device (e.g., a direction of movement, a velocity, etc.).
[00113] Disabling teleoperational control in response to meeting a drift metric that is based both on movement of the input device and a hand presence metric (and further conditions in some instances) is beneficial, at least, because it allows for intra-operative hand adjustments (i.e., temporarily removing the hand from the control input device) without delaying the procedure by disabling teleoperational control and then re-enabling teleoperational control, for example, by meeting anew one or more teleoperation entry conditions. An example of an intra-operative hand adjustment can be discussed with reference to FIG. 4. As seen in FIG. 4, an example input device (400) is contacted by a hand
of a user where, as depicted, the index finger (450) contacts an upper grip member (406), and the thumb (460) contacts a lower grip member (406). During a procedure performed using the computer-assisted system (100), the user may wish to adjust the position of their hand, for example, such that the thumb (460) contacts the upper grip member (406), and the index finger (450) contacts the lower grip member (406). Such an adjustment may be desirable to improve dexterity during certain operations of a procedure and/or to prevent hand fatigue. [00114] Generally, a computer-assisted system (100) may enter or enable a state of teleoperational control. In one or more implementations, at a minimum, this process consists of a user selecting a teleoperation mode (e.g., using a graphical user interface, depressing a button, etc.). In some instances, the process of enabling teleoperational control of the computer-assisted system (100) can further require a user to meet one or more conditions, such as moving the input device in a prescribed direction or movement and ensuring that an input device and its associated controllable component (e.g., manipulator assembly) are synchronized and/or properly aligned according to a mapping relating their movements. Thus, it may be said that to enable teleoperational control of the computer-assisted system (100) a teleoperation entry requirement must be satisfied, where the teleoperation entry requirement can consist of many related (and possibly sequential) processes. For example, the teleoperation entry requirement may require that a user first indicate (e.g., through a selection mechanism) a desire to initiate teleoperation, followed by the user-directed movement of an input device through a specified movement (e.g., moving a grip or roll axis through its degree(s) of freedom), and then moving the input device to be aligned, according to some mapping, with an associated manipulator assembly controlled by the input device. [00115] Conventionally, upon disabling teleoperational control (e.g., engaging a non- teleoperation mode), a computer-assisted system (100) may require that all conditions of teleoperation entry requirement be re-satisfied (re-performed) in order to enable, anew, teleoperational control of the computer-assisted system (100). In instances where disabling teleoperational control was not the desired intent of a user, for example, by removing a hand from an input device to re-orient the hand (e.g., to improve dexterity and/or adjust hand position based on the state of the procedure), the unintended disabling of teleoperational control can result in procedural delays; especially in instances where a teleoperation entry requirement must be re-met in order to re-enable teleoperational control.
[00116] Additionally, the use of a drift metric that is based on both movement of the input device and a hand presence metric reduces the number of false positives that may be associated with drift detection, where drift is the unintended movement of an input device
(e.g., bumped by a user, affected by gravity when not manipulated by a user, etc.). Finally, advantages of embodiments disclosed herein further include the fact that the drift metric and/or drift threshold can be dynamic and dependent on other aspects of the computer- assisted system (100) as previously discussed (e.g., instrument in use, state of the procedure, etc.). The determined movement of an input device can be measured relative to the input device (with a dynamically determined reference position and/or time) or a portion of the controlled device, e.g., an end effector of an instrument attached to a manipulator arm. [00117] FIG. 6 depicts a method of determining and using a drift metric and for determining whether the teleoperational control of a computer-assisted system (100) should be disabled, in accordance with one or more embodiments. As such, throughout the blocks depicted in FIG. 6, it is given that teleoperational control of the computer-assisted system (100) is enabled. In Block 602, sensor data is acquired using one or more hand sensors (“sensors”) monitoring an input device of the computer-assisted system (100), the input device capable of controlling one or more other components of the computer-assisted system (100) (e.g., a manipulator assembly), at least while the computer-assisted system (100) is in a state of enabled teleoperational control. The sensor data is used to determine a hand presence metric. Various examples of how a hand presence metric can be determined from sensor data have been described above.
[00118] In Block 604, a drift metric of the input device is determined. The drift metric is based on a movement of the input device (if any) and the hand presence metric. For example, in one implementation, the drift metric is an absolute displacement of the input device relative to a reference position, where the reference position is established when the hand presence metric indicates that the hand is absent from the input device. Thus, in this implementation, the drift metric is a representation of the movement (e.g., drift) of the input device whenever, according to the hand presence metric, the hand is absent from the input device. Accordingly, the drift metric may be said to “reset” whenever the hand presence metric indicates that the hand is present at the input device. As such, a new reference position, from which displacement of the input device is determined as the drift metric, is established for each instance where the hand presence metric switches from an indication that the hand is present to an indication that the hand is absent at the input device. Other implementations of determining the drift metric are described above.
[00119] In Block 606, the drift metric is compared to a drift threshold, where the drift threshold can be dependent on the implementation of the drift metric and/or other factors such as a type of instrument (160) manipulated by a manipulator arm (150) controlled, when
teleoperational control of the computer-assisted system (100) is enabled, by the input device. For example, in implementations where the drift metric is an absolute displacement of the input device relative to a reference position, where the reference position is established when the hand presence metric indicates that the hand is absent from the input device, the drift threshold can be a maximum allowable absolute displacement of the input device from the reference position. As such, in Block 606, a determination as to whether the drift metric exceeds the drift threshold is made. If, in Block 606, it is determined that the drift metric exceeds the drift threshold, teleoperational control of the computer-assisted system (100) is disabled.
[00120] One or more of the Blocks in FIG. 6 may be performed by various components of systems, previously described with reference to FIGS. 1-4. Further, blocks of the flowchart of FIG. 6 may be executed on one or more processors, e.g., of the control system (242) of the computer-assisted system (100).
[00121] In accordance with one or more embodiments, in response to disabling teleoperational control the behavior of the computer-assisted system (100) is altered. Responses to disabling teleoperational control include, but are not limited to: generating an alert or alarm to a user where the alert or alarm and be any combination of audio, visual, and haptic feedback (e.g. display of a text notification, emission of an audible tone, vibration delivered through a component of the input device, etc.); interrupting control or otherwise discontinuing operation of a robotic manipulating system (130) (e.g., back-driving the actuators of the one or more manipulator arms to prevent/oppose movement of the one or more manipulator arms, locking the joints of one or more manipulator arms, putting the actuators of one or more manipulator arms into a gravity-compensation mode wherein the manipulator arm maintains a position but can otherwise move subject to an external force, etc.) inhibiting movement of the one or more input devices (e.g., FIG. 4) of the input system (120) (e.g., by back-driving actuators to prevent/oppose movement of one or more input devices); not propagating signals/movement associated with the input system (120) to the corresponding components of the robotic manipulating system (130); or any combination thereof. Further, disablement of teleoperational control (and/or a switch to a non- teleoperation mode) can selectively disable or maintain functions of certain instruments (e.g., energy to electrical cortical stimulation instruments turned off while irrigation instruments remain active).
[00122] In addition, or as an alternative, to disabling teleoperational control in response to a determination that the drift threshold of an input device exceeds its associated
drift threshold, the control system (242) is configured to cause the input device to be held in place (e.g., by activating one or more actuators coupled to the input device) and/or to provide user feedback (e.g., haptic feedback, force feedback, etc.) to the user input device. That is, teleoperational control remains enabled but with imposed constraints on aspects of the computer-assisted system (100).
[00123] It is noted that, for concision and clarity, the preceding discussion has been made with reference to a single input device. However, in practice, an input system (120) (or, more generally, the computer-assisted system (100)) can have more than one input device. In instances where the computer-assisted system (100) has more than one input device, the one or more input devices can be part of the user input system (120) or distributed across multiple components of the computer-assisted system (100). Embodiments disclosed herein, relating to, at least, the determination of a drift metric and whether teleoperational control is disabled can be applied to one or more input devices of a computer-assisted system independently. For example, consider a computer-assisted system (100) with two input devices, namely, a first input device and a second input device. The computer-assisted system can enable teleoperational control, independently, for the first input device and the second input device. For example, teleoperational control of the computer-assisted system (100) associated with the first input device may be referred to as a first teleoperational control. Similarly, teleoperational control of the computer-assisted system (100) associated with the second input device may be referred to as a second teleoperational control.
Continuing with the example, the first input device, with enabled first teleoperational control, can be configured to manipulate a first manipulator arm (e.g., 150A) and/or a first instrument supported by the first manipulator arm while the second input device, with enabled second teleoperational control, can be configured to manipulate a second manipulator arm (e.g., 150B) and/or a second instrument supported by the second manipulator arm; the first and second manipulator arms (150A, 150B) part of a robotic manipulating system (130). Further, the first and second input devices can each have their own associated drift metric and drift threshold (where these are, for example, dependent on an associated instrument type). As such, as an example, the first input device, in response to its drift metric exceeding its drift threshold, can disable the first teleoperational control while the second input device retains an enabled state for the second teleoperational control. In summary, embodiments of the instant disclosure are not limited to a single input device or computer-assisted systems (100) with only one input device. One with ordinary skill in the art will readily appreciate that the aforementioned description of a drift metric and drift threshold to determine disablement of
teleoperational control can be readily applied to multiple input devices, including computer- assisted systems (100) with more than one input device.
[00124] FIGS. 7A-7C illustrate example movements of an input device and associated drift thresholds. FIGS. 7A-7C depict a two-dimensional space that can be used to visualize the “movement” of an input system. The use of a two-dimensional space is to aid in visualization and to promote understanding of embodiments disclosed herein and therefore should not be considered a limitation on the instant disclosure. In the examples of FIGS. 7A- 7C, the two-dimensional space is depicted with respect to two orthogonal axes (abscissa axis and ordinate axis). The abscissa axis indicates the value of a first degree of freedom (703) of the input device. Similarly, the ordinate axis indicates the value of a second degree of freedom (705). The first degree of freedom (703) and the second degree of freedom (705) can, for example, correspond to translational motion of the input device such that the two- dimensional space can be referenced with respect to a spatial coordinate system.
[00125] FIGS. 7A-7C each consider an implementation of the drift metric where the drift metric is determined with respect to a reference position, the reference position established when the hand presence metric (associated with the input device) indicates that a hand is absent from the input device. Thus, in accordance with this implementation, a reference position (702) is established in response to a determination that a hand is absent from the input device. An example reference position (702) is depicted in FIGS. 7A-C.
[00126] FIG. 7A depicts a situation in which, upon establishing the reference position (702), the input device is moved (or experiences movement). A current position (704) of the input device, as a result of the movement of the input device, is depicted in FIG. 7A. FIG. 7 A further depicts a path travelled (706) by the input device during the movement of the input device from the reference position (702) to the current position (704). In some implementations, a distance between the reference position (702) and the current position (704) is computed according to a distance metric. Examples of distance metrics can include a Euclidean distance (also known as a L2 norm) (708), a Manhattan distance (also known as a LI norm), a length of a line traced out by the path travelled (706), or any other known distance metric (e.g., L3 norm, etc.). The distance metric can further account for differences in units used by each of the degrees of freedom (e.g., radians, millimeters, etc.). For example, the distance between the reference position (702) and the current position (704) can be computed using a generalized Euclidean distance metric as
EQ. 6 where d is the computed distance between the current position (704) and the reference position (702), M is the number of degrees of freedom (M > 1), x^cp^ is the value of the ith degree of freedom at the current position (704), x^rpJ' is the value of the ith degree of freedom at the reference position (702), and
is a constant factor multiplied to the squared difference between the current position (704) and the reference position (702) for the Ith degree of freedom. The constant factors
i 6 {1, ... , M can be used to make units of the degrees of freedom consistent (e.g., convert from radians to arc length) or serve as arbitrary factors to generate a distance metric from degrees of freedom with disparate units. In some implementations, only the degrees of freedom of the input device that result in a translation of the input device are considered. For example, rotation about the central axis (412), rotation about the yaw axis (413), and actuation of one or more grip members (406) may not result in a translation (either in part or “in bulk”) of the input device and thus may not be considered in a distance metric. In some implementations, the distance metric is computed with respect to the movement of a component of the computer-assisted system (100) controlled by the input device (e.g., an instrument (160) supported by a manipulator arm (150)). In these implementations the drift metric can also be given with respect to the component of the computer-assisted system (100) controlled by the input device.
[00127] FIG. 7B depicts an example drift threshold (710) as a surface enclosing the reference position (702). In some implementations the drift threshold (e.g., the depicted surface) is independent from the reference position (702). In other implementations, the drift threshold (710) is determined (e.g., positioned) based on the reference position (702). The example of FIG. 7B further designates a first direction (712) and a second direction (714), the first direction (712) and the second direction (714) directed in an opposing manner and aligned with the first degree of freedom (703). In the example of FIG. 7B, the drift threshold (710) is an asymmetric boundary encompassing the reference position (702). In the example of FIB. 7B, the drift threshold (710) is closer to the reference position (702) in the first direction (712) than in the second direction (714). Thus, in this example, the input device is allowed to move further in the second direction (714) from the reference position (702) relative to the first direction (712). As such, FIG. 7B depicts an example where the drift
threshold (710) depends on the movement of the input device (at least with respect to one degree of freedom).
[00128] In some implementations, the drift metric includes a singular drift metric for each degree of freedom of the input device. In these implementations, the drift threshold includes a singular drift threshold for each singular drift metric. For example, the first degree of freedom (703) can be specified in units of millimeters while the second degree of freedom (705) can be specified in units of radians (e.g., roll axis). A benefit of representing the drift threshold (710) as a surface (as depicted, for example, in FIG. 7B), is that the drift threshold (710) can be defined without excess consideration for disparate units used by the degrees of freedom of the input device. Further, if the drift threshold is composed of individual drift thresholds, one for each degree of freedom of the input device, and these singular drift thresholds are independent, the surface representation of the drift threshold encloses a hypercuboid. Further, in some implementations, the drift metric, and similarly the drift threshold, are associated with a single joint movement. That is, the drift metric and drift threshold can be based on a single degree of freedom the input device, where the degree of freedom may be linear, angular, etc.
[00129] FIG. 7C depicts another example of drift thresholds, namely, a drift threshold A (716) and a drift threshold B (718), each centered on a visualized reference position (702). FIG. 7C depicts a case where the use of drift threshold A (716) or drift threshold B (718) depends on a state of input device. Consider an example where the pose of the input device is completely described using three degrees of freedom. In this example, the movement (or, at least the distance computed using a distance metric) of the input device is determined using only two degrees of freedom such as the first degree of freedom (703) and the second degree of freedom (705). The first degree of freedom (703) and the second degree of freedom (705), in this example, may directly correlate with orthogonal and translational movements of the input device. The third degree of freedom, in this example, may correspond to a rotational movement of parts of the input device without translating the input device (e.g., roll axis (312)). Referring to FIG. 7C, the drift threshold, defined and represented in terms of the first degree of freedom (703) and the second degree of freedom (705), can depend on the state (or value) of the third degree of freedom. As an example, consider the case where the third degree of freedom corresponds to a grip degree of freedom, and the grip degree of freedom can further be categorized as “open” or “closed.” In this case, the drift threshold A (716) can be used when the grip degree of freedom is open and the drift threshold B (718) can be used when the grip degree of freedom is closed. As such, in some implementations, the grip
metric and the grip threshold can each be determined relative to a first subset of degrees of freedom of the degrees of freedom associated with an input device. Further, the grip metric and grip threshold can be dependent on one or more degrees of freedom in a second subset of the degrees of freedom of the input device, the second subset being complimentary to the first subset.
[00130] FIG. 8 depicts various example sequences of events (800) that can occur while using a computer-assisted system (80). The sequences of events (800) specifically dictate that some events occur, such as a determination that a user’s hand is not present at an input device of the computer-assisted system (100) (e.g., according to the hand presence metric). This is done in order to demonstrate behaviors of the computer-assisted system (100), in accordance with one or more embodiments disclosed herein. That said, events in the example sequences of events (800) need not occur when using a computer-assisted system (100). [00131] In particular, FIG. 8 depicts three example scenarios, namely, scenario A, scenario B, and scenario C. The first few events of each of these scenarios are identical. As depicted in FIG. 8, in Block 802, the computer-assisted system (100) is assumed to, at first, have teleoperational control disabled. While teleoperational control is disabled, in Block 804, a user’s hand is determined to be present with respect to (w.r.t.) an input device of the computer-assisted system. The determination can be made according to a hand presence metric. As previously discussed, in some implementations, enabling teleoperational control of the computer-assisted system (100) can require the satisfaction of one or more teleoperation entry requirements, such as selecting a teleoperation mode, indicating user presence at an input system, and/or aligning an input device with a component (e.g., an instrument) of the computer-assisted system (100) the input device is intended to control. As such, teleoperation entry requirement(s) consists of all the steps and/or conditions that must be undertaken and/or satisfied (sometimes in a strict order) to enable teleoperational control of the computer-assisted system (100). In Block 806 it is stated that the teleoperation entry requirement(s) are satisfied. As such, proceeding from Block 806, teleoperational control of the computer-assisted system (100) is enabled, and continues to be enabled, unless stated otherwise.
[00132] In the examples of FIG. 8, it may be implied that the user’s hand is in contact with the input device in order to satisfy, at least in part, the teleoperation entry requirement(s) of the computer-assisted system (100). However, other implementations may not require that a user’s hand be in contact with the input device to satisfy, at least in part, the teleoperation
entry requirement(s) of the computer-assisted system (100). In such cases, FIG. 8 can be adjusted such that Block 804 occurs after Block 806.
[00133] Block 808 specifies that the control system (242) of the computer-assisted system (100), having satisfied the teleoperation entry requirement(s), enabled teleoperational control for the input device. That is, for a computer-assisted system (100) with more than one input device, the designation of a teleoperational control as enabled or disabled can be applied to each input device independently. While teleoperational control is enabled, the control system (242) of the computer-assisted system (100) can determine a drift metric for the input device. For example, the drift metric may be determined upon, based on the hand presence metric, a determination that the user’s is absent with respect to (w.r.t.) the input device. Further, the drift metric can be determined in real-time and may be “reset,” or otherwise altered, throughout a period of enabled teleoperational control according to subsequent events.
[00134] Block 810 specifies an event where the user’s hand is determined to not be present with respect to (w.r.t.) the input device (e.g., according to the hand presence metric). Determination that the user’s hand is not present with respect to (w.r.t.) the input device may be the result of a user removing their hand from the input device. Removal of the user’s hand from the input device can be caused by an intra-operative movement or for some other reason (e.g., the user is acknowledging an itch on their person). Following Block 810, three different example scenarios: scenario A; scenario B; and scenario C, are described.
[00135] In scenario A, after the determination that the user’ s hand is not present with respect to (w.r.t.) the input device (Block 810), there is no movement of the input device, as stated in Block 812. Consequently, as seen in Block 814, without movement in the input device, the drift metric cannot exceed a drift threshold of the input device. Block 816 specifies an event where the user’s hand is determined to be present, again, with respect to (w.r.t.) the input device. In Block 818, it is explicitly stated that the computer-assisted system (100) continues with teleoperational control enabled, at least with respect to (w.r.t.) the input device.
[00136] In scenario B, after the determination that the user’s hand is not present with respect to (w.r.t.) the input device (Block 810), there is movement in the input device, as stated in Block 820. The movement of the input device is used, along with a hand presence metric, to determine the drift metric (where, in some implementations, the drift metric is determined with respect to a reference position established when, according to the hand presence metric, the user’s hand was determined not to be present with respect to the input
device (Block 810)). In Block 822, for scenario B, it is specified that that the drift metric does not exceed the drift threshold. Block 824 specifies an event where the user’s hand is determined to be present, again, with respect to (w.r.t.) the input device. In Block 826, it is explicitly stated that the computer-assisted system continues with teleoperational control enabled, at least with respect to (w.r.t.) the input device.
[00137] In scenario C, after the determination that the user’s hand is not present with respect to (w.r.t) the input device (Block 810), there is movement in the input device, as stated in Block 828. The movement of the input device is used, along with a hand presence metric, to determine the drift metric. In Block 830, for scenario C, it is specified that the drift metric does exceed the drift threshold. In response to the drift metric exceeding the drift threshold, teleoperational control of the computer-assisted system (100) is disabled (at least with respect to the input device), as stated in Block 832. As previously discussed, in response to disabling teleoperational control, various alterations in the behavior of the computer-assisted system (100) can occur, including locking the input device and providing an audiovisual signal to the user. In Block 834, an event is stated where the user re-satisfies the teleoperation entry requirement(s) (i.e., return to Block 806) to enable, again, teleoperational control of the computer-assisted system (100) for the given input device. In some implementations, the teleoperation entry requirement(s) to “re-enable” teleoperational control are reduced. For example, re-enabling teleoperational control may not require the user to select a teleoperation mode of the computer-assisted system (100). That is, a selection of a teleoperation mode may have been made to originally satisfy the teleoperation entry requirements, but an exit from the teleoperation mode was not triggered upon disabling teleoperational control, such that selection of the teleoperation mode is not required (i.e., already satisfied) when evaluating the teleoperation entry requirement(s) to re-enable teleoperational control.
[00138] Again, it is emphasized the FIG. 8 depicts examples of sequences of events. These examples are provided to illustrate the behavior of a computer-assisted system (100) according to embodiments of the instant disclosure. That said, these events, or their sequence, need not occur as depicted in practice.
[00139] Although only a few example embodiments have been described in detail above, those skilled in the art will readily appreciate that many modifications are possible in the example embodiments without materially departing from this invention. Accordingly, all such modifications are intended to be included within the scope of this disclosure as defined in the following claims.
Claims
1. A computer-assisted system comprising: an input device configured to be manipulated by a user; one or more sensors configured to provide sensor data indicative of an absence of a hand of the user at the input device; and a control system comprising one or more processors, the control system communicatively coupled to the input device and the one or more sensors, the control system configured to: command motion of an instrument in response to input received at the input device; determine, based on the sensor data, a hand presence metric, determine a movement of the input device, and disable teleoperational control of the instrument by the input device based on the hand presence metric and the movement of the input device.
2. The computer-assisted system of claim 1, further comprising: another input device configured to be manipulated by the user; wherein the sensor data is further indicative of an absence of a hand of the user at the another input device; and wherein the control system is communicatively coupled to the another input device and further configured to: command motion of another instrument in response to input received at the another input device, determine, based on the sensor data, another hand presence metric, the another hand presence metric corresponding to the another input device, determine a movement of the another input device, and disable teleoperational control of the another instrument by the another input device based on the another hand presence metric and the movement of the another input device.
3. The computer-assisted system of claim 1, wherein disabling teleoperational control of the instrument by the input device comprises exiting a teleoperation mode.
4. The computer-assisted system of claim 1, wherein disabling teleoperational control of the instrument by the input device is performed without exiting a teleoperation mode.
5. The computer-assisted system of claim 1, wherein disabling teleoperational control of the instrument by the input device comprises disabling a movement of the instrument without disabling teleoperational control of a function of the instrument.
6. The computer-assisted system of claim 1, wherein: the movement of the input device comprises a distance travelled by the input device from a reference position, the control system is further configured to compare the distance travelled by the input device from the reference position to a drift threshold, the drift threshold defines a maximum distance of the input device from the reference position, disabling teleoperational control of the instrument by the input device is executed in response to a determination that the distance exceeds the drift threshold.
7. The computer-assisted system of claim 1, wherein: the control system is further configured to: determine a spatial location of the input device, and determine a spatial surface surrounding the input device from a reference position; disabling teleoperational control of the instrument by the input device is executed in response to a determination that the spatial location resides outside the spatial surface.
8. The computer-assisted system of claim 7, wherein the spatial surface is based on a pre-operative image of a worksite associated with the instrument.
9. The computer-assisted system of claim 1, wherein the one or more sensors comprises a first hand sensor, wherein the sensor data comprises a first output of the first hand sensor, wherein the one or more sensors further comprise a second hand sensor, wherein the sensor data comprises a second output of the second hand sensor,
wherein the hand presence metric is an aggregation of the first output and the second output of the sensor data.
10. The computer-assisted system of claim 1, wherein disabling teleoperational control with the input device comprises inhibiting motion of the input device.
11. The computer-assisted system of claim 1, wherein disabling teleoperational control comprises configuring the control system not to command motion of the instrument in response to another movement of the input device.
12. The computer-assisted system of claim 1, wherein the control system is further configured to: provide a user-detectable output in response to disabling teleoperational control.
13. The computer-assisted system of claim 1, wherein: the control system is further configured to determine a drift metric based on the movement of the input device and the hand presence metric, disabling teleoperational control of the instrument by the input device is executed in response to a determination that the drift metric exceeds a drift threshold.
14. The computer-assisted system of claim 13, wherein at least one parameter selected from the group consisting of: the drift metric and the drift threshold, is based on an instrument type of the instrument.
15. The computer-assisted system of claim 13, wherein at least one parameter selected from the group consisting of: the drift metric and the drift threshold is based on a current state of the instrument.
16. The computer-assisted system of claim 13, wherein at least one parameter selected from the group consisting of: the drift metric and the drift threshold is based on a procedure type or a current state of a procedure being performed with the computer-assisted system.
17. The computer-assisted system of claim 13,
wherein at least one parameter selected from the group consisting of: the drift metric and the drift threshold is based on a characteristic of the movement of the input device.
18. The computer-assisted system of claim 13, wherein the sensor data comprises a distance of a hand of the user from the input device or a direction of hand movement relative to the input device, wherein the drift metric and the drift threshold are based on the distance or the direction of the hand movement.
19. A method for controlling a computer-assisted system, the method performed by a control system of the computer-assisted system and comprising: commanding motion of an instrument in response to input received at an input device, wherein the input device is configured to be manipulated by a user and wherein the control system is communicatively coupled to the input device and one or more sensors that are configured to provide sensor data indicative of an absence of a hand of the user at the input device; determining, based on the sensor data, a hand presence metric; determining a movement of the input device; and disabling teleoperational control of the instrument by the input device based on the hand presence metric and the movement of the input device.
20. The method of claim 19: wherein the computer-assisted system further comprises another input device configured to be manipulated by the user and another instrument; wherein the sensor data is further indicative of an absence of a hand of the user at the another input device; and wherein the control system is communicatively coupled to the another input device and the method further comprises: commanding motion of the another instrument in response to input received at the another input device, determining, based on the sensor data, another hand presence metric, the another hand presence metric corresponding to the another input device, determining a movement of the another input device, and
disabling teleoperational control of the another instrument by the another input device based on the another hand presence metric and the movement of the another input device.
21. The method of claim 19, wherein disabling teleoperational control of the instrument by the input device comprises exiting a teleoperation mode.
22. The method of claim 19, wherein disabling teleoperational control of the instrument by the input device is performed without exiting a teleoperation mode.
23. The method of claim 19, wherein disabling teleoperational control of the instrument by the input device comprises disabling a movement of the instrument without disabling teleoperational control of a function of the instrument.
24. The method of claim 19, wherein: the movement of the input device comprises a distance travelled by the input device from a reference position, the method further comprises comparing the distance travelled by the input device from the reference position to a drift threshold, the drift threshold defines a maximum distance of the input device from the reference position, disabling teleoperational control of the instrument by the input device is executed in response to a determination that the distance exceeds the drift threshold.
25. The method of claim 19, further comprising: determining a spatial location of the input device; and determining a spatial surface surrounding the input device from a reference position, wherein disabling teleoperational control of the instrument by the input device is executed in response to a determination that the spatial location resides outside the spatial surface.
26. The method of claim 25, wherein the spatial surface is based on a pre-operative image of a worksite associated with the instrument.
27. The method of claim 19,
wherein the one or more sensors comprises a first hand sensor, wherein the sensor data comprises a first output of the first hand sensor, wherein the one or more sensors further comprise a second hand sensor, wherein the sensor data comprises a second output of the second hand sensor, wherein the hand presence metric is an aggregation of the first output and the second output of the sensor data.
28. The method of claim 19, wherein disabling teleoperational control with the input device comprises inhibiting motion of the input device.
29. The method of claim 19, wherein disabling teleoperational control comprises configuring the control system not to command motion of the instrument in response to another movement of the input device.
30. The method of claim 19, further comprising: providing a user-detectable output in response to disabling teleoperational control.
31. The method of claim 19, further comprising: determining a drift metric based on the movement of the input device and the hand presence metric, wherein disabling teleoperational control of the instrument by the input device is executed in response to a determination that the drift metric exceeds a drift threshold.
32. The method of claim 31, wherein at least one parameter selected from the group consisting of: the drift metric and the drift threshold, is based on an instrument type of the instrument.
33. The method of claim 31, wherein at least one parameter selected from the group consisting of: the drift metric and the drift threshold is based on a current state of the instrument.
34. The method of claim 31,
wherein at least one parameter selected from the group consisting of: the drift metric and the drift threshold is based on a procedure type or a current state of a procedure being performed with the computer-assisted system.
35. The method of claim 31, wherein at least one parameter selected from the group consisting of: the drift metric and the drift threshold is based on a characteristic of the movement of the input device.
36. The method of claim 31, wherein the sensor data comprises a distance of a hand of the user from the input device or a direction of hand movement relative to the input device, wherein the drift metric and the drift threshold are based on the distance or the direction of the hand movement.
37. A non-transitory machine-readable medium comprising a plurality of machine-readable instructions executed by one or more processors associated with a computer-assisted system, the plurality of machine-readable instructions causing the one or more processors to perform the method of any of claims 19 to 36.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463557008P | 2024-02-23 | 2024-02-23 | |
| US63/557,008 | 2024-02-23 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025179100A1 true WO2025179100A1 (en) | 2025-08-28 |
Family
ID=94924979
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2025/016722 Pending WO2025179100A1 (en) | 2024-02-23 | 2025-02-21 | System and method for controlling teleoperation based on hand presence and input device drift |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025179100A1 (en) |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019083886A1 (en) * | 2017-10-25 | 2019-05-02 | Intuitive Surgical Operations, Inc. | System and method for repositioning input control devices |
| WO2019099504A1 (en) * | 2017-11-15 | 2019-05-23 | Intuitive Surgical Operations, Inc. | Master control device with multi-finger grip and methods therefor |
| WO2019220409A1 (en) * | 2018-05-17 | 2019-11-21 | Medical Microinstruments S.p.A. | Sterile console for robotic surgery |
| WO2020236195A1 (en) * | 2019-05-17 | 2020-11-26 | Verb Surgical Inc. | Methods for determining if teleoperation should be disengaged based on the user's gaze |
| WO2021071933A1 (en) * | 2019-10-08 | 2021-04-15 | Intuitive Surgical Operations, Inc. | Hand presence sensing at control input device |
| WO2021188127A1 (en) * | 2020-03-17 | 2021-09-23 | Verb Surgical Inc. | Drop detection of ungrounded master controller for a surgical robot |
| WO2022081908A2 (en) * | 2020-10-15 | 2022-04-21 | Intuitive Surgical Operations, Inc. | Detection and mitigation of predicted collisions of objects with user control system |
| WO2023175527A1 (en) * | 2022-03-17 | 2023-09-21 | Auris Health, Inc. | Continuous teleoperation with assistive master control |
-
2025
- 2025-02-21 WO PCT/US2025/016722 patent/WO2025179100A1/en active Pending
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019083886A1 (en) * | 2017-10-25 | 2019-05-02 | Intuitive Surgical Operations, Inc. | System and method for repositioning input control devices |
| WO2019099504A1 (en) * | 2017-11-15 | 2019-05-23 | Intuitive Surgical Operations, Inc. | Master control device with multi-finger grip and methods therefor |
| WO2019220409A1 (en) * | 2018-05-17 | 2019-11-21 | Medical Microinstruments S.p.A. | Sterile console for robotic surgery |
| WO2020236195A1 (en) * | 2019-05-17 | 2020-11-26 | Verb Surgical Inc. | Methods for determining if teleoperation should be disengaged based on the user's gaze |
| WO2021071933A1 (en) * | 2019-10-08 | 2021-04-15 | Intuitive Surgical Operations, Inc. | Hand presence sensing at control input device |
| WO2021188127A1 (en) * | 2020-03-17 | 2021-09-23 | Verb Surgical Inc. | Drop detection of ungrounded master controller for a surgical robot |
| WO2022081908A2 (en) * | 2020-10-15 | 2022-04-21 | Intuitive Surgical Operations, Inc. | Detection and mitigation of predicted collisions of objects with user control system |
| WO2023175527A1 (en) * | 2022-03-17 | 2023-09-21 | Auris Health, Inc. | Continuous teleoperation with assistive master control |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12048505B2 (en) | Master control device and methods therefor | |
| JP6543742B2 (en) | Collision avoidance between controlled movements of an image capture device and an operable device movable arm | |
| JP5982542B2 (en) | Method and system for detecting the presence of a hand in a minimally invasive surgical system | |
| JP2015107377A (en) | Master finger tracking device and method for use in a minimally invasive surgical system | |
| WO2018165047A1 (en) | Systems and methods for entering and exiting a teleoperational state | |
| US11703952B2 (en) | System and method for assisting operator engagement with input devices | |
| JP7427815B2 (en) | User interface device with grip links | |
| US11457986B2 (en) | Force-feedback gloves for a surgical robotic system | |
| Bihlmaier | Endoscope robots and automated camera guidance | |
| US20240004369A1 (en) | Haptic profiles for input controls of a computer-assisted device | |
| WO2019083886A1 (en) | System and method for repositioning input control devices | |
| WO2025179100A1 (en) | System and method for controlling teleoperation based on hand presence and input device drift | |
| US20250162157A1 (en) | Temporal non-overlap of teleoperation and headrest adjustment in a computer-assisted teleoperation system | |
| WO2024049942A1 (en) | Techniques for updating a registration transform between an extended-reality system and a computer-assisted device | |
| WO2025184368A1 (en) | Anatomy based force feedback and instrument guidance | |
| WO2025194022A1 (en) | Force feedback reduction damper for computer-assisted system | |
| WO2025151641A1 (en) | Techniques for updating a shared extended reality anchor based on operator interactions with a computer-assisted device | |
| CN119907646A (en) | Change the instrument's operating mode based on gesture detection |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25710709 Country of ref document: EP Kind code of ref document: A1 |