WO2025007141A1 - Systems and methods for inductive pulse frequency modulated position sensing - Google Patents
Systems and methods for inductive pulse frequency modulated position sensing Download PDFInfo
- Publication number
- WO2025007141A1 WO2025007141A1 PCT/US2024/036427 US2024036427W WO2025007141A1 WO 2025007141 A1 WO2025007141 A1 WO 2025007141A1 US 2024036427 W US2024036427 W US 2024036427W WO 2025007141 A1 WO2025007141 A1 WO 2025007141A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- filter circuit
- robotic
- frequency
- controller
- response
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D5/00—Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable
- G01D5/12—Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable using electric or magnetic means
- G01D5/14—Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable using electric or magnetic means influencing the magnitude of a current or voltage
- G01D5/20—Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable using electric or magnetic means influencing the magnitude of a current or voltage by varying inductance, e.g. by a movable armature
- G01D5/2006—Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable using electric or magnetic means influencing the magnitude of a current or voltage by varying inductance, e.g. by a movable armature by influencing the self-induction of one or more coils
- G01D5/202—Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable using electric or magnetic means influencing the magnitude of a current or voltage by varying inductance, e.g. by a movable armature by influencing the self-induction of one or more coils by movable a non-ferromagnetic conductive element
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/088—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D5/00—Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable
- G01D5/12—Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable using electric or magnetic means
- G01D5/243—Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable using electric or magnetic means influencing the phase or frequency of AC
Definitions
- Surgical robotic systems permit a user (also described herein as an “operator” or a “user”) to perform an operation using robotically-controlled instruments to perform tasks and functions during a procedure.
- Position sensors are used within one or more robotic arms of the surgical robotic system to output signals to a processor that can be used to resolve the position of the one or more arms within a cavity of a patient during a surgical procedure.
- Position sensors allow a user to determine the position of the robotic arm joints.
- One type of position sensor is known as a Hall effect sensor.
- PCBs flexible printed circuit boards
- a system for determining an angular position of a rotary joint the system is presented.
- the system comprises a filter circuit comprising a capacitor, a variable inductor and controller coupled the filter circuit.
- the variable inductor can comprise a moveable target, and at least one sensing coil affixed to the rotary joint.
- the controller can be programed or configured to transmit an input signal of varying frequency to the filter circuit.
- the controller can be further programmed or configured to transmit an input signal of varying frequency to the filter circuit.
- the controller can be further programmed or configured to detect a signal corresponding to a response of the filter circuit to the input signal.
- the controller can be further programmed or configured to characterize a frequency response of the filter circuit based at least in part on a determination that the signal corresponding to the response of the filter circuit is corrupted.
- the controller can be further programmed or configured to determine an inductance value of the variable inductor based at least in part on the frequency response of the filter circuit.
- the controller can be further programmed or configured to determine the angular position of the rotary joint based at least in part on the inductance value which is indicative of a positional relationship between the moveable target and the at least one sensing coil.
- FIG. 1 schematically depicts an example surgical robotic system in accordance with some embodiments.
- FIG. 2A is an example perspective view of a patient cart including a robotic support system coupled to a robotic subsystem of the surgical robotic system in accordance with some embodiments.
- FIG. 2B is an example perspective view of an example operator console of a surgical robotic system of the present disclosure in accordance with some embodiments.
- FIG. 3 A schematically depicts an example side view of a surgical robotic system performing a surgery within an internal cavity of a subject in accordance with some embodiments.
- FIG. 3B schematically depicts an example top view of the surgical robotic system performing the surgery within the internal cavity of the subject of FIG. 3 A in accordance with some embodiments.
- FIG. 4A is an example perspective view of a single robotic arm subsystem in accordance with some embodiments.
- FIG. 4B is an example perspective side view of a single robotic arm of the single robotic arm subsystem of FIG. 4A in accordance with some embodiments.
- FIG. 5 is an example perspective front view of a camera assembly and a robotic arm assembly in accordance with some embodiments.
- FIG. 6A is an example perspective view of a left hand controller for use in an operator console of a surgical robotic system in accordance with some embodiments.
- FIG. 6B is an example perspective view of a right hand controller for use in an operator console of a surgical robotic system in accordance with some embodiments.
- FIG. 7A is an example perspective view of a left hand controller for use in an operator console of a surgical robotic system in accordance with some embodiments.
- FIG. 7B is an example perspective view of a right hand controller for use in an operator console of a surgical robotic system in accordance with some embodiments.
- FIG. 8A is an example perspective view of a left hand controller for use in an operator console of a surgical robotic system in accordance with some embodiments.
- FIG. 8B is an example perspective view of a right hand controller for use in an operator console of a surgical robotic system in accordance with some embodiments.
- FIG. 9 is a side view of an example position sensor with a target, coil sensors, and background in a robotic joint, in accordance with some embodiments.
- FIG. 10 is an example shape of a target as well as sensor coils used in a position sensor, in accordance with some embodiments.
- FIG. 11 is a schematic of a filter circuit coupled to a controller that is used to determine a position of a robotic arm in response to a signal that is input to the filter circuit, in accordance with some embodiments.
- FIG. 12A is an example timing diagram illustrating a comparison of the shape of a signal modulated at a selected frequency that is input to a filter circuit and the shape of a corresponding output signal from the filter circuit, in accordance with some embodiments.
- FIG. 12B is an example timing diagram illustrating a comparison of the shape of a signal modulated at a selected frequency that is input to a filter circuit and the shape of a corresponding output signal from the filter circuit, in accordance with some embodiments.
- FIG. 12C is an example timing diagram illustrating a comparison of the shape of a signal modulated at a selected frequency that is input to a filter circuit and the shape of a corresponding output signal from the filter circuit, in accordance with some embodiments.
- FIG. 13 A is an example 4 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
- MHz megahertz
- FIG. 13B is an example 10 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
- FIG. 13C is an example 20 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
- FIG. 13D is an example 40 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
- FIG. 13E is an example 60 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
- FIG. 14 is an example flowchart corresponding to determining a position of a robotic joint using an inductive sensing method, in accordance with some embodiment.
- FIG. 15 schematically depicts an example computing module of the surgical robotic system in accordance with some embodiments.
- the frequency modulated position sensing technology disclosed herein can also be applied in the areas of small personal consumer electronic devices with a dial.
- volume knobs or dials that are used to tune a radio to a specific frequency could be an area where frequency modulated position sensing technology is applicable.
- the electronics used to sense the joint angles need to fit into small spaces within the robotic arms while remaining flexible enough to bend around the contours of the space available.
- a filtering circuit including a capacitor, a target, and sense coils whose inductance changes as the target moves over or near the sense coils, receives signals of varying frequencies from a controller and outputs signals of varying amplitude and frequency in response to the movement of the target and input signals.
- the controller generates these signals of varying frequency by modulating the frequency of a signal that has a periodicity that is equivalent to that of the clock of the controller.
- Each of the frequency modulated signals corresponds to a unique frequency by which the signal that is oscillating at the same frequency as the clock is modulated by.
- the filter circuit As the controller sweeps through a range of frequencies, by which to modulate the signal input to the filter circuit, the filter circuit generates an output signal that changes in shape as the frequency exceeds a certain value.
- the controller can determine the frequency response of the circuit which can be used to determine the inductance value of the filter circuit which is directly related to the position of the rotary joint as the target comes within proximity of the sense coils. As a result, the positon of the rotary joint can quickly be determined using a minimal number of components that fit within the confines of the robotic arms.
- controller/controller may refer to a hardware device that includes a memory and a processor and is specifically programmed to execute the processes described herein in accordance with some embodiments.
- the memory is configured to store the modules and the processor is specifically configured to execute said modules to perform one or more processes which are described further below.
- multiple different controllers or controllers or multiple different types of controllers or controllers may be employed in performing one or more processes.
- different controllers or controllers may be implemented in different portions of a surgical robotic systems.
- Some embodiments disclosed herein are implemented on, employ, or are incorporated into a surgical robotic system that includes a camera assembly having at least three articulating degrees of freedom and two or more robotic arms each having at least six articulating degrees of freedom and an additional degree of freedom corresponding to the movement of an associated end-effector (e.g., grasper, manipulator, and the like).
- the camera assembly when mounted within a subject (e.g., a patient) can be moved or rotated in a pitch or yaw direction about 180 degrees such that the camera assembly can view rearwardly back towards the insertion site.
- the camera assembly and the robotic arms can view and operate dexterously forward (e.g., away from the insertion site), to each side, in an upward or downward direction, as well as in the rearward direction to view backwards towards the insertion site.
- the robotic arms and the camera assembly can also move in the roll, pitch and yaw directions.
- the large number of degrees of freedom in some surgical robotic systems described herein in comparison to some conventional surgical robotic systems, enables movements of a robotic arm assembly and orientations of a robotic arm assembly not possible with some conventional surgical robotic arms and enables movements of a camera of a robotic camera assembly not possible in cameras for some conventional robotic surgical systems.
- many conventional surgical robotic systems having two robotic arms and fewer degrees of freedom per arm may not be able to change a position or an orientation of a virtual chest of the robotic arms assembly while keeping instrument tips of end effectors of the robotic arms stationary.
- cameras of many conventional surgical robotic systems may only have degrees of freedom associated with movement of a support for the camera extending through a trocar and may have no independent degrees of freedom for movement relative to the support.
- Some embodiments described herein provide methods and systems employing multiple different control modes, which may be described as a plurality of control modes herein, for controlling a surgical robotic system before, during or after a robotic arms assembly of the surgical robotic system is disposed within an internal body cavity of a subject.
- the robotic arms assembly includes at least two robotic arms, which may be described as a “robotic arm assembly” or “arm assembly” herein.
- the robotic arms assembly also includes a camera assembly, which may be also be referred to as a “surgical camera assembly”, or “robotic camera assembly” herein.
- Each control mode uses sensed movement of one or more hand controllers, and may also use input from one or more foot pedals, to control the robotic arm assembly and/or the camera assembly.
- a control mode may be changed from a current control mode to a different selected control mode based on operator input (e.g., provided via one or more hand controllers and/or a foot pedal of the surgical robotic system). In different control modes, the same movements of the hand controllers may result in different motions of the surgical robotic assembly.
- an orientation or a direction of view of a “camera assembly” or a “camera” is referring to an orientation or a direction of a component or group of components of the surgical robotic arms assembly that includes one or more cameras or other imaging devices that can collectively change orientation with respect to the robotic arm assembly and provide image data to be displayed.
- the one or more cameras or other imaging devices may all be disposed in a same housing whose orientation can be changed relative to a support (e.g., support tube or support shaft) for the camera assembly.
- a system for robotic surgery may include a robotic subsystem.
- the robotic subsystem includes at least a portion, which may also be referred to herein as a robotic arms assembly that can be inserted into a patient via a trocar through a single incision point or site.
- the portion inserted into the patient via a trocar is small enough to be deployed in vivo at the surgical site and is sufficiently maneuverable when inserted to be able to move within the body to perform various surgical procedures at multiple different points or sites.
- the portion inserted into the body that performs functional tasks may be referred to as a surgical robotic module, a surgical robotic module or a robotic arms assembly herein.
- the surgical robotic module can include multiple different submodules or parts that may be inserted into the trocar separately.
- the surgical robotic module, surgical robotic module or robotic arms assembly can include multiple separate robotic arms that are deployable within the patient along different or separate axes. These multiple separate robotic arms may be collectively referred to as a robotic arm assembly herein.
- a surgical camera assembly can also be deployed along a separate axis.
- the surgical robotic module, surgical robotic module, or robotic arms assembly may also include the surgical camera assembly.
- the surgical robotic module, or robotic arms assembly employs multiple different components, such as a pair of robotic arms and a surgical or robotic camera assembly, each of which are deployable along different axes and are separately manipulatable, maneuverable, and movable.
- the robotic arms and the camera assembly that are disposable along separate and manipulatable axes is referred to herein as the Split Arm (SA) architecture.
- SA architecture is designed to simplify and increase efficiency of the insertion of robotic surgical instruments through a single trocar at a single insertion site, while concomitantly assisting with deployment of the surgical instruments into a surgical ready state as well as the subsequent removal of the surgical instruments through the trocar.
- a surgical instrument can be inserted through the trocar to access and perform an operation in vivo in the abdominal cavity of a patient.
- various surgical instruments may be used or employed, including but not limited to robotic surgical instruments, as well as other surgical instruments known in the art.
- the surgical robotic module that forms part of the present invention can form part of a surgical robotic system that includes a user workstation that includes appropriate sensors and displays, and a robot support system (RSS) for interacting with and supporting the robotic subsystem of the present invention in some embodiments.
- the robotic subsystem includes a motor and a surgical robotic module that includes one or more robotic arms and one or more camera assemblies in some embodiments.
- the robotic arms and camera assembly can form part of a single support axis robotic system, can form part of the split arm (SA) architecture robotic system, or can have another arrangement.
- SA split arm
- the robot support system can provide multiple degrees of freedom such that the robotic module can be maneuvered within the patient into a single position or multiple different positions.
- the robot support system can be directly mounted to a surgical table or to the floor or ceiling within an operating room. In another embodiment, the mounting is achieved by various fastening means, including but not limited to, clamps, screws, or a combination thereof. In other embodiments, the structure may be free standing.
- the robot support system can mount a motor assembly that is coupled to the surgical robotic module, which includes the robotic arms and the camera assembly.
- the motor assembly can include gears, motors, drivetrains, electronics, and the like, for powering the components of the surgical robotic module.
- the robotic arms and the camera assembly are capable of multiple degrees of freedom of movement. According to some embodiments, when the robotic arms and the camera assembly are inserted into a patient through the trocar, they are capable of movement in at least the axial, yaw, pitch, and roll directions.
- the robotic arms are designed to incorporate and employ a multi-degree of freedom of movement robotic arm with an end effector mounted at a distal end thereof that corresponds to a wrist area or joint of the user.
- the working end (e.g., the end effector end) of the robotic arm is designed to incorporate and use or employ other robotic surgical instruments, such as for example the surgical instruments set forth in U.S. Publ. No. 2018/0221102, the entire contents of which are herein incorporated by reference.
- FIG. 1 is a schematic illustration of an example surgical robotic system 10 in which aspects of the present disclosure can be employed in accordance with some embodiments of the present disclosure.
- the surgical robotic system 10 includes an operator console 11 and a robotic subsystem 20 in accordance with some embodiments.
- the operator console 11 includes a display 12, an image computing module 14, which may be a three-dimensional (3D) computing module, hand controllers 17 having a sensing and tracking module 16, and a computing module 18. Additionally, the operator console 11 may include a foot pedal array 19 including a plurality of pedals.
- the image computing module 14 can include a graphical user interface 39.
- the graphical user interface 39, the controller 26 or the image Tenderer 30, or both, may render one or more images or one or more graphical user interface elements on the graphical user interface 39.
- a pillar box associated with a mode of operating the surgical robotic system 10, or any of the various components of the surgical robotic system 10 can be rendered on the graphical user interface 39.
- live video footage captured by a camera assembly 44 can also be rendered by the controller 26 or the image Tenderer 30 on the graphical user interface 39.
- the operator console 11 can include a visualization system 9 that includes a display 12 which may be any selected type of display for displaying information, images or video generated by the image computing module 14, the computing module 18, and/or the robotic subsystem 20.
- the display 12 can include or form part of, for example, a head-mounted display (HMD), an augmented reality (AR) display (e.g., an AR display, or AR glasses in combination with a screen or display), a screen or a display, a two-dimensional (2D) screen or display, a three-dimensional (3D) screen or display, and the like.
- the display 12 can also include an optional sensing and tracking module 16A.
- the display 12 can include an image display for outputting an image from a camera assembly 44 of the robotic subsystem 20.
- the hand controllers 17 are configured to sense a movement of the operator’s hands and/or arms to manipulate the surgical robotic system 10.
- the hand controllers 17 can include the sensing and tracking module 16, circuity, and/or other hardware.
- the sensing and tracking module 16 can include one or more sensors or detectors that sense movements of the operator’s hands.
- the one or more sensors or detectors that sense movements of the operator’s hands are disposed in the hand controllers 17 that are grasped by or engaged by hands of the operator.
- the one or more sensors or detectors that sense movements of the operator’s hands are coupled to the hands and/or arms of the operator.
- the sensors of the sensing and tracking module 16 can be coupled to a region of the hand and/or the arm, such as the fingers, the wrist region, the elbow region, and/or the shoulder region. Additional sensors can also be coupled to a head and/or neck region of the operator in some embodiments.
- the sensing and tracking module 16 can be external and coupled to the hand controllers 17 via electricity components and/or mounting hardware.
- the optional sensor and tracking module 16A may sense and track movement of one or more of an operator’s head, of at least a portion of an operator’s head, an operator’s eyes or an operator’s neck based, at least in part, on imaging of the operator in addition to or instead of by a sensor or sensors attached to the operator’s body.
- the sensing and tracking module 16 can employ sensors coupled to the torso of the operator or any other body part.
- the sensing and tracking module 16 can employ in addition to the sensors an Inertial Momentum Unit (IMU) having for example an accelerometer, gyroscope, magnetometer, and a motion processor. The addition of a magnetometer allows for reduction in sensor drift about a vertical axis.
- the sensing and tracking module 16 also include sensors placed in surgical material such as gloves, surgical scrubs, or a surgical gown.
- the sensors can be reusable or disposable.
- sensors can be disposed external of the operator, such as at fixed locations in a room, such as an operating room.
- the external sensors 37 can generate external data 36 that can be processed by the computing module 18 and hence employed by the surgical robotic system 10.
- the sensors generate position and/or orientation data indicative of the position and/or orientation of the operator’s hands and/or arms.
- the sensing and tracking modules 16 and/or 16A can be utilized to control movement (e.g., changing a position and/or an orientation) of the camera assembly 44 and robotic arms 42 of the robotic subsystem 20.
- the tracking and position data 34 generated by the sensing and tracking module 16 can be conveyed to the computing module 18 for processing by at least one processor 22.
- the computing module 18 can determine or calculate, from the tracking and position data 34 and 34A, the position and/or orientation of the operator’s hands or arms, and in some embodiments of the operator’s head as well, and convey the tracking and position data 34 and 34A to the robotic subsystem 20.
- the tracking and position data 34, 34A can be processed by the processor 22 and can be stored for example in the storage 24.
- the tracking and position data 34 and 34A can also be used by the controller 26, which in response can generate control signals for controlling movement of the robotic arms 42 and/or the camera assembly 44.
- the controller 26 can change a position and/or an orientation of at least a portion of the camera assembly 44, of at least a portion of the robotic arms 42, or both.
- the controller 26 can also adjust the pan and tilt of the camera assembly 44 to follow the movement of the operator’s head.
- the robotic subsystem 20 can include a robot support system (RSS) 46 having a motor 40 and a trocar 50 or trocar mount, the robotic arms 42, and the camera assembly 44.
- the robotic arms 42 and the camera assembly 44 can form part of a single support axis robot system, such as that disclosed and described in U.S. Patent No. 10,285,765, or can form part of a split arm (SA) architecture robot system, such as that disclosed and described in PCT Patent Application No. PCT/US2020/039203, both of which are incorporated herein by reference in their entirety.
- SA split arm
- the robotic subsystem 20 can employ multiple different robotic arms that are deployable along different or separate axes.
- the camera assembly 44 which can employ multiple different camera elements, can also be deployed along a common separate axis.
- the surgical robotic system 10 can employ multiple different components, such as a pair of separate robotic arms and the camera assembly 44, which are deployable along different axes.
- the robotic arms assembly 42 and the camera assembly 44 are separately manipulatable, maneuverable, and movable.
- the robotic subsystem 20, which includes the robotic arms 42 and the camera assembly 44, is disposable along separate manipulatable axes, and is referred to herein as an SA architecture.
- the SA architecture is designed to simplify and increase efficiency of the insertion of robotic surgical instruments through a single trocar at a single insertion point or site, while concomitantly assisting with deployment of the surgical instruments into a surgical ready state, as well as the subsequent removal of the surgical instruments through a trocar 50 as further described below.
- the RSS 46 can include the motor 40 and the trocar 50 or a trocar mount.
- the RSS 46 can further include a support member that supports the motor 40 coupled to a distal end thereof.
- the motor 40 in turn can be coupled to the camera assembly 44 and to each of the robotic arms assembly 42.
- the support member can be configured and controlled to move linearly, or in any other selected direction or orientation, one or more components of the robotic subsystem 20.
- the RSS 46 can be free standing.
- the RSS 46 can include the motor 40 that is coupled to the robotic subsystem 20 at one end and to an adjustable support member or element at an opposed end.
- the motor 40 can receive the control signals generated by the controller 26.
- the motor 40 can include gears, one or more motors, drivetrains, electronics, and the like, for powering and driving the robotic arms 42 and the cameras assembly 44 separately or together.
- the motor 40 can also provide mechanical power, electrical power, mechanical communication, and electrical communication to the robotic arms 42, the camera assembly 44, and/or other components of the RSS 46 and robotic subsystem 20.
- the motor 40 can be controlled by the computing module 18.
- the motor 40 can thus generate signals for controlling one or more motors that in turn can control and drive the robotic arms 42, including for example the position and orientation of each robot joint of each robotic arm, as well as the camera assembly 44.
- the motor 40 can further provide for a translational or linear degree of freedom that is first utilized to insert and remove each component of the robotic subsystem 20 through a trocar 50.
- the motor 40 can also be employed to adjust the inserted depth of each robotic arm 42 when inserted into the patient 100 through the trocar 50.
- the trocar 50 is a medical device that can be made up of an awl (which may be a metal or plastic sharpened or non-bladed tip), a cannula (essentially a hollow tube), and a seal in some embodiments.
- the trocar 50 can be used to place at least a portion of the robotic subsystem 20 in an interior cavity of a subject (e.g., a patient) and can withdraw gas and/or fluid from a body cavity.
- the robotic subsystem 20 can be inserted through the trocar 50 to access and perform an operation in vivo in a body cavity of a patient.
- the robotic subsystem 20 can be supported, at least in part, by the trocar 50 or a trocar mount with multiple degrees of freedom such that the robotic arms 42 and the camera assembly 44 can be maneuvered within the patient into a single position or multiple different positions.
- the robotic arms 42 and camera assembly 44 can be moved with respect to the trocar 50 or a trocar mount with multiple different degrees of freedom such that the robotic arms 42 and the camera assembly 44 can be maneuvered within the patient into a single position or multiple different positions.
- the RSS 46 can further include an optional controller for processing input data from one or more of the system components (e.g., the display 12, the sensing and tracking module 16, the robotic arms 42, the camera assembly 44, and the like), and for generating control signals in response thereto.
- the motor 40 can also include a storage element for storing data in some embodiments.
- the robotic arms 42 can be controlled to follow the scaled-down movement or motion of the operator’s arms and/or hands as sensed by the associated sensors in some embodiments and in some modes of operation.
- the robotic arms 42 include a first robotic arm including a first end effector at distal end of the first robotic arm, and a second robotic arm including a second end effector disposed at a distal end of the second robotic arm.
- the robotic arms 42 can have portions or regions that can be associated with movements associated with the shoulder, elbow, and wrist joints as well as the fingers of the operator.
- the robotic elbow joint can follow the position and orientation of the human elbow
- the robotic wrist joint can follow the position and orientation of the human wrist.
- the robotic arms 42 can also have associated therewith end regions that can terminate in end-effectors that follow the movement of one or more fingers of the operator in some embodiments, such as for example the index finger as the user pinches together the index finger and thumb.
- the robotic arms 42 may follow movement of the arms of the operator in some modes of control while a virtual chest of the robotic arms assembly may remain stationary (e.g., in an instrument control mode).
- the position and orientation of the torso of the operator are subtracted from the position and orientation of the operator’s arms and/or hands. This subtraction allows the operator to move his or her torso without the robotic arms moving. Further disclosure control of movement of individual arms of a robotic arm assembly is provided in International Patent Application Publications WO 2022/094000 Al and WO 2021/231402 Al, each of which is incorporated by reference herein in its entirety.
- the camera assembly 44 is configured to provide the operator with image data 48, such as for example a live video feed of an operation or surgical site, as well as enable the operator to actuate and control the cameras forming part of the camera assembly 44.
- the camera assembly 44 can include one or more cameras (e.g., a pair of cameras), the optical axes of which are axially spaced apart by a selected distance, known as the inter-camera distance, to provide a stereoscopic view or image of the surgical site.
- the operator can control the movement of the cameras via movement of the hands via sensors coupled to the hands of the operator or via hand controllers 17 grasped or held by hands of the operator, thus enabling the operator to obtain a desired view of an operation site in an intuitive and natural manner.
- the operator can additionally control the movement of the camera via movement of the operator’s head.
- the camera assembly 44 is movable in multiple directions, including for example in yaw, pitch and roll directions relative to a direction of view.
- the components of the stereoscopic cameras can be configured to provide a user experience that feels natural and comfortable.
- the interaxial distance between the cameras can be modified to adjust the depth of the operation site perceived by the operator.
- the image or video data 48 generated by the camera assembly 44 can be displayed on the display 12.
- the display 12 includes an HMD
- the display can include the built-in sensing and tracking module 16A that obtains raw orientation data for the yaw, pitch and roll directions of the HMD as well as positional data in Cartesian space (x, y, z) of the HMD.
- positional and orientation data regarding an operator’s head may be provided via a separate head-tracking module.
- the sensing and tracking module 16A may be used to provide supplementary position and orientation tracking data of the display in lieu of or in addition to the built-in tracking system of the HMD.
- no head tracking of the operator is used or employed.
- images of the operator may be used by the sensing and tracking module 16A for tracking at least a portion of the operator’s head.
- FIG. 2A depicts an example robotic arms assembly 20, which is also referred to herein as a robotic subsystem, of a surgical robotic system 10 incorporated into or mounted onto a mobile patient cart in accordance with some embodiments.
- the robotic arms assembly 20 includes the RSS 46, which, in turn includes the motor 40, the robotic arm assembly 42 having end-effectors 45, the camera assembly 44 having one or more cameras 47, and may also include the trocar 50 or a trocar mount.
- FIG. 2B depicts an example of an operator console 11 of the surgical robotic system 10 of the present disclosure in accordance with some embodiments.
- the operator console 11 includes a display 12, hand controllers 17, and also includes one or more additional controllers, such as a foot pedal array 19 for control of the robotic arms 42, for control of the camera assembly 44, and for control of other aspects of the system.
- FIG. 2B also depicts the left hand controller subsystem 23 A and the right hand controller subsystem 23B of the operator console.
- the left hand controller subsystem 23 A includes and supports the left hand controller 17A and the right hand controller subsystem 23B includes and supports the right hand controller 17B.
- the left hand controller subsystem 23 A may releasably connect to or engage the left hand controller 17A
- right hand controller subsystem 23B may releasably connect to or engage the right hand controller 17 A.
- connections may be both physical and electronic so that the left hand controller subsystem 23 A and the right hand controller subsystem 23B may receive signals from the left hand controller 17A and the right hand controller 17B, respectively, including signals that convey inputs received from a user selection on a button or touch input device of the left hand controller 17A or the right hand controller 17B.
- Each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B may include components that enable a range of motion of the respective left hand controller 17A and right hand controller 17B, so that the left hand controller 17A and right hand controller 17B may be translated or displaced in three dimensions and may additionally move in the roll, pitch, and yaw directions. Additionally, each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B may register movement of the respective left hand controller 17A and right hand controller 17B in each of the forgoing directions and may send a signal providing such movement information to a processor (not shown) of the surgical robotic system.
- each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B may be configured to receive and connect to or engage different hand controllers (not shown).
- hand controllers with different configurations of buttons and touch input devices may be provided.
- hand controllers with a different shape may be provided. The hand controllers may be selected for compatibility with a particular surgical robotic system or a particular surgical robotic procedure or selected based upon preference of an operator with respect to the buttons and input devices or with respect to the shape of the hand controller in order to provide greater comfort and ease for the operator.
- FIG. 3 A schematically depicts a side view of the surgical robotic system 10 performing a surgery within an internal cavity 104 of a subject 100 in accordance with some embodiments and for some surgical procedures.
- FIG. 3B schematically depicts a top view of the surgical robotic system 10 performing the surgery within the internal cavity 104 of the subject 100.
- the subject 100 e.g., a patient
- an operation table 102 e.g., a surgical table 102
- an incision is made in the patient 100 to gain access to the internal cavity 104.
- the trocar 50 is then inserted into the patient 100 at a selected location to provide access to the internal cavity 104 or operation site.
- the RSS 46 can then be maneuvered into position over the patient 100 and the trocar 50.
- the RSS 46 includes a trocar mount that attaches to the trocar 50.
- the robotic arms assembly 20 can be coupled to the motor 40 and at least a portion of the robotic arms assembly can be inserted into the trocar 50 and hence into the internal cavity 104 of the patient 100.
- the camera assembly 44 and the robotic arm assembly 42 can be inserted individually and sequentially into the patient 100 through the trocar 50.
- references to insertion of the robotic arm assembly 42 and/or the camera assembly into an internal cavity of a subject and disposing the robotic arm assembly 42 and/or the camera assembly 44 in the internal cavity of the subject are referring to the portions of the robotic arm assembly 42 and the camera assembly 44 that are intended to be in the internal cavity of the subject during use.
- the sequential insertion method has the advantage of supporting smaller trocars and thus smaller incisions can be made in the patient 100, thus reducing the trauma experienced by the patient 100.
- the camera assembly 44 and the robotic arm assembly 42 can be inserted in any order or in a specific order.
- the camera assembly 44 can be followed by a first robotic arm of the robotic arm assembly 42 and then followed by a second robotic arm of the robotic arm assembly 42 all of which can be inserted into the trocar 50 and hence into the internal cavity 104.
- the RSS 46 can move the robotic arm assembly 42 and the camera assembly 44 to an operation site manually or automatically controlled by the operator console 11.
- FIG. 4A is a perspective view of a robotic arm subassembly 21 in accordance with some embodiments.
- the robotic arm subassembly 21 includes a robotic arm 42 A, the endeffector 45 having an instrument tip 120 (e.g., monopolar scissors, needle driver/holder, bipolar grasper, or any other appropriate tool), a shaft 122 supporting the robotic arm 42 A.
- an instrument tip 120 e.g., monopolar scissors, needle driver/holder, bipolar grasper, or any other appropriate tool
- a distal end of the shaft 122 is coupled to the robotic arm 42A, and a proximal end of the shaft 122 is coupled to a housing 124 of the motor 40 (as shown in FIG. 2 A). At least a portion of the shaft 122 can be external to the internal cavity 104 (as shown in FIGS. 3A and 3B). At least a portion of the shaft 122 can be inserted into the internal cavity 104 (as shown in FIGS. 3 A and 3B).
- FIG. 4B is a side view of the robotic arm assembly 42.
- the robotic arm assembly 42 includes a virtual shoulder 126, a virtual elbow 128 having position sensor 132 (e.g., inductive sensing coil and oscillator circuit), a virtual wrist 130, and the end-effector 45 in accordance with some embodiments.
- the virtual shoulder 126, the virtual elbow 128, the virtual wrist 130 can include a series of hinge and rotary joints to provide each arm with positionable, seven degrees of freedom, along with one additional grasping degree of freedom for the end-effector 45 in some embodiments.
- FIG. 5 illustrates a perspective front view of a portion of the robotic arms assembly 20 configured for insertion into an internal body cavity of a patient.
- the robotic arms assembly 20 includes a robotic arm 42 A and a robotic arm 42B.
- the two robotic arms 42 A and 42B can define, or at least partially define, a virtual chest 140 of the robotic arms assembly 20 in some embodiments.
- the virtual chest 140 (depicted as a triangle with dotted lines) can be defined by a chest plane extending between a first pivot point 142 A of a most proximal joint of the robotic arm 42A (e.g., a shoulder joint 126), a second pivot point 142B of a most proximal joint of the robotic arm 42B, and a camera imaging center point 144 of the camera(s) 47.
- a pivot center 146 of the virtual chest 140 lies in the middle of the virtual chest.
- sensors in one or both of the robotic arm 42A and the robotic arm 42B can be used by the system to determine a change in location in three-dimensional space of at least a portion of the robotic arm.
- sensors in one or both of the first robotic arm and second robotic arm can be used by the system to determine a location in three-dimensional space of at least a portion of one robotic arm relative to a location in three-dimensional space of at least a portion of the other robotic arm.
- a camera assembly 44 is configured to obtain images from which the system can determine relative locations in three-dimensional space.
- the camera assembly may include multiple cameras, at least two of which are laterally displaced from each other relative to an imaging axis, and the system may be configured to determine a distance to features within the internal body cavity.
- a surgical robotic system including camera assembly and associated system for determining a distance to features may be found in International Patent Application Publication No. WO 2021/159409, entitled “System and Method for Determining Depth Perception In Vivo in a Surgical Robotic System,” and published August 12, 2021, which is incorporated by reference herein in its entirety.
- Information about the distance to features and information regarding optical properties of the cameras may be used by a system to determine relative locations in three-dimensional space.
- Hand controllers for a surgical robotic system as described herein can be employed with any of the surgical robotic systems described above or any other suitable surgical robotic system. Further, some embodiments of hand controllers described herein may be employed with semi-robotic endoscopic surgical systems that are only robotic in part. [0075] As explained above, controllers for a surgical robotic system may desirably feature sufficient inputs to provide control of the system, an ergonomic design and “natural” feel in use.
- a robotic arm considered a left robotic arm and a robotic arm considered a right robotic arm may change due a configuration of the robotic arms and the camera assembly being adjusted such that the second robotic arm corresponds to a left robotic arm with respect to a view provided by the camera assembly and the first robotic arm corresponds to a right robotic arm with respect view provided by the camera assembly.
- the surgical robotic system changes which robotic arm is identified as corresponding to the left hand controller and which robotic arm is identified as corresponding to the right hand controller during use.
- at least one hand controller includes one or more operator input devices to provide one or more inputs for additional control of a robotic assembly.
- the one or more operator input devices receive one or more operators inputs for at least one of: engaging a scanning mode, resetting a camera assembly orientation and position to a align a view of the camera assembly to the instrument tips and to the chest; displaying a menu, traversing a menu or highlighting options or items for selection and selecting an item or option, selecting and adjusting an elbow position, and engaging a clutch associated with an individual hand controller.
- additional functions may be accessed via the menu, for example, selecting a level of a grasper force (e.g., high/low), selecting an insertion mode, an extraction mode, or an exchange mode, adjusting a focus, lighting, or a gain, camera cleaning, motion scaling, rotation of camera to enable looking down, etc.
- a level of a grasper force e.g., high/low
- selecting an insertion mode, an extraction mode, or an exchange mode adjusting a focus, lighting, or a gain, camera cleaning, motion scaling, rotation of camera to enable looking down, etc.
- FIG. 6A depicts a left hand controller 201 and FIG. 6B depicts a right hand controller 202 in accordance with some embodiments.
- the left hand controller 201 and the right hand controller 202 each include a contoured housing 210, 211, respectively.
- Each contoured housing 210, 211 includes an upper surface 212a, 213a, an inside side surface 212b, 213 adjacent the upper surface, an outside side surface (not visible in these views) facing away from the inside side surface 212b, 213, and a lower surface (not visible in these views) facing away from the upper surface 212a, 213a.
- each hand controller 201, 202 includes a mounting assembly 215, 216, respectively.
- the mounting assembly 215, 216 may be used to attach, either directly or indirectly, the respective hand controller 201, 202 to a user console of a surgical robotic system.
- the mounting assembly 215 defines holes 217, which may be countersunk holes, configured to receive a screw or bolt to connect the left hand controller 201 to a user console.
- the hand controller includes two control levers, three buttons, and one touch input device. As will be explained herein, embodiments may feature other combinations of touch input devices, buttons, and levers, or a subset thereof.
- the embodiment shown as the left hand controller 201 features a first control lever 221 and a second control lever 222.
- right hand controller 202 includes a first control lever 223 and a second control lever 224.
- first control lever 221 is engaged with the second control lever 222 via one or more gears (not shown) so that a user depressing the first control lever 221 causes a reciprocal movement in the second control lever 222 and vice versa.
- first control lever 221 and second control lever 222 may be configured to operate independently.
- a hand controller may employ only one signal indicating a deflection of the first lever and the second lever.
- a hand controller may employ a first signal indicating a deflection of the first control lever and a second signal indicating a deflection of the second control lever.
- the first control lever 221, 223 and the second control lever 222, 224 may be contoured to receive a thumb and/or finger of a user.
- the first control lever 221, 223 extends from or extends beyond the outside side surface of the respective contoured housing 210, 211 the second control lever 222, 224 extends from or extends beyond the inside side surface 212b, 212c of the respective contoured housing.
- controller 210, 211, deflection or depression of the first control lever 221, 223, and the second control lever 222, 224 is configured to produce a signal that the surgical robotic system uses as an input to control a tool or an instrument tip (e.g., opening/closing an aperture of graspers/jaws of an instrument tip) at a distal end of a robotic arm of the surgical robotic system.
- depressing first control lever and the second control lever may change an angle of jaws of a grasper at a distal end of the respective robotic arm.
- end effectors, tools or instruments are used to pull tissue apart, drive a needle driver, grab an item (e.g., a mesh, suture, needle) or pick up such an item in the body cavity when it is dropped, deliver energy via an electrosurgical unit (ESU) (e.g., to cut or to coagulate).
- ESU electrosurgical unit
- a housing of a hand controller may be contoured.
- the contoured housing 210, 211 includes a rounded shape.
- a housing may be shaped to have a contour to match a contour of at least a portion of a thumb of a user’s hand.
- the contoured housing 210, 211, the first control lever 221, 223, and the second control lever 222, 224 may each be shaped to comfortably and ergonomically receive a respective hand of a user.
- a housing of the hand controller, a lever or levers of a hand controller, buttons of a hand controller and/or one or more touch input devices may have shapes and/or positions on the hand controller for fitting different palm sizes and finger lengths.
- Left hand controller 201 also includes a first button 231, a second button 232, and a third button 233.
- right hand controller 202 also includes a first button 234, a second button 235 and a third button 236.
- each button may provide one or more inputs that may be mapped to a variety of different functions of the surgical robotic device to control the surgical robotic system including a camera assembly and a robotic arm assembly.
- input received via the first button 231 of the left hand controller 201 and input received via the first button 234 of the right hand controller 202 may control a clutch feature.
- a clutch is activated enabling movement of the respective left hand controller 201 or right hand controller 20, by the operator without causing any movement of a robotic arms assembly (e.g., a first robotic arm, a second robotic arm, and a camera assembly) of the surgical robotic system.
- a robotic arms assembly e.g., a first robotic arm, a second robotic arm, and a camera assembly
- the clutch is activated for a hand controller, movement of the respective right hand controller or left hand controller is not translated to movement of the robotic assembly.
- an operator engaging a hand controller input e.g., tapping or pressing a button
- activates the clutch and the operator engaging again e.g., tapping or pressing the button again
- turns off the clutch or exits a clutch mode e.g., tapping or pressing the button again
- an operator engaging a hand controller input activates the clutch and the clutch stays active for as long as the input is active and exits the clutch when the when the operator is no longer engaging the hand controller input (e.g., releasing the button).
- Activating the clutch or entering the clutch mode for a hand controller enables the operator to reposition the respective hand controller (e.g., re-position the left controller 201 within the range of motion of the left hand controller 201 and/or re-position the right hand controller 202 within a range of motion of the right hand controller 202) without causing movement of the robotic arms assembly itself.
- the second button 232 of the left hand controller 201 may provide an input that controls a pivot function of the surgical robotic device.
- An operator engaging (e.g., pressing and holding) the second button 232 of the left hand controller 201 may engage a pivot function or a pivot mode that reorients the robotic arms assembly chest to center the camera on the midpoint between the instrument tips.
- the pivot function can be activated with a brief tap or held down to continuously track the instrument tips as they move, in accordance with some embodiments.
- the second button 235 of the right hand controller 202 may provide input for entering a menu mode in which a menu is displayed on the graphical user interface 39 of the surgical robotic system and exiting a menu mode.
- the operator may activate a menu mode by pressing the second button 235 a first time and disengage the menu function by pressing the second button 235 a second time.
- the operator may be able to select options within the menu by navigating the menu using the left hand controller and/or the right hand controller when the menu mode is engaged.
- the first touch input device 242 of the right hand controller 202 may be used to navigate the menu and to select a menu item in some embodiments.
- While in a menu mode movement of the robotic in response to movement of the left hand controller 201 or the right hand controller 202 may be suspended.
- the menu mode and the selection of menu options are discussed in more detail below.
- the third button 233 of the left hand controller and the third button of the right hand controller may provide an input that engages or disengages an instrument control mode of the surgical robotic system in some embodiments.
- a movement of at least one of the one or more hand controllers when in the instrument mode causes a corresponding movement in a corresponding robotic arm of the robotic assembly.
- the instrument control mode will be described in more detail below.
- the left hand controller 201 further includes a touch input device 241.
- the right hand controller 202 further includes a touch input device 242.
- the touch input device 241, 242 may be a scroll wheel, as shown in FIGS. 6A and 6B.
- Other touch input devices that may be employed include, but are not limited to, rocker buttons, joy sticks, pointing sticks, touch pads, track balls, track point nubs, etc.
- the touch input device 241, 242 may be able to receive input through several different forms of engagement by the operator.
- the operator may be able to push or click the first touch input device 241, 242, scroll the first touch input device 241, 242 backward or forward, or both.
- scrolling the first touch input device 241 of the left hand controller 241 forward may activate a zoom in function to magnify a view provided by the camera assembly of the surgical robotic system and displayed to the operator, and scrolling backward with first touch input device 241 may provide a zoom out function to reduce the view provided by the camera assembly of the surgical robotic device and displayed to the operator, or vice versa.
- the zoom function may be mechanical or digital.
- the zoom function may be mechanical in part and digital in part (e.g., a mechanical zoom over one zoom range, and a mechanical zoom plus a digital zoom over another zoom range).
- clicking or depressing first touch input device 241 may engage a scan mode of the surgical robotic system.
- a movement of at least one of the left hand controller 201 or the right hand controller 202 causes a corresponding change in an orientation of a camera assembly of the robotic arms assembly without changing a position or orientation of either robotic arm of the surgical robotic system.
- pressing and holding the first touch input device 241 may activate the scan mode and releasing the first touch input device 241 may end the scan mode of the surgical robotic system.
- releasing the scan mode returns the camera to the orientation it was in upon entering scan mode.
- a function may be provided for locking the orientation upon exiting the scan mode (e.g., to change the “horizon” line).
- the first touch input device 241 of the left hand controller 201 may be used for selection of a direction and degree of left elbow bias.
- elbow bias refers to the extent by which the virtual elbow of the robotic arm is above or below a neutral or default position.
- an operator may be able to select options within the menu by navigating the menu using the left hand controller and/or the right hand controller.
- the touch input device 242 e.g., scroll wheel
- the right hand controller provide a set of inputs for traversing a displayed menu and selecting an item in a displayed menu.
- touch input device 242 may be used to control right elbow bias when a right elbow bias menu item has been selected.
- buttons and the touch input device described above with respect to the left hand controller above may instead be assigned to the right hand controller, and functions of various buttons and the touch input device described above with respect to the right hand controller may instead be assigned to the left hand controller in some embodiments.
- Fig. 6A also shows a schematic depiction 203 of a first foot pedal 251 and second foot pedal 252 for receiving operator input.
- the first foot pedal 251 engages a camera control mode, also described herein as a view control mode, an image framing control mode, or a camera framing control mode of the surgical robotic system and the second foot pedal 252 engages a travel control mode of the surgical robotic system.
- a camera control mode also described herein as a view control mode, an image framing control mode, or a camera framing control mode of the surgical robotic system
- the second foot pedal 252 engages a travel control mode of the surgical robotic system.
- movement of the left hand controller 201 and/or the right hand controller 202 by the operator may provide input that is interpreted by the system to control a movement of and an orientation of a camera assembly of the surgical robotic system while keeping positions of instrument tips of robotic arms of the robotic arms assembly constant.
- the left hand controller 201 and the right hand controller 202 may be used to move the robotic arm assembly of the surgical robotic system in a manner in which distal tips of the robotic arms direct or lead movement of a chest of the robotic arms assembly through an internal body cavity.
- a position and orientation of the camera assembly, of the chest, or of both is automatically adjusted to maintain the view of the camera assembly directed at the tips (e.g., at a point between a tip or tips of a distal end of the first robotic arm and a tip or tips of a distal end of the second robotic arm). This may be described as the camera assembly being pinned to the chest of the robotic arms assembly and automatically following the tips. Further detail regarding the travel control mode is provided below.
- FIGS. 8A and 8B depict another embodiment according to the present disclosure featuring a left hand controller 1001 and a right hand controller 1002.
- the left hand controller 1001 includes a contoured housing 1010
- the right hand controller 1002 includes a contoured housing 1011.
- Each contoured housing 1010, 1011 includes an upper surface 1012a, 1013a, an inside side surface 1012b, 1013b adjacent the upper surface, an outside side surface (not visible in these views) facing away from the inside side surface 1012b, 1013b, and a lower surface (not visible in these views) facing away from the upper surface 1012a, 1013a.
- Each hand controller 1001, 1002 includes a mounting assembly 1015, 1016, respectively.
- the mounting assembly 1015, 1016 that may be used to attach, either directly or indirectly, each of the respective hand controllers 1001, 1002 to a surgeon console of a surgical robotic system.
- the mounting assembly 1015 includes an aperture 1017 and the mounting assembly 1016 defines an aperture 1018.
- the apertures 1017, 1018 may be countersunk apertures, configured to receive a screw or bolt to connect the respective hand controller 1001, 1002 to a surgeon console.
- the mounting assembly 1015 includes a button 1004 and the mounting assembly 1016 includes a button 1005.
- the buttons 1004, 1005 provide an input to toggle between insertion and extraction of one or more robotic arm assemblies 42A, 42B as well as the camera assembly 44.
- the button 1004 can be used to insert or extract a first robotic arm 42 A and the button 1005 can be used to insert or extract a second robotic arm 42B.
- Each of the left hand controller 1001 and the right hand controller 1002 also includes a first button 1031, 1034, a second button 1032, 1035, a touch input device 1041, 1042 (e.g., a joy stick, or scroll wheel), respectively.
- a touch input device 1041, 1042 e.g., a joy stick, or scroll wheel
- a hand controller 1001, 1002 a lever (not visible in this view) extends from the respective outside side surface (not visible in this view).
- a different mechanism may be used for a grasping input on a hand controller.
- a hand controller may include a least one “pistol trigger” type button that can be pulled back to close and released to open instead of or in addition to a lever or levers.
- the left hand controller 1001 includes a first paddle 1021 and a second paddle 1022.
- right hand controller 1002 includes a first paddle 1023 and a second paddle 1024.
- first paddle 1021, 1023 is engaged with the second paddle 1022, 1024 of each hand controller 1001, 1002 via one or more gears (not shown) so that a user depressing the first paddle 1021, 1023 causes a reciprocal movement in the second paddle
- the first paddle 1021, 1023 and the second paddle 1022, 1024 of each hand controller may be configured to operate independently.
- the hand controller 1001, 1002 may employ some form of a signal or other indicator indicating a deflection of the first paddle 1021, 1023 and the second paddle 1022, 1024.
- the hand controller 1001, 1002 may employ a first signal or other indicator indicating a deflection of the first paddle 1021, 1023 and a second signal or other indicator indicating a deflection of the second paddle 1022, 1024.
- the first paddle 1021, 1023 and the second paddle 1022, 1024 may be contoured to receive a thumb and/or finger of a user.
- the first paddle 1021, 1023 extends from or extends beyond the outside side surface of the respective contoured housing 1010, 1011 the second paddle 1022, 1024 extends from or extends beyond the inside side surface 1012b, 1013b of the respective contoured housing.
- controller 1010, 1011, deflection or depression of the first paddle 1021, 1023, and the second paddle 1022, 1024 is configured to trigger a signal that the surgical robotic system uses as an input to control a tool or an instrument tip (e.g., opening/closing an aperture of graspers/jaws of an instrument tip) at a distal end of a robotic arm of the surgical robotic system.
- a tool or an instrument tip e.g., opening/closing an aperture of graspers/jaws of an instrument tip
- depressing first paddle 1021, 1023 and the second paddle 1022, 1024 may change an angle of jaws of a grasper at a distal end of the respective robotic arm.
- end effectors, tools or instruments are used to pull tissue apart, drive a needle driver, grab an item (e.g., a mesh, suture, needle) or pick up such an item in the body cavity when it is dropped, deliver energy via an electrosurgical unit (ESU) (e.g., to cut or to coagulate).
- ESU electrosurgical unit
- each of the first paddle 1021, 1023 and the second paddle 1022, 1024 can have a loop to receive a thumb and/or finger of a user, as further described with respect to FIGS. 9A and 9B.
- parameters e.g., length, angle, finger ergonomics, and the like
- the length, angle, finger ergonomics, and the like can be adjusted.
- the contoured housing 1010, 1011 may be configured to comfortably and ergonomically mate with a corresponding hand of the operator.
- the operator may engage with the respective hand controller 1001, 1002 by placing the thumb of the respective hand on the second paddle 1022, 1024, positioning the pointer finger or middle finger of the respective hand on or over the projecting portion of the upper surface 1013a, 1013a on which the first button 1021, 1034, the second button 1032, 1035, and the touch input device 1041, 1042 are disposed, and by positioning at least, the middle finger or ring finger of the respective hand on or over the first paddle 1021, 1024.
- buttons and touch input devices assign certain functions to certain buttons and to certain touch input devices
- functions are ascribed to which buttons and touch input devices may be different in different embodiments.
- additional functions not explicitly described herein may be assigned to some buttons and some touch input devices in some embodiments.
- one or more functions may be assigned to a foot pedal of a surgical robotic system that includes one or more hand controllers as described herein.
- pressing or pressing and holding the first button 1004 may trigger a signal used to engage an insertion or extraction for a left robotic arm assembly and/or a camera assembly of the surgical robotic system.
- Pressing or pressing and holding the first button 1031 may trigger a signal used to control a clutch function for the left hand controller of the surgical robotic system.
- Pressing or pressing and holding the second button 1032 may trigger a signal used to engage or disengage a camera control mode of the surgical robotic system.
- Scrolling the touch input device 1041 forward may activate a zoom in function to magnify a view provided by the camera assembly of the surgical robotic system and displayed to the operator, and scrolling backward with first touch input device 1041 may provide a zoom out function to reduce the view provided by the camera assembly of the surgical robotic device and displayed to the operator, or vice versa.
- Scrolling the touch input device 1041 may trigger a signal used to select left elbow bias when an elbow bias function is activated using a menu.
- pressing or pressing and holding the first button 1005 may trigger a signal used to engage an insertion or extraction for a right robotic arm assembly and/or a camera assembly of the surgical robotic system.
- Pressing or pressing and holding the first button 1034 may trigger a signal used to control a clutch function for the right hand controller of the surgical robotic system.
- Clicking or depressing the second button 1035 may engage a scan mode of the surgical robotic system. When in a scan mode, a movement of at least one of the left hand controller 1001 or the right hand controller 1002 causes a corresponding change in an orientation of a camera assembly of the robotic assembly without changing a position or orientation of either robotic arm of the surgical robotic system.
- pressing and holding the second button 1035 may activate the scan mode and releasing the second button 1035 may end the scan mode of the surgical robotic system.
- releasing the scan mode returns the camera to the orientation it was in upon entering the scan mode.
- a function may be provided for locking the orientation upon exiting the scan mode (e.g., to change the “horizon” line).
- Scrolling the touch input device 1042 may trigger a signal used to traverse a menu or highlight a portion of the menu when the menu is displayed or a menu mode is active. Pressing the touch input device 1042 may trigger a signal used to select a highlighted portion or of the menu or feature on the menu when the menu is displayed.
- Scrolling the touch input device 1042 may produce a signal used to select right elbow bias when the elbow bias function is activated using the menu. Scrolling forward on touch input device 1042 may move up the menu and scrolling backwards with touch input device 1042 may move down the menu, or vice versa. Clicking first touch input device 1042 may make a selection within a menu.
- FIGS. 9A and 9B depict another embodiment according to the present disclosure featuring a left hand controller 1001’ and a right hand controller 1002’.
- some buttons of the hand controllers 1001’, 1002’ have the same button type but different functions.
- the second button 1035’ of the right hand controller 1002’ may trigger a signal used to turn on or turn off a menu.
- some buttons of the hand controllers 1001 ’, 1002’ may have a different button type and/or different functions.
- touch input device 1041’ for the left hand controller 1001’ may have a three- way switch button type. Switching or holding the touch input device 1041’ to the center may trigger a signal used to engage or disengage a scan mode of the surgical robotic system. Switching the touch input device 1041’ forward may activate a zoom in function to magnify a view provided by the camera assembly of the surgical robotic system and displayed to the operator, and switching backward with first touch input device 1041’ may provide a zoom out function to reduce the view provided by the camera assembly of the surgical robotic device and displayed to the operator, or vice versa. Switching the touch input device 1035’ upward may trigger a signal used to traverse a menu when the menu is displayed or a menu mode is active.
- Touch input device 1042’ for the right hand controller 1002’ may have a three-way switch button type. Switching the touch input device 1042’ may trigger a signal used to traverse a menu or highlight a portion of the menu when the menu is displayed or a menu mode is active by pressing the touch input device 1035’. Switching forward on touch input device 1042’ may move up the menu and switching backwards with touch input device 1042’ may move down the menu, or vice versa. Clicking first touch input device 1042’ may trigger a signal used to select a highlighted portion or of the menu or feature on the menu when the menu is displayed. In some embodiments, switching the touch input device 1042’ may trigger a signal used to select right elbow bias when the elbow bias function is activated using the menu.
- the hand controllers 1001’, 1002’ may have the first paddles 1021 ’, 1023’ and second paddles 1022’, 1024’ to couple to finger loops 1061, 1062, 1063, 1064, respectively.
- Each finger loop can be a Velcro type. In some embodiments (not illustrated), each finger loop can be a hook type.
- Deflection or depression of the first paddle 1021’, 1023’, and the second paddle 1022’, 1024’ is configured to trigger a signal to control a tool or an instrument tip (e.g., opening/closing an aperture of graspers/jaws of an instrument tip) at a distal end of a robotic arm of the surgical robotic system.
- a tool or an instrument tip e.g., opening/closing an aperture of graspers/jaws of an instrument tip
- depressing first paddle 1021’, 1023’ and the second paddle 1022’, 1024’ may change an angle of jaws of a grasper at a distal end of the respective robotic arm.
- end effectors, tools or instruments are used to pull tissue apart, drive a needle driver, grab an item (e.g., a mesh, suture, needle) or pick up such an item in the body cavity when it is dropped, deliver energy via an electrosurgical unit (ESU) (e.g., to cut or to coagulate).
- ESU electrosurgical unit
- first buttons 1031’, 1034’ may have a slider button type. Sliding the first button 1031’, 1034’ may trigger a signal used to control a clutch function for the corresponding hand controller of the surgical robotic system.
- FIG. 9 is a side view of an example position sensor with a target 702, coil sensors 704a and 704b, and background in a robotic joint, in accordance with some embodiments.
- a position sensor 132 includes a target 702 a portion of which is in the vicinity of a sensing coil 704a and another portion of the target 702 that is completely covered by or underneath a sensing coil 704b.
- the combination of the target 702 and coil sensors 704a and 704b form a variable conductor.
- FIG. 10 is a top down view of the position sensor 132 with the target 702, and the sensing coils 704a and 704b in the robotic arm.
- FIG. 8 shows a sensor stack comprising two coils separated by a distance of .5 mm between which the target 702 travels between as the robotic arm 42A or 42B rotates.
- FIG. 10 illustrates an example shape of the target 702 as well as the sensor coils 704a and 704b used in the position sensor 132, in accordance with some embodiments.
- the position sensor 132 can include the target 702 and the sensing coils 704a and 704b.
- the target 702 can be ring shaped with varying thickness around the circumference of the ring.
- the target 702 can be made of a material that has a high permeability while having a low conductivity such as shielding films that may be punched into a pattern that is useful as an encoder target.
- the sensing coils 704a and 704b can have a high conductivity and low permeability.
- the target 702 sensing coils 704 and 704b can form a variable inductor.
- the target 702 can be a copper trace that moves as a joint rotates.
- position sensor 132 can determine the position of the rotary joints in the virtual elbow 128 as the target of the of position sensor 132 rotates and comes within proximity of the corresponding sensing coils.
- the sensing coils 704a and 704b can be a set of fixed location copper traces which (not shown) feed back to the rest of a filtering circuit as shown in FIG. 11.
- the sensing coils 704a and 704b can be represented schematically as seen in FIG. 11 as part of a variable inductor 903.
- FIG. 11 is a schematic of the filter circuit 900 electrically coupled to a controller 26 that is used to determine a position of a robotic arm in response to a signal that is input to the filter circuit 900, in accordance with some embodiments.
- the filter circuit includes a fixed value capacitor 902 and a variable inductor 903.
- the variable inductor 903 is a circuit element corresponding to the target 702 and the sensing coils 704a and 704b.
- the controller 26 can execute instructions in accordance with a Universal Asynchronous Receiver Transmitter (UART).
- UART Universal Asynchronous Receiver Transmitter
- the UART converts parallel data from a controlling device, such as the processor 22, into serial data, and transmits the serial data using the transmitter to a receiver of a UART.
- the serial data is transmitted by a transmitter of the UART through a filter circuit, and it is received by the receiver of the UART. The UART then converts the serial data back into parallel data which is further processed by a processor.
- Data is transferred from the processor 22 via a data bus to the UART in parallel form.
- the UART gets the parallel data from the data bus, it adds a start bit, a parity bit, and a stop bit, creating a data packet.
- the data packet is output serially, bit by bit at the transmitter of the UART, and then the receiver of the UART reads the data packet bit by bit.
- the receiver of the UART then converts the data packet from serial form into parallel form, and then removes the start bit, parity bit, and stop bits.
- the receiver of the UART then transfers the bits on the data bus to the processor 22.
- the controller 26 is electrically coupled to an input of the filter circuit 900, via a transmit pin or output, for example, UART TX 26a and an output of the filter circuit 900 is electrically coupled to a receive pin or input of the controller 26, for example, UART RX 26b of the controller.
- the filter circuit 900 is made of minimal components including the fixed value capacitor 902, and the target 702, which in combination with the sensing coils 704a and 704b that, in some embodiments are incorporated into a flexible printed circuit board (PCB) together form an inductor of variable value depending on the positional relationship of the target 702 and sensing coils 704a and 704b.
- PCB flexible printed circuit board
- the thickness or the width or both of the target 702 passing over, under or between the sensor coils 704a and 704b changes thereby creating an inductance that varies relative to the portion of the target 702 passing by the sensor coils 704a and 704b.
- the transmitter, of the UART, UART TX 26a can transmit a square wave signal of varying frequencies depending on the underlying capabilities of the controller, and the receiver of the UART, UART RX 26b, can receive the same square wave signal or an altered version of the square wave signal based on the filter circuit 900 between the transmitter and receiver of the UART.
- the controller can control the frequency at which the square wave signal is transmitted by updating the clock configuration of the UART, so that the frequency of the bits transmitted is at a specific arbitrary value. It can transmit a square wave signal of varying frequencies by maintaining the UART at a specific transmission rate. Another way of transmitting the signal with varying frequencies is by modulating a packet containing data so that it lowers the frequency.
- a further way to transmit a square wave signal with varying frequencies is by combining the two aforementioned ways of transmitting the signal using varying frequencies. Using this frequency variation, the transmitter can perform a sweep across different frequencies as the input to the filter circuit 900.
- the filter circuit 900 may have a variable frequency response based on the present capacitor value, which can be a fixed value, and an inductance value which is variable.
- Lower frequency signals may pass through the filter circuit 900 unattenuated, but higher frequency signals may be modulated or attenuated depending on the inductance value.
- the output of the filter circuit 900 may begin to degrade and the output signal from the filter circuit 900 may be a corrupted version of the input signal. Based on the threshold of where the corruption becomes more pronounced and the characteristics of the corrupted data, the frequency response of the filter circuit 900 can be characterized.
- the sensed inductance value can be mapped to a corresponding rotary joint angle by utilizing an external tool that can be used to determine the actual angle values of the rotary join at different turn radii, and mapping the sensed inductance value to the different turn radii and thus different rotary joint angles.
- FIG. 12A is an example timing diagram illustrating a comparison of the shape of a signal modulated at a frequency of 1kHz that is input to the filter circuit 900 and a corresponding output signal from the filter circuit 900, corresponding to the frequency response of the filter circuit 900.
- the filter circuit When the UART TX 26a transmits an input signal, filter input signal 1065a, to the filter circuit 900, the filter circuit outputs an output signal, filter output signal 1065b, that is received by the UART RX 26b.
- the digital representation of the fitler input and filter output signal is the same.
- the filter input signal can be a packet of bits that are transmitted serially from the UART TX 26a that can be expressed in a hexadecimal format interpretable by a controller.
- the controller 26 can read in data expressed in a hexadecimal format to be transmitted in a packet and convert the hexadecimal value into a parallel stream of bits that will be transferred to the UART TX 26a via a bus.
- the data that is expressed in the hexadecimal format can be referred to as a controller data representation.
- the controller data representation 1001 of the filter input signal 1065a for a signal that is modulated using a 1kHz frequency can be OxOOFF 00FF 00FF
- the corresponding controller data representation 1001 of the filter output signal 1065b can be OxOOFF 00FF 00FF.
- input signals that are modulated using lower frequencies can pass through the filter circuit 900 unattenuated. As can be the case for the input signal that is modulated using a frequency of 1kHz.
- FIG. 12B is an example timing diagram illustrating a comparison of the shape of a signal modulated at a frequency of 1333Hz that is input to the filter circuit 900 and a corresponding output signal from the filter circuit 900, corresponding to the frequency response of the filter circuit 900.
- UART TX 26a transmits an input signal, filter input signal 1065a, to the filter circuit 900
- the filter circuit outputs an output signal, filter output signal 1065b, that is received by the UART RX 26b.
- the digital representation of the input and output data is not the same.
- the corresponding controller data representation 1001 of the filter output signal 1065b should be OxlOFO 1F01 F01F.
- input signals that are modulated using frequencies above a certain threshold have a tendency to be slightly attenuated or altered as they pass through the filter circuit 900. As is this case for the input signal that is modulated using a frequency of 1333Hz.
- FIG. 12C is an example timing diagram illustrating a comparison of the shape of a signal modulated at a frequency of 2kHz that is input to the filter circuit 900 and a corresponding output signal from the filter circuit 900, corresponding to the frequency response of the filter circuit 900.
- the filter circuit When the UART TX 26a transmits an input signal, filter input signal 1065a, to the filter circuit 900, the filter circuit outputs an output signal, filter output signal 1065b, that is received by the UART RX 26b.
- the digital representation of the input and output data is not the same.
- the corresponding controller data representation 1001 of the filter output 1065b should be 0x060E 078C 1F07.
- input signals that are modulated using frequencies above a certain threshold can be severely attenuated or altered as they pass through the filter circuit 900.
- the input signal that is modulated using a frequency of 2kHz can be severely attenuated or altered as they pass through the filter circuit 900.
- FIG. 13 A is an example 4 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
- Controller 26 can generate an output signal 1101 waveform oscillating at 4 MHz to the filter circuit 900 and the filter circuit 900 which can in turn cause the filter circuit 900 to generate a frequency response and output a return signal 1102 that is received by the controller 26.
- a first threshold e.g., threshold high 1103
- a second threshold e.g., threshold high 1104
- the distortion happens as result of the frequency of the output signal 1101 increasing thereby causing the filter circuit 900 to corrupt the output signal 1101 and generating the return signal 1102 that has a different shape.
- An exemplary frequency cutoff of the filter circuit 900 can be 5 MHz which is less than the 4 MHz output signal 1101 as shown in FIG. 13 A.
- the return signal 1102 closely approximates the shape of the output signal 1101, and the area beneath both curves is approximately the same.
- FIG. 13B is an example 10 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
- Controller 26 can generate an output signal 1101 waveform oscillating at 10 MHz to the filter circuit 900 and the filter circuit 900 which can in turn cause the filter circuit 900 to generate a frequency response and output a return signal 1102 that is received by the controller 26. Because the exemplary frequency cutoff frequency is 5 MHz, which is less than the 4 MHz output signal 1101 as shown in FIG. 13B, the return signal 1102 does not approximate the shape of the output signal 1101 as closely as when the output signal 1101 is oscillating at a frequency of 4 MHz.
- the difference between the area beneath the two curves is greater than the difference between the area of the two curves when the output signal 1101 is oscillating at a frequency of 4 MHz.
- FIG. 13C is an example 20 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
- Controller 26 can generate an output signal 1101 waveform oscillating at 20 MHz to the filter circuit 900 and the filter circuit 900 which can in turn cause the filter circuit 900 to generate a frequency response and output a return signal 1102 that is received by the controller 26. Because the exemplary frequency cutoff frequency is 5 MHz, which is less than the 20 MHz output signal 1101 as shown in FIG.
- the return signal 1102 does not approximate the shape of the output signal 1101 as closely as well as when the output signal 1101 is oscillating at a frequency of 4 or 10 MHz.
- the difference between the area beneath the two curves is greater than the difference between the area of the two curves when the output signal 1101 is oscillating at a frequency of 4 or 10 MHz.
- FIG. 13D is an example 40 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
- Controller 26 can generate an output signal 1101 waveform oscillating at 40 MHz to the filter circuit 900 and the filter circuit 900 which can in turn cause the filter circuit 900 to generate a frequency response and output a return signal 1102 that is received by the controller 26. Because the exemplary frequency cutoff frequency is 5 MHz, which is less than the 40 MHz output signal 1101 as shown in FIG.
- the return signal 1102 does not approximate the shape of the output signal 1101 as closely as well as when the output signal 1101 is oscillating at a frequency of 4, 10, or 20 MHz.
- the difference between the area beneath the two curves is greater than the difference between the area of the two curves when the output signal 1101 is oscillating at a frequency of 4, 10, or 20 MHz.
- FIG. 13E is an example 60 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
- Controller 26 can generate an output signal 1101 waveform oscillating at 60 MHz to the filter circuit 900 which can in turn cause the filter circuit 900 to generate a frequency response and output a return signal 1102 that is received by the controller 26.
- the exemplary cutoff frequency is 5 MHz, which is less than the 60 MHz output signal 1101 as shown in FIG. 13E, the return signal 1102 does not approximate the shape of the output signal 1101 as closely as well as when the output signal 1101 is oscillating at a frequency of 4, 10, 20, or 40 MHz.
- the difference between the area beneath the two curves is greater than the difference between the area of the two curves when the output signal 1101 is oscillating at a frequency of 4, 10, 20, or 40 MHz.
- the controller 26 can map the return signal 1102 values to a corresponding angle for a rotary joint.
- FIG. 14 is an example flowchart 1200 corresponding to determining a position of a robotic joint using an inductive sensing method, in accordance with some embodiments.
- the controller 26 can transmit an input signal of varying frequency to the filter circuit 900.
- the controller 26 can detect a signal corresponding to a response of the filter circuit 900 to the input signal.
- the controller 26 can characterize a frequency response of the filter circuit 900 based on the signal corresponding to the response of the filter circuit 900 being corrupted relative to the input signal.
- the controller 26 can determine an inductance value of the variable inductor 903 based at least in part on the frequency response of the filter circuit 900.
- the controller 26 can determine the angular position of the rotary joint based at least in part on the inductance value which is indicative of a positional relationship between the moveable target 802 and the fixed sensing coils 804a and 804b.
- FIG. 15 schematically depicts an example network environment 1300 that the surgical robotic system can be connected to in accordance with some embodiments.
- Computing module 18 can be used to perform one or more steps of the methods provided by example embodiments.
- the computing module 18 includes one or more non-transitory computer- readable media for storing one or more computer-executable instructions or software for implementing example embodiments.
- the non-transitory computer-readable media can include, but are not limited to, one or more types of hardware memory, non-transitory tangible media (for example, one or more magnetic storage disks, one or more optical disks, one or more USB flashdrives), and the like.
- memory 1306 included in the computing module 18 can store computer- readable and computer-executable instructions or software for implementing example embodiments.
- the computing module 18 also includes the processor 22 and associated core 1304, for executing computer-readable and computerexecutable instructions or software stored in the memory 1306 and other programs for controlling system hardware.
- the processor 22 can be a single core processor or multiple core (1304) processor.
- Memory 1306 can include a computer system memory or random access memory, such as DRAM, SRAM, EDO RAM, and the like.
- the memory 1306 can include other types of memory as well, or combinations thereof.
- a user can interact with the computing module 18 through the display 12, such as a touch screen display or computer monitor, which can display the graphical user interface (GUI) 39.
- the display 12 can also display other aspects, transducers and/or information or data associated with example embodiments.
- the computing module 18 can include other VO devices for receiving input from a user, for example, a keyboard or any suitable multi-point touch interface 1308, a pointing device 1310 (e.g., a pen, stylus, mouse, or trackpad).
- the keyboard 1308 and the pointing device 1310 can be coupled to the visual display device 12.
- the computing module 18 can include other suitable conventional VO peripherals.
- the computing module 18 can also include one or more storage devices 24, such as a hard-drive, CD-ROM, or other computer readable media, for storing data and computer- readable instructions, applications, and/or software that implements example operations/steps of the surgical robotic system 10 as described herein, or portions thereof, which can be executed to generate GUI 39 on display 12.
- Example storage devices 24 can also store one or more databases for storing any suitable information required to implement example embodiments. The databases can be updated by a user or automatically at any suitable time to add, delete or update one or more items in the databases.
- Example storage device 24 can store one or more databases 1326 for storing provisioned data, and other data/information used to implement example embodiments of the systems and methods described herein.
- the computing module 18 can include a network interface 1312 configured to interface via one or more network devices 1320 with one or more networks, for example, Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, Tl, T3, 56kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above.
- LAN Local Area Network
- WAN Wide Area Network
- the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, Tl, T3, 56kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above.
- the network interface 1312 can include a built-in network adapter, network interface card, PCMCIA network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing the computing module 18 to any type of network capable of communication and performing the operations described herein.
- the computing module 18 can be any computer system, such as a workstation, desktop computer, server, laptop, handheld computer, tablet computer (e.g., the iPad® tablet computer), mobile computing or communication device (e.g., the iPhone® communication device), or other form of computing or telecommunications device that is capable of communication and that has sufficient processor power and memory capacity to perform the operations described herein.
- the computing module 18 can run any operating system 1316, such as any of the versions of the Microsoft® Windows® operating systems, the different releases of the Unix and Linux operating systems, any version of the MacOS® for Macintosh computers, any embedded operating system, any real-time operating system, any open source operating system, any proprietary operating system, any operating systems for mobile computing devices, or any other operating system capable of running on the computing device and performing the operations described herein.
- the operating system 1316 can be run in native mode or emulated mode.
- the operating system 1316 can be run on one or more cloud machine instances.
- the computing module 18 can also include an antenna 1330, where the antenna 1330 can transmit wireless transmissions a radio frequency (RF) front end and receive wireless transmissions from the RF front end.
- RF radio frequency
- the computing module can also include the UART TX 26a and UART RX 26b.
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Manipulator (AREA)
Abstract
Systems and methods are disclosed for determining an angular position of a rotary joint using a filter circuit comprising a capacitor, a variable inductor comprising a moveable target, and at least one sensing coil affixed to the rotary joint. A controller coupled to the filter circuit is disclosed. The controller transmits an input signal of varying frequency to the filter circuit, detects a signal corresponding to a response of the filter circuit to the input signal, and characterizes a frequency response of the filter circuit based on a determination that the signal corresponding to the response is corrupted. The controller determines an inductance value of the variable inductor based on the frequency response of the filter circuit, and the angular position of the rotary joint based on the inductance value which is indicative of a positional relationship between the moveable target and the at least one sensing coil.
Description
SYSTEMS AND METHODS FOR INDUCTIVE PULSE FREQUENCY MODULATED POSITION SENSING
Cross-Reference to Related Applications
[0001] This application claims the benefit of U.S. Provisional Patent Application No. 63/524,220, filed June 30, 2023, the entire contents of which are incorporated herein by reference in their entirety.
Background of the Disclosure
[0002] Surgical robotic systems permit a user (also described herein as an “operator” or a “user”) to perform an operation using robotically-controlled instruments to perform tasks and functions during a procedure. Position sensors are used within one or more robotic arms of the surgical robotic system to output signals to a processor that can be used to resolve the position of the one or more arms within a cavity of a patient during a surgical procedure. [0003] Position sensors allow a user to determine the position of the robotic arm joints. One type of position sensor is known as a Hall effect sensor.
[0004] Conventional hardware that is used to determine the positon or angles of rotary joints includes flexible printed circuit boards (PCBs) that bend within the confines of robotic arms, and include a set of four very small hall-effect sensors per joint to determine the joint angles.
Summary
[0005] A system for determining an angular position of a rotary joint, the system is presented. The system comprises a filter circuit comprising a capacitor, a variable inductor and controller coupled the filter circuit. The variable inductor can comprise a moveable target, and at least one sensing coil affixed to the rotary joint. The controller can be programed or configured to transmit an input signal of varying frequency to the filter circuit. The controller can be further programmed or configured to transmit an input signal of varying frequency to the filter circuit. The controller can be further programmed or configured to detect a signal corresponding to a response of the filter circuit to the input signal. The controller can be further programmed or configured to characterize a frequency response of the filter circuit based at least in part on a determination that the signal corresponding to the response of the filter circuit is corrupted. The controller can be further programmed or configured to determine an inductance value of the variable inductor based at least in part on the frequency response of the filter circuit. The controller can be further programmed or
configured to determine the angular position of the rotary joint based at least in part on the inductance value which is indicative of a positional relationship between the moveable target and the at least one sensing coil.
Brief Description of the Drawings
[0006] These and other features and advantages of the present invention will be more fully understood by reference to the following detailed description in conjunction with the attached drawings in which like reference numerals refer to like elements throughout the different views. The drawings illustrate principals of the invention and, although not to scale, show relative dimensions.
[0007] FIG. 1 schematically depicts an example surgical robotic system in accordance with some embodiments.
[0008] FIG. 2A is an example perspective view of a patient cart including a robotic support system coupled to a robotic subsystem of the surgical robotic system in accordance with some embodiments.
[0009] FIG. 2B is an example perspective view of an example operator console of a surgical robotic system of the present disclosure in accordance with some embodiments.
[0010] FIG. 3 A schematically depicts an example side view of a surgical robotic system performing a surgery within an internal cavity of a subject in accordance with some embodiments.
[0011] FIG. 3B schematically depicts an example top view of the surgical robotic system performing the surgery within the internal cavity of the subject of FIG. 3 A in accordance with some embodiments.
[0012] FIG. 4A is an example perspective view of a single robotic arm subsystem in accordance with some embodiments.
[0013] FIG. 4B is an example perspective side view of a single robotic arm of the single robotic arm subsystem of FIG. 4A in accordance with some embodiments.
[0014] FIG. 5 is an example perspective front view of a camera assembly and a robotic arm assembly in accordance with some embodiments.
[0015] FIG. 6A is an example perspective view of a left hand controller for use in an operator console of a surgical robotic system in accordance with some embodiments.
[0016] FIG. 6B is an example perspective view of a right hand controller for use in an operator console of a surgical robotic system in accordance with some embodiments.
[0017] FIG. 7A is an example perspective view of a left hand controller for use in an operator console of a surgical robotic system in accordance with some embodiments.
[0018] FIG. 7B is an example perspective view of a right hand controller for use in an operator console of a surgical robotic system in accordance with some embodiments.
[0019] FIG. 8A is an example perspective view of a left hand controller for use in an operator console of a surgical robotic system in accordance with some embodiments.
[0020] FIG. 8B is an example perspective view of a right hand controller for use in an operator console of a surgical robotic system in accordance with some embodiments.
[0021] FIG. 9 is a side view of an example position sensor with a target, coil sensors, and background in a robotic joint, in accordance with some embodiments.
[0022] FIG. 10 is an example shape of a target as well as sensor coils used in a position sensor, in accordance with some embodiments.
[0023] FIG. 11 is a schematic of a filter circuit coupled to a controller that is used to determine a position of a robotic arm in response to a signal that is input to the filter circuit, in accordance with some embodiments.
[0024] FIG. 12A is an example timing diagram illustrating a comparison of the shape of a signal modulated at a selected frequency that is input to a filter circuit and the shape of a corresponding output signal from the filter circuit, in accordance with some embodiments. [0025] FIG. 12B is an example timing diagram illustrating a comparison of the shape of a signal modulated at a selected frequency that is input to a filter circuit and the shape of a corresponding output signal from the filter circuit, in accordance with some embodiments. [0026] FIG. 12C is an example timing diagram illustrating a comparison of the shape of a signal modulated at a selected frequency that is input to a filter circuit and the shape of a corresponding output signal from the filter circuit, in accordance with some embodiments. [0027] FIG. 13 A is an example 4 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
[0028] FIG. 13B is an example 10 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
[0029] FIG. 13C is an example 20 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
[0030] FIG. 13D is an example 40 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
[0031] FIG. 13E is an example 60 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments.
[0032] FIG. 14 is an example flowchart corresponding to determining a position of a robotic joint using an inductive sensing method, in accordance with some embodiment.
[0033] FIG. 15 schematically depicts an example computing module of the surgical robotic system in accordance with some embodiments.
Detailed Description
[0034] In order to facilitate explanation of the inductive frequency modulated position sensing technology disclosed herein, an application in robotics is disclosed, however the frequency modulated position sensing technology disclosed herein can also be applied in the areas of small personal consumer electronic devices with a dial. For instance, volume knobs or dials that are used to tune a radio to a specific frequency could be an area where frequency modulated position sensing technology is applicable. In order to overcome the difficulties imposed by size constraints, and in particular the need to fit extremely small instruments in robotic arms that are used to sense joint angles of robotic arms as they move, the electronics used to sense the joint angles need to fit into small spaces within the robotic arms while remaining flexible enough to bend around the contours of the space available. A filtering circuit including a capacitor, a target, and sense coils whose inductance changes as the target moves over or near the sense coils, receives signals of varying frequencies from a controller and outputs signals of varying amplitude and frequency in response to the movement of the target and input signals. The controller generates these signals of varying frequency by modulating the frequency of a signal that has a periodicity that is equivalent to that of the clock of the controller. Each of the frequency modulated signals corresponds to a unique frequency by which the signal that is oscillating at the same frequency as the clock is modulated by. As the controller sweeps through a range of frequencies, by which to modulate the signal input to the filter circuit, the filter circuit generates an output signal that changes in shape as the frequency exceeds a certain value. The controller can determine the frequency response of the circuit which can be used to determine the inductance value of the filter circuit which is directly related to the position of the rotary joint as the target comes within
proximity of the sense coils. As a result, the positon of the rotary joint can quickly be determined using a minimal number of components that fit within the confines of the robotic arms.
[0035] While various embodiments of the invention have been shown and described herein, it will be clear to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions may occur to those skilled in the art without departing from the invention. It may be understood that various alternatives to the embodiments of the invention described herein may be employed.
[0036] Unless specifically stated or obvious from context, as used herein, the term “about” is understood as within a range of normal tolerance in the art, for example within 2 standard deviations of the mean. “About” can be understood as within 10%, 9%, 8%, 7%, 6%, 5%, 4%, 3%, 2%, 1%, 0.5%, 0.1%, 0.05%, or 0.01% of the stated value. Unless otherwise clear from the context, all numerical values provided herein are modified by the term “about.” [0037] Although some example embodiments may be described herein or in documents incorporated by reference as employing a plurality of units to perform example processes, it is understood that example processes may also be performed by one or a plurality of modules. Additionally, it is understood that the term controller/controller may refer to a hardware device that includes a memory and a processor and is specifically programmed to execute the processes described herein in accordance with some embodiments. In some embodiments, the memory is configured to store the modules and the processor is specifically configured to execute said modules to perform one or more processes which are described further below. In some embodiments, multiple different controllers or controllers or multiple different types of controllers or controllers may be employed in performing one or more processes. In some embodiments, different controllers or controllers may be implemented in different portions of a surgical robotic systems.
[0038] Some embodiments disclosed herein are implemented on, employ, or are incorporated into a surgical robotic system that includes a camera assembly having at least three articulating degrees of freedom and two or more robotic arms each having at least six articulating degrees of freedom and an additional degree of freedom corresponding to the movement of an associated end-effector (e.g., grasper, manipulator, and the like). In some embodiments, the camera assembly when mounted within a subject (e.g., a patient) can be moved or rotated in a pitch or yaw direction about 180 degrees such that the camera assembly can view rearwardly back towards the insertion site. As such, the camera assembly and the robotic arms can view and operate dexterously forward (e.g., away from the insertion site), to
each side, in an upward or downward direction, as well as in the rearward direction to view backwards towards the insertion site. The robotic arms and the camera assembly can also move in the roll, pitch and yaw directions.
[0039] The large number of degrees of freedom in some surgical robotic systems described herein, in comparison to some conventional surgical robotic systems, enables movements of a robotic arm assembly and orientations of a robotic arm assembly not possible with some conventional surgical robotic arms and enables movements of a camera of a robotic camera assembly not possible in cameras for some conventional robotic surgical systems. For example, many conventional surgical robotic systems having two robotic arms and fewer degrees of freedom per arm may not be able to change a position or an orientation of a virtual chest of the robotic arms assembly while keeping instrument tips of end effectors of the robotic arms stationary. As another example, cameras of many conventional surgical robotic systems may only have degrees of freedom associated with movement of a support for the camera extending through a trocar and may have no independent degrees of freedom for movement relative to the support.
[0040] Some embodiments described herein provide methods and systems employing multiple different control modes, which may be described as a plurality of control modes herein, for controlling a surgical robotic system before, during or after a robotic arms assembly of the surgical robotic system is disposed within an internal body cavity of a subject. In some embodiments, the robotic arms assembly includes at least two robotic arms, which may be described as a “robotic arm assembly” or “arm assembly” herein. In some embodiments, the robotic arms assembly also includes a camera assembly, which may be also be referred to as a “surgical camera assembly”, or “robotic camera assembly” herein. Each control mode uses sensed movement of one or more hand controllers, and may also use input from one or more foot pedals, to control the robotic arm assembly and/or the camera assembly. A control mode may be changed from a current control mode to a different selected control mode based on operator input (e.g., provided via one or more hand controllers and/or a foot pedal of the surgical robotic system). In different control modes, the same movements of the hand controllers may result in different motions of the surgical robotic assembly.
[0041] When describing the control modes, a reference, an orientation or a direction of view of a “camera assembly” or a “camera” is referring to an orientation or a direction of a component or group of components of the surgical robotic arms assembly that includes one or more cameras or other imaging devices that can collectively change orientation with respect
to the robotic arm assembly and provide image data to be displayed. For example, in some embodiments, the one or more cameras or other imaging devices may all be disposed in a same housing whose orientation can be changed relative to a support (e.g., support tube or support shaft) for the camera assembly.
[0042] Some embodiments may be employed with a surgical robotic system. A system for robotic surgery may include a robotic subsystem. The robotic subsystem includes at least a portion, which may also be referred to herein as a robotic arms assembly that can be inserted into a patient via a trocar through a single incision point or site. The portion inserted into the patient via a trocar is small enough to be deployed in vivo at the surgical site and is sufficiently maneuverable when inserted to be able to move within the body to perform various surgical procedures at multiple different points or sites. The portion inserted into the body that performs functional tasks may be referred to as a surgical robotic module, a surgical robotic module or a robotic arms assembly herein. The surgical robotic module can include multiple different submodules or parts that may be inserted into the trocar separately. The surgical robotic module, surgical robotic module or robotic arms assembly can include multiple separate robotic arms that are deployable within the patient along different or separate axes. These multiple separate robotic arms may be collectively referred to as a robotic arm assembly herein. Further, a surgical camera assembly can also be deployed along a separate axis. The surgical robotic module, surgical robotic module, or robotic arms assembly may also include the surgical camera assembly. Thus, the surgical robotic module, or robotic arms assembly employs multiple different components, such as a pair of robotic arms and a surgical or robotic camera assembly, each of which are deployable along different axes and are separately manipulatable, maneuverable, and movable. The robotic arms and the camera assembly that are disposable along separate and manipulatable axes is referred to herein as the Split Arm (SA) architecture. The SA architecture is designed to simplify and increase efficiency of the insertion of robotic surgical instruments through a single trocar at a single insertion site, while concomitantly assisting with deployment of the surgical instruments into a surgical ready state as well as the subsequent removal of the surgical instruments through the trocar. By way of example, a surgical instrument can be inserted through the trocar to access and perform an operation in vivo in the abdominal cavity of a patient. In some embodiments, various surgical instruments may be used or employed, including but not limited to robotic surgical instruments, as well as other surgical instruments known in the art.
[0043] The systems, devices, and methods disclosed herein can be incorporated into and/or used with a robotic surgical device and associated system disclosed for example in United States Patent No. 10,285,765 and in PCT patent application Serial No. PCT/US2020/39203, and/or with the camera assembly and system disclosed in United States Publication No. 2019/0076199, and/or the systems and methods of exchanging surgical tools in an implantable surgical robotic system disclosed in PCT patent application Serial No. PCT/US2021/058820, where the content and teachings of all of the foregoing patents, patent applications and publications are incorporated herein by reference herein in their entirety. The surgical robotic module that forms part of the present invention can form part of a surgical robotic system that includes a user workstation that includes appropriate sensors and displays, and a robot support system (RSS) for interacting with and supporting the robotic subsystem of the present invention in some embodiments. The robotic subsystem includes a motor and a surgical robotic module that includes one or more robotic arms and one or more camera assemblies in some embodiments. The robotic arms and camera assembly can form part of a single support axis robotic system, can form part of the split arm (SA) architecture robotic system, or can have another arrangement. The robot support system can provide multiple degrees of freedom such that the robotic module can be maneuvered within the patient into a single position or multiple different positions. In one embodiment, the robot support system can be directly mounted to a surgical table or to the floor or ceiling within an operating room. In another embodiment, the mounting is achieved by various fastening means, including but not limited to, clamps, screws, or a combination thereof. In other embodiments, the structure may be free standing. The robot support system can mount a motor assembly that is coupled to the surgical robotic module, which includes the robotic arms and the camera assembly. The motor assembly can include gears, motors, drivetrains, electronics, and the like, for powering the components of the surgical robotic module.
[0044] The robotic arms and the camera assembly are capable of multiple degrees of freedom of movement. According to some embodiments, when the robotic arms and the camera assembly are inserted into a patient through the trocar, they are capable of movement in at least the axial, yaw, pitch, and roll directions. The robotic arms are designed to incorporate and employ a multi-degree of freedom of movement robotic arm with an end effector mounted at a distal end thereof that corresponds to a wrist area or joint of the user. In other embodiments, the working end (e.g., the end effector end) of the robotic arm is designed to incorporate and use or employ other robotic surgical instruments, such as for example the
surgical instruments set forth in U.S. Publ. No. 2018/0221102, the entire contents of which are herein incorporated by reference.
[0045] Like numerical identifiers are used throughout the figures to refer to the same elements.
[0046] FIG. 1 is a schematic illustration of an example surgical robotic system 10 in which aspects of the present disclosure can be employed in accordance with some embodiments of the present disclosure. The surgical robotic system 10 includes an operator console 11 and a robotic subsystem 20 in accordance with some embodiments.
[0047] The operator console 11 includes a display 12, an image computing module 14, which may be a three-dimensional (3D) computing module, hand controllers 17 having a sensing and tracking module 16, and a computing module 18. Additionally, the operator console 11 may include a foot pedal array 19 including a plurality of pedals. The image computing module 14 can include a graphical user interface 39. The graphical user interface 39, the controller 26 or the image Tenderer 30, or both, may render one or more images or one or more graphical user interface elements on the graphical user interface 39. For example, a pillar box associated with a mode of operating the surgical robotic system 10, or any of the various components of the surgical robotic system 10, can be rendered on the graphical user interface 39. Also live video footage captured by a camera assembly 44 can also be rendered by the controller 26 or the image Tenderer 30 on the graphical user interface 39.
[0048] The operator console 11 can include a visualization system 9 that includes a display 12 which may be any selected type of display for displaying information, images or video generated by the image computing module 14, the computing module 18, and/or the robotic subsystem 20. The display 12 can include or form part of, for example, a head-mounted display (HMD), an augmented reality (AR) display (e.g., an AR display, or AR glasses in combination with a screen or display), a screen or a display, a two-dimensional (2D) screen or display, a three-dimensional (3D) screen or display, and the like. The display 12 can also include an optional sensing and tracking module 16A. In some embodiments, the display 12 can include an image display for outputting an image from a camera assembly 44 of the robotic subsystem 20.
[0049] The hand controllers 17 are configured to sense a movement of the operator’s hands and/or arms to manipulate the surgical robotic system 10. The hand controllers 17 can include the sensing and tracking module 16, circuity, and/or other hardware. The sensing and tracking module 16 can include one or more sensors or detectors that sense movements of the operator’s hands. In some embodiments, the one or more sensors or detectors that sense
movements of the operator’s hands are disposed in the hand controllers 17 that are grasped by or engaged by hands of the operator. In some embodiments, the one or more sensors or detectors that sense movements of the operator’s hands are coupled to the hands and/or arms of the operator. For example, the sensors of the sensing and tracking module 16 can be coupled to a region of the hand and/or the arm, such as the fingers, the wrist region, the elbow region, and/or the shoulder region. Additional sensors can also be coupled to a head and/or neck region of the operator in some embodiments. In some embodiments, the sensing and tracking module 16 can be external and coupled to the hand controllers 17 via electricity components and/or mounting hardware. In some embodiments, the optional sensor and tracking module 16A may sense and track movement of one or more of an operator’s head, of at least a portion of an operator’s head, an operator’s eyes or an operator’s neck based, at least in part, on imaging of the operator in addition to or instead of by a sensor or sensors attached to the operator’s body.
[0050] In some embodiments, the sensing and tracking module 16 can employ sensors coupled to the torso of the operator or any other body part. In some embodiments, the sensing and tracking module 16 can employ in addition to the sensors an Inertial Momentum Unit (IMU) having for example an accelerometer, gyroscope, magnetometer, and a motion processor. The addition of a magnetometer allows for reduction in sensor drift about a vertical axis. In some embodiments, the sensing and tracking module 16 also include sensors placed in surgical material such as gloves, surgical scrubs, or a surgical gown. The sensors can be reusable or disposable. In some embodiments, sensors can be disposed external of the operator, such as at fixed locations in a room, such as an operating room. The external sensors 37 can generate external data 36 that can be processed by the computing module 18 and hence employed by the surgical robotic system 10.
[0051] The sensors generate position and/or orientation data indicative of the position and/or orientation of the operator’s hands and/or arms. The sensing and tracking modules 16 and/or 16A can be utilized to control movement (e.g., changing a position and/or an orientation) of the camera assembly 44 and robotic arms 42 of the robotic subsystem 20. The tracking and position data 34 generated by the sensing and tracking module 16 can be conveyed to the computing module 18 for processing by at least one processor 22.
[0052] The computing module 18 can determine or calculate, from the tracking and position data 34 and 34A, the position and/or orientation of the operator’s hands or arms, and in some embodiments of the operator’s head as well, and convey the tracking and position data 34 and 34A to the robotic subsystem 20. The tracking and position data 34, 34A can be processed
by the processor 22 and can be stored for example in the storage 24. The tracking and position data 34 and 34A can also be used by the controller 26, which in response can generate control signals for controlling movement of the robotic arms 42 and/or the camera assembly 44. For example, the controller 26 can change a position and/or an orientation of at least a portion of the camera assembly 44, of at least a portion of the robotic arms 42, or both. In some embodiments, the controller 26 can also adjust the pan and tilt of the camera assembly 44 to follow the movement of the operator’s head.
[0053] The robotic subsystem 20 can include a robot support system (RSS) 46 having a motor 40 and a trocar 50 or trocar mount, the robotic arms 42, and the camera assembly 44. The robotic arms 42 and the camera assembly 44 can form part of a single support axis robot system, such as that disclosed and described in U.S. Patent No. 10,285,765, or can form part of a split arm (SA) architecture robot system, such as that disclosed and described in PCT Patent Application No. PCT/US2020/039203, both of which are incorporated herein by reference in their entirety.
[0054] The robotic subsystem 20 can employ multiple different robotic arms that are deployable along different or separate axes. In some embodiments, the camera assembly 44, which can employ multiple different camera elements, can also be deployed along a common separate axis. Thus, the surgical robotic system 10 can employ multiple different components, such as a pair of separate robotic arms and the camera assembly 44, which are deployable along different axes. In some embodiments, the robotic arms assembly 42 and the camera assembly 44 are separately manipulatable, maneuverable, and movable. The robotic subsystem 20, which includes the robotic arms 42 and the camera assembly 44, is disposable along separate manipulatable axes, and is referred to herein as an SA architecture. The SA architecture is designed to simplify and increase efficiency of the insertion of robotic surgical instruments through a single trocar at a single insertion point or site, while concomitantly assisting with deployment of the surgical instruments into a surgical ready state, as well as the subsequent removal of the surgical instruments through a trocar 50 as further described below.
[0055] The RSS 46 can include the motor 40 and the trocar 50 or a trocar mount. The RSS 46 can further include a support member that supports the motor 40 coupled to a distal end thereof. The motor 40 in turn can be coupled to the camera assembly 44 and to each of the robotic arms assembly 42. The support member can be configured and controlled to move linearly, or in any other selected direction or orientation, one or more components of the robotic subsystem 20. In some embodiments, the RSS 46 can be free standing. In some
embodiments, the RSS 46 can include the motor 40 that is coupled to the robotic subsystem 20 at one end and to an adjustable support member or element at an opposed end.
[0056] The motor 40 can receive the control signals generated by the controller 26. The motor 40 can include gears, one or more motors, drivetrains, electronics, and the like, for powering and driving the robotic arms 42 and the cameras assembly 44 separately or together. The motor 40 can also provide mechanical power, electrical power, mechanical communication, and electrical communication to the robotic arms 42, the camera assembly 44, and/or other components of the RSS 46 and robotic subsystem 20. The motor 40 can be controlled by the computing module 18. The motor 40 can thus generate signals for controlling one or more motors that in turn can control and drive the robotic arms 42, including for example the position and orientation of each robot joint of each robotic arm, as well as the camera assembly 44. The motor 40 can further provide for a translational or linear degree of freedom that is first utilized to insert and remove each component of the robotic subsystem 20 through a trocar 50. The motor 40 can also be employed to adjust the inserted depth of each robotic arm 42 when inserted into the patient 100 through the trocar 50.
[0057] The trocar 50 is a medical device that can be made up of an awl (which may be a metal or plastic sharpened or non-bladed tip), a cannula (essentially a hollow tube), and a seal in some embodiments. The trocar 50 can be used to place at least a portion of the robotic subsystem 20 in an interior cavity of a subject (e.g., a patient) and can withdraw gas and/or fluid from a body cavity. The robotic subsystem 20 can be inserted through the trocar 50 to access and perform an operation in vivo in a body cavity of a patient. In some embodiments, the robotic subsystem 20 can be supported, at least in part, by the trocar 50 or a trocar mount with multiple degrees of freedom such that the robotic arms 42 and the camera assembly 44 can be maneuvered within the patient into a single position or multiple different positions. In some embodiments, the robotic arms 42 and camera assembly 44 can be moved with respect to the trocar 50 or a trocar mount with multiple different degrees of freedom such that the robotic arms 42 and the camera assembly 44 can be maneuvered within the patient into a single position or multiple different positions.
[0058] In some embodiments, the RSS 46 can further include an optional controller for processing input data from one or more of the system components (e.g., the display 12, the sensing and tracking module 16, the robotic arms 42, the camera assembly 44, and the like), and for generating control signals in response thereto. The motor 40 can also include a storage element for storing data in some embodiments.
[0059] The robotic arms 42 can be controlled to follow the scaled-down movement or motion of the operator’s arms and/or hands as sensed by the associated sensors in some embodiments and in some modes of operation. The robotic arms 42 include a first robotic arm including a first end effector at distal end of the first robotic arm, and a second robotic arm including a second end effector disposed at a distal end of the second robotic arm. In some embodiments, the robotic arms 42 can have portions or regions that can be associated with movements associated with the shoulder, elbow, and wrist joints as well as the fingers of the operator. For example, the robotic elbow joint can follow the position and orientation of the human elbow, and the robotic wrist joint can follow the position and orientation of the human wrist. The robotic arms 42 can also have associated therewith end regions that can terminate in end-effectors that follow the movement of one or more fingers of the operator in some embodiments, such as for example the index finger as the user pinches together the index finger and thumb. In some embodiments, while the robotic arms 42 may follow movement of the arms of the operator in some modes of control while a virtual chest of the robotic arms assembly may remain stationary (e.g., in an instrument control mode). In some embodiments, the position and orientation of the torso of the operator are subtracted from the position and orientation of the operator’s arms and/or hands. This subtraction allows the operator to move his or her torso without the robotic arms moving. Further disclosure control of movement of individual arms of a robotic arm assembly is provided in International Patent Application Publications WO 2022/094000 Al and WO 2021/231402 Al, each of which is incorporated by reference herein in its entirety.
[0060] The camera assembly 44 is configured to provide the operator with image data 48, such as for example a live video feed of an operation or surgical site, as well as enable the operator to actuate and control the cameras forming part of the camera assembly 44. In some embodiments, the camera assembly 44 can include one or more cameras (e.g., a pair of cameras), the optical axes of which are axially spaced apart by a selected distance, known as the inter-camera distance, to provide a stereoscopic view or image of the surgical site. In some embodiments, the operator can control the movement of the cameras via movement of the hands via sensors coupled to the hands of the operator or via hand controllers 17 grasped or held by hands of the operator, thus enabling the operator to obtain a desired view of an operation site in an intuitive and natural manner. In some embodiments, the operator can additionally control the movement of the camera via movement of the operator’s head. The camera assembly 44 is movable in multiple directions, including for example in yaw, pitch and roll directions relative to a direction of view. In some embodiments, the components of
the stereoscopic cameras can be configured to provide a user experience that feels natural and comfortable. In some embodiments, the interaxial distance between the cameras can be modified to adjust the depth of the operation site perceived by the operator.
[0061] The image or video data 48 generated by the camera assembly 44 can be displayed on the display 12. In embodiments in which the display 12 includes an HMD, the display can include the built-in sensing and tracking module 16A that obtains raw orientation data for the yaw, pitch and roll directions of the HMD as well as positional data in Cartesian space (x, y, z) of the HMD. In some embodiments, positional and orientation data regarding an operator’s head may be provided via a separate head-tracking module. In some embodiments, the sensing and tracking module 16A may be used to provide supplementary position and orientation tracking data of the display in lieu of or in addition to the built-in tracking system of the HMD. In some embodiments, no head tracking of the operator is used or employed. In some embodiments, images of the operator may be used by the sensing and tracking module 16A for tracking at least a portion of the operator’s head.
[0062] FIG. 2A depicts an example robotic arms assembly 20, which is also referred to herein as a robotic subsystem, of a surgical robotic system 10 incorporated into or mounted onto a mobile patient cart in accordance with some embodiments. In some embodiments, the robotic arms assembly 20 includes the RSS 46, which, in turn includes the motor 40, the robotic arm assembly 42 having end-effectors 45, the camera assembly 44 having one or more cameras 47, and may also include the trocar 50 or a trocar mount.
[0063] FIG. 2B depicts an example of an operator console 11 of the surgical robotic system 10 of the present disclosure in accordance with some embodiments. The operator console 11 includes a display 12, hand controllers 17, and also includes one or more additional controllers, such as a foot pedal array 19 for control of the robotic arms 42, for control of the camera assembly 44, and for control of other aspects of the system.
[0064] FIG. 2B also depicts the left hand controller subsystem 23 A and the right hand controller subsystem 23B of the operator console. The left hand controller subsystem 23 A includes and supports the left hand controller 17A and the right hand controller subsystem 23B includes and supports the right hand controller 17B. In some embodiments, the left hand controller subsystem 23 A may releasably connect to or engage the left hand controller 17A, and right hand controller subsystem 23B may releasably connect to or engage the right hand controller 17 A. In some embodiments, the connections may be both physical and electronic so that the left hand controller subsystem 23 A and the right hand controller subsystem 23B may receive signals from the left hand controller 17A and the right hand controller 17B,
respectively, including signals that convey inputs received from a user selection on a button or touch input device of the left hand controller 17A or the right hand controller 17B.
[0065] Each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B may include components that enable a range of motion of the respective left hand controller 17A and right hand controller 17B, so that the left hand controller 17A and right hand controller 17B may be translated or displaced in three dimensions and may additionally move in the roll, pitch, and yaw directions. Additionally, each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B may register movement of the respective left hand controller 17A and right hand controller 17B in each of the forgoing directions and may send a signal providing such movement information to a processor (not shown) of the surgical robotic system.
[0066] In some embodiments, each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B may be configured to receive and connect to or engage different hand controllers (not shown). For example, hand controllers with different configurations of buttons and touch input devices may be provided. Additionally, hand controllers with a different shape may be provided. The hand controllers may be selected for compatibility with a particular surgical robotic system or a particular surgical robotic procedure or selected based upon preference of an operator with respect to the buttons and input devices or with respect to the shape of the hand controller in order to provide greater comfort and ease for the operator.
[0067] FIG. 3 A schematically depicts a side view of the surgical robotic system 10 performing a surgery within an internal cavity 104 of a subject 100 in accordance with some embodiments and for some surgical procedures. FIG. 3B schematically depicts a top view of the surgical robotic system 10 performing the surgery within the internal cavity 104 of the subject 100. The subject 100 (e.g., a patient) is placed on an operation table 102 (e.g., a surgical table 102). In some embodiments, and for some surgical procedures, an incision is made in the patient 100 to gain access to the internal cavity 104. The trocar 50 is then inserted into the patient 100 at a selected location to provide access to the internal cavity 104 or operation site. The RSS 46 can then be maneuvered into position over the patient 100 and the trocar 50. In some embodiments, the RSS 46 includes a trocar mount that attaches to the trocar 50. The robotic arms assembly 20 can be coupled to the motor 40 and at least a portion of the robotic arms assembly can be inserted into the trocar 50 and hence into the internal cavity 104 of the patient 100. For example, the camera assembly 44 and the robotic arm assembly 42 can be inserted individually and sequentially into the patient 100 through the
trocar 50. Although the camera assembly and the robotic arm assembly may include some portions that remain external to the subject’s body in use, references to insertion of the robotic arm assembly 42 and/or the camera assembly into an internal cavity of a subject and disposing the robotic arm assembly 42 and/or the camera assembly 44 in the internal cavity of the subject are referring to the portions of the robotic arm assembly 42 and the camera assembly 44 that are intended to be in the internal cavity of the subject during use. The sequential insertion method has the advantage of supporting smaller trocars and thus smaller incisions can be made in the patient 100, thus reducing the trauma experienced by the patient 100. In some embodiments, the camera assembly 44 and the robotic arm assembly 42 can be inserted in any order or in a specific order. In some embodiments, the camera assembly 44 can be followed by a first robotic arm of the robotic arm assembly 42 and then followed by a second robotic arm of the robotic arm assembly 42 all of which can be inserted into the trocar 50 and hence into the internal cavity 104. Once inserted into the patient 100, the RSS 46 can move the robotic arm assembly 42 and the camera assembly 44 to an operation site manually or automatically controlled by the operator console 11.
[0068] Further disclosure regarding control of movement of individual arms of a robotic arm assembly is provided in International Patent Application Publications WO 2022/094000 Al and WO 2021/231402 Al, each of which is incorporated by reference herein in its entirety. [0069] FIG. 4A is a perspective view of a robotic arm subassembly 21 in accordance with some embodiments. The robotic arm subassembly 21 includes a robotic arm 42 A, the endeffector 45 having an instrument tip 120 (e.g., monopolar scissors, needle driver/holder, bipolar grasper, or any other appropriate tool), a shaft 122 supporting the robotic arm 42 A. A distal end of the shaft 122 is coupled to the robotic arm 42A, and a proximal end of the shaft 122 is coupled to a housing 124 of the motor 40 (as shown in FIG. 2 A). At least a portion of the shaft 122 can be external to the internal cavity 104 (as shown in FIGS. 3A and 3B). At least a portion of the shaft 122 can be inserted into the internal cavity 104 (as shown in FIGS. 3 A and 3B).
[0070] FIG. 4B is a side view of the robotic arm assembly 42. The robotic arm assembly 42 includes a virtual shoulder 126, a virtual elbow 128 having position sensor 132 (e.g., inductive sensing coil and oscillator circuit), a virtual wrist 130, and the end-effector 45 in accordance with some embodiments. The virtual shoulder 126, the virtual elbow 128, the virtual wrist 130 can include a series of hinge and rotary joints to provide each arm with positionable, seven degrees of freedom, along with one additional grasping degree of freedom for the end-effector 45 in some embodiments.
[0071] FIG. 5 illustrates a perspective front view of a portion of the robotic arms assembly 20 configured for insertion into an internal body cavity of a patient. The robotic arms assembly 20 includes a robotic arm 42 A and a robotic arm 42B. The two robotic arms 42 A and 42B can define, or at least partially define, a virtual chest 140 of the robotic arms assembly 20 in some embodiments. In some embodiments, the virtual chest 140 (depicted as a triangle with dotted lines) can be defined by a chest plane extending between a first pivot point 142 A of a most proximal joint of the robotic arm 42A (e.g., a shoulder joint 126), a second pivot point 142B of a most proximal joint of the robotic arm 42B, and a camera imaging center point 144 of the camera(s) 47. A pivot center 146 of the virtual chest 140 lies in the middle of the virtual chest.
[0072] In some embodiments, sensors in one or both of the robotic arm 42A and the robotic arm 42B can be used by the system to determine a change in location in three-dimensional space of at least a portion of the robotic arm. In some embodiments, sensors in one or both of the first robotic arm and second robotic arm can be used by the system to determine a location in three-dimensional space of at least a portion of one robotic arm relative to a location in three-dimensional space of at least a portion of the other robotic arm.
[0073] In some embodiments, a camera assembly 44 is configured to obtain images from which the system can determine relative locations in three-dimensional space. For example, the camera assembly may include multiple cameras, at least two of which are laterally displaced from each other relative to an imaging axis, and the system may be configured to determine a distance to features within the internal body cavity. Further disclosure regarding a surgical robotic system including camera assembly and associated system for determining a distance to features may be found in International Patent Application Publication No. WO 2021/159409, entitled “System and Method for Determining Depth Perception In Vivo in a Surgical Robotic System,” and published August 12, 2021, which is incorporated by reference herein in its entirety. Information about the distance to features and information regarding optical properties of the cameras may be used by a system to determine relative locations in three-dimensional space.
[0074] Hand controllers for a surgical robotic system as described herein can be employed with any of the surgical robotic systems described above or any other suitable surgical robotic system. Further, some embodiments of hand controllers described herein may be employed with semi-robotic endoscopic surgical systems that are only robotic in part.
[0075] As explained above, controllers for a surgical robotic system may desirably feature sufficient inputs to provide control of the system, an ergonomic design and “natural” feel in use.
[0076] In some embodiments described herein, reference is made to a left hand controller and a corresponding left robotic arm, which may be a first robotic arm, and to a right hand controller and a corresponding right robotic arm, which may be a second robotic arm. In some embodiments, a robotic arm considered a left robotic arm and a robotic arm considered a right robotic arm may change due a configuration of the robotic arms and the camera assembly being adjusted such that the second robotic arm corresponds to a left robotic arm with respect to a view provided by the camera assembly and the first robotic arm corresponds to a right robotic arm with respect view provided by the camera assembly. In some embodiments, the surgical robotic system changes which robotic arm is identified as corresponding to the left hand controller and which robotic arm is identified as corresponding to the right hand controller during use. In some embodiments, at least one hand controller includes one or more operator input devices to provide one or more inputs for additional control of a robotic assembly. In some embodiments, the one or more operator input devices receive one or more operators inputs for at least one of: engaging a scanning mode, resetting a camera assembly orientation and position to a align a view of the camera assembly to the instrument tips and to the chest; displaying a menu, traversing a menu or highlighting options or items for selection and selecting an item or option, selecting and adjusting an elbow position, and engaging a clutch associated with an individual hand controller. In some embodiments, additional functions may be accessed via the menu, for example, selecting a level of a grasper force (e.g., high/low), selecting an insertion mode, an extraction mode, or an exchange mode, adjusting a focus, lighting, or a gain, camera cleaning, motion scaling, rotation of camera to enable looking down, etc.
[0077] Fig. 6A depicts a left hand controller 201 and FIG. 6B depicts a right hand controller 202 in accordance with some embodiments. The left hand controller 201 and the right hand controller 202 each include a contoured housing 210, 211, respectively. Each contoured housing 210, 211, includes an upper surface 212a, 213a, an inside side surface 212b, 213 adjacent the upper surface, an outside side surface (not visible in these views) facing away from the inside side surface 212b, 213, and a lower surface (not visible in these views) facing away from the upper surface 212a, 213a.
[0078] In some embodiments, each hand controller 201, 202 includes a mounting assembly 215, 216, respectively. The mounting assembly 215, 216 may be used to attach, either
directly or indirectly, the respective hand controller 201, 202 to a user console of a surgical robotic system. In some embodiments, the mounting assembly 215 defines holes 217, which may be countersunk holes, configured to receive a screw or bolt to connect the left hand controller 201 to a user console.
[0079] In some embodiments of the present disclosure, such as that depicted in FIGs. 6A and 6B, the hand controller includes two control levers, three buttons, and one touch input device. As will be explained herein, embodiments may feature other combinations of touch input devices, buttons, and levers, or a subset thereof. The embodiment shown as the left hand controller 201 features a first control lever 221 and a second control lever 222. Similarly, right hand controller 202 includes a first control lever 223 and a second control lever 224. In some embodiments, first control lever 221 is engaged with the second control lever 222 via one or more gears (not shown) so that a user depressing the first control lever 221 causes a reciprocal movement in the second control lever 222 and vice versa. Further description regarding a geared engagement between a first control lever and a second control lever is provided below with respect to FIGs. 11 A and 1 IB. In another embodiment, first control lever 221 and second control lever 222 may be configured to operate independently. In embodiments employing reciprocal movement of the first and second control lever, a hand controller may employ only one signal indicating a deflection of the first lever and the second lever. In embodiments in which the first control lever and second control lever operate independently, a hand controller may employ a first signal indicating a deflection of the first control lever and a second signal indicating a deflection of the second control lever.
[0080] In some embodiments, the first control lever 221, 223 and the second control lever 222, 224 may be contoured to receive a thumb and/or finger of a user. In some embodiments, the first control lever 221, 223 extends from or extends beyond the outside side surface of the respective contoured housing 210, 211 the second control lever 222, 224 extends from or extends beyond the inside side surface 212b, 212c of the respective contoured housing. For each hand controller 210, 211, deflection or depression of the first control lever 221, 223, and the second control lever 222, 224, is configured to produce a signal that the surgical robotic system uses as an input to control a tool or an instrument tip (e.g., opening/closing an aperture of graspers/jaws of an instrument tip) at a distal end of a robotic arm of the surgical robotic system. For example, depressing first control lever and the second control lever may change an angle of jaws of a grasper at a distal end of the respective robotic arm. In some embodiments, end effectors, tools or instruments are used to pull tissue apart, drive a needle driver, grab an item (e.g., a mesh, suture, needle) or pick up such an item in the body cavity
when it is dropped, deliver energy via an electrosurgical unit (ESU) (e.g., to cut or to coagulate).
[0081] In some embodiments, a housing of a hand controller may be contoured. For example, in FIGS. 6 A and 6B, the contoured housing 210, 211 includes a rounded shape. In some embodiments, a housing may be shaped to have a contour to match a contour of at least a portion of a thumb of a user’s hand. In some embodiments, the contoured housing 210, 211, the first control lever 221, 223, and the second control lever 222, 224, may each be shaped to comfortably and ergonomically receive a respective hand of a user. In some embodiments, a housing of the hand controller, a lever or levers of a hand controller, buttons of a hand controller and/or one or more touch input devices may have shapes and/or positions on the hand controller for fitting different palm sizes and finger lengths.
[0082] Left hand controller 201 also includes a first button 231, a second button 232, and a third button 233. Similarly, right hand controller 202 also includes a first button 234, a second button 235 and a third button 236. As will be described herein, each button may provide one or more inputs that may be mapped to a variety of different functions of the surgical robotic device to control the surgical robotic system including a camera assembly and a robotic arm assembly. In an embodiment, input received via the first button 231 of the left hand controller 201 and input received via the first button 234 of the right hand controller 202 may control a clutch feature. For example, by engaging the first button 231, 234 a clutch is activated enabling movement of the respective left hand controller 201 or right hand controller 20, by the operator without causing any movement of a robotic arms assembly (e.g., a first robotic arm, a second robotic arm, and a camera assembly) of the surgical robotic system. When the clutch is activated for a hand controller, movement of the respective right hand controller or left hand controller is not translated to movement of the robotic assembly. In some embodiments, an operator engaging a hand controller input (e.g., tapping or pressing a button) activates the clutch and the operator engaging again (e.g., tapping or pressing the button again) turns off the clutch or exits a clutch mode. In some embodiments, an operator engaging a hand controller input (e.g., tapping or pressing a button and holding the button) activates the clutch and the clutch stays active for as long as the input is active and exits the clutch when the when the operator is no longer engaging the hand controller input (e.g., releasing the button). Activating the clutch or entering the clutch mode for a hand controller enables the operator to reposition the respective hand controller (e.g., re-position the left controller 201 within the range of motion of the left hand controller 201 and/or re-position
the right hand controller 202 within a range of motion of the right hand controller 202) without causing movement of the robotic arms assembly itself.
[0083] The second button 232 of the left hand controller 201 may provide an input that controls a pivot function of the surgical robotic device. An operator engaging (e.g., pressing and holding) the second button 232 of the left hand controller 201 may engage a pivot function or a pivot mode that reorients the robotic arms assembly chest to center the camera on the midpoint between the instrument tips. The pivot function can be activated with a brief tap or held down to continuously track the instrument tips as they move, in accordance with some embodiments.
[0084] The second button 235 of the right hand controller 202 may provide input for entering a menu mode in which a menu is displayed on the graphical user interface 39 of the surgical robotic system and exiting a menu mode. The operator may activate a menu mode by pressing the second button 235 a first time and disengage the menu function by pressing the second button 235 a second time. The operator may be able to select options within the menu by navigating the menu using the left hand controller and/or the right hand controller when the menu mode is engaged. For example, the first touch input device 242 of the right hand controller 202 may be used to navigate the menu and to select a menu item in some embodiments. While in a menu mode, movement of the robotic in response to movement of the left hand controller 201 or the right hand controller 202 may be suspended. The menu mode and the selection of menu options are discussed in more detail below. The third button 233 of the left hand controller and the third button of the right hand controller may provide an input that engages or disengages an instrument control mode of the surgical robotic system in some embodiments. A movement of at least one of the one or more hand controllers when in the instrument mode causes a corresponding movement in a corresponding robotic arm of the robotic assembly. The instrument control mode will be described in more detail below.
[0085] The left hand controller 201 further includes a touch input device 241. Similarly, the right hand controller 202 further includes a touch input device 242. In an embodiment, the touch input device 241, 242 may be a scroll wheel, as shown in FIGS. 6A and 6B. Other touch input devices that may be employed include, but are not limited to, rocker buttons, joy sticks, pointing sticks, touch pads, track balls, track point nubs, etc.
[0086] The touch input device 241, 242 may be able to receive input through several different forms of engagement by the operator. For example, where the touch input device 241, 242 is a scroll wheel, the operator may be able to push or click the first touch input device 241, 242, scroll the first touch input device 241, 242 backward or forward, or both.
[0087] In some embodiments, scrolling the first touch input device 241 of the left hand controller 241 forward may activate a zoom in function to magnify a view provided by the camera assembly of the surgical robotic system and displayed to the operator, and scrolling backward with first touch input device 241 may provide a zoom out function to reduce the view provided by the camera assembly of the surgical robotic device and displayed to the operator, or vice versa. In embodiments, the zoom function may be mechanical or digital. In some embodiments, the zoom function may be mechanical in part and digital in part (e.g., a mechanical zoom over one zoom range, and a mechanical zoom plus a digital zoom over another zoom range).
[0088] In some embodiments, clicking or depressing first touch input device 241 may engage a scan mode of the surgical robotic system. When in a scan mode, a movement of at least one of the left hand controller 201 or the right hand controller 202 causes a corresponding change in an orientation of a camera assembly of the robotic arms assembly without changing a position or orientation of either robotic arm of the surgical robotic system. In another embodiment, pressing and holding the first touch input device 241 may activate the scan mode and releasing the first touch input device 241 may end the scan mode of the surgical robotic system. In some embodiments, releasing the scan mode returns the camera to the orientation it was in upon entering scan mode. In some embodiments, a function may be provided for locking the orientation upon exiting the scan mode (e.g., to change the “horizon” line).
[0089] In some embodiments, when in a menu mode and a left elbow menu item is selected, the first touch input device 241 of the left hand controller 201 may be used for selection of a direction and degree of left elbow bias. As used herein, elbow bias refers to the extent by which the virtual elbow of the robotic arm is above or below a neutral or default position. [0090] In some embodiments, when in a menu mode, an operator may be able to select options within the menu by navigating the menu using the left hand controller and/or the right hand controller. For example, when in the menu mode, the touch input device 242 (e.g., scroll wheel) of the right hand controller provide a set of inputs for traversing a displayed menu and selecting an item in a displayed menu. For example, by scrolling forward on touch input device 242 the operator may move up the menu and by scrolling backwards with touch input device 242 the user may move down the menu, or vice versa. In an embodiment, by clicking first touch input device 242 the operator may make a selection within a menu. Use of the touch input device 242 and the menu mode are discussed in more detail below.
[0091] In some embodiments, the touch input device 242 of the right hand controller 202 may be used to control right elbow bias when a right elbow bias menu item has been selected. [0092] Functions of various buttons and the touch input device described above with respect to the left hand controller above may instead be assigned to the right hand controller, and functions of various buttons and the touch input device described above with respect to the right hand controller may instead be assigned to the left hand controller in some embodiments.
[0093] Fig. 6A also shows a schematic depiction 203 of a first foot pedal 251 and second foot pedal 252 for receiving operator input. As shown in Fig. 6A, in some embodiments the first foot pedal 251 engages a camera control mode, also described herein as a view control mode, an image framing control mode, or a camera framing control mode of the surgical robotic system and the second foot pedal 252 engages a travel control mode of the surgical robotic system.
[0094] In some embodiments, when the camera control mode is activated e.g., using the foot pedal 251, movement of the left hand controller 201 and/or the right hand controller 202 by the operator may provide input that is interpreted by the system to control a movement of and an orientation of a camera assembly of the surgical robotic system while keeping positions of instrument tips of robotic arms of the robotic arms assembly constant.
[0095] In some embodiments, when the travel control mode is activated e.g., using the foot pedal 252, the left hand controller 201 and the right hand controller 202 may be used to move the robotic arm assembly of the surgical robotic system in a manner in which distal tips of the robotic arms direct or lead movement of a chest of the robotic arms assembly through an internal body cavity. In the travel control mode, a position and orientation of the camera assembly, of the chest, or of both is automatically adjusted to maintain the view of the camera assembly directed at the tips (e.g., at a point between a tip or tips of a distal end of the first robotic arm and a tip or tips of a distal end of the second robotic arm). This may be described as the camera assembly being pinned to the chest of the robotic arms assembly and automatically following the tips. Further detail regarding the travel control mode is provided below.
[0096] FIGS. 8A and 8B depict another embodiment according to the present disclosure featuring a left hand controller 1001 and a right hand controller 1002. The left hand controller 1001 includes a contoured housing 1010, and the right hand controller 1002 includes a contoured housing 1011. Each contoured housing 1010, 1011, includes an upper surface 1012a, 1013a, an inside side surface 1012b, 1013b adjacent the upper surface, an
outside side surface (not visible in these views) facing away from the inside side surface 1012b, 1013b, and a lower surface (not visible in these views) facing away from the upper surface 1012a, 1013a.
[0097] Each hand controller 1001, 1002 includes a mounting assembly 1015, 1016, respectively. The mounting assembly 1015, 1016 that may be used to attach, either directly or indirectly, each of the respective hand controllers 1001, 1002 to a surgeon console of a surgical robotic system. The mounting assembly 1015 includes an aperture 1017 and the mounting assembly 1016 defines an aperture 1018. The apertures 1017, 1018 may be countersunk apertures, configured to receive a screw or bolt to connect the respective hand controller 1001, 1002 to a surgeon console. The mounting assembly 1015 includes a button 1004 and the mounting assembly 1016 includes a button 1005. The buttons 1004, 1005 provide an input to toggle between insertion and extraction of one or more robotic arm assemblies 42A, 42B as well as the camera assembly 44. For example, the button 1004 can be used to insert or extract a first robotic arm 42 A and the button 1005 can be used to insert or extract a second robotic arm 42B.
[0098] Each of the left hand controller 1001 and the right hand controller 1002 also includes a first button 1031, 1034, a second button 1032, 1035, a touch input device 1041, 1042 (e.g., a joy stick, or scroll wheel), respectively. In each hand controller 1001, 1002, the first button
1021, 1034, the second button 1032, 1035, and the touch input device 1041, 1042 are disposed on or at an upper surface 1012a, 1013a of the housing 1010, 1011, respectively. In some embodiments, the first button 1021, 1034, the second button 1032, 1035, and the touch input device 1041, 1042 are disposed on or at a portion of the upper surface 1012a, 1013a that projects from the upper surface. For each hand controller 1001, 1002, a lever (not visible in this view) extends from the respective outside side surface (not visible in this view). In some embodiments, a different mechanism may be used for a grasping input on a hand controller. For example, in some embodiments a hand controller may include a least one “pistol trigger” type button that can be pulled back to close and released to open instead of or in addition to a lever or levers.
[0099] The left hand controller 1001 includes a first paddle 1021 and a second paddle 1022. Similarly, right hand controller 1002 includes a first paddle 1023 and a second paddle 1024. In some embodiments, first paddle 1021, 1023 is engaged with the second paddle 1022, 1024 of each hand controller 1001, 1002 via one or more gears (not shown) so that a user depressing the first paddle 1021, 1023 causes a reciprocal movement in the second paddle
1022, 1024 and vice versa, respectively. In another embodiment, the first paddle 1021, 1023
and the second paddle 1022, 1024 of each hand controller may be configured to operate independently. In embodiments employing reciprocal movement of the first and second paddles, the hand controller 1001, 1002 may employ some form of a signal or other indicator indicating a deflection of the first paddle 1021, 1023 and the second paddle 1022, 1024. In embodiments in which the first paddle and second paddle operate independently, the hand controller 1001, 1002 may employ a first signal or other indicator indicating a deflection of the first paddle 1021, 1023 and a second signal or other indicator indicating a deflection of the second paddle 1022, 1024.
[0100] In some embodiments, the first paddle 1021, 1023 and the second paddle 1022, 1024 may be contoured to receive a thumb and/or finger of a user. In some embodiments, the first paddle 1021, 1023 extends from or extends beyond the outside side surface of the respective contoured housing 1010, 1011 the second paddle 1022, 1024 extends from or extends beyond the inside side surface 1012b, 1013b of the respective contoured housing. For each hand controller 1010, 1011, deflection or depression of the first paddle 1021, 1023, and the second paddle 1022, 1024, is configured to trigger a signal that the surgical robotic system uses as an input to control a tool or an instrument tip (e.g., opening/closing an aperture of graspers/jaws of an instrument tip) at a distal end of a robotic arm of the surgical robotic system. For example, depressing first paddle 1021, 1023 and the second paddle 1022, 1024 may change an angle of jaws of a grasper at a distal end of the respective robotic arm. In some embodiments, end effectors, tools or instruments are used to pull tissue apart, drive a needle driver, grab an item (e.g., a mesh, suture, needle) or pick up such an item in the body cavity when it is dropped, deliver energy via an electrosurgical unit (ESU) (e.g., to cut or to coagulate).
[0101] In some embodiments, each of the first paddle 1021, 1023 and the second paddle 1022, 1024 can have a loop to receive a thumb and/or finger of a user, as further described with respect to FIGS. 9A and 9B. In some embodiments, parameters (e.g., length, angle, finger ergonomics, and the like) of each of the first paddle 1021, 1023 and the second paddle 1022, 1024 can be adjusted.
[0102] The contoured housing 1010, 1011 may be configured to comfortably and ergonomically mate with a corresponding hand of the operator. The operator may engage with the respective hand controller 1001, 1002 by placing the thumb of the respective hand on the second paddle 1022, 1024, positioning the pointer finger or middle finger of the respective hand on or over the projecting portion of the upper surface 1013a, 1013a on which the first button 1021, 1034, the second button 1032, 1035, and the touch input device 1041,
1042 are disposed, and by positioning at least, the middle finger or ring finger of the respective hand on or over the first paddle 1021, 1024.
[0103] Although various example embodiments described herein assign certain functions to certain buttons and to certain touch input devices, one of ordinary skill of the art in view of the present disclosure will appreciate that which functions are ascribed to which buttons and touch input devices may be different in different embodiments. Further, one of ordinary skill of the art in view of the present disclosure will appreciate that additional functions not explicitly described herein may be assigned to some buttons and some touch input devices in some embodiments. In some embodiments, one or more functions may be assigned to a foot pedal of a surgical robotic system that includes one or more hand controllers as described herein.
[0104] By way of example, a set of functions that may be controlled by the left hand controller 1001 and the right hand controller 1002 for some embodiments of the present technology will now be described.
[0105] For left hand controller 1001, pressing or pressing and holding the first button 1004 may trigger a signal used to engage an insertion or extraction for a left robotic arm assembly and/or a camera assembly of the surgical robotic system. Pressing or pressing and holding the first button 1031 may trigger a signal used to control a clutch function for the left hand controller of the surgical robotic system. Pressing or pressing and holding the second button 1032 may trigger a signal used to engage or disengage a camera control mode of the surgical robotic system. Scrolling the touch input device 1041 forward may activate a zoom in function to magnify a view provided by the camera assembly of the surgical robotic system and displayed to the operator, and scrolling backward with first touch input device 1041 may provide a zoom out function to reduce the view provided by the camera assembly of the surgical robotic device and displayed to the operator, or vice versa. Scrolling the touch input device 1041 may trigger a signal used to select left elbow bias when an elbow bias function is activated using a menu.
[0106] For right hand controller 1002, pressing or pressing and holding the first button 1005 may trigger a signal used to engage an insertion or extraction for a right robotic arm assembly and/or a camera assembly of the surgical robotic system. Pressing or pressing and holding the first button 1034 may trigger a signal used to control a clutch function for the right hand controller of the surgical robotic system. Clicking or depressing the second button 1035 may engage a scan mode of the surgical robotic system. When in a scan mode, a movement of at least one of the left hand controller 1001 or the right hand controller 1002 causes a
corresponding change in an orientation of a camera assembly of the robotic assembly without changing a position or orientation of either robotic arm of the surgical robotic system. In another embodiment, pressing and holding the second button 1035 may activate the scan mode and releasing the second button 1035 may end the scan mode of the surgical robotic system. In some embodiments, releasing the scan mode returns the camera to the orientation it was in upon entering the scan mode. In some embodiments, a function may be provided for locking the orientation upon exiting the scan mode (e.g., to change the “horizon” line). Scrolling the touch input device 1042 may trigger a signal used to traverse a menu or highlight a portion of the menu when the menu is displayed or a menu mode is active. Pressing the touch input device 1042 may trigger a signal used to select a highlighted portion or of the menu or feature on the menu when the menu is displayed. Scrolling the touch input device 1042 may produce a signal used to select right elbow bias when the elbow bias function is activated using the menu. Scrolling forward on touch input device 1042 may move up the menu and scrolling backwards with touch input device 1042 may move down the menu, or vice versa. Clicking first touch input device 1042 may make a selection within a menu.
[0107] FIGS. 9A and 9B depict another embodiment according to the present disclosure featuring a left hand controller 1001’ and a right hand controller 1002’. Compared with the hand controllers 1001, 1002 in FIGS. 8 A and 8B, some buttons of the hand controllers 1001’, 1002’ have the same button type but different functions. For example, the second button 1035’ of the right hand controller 1002’ may trigger a signal used to turn on or turn off a menu. Compared with the hand controllers 1001, 1002 in FIGS. 8 A and 8B, some buttons of the hand controllers 1001 ’, 1002’ may have a different button type and/or different functions. For example, touch input device 1041’ for the left hand controller 1001’ may have a three- way switch button type. Switching or holding the touch input device 1041’ to the center may trigger a signal used to engage or disengage a scan mode of the surgical robotic system. Switching the touch input device 1041’ forward may activate a zoom in function to magnify a view provided by the camera assembly of the surgical robotic system and displayed to the operator, and switching backward with first touch input device 1041’ may provide a zoom out function to reduce the view provided by the camera assembly of the surgical robotic device and displayed to the operator, or vice versa. Switching the touch input device 1035’ upward may trigger a signal used to traverse a menu when the menu is displayed or a menu mode is active. Touch input device 1042’ for the right hand controller 1002’ may have a three-way switch button type. Switching the touch input device 1042’ may trigger a signal
used to traverse a menu or highlight a portion of the menu when the menu is displayed or a menu mode is active by pressing the touch input device 1035’. Switching forward on touch input device 1042’ may move up the menu and switching backwards with touch input device 1042’ may move down the menu, or vice versa. Clicking first touch input device 1042’ may trigger a signal used to select a highlighted portion or of the menu or feature on the menu when the menu is displayed. In some embodiments, switching the touch input device 1042’ may trigger a signal used to select right elbow bias when the elbow bias function is activated using the menu. Compared with the hand controllers 1001, 1002 in FIGS. 8A and 8B, the hand controllers 1001’, 1002’ may have the first paddles 1021 ’, 1023’ and second paddles 1022’, 1024’ to couple to finger loops 1061, 1062, 1063, 1064, respectively. Each finger loop can be a Velcro type. In some embodiments (not illustrated), each finger loop can be a hook type. Deflection or depression of the first paddle 1021’, 1023’, and the second paddle 1022’, 1024’, is configured to trigger a signal to control a tool or an instrument tip (e.g., opening/closing an aperture of graspers/jaws of an instrument tip) at a distal end of a robotic arm of the surgical robotic system. For example, depressing first paddle 1021’, 1023’ and the second paddle 1022’, 1024’ may change an angle of jaws of a grasper at a distal end of the respective robotic arm. In some embodiments, end effectors, tools or instruments are used to pull tissue apart, drive a needle driver, grab an item (e.g., a mesh, suture, needle) or pick up such an item in the body cavity when it is dropped, deliver energy via an electrosurgical unit (ESU) (e.g., to cut or to coagulate). Compared with the hand controllers 1001, 1002 in FIGS. 8A and 8B, first buttons 1031’, 1034’ may have a slider button type. Sliding the first button 1031’, 1034’ may trigger a signal used to control a clutch function for the corresponding hand controller of the surgical robotic system.
[0108] FIG. 9 is a side view of an example position sensor with a target 702, coil sensors 704a and 704b, and background in a robotic joint, in accordance with some embodiments. As shown in FIG. 9 a position sensor 132 includes a target 702 a portion of which is in the vicinity of a sensing coil 704a and another portion of the target 702 that is completely covered by or underneath a sensing coil 704b. The combination of the target 702 and coil sensors 704a and 704b form a variable conductor. As the thicker section of target 702 passes the sensing coils 704a and 704b, the effective inductance value of the sensing coils 704a and 704b increases, and as the thinner section of the target 702 passes the sensing coils 704a and 704b, the inductance value of the sensing coils 704a and 704b decreases. In some embodiments, a single coil sensor 704 is used.
[0109] FIG. 10 is a top down view of the position sensor 132 with the target 702, and the sensing coils 704a and 704b in the robotic arm. FIG. 8 shows a sensor stack comprising two coils separated by a distance of .5 mm between which the target 702 travels between as the robotic arm 42A or 42B rotates.
[0110] FIG. 10 illustrates an example shape of the target 702 as well as the sensor coils 704a and 704b used in the position sensor 132, in accordance with some embodiments. In some embodiments, the position sensor 132 can include the target 702 and the sensing coils 704a and 704b. The target 702 can be ring shaped with varying thickness around the circumference of the ring. The target 702 can be made of a material that has a high permeability while having a low conductivity such as shielding films that may be punched into a pattern that is useful as an encoder target. The sensing coils 704a and 704b can have a high conductivity and low permeability. The target 702 sensing coils 704 and 704b can form a variable inductor. The target 702 can be a copper trace that moves as a joint rotates. For example, position sensor 132 can determine the position of the rotary joints in the virtual elbow 128 as the target of the of position sensor 132 rotates and comes within proximity of the corresponding sensing coils. The sensing coils 704a and 704b can be a set of fixed location copper traces which (not shown) feed back to the rest of a filtering circuit as shown in FIG. 11. The sensing coils 704a and 704b can be represented schematically as seen in FIG. 11 as part of a variable inductor 903.
[OHl] FIG. 11 is a schematic of the filter circuit 900 electrically coupled to a controller 26 that is used to determine a position of a robotic arm in response to a signal that is input to the filter circuit 900, in accordance with some embodiments. The filter circuit includes a fixed value capacitor 902 and a variable inductor 903. The variable inductor 903 is a circuit element corresponding to the target 702 and the sensing coils 704a and 704b.
[0112] In some embodiments, the controller 26 can execute instructions in accordance with a Universal Asynchronous Receiver Transmitter (UART). In UART communication, the UART converts parallel data from a controlling device, such as the processor 22, into serial data, and transmits the serial data using the transmitter to a receiver of a UART. In some embodiments, the serial data is transmitted by a transmitter of the UART through a filter circuit, and it is received by the receiver of the UART. The UART then converts the serial data back into parallel data which is further processed by a processor.
[0113] Data is transferred from the processor 22 via a data bus to the UART in parallel form. After the UART gets the parallel data from the data bus, it adds a start bit, a parity bit, and a stop bit, creating a data packet. Next, the data packet is output serially, bit by bit at the
transmitter of the UART, and then the receiver of the UART reads the data packet bit by bit. The receiver of the UART then converts the data packet from serial form into parallel form, and then removes the start bit, parity bit, and stop bits. The receiver of the UART then transfers the bits on the data bus to the processor 22.
[0114] The controller 26 is electrically coupled to an input of the filter circuit 900, via a transmit pin or output, for example, UART TX 26a and an output of the filter circuit 900 is electrically coupled to a receive pin or input of the controller 26, for example, UART RX 26b of the controller. The filter circuit 900 is made of minimal components including the fixed value capacitor 902, and the target 702, which in combination with the sensing coils 704a and 704b that, in some embodiments are incorporated into a flexible printed circuit board (PCB) together form an inductor of variable value depending on the positional relationship of the target 702 and sensing coils 704a and 704b.
[0115] For example, as the rotational angle of the rotary joint changes, the thickness or the width or both of the target 702 passing over, under or between the sensor coils 704a and 704b changes thereby creating an inductance that varies relative to the portion of the target 702 passing by the sensor coils 704a and 704b.
[0116] The transmitter, of the UART, UART TX 26a can transmit a square wave signal of varying frequencies depending on the underlying capabilities of the controller, and the receiver of the UART, UART RX 26b, can receive the same square wave signal or an altered version of the square wave signal based on the filter circuit 900 between the transmitter and receiver of the UART. The controller can control the frequency at which the square wave signal is transmitted by updating the clock configuration of the UART, so that the frequency of the bits transmitted is at a specific arbitrary value. It can transmit a square wave signal of varying frequencies by maintaining the UART at a specific transmission rate. Another way of transmitting the signal with varying frequencies is by modulating a packet containing data so that it lowers the frequency. For example, transmitting the byte OxAA (OblOlOlOlO) would be a square wave at the specified transmission rate, whereas OxCC (ObllOOl 100) would have a square wave of half the specified transmission rate. By using longer sequences of bytes you can subdivide into more granular transmission rates. A further way to transmit a square wave signal with varying frequencies, is by combining the two aforementioned ways of transmitting the signal using varying frequencies. Using this frequency variation, the transmitter can perform a sweep across different frequencies as the input to the filter circuit 900. The filter circuit 900 may have a variable frequency response based on the present capacitor value, which can be a fixed value, and an inductance value which is variable. Lower
frequency signals may pass through the filter circuit 900 unattenuated, but higher frequency signals may be modulated or attenuated depending on the inductance value. Thus, as the controller 26 sweeps through the different frequencies, the output of the filter circuit 900 may begin to degrade and the output signal from the filter circuit 900 may be a corrupted version of the input signal. Based on the threshold of where the corruption becomes more pronounced and the characteristics of the corrupted data, the frequency response of the filter circuit 900 can be characterized. The sensed inductance value can be mapped to a corresponding rotary joint angle by utilizing an external tool that can be used to determine the actual angle values of the rotary join at different turn radii, and mapping the sensed inductance value to the different turn radii and thus different rotary joint angles.
[0117] FIG. 12A is an example timing diagram illustrating a comparison of the shape of a signal modulated at a frequency of 1kHz that is input to the filter circuit 900 and a corresponding output signal from the filter circuit 900, corresponding to the frequency response of the filter circuit 900. When the UART TX 26a transmits an input signal, filter input signal 1065a, to the filter circuit 900, the filter circuit outputs an output signal, filter output signal 1065b, that is received by the UART RX 26b. The digital representation of the fitler input and filter output signal is the same. The filter input signal can be a packet of bits that are transmitted serially from the UART TX 26a that can be expressed in a hexadecimal format interpretable by a controller. For example, the controller 26 can read in data expressed in a hexadecimal format to be transmitted in a packet and convert the hexadecimal value into a parallel stream of bits that will be transferred to the UART TX 26a via a bus. The data that is expressed in the hexadecimal format can be referred to as a controller data representation. [0118] The controller data representation 1001 of the filter input signal 1065a for a signal that is modulated using a 1kHz frequency can be OxOOFF 00FF 00FF, and the corresponding controller data representation 1001 of the filter output signal 1065b can be OxOOFF 00FF 00FF. As noted above, input signals that are modulated using lower frequencies can pass through the filter circuit 900 unattenuated. As can be the case for the input signal that is modulated using a frequency of 1kHz.
[0119] FIG. 12B is an example timing diagram illustrating a comparison of the shape of a signal modulated at a frequency of 1333Hz that is input to the filter circuit 900 and a corresponding output signal from the filter circuit 900, corresponding to the frequency response of the filter circuit 900. When UART TX 26a transmits an input signal, filter input signal 1065a, to the filter circuit 900, the filter circuit outputs an output signal, filter output signal 1065b, that is received by the UART RX 26b. The digital representation of the input
and output data is not the same. That is, for a controller data representation 1001 of 0x03F0 3F03 F03F, corresponding to the filter input signal 1065a, the corresponding controller data representation 1001 of the filter output signal 1065b should be OxlOFO 1F01 F01F. As noted above, input signals that are modulated using frequencies above a certain threshold have a tendency to be slightly attenuated or altered as they pass through the filter circuit 900. As is this case for the input signal that is modulated using a frequency of 1333Hz.
[0120] FIG. 12C is an example timing diagram illustrating a comparison of the shape of a signal modulated at a frequency of 2kHz that is input to the filter circuit 900 and a corresponding output signal from the filter circuit 900, corresponding to the frequency response of the filter circuit 900. When the UART TX 26a transmits an input signal, filter input signal 1065a, to the filter circuit 900, the filter circuit outputs an output signal, filter output signal 1065b, that is received by the UART RX 26b. The digital representation of the input and output data is not the same. That is, for a controller data representation 1001 of OxOFOF 0F0F 0F0F, corresponding to the filter input 1065a, the corresponding controller data representation 1001 of the filter output 1065b should be 0x060E 078C 1F07. As noted above, input signals that are modulated using frequencies above a certain threshold can be severely attenuated or altered as they pass through the filter circuit 900. As is this case for the input signal that is modulated using a frequency of 2kHz.
[0121] FIG. 13 A is an example 4 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments. Controller 26 can generate an output signal 1101 waveform oscillating at 4 MHz to the filter circuit 900 and the filter circuit 900 which can in turn cause the filter circuit 900 to generate a frequency response and output a return signal 1102 that is received by the controller 26. A first threshold (e.g., threshold high 1103) and a second threshold (e.g., threshold high 1104) are thresholds that are used to determine when the amplitude of the corresponding return signal 1102 begins to distort beyond a predetermined level. The distortion happens as result of the frequency of the output signal 1101 increasing thereby causing the filter circuit 900 to corrupt the output signal 1101 and generating the return signal 1102 that has a different shape. An exemplary frequency cutoff of the filter circuit 900 can be 5 MHz which is less than the 4 MHz output signal 1101 as shown in FIG. 13 A. As a result the return signal 1102 closely approximates the shape of the output signal 1101, and the area beneath both curves is approximately the same.
[0122] FIG. 13B is an example 10 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5
MHz cutoff, in accordance with some embodiments. Controller 26 can generate an output signal 1101 waveform oscillating at 10 MHz to the filter circuit 900 and the filter circuit 900 which can in turn cause the filter circuit 900 to generate a frequency response and output a return signal 1102 that is received by the controller 26. Because the exemplary frequency cutoff frequency is 5 MHz, which is less than the 4 MHz output signal 1101 as shown in FIG. 13B, the return signal 1102 does not approximate the shape of the output signal 1101 as closely as when the output signal 1101 is oscillating at a frequency of 4 MHz. As a result the difference between the area beneath the two curves is greater than the difference between the area of the two curves when the output signal 1101 is oscillating at a frequency of 4 MHz. This happens as a result of the output signal 1101 oscillating at a frequency exceeding that of the filter circuit 1102, which in turn causes the filter circuit 1102 to corrupt the output signal 1101 more than when it is oscillating at 4 or 10 MHz.
[0123] FIG. 13C is an example 20 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments. Controller 26 can generate an output signal 1101 waveform oscillating at 20 MHz to the filter circuit 900 and the filter circuit 900 which can in turn cause the filter circuit 900 to generate a frequency response and output a return signal 1102 that is received by the controller 26. Because the exemplary frequency cutoff frequency is 5 MHz, which is less than the 20 MHz output signal 1101 as shown in FIG. 13C, the return signal 1102 does not approximate the shape of the output signal 1101 as closely as well as when the output signal 1101 is oscillating at a frequency of 4 or 10 MHz. As a result the difference between the area beneath the two curves is greater than the difference between the area of the two curves when the output signal 1101 is oscillating at a frequency of 4 or 10 MHz. This happens as a result of the output signal 1101 oscillating at a frequency exceeding that of the filter circuit 1102, which in turn causes the filter circuit 1102 to corrupt the output signal 1101 more than when the output signal 1101 is oscillating at 4 or 10 MHz.
[0124] FIG. 13D is an example 40 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments. Controller 26 can generate an output signal 1101 waveform oscillating at 40 MHz to the filter circuit 900 and the filter circuit 900 which can in turn cause the filter circuit 900 to generate a frequency response and output a return signal 1102 that is received by the controller 26. Because the exemplary frequency cutoff frequency is 5 MHz, which is less than the 40 MHz output signal 1101 as shown in
FIG. 13D, the return signal 1102 does not approximate the shape of the output signal 1101 as closely as well as when the output signal 1101 is oscillating at a frequency of 4, 10, or 20 MHz. As a result the difference between the area beneath the two curves is greater than the difference between the area of the two curves when the output signal 1101 is oscillating at a frequency of 4, 10, or 20 MHz. This happens as a result of the output signal 1101 oscillating at a frequency exceeding that of the filter circuit 1102, which in turn causes the filter circuit 1102 to corrupt the output signal 1101 more than when the output signal 1101 is oscillating at 4, 10, or 20 MHz.
[0125] FIG. 13E is an example 60 megahertz (MHz) signal waveform output from a controller into a filter circuit and a corresponding return signal from the filter circuit with a 5 MHz cutoff, in accordance with some embodiments. Controller 26 can generate an output signal 1101 waveform oscillating at 60 MHz to the filter circuit 900 which can in turn cause the filter circuit 900 to generate a frequency response and output a return signal 1102 that is received by the controller 26. Because the exemplary cutoff frequency is 5 MHz, which is less than the 60 MHz output signal 1101 as shown in FIG. 13E, the return signal 1102 does not approximate the shape of the output signal 1101 as closely as well as when the output signal 1101 is oscillating at a frequency of 4, 10, 20, or 40 MHz. As a result the difference between the area beneath the two curves is greater than the difference between the area of the two curves when the output signal 1101 is oscillating at a frequency of 4, 10, 20, or 40 MHz. This happens as a result of the output signal 1101 oscillating at a frequency exceeding that of the filter circuit 1102, which in turn causes the filter circuit 1102 to corrupt the output signal 1101 more than when the output signal 1101 is oscillating at 4, 10, 20, or 40 MHz. When the amplitude of the output signal 1101 is below the threshold high 1103 and the threshold low 1104 as shown in FIG. 13E the controller 26 can map the return signal 1102 values to a corresponding angle for a rotary joint.
[0126] FIG. 14 is an example flowchart 1200 corresponding to determining a position of a robotic joint using an inductive sensing method, in accordance with some embodiments. At block 1202 the controller 26 can transmit an input signal of varying frequency to the filter circuit 900. At block 1204 the controller 26 can detect a signal corresponding to a response of the filter circuit 900 to the input signal. At block 1206, the controller 26 can characterize a frequency response of the filter circuit 900 based on the signal corresponding to the response of the filter circuit 900 being corrupted relative to the input signal. At block 1208, the controller 26 can determine an inductance value of the variable inductor 903 based at least in part on the frequency response of the filter circuit 900. At block 1210, the controller 26 can
determine the angular position of the rotary joint based at least in part on the inductance value which is indicative of a positional relationship between the moveable target 802 and the fixed sensing coils 804a and 804b.
[0127] FIG. 15 schematically depicts an example network environment 1300 that the surgical robotic system can be connected to in accordance with some embodiments. Computing module 18 can be used to perform one or more steps of the methods provided by example embodiments. The computing module 18 includes one or more non-transitory computer- readable media for storing one or more computer-executable instructions or software for implementing example embodiments. The non-transitory computer-readable media can include, but are not limited to, one or more types of hardware memory, non-transitory tangible media (for example, one or more magnetic storage disks, one or more optical disks, one or more USB flashdrives), and the like. For example, memory 1306 included in the computing module 18 can store computer- readable and computer-executable instructions or software for implementing example embodiments. The computing module 18 also includes the processor 22 and associated core 1304, for executing computer-readable and computerexecutable instructions or software stored in the memory 1306 and other programs for controlling system hardware. The processor 22 can be a single core processor or multiple core (1304) processor.
[0128] Memory 1306 can include a computer system memory or random access memory, such as DRAM, SRAM, EDO RAM, and the like. The memory 1306 can include other types of memory as well, or combinations thereof. A user can interact with the computing module 18 through the display 12, such as a touch screen display or computer monitor, which can display the graphical user interface (GUI) 39. The display 12 can also display other aspects, transducers and/or information or data associated with example embodiments. The computing module 18 can include other VO devices for receiving input from a user, for example, a keyboard or any suitable multi-point touch interface 1308, a pointing device 1310 (e.g., a pen, stylus, mouse, or trackpad). The keyboard 1308 and the pointing device 1310 can be coupled to the visual display device 12. The computing module 18 can include other suitable conventional VO peripherals.
[0129] The computing module 18 can also include one or more storage devices 24, such as a hard-drive, CD-ROM, or other computer readable media, for storing data and computer- readable instructions, applications, and/or software that implements example operations/steps of the surgical robotic system 10 as described herein, or portions thereof, which can be executed to generate GUI 39 on display 12. Example storage devices 24 can also store one or
more databases for storing any suitable information required to implement example embodiments. The databases can be updated by a user or automatically at any suitable time to add, delete or update one or more items in the databases. Example storage device 24 can store one or more databases 1326 for storing provisioned data, and other data/information used to implement example embodiments of the systems and methods described herein. [0130] The computing module 18 can include a network interface 1312 configured to interface via one or more network devices 1320 with one or more networks, for example, Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, Tl, T3, 56kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above. The network interface 1312 can include a built-in network adapter, network interface card, PCMCIA network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing the computing module 18 to any type of network capable of communication and performing the operations described herein. Moreover, the computing module 18 can be any computer system, such as a workstation, desktop computer, server, laptop, handheld computer, tablet computer (e.g., the iPad® tablet computer), mobile computing or communication device (e.g., the iPhone® communication device), or other form of computing or telecommunications device that is capable of communication and that has sufficient processor power and memory capacity to perform the operations described herein.
[0131] The computing module 18 can run any operating system 1316, such as any of the versions of the Microsoft® Windows® operating systems, the different releases of the Unix and Linux operating systems, any version of the MacOS® for Macintosh computers, any embedded operating system, any real-time operating system, any open source operating system, any proprietary operating system, any operating systems for mobile computing devices, or any other operating system capable of running on the computing device and performing the operations described herein. In some embodiments, the operating system 1316 can be run in native mode or emulated mode. In some embodiments, the operating system 1316 can be run on one or more cloud machine instances.
[0132] The computing module 18 can also include an antenna 1330, where the antenna 1330 can transmit wireless transmissions a radio frequency (RF) front end and receive wireless transmissions from the RF front end.
[0133] The computing module can also include the UART TX 26a and UART RX 26b.
Claims
1. A system for determining an angular position of a rotary joint, the system comprising: a filter circuit comprising: a capacitor; a variable inductor comprising a moveable target, and at least one sensing coil affixed to the rotary joint; and a controller coupled to the filter circuit, the controller programed or configured to: transmit an input signal of varying frequency to the filter circuit; detect a signal corresponding to a response of the filter circuit to the input signal; characterize a frequency response of the filter circuit based at least in part on a determination that the signal corresponding to the response of the filter circuit is corrupted; determine: an inductance value of the variable inductor based at least in part on the frequency response of the filter circuit, and the angular position of the rotary joint based at least in part on the inductance value which is indicative of a positional relationship between the moveable target and the at least one sensing coil.
2. The system of claim 1, wherein the input signal is frequency modulated using the varying frequency.
3. The system of claim 1, wherein the determination that the signal corresponding to the response of the filter circuit is corrupted is based at least in part on the input signal being modulated by a frequency above a certain threshold.
4. The system of claim 3, wherein the controller comprises a Universal Asynchronous Receiver Transmitter (UART) receiver and a Universal Asynchronous Receiver Transmitter (UART) transmitter.
5. The system of claim 1, wherein the filter circuit further comprises a fixed value capacitor.
6. The system of claim 1, wherein the rotary joint is within a robotic arm.
7. A non-transitory computer-readable medium storing computer-executable instructions therein, which when executed by at least one processor in a controller, cause the at least one processor to perform the operations of: transmitting an input signal of varying frequency to a filter circuit; detecting a signal corresponding to a response of the filter circuit to the input signal; characterizing a frequency response of the filter circuit based at least in part on a determination that the signal corresponding to the response of the filter circuit is corrupted; determining: an inductance value of a variable inductor based at least in part on the frequency response of the filter circuit, wherein the variable inductor comprises a moveable target, and at least one sensing coil affixed to a rotary joint, and an angular position of the rotary joint based at least in part on the inductance value which is indicative of a positional relationship between the moveable target and the at least one sensing coil.
8. The non-transitory computer-readable medium of claim 7, wherein the input signal is frequency modulated using the varying frequency.
9. The non-transitory computer-readable medium of claim 7, wherein the determination that the signal corresponding to the response of the filter circuit is corrupted is based at least in part on the input signal being modulated by a frequency above a certain threshold.
10. The non-transitory computer-readable medium of claim 7, wherein the controller comprises a Universal Asynchronous Receiver Transmitter (UART) receiver and a Universal Asynchronous Receiver Transmitter (UART) transmitter.
11. The non-transitory computer-readable medium of claim 8, wherein the filter circuit further comprises a fixed value capacitor.
12. The non-transitory computer-readable medium of claim 7, wherein the rotary joint is within a robotic arm.
13. The non-transitory computer-readable medium of claim 7, wherein the angular position of the rotary joint changes based at least in part on a thickness or the width the moveable target.
14. A method of determining an angular position of a rotary joint, the method comprising: transmitting an input signal of varying frequency to a filter circuit; detecting a signal corresponding to a response of the filter circuit to the input signal; characterizing a frequency response of the filter circuit based at least in part on a determination that the signal corresponding to the response of the filter circuit is corrupted; determining: an inductance value of a variable inductor based at least in part on the frequency response of the filter circuit, wherein the variable inductor comprises a moveable target, and at least one sensing coil affixed to a rotary joint, and an angular position of the rotary joint based at least in part on the inductance value which is indicative of a positional relationship between the moveable target and the at least one sensing coil.
15. The method of claim 14, wherein the input signal is frequency modulated using the varying frequency.
16. The method of claim 14, wherein the determination that the signal corresponding to the response of the filter circuit is corrupted is based at least in part on the input signal being modulated by a frequency above a certain threshold.
17. The method of claim 14, wherein the controller comprises a Universal Asynchronous Receiver Transmitter (UART) receiver and a Universal Asynchronous Receiver Transmitter (UART) transmitter.
18. The method of claim 14, wherein the filter circuit further comprises a fixed value capacitor.
19. The method of claim 14, wherein the rotary joint is within a robotic arm.
20. The method of claim 14, wherein the angular position of the rotary joint changes based at least in part on a thickness or the width the moveable target.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363524220P | 2023-06-30 | 2023-06-30 | |
| US63/524,220 | 2023-06-30 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025007141A1 true WO2025007141A1 (en) | 2025-01-02 |
Family
ID=91950068
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/036427 Pending WO2025007141A1 (en) | 2023-06-30 | 2024-07-01 | Systems and methods for inductive pulse frequency modulated position sensing |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025007141A1 (en) |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102016206773A1 (en) * | 2016-04-21 | 2017-10-26 | Robert Bosch Gmbh | Motor control electronics for a brushless DC motor |
| US20190076199A1 (en) | 2017-09-14 | 2019-03-14 | Vicarious Surgical Inc. | Virtual reality surgical camera system |
| US10285765B2 (en) | 2014-05-05 | 2019-05-14 | Vicarious Surgical Inc. | Virtual reality surgical device |
| DE102018121560A1 (en) * | 2018-09-04 | 2020-03-05 | Thyssenkrupp Ag | High-resolution induction / frequency measurement with a slow microcontroller |
| WO2021159409A1 (en) | 2020-02-13 | 2021-08-19 | Oppo广东移动通信有限公司 | Power control method and apparatus, and terminal |
| WO2021231402A1 (en) | 2020-05-11 | 2021-11-18 | Vicarious Surgical Inc. | System and method for reversing orientation and view of selected components of a miniaturized surgical robotic unit in vivo |
| WO2022094000A1 (en) | 2020-10-28 | 2022-05-05 | Vicarious Surgical Inc. | Laparoscopic surgical robotic system with internal degrees of freedom of articulation |
| US20220272272A1 (en) * | 2021-02-24 | 2022-08-25 | Vicarious Surgical Inc. | System and method for autofocusing of a camera assembly of a surgical robotic system |
-
2024
- 2024-07-01 WO PCT/US2024/036427 patent/WO2025007141A1/en active Pending
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10285765B2 (en) | 2014-05-05 | 2019-05-14 | Vicarious Surgical Inc. | Virtual reality surgical device |
| DE102016206773A1 (en) * | 2016-04-21 | 2017-10-26 | Robert Bosch Gmbh | Motor control electronics for a brushless DC motor |
| US20190076199A1 (en) | 2017-09-14 | 2019-03-14 | Vicarious Surgical Inc. | Virtual reality surgical camera system |
| DE102018121560A1 (en) * | 2018-09-04 | 2020-03-05 | Thyssenkrupp Ag | High-resolution induction / frequency measurement with a slow microcontroller |
| WO2021159409A1 (en) | 2020-02-13 | 2021-08-19 | Oppo广东移动通信有限公司 | Power control method and apparatus, and terminal |
| WO2021231402A1 (en) | 2020-05-11 | 2021-11-18 | Vicarious Surgical Inc. | System and method for reversing orientation and view of selected components of a miniaturized surgical robotic unit in vivo |
| WO2022094000A1 (en) | 2020-10-28 | 2022-05-05 | Vicarious Surgical Inc. | Laparoscopic surgical robotic system with internal degrees of freedom of articulation |
| US20220272272A1 (en) * | 2021-02-24 | 2022-08-25 | Vicarious Surgical Inc. | System and method for autofocusing of a camera assembly of a surgical robotic system |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12048505B2 (en) | Master control device and methods therefor | |
| US11197731B2 (en) | Auxiliary image display and manipulation on a computer display in a medical robotic system | |
| KR102549728B1 (en) | Systems and methods for onscreen menus in a teleoperational medical system | |
| US6522906B1 (en) | Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure | |
| EP2467082B1 (en) | Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument | |
| US9788909B2 (en) | Synthetic representation of a surgical instrument | |
| AU2021240407B2 (en) | Virtual console for controlling a surgical robot | |
| EP1973021A2 (en) | Master-slave manipulator system | |
| KR20140110685A (en) | Method for controlling of single port surgical robot | |
| KR20120098342A (en) | Master robot, surgical robot system using thereof, control method thereof, and recording medium thereof | |
| JP2023010762A (en) | Camera control | |
| US20220378528A1 (en) | Systems and methods for controlling a surgical robotic assembly in an internal body cavity | |
| WO2025007141A1 (en) | Systems and methods for inductive pulse frequency modulated position sensing | |
| KR101825929B1 (en) | A system of non - restraint three - dimensional hand movement motion recognition for surgical robot maneuvering, method, computer program, and computer - readable recording media using the same | |
| WO2024207024A1 (en) | Systems and methods for a low-conductivity and high permeability based target inductive encoding | |
| WO2024137772A1 (en) | Systems and methods for inserting a robotic assembly into an internal body cavity | |
| WO2024097162A1 (en) | Systems including a graphical user interface for a surgical robotic system | |
| EP4593748A1 (en) | Hand controllers, systems, and control methods for surgical robotic systems | |
| WO2024145552A9 (en) | Needle driver with suture cutting function | |
| WO2024129771A1 (en) | Controller with a touchpad user interface for operating robotically actuated devices | |
| WO2025007133A1 (en) | Rounded triangular cannula trocar | |
| CN119097423A (en) | Control system for surgical robot and surgical robot system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24743236 Country of ref document: EP Kind code of ref document: A1 |