US20220331032A1 - Camera control - Google Patents
Camera control Download PDFInfo
- Publication number
- US20220331032A1 US20220331032A1 US17/811,098 US202217811098A US2022331032A1 US 20220331032 A1 US20220331032 A1 US 20220331032A1 US 202217811098 A US202217811098 A US 202217811098A US 2022331032 A1 US2022331032 A1 US 2022331032A1
- Authority
- US
- United States
- Prior art keywords
- input
- end effector
- input device
- operator
- movement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/37—Leader-follower robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00147—Holding or positioning arrangements
- A61B1/00149—Holding or positioning arrangements using articulated arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/74—Manipulators with manual electric input means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/02—Hand grip control means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/02—Hand grip control means
- B25J13/025—Hand grip control means comprising haptic means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J3/00—Manipulators of leader-follower type, i.e. both controlling unit and controlled unit perform corresponding spatial movements
- B25J3/04—Manipulators of leader-follower type, i.e. both controlling unit and controlled unit perform corresponding spatial movements involving servo mechanisms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/301—Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/302—Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/373—Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/35—Surgical robots for telesurgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
Definitions
- robotic master-slave systems an operator controls the movements of a number teleoperated tools using one or more input devices. Consequently, the operator can control the tools without needing to be in the same location as the worksite in which the tools are being manipulated.
- Image capture devices such as cameras, and a display for displaying a view of the captured images may be included to provide the operator with a view of the worksite.
- Some applications of robotic master-slave systems include assembly, maintenance or exploration in hazardous environments (e.g. underwater, nuclear or chemical plant), and minimally invasive surgical systems.
- the operator(s) who may be a surgeon, an assistant, a student or a combination thereof can remotely manipulate a number of remote tools (such as surgical instruments) supported by a number of robotic manipulators by manipulating a number of input devices at an operator's console (e.g. a surgeon's console).
- the robotic manipulators may support image capture devices, such as endoscopes, to provide the operator with a view of the remote surgical site on a display at the surgeon's console.
- the surgeon manipulates an input device to control the movement of the surgical instrument at the surgical site.
- an endoscope used in the procedure is controlled by a camera assistant who manipulates the endoscope in response to verbal instructions received from the surgeon to provide the surgeon with a desired view of the surgical site.
- the surgeon may use the same input device being used to manipulate the surgical instrument to control the endoscope by entering an endoscope mode in which operation of the surgical instrument is halted and the instrument is locked in place, the input device is then disassociated from the instrument and is associated with the endoscope to control the movement of the endoscope.
- the endoscope mode described above is exited, the input device is disassociated from the endoscope and re-associated with the instrument so that the surgeon can resume operation of the instrument.
- the present invention provides a master-slave system comprising: a first manipulator supporting a first end effector; a second manipulator supporting a second end effector; an input device configured to concurrently receive from a hand of an operator a first movement command to effect a desired movement of the first end effector and a second movement command to effect a desired movement of the second end effector; and a processor configured to determine a desired movement of the first and the second end effectors in response to the first and second movement commands received from the input device.
- the input device may further comprise a body for being held by an operator.
- the body may be configured to be gripped by a single hand of the operator.
- the input device may further comprise a first input for receiving the first movement command to effect a desired movement of the first end effector.
- the processor may be configured to operate in a plurality of modes including: an engaged mode, wherein the processor operatively couples the input device with the second end effector; a disengaged mode, wherein the processor operatively decouples the input device from the second end effector; an operational mode, wherein the processor operatively couples the first input with the first end effector; and a selection mode, wherein the processor operatively decouples the first input from the first end effector.
- the processor may further be configured to operate in the disengaged mode in response to a signal received from a clutch input.
- the processor may further be configured to operate in the operational mode in response to receiving a signal from a second input.
- the processor may further be configured to concurrently operate in the disengaged mode and the selection mode when the processor enters the disengaged mode.
- the input device may further be configured to receive the second movement command to effect a desired movement of the second end effector when the processor is operating in the engaged mode.
- the first input may be configured to receive the first movement command to effect a desired movement of the first end effector when the processor is operating in the operational mode.
- the first input may further be configured to select a third end effector for association with the input device when the processor is operating in the selection mode.
- the input device may further be configured to receive a third movement command to effect a desired movement of the third end effector in response to selection of the third end effector by the first input.
- the input device may further comprise a drive mechanism coupled to the first input, the drive mechanism being configured to apply a force to the first input for providing force feedback to the operator.
- the input device may further comprise a force sensor configured to sense a force applied to the first input for determining that a command is being received at the first input from the operator.
- the first input device may further comprise a capacitive sensor configured to sense when the operator is touching the first input for determining that a command is being received at the first input from the operator.
- the input device may further comprise a position sensor configured to sense the position of the first input relative to the body for determining that a command is being received at the first input from the operator.
- the processor may further be configured to effect movement of the first end effector in accordance with the determined desired movement of the first end effector.
- the processor may further be configured to effect movement of the second end effector in accordance with the determined desired movement of the second end effector.
- the first end effector may have multiple degrees of freedom and the processor may be configured to selectively associate at least one of the degrees of freedom of the first end effector with the first input and to effect movement of the at least one of the degrees of freedom of the first end effector in response to movement of the first input.
- the first end effector may comprise an image capture device for capturing an image of a worksite.
- the image capture device may be an endoscope.
- the master-slave system may further comprise a display for displaying the captured image.
- the processor may further be configured to effect movement of the first end effector according to movement of the first input with respect to an operator selected frame of reference.
- the operator selected frame of reference may be fixed to the input device.
- the operator selected frame of reference may be fixed to the display.
- the processor may further be configured to continuously update a mapping between an input device frame of reference and a second end effector frame of reference and to continuously update a mapping between a first input frame of reference and a first end effector frame of reference such that movements of the first and second end effectors as displayed on the display correspond to the movements of the first input and the input device respectively.
- FIG. 1 shows a master-slave system
- FIG. 2 shows a patient undergoing tele-surgery using a plurality of robotic manipulators each supporting a tool
- FIG. 3 shows two examples of the robotic manipulators of FIG. 2 each supporting a different tool
- FIG. 4 shows an operator console including a display and an input device
- FIG. 5 a shows a top view of an input device including an input and a clutch input
- FIG. 5 b shows a perspective view of the input device of FIG. 5 a
- FIG. 6 shows a schematic representation of the system of FIG. 1 ;
- FIG. 7 shows an operator using the operator console of FIG. 4 ;
- FIG. 8 a shows a view of the input of FIG. 5 a with its operator selected frame of reference fixed to the input of FIG. 4 ;
- FIG. 8 b shows an alternative view of the input of FIG. 5 a with its operator selected frame of reference fixed to the display of FIG. 4 .
- FIG. 1 shows a master-slave system 10 in an exemplary configuration as a surgical system.
- the system 10 includes a patient table 50 , a cart 70 , and an operator console 20 .
- the operator console 20 allows an operator O (shown in FIG. 7 ) to carry out a surgical procedure on patient P by remotely manipulating one or more robotic manipulator(s) 80 a, 80 b, 80 c (shown in FIG. 2 ) which supports a tool 40 a, 40 b, 40 c using one or more input device(s) 21 a, 21 b.
- the operator O may, for example, be a surgeon, an assistant, or a student.
- the system 10 has one or more robotic manipulators 80 a, 80 b, 80 c for supporting a variety of tools 40 a, 40 b, 40 c.
- the robotic manipulator(s) 80 a, 80 b, 80 c may be as shown in FIGS. 2, 3 and 6 .
- the robotic manipulators 80 a, 80 b, 80 c of FIGS. 2, 3 and 6 each have an arm 81 which extends from a base 90 (shown in FIG. 6 ).
- the arm 81 is articulated by a series of revolute joints 82 , 83 , 84 along its length.
- the end effector 41 a, 41 b may be a surgical instrument such as, but not limited to, smooth jaws, serrated jaws, a gripper, a pair of shears, a needle for suturing, a laser, a knife, a stapler, a cauteriser, a suctioner, pliers, a scalpel, cautery electrode, or the like.
- the end effector 41 c may alternatively be an image capture device, such as an endoscope.
- the end effector 41 a, 41 b, 41 c can be driven to move by a drive motor 86 at the distal end of the arm 81 .
- the drive motor 86 may be coupled to the end effector 41 a, 41 b, 41 c by cables extending along the interior of the instrument's shaft.
- FIG. 3 shows a robotic manipulator 80 c supporting an endoscope 41 c and a robotic manipulator 80 a supporting an instrument end effector 41 a, it will be appreciated that each of the robotic manipulators 80 a, 80 b, 80 c can support any of the end effectors.
- the operator console 20 includes one or more input devices 21 a, 21 b coupled to a linkage assembly 26 for controlling the robotic manipulator(s) 80 a, 80 b, 80 c .
- the operator console 20 also includes a display 30 for providing the operator O with a view of the remote surgical worksite.
- the input device(s) 21 a, 21 b of FIGS. 4, 5 a and 5 b are designed to be held in a user's hand for providing three-dimensional motion input to the system 10 . In some cases the input device(s) 21 a, 21 b may also allow the user to provide functional input to the system 10 .
- the robotic system may comprise a plurality of display devices, or screens.
- the screens are suitably configured to display the view of the remote worksite as a two-dimensional image and/or as a three-dimensional image.
- the screens can be provided on a single operator console 20 , or two or more consoles can comprise at least one screen each. This permits additional viewing screens which can be useful for allowing people other than the console operator to view the surgical worksite, for example for training.
- the X axis is orthogonal to the forward/rearward axis of the input device 21 b.
- the Y axis is orthogonal to the left/right axis of the input device 21 b.
- the Z axis is parallel to the up/down axis of the input device 21 b and orthogonal to the X and the Y axes. In other words, the Z axis of the input device 21 b is a line going through the body of the input device 21 b from its top to its bottom surface.
- the example input device 21 b of FIGS. 5 a and 5 b is intended to be grasped in the right hand.
- a mirror image input device 21 a could be intended for the left hand.
- the body 27 of the input device 21 b has a head 23 and a grip 24 .
- the grip 24 is configured to sit in the palm of the user's hand. This allows the user to place their first/index finger on a clutch input 22 .
- the user's thumb can be placed on the opposite side of the head 23 to the clutch input 22 or alternatively the user's thumb can be placed on an input 25 on the head 23 .
- User acceptance testing of devices of this type has established that many users naturally pick up and hold such a device in this manner, with their wrist in a neutral position and their thumb opposing their finger. This has the result that for three-dimensional motion input the user's wrist is free to move in flexion, extension, adduction and abduction.
- the clutch input 22 is shown as being a trigger lever that can rotate relative to the head 23 about an axis (not shown).
- the clutch input 22 could variably be termed the pince or pinch input.
- the input 25 is shown as being a thumbstick that pivots on a base and reports its position in relation to a default centre position to a control unit 60 .
- This position signal can then be used as a further motion input and, optionally, as a functional input in the system 10 .
- the input 25 of FIGS. 5 a and 5 b is designed to be manipulated with the user's thumb and typically provides a two-dimensional input.
- Input 25 could be articulated in several ways.
- the input 25 could pivot in a forward/rearward direction A about the X axis of input device 21 b.
- the input 25 could also pivot in a left/right direction B about the Y axis of input device 21 b.
- the input 25 could also behave as a push-button to provide an up/down movement of the input 25 along the Z axis of input device 21 b.
- the input 25 is positioned on the input device 21 a, 21 b such that it can be used in a comfortable and natural way by the user but not inadvertently actuated.
- the input 25 may be in an inoperable state by default, in which state any input received by the system from input 25 is ignored.
- the input 25 may only become operable once it has been detected that it is being engaged by a user.
- a touch sensitive panel (not shown) could be provided on the surface of input 25 which could be engaged by the user's thumb.
- the touch-sensitive panel could include a capacitive sensor (not shown) to detect the presence of the user's thumb on the input 25 when the user's thumb is touching the input 25 .
- haptic feedback could be provided for input 25 to detect any shaking or vibrating resulting from the user's thumb touching the input 25 to detect the presence of the user's thumb on the input 25 .
- the input 25 is in an operable state by default.
- a neutral zone could be defined within the articulation range of the input 25 which will not produce any output.
- the input 25 may need to be pivoted in a forward/rearward direction about the X axis of input device 21 a, 21 b beyond a certain threshold value before an output signal is actioned and any resulting movement of an associated slave is generated in the system.
- Visual cues may be provided on the display 30 and/or on the input device 21 a, 21 b to indicate the band of input values to the input 25 which will not produce any output to the system.
- a position sensor may be provided within the head 23 to sense the position of the input 25 relative to the body 27 .
- the position sensor may be able to discriminate a range of positions of the input 25 , allowing the extent to which the input 25 is pivoted or is pressed to be identified more precisely.
- the position sensor may be a rotational encoder or a potentiometer arranged about the rotation axis of the input 25 .
- the input device 21 b may comprise a force and/or torque sensor for detecting the force applied to the input 25 and/or the torque applied to the input 25 about its pivot axis.
- Additional user input interface may be provided on the head 23 .
- an example robotic manipulator 80 a is governed by a control unit 60 .
- the control unit 60 receives inputs from an input device 21 a, 21 b and from other sources such as the linkage assembly 26 and the robotic manipulator position/force sensors 85 .
- the linkage assembly 26 is an articulated linkage which supports the input device 21 a, 21 b and permits it to be moved with six degrees of freedom.
- the configuration of the linkage assembly 26 can be detected by sensors on the linkage and passed to the control unit 60 . In that way movement of the input device 21 a, 21 b can be used to control the movement of the robotic manipulator 80 a.
- the input device 21 a, 21 b may be equipped with accelerometers which permit its position and orientation to be estimated.
- the control unit 60 comprises a processor 61 which executes code stored in a non-transient form in a memory 62 . On executing the code, the processor 60 determines a set of signals for commanding movement of the joints 82 , 83 , and 84 of the robotic manipulator 80 a, and for moving the end effector 41 a of the instrument 40 a in dependence on the inputs from the input device 21 a, 21 b, the linkage assembly 26 , and the manipulator position/force sensors 85 .
- the code is configured so that the motion of the robotic manipulator 80 a is essentially dictated by the inputs from the input device 21 a, 21 b and the linkage assembly 26 .
- the attitude of the end effector 41 a may be set by the attitude of the input device 21 a, 21 b about its rotational degrees of freedom; (ii) the position of the end effector 41 a may be set by the position of the input device 21 a, 21 b about its translational degrees of freedom; and (iii) the configuration of the jaws of the end effector 41 a may be set by the position of the clutch input 22 or input 25 relative to the body 21 .
- clutch input 22 can be used to operate the jaws of a surgical instrument end effector.
- the processor 60 determines a set of signals for commanding movement of the joints 82 , 83 , and 84 of the robotic manipulator 80 c, and for moving the endoscope 41 c in dependence on the inputs from the input 25 and the position/force sensors of the input device 21 a, 21 b.
- the up/down position of the distal end of the endoscope 41 c may be set by the position of the input 25 of a first input device 21 a about its default position in a forward/rearward direction A;
- the left/right position of the distal end of the endoscope 41 c may be set by the position of the input 25 of the first input device 21 a about its default position in a left/right direction B;
- the zoom level of the endoscope 41 c may be set by the position of the input 25 of a second input device 21 b about its default position in a forward/rearward direction A;
- the rotation of the distal end of the endoscope 41 c may be set by the position of the input 25 of the second input device 21 b about its default position in a left/right direction B.
- the input devices 21 a and 21 b could be mapped conversely to that described above so that the first input device 21 a controls the rotation and zoom level of the endoscope 41 c and the second input device 21 b controls the up/down and left/right movement of the distal end of the endoscope 41 c.
- a single input device 21 a or 21 b is used to control all of the above described degrees of freedom of the endoscope 41 c .
- the up/down position of the distal end of the endoscope 41 c may be set by the position of the input 25 of a first input device 21 a about its default position in a forward/rearward direction A;
- the left/right position of the distal end of the endoscope 41 c may be set by the position of the input 25 of the first input device 21 a about its default position in a left/right direction B.
- the operator O may press the input 25 to switch from controlling the up/down and left/right directions of motion of the distal end of the endoscope 41 c to controlling the rotation and zoom level of the endoscope 41 c.
- the zoom level of the endoscope 41 c may be set by the position of the input 25 of the first input device 21 a about its default position in a forward/rearward direction A; and (iv) the rotation of the endoscope 41 c may be set by the position of the input 25 of the first input device 21 a about its default position in a left/right direction B.
- the operator O may choose the mapping between the forward/rearward and left/right movements of the input 25 and the up/down, left/right, rotation and zoom level of the endoscope 41 c according to their own preference. In other words, the operator O can select which degrees of freedom of the input 25 to map onto which degrees of freedom of the endoscope 41 c.
- the operator O of system 10 is able to use a single input device 21 a, 21 b to concurrently control the instrument end effector 41 a, 41 b and the endoscope 41 c without the need to halt either the manipulation of the instrument end effector 41 a, 41 b or the endoscope 41 c.
- the processor 60 may determine the set of signals for moving the endoscope 41 c in dependence on the inputs from the input 25 and the position/force sensors of the input device 21 a, 21 b using velocity control such that the speed with which the endoscope 41 c is moved in a desired direction may also be controlled.
- the input 25 may be configured such that it is biased to return to a neutral position.
- the speed with which the distal end of the endoscope 41 c is moved in an up/down direction may be set by the position of the input 25 of a first input device 21 a about its default neutral position in the forward/rearward direction A;
- the speed with which the distal end of the endoscope 41 c is moved in a left/right direction may be set by the position of the input 25 of the first input device 21 a about its default neutral position in a left/right direction B;
- the speed with the zoom level of the endoscope 41 c is changed may be set by the position of the input 25 of a second input device 21 b about its default neutral position in a forward/rearward direction A; and
- the speed with which the rotation of the distal end of the endoscope 41 c is effected may be set by the position of the input 25 of the second input device 21 b about its default neutral position in a left/right direction B.
- the operator O when the operator O wishes to move the endoscope 41 c in a desired way, they will push the input 25 of the relevant input device 21 a, 21 b in one of the appropriate directions described above. How far they push the input 25 in the appropriate direction away from the neutral position, determines how fast the endoscope 41 c moves in the desired way. Once the operator O releases the input 25 , the endoscope 41 c stops moving and the input 25 returns to its neutral position.
- the input devices 21 a and 21 b could be mapped conversely to that described above so that the first input device 21 a controls the rotation and zoom level of the endoscope 41 c and the second input device 21 b controls the up/down and left/right movement of the distal end of the endoscope 41 c.
- a single input device 21 a or 21 b is used to control all of the above described degrees of freedom of the endoscope 41 c using velocity control.
- the input 25 is configured such that it is biased to return to a neutral position; (i) the speed with which the distal end of the endoscope 41 c is moved in the up/down direction may be set by the position of the input 25 of a first input device 21 a about its default neutral position in a forward/rearward direction A; (ii) the speed with which the distal end of the endoscope 41 c is moved in the left/right direction may be set by the position of the input 25 of the first input device 21 a about its default neutral position in a left/right direction B.
- the operator O may press the input 25 to switch from controlling the up/down and left/right directions of motion of the distal end of the endoscope 41 c to controlling the rotation and zoom level of the endoscope 41 c.
- the speed with which the zoom level of the endoscope 41 c is changed may be set by the position of the input 25 of the first input device 21 a about its default neutral position in a forward/rearward direction A; and (iv) the speed with which the rotation of the endoscope 41 c is effected may be set by the position of the input 25 of the first input device 21 a about its default neutral position in a left/right direction B.
- the operator O may choose the mapping between the forward/rearward and left/right movements of the input 25 and the up/down, left/right, rotation and zoom level of the endoscope 41 c according to their own preference. In other words, the operator O can select which degrees of freedom of the input 25 to map onto which degrees of freedom of the endoscope 41 c. This will be described in more detail below.
- the operator O of system 10 is able to use a single input device 21 a, 21 b to concurrently control the instrument end effector 41 a, 41 b and the endoscope 41 c without the need to halt either the manipulation of the instrument end effector 41 a, 41 b or the endoscope 41 c.
- the operator O of system 10 can reposition the endoscope 41 c to capture a different view of the worksite during a procedure without needing to stop the procedure.
- the endoscope 41 c can thus be regarded as another slave end effector.
- the operator O carrying out a procedure using the operator console 20 uses the input device(s) 21 a, 21 b (the masters) to control the movements of the one or more remote tools 40 a, 40 b, 40 c (the slaves).
- the operator O can view the worksite via the display 30 .
- Instrument end effectors 41 a and 41 b supported on robotic manipulators 80 a and 80 b are manipulated to effect positional and orientational movements in response to movement and functional inputs on their respective associated input device(s) 21 a, 21 b.
- images of the instrument end effectors 41 a, 41 b together with the surgical site are captured by the endoscope 41 c supported on robotic manipulator 80 c and are displayed on the display 30 so that the operator O can see the responsive movements and actions of the instrument end effectors 41 a, 41 b as they control such movements and actions using the input device(s) 21 a, 21 b.
- control unit 60 establishes a relationship between input device 21 a, 21 b and the instrument end effector 41 a, 41 b associated with it as viewed in the captured image being displayed on display 30 , in which relationship the orientation and position of the instrument end effector 41 a or 41 b as displayed to the operator O in the image follows and corresponds to the orientation and position of the associated input device 21 a or 21 b as manipulated by the operator's hand.
- mapping the orientation and position of the associated instrument end effector 41 a or 41 b onto the input device's orientation and position controlling the movement of the instrument end effector 41 a or 41 b can be performed in a more intuitive manner than in the case where the movement of the instrument end effector 41 a or 41 b as displayed in the image were not mapped onto the movement of the operator's hand.
- control unit 60 establishes a relationship between input 25 and the endoscope 41 c associated with it, in which relationship the orientation and position of the endoscope 41 c follows and corresponds to the orientation and position of the associated input 25 as manipulated by the operator's hand.
- the control unit 60 effects control between an input device 21 a, 21 b and an instrument end effector 41 a, 41 b by mapping the input device's position and orientation in an input device Cartesian coordinate reference system (input frame) with the position and orientation of the instrument end effector 41 a or 41 b in a camera Cartesian coordinate reference system (camera frame). Accordingly, during operation, the instrument end effector's position and orientation within the camera frame is mapped to the position and orientation of the input device 21 a, 21 b in the input frame and the instrument end effector 41 a or 41 b is effected to move to a new position/orientation in the camera frame which corresponds to the current position and orientation of the input device 21 a, 21 b in the input frame.
- control unit 60 effects control between input 25 and the endoscope 41 c by mapping the input's position and orientation with respect to the input device 21 a, 21 b in an input Cartesian coordinate reference system (input frame) with the position and orientation of the endoscope 41 c in the camera frame. Accordingly, during operation the endoscope 41 c is effected to move to a new position/orientation in the camera frame which corresponds to the current position and orientation of the input 25 in the input frame.
- control unit 60 effects control between input 25 and the endoscope 41 c using velocity control by mapping the input's position about its default neutral position with respect to the input device 21 a, 21 b in an input frame with the direction and speed of motion of the endoscope 41 c in the camera frame. Accordingly, during operation, the endoscope 41 c is effected to move in a new direction and at a desired speed in the camera frame which corresponds to the position of the input 25 about its default neutral position in the input frame.
- the user can disengage the input device 21 a, 21 b and move it to a more ergonomic position and or orientation.
- the mapping between the input frame and the camera frame is updated.
- the master frame of reference can be changed and also vary between the input device 21 a and the input device 21 b.
- this update of the mapping between the master (input device 21 a, 21 b ) and the slave (end effector 41 a, 41 b, 41 c ) results in the movement commands given via the input device 21 a, 21 b to produce the expected motion of the slave as displayed to the operator O on display 30 .
- control unit 60 determines and effects a desired position and orientation of the distal end of the endoscope 41 c based on the position and orientation of the input 25 by means of the input frame relative to a frame of reference of the operator's preference. This concept is illustrated in FIGS. 8 a and 8 b and is described below.
- FIG. 8 a illustrates an example in which the operator O is holding the input device 21 a such that it is pointed squarely at the screen of display 30 .
- the operator O can choose to set their preferred frame of reference to be the same as that of the input device 21 a (i.e. the input frame).
- the operator O can see the instrument end effector 41 a as shown on display 30 .
- the distal end of the endoscope 41 c is also moved leftwards, resulting in the displayed image of the instrument end effector 41 a to shift to the right side of the captured image as shown on display 30 ′.
- FIG. 8 b illustrates another example in which the input device 21 a is being held by the operator O such that it is pointed 90 degrees to the right of the screen of display 30 .
- the operator O chooses to set their preferred frame of reference to be the same as the input frame once more, manipulating the input 25 to move left in respect to the body of the input device 21 a, in the direction indicated by letter L in FIG. 8 b , would result in a leftward movement of the distal end of the endoscope 41 c, resulting in the displayed image of the instrument end effector 41 a to shift to the right side of the captured image as shown on display 30 ′.
- operator O could choose to set their preferred frame of reference to be with respect to the screen.
- moving the input 25 in the direction indicated by letter L in FIG. 8 b would result in a forwards movement of the distal end of the endoscope 41 c, resulting in the displayed image of the instrument end effector 41 a appearing to be larger as shown on display 30 ′′.
- the operator O may choose another desired frame of reference.
- the examples above describe the use of the input 25 for providing three-dimensional motion of the image capture device (e.g. endoscope 41 c ).
- the following examples describe the use of the input 25 for providing functional input to the system 10 .
- the system 10 provides for selectively permitting operative association between the input device(s) 21 a, 21 b and any of the instrument end effectors 41 , 41 b.
- the input device 21 a, 21 b is usable to control a plurality of instrument end effectors 41 a, 41 b.
- the input device 21 a, 21 b is usable to control one instrument end effector 41 , 41 b when operatively coupled to the one instrument end effector 41 , 41 b, and is usable to control another instrument end effector 41 a, 41 b when operatively coupled to that other instrument.
- an input device 21 a, 21 b is usable to control the instrument end effector 40 a, 40 b to which it is operatively coupled in an engaged mode.
- the coupling between input device 21 a, 21 b and instrument end effector 41 a, 41 b is changeable.
- One instrument end effector 41 a can be controlled by a plurality of input devices 21 a, 21 b.
- the instrument end effector 41 a is couple-able to one input device 21 a to permit control of the instrument end effector 41 a by the one input device 21 a.
- the instrument end effector 41 a is couple-able to another input device 21 b to permit control of the instrument end effector 41 a by the other input device 21 b.
- more than one instrument end effector 41 a, 41 b can be operatively associated with a single input device 21 .
- the input devices 21 a, 21 b are associable with, or operatively coupleable to, one instrument end effector 41 a, 41 b at a time. This association or coupling is, in one example, effected in software control.
- the operator O selects the desired coupling, for example by manipulating the respective input device 21 a, 21 b to select the respective instrument end effector 41 a , 41 b, such as by using the input 25 to select an option from a menu in an interface.
- a suitable instrument end effector 41 a, 41 b Once a suitable instrument end effector 41 a, 41 b has been selected it is operably engaged/coupled with the input device 21 a, 21 b to provide the operator O with control over that chosen instrument end effector. This can be done using the clutch input 22 of the input device 21 .
- the input device 21 a, 21 b itself need not be used to make this engagement.
- Another button or control such as a foot pedal or switch on the operator console 20 , can be used instead of, or as well as, the input device 21 a, 21 b.
- another button or control on the input device 21 a, 21 b other than clutch input 22 could be provided as an engagement input for operably engaging/coupling a suitable instrument end effector 41 a, 41 b with the input device 21 a, 21 b to provide the operator O with control over that chosen instrument end effector. This provides flexibility in the selection of the instrument end effector 41 a, 41 b.
- the instrument end effector 41 a, 41 b that is operatively coupled to the input device 21 a, 21 b can be termed an engaged instrument end effector.
- Instrument end effectors 41 a, 41 b that are not operatively coupled to an input device 21 a, 21 b can be termed disengaged instrument end effectors.
- a disengaged instrument end effector is not being actively controlled by an input device 21 a, 21 b.
- the processor 61 is configured to determine which of the instrument end effectors 41 a, 41 b is an engaged instrument.
- the processor 61 can operate in a number of modes, namely in an engaged mode, in a disengaged mode, in an operational mode, and in a selection mode.
- the input device 21 a, 21 b In the engaged mode, the input device 21 a, 21 b is operatively coupled with an instrument end effector 41 a, 41 b.
- the disengaged mode the input device 21 a, 21 b is operatively decoupled from an instrument end effector 41 a, 41 b.
- the same mechanism described above to operatively couple an instrument end effector to an input device 21 a, 21 b can be used to operatively decouple that instrument end effector from that input device 21 a, 21 b.
- the clutch input 22 of the input device 21 a, 21 b can be used to operatively decouple the associated instrument end effector 41 a, 41 b.
- the input device 21 a, 21 b itself need not be used to effect this disengagement.
- Another button or control such as a foot pedal or switch on the operator console 20 , can be used instead of, or as well as, the input device 21 a, 21 b.
- another button or control on the input device 21 a, 21 b other than clutch input 22 could be alternatively provided as an engagement input for operably disengaging/decoupling the associated instrument end effector 41 a, 41 b from the input device 21 a, 21 b.
- a different instrument end effector can be selected and operatively coupled to the input device 21 a, 21 b to provide the operator O with control over that chosen instrument.
- the processor 61 could further be configured such that upon disengagement of an instrument 40 a, 40 b from an input device 21 , the input 25 of that input device 21 a, 21 b ceases to act as a three-dimensional input and automatically becomes an instrument end effector selection device to allow the operator O to select another instrument end effector for association with the input device 21 a, 21 b.
- disengaging one of the input devices 21 a, 21 b results only in the disengagement of the associated instrument end effector of that input device 21 a, 21 b with the input 25 of the disengaged input device 21 a, 21 b turning into a selection device.
- the processor can concurrently operate in the disengaged and the selection modes.
- disengaging one of the input devices 21 a, 21 b results in disengagement of all the associated instrument end effectors and the input 25 of all the input devices 21 a, 21 b automatically turn into selection devices.
- the clutch input 22 of the input device 21 a, 21 b can be used to cause the processor 61 to enter the operational mode and provide the operator O with control over the endoscope's required degrees of freedom.
- another engagement input of the input device 21 a, 21 b as described above or another button or control, such as a foot pedal or switch on the operator console 20 can be used instead of or as well as the clutch input 22 to cause the processor 61 to enter the operational mode.
- both inputs 25 of the input devices 21 a and 21 b are disengaged from controlling the endoscope's degrees of freedom by the processor 61 and both inputs 25 become instrument selection devices.
- inputs 25 can be adapted to provide other functional input to the system 10 (e.g. the selection of a number of other available modes that may be accessible to the operator via a menu-type interface).
- the processor 61 when the system 10 enters these other functional modalities, the processor 61 is configured to disengage all input devices 21 a, 21 b from their respective associated instrument end effectors 41 a, 41 b. Furthermore, the processor 61 is configured to disengage all inputs 25 from the endoscope 41 c.
- the robotic system may be an industrial system or another type of system.
- the robot could be an industrial robot or a robot for another function.
- the instruments could be an industrial or other form of tool.
- the body of the instrument could comprise a coupling for releasable attachment to a robot and an elongate shaft running between the coupling and the end effector.
- control processes described herein could be performed by one or more physical processing units executing software that causes the unit(s) to perform the control processes.
- the or each physical processing unit could be any suitable processor, such as a CPU or fixed function or programmable hardware.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Robotics (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
- Exposure Control For Cameras (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
- This application is continuation of U.S. patent application Ser. No. 16/281,194 filed on Feb. 21, 2019, which claims the benefit under 35 U.S.C. § 119 of United Kingdom Patent Application No. 1802992.6 filed on Feb. 23, 2018. Each application referenced above is hereby incorporated by reference in its entirety for all purposes.
- Typically, in robotic master-slave systems, an operator controls the movements of a number teleoperated tools using one or more input devices. Consequently, the operator can control the tools without needing to be in the same location as the worksite in which the tools are being manipulated. Image capture devices, such as cameras, and a display for displaying a view of the captured images may be included to provide the operator with a view of the worksite. Some applications of robotic master-slave systems include assembly, maintenance or exploration in hazardous environments (e.g. underwater, nuclear or chemical plant), and minimally invasive surgical systems.
- In surgical systems in particular, the operator(s) who may be a surgeon, an assistant, a student or a combination thereof can remotely manipulate a number of remote tools (such as surgical instruments) supported by a number of robotic manipulators by manipulating a number of input devices at an operator's console (e.g. a surgeon's console). As well as surgical instruments, the robotic manipulators may support image capture devices, such as endoscopes, to provide the operator with a view of the remote surgical site on a display at the surgeon's console.
- During surgery, the surgeon manipulates an input device to control the movement of the surgical instrument at the surgical site. Typically, an endoscope used in the procedure is controlled by a camera assistant who manipulates the endoscope in response to verbal instructions received from the surgeon to provide the surgeon with a desired view of the surgical site. Alternatively, the surgeon may use the same input device being used to manipulate the surgical instrument to control the endoscope by entering an endoscope mode in which operation of the surgical instrument is halted and the instrument is locked in place, the input device is then disassociated from the instrument and is associated with the endoscope to control the movement of the endoscope. Once the endoscope has been manipulated and the desired viewpoint is obtained, the endoscope mode described above is exited, the input device is disassociated from the endoscope and re-associated with the instrument so that the surgeon can resume operation of the instrument.
- However, such methods of controlling the endoscope are inefficient at best. Specifically, the method in which a camera assistant controls the endoscope can lead to delays and errors (e.g. it is possible for the surgeon to become disorientated by the movement of the endoscope frame of reference with respect to the surgeon's position). In the method in which the surgeon uses the input device to control the endoscope, the surgeon needs to stop the surgical procedure before the surgeon can manipulate the endoscope to change his/her viewpoint of the surgical site. In contrast, during normal open surgery the surgeon can move their head in response to a need or stimulus to change their viewpoint of the surgical site while carrying on with the procedure at the same time.
- There is, therefore, a need for an improved master-slave system which would allow a surgeon to more intuitively and comfortably change their view of the surgical site while at the same time being able to carry on with the procedure at hand.
- Accordingly, the present invention provides a master-slave system comprising: a first manipulator supporting a first end effector; a second manipulator supporting a second end effector; an input device configured to concurrently receive from a hand of an operator a first movement command to effect a desired movement of the first end effector and a second movement command to effect a desired movement of the second end effector; and a processor configured to determine a desired movement of the first and the second end effectors in response to the first and second movement commands received from the input device.
- The input device may further comprise a body for being held by an operator. The body may be configured to be gripped by a single hand of the operator.
- The input device may further comprise a first input for receiving the first movement command to effect a desired movement of the first end effector.
- The processor may be configured to operate in a plurality of modes including: an engaged mode, wherein the processor operatively couples the input device with the second end effector; a disengaged mode, wherein the processor operatively decouples the input device from the second end effector; an operational mode, wherein the processor operatively couples the first input with the first end effector; and a selection mode, wherein the processor operatively decouples the first input from the first end effector.
- The processor may further be configured to operate in the disengaged mode in response to a signal received from a clutch input. The processor may further be configured to operate in the operational mode in response to receiving a signal from a second input. The processor may further be configured to concurrently operate in the disengaged mode and the selection mode when the processor enters the disengaged mode.
- The input device may further be configured to receive the second movement command to effect a desired movement of the second end effector when the processor is operating in the engaged mode.
- The first input may be configured to receive the first movement command to effect a desired movement of the first end effector when the processor is operating in the operational mode. The first input may further be configured to select a third end effector for association with the input device when the processor is operating in the selection mode. The input device may further be configured to receive a third movement command to effect a desired movement of the third end effector in response to selection of the third end effector by the first input.
- The input device may further comprise a drive mechanism coupled to the first input, the drive mechanism being configured to apply a force to the first input for providing force feedback to the operator. The input device may further comprise a force sensor configured to sense a force applied to the first input for determining that a command is being received at the first input from the operator. The first input device may further comprise a capacitive sensor configured to sense when the operator is touching the first input for determining that a command is being received at the first input from the operator. The input device may further comprise a position sensor configured to sense the position of the first input relative to the body for determining that a command is being received at the first input from the operator.
- The processor may further be configured to effect movement of the first end effector in accordance with the determined desired movement of the first end effector. The processor may further be configured to effect movement of the second end effector in accordance with the determined desired movement of the second end effector.
- The first end effector may have multiple degrees of freedom and the processor may be configured to selectively associate at least one of the degrees of freedom of the first end effector with the first input and to effect movement of the at least one of the degrees of freedom of the first end effector in response to movement of the first input.
- The first end effector may comprise an image capture device for capturing an image of a worksite. The image capture device may be an endoscope. The master-slave system may further comprise a display for displaying the captured image.
- The processor may further be configured to effect movement of the first end effector according to movement of the first input with respect to an operator selected frame of reference. The operator selected frame of reference may be fixed to the input device. The operator selected frame of reference may be fixed to the display. The processor may further be configured to continuously update a mapping between an input device frame of reference and a second end effector frame of reference and to continuously update a mapping between a first input frame of reference and a first end effector frame of reference such that movements of the first and second end effectors as displayed on the display correspond to the movements of the first input and the input device respectively.
- The present invention will now be described by way of example with reference to the accompanying drawings in which:
-
FIG. 1 shows a master-slave system; -
FIG. 2 shows a patient undergoing tele-surgery using a plurality of robotic manipulators each supporting a tool; -
FIG. 3 shows two examples of the robotic manipulators ofFIG. 2 each supporting a different tool; -
FIG. 4 shows an operator console including a display and an input device; -
FIG. 5a shows a top view of an input device including an input and a clutch input; -
FIG. 5b shows a perspective view of the input device ofFIG. 5 a; -
FIG. 6 shows a schematic representation of the system ofFIG. 1 ; -
FIG. 7 shows an operator using the operator console ofFIG. 4 ; -
FIG. 8a shows a view of the input ofFIG. 5a with its operator selected frame of reference fixed to the input ofFIG. 4 ; and -
FIG. 8b shows an alternative view of the input ofFIG. 5a with its operator selected frame of reference fixed to the display ofFIG. 4 . -
FIG. 1 shows a master-slave system 10 in an exemplary configuration as a surgical system. Thesystem 10 includes a patient table 50, acart 70, and anoperator console 20. Theoperator console 20 allows an operator O (shown inFIG. 7 ) to carry out a surgical procedure on patient P by remotely manipulating one or more robotic manipulator(s) 80 a, 80 b, 80 c (shown inFIG. 2 ) which supports a 40 a, 40 b, 40 c using one or more input device(s) 21 a, 21 b. The operator O may, for example, be a surgeon, an assistant, or a student.tool - Referring now to
FIG. 2 , thesystem 10 has one or more 80 a, 80 b, 80 c for supporting a variety ofrobotic manipulators 40 a, 40 b, 40 c. The robotic manipulator(s) 80 a, 80 b, 80 c may be as shown intools FIGS. 2, 3 and 6 . The 80 a, 80 b, 80 c ofrobotic manipulators FIGS. 2, 3 and 6 each have anarm 81 which extends from a base 90 (shown inFIG. 6 ). Thearm 81 is articulated by a series of 82, 83, 84 along its length. At the distal end of therevolute joints arm 81 is a 40 a, 40 b, 40 c which terminates in antool 41 a, 41 b, 41 c. Theend effector end effector 41 a, 41 b may be a surgical instrument such as, but not limited to, smooth jaws, serrated jaws, a gripper, a pair of shears, a needle for suturing, a laser, a knife, a stapler, a cauteriser, a suctioner, pliers, a scalpel, cautery electrode, or the like. Theend effector 41 c may alternatively be an image capture device, such as an endoscope. The 41 a, 41 b, 41 c can be driven to move by aend effector drive motor 86 at the distal end of thearm 81. Thedrive motor 86 may be coupled to the 41 a, 41 b, 41 c by cables extending along the interior of the instrument's shaft.end effector - Whilst
FIG. 3 shows arobotic manipulator 80 c supporting anendoscope 41 c and arobotic manipulator 80 a supporting aninstrument end effector 41 a, it will be appreciated that each of the 80 a, 80 b, 80 c can support any of the end effectors.robotic manipulators - Referring now to
FIG. 4 , theoperator console 20 includes one or 21 a, 21 b coupled to amore input devices linkage assembly 26 for controlling the robotic manipulator(s) 80 a, 80 b, 80 c. Theoperator console 20 also includes adisplay 30 for providing the operator O with a view of the remote surgical worksite. The input device(s) 21 a, 21 b ofFIGS. 4, 5 a and 5 b are designed to be held in a user's hand for providing three-dimensional motion input to thesystem 10. In some cases the input device(s) 21 a, 21 b may also allow the user to provide functional input to thesystem 10. - Whilst the above description refers to a single display device, in other examples the robotic system may comprise a plurality of display devices, or screens. The screens are suitably configured to display the view of the remote worksite as a two-dimensional image and/or as a three-dimensional image. The screens can be provided on a
single operator console 20, or two or more consoles can comprise at least one screen each. This permits additional viewing screens which can be useful for allowing people other than the console operator to view the surgical worksite, for example for training. - For ease of explanation an
example input device 21 b will be described with reference to mutually orthogonal X, Y and Z axes indicated inFIGS. 5a and 5b . The X axis is orthogonal to the forward/rearward axis of theinput device 21 b. The Y axis is orthogonal to the left/right axis of theinput device 21 b. The Z axis is parallel to the up/down axis of theinput device 21 b and orthogonal to the X and the Y axes. In other words, the Z axis of theinput device 21 b is a line going through the body of theinput device 21 b from its top to its bottom surface. - The
example input device 21 b ofFIGS. 5a and 5b is intended to be grasped in the right hand. A mirrorimage input device 21 a could be intended for the left hand. Thebody 27 of theinput device 21 b has ahead 23 and agrip 24. Thegrip 24 is configured to sit in the palm of the user's hand. This allows the user to place their first/index finger on aclutch input 22. The user's thumb can be placed on the opposite side of thehead 23 to theclutch input 22 or alternatively the user's thumb can be placed on aninput 25 on thehead 23. User acceptance testing of devices of this type has established that many users naturally pick up and hold such a device in this manner, with their wrist in a neutral position and their thumb opposing their finger. This has the result that for three-dimensional motion input the user's wrist is free to move in flexion, extension, adduction and abduction. - In the
input device 21 b ofFIGS. 5a and 5b , theclutch input 22 is shown as being a trigger lever that can rotate relative to thehead 23 about an axis (not shown). Theclutch input 22 could variably be termed the pince or pinch input. For a more detailed description of such a clutch input, refer to the Applicant's GB Patent Application No. GB1616086.3, entitled “Hand Controller Grip”. Furthermore, in theinput device 21 b ofFIGS. 5a and 5b , theinput 25 is shown as being a thumbstick that pivots on a base and reports its position in relation to a default centre position to acontrol unit 60. This position signal can then be used as a further motion input and, optionally, as a functional input in thesystem 10. Theinput 25 ofFIGS. 5a and 5b is designed to be manipulated with the user's thumb and typically provides a two-dimensional input. -
Input 25 could be articulated in several ways. For example, theinput 25 could pivot in a forward/rearward direction A about the X axis ofinput device 21 b. Theinput 25 could also pivot in a left/right direction B about the Y axis ofinput device 21 b. Theinput 25 could also behave as a push-button to provide an up/down movement of theinput 25 along the Z axis ofinput device 21 b. - The
input 25 is positioned on the 21 a, 21 b such that it can be used in a comfortable and natural way by the user but not inadvertently actuated. Theinput device input 25 may be in an inoperable state by default, in which state any input received by the system frominput 25 is ignored. Theinput 25 may only become operable once it has been detected that it is being engaged by a user. To do this a touch sensitive panel (not shown) could be provided on the surface ofinput 25 which could be engaged by the user's thumb. The touch-sensitive panel could include a capacitive sensor (not shown) to detect the presence of the user's thumb on theinput 25 when the user's thumb is touching theinput 25. Alternatively or additionally, haptic feedback could be provided forinput 25 to detect any shaking or vibrating resulting from the user's thumb touching theinput 25 to detect the presence of the user's thumb on theinput 25. - Preferably, the
input 25 is in an operable state by default. In this scenario to stop any inadvertent actuation of theinput 25 by the user, a neutral zone could be defined within the articulation range of theinput 25 which will not produce any output. For example, theinput 25 may need to be pivoted in a forward/rearward direction about the X axis of 21 a, 21 b beyond a certain threshold value before an output signal is actioned and any resulting movement of an associated slave is generated in the system. Visual cues may be provided on theinput device display 30 and/or on the 21 a, 21 b to indicate the band of input values to theinput device input 25 which will not produce any output to the system. - A position sensor may be provided within the
head 23 to sense the position of theinput 25 relative to thebody 27. The position sensor may be able to discriminate a range of positions of theinput 25, allowing the extent to which theinput 25 is pivoted or is pressed to be identified more precisely. The position sensor may be a rotational encoder or a potentiometer arranged about the rotation axis of theinput 25. In addition to, or instead of, a position sensor theinput device 21 b may comprise a force and/or torque sensor for detecting the force applied to theinput 25 and/or the torque applied to theinput 25 about its pivot axis. - Additional user input interface may be provided on the
head 23. For example, there may be one or more push-buttons, rotational knobs, joysticks, rocker switches or the like. - Referring now to
FIG. 6 , an examplerobotic manipulator 80 a is governed by acontrol unit 60. Thecontrol unit 60 receives inputs from an 21 a, 21 b and from other sources such as theinput device linkage assembly 26 and the robotic manipulator position/force sensors 85. - The
linkage assembly 26 is an articulated linkage which supports the 21 a, 21 b and permits it to be moved with six degrees of freedom. The configuration of theinput device linkage assembly 26 can be detected by sensors on the linkage and passed to thecontrol unit 60. In that way movement of the 21 a, 21 b can be used to control the movement of theinput device robotic manipulator 80 a. Instead of thelinkage assembly 26, the 21 a, 21 b may be equipped with accelerometers which permit its position and orientation to be estimated.input device - The
control unit 60 comprises a processor 61 which executes code stored in a non-transient form in amemory 62. On executing the code, theprocessor 60 determines a set of signals for commanding movement of the 82, 83, and 84 of thejoints robotic manipulator 80 a, and for moving theend effector 41 a of theinstrument 40 a in dependence on the inputs from the 21 a, 21 b, theinput device linkage assembly 26, and the manipulator position/force sensors 85. The code is configured so that the motion of therobotic manipulator 80 a is essentially dictated by the inputs from the 21 a, 21 b and theinput device linkage assembly 26. For example, in a normal operating mode (i) the attitude of theend effector 41 a may be set by the attitude of the 21 a, 21 b about its rotational degrees of freedom; (ii) the position of theinput device end effector 41 a may be set by the position of the 21 a, 21 b about its translational degrees of freedom; and (iii) the configuration of the jaws of theinput device end effector 41 a may be set by the position of theclutch input 22 orinput 25 relative to thebody 21. For example,clutch input 22 can be used to operate the jaws of a surgical instrument end effector. - Furthermore, when the
robotic manipulator 80 c is supporting an image capture device such as anendoscope 41 c, on executing the code, theprocessor 60 determines a set of signals for commanding movement of the 82, 83, and 84 of thejoints robotic manipulator 80 c, and for moving theendoscope 41 c in dependence on the inputs from theinput 25 and the position/force sensors of the 21 a, 21 b. For example, in a normal operating mode (i) the up/down position of the distal end of the endoscope 41 c may be set by the position of the input 25 of a first input device 21 a about its default position in a forward/rearward direction A; (ii) the left/right position of the distal end of the endoscope 41 c may be set by the position of the input 25 of the first input device 21 a about its default position in a left/right direction B; (iii) the zoom level of the endoscope 41 c may be set by the position of the input 25 of a second input device 21 b about its default position in a forward/rearward direction A; and (iv) the rotation of the distal end of the endoscope 41 c may be set by the position of the input 25 of the second input device 21 b about its default position in a left/right direction B. It is to be noted that the input devices 21 a and 21 b could be mapped conversely to that described above so that the first input device 21 a controls the rotation and zoom level of the endoscope 41 c and the second input device 21 b controls the up/down and left/right movement of the distal end of the endoscope 41 c. Alternatively, ainput device 21 a or 21 b is used to control all of the above described degrees of freedom of thesingle input device endoscope 41 c. As before, (i) the up/down position of the distal end of theendoscope 41 c may be set by the position of theinput 25 of afirst input device 21 a about its default position in a forward/rearward direction A; (ii) the left/right position of the distal end of theendoscope 41 c may be set by the position of theinput 25 of thefirst input device 21 a about its default position in a left/right direction B. At this point the operator O may press theinput 25 to switch from controlling the up/down and left/right directions of motion of the distal end of theendoscope 41 c to controlling the rotation and zoom level of theendoscope 41 c. Once the switch has been made (iii) the zoom level of theendoscope 41 c may be set by the position of theinput 25 of thefirst input device 21 a about its default position in a forward/rearward direction A; and (iv) the rotation of theendoscope 41 c may be set by the position of theinput 25 of thefirst input device 21 a about its default position in a left/right direction B. Alternatively, the operator O may choose the mapping between the forward/rearward and left/right movements of theinput 25 and the up/down, left/right, rotation and zoom level of theendoscope 41 c according to their own preference. In other words, the operator O can select which degrees of freedom of theinput 25 to map onto which degrees of freedom of theendoscope 41 c. This will be described in more detail below. Thus, the operator O ofsystem 10 is able to use a 21 a, 21 b to concurrently control thesingle input device instrument end effector 41 a, 41 b and theendoscope 41 c without the need to halt either the manipulation of theinstrument end effector 41 a, 41 b or theendoscope 41 c. - When the
robotic manipulator 80 c is supporting an image capture device such asendoscope 41 c, on executing the code, theprocessor 60 may determine the set of signals for moving theendoscope 41 c in dependence on the inputs from theinput 25 and the position/force sensors of the 21 a, 21 b using velocity control such that the speed with which theinput device endoscope 41 c is moved in a desired direction may also be controlled. For example, theinput 25 may be configured such that it is biased to return to a neutral position. In a normal operating mode (i) the speed with which the distal end of the endoscope 41 c is moved in an up/down direction may be set by the position of the input 25 of a first input device 21 a about its default neutral position in the forward/rearward direction A; (ii) the speed with which the distal end of the endoscope 41 c is moved in a left/right direction may be set by the position of the input 25 of the first input device 21 a about its default neutral position in a left/right direction B; (iii) the speed with the zoom level of the endoscope 41 c is changed may be set by the position of the input 25 of a second input device 21 b about its default neutral position in a forward/rearward direction A; and (iv) the speed with which the rotation of the distal end of the endoscope 41 c is effected may be set by the position of the input 25 of the second input device 21 b about its default neutral position in a left/right direction B. Put another way, when the operator O wishes to move the endoscope 41 c in a desired way, they will push the input 25 of the relevant input device 21 a, 21 b in one of the appropriate directions described above. How far they push theinput 25 in the appropriate direction away from the neutral position, determines how fast theendoscope 41 c moves in the desired way. Once the operator O releases theinput 25, theendoscope 41 c stops moving and theinput 25 returns to its neutral position. It is to be noted that the 21 a and 21 b could be mapped conversely to that described above so that theinput devices first input device 21 a controls the rotation and zoom level of theendoscope 41 c and thesecond input device 21 b controls the up/down and left/right movement of the distal end of theendoscope 41 c. - Alternatively, a
21 a or 21 b is used to control all of the above described degrees of freedom of thesingle input device endoscope 41 c using velocity control. As before, theinput 25 is configured such that it is biased to return to a neutral position; (i) the speed with which the distal end of theendoscope 41 c is moved in the up/down direction may be set by the position of theinput 25 of afirst input device 21 a about its default neutral position in a forward/rearward direction A; (ii) the speed with which the distal end of theendoscope 41 c is moved in the left/right direction may be set by the position of theinput 25 of thefirst input device 21 a about its default neutral position in a left/right direction B. At this point the operator O may press theinput 25 to switch from controlling the up/down and left/right directions of motion of the distal end of theendoscope 41 c to controlling the rotation and zoom level of theendoscope 41 c. Once the switch has been made (iii) the speed with which the zoom level of theendoscope 41 c is changed may be set by the position of theinput 25 of thefirst input device 21 a about its default neutral position in a forward/rearward direction A; and (iv) the speed with which the rotation of theendoscope 41 c is effected may be set by the position of theinput 25 of thefirst input device 21 a about its default neutral position in a left/right direction B. Alternatively, the operator O may choose the mapping between the forward/rearward and left/right movements of theinput 25 and the up/down, left/right, rotation and zoom level of theendoscope 41 c according to their own preference. In other words, the operator O can select which degrees of freedom of theinput 25 to map onto which degrees of freedom of theendoscope 41 c. This will be described in more detail below. Thus, the operator O ofsystem 10 is able to use a 21 a, 21 b to concurrently control thesingle input device instrument end effector 41 a, 41 b and theendoscope 41 c without the need to halt either the manipulation of theinstrument end effector 41 a, 41 b or theendoscope 41 c. - In this way, the operator O of
system 10 can reposition theendoscope 41 c to capture a different view of the worksite during a procedure without needing to stop the procedure. Theendoscope 41 c can thus be regarded as another slave end effector. - Referring to
FIG. 7 , the operator O carrying out a procedure using theoperator console 20 uses the input device(s) 21 a, 21 b (the masters) to control the movements of the one or more 40 a, 40 b, 40 c (the slaves). The operator O can view the worksite via theremote tools display 30.Instrument end effectors 41 a and 41 b supported on 80 a and 80 b are manipulated to effect positional and orientational movements in response to movement and functional inputs on their respective associated input device(s) 21 a, 21 b. During the procedure, images of therobotic manipulators instrument end effectors 41 a, 41 b together with the surgical site are captured by theendoscope 41 c supported onrobotic manipulator 80 c and are displayed on thedisplay 30 so that the operator O can see the responsive movements and actions of theinstrument end effectors 41 a, 41 b as they control such movements and actions using the input device(s) 21 a, 21 b. - Advantageously, the
control unit 60 establishes a relationship between 21 a, 21 b and theinput device instrument end effector 41 a, 41 b associated with it as viewed in the captured image being displayed ondisplay 30, in which relationship the orientation and position of theinstrument end effector 41 a or 41 b as displayed to the operator O in the image follows and corresponds to the orientation and position of the associated 21 a or 21 b as manipulated by the operator's hand. By mapping the orientation and position of the associatedinput device instrument end effector 41 a or 41 b onto the input device's orientation and position, controlling the movement of theinstrument end effector 41 a or 41 b can be performed in a more intuitive manner than in the case where the movement of theinstrument end effector 41 a or 41 b as displayed in the image were not mapped onto the movement of the operator's hand. - Additionally, the
control unit 60 establishes a relationship betweeninput 25 and theendoscope 41 c associated with it, in which relationship the orientation and position of theendoscope 41 c follows and corresponds to the orientation and position of the associatedinput 25 as manipulated by the operator's hand. By thus mapping the orientation and position of the associatedendoscope 41 c onto the input's orientation and position, controlling the movement of theendoscope 41 c can be performed in a more intuitive manner than in the case where the movement of theendoscope 41 c were not mapped onto the movement of theinput 25. - The
control unit 60 effects control between an 21 a, 21 b and aninput device instrument end effector 41 a, 41 b by mapping the input device's position and orientation in an input device Cartesian coordinate reference system (input frame) with the position and orientation of theinstrument end effector 41 a or 41 b in a camera Cartesian coordinate reference system (camera frame). Accordingly, during operation, the instrument end effector's position and orientation within the camera frame is mapped to the position and orientation of the 21 a, 21 b in the input frame and theinput device instrument end effector 41 a or 41 b is effected to move to a new position/orientation in the camera frame which corresponds to the current position and orientation of the 21 a, 21 b in the input frame.input device - In some cases, the
control unit 60 effects control betweeninput 25 and theendoscope 41 c by mapping the input's position and orientation with respect to the 21 a, 21 b in an input Cartesian coordinate reference system (input frame) with the position and orientation of theinput device endoscope 41 c in the camera frame. Accordingly, during operation theendoscope 41 c is effected to move to a new position/orientation in the camera frame which corresponds to the current position and orientation of theinput 25 in the input frame. - Alternatively, the
control unit 60 effects control betweeninput 25 and theendoscope 41 c using velocity control by mapping the input's position about its default neutral position with respect to the 21 a, 21 b in an input frame with the direction and speed of motion of theinput device endoscope 41 c in the camera frame. Accordingly, during operation, theendoscope 41 c is effected to move in a new direction and at a desired speed in the camera frame which corresponds to the position of theinput 25 about its default neutral position in the input frame. - Advantageously, the user can disengage the
21 a, 21 b and move it to a more ergonomic position and or orientation. Upon re-engagement of theinput device 21 a, 21 b, the mapping between the input frame and the camera frame is updated. Thus, the master frame of reference can be changed and also vary between theinput device input device 21 a and theinput device 21 b. Additionally, this update of the mapping between the master ( 21 a, 21 b) and the slave (endinput device 41 a, 41 b, 41 c) results in the movement commands given via theeffector 21 a, 21 b to produce the expected motion of the slave as displayed to the operator O oninput device display 30. - It will be appreciated that, should the image capture device's position change, the orientation and position of the
instrument end effector 41 a or 41 b in the viewed image could also change. Consequently, and advantageously, the relationship in which the movement of theinstrument end effector 41 a, 41 b is mapped onto the movement of the 21 a, 21 b is again established after such a positional change by the image capture device. This will enhance the operator's intuitive “feel” when performing a procedure as this continuous update of the mapping between the master and the slave would result in the movement commands given via theinput device 21 a, 21 b to produce the expected motion of the slave as displayed to the operator O.input device - Additionally, to provide the operator O with an improved and more intuitive response between the movement of the
input 25 and the movement of the distal end of theendoscope 41 c, thecontrol unit 60 determines and effects a desired position and orientation of the distal end of theendoscope 41 c based on the position and orientation of theinput 25 by means of the input frame relative to a frame of reference of the operator's preference. This concept is illustrated inFIGS. 8a and 8b and is described below. -
FIG. 8a illustrates an example in which the operator O is holding theinput device 21 a such that it is pointed squarely at the screen ofdisplay 30. The operator O can choose to set their preferred frame of reference to be the same as that of theinput device 21 a (i.e. the input frame). In this example, before theinput 25 is manipulated, the operator O can see theinstrument end effector 41 a as shown ondisplay 30. However, once theinput 25 is moved left in respect to the body of theinput device 21, in the direction indicated by letter L inFIG. 8a , the distal end of theendoscope 41 c is also moved leftwards, resulting in the displayed image of theinstrument end effector 41 a to shift to the right side of the captured image as shown ondisplay 30′. -
FIG. 8b illustrates another example in which theinput device 21 a is being held by the operator O such that it is pointed 90 degrees to the right of the screen ofdisplay 30. If the operator O chooses to set their preferred frame of reference to be the same as the input frame once more, manipulating theinput 25 to move left in respect to the body of theinput device 21 a, in the direction indicated by letter L inFIG. 8b , would result in a leftward movement of the distal end of theendoscope 41 c, resulting in the displayed image of theinstrument end effector 41 a to shift to the right side of the captured image as shown ondisplay 30′. - However, operator O could choose to set their preferred frame of reference to be with respect to the screen. In this example, moving the
input 25 in the direction indicated by letter L inFIG. 8b would result in a forwards movement of the distal end of theendoscope 41 c, resulting in the displayed image of theinstrument end effector 41 a appearing to be larger as shown ondisplay 30″. Alternatively, the operator O may choose another desired frame of reference. - The examples above describe the use of the
input 25 for providing three-dimensional motion of the image capture device (e.g. endoscope 41 c). The following examples describe the use of theinput 25 for providing functional input to thesystem 10. - The
system 10 provides for selectively permitting operative association between the input device(s) 21 a, 21 b and any of the instrument end effectors 41, 41 b. In many surgical procedures, it is desirable for the operator to be able to select from among two or moreinstrument end effectors 41 a, 41 b. For example, the 21 a, 21 b is usable to control a plurality ofinput device instrument end effectors 41 a, 41 b. The 21 a, 21 b is usable to control one instrument end effector 41, 41 b when operatively coupled to the one instrument end effector 41, 41 b, and is usable to control anotherinput device instrument end effector 41 a, 41 b when operatively coupled to that other instrument. In other words, an 21 a, 21 b is usable to control theinput device 40 a, 40 b to which it is operatively coupled in an engaged mode. The coupling betweeninstrument end effector 21 a, 21 b andinput device instrument end effector 41 a, 41 b is changeable. Oneinstrument end effector 41 a can be controlled by a plurality of 21 a, 21 b. Theinput devices instrument end effector 41 a is couple-able to oneinput device 21 a to permit control of theinstrument end effector 41 a by the oneinput device 21 a. Once decoupled from the oneinput device 21 a, theinstrument end effector 41 a is couple-able to anotherinput device 21 b to permit control of theinstrument end effector 41 a by theother input device 21 b. Thus, more than oneinstrument end effector 41 a, 41 b can be operatively associated with asingle input device 21. - The
21 a, 21 b are associable with, or operatively coupleable to, oneinput devices instrument end effector 41 a, 41 b at a time. This association or coupling is, in one example, effected in software control. The operator O selects the desired coupling, for example by manipulating the 21 a, 21 b to select the respectiverespective input device instrument end effector 41 a, 41 b, such as by using theinput 25 to select an option from a menu in an interface. Once a suitableinstrument end effector 41 a, 41 b has been selected it is operably engaged/coupled with the 21 a, 21 b to provide the operator O with control over that chosen instrument end effector. This can be done using theinput device clutch input 22 of theinput device 21. However, the 21 a, 21 b itself need not be used to make this engagement. Another button or control, such as a foot pedal or switch on theinput device operator console 20, can be used instead of, or as well as, the 21 a, 21 b. Alternatively, another button or control on theinput device 21 a, 21 b other thaninput device clutch input 22 could be provided as an engagement input for operably engaging/coupling a suitableinstrument end effector 41 a, 41 b with the 21 a, 21 b to provide the operator O with control over that chosen instrument end effector. This provides flexibility in the selection of theinput device instrument end effector 41 a, 41 b. Theinstrument end effector 41 a, 41 b that is operatively coupled to the 21 a, 21 b can be termed an engaged instrument end effector.input device Instrument end effectors 41 a, 41 b that are not operatively coupled to an 21 a, 21 b can be termed disengaged instrument end effectors. A disengaged instrument end effector is not being actively controlled by aninput device 21 a, 21 b. Suitably the processor 61 is configured to determine which of theinput device instrument end effectors 41 a, 41 b is an engaged instrument. - Thus, the processor 61 can operate in a number of modes, namely in an engaged mode, in a disengaged mode, in an operational mode, and in a selection mode. In the engaged mode, the
21 a, 21 b is operatively coupled with aninput device instrument end effector 41 a, 41 b. In the disengaged mode, the 21 a, 21 b is operatively decoupled from aninput device instrument end effector 41 a, 41 b. The same mechanism described above to operatively couple an instrument end effector to an 21 a, 21 b can be used to operatively decouple that instrument end effector from thatinput device 21 a, 21 b. For example, theinput device clutch input 22 of the 21 a, 21 b can be used to operatively decouple the associatedinput device instrument end effector 41 a, 41 b. Again, the 21 a, 21 b itself need not be used to effect this disengagement. Another button or control, such as a foot pedal or switch on theinput device operator console 20, can be used instead of, or as well as, the 21 a, 21 b. Again, another button or control on theinput device 21 a, 21 b other thaninput device clutch input 22 could be alternatively provided as an engagement input for operably disengaging/decoupling the associatedinstrument end effector 41 a, 41 b from the 21 a, 21 b. Once the previous instrument end effector has been operatively decoupled from theinput device 21 a, 21 b, a different instrument end effector can be selected and operatively coupled to theinput device 21 a, 21 b to provide the operator O with control over that chosen instrument.input device - In the selection mode, the processor 61 could further be configured such that upon disengagement of an
40 a, 40 b from aninstrument input device 21, theinput 25 of that 21 a, 21 b ceases to act as a three-dimensional input and automatically becomes an instrument end effector selection device to allow the operator O to select another instrument end effector for association with theinput device 21 a, 21 b. In one example, when more than oneinput device 21 a, 21 b is being used, disengaging one of theinput device 21 a, 21 b results only in the disengagement of the associated instrument end effector of thatinput devices 21 a, 21 b with theinput device input 25 of the 21 a, 21 b turning into a selection device. Thus, the processor can concurrently operate in the disengaged and the selection modes. In another example, when more than onedisengaged input device 21 a, 21 b is being used, disengaging one of theinput device 21 a, 21 b results in disengagement of all the associated instrument end effectors and theinput devices input 25 of all the 21 a, 21 b automatically turn into selection devices.input devices - As described earlier, the
input 25 of one or 21 a, 21 b could also be used to control the various degrees of freedom (up/down, left/right, zoom in/out and roll) of the image capture device (more input devices e.g. endoscope 41 c). This may occur in the operational mode, where for example,input 25 of afirst input device 21 a is operatively coupled to theendoscope 41 c to control its up/down and left/right movements andinput 25 of asecond input device 21 b can be operatively coupled to theendoscope 41 c to control the zoom in/out and roll of theendoscope 41 c. Theclutch input 22 of the 21 a, 21 b can be used to cause the processor 61 to enter the operational mode and provide the operator O with control over the endoscope's required degrees of freedom. Alternatively, another engagement input of theinput device 21 a, 21 b as described above or another button or control, such as a foot pedal or switch on theinput device operator console 20, can be used instead of or as well as theclutch input 22 to cause the processor 61 to enter the operational mode. In one example, when either one of the 21 a, 21 b are disengaged from their associatedinput devices instrument end effector 41 a, 41 b while the other remains engaged to their associatedinstrument end effector 41 a, 41 b then bothinputs 25 of the 21 a and 21 b are disengaged from controlling the endoscope's degrees of freedom by the processor 61 and bothinput devices inputs 25 become instrument selection devices. - As well as providing the operator with an instrument selection device,
inputs 25 can be adapted to provide other functional input to the system 10 (e.g. the selection of a number of other available modes that may be accessible to the operator via a menu-type interface). In one example, when thesystem 10 enters these other functional modalities, the processor 61 is configured to disengage all 21 a, 21 b from their respective associatedinput devices instrument end effectors 41 a, 41 b. Furthermore, the processor 61 is configured to disengage allinputs 25 from theendoscope 41 c. - While the present invention has been described in the context of a surgical robotic system, it must be appreciated that different aspects of this invention equally apply to other master-slave robotic systems. Thus, the robotic system may be an industrial system or another type of system. The robot could be an industrial robot or a robot for another function. The instruments could be an industrial or other form of tool. The body of the instrument could comprise a coupling for releasable attachment to a robot and an elongate shaft running between the coupling and the end effector.
- The control processes described herein could be performed by one or more physical processing units executing software that causes the unit(s) to perform the control processes. The or each physical processing unit could be any suitable processor, such as a CPU or fixed function or programmable hardware.
- The applicant hereby discloses in isolation each individual feature described herein and any combination of two or more such features, to the extent that such features or combinations are capable of being carried out based on the present specification as a whole in the light of the common general knowledge of a person skilled in the art, irrespective of whether such features or combinations of features solve any problems disclosed herein, and without limitation to the scope of the claims. The applicant indicates that aspects of the present invention may consist of any such individual feature or combination of features. In view of the foregoing description it will be evident to a person skilled in the art that various modifications may be made within the scope of the invention.
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/811,098 US20220331032A1 (en) | 2018-02-23 | 2022-07-07 | Camera control |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1802992.6A GB2571319B (en) | 2018-02-23 | 2018-02-23 | Concurrent control of an end effector in a master-slave robotic system using multiple input devices |
| GB1802992.6 | 2018-02-23 | ||
| US16/281,194 US11406463B2 (en) | 2018-02-23 | 2019-02-21 | Camera control |
| US17/811,098 US20220331032A1 (en) | 2018-02-23 | 2022-07-07 | Camera control |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/281,194 Continuation US11406463B2 (en) | 2018-02-23 | 2019-02-21 | Camera control |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20220331032A1 true US20220331032A1 (en) | 2022-10-20 |
Family
ID=61903219
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/281,194 Active 2041-02-26 US11406463B2 (en) | 2018-02-23 | 2019-02-21 | Camera control |
| US17/811,099 Pending US20220331033A1 (en) | 2018-02-23 | 2022-07-07 | Camera control |
| US17/811,098 Pending US20220331032A1 (en) | 2018-02-23 | 2022-07-07 | Camera control |
Family Applications Before (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/281,194 Active 2041-02-26 US11406463B2 (en) | 2018-02-23 | 2019-02-21 | Camera control |
| US17/811,099 Pending US20220331033A1 (en) | 2018-02-23 | 2022-07-07 | Camera control |
Country Status (8)
| Country | Link |
|---|---|
| US (3) | US11406463B2 (en) |
| EP (2) | EP4541304A3 (en) |
| JP (3) | JP2021527449A (en) |
| CN (1) | CN111818872B (en) |
| AU (3) | AU2019224647B2 (en) |
| BR (1) | BR112020016361A2 (en) |
| GB (1) | GB2571319B (en) |
| WO (1) | WO2019162660A1 (en) |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12415284B2 (en) | 2019-08-23 | 2025-09-16 | Intuitive Surgical Operations, Inc. | Moveable display system |
| CN114423366B (en) * | 2019-09-14 | 2025-06-06 | 旋转外科股份有限公司 | Hybrid, direct-control, and robotic-assisted surgical systems |
| EP4259030A4 (en) | 2021-01-14 | 2025-01-01 | Corindus, Inc. | SYSTEMS AND METHODS FOR A CONTROL STATION FOR ROBOTIC INTERVENTIONAL PROCEDURES USING A PLURALITY OF ELONGATED MEDICAL DEVICES |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110306986A1 (en) * | 2009-03-24 | 2011-12-15 | Min Kyu Lee | Surgical robot system using augmented reality, and method for controlling same |
| US20130035697A1 (en) * | 2011-08-04 | 2013-02-07 | Olympus Corporation | Medical manipulator and method of controllling the same |
| US20140114481A1 (en) * | 2011-07-07 | 2014-04-24 | Olympus Corporation | Medical master slave manipulator system |
| US20140336669A1 (en) * | 2013-05-08 | 2014-11-13 | Samsung Electronics Co., Ltd. | Haptic gloves and surgical robot systems |
| US20160128790A1 (en) * | 2013-07-26 | 2016-05-12 | Olympus Corporation | Medical system and control method therefor |
| US20160135909A1 (en) * | 2013-07-26 | 2016-05-19 | Olympus Corporation | Medical system and method for controlling the same |
| US20170312047A1 (en) * | 2014-10-27 | 2017-11-02 | Intuitive Surgical Operations, Inc. | Medical device with active brake release control |
| US20180042680A1 (en) * | 2005-06-06 | 2018-02-15 | Intuitive Surgical Operations, Inc. | Interactive user interfaces for minimally invasive telesurgical systems |
Family Cites Families (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8527094B2 (en) * | 1998-11-20 | 2013-09-03 | Intuitive Surgical Operations, Inc. | Multi-user medical robotic system for collaboration or training in minimally invasive surgical procedures |
| US6951535B2 (en) * | 2002-01-16 | 2005-10-04 | Intuitive Surgical, Inc. | Tele-medicine system that transmits an entire state of a subsystem |
| EP1148807B1 (en) * | 1998-12-08 | 2010-03-10 | Intuitive Surgical, Inc. | Image shifting telerobotic system |
| US9492235B2 (en) * | 1999-09-17 | 2016-11-15 | Intuitive Surgical Operations, Inc. | Manipulator arm-to-patient collision avoidance using a null-space |
| EP1887961B1 (en) * | 2005-06-06 | 2012-01-11 | Intuitive Surgical Operations, Inc. | Laparoscopic ultrasound robotic surgical system |
| WO2009049654A1 (en) * | 2007-10-19 | 2009-04-23 | Force Dimension S.A.R.L. | Device for movement between an input member and an output member |
| KR101038417B1 (en) * | 2009-02-11 | 2011-06-01 | 주식회사 이턴 | Surgical Robot System and Its Control Method |
| US8465476B2 (en) * | 2009-09-23 | 2013-06-18 | Intuitive Surgical Operations, Inc. | Cannula mounting fixture |
| US8521331B2 (en) * | 2009-11-13 | 2013-08-27 | Intuitive Surgical Operations, Inc. | Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument |
| JP6021353B2 (en) * | 2011-08-04 | 2016-11-09 | オリンパス株式会社 | Surgery support device |
| US9586323B2 (en) * | 2012-02-15 | 2017-03-07 | Intuitive Surgical Operations, Inc. | User selection of robotic system operating modes using mode distinguishing operator actions |
| WO2013181522A1 (en) * | 2012-06-01 | 2013-12-05 | Intuitive Surgical Operations, Inc. | Redundant axis and degree of freedom for hardware-constrained remote center robotic manipulator |
| KR102167359B1 (en) * | 2012-06-01 | 2020-10-19 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | Systems and methods for commanded reconfiguration of a surgical manipulator using the null-space |
| GB201504787D0 (en) * | 2015-03-20 | 2015-05-06 | Cambridge Medical Robotics Ltd | User interface for a robot |
| US11351001B2 (en) * | 2015-08-17 | 2022-06-07 | Intuitive Surgical Operations, Inc. | Ungrounded master control devices and methods of use |
| JP6871929B2 (en) * | 2015-09-29 | 2021-05-19 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Instrument controller for robot-assisted minimally invasive surgery |
| US11209954B2 (en) * | 2015-12-10 | 2021-12-28 | Cmr Surgical Limited | Surgical robotic system using dynamically generated icons to represent orientations of instruments |
| US10219868B2 (en) * | 2016-01-06 | 2019-03-05 | Ethicon Llc | Methods, systems, and devices for controlling movement of a robotic surgical system |
-
2018
- 2018-02-23 GB GB1802992.6A patent/GB2571319B/en not_active Expired - Fee Related
-
2019
- 2019-02-20 AU AU2019224647A patent/AU2019224647B2/en not_active Ceased
- 2019-02-20 CN CN201980014257.3A patent/CN111818872B/en active Active
- 2019-02-20 JP JP2020544397A patent/JP2021527449A/en active Pending
- 2019-02-20 WO PCT/GB2019/050455 patent/WO2019162660A1/en not_active Ceased
- 2019-02-20 EP EP25154065.4A patent/EP4541304A3/en active Pending
- 2019-02-20 BR BR112020016361-9A patent/BR112020016361A2/en active Search and Examination
- 2019-02-20 EP EP19708619.2A patent/EP3755260B1/en active Active
- 2019-02-21 US US16/281,194 patent/US11406463B2/en active Active
-
2022
- 2022-07-07 US US17/811,099 patent/US20220331033A1/en active Pending
- 2022-07-07 US US17/811,098 patent/US20220331032A1/en active Pending
- 2022-08-31 AU AU2022224785A patent/AU2022224785B2/en not_active Ceased
- 2022-08-31 AU AU2022224765A patent/AU2022224765B2/en not_active Ceased
- 2022-11-08 JP JP2022178714A patent/JP2023010762A/en active Pending
- 2022-11-08 JP JP2022178713A patent/JP2023010761A/en active Pending
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180042680A1 (en) * | 2005-06-06 | 2018-02-15 | Intuitive Surgical Operations, Inc. | Interactive user interfaces for minimally invasive telesurgical systems |
| US20110306986A1 (en) * | 2009-03-24 | 2011-12-15 | Min Kyu Lee | Surgical robot system using augmented reality, and method for controlling same |
| US20140114481A1 (en) * | 2011-07-07 | 2014-04-24 | Olympus Corporation | Medical master slave manipulator system |
| US20130035697A1 (en) * | 2011-08-04 | 2013-02-07 | Olympus Corporation | Medical manipulator and method of controllling the same |
| US20140336669A1 (en) * | 2013-05-08 | 2014-11-13 | Samsung Electronics Co., Ltd. | Haptic gloves and surgical robot systems |
| US20160128790A1 (en) * | 2013-07-26 | 2016-05-12 | Olympus Corporation | Medical system and control method therefor |
| US20160135909A1 (en) * | 2013-07-26 | 2016-05-19 | Olympus Corporation | Medical system and method for controlling the same |
| US20170312047A1 (en) * | 2014-10-27 | 2017-11-02 | Intuitive Surgical Operations, Inc. | Medical device with active brake release control |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2019162660A9 (en) | 2020-09-17 |
| AU2022224785B2 (en) | 2024-01-11 |
| EP3755260B1 (en) | 2025-03-26 |
| AU2019224647B2 (en) | 2022-07-07 |
| US20190262089A1 (en) | 2019-08-29 |
| AU2019224647A1 (en) | 2020-08-27 |
| CN111818872B (en) | 2024-09-10 |
| JP2023010761A (en) | 2023-01-20 |
| US11406463B2 (en) | 2022-08-09 |
| GB2571319B (en) | 2022-11-23 |
| GB2571319A (en) | 2019-08-28 |
| EP4541304A3 (en) | 2025-07-16 |
| WO2019162660A1 (en) | 2019-08-29 |
| AU2022224785A1 (en) | 2022-09-22 |
| EP4541304A2 (en) | 2025-04-23 |
| EP3755260A1 (en) | 2020-12-30 |
| JP2021527449A (en) | 2021-10-14 |
| JP2023010762A (en) | 2023-01-20 |
| BR112020016361A2 (en) | 2020-12-15 |
| CN111818872A (en) | 2020-10-23 |
| GB201802992D0 (en) | 2018-04-11 |
| US20220331033A1 (en) | 2022-10-20 |
| AU2022224765B2 (en) | 2024-01-11 |
| AU2022224765A1 (en) | 2022-09-22 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12048505B2 (en) | Master control device and methods therefor | |
| US12023122B2 (en) | Ungrounded master control devices and methods of use | |
| US12419715B2 (en) | Master control device with multi-finger grip and methods therefor | |
| CN110799144B (en) | Systems and methods for haptic feedback for selection of menu items in a remote control system | |
| AU2022224785B2 (en) | Camera Control | |
| US12167901B2 (en) | Telementoring control assemblies for robotic surgical systems | |
| US20230064265A1 (en) | Moveable display system | |
| CN117279589A (en) | Apparatus, computer-implemented method, and computer program | |
| GB2606672A (en) | Camera control | |
| GB2605090A (en) | Camera control | |
| GB2605091A (en) | Camera control |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: CMR SURGICAL LIMITED, UNITED KINGDOM Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MARSHALL, KEITH;CUTHBERTSON, REBECCA ANNE;DEANE, GORDON THOMAS;SIGNING DATES FROM 20190205 TO 20190214;REEL/FRAME:060428/0907 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| AS | Assignment |
Owner name: TRINITY CAPITAL INC., AS AGENT, ARIZONA Free format text: SECURITY INTEREST;ASSIGNOR:CMR SURGICAL LIMITED;REEL/FRAME:070629/0172 Effective date: 20250324 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |