WO2024224268A1 - Procédé de maintien d'un instrument chirurgical d'un système de chirurgie robotique, pendant sa commande de mouvement, dans le champ de vision d'un système de visualisation et système de chirurgie robotique associé - Google Patents
Procédé de maintien d'un instrument chirurgical d'un système de chirurgie robotique, pendant sa commande de mouvement, dans le champ de vision d'un système de visualisation et système de chirurgie robotique associé Download PDFInfo
- Publication number
- WO2024224268A1 WO2024224268A1 PCT/IB2024/053901 IB2024053901W WO2024224268A1 WO 2024224268 A1 WO2024224268 A1 WO 2024224268A1 IB 2024053901 W IB2024053901 W IB 2024053901W WO 2024224268 A1 WO2024224268 A1 WO 2024224268A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- viewing space
- viewing
- space
- surgical instrument
- fov1
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/37—Leader-follower robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/371—Surgical systems with images on a monitor during operation with simultaneous use of two cameras
Definitions
- the present invention relates to a method and system for controlling a robotic system for medical or surgical teleoperation.
- the invention relates to a method (and a related robotic surgery system) for keeping a surgical instrument, teleoperated and controlled in movement, within the field of view of a viewing system with which the robotic surgery system is provided.
- the field of view (FOV) provided by any viewing system associated therewith e.g., microscope, exoscope, endoscope
- FOV field of view
- any viewing system associated therewith e.g., microscope, exoscope, endoscope
- the workspace of the slave device also defined a "slave workspace”
- the field of view FOV is a subspace, i.e., it represents a subset, of the workspace of the joints of the slave device.
- a movement of an instrument controlled by the master device can be mapped inside the slave workspace (i.e., inside the space of the slave joints) but outside the effective field of view FOV and therefore is not carried out under the complete control of the operator who, in a robotic teleoperation system, closes the control loop of each movement through his own sight mediated by the viewing system.
- executions of surgical gestures such as pulling a suture filament during the passage of the needle in tissues or making a knot, as well as the retraction of an organ or tissue, or a gripping motion of an object at the limits of the field of view FOV, can lead to movements outside the field of view FOV of one or more surgical instruments of the slave device (or, in particular, of articulated terminals, also called “end-effectors", of such surgical instruments).
- a first solution for overcoming the above drawbacks is to control the movements of the surgical instruments so as to prevent them from exiting the field of view FOV, for example by locking the instruments or constraining them in motion so that they remain within the field of view (even when the command of the master device, in the absence of such a safety control, would determine the exit of the slave device, and therefore of the surgical instrument, from the field of view).
- the need remains to combine the requirement of safety (i.e. , avoiding the risk of causing damage to the patient when the surgical instrument is not visible) and the requirement of easy usability, reducing, as much as possible, the constraints imposed by rigidly confining and in any case the movement of the surgical instrument within a field of view originally defined by the viewing system.
- Such an object is achieved by a method according to claim 1.
- FIG. 1A, 1 B, 2A and 2B show a robotic system, in accordance with the invention, according to some possible embodiments
- FIG. 2C diagrammatically shows a detail of a robotic system, according to an embodiment of the present invention
- FIG. 3 diagrammatically shows an image acquisition device with a viewing space including a surgical site and a slave surgical instrument, according to an embodiment of the present invention
- FIG. 8A, 8B and 8C are block diagrams schematically showing a robotic system, according to some embodiments of the present invention.
- FIG. 9 is block diagrams showing some possible steps of a control method, according to some embodiments of the present invention.
- FIG. 12A, 12B, 12C, 12D, 13 and 14 are views of a screen displaying images of two viewing spaces, according to some possible embodiments of the present invention.
- FIG. 15A, 15B, 16A and 16B are diagrammatic views of an image acquisition device, according to some embodiments of the present invention.
- FIG. 17A and 17B diagrammatically show some possible configurations of a viewing space, according to some embodiments of the present invention.
- FIG. 18A, 18B and 18C diagrammatically show a sequence of some possible steps of a control method, according to an embodiment of the present invention
- FIG. 19A and 19B diagrammatically show a sequence of some possible steps of a control method, according to an embodiment of the present invention
- FIG. 20A, 20B and 20C diagrammatically show a sequence of some steps of a control method, according to an embodiment of the invention
- FIG. 21 A - 21 D diagrammatically show a sequence of some steps of a control method, according to an embodiment of the present invention
- FIG. 21 E is a diagram of some possible steps of a control method, according to an embodiment of the present invention.
- the robotic system comprises at least one surgical instrument 170, adapted to operate in teleoperation, and further comprises viewing means 120, 130 configured to display to an operator 150 images and/or videos of at least one viewing space associated with a teleoperation area in which the surgical instrument 170 operates.
- the method first comprises the step of determining whether a position of the surgical instrument 170 is within an allowed space correlated to a first viewing space FOV1 , in which the first viewing space FOV1 is used as the current viewing space for the teleoperation, with respect to which a current visualization is provided to the operator.
- the method then comprises, if or when it is determined that the aforesaid position of the surgical instrument 170 is not within the aforesaid allowed space correlated to the first viewing space FOV1 , the further step of automatically providing, by the viewing means, a second visualization defining a second viewing space FOV2 having a surface or field of view greater than the aforesaid first viewing space FOV1 and containing, or partially containing, the aforesaid first viewing space FOV1.
- Said second visualization comprises a combination and/or superimposition of the aforesaid second viewing space FOV2 and first viewing space FOV1 , and/or a switching from the aforesaid first viewing space FOV1 to the aforesaid second viewing space FOV2, in which such a switching is performed, without mechanical movements, by controlling optical and/or digital parameters of the viewing means 120, 130.
- the aforesaid step of automatically providing a second visualization comprises displaying to the operator 150 a combination and/or superimposition of the aforesaid second viewing space FOV2 and of the aforesaid first viewing space FOV1 , for example in images and/or videos, by display means 130 included in the aforesaid viewing means.
- the aforesaid step of automatically providing a second visualization comprises displaying to the operator 150 the second viewing space FOV2, through a switching from the aforesaid first viewing space FOV1 to the aforesaid second viewing space FOV2, by controlling optical and/or digital parameters of image acquisition means 120 comprised in the aforesaid viewing means, in which such control and such switching are performed without carrying out mechanical movements.
- the aforesaid optical parameters comprise, for example, electronic control of zoom and/or exposure and/or focus and/or depth of focus
- the aforesaid digital parameters comprise, for example, digital scale factor and/or magnification and/or region of interest and/or digital zoom or other digital image processing.
- the aforesaid step of determining a position of the surgical instrument 170 comprises determining a current position of the surgical instrument 170 with respect to the allowed space correlated to the first viewing space FOV1 , and/or the presence of the surgical instrument 170 in the allowed space correlated to the viewing space.
- the aforesaid step of automatically providing a second visualization comprises automatically providing a second visualization when the current determined position of the surgical instrument 170 is on the boundary and/or outside the allowed space correlated to the first viewing space FOV1.
- the aforesaid current portion of the surgical instrument 170 is detected based on digital data, for example digital images, provided in real time by image acquisition means 130 included in the viewing means.
- the method is applied in a robotic system comprising at least one master device 110 adapted to be moved by an operator 150, and further comprising the aforesaid at least one surgical instrument 170 adapted to be controlled by the master device 110.
- said step of determining a position of the surgical instrument 170 comprises determining a position of the surgical instrument 170 with respect to the first viewing space FOV1 , as imposed by the master device 110.
- the aforesaid step of automatically providing a second visualization comprises automatically providing a second visualization when the controlled position of the surgical instrument 170, as was determined, is on the boundary and/or outside the aforesaid allowed space correlated to the first viewing space FOV1.
- the aforesaid step of automatically providing a second visualization comprises ensuring that the surgical instrument is displayed in/from the second visualization provided to the operator, during the movement of the surgical instrument 170 as controlled by the master device 110, even when the surgical instrument moves outside the allowed space correlated to the first viewing space FOV1 , during teleoperation or preparation for teleoperation.
- the aforesaid second viewing space F0V2 is a physical viewing space, detectable and displayable by the viewing means in addition to the first viewing space F0V1.
- Such a second viewing space FOV2 is wider than the first viewing space FOV1 and/or contains the first viewing space FOV1 (i.e. , in other words, the first viewing space FOV1 is included in the second viewing space FOV2).
- the aforesaid second viewing space FOV2 is a virtual viewing space, extractable or extrapolable from digital images and/or videos previously detected and/or displayable by the viewing means in addition to the first viewing space FOV1.
- Such a second viewing space FOV2 is wider than the first viewing space FOV1 and/or contains the first viewing space FOV1 (i.e., in other words, the first viewing space FOV1 is included in the second viewing space FOV2).
- the method comprises the further steps of acquiring, by the viewing means, both a first digital video image of a magnified operating scenario, defined in the aforesaid first viewing space FOV1 , and a second digital video image of the enlarged operating scenario, defined in the aforesaid second viewing space FOV2, wider and/or at a lower zoom with respect to the first viewing space FOV1 , which includes the first viewing space FOV1 ; the surgical instrument is displayed in the second viewing space FOV2.
- the aforesaid step of automatically providing a second visualization when the determined position of the surgical instrument is on the boundary or outside the allowed space correlated to the first viewing space FOV1 , comprises displaying both the first viewing space FOV1 and the second viewing space FOV2.
- the step of displaying both the first viewing space FOV1 and the second viewing space FOV2 comprises displaying the second viewing space FOV2, in which the surgical instrument is visible, in overlay on a screen portion 130, covering a part of the first viewing space.
- the screen portion in which the second viewing space FOV2 is in overlay comprises an area located at one of the four corners of the screen 130.
- the screen portion 130 in which the second viewing space FOV2 is in overlay comprises a side box, arranged on the side of the first viewing space FOV1 from which the surgical instrument exited the space correlated to the first viewing space.
- the aforesaid step of displaying both the first viewing space FOV1 and the second viewing space FOV2 comprises displaying the second viewing space FOV2, in which the surgical instrument is visible, in full screen, and displaying the first viewing space F0V1 in overlay on a screen portion, covering a part of the second viewing space in which the surgical instrument is not present.
- the method further comprises highlighting the second viewing space FOV2, by means of increased brightness or with a colored or bright edging, at the moment of transition between the first visualization and the second visualization.
- the aforesaid step of displaying both the first viewing space FOV1 and the second viewing space FOV2 comprises displaying the first viewing space FOV1 on a first screen and displaying the second viewing space FOV2 on a second screen.
- the method comprises the further steps of acquiring, by the viewing means, both a first digital video image of a magnified operating scenario, defined in the aforesaid first viewing space FOV1 , and a second digital video image of the enlarged operating scenario, defined in the aforesaid second viewing space FOV2 (wider and/or at a lower zoom with respect to the first viewing space FOV1), which includes the first viewing space FOV1 ; the surgical instrument is displayed in the second viewing space FOV2.
- the aforesaid step of automatically providing a second visualization comprises processing both the first acquired digital video image and the second acquired digital video image, by image processing techniques, so as to perform a pixel-by-pixel merging of the two digital images and thus obtain, as a second visualization provided to the operator, a combined digital image of the operating scenario, based on the aforesaid image merging.
- the aforesaid combined digital image has a higher resolution and/or a greater field depth with respect to the aforesaid first acquired digital video image and second acquired digital video image.
- the aforesaid steps of acquiring a first digital video image and a second digital video image are carried out simultaneously, continuously and in real time, by means of two image sensors or cameras, included in the viewing means, having the same viewpoint or two different respective viewpoints.
- said two or more sensors comprise two sensors adapted to provide two two-dimensional viewing spaces FOV, or two sensors adapted to provide one three-dimensional viewing space FOV, or four sensors adapted to provide two three-dimensional viewing spaces FOV.
- the aforesaid two image sensors comprise two cameras having as viewing space the first viewing space FOV1 and the second viewing space FOV2, respectively, aligned with each other and/or coaxial.
- the aforesaid steps of acquiring a first digital video image and a second digital video image are carried out alternatively, by means of a single image sensor or camera.
- such a camera detects the first digital video image associated with the first viewing space FOV1 by adjusting the zoom to a first zoom value, and detects the second digital video image associated with the second viewing space FOV2 by adjusting the zoom to a second zoom value.
- switching from the first digital video image to the second digital image, or vice versa involves, in addition to switching from the first zoom value to the second zoom value, or vice versa, also an automatic controlled variation and/or automatic adjustment of other optical parameters, inter alia exposure/brightness and/or focus.
- the step of providing the second visualization comprises switching from a first digital video image of the operating scenario, taken at a first zoom value, to a second digital image of the operating scenario characterized by a second zoom value less than the first zoom value, and thus from a second viewing space FOV2 which is wider than the first viewing space FOV1 , in which the surgical instrument is displayed.
- the aforesaid switching step is carried out by the viewing means by controlling optical and/or digital parameters, and not by controlling movement parameters of the viewing means.
- the method includes providing and using a plurality of larger second viewing spaces FOV2i- FOV2 n , characterized by a respective plurality of gradually lower second zoom values.
- the aforesaid step of automatically providing a second visualization when the determined position of the surgical instrument is on the boundary of or outside an allowed space correlated to a second current viewing space FOV2j, comprises displaying the next wider second viewing space FOV2j+i, having a respective lower second zoom value, and such as to display the surgical instrument.
- the method envisages that the robotic system remains in teleoperation until the surgical instrument is displayed within the aforesaid second visualization.
- the method provides for the robotic system exiting the teleoperation state, if the surgical instrument reaches the boundaries of the second field of view FOV2, and the second field of view FOV2 cannot be further enlarged.
- the method provides for the robotic system remaining in a limited teleoperation state, if the surgical instrument reaches the boundaries of the second field of view FOV2, and the second field of view FOV2 cannot be further enlarged.
- the movement of the surgical instrument is only allowed if the surgical instrument is located within an allowed space correlated to the second viewing space and the movement of the surgical instrument, if allowed, is still confined within such a second viewing space.
- the aforesaid allowed space correlated to the viewing space corresponds to the viewing space.
- the aforesaid allowed space correlated to the viewing space comprises a subset of the viewing space, corresponding to the viewing space from which an internal surrounding extending with a spatial tolerance e inside the boundaries of the viewing space is removed.
- the aforesaid viewing space is defined by a field of view (FOV) of the viewing means.
- FOV field of view
- Such an implementation option refers to a robotic system having viewing means, or a generic viewing system (comprising digital image/video acquisition means), capable of capturing a portion of the world observed through appropriate lens or light guide systems.
- FOV Field of View
- said viewing space is defined by a predefined subset of the field of view (FOV) of the viewing means.
- FOV field of view
- the boundaries of the area or volume defining the viewing space of interest do not necessarily coincide with the field of view; such boundaries can be constructed on a sub-volume of the field of view where the view is optimal and/or in such a way to have a particular geometric shape and/or specially chosen to promote the mobility of the slave device therein and/or for any other reason.
- the aforesaid viewing space is defined by a field-of-view workspace, consisting of a geometric volume, in a reference coordinate system of the robotic system, associated with the aforesaid field of view.
- Such a field-of-view workspace can for example correspond to a volume, e.g., a trapezoid which goes from the lens to infinity and centered in the main axis of the optical system, which is capable of representing the field of view of a digital viewing system, for example for lenses with "Fields of View” FOV less than 180 degrees.
- FOV Workspace can for example correspond to a volume, e.g., a trapezoid which goes from the lens to infinity and centered in the main axis of the optical system, which is capable of representing the field of view of a digital viewing system, for example for lenses with "Fields of View” FOV less than 180 degrees.
- the aforesaid viewing space is defined by geometric limits of the field of view, consisting of a boundary surface of the aforesaid viewing workspace, in the reference coordinate system of the robotic system.
- the field-of-view workspace is constructed with respect to the trapezoid originating in the camera image plane of the viewing system. It is possible to build simplified geometries called "field of view workspace limits" (FOV Workspace Limits) therefrom which are imposed to limit the movement of the slave device. Such geometries can be defined as planes orthogonal to the viewing system or as curved surfaces which in any case are defined inside the field-of-view workspace.
- FOV Workspace Limits field of view workspace limits
- the aforesaid viewing means comprise at least one camera 120 or comprise an endoscope and/or a laparoscope and/or a microscope and/or an exoscope.
- the viewing means comprise a stereoscopic viewing system comprising two cameras, each of which defines a respective "FOV Workspace” 175, for example right "FOV Workspace” and left "FOV Workspace”.
- the intersection of the aforesaid two field-of-view workspaces of right and left camera produces a “common field-of-view workspace” which ensures the maximum visibility of the objects in the scene.
- the aforesaid step of determining a position of the surgical instrument 170 with respect to the allowed space correlated to the viewing space comprises determining a current position of the surgical instrument 170 and/or the presence of the surgical instrument 170 in the allowed space correlated to the viewing space, based on digital data deriving from the viewing means.
- the aforesaid step of determining a position of the surgical instrument 170 with respect to the allowed space correlated to the viewing space comprises:
- the aforesaid step of determining a position of the surgical instrument 170 with respect to the allowed space correlated to the viewing space comprises calculating and/or determining the position of a real point belonging to the surgical instrument or the position of a virtual point integral with the surgical instrument 170, based on images provided by said viewing system.
- the aforesaid step of determining a position of the surgical instrument 170 comprises determining the position of a virtual control point 600 of the slave device (for example placed between the tips 171, 172 or jaws 171, 172 of the surgical instrument 170).
- the aforesaid step of determining a position of the surgical instrument 170 comprises determining the position of at least one of the tips 171, 172 of the surgical instrument 170.
- the aforesaid step of determining a position of the surgical instrument 170 comprises determining the position of at least one of the links of a hinged wrist (or "end-effector") 177 included in the surgical instrument 170.
- the aforesaid step of determining a position of the surgical instrument 170 comprises determining the position of a distal portion of a positioning shaft near the hinged wrist of the surgical instrument 170.
- the method comprises the further step of dynamically adjusting/varying the viewing space, by controlling the viewing means, e.g., by changing the zoom or adjusting the point of view, so as to improve or restore the view of the surgical instrument through the viewing means.
- the robotic system coupled to the viewing system is capable of acting autonomously on the zoom by widening the viewing space (e.g., FOV) when an instrument reaches the limits of the field of view, thereby preventing a maneuver of the surgeon, possibly involuntary, from causing the exit of the instrument from the field of view.
- FOV viewing space
- a first zoom value related to a first viewing space e.g., first FOV
- a second zoom value related to a second viewing space e.g., second FOV
- first viewing space e.g., first FOV
- second FOV aforesaid first viewing space
- the self-adjustment of the zoom upon reaching the limits imposed by the viewing space can vary between the aforesaid first zoom value and second zoom value and the related first FOV and second FOV.
- said varying zoom self-adjustment is an intermediate variation between the two values calculated and evaluated based on the target position of the instrument or upon reaching the limits, generating intermediate viewing spaces contained between the first and second viewing spaces, or is one of the two zoom values, and passes from one to the other when the instrument is outside or inside said first viewing space (e.g., FOV).
- FOV first viewing space
- the switching between the first zoom value and the second zoom value or vice versa does not include any mechanical movement of joints, lenses or microscope but only digital processing.
- Such a robotic system comprises at least one surgical instrument 170, adapted to operate in teleoperation; viewing means 120, 130 configured to display to the operator 150 images and/or videos of at least one viewing space associated with a teleoperation area in which the surgical instrument 170 operates; and lastly a control unit configured to control the surgical instrument 170.
- the control unit is further configured to carry out the following actions: - determining whether a position of the surgical instrument 170 is within an allowed space correlated to a first viewing space FOV1 , in which such a first viewing space FOV1 is used as the current viewing space for the teleoperation, with respect to which a current visualization is provided to the operator;
- the aforesaid second visualization defines a second viewing space FOV2 having a greater surface or a greater field of view than the aforesaid first viewing space FOV1 and containing, or partially containing, the first viewing space FOV1.
- the aforesaid second visualization comprises a combination and/or superimposition of the aforesaid second viewing space FOV2 and first viewing space FOV1 , and/or a switching from the first viewing space FOV1 to the second viewing space FOV2, in which such a switching is performed, without including mechanical movements, by controlling optical and/or digital parameters of the viewing means 120, 130.
- the aforesaid second visualization is representative of a combination and/or superimposition of the second viewing space FOV2 and the first viewing space FOV1.
- the robotic system further comprises at least one master device 110, adapted to be moved by an operator 150, and at least one slave device comprising the aforesaid surgical instrument 170 adapted to be controlled by the master device.
- the master device 110 is preferably a "groundless"-type master device, without force feedback, for mono-lateral teleoperation.
- the master device can be a master mechanically constrained to an operating console and at the same time be of the “groundless”-type without force feedback, for mono-lateral teleoperation.
- the master device 110 is preferably a master device of a type which is mechanically unconstrained to the operating console.
- the viewing means 120, 130 comprise image acquisition means 120, configured to acquire a digital image or video associated with the aforesaid first viewing space FOV1 and/or second viewing space FOV2; and further comprise display means 130, configured to display to the operator 150 the first viewing space FOV1 or the second viewing space FOV2, and/or a combination and/or superimposition of the aforesaid second viewing space FOV2 and first viewing space F0V1.
- the viewing means comprise at least one image acquisition device 120, such as at least one camera 120 and/or at least one endoscope, and a screen 130 or display 130, in operating communication therebetween, for example by means of the provision of a vision processing unit.
- the viewing means are further configured to acquire both a first digital video image of a magnified operating scenario, defined in the first viewing space FOV1 , and a second digital video image of the enlarged operating scenario, defined in the second viewing space FOV2, wider and/or at a lower zoom with respect to the first viewing space FOV1 , which includes the first viewing space FOV1 ; the surgical instrument is displayed in such a second viewing space FOV2.
- the aforesaid action of automatically providing a second visualization when the determined position of the surgical instrument is on the boundary or outside the allowed space correlated to the first viewing space FOV1 , comprises displaying both the first viewing space FOV1 and the second viewing space FOV2.
- the viewing means are further configured to acquire both a first digital video image of a magnified operating scenario, defined in the first viewing space FOV1 , and a second digital video image of the enlarged operating scenario, defined in the second viewing space FOV2, wider and/or at a lower zoom with respect to the first viewing space FOV1 , which includes the first viewing space FOV1 and in which the surgical instrument is displayed.
- control means to perform the aforesaid step of automatically providing a second visualization, when the determined position of the surgical instrument is on the boundary or outside the allowed space correlated to the first viewing space FOV1 , are further configured to process both the first acquired digital video image and the second acquired digital video image, by image processing techniques, so as to perform a pixel- by-pixel merging of the two digital images and thus obtain, as a second visualization provided to the operator, a combined digital image of the operating scenario, based on the aforesaid image merging.
- the image acquisition means 120 comprise two or more image sensors or cameras, having the same or two different respective viewpoints, configured to perform the aforesaid actions of acquiring a first digital video image and a second digital video image simultaneously, continuously and in real time.
- the aforesaid two or more image sensors or cameras comprise two sensors or cameras adapted to provide two two-dimensional viewing spaces FOV, or two sensors or cameras adapted to provide one three-dimensional viewing space FOV, or four sensors or cameras adapted to provide two three-dimensional viewing spaces FOV.
- the aforesaid two image sensors comprise two cameras having as viewing space the aforesaid first viewing space FOV1 and the aforesaid second viewing space FOV2, respectively, aligned with each other and/or coaxial.
- the image acquisition means 120 comprise a single image sensor or camera, configured to alternately perform the aforesaid actions of acquiring a first digital video image and a second digital video image.
- the aforesaid image sensor or camera is configured to detect the first digital video image associated with the first viewing space FOV1 by adjusting the zoom to a first zoom value, and to detect the second digital video image associated with the second viewing space FOV2 by adjusting the zoom to a second zoom value.
- the aforesaid image sensor or camera is configured to perform, upon switching from the first digital video image to the second digital image, or vice versa, in addition to switching from the first zoom value to the second zoom value, or vice versa, also an automatic controlled variation and/or automatic adjustment of other optical parameters, including exposure/brightness and/or focusing.
- the image acquisition means 120 comprise an exoscope or a microscope or an endoscope.
- the display means 130 comprise at least one electronic screen or monitor.
- the robotic system further comprises processing means configured to process the digital images or videos acquired by the image acquisition means in order to detect the presence or absence of the surgical instrument 170 in the allowed space correlated to the first viewing space FOV1.
- control unit is configured to carry out a method for controlling a robotic system according to any one of the embodiments of the method illustrated in this description.
- Zoom i.e. , “zoom ratio” is the amount of scaling applied to the "Field of View” obtained through an optical or digital modification
- Magnetic magnification (i.e., “magnification ratio”) is the ratio between an entity of 1 mm placed at a distance equal to "Working Distance” and the representation thereof on a screen or monitor, and is also proportional to "Zoom".
- the viewing space (or field of view, FOV) can be imagined as the truncated pyramid with the minor plane on the camera lens and the other plane at infinity.
- FOV2 FOV1 which indicates that each point of FOV1 is included in FOV2.
- FOV2 has the same origin and main axis as FOV1 is also defined with the indication "FOV2
- the "Working Area WA” is a rectangle obtained by the intersection of the FOV with a plane placed at a certain working distance. For each distance from the lens, a “Working Area WA” can be defined, but typically it is intended as the "Working Distance” corresponding to the distance over which the camera focus is placed.
- a "Slave Working Area, SWA” is defined here as the "Working Area WA” defined by the distance of a chosen point of the surgical instrument 170, i.e., the control point thereof along the main axis, for example:
- dF is the focal length of the viewing system
- dS is the distance of the instrument from the origin (camera)
- WA(dF) is the Working Area at dF
- WA(dS) is the Working Area at dS
- dF(dF) is the field depth at dF.
- the operator-controlled surgical instrument will work at a distance dS in the range dV and dF, where dV is for example 100mm from the exoscopic head, and dF is instead at the center of the surgical site 180 where the focus is placed, for example 250mm.
- the distance dS is in the region of focus range for the focal length dF: [dF-rF/2, dF+rF/2],
- a functional of distance is further defined between the slave device (and the relative surgical instrument) and the limits of the viewing space FOV so that it is positive for points inside the FOV and negative for external points.
- Such a distance can be expressed either in visual coordinates (pixels), in metric coordinates (mm) or in normalized coordinates.
- An example of a distance is the Euclidean one in three-dimensional space between a chosen point of the slave device and the closest point of the contour surface of the FOV.
- Another purely planar distance can be obtained by estimating the distance of the slave device from the image edge in pixels or in normalized coordinates.
- a viewing system which has two controllable parameters: "Zoom” K, from 1x to Mx, and "Working Distance” F from A to B in millimeters (for example, from 200mm to 550mm).
- the "Working Distance” parameter controls the focusing.
- a possible "Working Distance” is not obtainable at each Zoom level and in general it is possible to identify a functional defining such a range for each Zoom level:
- the "WD_span” range is maximum for 1x Zoom and decreases for high Zooms.
- the "Depth Of Field, DoF" parameter is a function of WD: the higher WD, the higher the DoF.
- the focus region is a length range DoF(WD) such that the focus region is [WD - DoF (WD)/2, WD+DoF(WD)/2],
- an embodiment comprising a robotic system comprising a master-slave teleoperated robot, at least one associated surgical instrument, at least one master controller, at least one control and calculation unit CPU, and a slave workspace within which the surgical instruments can move.
- the teleoperated robot further comprises, or is associated with, an optical or digital viewing system (e.g., microscope, exoscope, or endoscope) defining a viewing space FOV.
- an optical or digital viewing system e.g., microscope, exoscope, or endoscope
- the robotic system is connected to the viewing system and is capable of commanding real-time changes of one or more parameters of the viewing system itself (such as zoom level, and/or focal length, and/or brightness, and/or exposure, and/or color filtering, and/or other parameters) based on events or states of the robotic system and/or conditions identified by the overall system consisting of robotic surgical system and viewing system.
- parameters of the viewing system itself such as zoom level, and/or focal length, and/or brightness, and/or exposure, and/or color filtering, and/or other parameters
- the method provides for the aforesaid overall system digitally using and processing the images and data of the viewing system in real time to identify the position or presence or non-presence of one or more surgical instruments (and/or of the respective terminal elements of the slave device, and/or hinged wrist and/or control point) in the viewing space FOV.
- the method further provides for, when in a teleoperation state in which at least one surgical instrument (i.e., the related slave device) is in the viewing space FOV, the position commanded to the slave device having a distance from the limits of the viewing space FOV (indicated in the figures as "Outer Limit") at a threshold value "EPS" or being outside the workspace FOV, then at least one new FOV (second viewing space FOV2) or a representation thereof containing the at least one surgical instrument is provided to the user and/or has always kept it inside the second viewing space FOV2 during the movement thereof by automatically modulating the optical/digital parameters of the viewing system (but without physical movement).
- at least one surgical instrument i.e., the related slave device
- the method includes commanding the viewing system to reduce the optical and/or digital zoom in real time, therefore enlarging the field of view, so that one or more instruments are made visible or constantly kept inside the new FOV called FOV2, in which:
- the controlled pose of the slave device in the second viewing space FOV2 is at a distance greater than the threshold "eps”.
- - "eps" is ⁇ 1/5 of the size of the FOV and preferably 1/10;
- the system after widening the field of view (reaching the second viewing space FOV2), if the instrument returns inside the previous viewing space FOV1 , the system is capable of controlling the zoom of the viewing system to return to a level which again identifies the starting FOV1. This is obtained by using a functional of distance which, if greater than a given threshold, leads to the transition from the second viewing space FOV2 to the first viewing space FOV1.
- the first viewing space FOV1 is defined and stored at a considered start time, for example at the start of teleoperation of the robotic system.
- a plurality of "n" FOVs (FOV n ) is defined and progressively modified and suggested in real time, where the "n" FOVs are in a linear zoom relationship.
- FOV2 is the smallest viewing space (in the camera space) capable of containing at least one surgical instrument of the slave device.
- the viewing space FOV must always contain at least two slave devices (i.e. , two surgical instruments).
- the operator can, in a preoperative step or during the operation, define and/or save two different zoom values, and thus viewing space (FOV1.FOV2), within which the system can move autonomously when the instruments are exiting or entering with respect to the aforesaid first viewing space FOV1.
- FOV1.FOV2 viewing space
- the viewing system will autonomously switch to the second viewing space FOV2, keeping the instruments in the field of view.
- FOV2 While the surgical instruments are in FOV2, bringing the surgical instruments back towards the center of the FOV2 and within the volume covered by FOV1 , the viewing system autonomously passes from FOV2 to FOV1.
- FOV2 > FOV1 i.e., FOV2 has a lower zoom than FOV1 and is capable of framing a larger portion of the surgical area within which the surgical instruments move.
- the aim is to keep the moving slave device in focus starting from the step of reaching the edge.
- the focus is adjusted in the slave device, i.e., the slave device is returned to the "depth of field" region containing both the slave device and the initial work plane.
- the slave workspace contains the FOV1.
- the system exits teleoperation if the available zoom is not sufficient or the instrument reaches the limits of FOV2, the system exits teleoperation.
- the system does not exit teleoperation and the operator cannot exceed the limit of FOV3.
- the viewing system If the viewing system has reached the minimum zoom allowed or fails to return or keep at least one instrument in the FOV, the system exits teleoperation, or prevents exiting the FOV by means of appropriate constraints.
- the teleoperated system applies a scaled Master-Slave movement in a possible range between 5-20 times.
- the scale factor can be used to change the aforesaid "eps" threshold, also taking into account the maximum speed of the slave device.
- This mode is applicable to an analogue viewing system ("operating microscope”) provided with a camera, or to a digital viewing system (exoscope) connected to a screen.
- an appropriate calculation unit analyzes the camera image and decides on commands to send to the viewing system.
- the aforesaid Control and Calculation Unit (CPU) can be located on the robot, on the viewing system, on an independent third element, on a representation system such as a screen.
- the viewing means further comprise a screen or monitor on which the digital images provided by the viewing system and processed by an appropriate calculation unit are represented
- the viewing system in the same operating session simultaneously and continuously acquires in real time both the anatomical image of the magnified operating scenario FOV1 and an anatomical image of the operating scenario in an "unzoomed” or lower zoom mode having FOV2 and in which FOV2 FOV1.
- the FOV2 is represented so as to cover a smaller screen section of the FOV1.
- the aforesaid overlay is displayed, for example, in an area less than half the area of the screen and/or, in an area preferably located at one of the four corners of the screen.
- the FOV2 is represented with a higher FOV brightness, or FOV2 is represented with a colored or bright boundary.
- the second viewing space FOV2 is arranged on the screen in overlay from the side from which the instrument exited the first viewing space FOV1.
- FOV2 is shown on the whole screen and instead of the magnified image FOV1.
- the FOV2 occupies the entire screen while the FOV1 is displayed in overlay and a smaller size representation of the entire screen.
- the viewing means further comprise a secondary screen and the second viewing space FOV2 is represented on such a secondary screen.
- such a secondary screen is integral with the control console of the robotic system, in particular in a seating structure it is integral and extends from the armrest of the console.
- the robotic system exits teleoperation if the instrument also exits the field of view FOV2 of the anatomical image of the operating scenario in the "unzoomed" or lower zoom mode.
- the viewing system and the robot are connected so that movements of the viewing system or modification of FOV parameters such as zoom are identified, monitored and recorded.
- the viewing system has a digital image acquisition sensor and is capable of acquiring both a portion of the "zoomed” sensor (FOV1) of the anatomical area and the entire sensor or a larger “unzoomed” (FOV2) portion thereof in real time and continuously.
- FOV1 "zoomed” sensor
- FOV2 "unzoomed”
- the viewing system is provided with two distinct and separate vision sensors or cameras, one defining FOV1 and one defining FOV2. In such a case, the point of view can be different.
- a FOV zoom change corresponds to a possible and automatic adjustment of exposure/brightness and/or focus.
- certain machine states and/or machine input and/or teleoperation system image analysis and/or teleoperation commands can trigger changes to viewing system parameters such as optical/digital zoom, focus, brightness, exposure, color filter.
- the aforesaid second visualization and first visualization are defined herein, respectively, as “visualization b" (having a lower zoom value, thus with a larger field of view) and “visualization a” (having a higher zoom value, thus with a smaller field of view).
- Visualization b is distorted as if in "visualization a” it were applied by a lens (lens effect); this can occur by hiding an edge region between the two visualizations or by compressing an edge region between the two visualizations (as shown for example in figure 23).
- the two-dimensional or three-dimensional nature of the image is taken into account.
- the image of digital microscopes or exoscopes is three-dimensional, and therefore how the stereoscopic signal is treated must be considered.
- this involves a two-dimensional "visualization b" and a three-dimensional "visualization a”, in the aforesaid cases (i) and (ii), while three-dimensional visualizations are considered for case (iii).
- a viewing system In accordance with an embodiment (“fusion"), a viewing system, a calculation unit and a screen on which the operator observes the operating field are available.
- the typical condition is to have the image taken by the viewing system on the screen with a given viewing space FOV.
- FOV1 the robotic system selects a second FOV2.
- Such s second FOV2 is obtained by changing the parameters of the single camera system, or by means of a second camera system aligned with the first.
- the "fusion" mode visually superimposes the results of the two viewing spaces FOV through image combination techniques so as to provide a higher resolution than that of the single view, i.e., a greater field depth.
- FOV1 the geometric relationship is known and in the simplest case the image provided by FOV1 is in the center of that of FOV2.
- the fusion system can express the transition dynamically seamlessly.
- the fusion occurs by eye.
- the recognition of the surgical instruments, in the aforesaid digital images occurs by means of the application of an appropriate deep neural network trained to identify, or the entire surgical instrument as an oriented rectangle, or a part thereof, or to separately identify all the parts thereof.
- a result can be obtained in the image space (pixel or normalized) or in the metric space if the camera is calibrated and the model of the identified objects is known.
- the recognition performed with neural networks can be combined with position tracking techniques based on probabilistic models or even networks. This distinction between recognition and tracking is computationally useful since recognition is typically slower.
- the distance thereof from the camera can be estimated and thus the distance from the FOV evaluated.
- the viewing system is mounted on an articulated arm with sensors at the joints which are capable of identifying movements of the position thereof in space.
- the robotic system is capable of reconstructing the FOV and/or the intersection of FOV1 and FOV2 by calculating the possible movement of the viewing system and the parameters thereof such as zoom and distance.
- the robotic system within the same operating session, stores, in a preparatory moment t1 , the anatomical image of the operating scenario in an "unzoomed" mode or at a lower zoom than the teleoperation session later performed (at time t2) with higher zoom and FOV1 lower than FOV2.
- the prerecorded "static" anatomical image of the operating scenario in "unzoomed” or lower zoom mode is shown to the user in overlay on a screen portion where the magnified FOV is also represented or on a secondary screen. Such a representation only occurs if the viewing system has not been moved between t1 and t2.
- such representations of FOV1 and/or evaluations of the workspace limits thereof are inhibited if the system identifies the movement of the viewing system between t1 and t2 or after t2.
- the viewing system is mounted on an articulated arm with sensors at the joints or active controllable robotic arm, or the FOV is a magnified small subset of the actual field of view of a digital viewing system.
- the viewing system moves to keep the instruments within a central FOV area.
- the image acquisition device 120 can be adapted to acquire images according to two points of view and/or according to two different settings, thereby defining said viewing spaces FOV1 and FOV2.
- the image of a viewing space FOV2 can be acquired during a preparatory step, to be included in advance, and stored in a memory of the vision processing unit (e.g., in the case of a panoramic view of the surgical site 180).
- the on-screen 130 visualization of the surgical device 170 when outside the first viewing space FOV1 and inside the second viewing space FOV2 can be processed by the system based on data from the motors of the slave surgical device 170 (thereby making a virtual on-screen 130 image of the slave surgical instrument when outside the first field of view 170).
- the robotic system 100 comprises a pair of master control devices 110 which control a respective pair of slave robotic manipulators 171 L, 171 R, in which a slave surgical instrument 170 is connected to each robotic manipulator.
- the image acquisition device 120 e.g., a camera 120
- the robotic manipulators 171 L, 171 R of the pair can be arranged between the robotic manipulators 171 L, 171 R of the pair, so that the field of view FOV1 , FOV2 can include both the slave surgical instruments 170 of the pair, if necessary.
- the image acquisition device 120 for example a camera 120
- an articulated arm 126 which for example is a robotic arm 126 (shown in figure 1 B).
- Computing units 125 can be provided, arranged on the data connections 128, 129 between the image acquisition device 120 and the screen 130 as well as between the image acquisition device 120 and the master control device 110 as well as between the image acquisition device 120 and the slave device 170.
- two screens 130 can be provided, which can be used to display two different superimposed viewing spaces FOV1 , FOV2.
- a screen 130 can be mounted to a chair 116 of a master control console.
- a transition area (indicated in figure 4 with "eps") can be defined, within the first viewing space FOV1 , wherein, when the surgical instrument is located in said transition area eps, a transition to or from another viewing space FOV2 occurs.
- a portion FOVT of the viewing space FOV1 can thus be defined in which a viewing space transition does not occur, and a transition area eps, inside the boundaries of the first viewing space FOV1 in which the transition to the wide-field of view FOV2 occurs, when the surgical instrument 170 moves towards the outside of the field of view FOV1 .
- the transition region eps can be provided when approaching the first narrow field of view FOV.
- the width of the transition region eps upon reentry in the narrow-field viewing space FOV1 can be different, larger in this example, from the transition area eps away from the narrow-field viewing space FOV1.
- an image 131 of the first viewing space FOV1 including an image 137L, 131 R of a single slave surgical instrument 170 can be displayed on the screen or display 130, while an image 132 of the second viewing space FOV2 including images 137L, 137R of both slave surgical instruments of the pair is displayed as an image 135 in overlay.
- An indicator 134 for indicating the position of the other slave surgical instrument of the pair can be displayed in overlay on the image 131 of the first narrow-field viewing space FOV1.
- the image 135 displayed in overlay can be the image 131 of the first viewing space FOV1 with higher zoom, as shown for example in figure 13, or vice versa, can be the image 132 of the second wide-field viewing space FOV.
- an image which is the digital fusion of the images 131 and 132 related to the respective viewing spaces FOV1 , FOV2 can be shown on the screen 130.
- the image acquisition device 120 can comprise a single digital sensor 121 capable of acquiring a wide viewing space FOV2 and an image processing unit 125 generating a digital magnification (zoom) of a portion of said wide viewing space, thereby defining the viewing space FOV1 , as shown for example in figure 16A. This avoids the necessity to provide optical systems to be moved.
- the digital sensor 121 can be associated with an optical unit 123 (which can comprise one or more lenses) which determine the magnifications (zoom) and an automatic movement system of the optical unit 123 can be provided.
- an optical unit 123 which can comprise one or more lenses
- magnifications zoom
- an automatic movement system of the optical unit 123 can be provided.
- two coordinated sensors 121 , 122 can be provided to generate a three-dimensional (3D) image, for example by means of the image processing unit 125.
- the viewing spaces FOV1 and FOV2 can be acquired by different sensors 121 , 122 with different points of view.
- the narrow and wide viewing spaces FOV1 , FOV2 can both be included in the slave workspace 175 of the slave device 175, for example the space of the joints of the manipulators 171 L, 171 R, or the slave workspace 175 can be included entirely in the wide field of view FOV2.
- the system can automatically switch to the wide second viewing space FOV2.
- the detection of the slave surgical instrument 170 within the spatial tolerance area £ can occur by means of digital visual identification (Computer Vision) to determine the transition to the second viewing space FOV2 and/or the detection of the slave surgical instrument 170 within the spatial tolerance area £ can occur by means of control over the input commands provided by the master control device 110 (i.e. , by the operator 150).
- the system initiates digital visual identification (Computer vision) to manage the transition times towards a wide second viewing space FOV2.
- digital visual identification Computer vision
- the system can be configured to automatically switch to the narrow first field of view FOV1.
- a plurality of gradually wider fields of view can be provided (as shown for example in figures 21 A to 21 E).
- the zoom i.e., the instantaneous magnification level
- the image processing unit 125 can be configured to optimize the zoom level to keep the slave surgical instrument 170 always within the current viewing space (e.g., FOV1 , FOV2, FOV3, and/or FOV4, as shown for example in figures 21A-21E and 22A-22C).
- Video processing unit or image processing unit
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Robotics (AREA)
- Radiology & Medical Imaging (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Gynecology & Obstetrics (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU2024263143A AU2024263143A1 (en) | 2023-04-26 | 2024-04-22 | Method for keeping a surgical instrument of a robotic surgery system, during its movement control, within the field of view of a viewing system and related robotic surgery system |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IT102023000008145 | 2023-04-26 | ||
| IT202300008145 | 2023-04-26 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024224268A1 true WO2024224268A1 (fr) | 2024-10-31 |
Family
ID=87889958
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2024/053901 Pending WO2024224268A1 (fr) | 2023-04-26 | 2024-04-22 | Procédé de maintien d'un instrument chirurgical d'un système de chirurgie robotique, pendant sa commande de mouvement, dans le champ de vision d'un système de visualisation et système de chirurgie robotique associé |
Country Status (2)
| Country | Link |
|---|---|
| AU (1) | AU2024263143A1 (fr) |
| WO (1) | WO2024224268A1 (fr) |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080108873A1 (en) * | 2006-11-03 | 2008-05-08 | Abhishek Gattani | System and method for the automated zooming of a surgical camera |
| US20090248036A1 (en) * | 2008-03-28 | 2009-10-01 | Intuitive Surgical, Inc. | Controlling a robotic surgical tool with a display monitor |
| US20130331644A1 (en) * | 2010-12-10 | 2013-12-12 | Abhilash Pandya | Intelligent autonomous camera control for robotics with medical, military, and space applications |
| US20170210012A1 (en) * | 2006-06-29 | 2017-07-27 | Intuitive Surgical Operations, Inc. | Tool position and identification indicator displayed in a boundary area of a computer display screen |
| US20180296280A1 (en) * | 2015-12-24 | 2018-10-18 | Olympus Corporation | Medical manipulator system and image display method therefor |
| US20200261160A1 (en) * | 2017-09-05 | 2020-08-20 | Covidien Lp | Robotic surgical systems and methods and computer-readable media for controlling them |
| US20210030497A1 (en) * | 2019-07-31 | 2021-02-04 | Auris Health, Inc. | Apparatus, systems, and methods to facilitate instrument visualization |
| JP2023008313A (ja) * | 2021-07-05 | 2023-01-19 | 国立大学法人神戸大学 | 手術システム、表示方法およびプログラム |
-
2024
- 2024-04-22 AU AU2024263143A patent/AU2024263143A1/en active Pending
- 2024-04-22 WO PCT/IB2024/053901 patent/WO2024224268A1/fr active Pending
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170210012A1 (en) * | 2006-06-29 | 2017-07-27 | Intuitive Surgical Operations, Inc. | Tool position and identification indicator displayed in a boundary area of a computer display screen |
| US20080108873A1 (en) * | 2006-11-03 | 2008-05-08 | Abhishek Gattani | System and method for the automated zooming of a surgical camera |
| US20090248036A1 (en) * | 2008-03-28 | 2009-10-01 | Intuitive Surgical, Inc. | Controlling a robotic surgical tool with a display monitor |
| US20130331644A1 (en) * | 2010-12-10 | 2013-12-12 | Abhilash Pandya | Intelligent autonomous camera control for robotics with medical, military, and space applications |
| US20180296280A1 (en) * | 2015-12-24 | 2018-10-18 | Olympus Corporation | Medical manipulator system and image display method therefor |
| US20200261160A1 (en) * | 2017-09-05 | 2020-08-20 | Covidien Lp | Robotic surgical systems and methods and computer-readable media for controlling them |
| US20210030497A1 (en) * | 2019-07-31 | 2021-02-04 | Auris Health, Inc. | Apparatus, systems, and methods to facilitate instrument visualization |
| JP2023008313A (ja) * | 2021-07-05 | 2023-01-19 | 国立大学法人神戸大学 | 手術システム、表示方法およびプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| AU2024263143A1 (en) | 2025-11-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230266823A1 (en) | Robotic system providing user selectable actions associated with gaze tracking | |
| US9948852B2 (en) | Intelligent manual adjustment of an image control element | |
| EP2903551B1 (fr) | Système numérique pour la capture et l'affichage d'une vidéo chirurgicale | |
| US7794396B2 (en) | System and method for the automated zooming of a surgical camera | |
| KR101374709B1 (ko) | 컴퓨터 디스플레이 스크린의 경계구역에 표시된 수술기구 위치 및 수술기구 확인 표시장치 | |
| JP4398352B2 (ja) | 医療用立体撮像装置 | |
| JP3506809B2 (ja) | 体腔内観察装置 | |
| WO2020084611A1 (fr) | Système et procédé d'ajustement automatiquement de l'éclairement pendant une intervention micro-chirurgicale | |
| WO2015186339A1 (fr) | Appareil de traitement d'image et procédé de traitement d'image | |
| JP2006158452A5 (fr) | ||
| US11638000B2 (en) | Medical observation apparatus | |
| US12295770B2 (en) | Medical display control device, medical observation device, display control method, and medical observation system | |
| JP2008532602A (ja) | 外科手術ナビゲーションと顕微鏡による可視化の方法と装置 | |
| CN110062596B (zh) | 自动焦点控制装置、内窥镜装置以及自动焦点控制装置的工作方法 | |
| CN112654280A (zh) | 医学观察系统、医学观察装置和医学观察方法 | |
| CN116407276A (zh) | 目标跟踪方法、内窥镜系统及计算机可读介质 | |
| US20200390514A1 (en) | Medical observation apparatus | |
| US11648082B2 (en) | Medical holding device, and medical observation device | |
| WO2024224268A1 (fr) | Procédé de maintien d'un instrument chirurgical d'un système de chirurgie robotique, pendant sa commande de mouvement, dans le champ de vision d'un système de visualisation et système de chirurgie robotique associé | |
| WO2016157923A1 (fr) | Dispositif de traitement d'informations et procédé de traitement d'informations | |
| JP4229664B2 (ja) | 顕微鏡システム | |
| JP2003334160A (ja) | 立体視内視鏡システム | |
| US20220354583A1 (en) | Surgical microscope system, control apparatus, and control method | |
| US11899836B2 (en) | Method for operating a visualization system in a surgical application, and visualization system for a surgical application | |
| WO2024141948A1 (fr) | Procédé de commande pour le déplacement d'un instrument chirurgical robotique sortant ou entrant dans le champ de vision d'un système de visualisation, et système robotique pour chirurgie associé |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24730065 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: AU2024263143 Country of ref document: AU |
|
| REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112025023194 Country of ref document: BR |
|
| ENP | Entry into the national phase |
Ref document number: 2024263143 Country of ref document: AU Date of ref document: 20240422 Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024730065 Country of ref document: EP |