WO2023237105A1 - Procédé d'affichage d'un instrument chirurgical virtuel sur une console de chirurgien, et console de chirurgien - Google Patents
Procédé d'affichage d'un instrument chirurgical virtuel sur une console de chirurgien, et console de chirurgien Download PDFInfo
- Publication number
- WO2023237105A1 WO2023237105A1 PCT/CN2023/099487 CN2023099487W WO2023237105A1 WO 2023237105 A1 WO2023237105 A1 WO 2023237105A1 CN 2023099487 W CN2023099487 W CN 2023099487W WO 2023237105 A1 WO2023237105 A1 WO 2023237105A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- control arm
- main control
- virtual
- instrument
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/37—Leader-follower robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/74—Manipulators with manual electric input means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/301—Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/74—Manipulators with manual electric input means
- A61B2034/742—Joysticks
Definitions
- This specification relates to the technical field of endoscopic surgery, and in particular to a method for displaying virtual surgical instruments on a doctor's console and a doctor's console.
- the doctor uses the main control arm and foot pedal in the doctor's console to control the surgical instruments in the endoscopic surgery environment, which requires the doctor to switch between hand movements and foot pedal movements.
- Cooperate Images of surgical instruments in an endoscopic surgical environment are displayed on a stereoscopic monitor in the doctor's console.
- doctors can only rely on training or muscle memory from past experience when operating. It is easy to misoperate, affecting the quality and success rate of the operation.
- Embodiments of this specification provide a method for displaying virtual surgical instruments on a doctor's console and a doctor's console to solve the problem in the prior art of easy misoperations when performing surgery only by relying on endoscopic images on a stereoscopic monitor. .
- Embodiments of this specification provide a method for displaying virtual surgical instruments on a doctor's console, including: acquiring an image of a main control arm of the doctor's console; and determining, based on the image of the main control arm, the main control arm.
- Posture data based on the posture data of the main control arm and the three-dimensional model of the surgical instrument, generate virtual posture data of the surgical instrument to display a virtual command instrument image in the stereoscopic monitor of the doctor's console.
- Embodiments of this specification also provide a doctor's console, including: an image acquisition device, used to collect images of the main control arm of the doctor's console; and an image processing device, used to determine the image of the main control arm based on the image of the main control arm.
- posture data of the main control arm also used to generate virtual posture data of the surgical instrument based on the posture data of the main control arm and the three-dimensional model of the surgical instrument; and a stereoscopic monitor for displaying virtual posture data according to the virtual posture data.
- Command device image used to collect images of the main control arm of the doctor's console
- an image processing device used to determine the image of the main control arm based on the image of the main control arm.
- posture data of the main control arm also used to generate virtual posture data of the surgical instrument based on the posture data of the main control arm and the three-dimensional model of the surgical instrument
- a stereoscopic monitor for displaying virtual posture data according to the virtual posture data.
- Embodiments of this specification also provide a medical device, including a processor and a memory used to store instructions executable by the processor.
- the processor executes the instructions, it implements the method described in any of the above embodiments for performing the steps on the doctor console. Steps of a method of displaying virtual surgical instruments.
- Embodiments of this specification also provide a computer-readable storage medium on which computer instructions are stored. When the instructions are executed, the steps of the method for displaying virtual surgical instruments on a doctor's console described in any of the above embodiments are implemented. .
- a method for displaying virtual surgical instruments on a doctor's console is provided.
- An image of the main control arm of the doctor's console can be obtained, and the main control arm can be determined based on the obtained image of the main control arm.
- Arm posture data based on the posture data of the main control arm and the three-dimensional model of the surgical instrument, generate virtual posture data of the surgical instrument to display a virtual instruction instrument image on the stereoscopic monitor of the doctor's console.
- the attitude data of the main control arm is first determined based on the image of the main control arm. Since there is a master-slave relationship between the main control arm and the surgical instruments, the attitude data of the main control arm and the three-dimensional model of the surgical instruments can be used.
- Figure 1 shows a schematic structural diagram of an endoscopic surgery system according to an embodiment of this specification
- Figure 2 shows a schematic diagram of a doctor's console in an embodiment of this specification
- FIG. 3 shows a schematic diagram of the image acquisition device in the embodiment of this specification
- Figure 4 shows a schematic diagram of a patient surgical platform in an embodiment of the present specification
- Figure 5 shows a schematic structural diagram of the image platform in the embodiment of this specification
- Figure 6 shows a flow chart of a method for displaying virtual surgical instruments on a doctor's console in an embodiment of the present specification
- Figure 7 shows a structural block diagram of the doctor's console in the embodiment of this specification.
- Figure 8 shows the enhanced display control logic block diagram in the embodiment of this specification
- Figure 9 shows a flow chart of virtual command instrument image display in the embodiment of this specification.
- Figure 10 shows a flow chart of the association between the main hand posture and the virtual instrument posture in the embodiment of this specification
- Figure 11 shows a schematic diagram of the camera coordinate system and the world coordinate system in the embodiment of this specification
- Figure 12 shows a schematic diagram of the transformation of the world coordinate system and the camera coordinate system in the embodiment of this specification
- Figure 13 shows a schematic diagram of the virtual foot position and foot action image display in the embodiment of this specification
- Figure 14 shows a schematic diagram of the prompt for excessive position following deviation of the surgical instrument in the embodiment of this specification
- Figure 15 shows a schematic diagram of the position display of the foot pedal panel in the embodiment of this specification
- Figure 16 shows a schematic diagram of foot action display and effective pedaling display in the embodiment of this specification
- Figure 17 shows a schematic diagram of the operation flow of the endoscopic surgical robot in the embodiment of this specification.
- Figure 18 shows a schematic flow chart of virtual instruction device acquisition and display in the embodiment of this specification
- Figure 19 shows a schematic flow chart of virtual foot position and foot movement display in the embodiment of this specification.
- Figure 20 shows a schematic structural diagram of the medical equipment in the embodiment of this specification.
- Embodiments of this specification provide a method for displaying virtual surgical instruments on a doctor's console and a doctor's console, which can be applied to endoscopic surgery systems.
- Figure 1 shows a schematic structural diagram of the endoscopic surgery system.
- the endoscopic surgery system can be It consists of a doctor's console, a patient surgical platform (including instruments and endoscopic instruments), and an image platform.
- the doctor can operate at the doctor's console and provide motion control instructions to the instruments on the patient's surgical platform.
- the endoscopic images are processed by the image platform and displayed on the image platform and the three-dimensional monitor on the doctor's console respectively.
- FIG. 2 shows a schematic diagram of a doctor's console in an embodiment of this specification.
- the doctor's console is the control center of the endoscopic surgery system, which can provide doctors with clear images and provide control signal input necessary for surgical operations.
- the main parts of the doctor's console may include: a stereoscopic monitor 201, a main operating hand (or main control arm) 202, a foot panel 203, and an image acquisition device (not shown in Figure 2).
- the stereoscopic monitor can present the same image or video information through the left and right screens respectively.
- the left eye observes the left screen and the right eye observes the right screen.
- the brain automatically synthesizes images into stereoscopic vision.
- the main functions of the stereoscopic monitor may include: displaying endoscopic visual images, displaying virtual command instrument images, and displaying virtual foot pedal positions and foot movements.
- the foot pedal panel can be an auxiliary operation function, operating in conjunction with the main operating hand/main control arm control to provide control signals to surgical instruments.
- the main functional parts of the foot panel may include: operating surgical instruments, operating endoscopes, and other auxiliary functions.
- the doctor's console can be equipped with two main operating hands/main control arms, which receive operating signals from the operator's left and right hands respectively.
- the main control arm can monitor the operator's hand movement information and is the main motion control input of the entire system.
- the operator can control the movement of the tool arm by operating the control handle at the end of the main control arm, thereby controlling the operation of the endoscope and surgical instruments.
- FIG. 3 shows a schematic diagram of the image acquisition device in the embodiment of this specification.
- the image acquisition device may include one or more binocular cameras. Binocular cameras can use the difference in pixel positions of feature points in images taken by two lenses to obtain object depth information. Using VR's 3D technology, the position and posture of objects can be obtained.
- the image acquisition device mainly monitors the position of the operator's hand, foot panel and foot movements.
- the image collection device may include a binocular camera and an infrared rangefinder. By setting up an additional infrared rangefinder, it is easier to obtain the depth information of the object and then obtain the attitude data of the main control arm.
- the image capture device may include an RGB-D (RGB-Depth) camera.
- the depth map collected by the RGB-D camera contains an image channel with information related to the surface distance of the viewpoint scene object.
- the channel itself is similar to a grayscale image.
- Each pixel value is the actual distance from the object measured by the sensor, and the object depth information and position information, and then obtain the attitude data of the main control arm.
- the image collection device may include multiple monocular cameras. By setting up multiple monocular cameras.
- the multiple monocular cameras may be two monocular cameras arranged vertically.
- the depth information of the object can be obtained by processing the collected images, and then the attitude data of the main control arm can be obtained.
- the patient surgical platform is the operating platform of the endoscopic surgery system located next to the patient's operating bed. It consists of three single machines: a surgical trolley, an adjustment arm, and a tool arm (including an image arm). The instruments and endoscopes installed on the tool arm The mirror serves as a slave mechanism and accepts motion control commands from the doctor's console.
- the image platform is the visual feedback subsystem of the endoscopic surgery system, which can include three stand-alone machines: a three-dimensional electronic endoscope, an endoscope image processing host, and an imaging car. It can provide energy, endoscopic vision processing, image display and other functions for the instrument.
- the image of the synchronous doctor's stereo monitor is mainly for viewing by personnel other than the operating doctor.
- Figure 6 shows a flowchart of a method for displaying virtual surgical instruments on a doctor's console in an embodiment of this specification.
- this specification provides method operation steps or device structures as shown in the following embodiments or drawings, more or less operation steps or module units may be included in the method or device based on routine or without creative effort. .
- the execution order of these steps or the module structure of the device is not limited to the execution described in the embodiments of this specification and shown in the drawings. Line order or module structure.
- the described methods or module structures are applied to actual devices or terminal products, they can be executed sequentially or in parallel (such as parallel processors or multi-thread processing) according to the methods or module structures shown in the embodiments or drawings. environment, even a distributed processing environment).
- a method for displaying virtual surgical instruments on a doctor's console may include the following steps:
- Step S601 Obtain an image of the main control arm of the doctor's console.
- the methods in the embodiments of this specification can be applied to image processing devices.
- the image processing device may acquire an image of the main control arm of the doctor's console.
- the main control arm may include a left main control arm and a right main control arm.
- images of the main control arm during endoscopic surgery may be acquired by an image acquisition device.
- the image processing device may acquire the image of the main control arm from the image acquisition device.
- the image of the main control arm may include an image of the left main control arm and an image of the right main control arm at at least one moment in the endoscopic surgical environment.
- the image acquisition device can acquire images of the main control arm during endoscopic surgery in real time.
- the image processing device may acquire images of the main control arm from the image acquisition device every preset time period.
- Step S602 Determine the posture data of the main control arm based on the image of the main control arm.
- the image processing device may determine the attitude data of the main control arm based on the image of the main control arm.
- the attitude data of the main control arm may include attitude data of multiple feature points on the main control arm.
- the attitude of the main control arm can be characterized by the attitude data of multiple feature points.
- the posture data of each feature point may include at least one of the following: position coordinates, displacement direction, displacement amount, rotation angle, rotation direction and other various data.
- the image processing device can determine the posture data of multiple feature points on the main control arm based on the images of the main control arm at two adjacent times, and then determine the posture data of the main control arm.
- Step S603 Generate virtual posture data of the surgical instrument based on the posture data of the main control arm and the three-dimensional model of the surgical instrument to display a virtual instruction instrument image on the stereoscopic monitor of the doctor's console.
- the posture data of the surgical instrument can be generated based on the posture data of the main control arm and the three-dimensional model of the surgical instrument.
- the virtual posture data of the surgical instrument may include posture data of the surgical instrument corresponding to the posture data of the main control arm.
- the correspondence between the attitude data of the main control arm and the attitude data of the surgical instrument may be stored in the image processing device.
- the posture data of the surgical instrument may also include posture data of multiple instrument feature points.
- the posture data of each instrument feature point may include at least one of the following: position coordinates, displacement direction, displacement amount, rotation angle, rotation direction and other various data.
- the image processing device can generate virtual posture data of the surgical instrument based on the posture data of the main control arm and the three-dimensional model of the surgical instrument.
- the three-dimensional model of the surgical instrument may be pre-established model data of the surgical instrument in a three-dimensional coordinate system.
- the virtual posture data of the surgical instrument refers to the posture data of the surgical instrument under the operation of the main control arm.
- the virtual instruction instrument image can be displayed on the stereoscopic monitor of the doctor's console according to the virtual posture data of the surgical instrument.
- the virtual instruction instrument image and the actual instrument image collected by the endoscope can be displayed simultaneously on the stereoscopic monitor.
- the attitude data of the main control arm can be determined first based on the image of the main control arm. Since there is a master-slave relationship between the main control arm and the surgical instrument, the attitude data of the main control arm and the three-dimensional model of the surgical instrument can be used. Generate virtual posture data of the surgical instrument, and then display a virtual instruction instrument image on the stereoscopic monitor based on the virtual posture data of the surgical instrument. By displaying virtual command instrument images on a stereoscopic monitor, it is easier for doctors to detect misoperations and instrument failures in a timely manner, thereby reducing misoperations, improving surgical quality and success rate, and improving the surgical experience of doctors and patients.
- the method may further include: acquiring a foot image of the doctor's console and/or an image of the operator's foot; generating foot position data based on the foot image, and/or generating foot position data based on the foot image.
- the foot image generates foot position data to display the footrest area and/or foot area in the stereoscopic monitor.
- the image acquisition device can collect the foot pedal images of the doctor's console and the operator's foot pedal operations in the pedal area. Foot image inside.
- the image processing device can acquire the foot images and foot images collected by the image acquisition device.
- the image processing device may generate foot pedal position data based on the foot pedal image.
- the footrest position data may include position data and size data for each of a plurality of footrests in the physician console.
- the image processing device can display the footrest area on the stereoscopic monitor based on the footrest position data.
- the footrest area may include multiple footrest sub-areas, and each footrest sub-area corresponds to one footrest.
- the image processing device may also generate foot position data based on the foot image.
- the foot position data may include position data and size data of the operator's left and/or right foot.
- Image processing allows the foot area to be displayed on a stereoscopic monitor based on foot position data.
- the foot area may include a left foot area corresponding to the left foot and a right foot area corresponding to the right foot.
- the method may further include: when one or more foot pedals of the doctor's console are effectively stepped on, on the stereoscopic monitor, the steps on the foot pedal area that are effectively stepped on are displayed. One or more pedals for highlighting.
- one or more pedals of the doctor's console are effectively stepped on
- one or more of the pedals in the pedal area that are effectively stepped on can be displayed on the stereo monitor. pedal to highlight.
- a pressure sensor is provided below each of the plurality of foot pedals of the doctor's console.
- the pressure sensor corresponding to each pedal can be connected to an image processing device.
- the image processing device can determine whether the pressure value detected by each pressure sensor is greater than a preset pressure threshold, and assign the pedal corresponding to the pressure sensor with a pressure value greater than the preset pressure threshold. It is judged to be effectively stepped on. Based on the determination result, the image processing device highlights one or more pedals that are effectively stepped on in the pedal area on the stereoscopic monitor.
- the image of the main control arm may include a left-eye image and a right-eye image of the main control arm collected by a binocular camera.
- the image acquisition device may include a binocular camera.
- the captured image of the main control arm may include the left eye image and the right eye image of the main control arm captured by the binocular camera.
- the image processing device can obtain depth information based on the left eye image and the right eye image, so that the instrument can be displayed stereoscopically.
- determining the posture data of the main control arm based on the image of the main control arm may include: obtaining the spatial position data of the left-eye camera and the right-eye camera in the binocular camera; based on the The spatial position data and the position parameters of each of the plurality of characteristic points of the main control arm in the left-eye image and the right-eye image are used to determine the spatial position data corresponding to each feature point; according to the plurality of The spatial position data corresponding to each feature point at time determines the attitude data of the main control arm.
- the main control arm can be regarded as a rigid body.
- the position or attitude of three points that are not on a straight line can determine the position or attitude of the rigid body. Therefore, the attitude of the main control arm can be determined through the attitude of at least three characteristic points on the main control arm.
- the spatial position data of the orbital left-eye camera and the right-eye camera in the binocular camera can be obtained. Afterwards, based on the spatial position data of the left-eye camera and the right-eye camera and the position parameters of at least three feature points on the main control arm, the spatial position data corresponding to each feature point can be determined, that is, the spatial position of each feature point in the geographical coordinate system. Location data.
- the attitude data of the main control arm can be determined based on the spatial position data corresponding to each feature point. Through the above method, it can be based on The acquired image of the main control arm determines the attitude data of the main control arm.
- generating virtual posture data of the surgical instrument based on the posture data of the main control arm and the three-dimensional model of the surgical instrument may include: generating the virtual posture data of the surgical instrument based on the posture data of the main control arm.
- the control instruction information of the main control arm is applied to the three-dimensional model of the surgical instrument to generate virtual posture data of the surgical instrument.
- the image processing device may generate the control instruction information of the main control arm based on the posture data of the main control arm.
- the control instruction information may include an operation instruction for the robotic arm, and the operation instruction may include but is not limited to at least one of the following: a specified translation distance, a specified rotation angle, a rotation direction, a rotation axis, a rotation position, a specified opening angle, etc.
- the image processing device may store a correspondence between the attitude data of the control arm and the control instruction or a conversion formula between the two. Based on the stored correspondence or conversion formula, the main control arm can be Attitude data generates control command information for the main control arm.
- the control instruction corresponding to the control instruction information is applied to the three-dimensional model of the surgical instrument, thereby obtaining the virtual posture data of the surgical instrument.
- the three-dimensional model of the surgical instrument can be operated according to the operating instructions in the control instruction to obtain the virtual posture data of the surgical instrument.
- virtual posture data of the surgical instrument can be generated based on the posture data of the control arm and the three-dimensional model of the surgical instrument.
- generating virtual posture data of the surgical instrument based on the posture data of the main control arm and the three-dimensional model of the surgical instrument may include: obtaining a target machine learning model; the target machine learning model is Pre-established based on the three-dimensional model of the surgical instrument; input the posture data of the main control arm into the target machine learning model to obtain the virtual posture data of the surgical instrument.
- a preset control algorithm can be used to generate virtual posture data of the surgical instrument.
- the preset control algorithm may include multiple operators in the machine learning model.
- the virtual attitude data of the surgical instrument can be output.
- the machine learning model can be pre-built based on a three-dimensional model of the surgical instrument.
- the target machine learning model before obtaining the target machine learning model, it may also include: constructing a machine learning model based on the three-dimensional model of the surgical instrument; obtaining the posture data of the control arm during endoscopic surgery and the corresponding The posture data of the surgical instruments, and construct a training sample set based on the posture data of the control arm during the endoscopic surgery and the posture data of the corresponding surgical instruments; use the constructed training sample set to train the machine The learning model is trained to obtain the target machine learning model.
- a machine learning model can be built based on the three-dimensional model of the surgical instrument.
- the posture data of the control arm and the posture data of the corresponding surgical instruments can be obtained.
- a training sample set is constructed.
- the attitude data of the control arm is the input data
- the attitude data of the surgical instrument is the label.
- the pre-established machine learning model is trained using the constructed training sample set to obtain a target machine learning model.
- the target machine learning model can determine the posture data of the surgical instrument based on the posture data of the control arm for a specific surgical instrument.
- the actual instrument image of the surgical instrument collected by the endoscope is also displayed on the stereoscopic monitor; after the virtual instruction instrument image is displayed on the stereoscopic monitor of the doctor's console, it may also include: : Comparing the actual instrument image with the virtual instruction instrument image; determining, based on the comparison result, the target portion of the virtual instruction instrument image that has a difference between the virtual instruction instrument image and the actual instrument image exceeding a preset range; The stereoscopic monitor is also used to annotate and display the target portion.
- the actual instrument image of the surgical instrument collected by the endoscope and the virtual instruction instrument image determined based on the posture data of the main control arm can be displayed simultaneously on the stereoscopic monitor.
- the image processing device can compare the actual instrument image with the virtual command instrument image.
- the image processing device can divide the surgical instrument into multiple parts, and divide the parts in the virtual instruction instrument image into Each of the multiple parts is compared with the corresponding part in the actual device image. According to the comparison results, the target portion of the virtual instruction instrument image whose difference between the actual instrument image and the actual instrument image exceeds a preset range can be determined.
- the determined target part can be marked and displayed on the stereo monitor.
- the part is determined to be the target part. For another example, if the rotation angle of a certain part in the virtual instruction instrument image about a fixed axis relative to the corresponding part in the actual instrument image is greater than the preset angle, then the part is determined as the target part. For another example, if the rotation angle of a certain part in the virtual instruction instrument image around a fixed point relative to the corresponding part in the actual instrument image is greater than the preset angle, then the part is determined as the target part.
- the image processing device may control the label display of the target part in the stereoscopic monitor.
- the annotation display may display the target part in different colors, for example, it may be highlighted in red, highlighted in yellow, etc.
- the annotation display may be a flashing display of the target part.
- the actual instrument image and the virtual instruction instrument image can be displayed in the same coordinate system and at the same scale, which facilitates the operator to observe the target part in time, reduces misoperation or detects instrument failure in time, and improves the quality of surgery and the success rate of surgery. .
- the method may further include: determining that the target portion is in the surgical operation. Position data in the instrument; generate early warning information based on the position data, and provide early warning to the operator in a preset manner based on the early warning information.
- the image processing device divides the surgical instrument into multiple parts and determines the target part.
- the position of the target part in the surgical instrument can be obtained according to the division rules.
- the position data can include the position of the target part in the surgical instrument and its location. The specific component names and other data described above.
- early warning information can be generated based on the location data. An early warning can be issued to the operator in a preset manner based on the early warning information. For example, it can be displayed in a stereoscopic monitor in the form of voice broadcast or text.
- the embodiments of this specification also provide a doctor console, as described in the following embodiments. Since the problem-solving principle of the doctor's console is similar to the method for displaying virtual surgical instruments on the doctor's console, the implementation of the doctor's console can be referred to the implementation of the method for displaying virtual surgical instruments on the doctor's console, and there will be no duplication.
- the term "unit” or “module” may be a combination of software and/or hardware that implements predetermined functions.
- the apparatus described in the following embodiments is preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
- Figure 7 is a structural block diagram of the doctor's console in the embodiment of this specification. As shown in Figure 7, it includes: an image acquisition device 701, an image processing device 702 and a stereoscopic monitor 703. This structure will be described below.
- the image acquisition device 701 is used to acquire images of the main control arm of the doctor's console.
- the image processing device 702 is configured to determine the attitude data of the main control arm according to the image of the main control arm; and is also used to generate the posture data of the main control arm and the three-dimensional model of the surgical instrument based on the posture data of the main control arm and the three-dimensional model of the surgical instrument. Virtual posture data.
- the stereoscopic monitor 703 is used to display a virtual instruction instrument image according to the virtual posture data.
- the image acquisition device is also used to collect the foot image of the doctor's console and the operator's foot image; the image processing device is also used to generate the foot position based on the foot image. data, and/or generate foot position data according to the foot image; the stereoscopic monitor is also used to display the foot area according to the foot position data, and/or display the foot area according to the foot position data. area.
- the stereoscopic monitor is also used to highlight one or more pedals that are effectively stepped on in the pedal area.
- the image acquisition device includes a binocular camera, and the binocular camera is used to collect left-eye images and right-eye images of the main control arm.
- the image processing device is specifically configured to: obtain the spatial position data of the left-eye camera and the right-eye camera in the binocular camera; and obtain multiple spatial position data based on the spatial position data and the main control arm.
- the position parameters of each feature point in the left-eye image and the right-eye image determine the spatial position data corresponding to each feature point; based on the spatial position data corresponding to each feature point at multiple times, Determine the attitude data of the main control arm.
- the image processing device is specifically configured to: generate control instruction information of the main control arm based on the posture data of the main control arm; apply the control instruction information of the main control arm to The three-dimensional model of the surgical instrument generates virtual posture data of the surgical instrument.
- the image processing device is specifically configured to: use a preset control algorithm to generate virtual posture data of the surgical instrument based on the posture data of the main control arm and the three-dimensional model of the surgical instrument. .
- the stereoscopic monitor may also display actual instrument images of surgical instruments collected by the endoscope; the image processing device may also be used to: compare the actual instrument images with the virtual Compare the command instrument images; according to the comparison results, determine the target portion of the virtual command instrument image that differs from the actual instrument image by exceeding a preset range; the stereoscopic monitor is also used to monitor the target portion Display labels.
- an augmented reality doctor console is provided.
- the doctor's console may include a stereoscopic monitor, a main control arm and a foot panel, and an image acquisition device.
- Enhanced display mainly realizes the display of endoscopic visual images, virtual command instrument images, foot position images, and foot movement images in a stereoscopic monitor together.
- This real-time fusion display can provide prompts to the operator and reduce misoperations.
- Figure 8 shows an enhanced display control logic block diagram.
- augmented reality can include: virtual instruction device image display, virtual foot position and foot action image display.
- FIG. 9 shows a flow chart of virtual command instrument image display.
- the depth information is used to obtain the main hand operating posture, and then fused with the main hand control instructions to obtain the virtual command instrument posture.
- the display of the virtual instruction device image has a switch on the operating platform that can control its opening or closing. When an excessive deviation fault occurs, a red prompt will be displayed on the part where the instrument deviation is excessive.
- Figure 10 shows a flow chart of the association between the main hand posture and the virtual instrument posture.
- binocular vision is initiated.
- the positions of multiple target feature points on the main control arm within the field of view are obtained through the left eye camera (camera 1).
- the positions of multiple target feature points on the main control arm within the field of view are obtained through the right eye camera (camera 2).
- the spatial position of each target feature point is calculated.
- the continuous motion trajectory of each feature point is associated with the posture of the 3D model of the virtual instrument to obtain the posture data of the virtual instrument.
- Figure 11 shows a schematic diagram of the camera coordinate system and the world coordinate system.
- XcYcZc is the camera coordinate system
- XwYwZw is the world coordinate system
- XY is the imaging plane coordinate system.
- Figure 12 shows a schematic diagram of the transformation of the world coordinate system and the camera coordinate system.
- the transformation from the world coordinate system to the camera coordinate system is a rigid body transformation, that is, the object will not deform and only needs to be rotated and translated.
- R represents the rotation matrix.
- S represents the translation vector.
- the transformation formula is as follows:
- (x, y, z) are the coordinates in the camera coordinate system
- (x w , y w , z w ) are the coordinates in the world coordinate system
- Figure 13 shows a schematic diagram of virtual foot position and foot action image display.
- the depth information is used to obtain foot position and foot movement information.
- the virtual footrest position is displayed on the stereo monitor to guide the doctor when switching footrests.
- the virtual foot movements are displayed on the stereo monitor.
- the calculation method for collection and display take the center of the lower left pedal as the origin of the X, Y, and Z coordinate systems, and obtain the real-time displacement of the foot in the XYZ direction through the depth camera.
- the displacement is displayed in a specified area in the stereoscopic monitor according to a certain proportion.
- Figure 14 shows a schematic diagram of a prompt for excessive position following deviation of a surgical instrument.
- a fault will be reported, and there will be corresponding fault location indication and text prompts in the stereo monitor.
- parts with excessive deviations can be displayed in red (marked in dark gray in the figure). This enhanced display can intuitively and effectively point out the fault point and provide prompts such as fault solutions.
- FIG 15 shows a schematic diagram of the foot panel position display.
- the foot pedal position and foot movements are displayed in real time.
- the purpose is to give the operator visual feedback during the switching operation and reduce the probability of operation errors.
- the foot movements are displayed in real time.
- the real-time display image of the foot can give the doctor feedback on whether the movement direction of the pedal reached is correct.
- the color highlights to determine whether the pedal function is effectively depressed.
- the above two points can assist the doctor in the smooth operation of switching pedals.
- Figure 16 shows a schematic diagram of foot action display and effective pedaling display. As shown in Figure 16, the pedal where the foot is is effectively stepped on.
- Figure 17 shows a schematic diagram of the operation flow of the endoscopic surgical robot.
- the main process includes: Operation: On the doctor's platform, the main hand and foot pedal operations are coordinated to give control instructions to the instruments on the surgical platform; Monitoring: On the doctor's platform, the main hand movements and foot pedal movements are obtained through the image acquisition device Information, virtual command equipment posture, this posture and the actual control instructions are mutually corrected. Alarm when the deviation is too large; display: the endoscope image is displayed on the stereo monitor, and the virtual command instrument, pedal position and foot movement are also displayed on the monitor in real time as real-time feedback information for the doctor's operation.
- Figure 18 shows a schematic flow chart of virtual instruction device acquisition and display.
- the main process includes: displaying the virtual instruction device based on image acquisition on the stereo monitor; when an excessive deviation fault occurs, the fault location is displayed with color highlighting and flashing prompts.
- Figure 19 shows a schematic flow chart of virtual pedal position and foot movement display.
- the main process includes: based on the image acquisition device, obtain the foot position image and display it in the stereo monitor, and highlight the effective pedal to confirm; based on the image acquisition device, obtain the real-time action image of the foot, and display it on the stereo monitor In-monitor display gives visual feedback on target position movement.
- an image acquisition module is installed on the doctor's console to monitor the main hand and foot pedal parts of the doctor's console.
- the action posture of the main hand and the action posture of the feet are recognized.
- VR display technology is used to virtualize and display the posture of the command device in the doctor's stereoscopic monitor.
- VR technology is used to virtually display the foot position and foot movements on the stereoscopic monitor to assist the doctor's operation.
- VR technology is used to display and operate prompts on the doctor's stereoscopic monitor in real time to facilitate the doctor's operation, such as: placing the foot on the doctor's stereoscopic monitor.
- the pedal position and foot movement are displayed in real time.
- the doctor can clearly see the foot movement process and related prompts such as which foot needs to move to which pedal position next. It can avoid misoperation and improve the efficiency of surgery, and the prompts during the operation can assist the doctor in the operation.
- the embodiments of this specification achieve the following technical effects: first determine the attitude data of the main control arm according to the image of the main control arm. Since there is a master-slave relationship between the main control arm and the surgical instrument, it is possible to Based on the posture data of the main control arm and the three-dimensional model of the surgical instrument, virtual posture data of the surgical instrument is generated, and then a virtual instruction instrument image is displayed on the stereoscopic monitor based on the virtual posture data of the surgical instrument.
- a virtual command instrument image is displayed on a stereoscopic monitor, it is possible for doctors to detect misoperations and instrument failures in a timely manner, assist doctors in operation, improve surgical efficiency, improve surgical quality and success rate, and improve the surgical experience of doctors and patients.
- the embodiment of this specification also provides a medical device.
- a medical device For details, please refer to the schematic structural diagram of the medical device based on the method for displaying virtual surgical instruments on a doctor's console provided by the embodiment of this specification as shown in Figure 20.
- the medical device Specifically, it may include an input device 21, a processor 22, and a memory 23.
- the memory 23 is used to store instructions executable by the processor.
- the processor 22 executes the instructions, the steps of the method for displaying virtual surgical instruments on the doctor's console described in any of the above embodiments are implemented.
- An embodiment of this specification also provides an endoscopic medical robot, including the doctor console described in any of the above embodiments.
- the input device may be one of the main devices for exchanging information between the user and the computer system.
- the input device may include a keyboard, mouse, camera, scanner, light pen, handwriting input pad, voice input device, etc.; the input device is used to input raw data and programs for processing these numbers into the computer.
- the input device can also obtain and receive data transmitted from other modules, units, and devices.
- the processor may be implemented in any suitable manner.
- a processor may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (such as software or firmware) executable by the (micro)processor, a logic gate, a switch, an application-specific integrated circuit ( Application Specific Integrated Circuit (ASIC), programmable logic controller and embedded microcontroller form, etc.
- the memory may specifically be a memory device used to store information in modern information technology. The memory can include multiple levels.
- anything that can save binary data can be a memory; in an integrated circuit, a circuit with a storage function that does not have a physical form is also called a memory, such as RAM, FIFO, etc.; In the system, storage devices in physical form are also called memories, such as memory sticks, TF cards, etc.
- the embodiment of this specification also provides a computer storage medium based on a method for displaying virtual surgical instruments on a doctor's console.
- the computer storage medium stores computer program instructions. When the computer program instructions are executed, the above-mentioned steps are implemented. arbitrary The steps of the method for displaying virtual surgical instruments on a doctor's console are described in the embodiments.
- the above-mentioned storage media includes but is not limited to random access memory (Random Access Memory, RAM), read-only memory (Read-Only Memory, ROM), cache (Cache), and hard disk (Hard Disk Drive, HDD). Or Memory Card.
- the memory may be used to store computer program instructions.
- the network communication unit may be an interface configured in accordance with the standards specified by the communication protocol and used for network connection communication.
- each module or each step of the above embodiments of this specification can be implemented by a general computing device, and they can be concentrated on a single computing device, or distributed among multiple computing devices. on a network, optionally, they may be implemented in program code executable by a computing device, such that they may be stored in a storage device for execution by the computing device, and in some cases, may be implemented in a manner different from that described herein
- the steps shown or described are performed in sequence, or they are separately made into individual integrated circuit modules, or multiple modules or steps among them are made into a single integrated circuit module. As such, embodiments of the present description are not limited to any specific combination of hardware and software.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Robotics (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Endoscopes (AREA)
Abstract
La présente invention concerne le domaine technique de la chirurgie endoscopique. Sont en particulier divulgués un procédé d'affichage d'un instrument chirurgical virtuel sur une console de chirurgien, et la console de chirurgien. Le procédé consiste à : acquérir une image d'un bras de commande principal de la console de chirurgien ; déterminer, en fonction de l'image du bras de commande principal, des données de position du bras de commande principal ; et générer, sur la base des données de position du bras de commande principal et d'un modèle tridimensionnel d'un instrument chirurgical, des données de position virtuelle de l'instrument chirurgical de façon à afficher une image d'instrument d'instruction virtuel sur un moniteur tridimensionnel de la console de chirurgien. Selon le système ci-dessus, l'image de l'instrument d'instruction virtuel est affichée sur le moniteur tridimensionnel, de telle sorte qu'un chirurgien peut facilement découvrir un dysfonctionnement à temps, repérer une défaillance de l'instrument à temps, et est assisté pendant l'opération, ce qui permet d'améliorer l'efficacité chirurgicale, de même que la qualité chirurgicale et le taux de réussite ou, encore, l'expérience chirurgicale des chirurgiens et des patients.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202210652288.5A CN115068114A (zh) | 2022-06-10 | 2022-06-10 | 用于在医生控制台显示虚拟手术器械的方法及医生控制台 |
| CN202210652288.5 | 2022-06-10 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023237105A1 true WO2023237105A1 (fr) | 2023-12-14 |
Family
ID=83250819
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2023/099487 Ceased WO2023237105A1 (fr) | 2022-06-10 | 2023-06-09 | Procédé d'affichage d'un instrument chirurgical virtuel sur une console de chirurgien, et console de chirurgien |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN115068114A (fr) |
| WO (1) | WO2023237105A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN119214802A (zh) * | 2024-09-29 | 2024-12-31 | 哈尔滨思哲睿智能医疗设备股份有限公司 | 手术机器人医生控制台视野调节方法、装置、设备、介质和产品 |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115068114A (zh) * | 2022-06-10 | 2022-09-20 | 上海微创医疗机器人(集团)股份有限公司 | 用于在医生控制台显示虚拟手术器械的方法及医生控制台 |
| CN116076984A (zh) * | 2023-03-03 | 2023-05-09 | 上海微创医疗机器人(集团)股份有限公司 | 内窥镜视野调整方法、控制系统及可读存储介质 |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080004603A1 (en) * | 2006-06-29 | 2008-01-03 | Intuitive Surgical Inc. | Tool position and identification indicator displayed in a boundary area of a computer display screen |
| CN102341046A (zh) * | 2009-03-24 | 2012-02-01 | 伊顿株式会社 | 利用增强现实技术的手术机器人系统及其控制方法 |
| KR101825929B1 (ko) * | 2017-11-24 | 2018-02-06 | 재단법인 구미전자정보기술원 | 수술로봇 조종을 위한 무구속의 3차원 손동작 모션 인식 시스템, 방법, 컴퓨터 프로그램 및 컴퓨터 판독 가능한 기록 매체 |
| US20180280099A1 (en) * | 2017-03-31 | 2018-10-04 | Verb Surgical Inc. | Multi-functional foot pedal assembly for controlling a robotic surgical system |
| CN109806002A (zh) * | 2019-01-14 | 2019-05-28 | 微创(上海)医疗机器人有限公司 | 一种用于手术机器人的成像系统及手术机器人 |
| CN111991084A (zh) * | 2020-10-08 | 2020-11-27 | 深圳市精锋医疗科技有限公司 | 手术机器人及其虚拟成像控制方法、虚拟成像控制装置 |
| WO2022019318A2 (fr) * | 2020-07-20 | 2022-01-27 | Sony Group Corporation | Système de commande de bras médical, procédé de commande de bras médical, simulateur de bras médical, modèle d'apprentissage de bras médical et programmes associés |
| CN115068114A (zh) * | 2022-06-10 | 2022-09-20 | 上海微创医疗机器人(集团)股份有限公司 | 用于在医生控制台显示虚拟手术器械的方法及医生控制台 |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3415071A4 (fr) * | 2016-02-10 | 2019-11-20 | Olympus Corporation | Système d'endoscope |
| EP3445048B1 (fr) * | 2017-08-15 | 2025-09-17 | Holo Surgical Inc. | Interface utilisateur graphique pour un système de navigation chirurgical pour fournir une image de réalité augmentée pendant le fonctionnement |
| JP6898285B2 (ja) * | 2018-09-25 | 2021-07-07 | 株式会社メディカロイド | 手術システムおよび表示方法 |
| CN113660913B (zh) * | 2019-02-05 | 2025-07-15 | 史密夫和内修有限公司 | 用于改进机器人手术系统的方法及其装置 |
| JP2022045236A (ja) * | 2020-09-08 | 2022-03-18 | ソニーグループ株式会社 | 医療用撮像装置、学習モデル生成方法および学習モデル生成プログラム |
| CN113633387B (zh) * | 2021-06-21 | 2024-01-26 | 安徽理工大学 | 术野追踪的扶持腹腔镜微创机器人触力交互方法和系统 |
| CN114176785B (zh) * | 2022-01-13 | 2024-05-24 | 北京歌锐科技有限公司 | 一种控制机器人的系统及方法 |
-
2022
- 2022-06-10 CN CN202210652288.5A patent/CN115068114A/zh active Pending
-
2023
- 2023-06-09 WO PCT/CN2023/099487 patent/WO2023237105A1/fr not_active Ceased
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080004603A1 (en) * | 2006-06-29 | 2008-01-03 | Intuitive Surgical Inc. | Tool position and identification indicator displayed in a boundary area of a computer display screen |
| CN102341046A (zh) * | 2009-03-24 | 2012-02-01 | 伊顿株式会社 | 利用增强现实技术的手术机器人系统及其控制方法 |
| US20180280099A1 (en) * | 2017-03-31 | 2018-10-04 | Verb Surgical Inc. | Multi-functional foot pedal assembly for controlling a robotic surgical system |
| KR101825929B1 (ko) * | 2017-11-24 | 2018-02-06 | 재단법인 구미전자정보기술원 | 수술로봇 조종을 위한 무구속의 3차원 손동작 모션 인식 시스템, 방법, 컴퓨터 프로그램 및 컴퓨터 판독 가능한 기록 매체 |
| CN109806002A (zh) * | 2019-01-14 | 2019-05-28 | 微创(上海)医疗机器人有限公司 | 一种用于手术机器人的成像系统及手术机器人 |
| WO2022019318A2 (fr) * | 2020-07-20 | 2022-01-27 | Sony Group Corporation | Système de commande de bras médical, procédé de commande de bras médical, simulateur de bras médical, modèle d'apprentissage de bras médical et programmes associés |
| CN111991084A (zh) * | 2020-10-08 | 2020-11-27 | 深圳市精锋医疗科技有限公司 | 手术机器人及其虚拟成像控制方法、虚拟成像控制装置 |
| CN115068114A (zh) * | 2022-06-10 | 2022-09-20 | 上海微创医疗机器人(集团)股份有限公司 | 用于在医生控制台显示虚拟手术器械的方法及医生控制台 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN119214802A (zh) * | 2024-09-29 | 2024-12-31 | 哈尔滨思哲睿智能医疗设备股份有限公司 | 手术机器人医生控制台视野调节方法、装置、设备、介质和产品 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN115068114A (zh) | 2022-09-20 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11977678B2 (en) | Robotic system providing user selectable actions associated with gaze tracking | |
| US11941734B2 (en) | Rendering tool information as graphic overlays on displayed images of tools | |
| CN112618026B (zh) | 远程手术数据融合交互显示系统及方法 | |
| WO2023237105A1 (fr) | Procédé d'affichage d'un instrument chirurgical virtuel sur une console de chirurgien, et console de chirurgien | |
| US10715720B2 (en) | Intelligent manual adjustment of an image control element | |
| JP7756079B2 (ja) | 外科手術仮想現実ユーザインターフェース | |
| US12277267B2 (en) | Two-way communication between head-mounted display and electroanatomic system | |
| EP2548495B1 (fr) | Système et programme pour le support d'une observation endoscopique | |
| EP3977406B1 (fr) | Systèmes et procédés d'imagerie médicale composite | |
| CN102821671B (zh) | 内窥镜观察支持系统和设备 | |
| JP2019523663A (ja) | ロボット外科手術装置および視聴者適合型の立体視ディスプレイの態様を制御するためのシステム、方法、およびコンピュータ可読記憶媒体 | |
| US20210220078A1 (en) | Systems and methods for measuring a distance using a stereoscopic endoscope | |
| WO2008006180A1 (fr) | Système de vision endoscopique | |
| US12266040B2 (en) | Rendering tool information as graphic overlays on displayed images of tools | |
| JP7345866B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
| KR20110049703A (ko) | 수술 로봇 시스템 및 그 복강경 조작 방법 | |
| US10854005B2 (en) | Visualization of ultrasound images in physical space | |
| KR101114234B1 (ko) | 수술 로봇 시스템 및 그 복강경 조작 방법 | |
| CN112397189A (zh) | 一种医用导引装置及其使用方法 | |
| EP3690609A1 (fr) | Procédé et système de commande de machines dentaires | |
| CN119214801A (zh) | 一种双屏立体监视器系统、信息展示方法和腔镜手术机器人 | |
| JP2025174992A (ja) | ロボット操作手術器具の位置を追跡システムおよび方法 | |
| WO2025019594A1 (fr) | Systèmes et procédés de mise en œuvre d'une fonction de zoom associée à un dispositif d'imagerie dans un espace d'imagerie | |
| WO2025230949A1 (fr) | Systèmes et procédés pour adapter le placement en profondeur d'éléments d'interface utilisateur graphique avec imagerie chirurgicale | |
| JP2025537073A (ja) | X線取得のために患者の少なくとも1つの身体部分の配置を支援するための装置、コンピュータプログラム及び方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23819268 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 23819268 Country of ref document: EP Kind code of ref document: A1 |