WO2025078950A1 - Système robotique chirurgical et procédé de commande intégrée de données de modèle 3d - Google Patents
Système robotique chirurgical et procédé de commande intégrée de données de modèle 3d Download PDFInfo
- Publication number
- WO2025078950A1 WO2025078950A1 PCT/IB2024/059844 IB2024059844W WO2025078950A1 WO 2025078950 A1 WO2025078950 A1 WO 2025078950A1 IB 2024059844 W IB2024059844 W IB 2024059844W WO 2025078950 A1 WO2025078950 A1 WO 2025078950A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- model
- controller
- surgical
- instrument
- video feed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/37—Leader-follower robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
Definitions
- a surgical robotic system includes a robotic arm having an instrument and a surgeon console including a display screen and a handle controller for receiving input to move the instrument via the robotic arm.
- the system further includes a laparoscopic camera for imaging a surgical site and the instrument and generating a video feed.
- the 3D model may be based on a plurality of preoperative images.
- the controller may receive calibration data for the laparoscopic camera.
- the controller may also render the 3D model based on the calibration data to match the video feed.
- the controller may also receive kinematic data pertaining to the laparoscopic camera and the robotic arm.
- the controller may further track the 3D model on the video feed by tracking position and orientation of the laparoscopic camera based on the kinematic data.
- the controller may additionally generate a dense depth map from the video feed and a point cloud based on the dense depth map.
- the controller may also track the instrument contact with one or more anatomical landmarks using the point cloud.
- the system further includes a controller for: receiving a 3D model of a portion of the surgical site, the 3D model including a plurality of model landmarks; registering the 3D model to the surgical site based on association of the plurality of model landmarks with a plurality of anatomical landmarks of the surgical site; displaying the registered 3D model as an augmented reality overlay in the video feed on the display screen; receiving manipulation input through the handle controller to manipulate the 3D model relative to the video feed; modifying the 3D model based on the manipulation input; and displaying the modified 3D model as the augmented reality overlay in the video feed on the display screen.
- a controller for: receiving a 3D model of a portion of the surgical site, the 3D model including a plurality of model landmarks; registering the 3D model to the surgical site based on association of the plurality of model landmarks with a plurality of anatomical landmarks of the surgical site; displaying the registered 3D model as an augmented reality overlay in the video feed on the display screen; receiving manipulation input through the
- FIG. 3 is a perspective view of a mobile cart having a setup arm with the surgical robotic arm of the surgical robotic system of FIG. 1 according to an embodiment of the present disclosure
- FIG. 7 is a perspective view of a handle controller according to one embodiment of the present disclosure.
- FIG. 8 is a flow chart of a method for integrated control of 3D model data according to one embodiment of the present disclosure
- the system also includes an electrosurgical generator 57 configured to output electrosurgical (e.g., monopolar or bipolar) or ultrasonic energy in a variety of operating modes, such as coagulation, cutting, sealing, etc.
- electrosurgical generator 57 configured to output electrosurgical (e.g., monopolar or bipolar) or ultrasonic energy in a variety of operating modes, such as coagulation, cutting, sealing, etc.
- Suitable generators include a ValleylabTM FT10 Energy Platform available from Medtronic of Minneapolis, MN.
- the surgeon console 30 includes a first, i.e., surgeon, screen 32, which displays a video feed of the surgical site provided by camera 51 of the surgical instrument 50 disposed on the robotic arm 40, and a second screen 34, which displays a user interface for controlling the surgical robotic system 10.
- the first screen 32 and second screen 34 may be touchscreens allowing for displaying various graphical user inputs.
- the first screen 32 may be a 3D screen.
- the surgeon console 30 also includes a plurality of user interface devices, such as foot pedals 36 and a pair of hand controllers 38a and 38b which are used by a user to remotely control robotic arms 40.
- the surgeon console further includes an armrest 33 used to support clinician’s arms while operating the hand controllers 38a and 38b.
- Wireless communication may be achieved via one or more wireless configurations, e.g., radio frequency, optical, Wi-Fi, Bluetooth (an open wireless protocol for exchanging data over short distances, using short length radio waves, from fixed and mobile devices, creating personal area networks (PANs), ZigBee® (a specification for a suite of high level communication protocols using small, low-power digital radios based on the IEEE 122.15.4-1203 standard for wireless personal area networks (WPANs)).
- wireless configurations e.g., radio frequency, optical, Wi-Fi, Bluetooth (an open wireless protocol for exchanging data over short distances, using short length radio waves, from fixed and mobile devices, creating personal area networks (PANs), ZigBee® (a specification for a suite of high level communication protocols using small, low-power digital radios based on the IEEE 122.15.4-1203 standard for wireless personal area networks (WPANs)).
- PANs personal area networks
- ZigBee® a specification for a suite of high level communication protocols using small, low-power digital radios
- the computers 21, 31, 41 may include any suitable processor (not shown) operably connected to a memory (not shown), which may include one or more of volatile, non-volatile, magnetic, optical, or electrical media, such as read-only memory (ROM), random access memory (RAM), electrically erasable programmable ROM (EEPROM), non-volatile RAM (NVRAM), or flash memory.
- the processor may be any suitable processor (e.g., control circuit) adapted to perform the operations, calculations, and/or set of instructions described in the present disclosure including, but not limited to, a hardware processor, a field programmable gate array (FPGA), a digital signal processor (DSP), a central processing unit (CPU), a microprocessor, and combinations thereof.
- FPGA field programmable gate array
- DSP digital signal processor
- CPU central processing unit
- microprocessor e.g., microprocessor
- the setup arm 61 includes a first link 62a, a second link 62b, and a third link 62c, which provide for lateral maneuverability of the robotic arm 40.
- the links 62a, 62b, 62c are interconnected at joints 63a and 63b, each of which may include an actuator (not shown) for rotating the links 62b and 62b relative to each other and the link 62c.
- the links 62a, 62b, 62c are movable in their corresponding lateral planes that are parallel to each other, thereby allowing for extension of the robotic arm 40 relative to the patient (e.g., surgical table).
- the robotic arm 40 may be coupled to the surgical table (not shown).
- the setup arm 61 includes controls 65 for adjusting movement of the links 62a, 62b, 62c as well as the lift 67.
- the setup arm 61 may include any type and/or number of joints.
- the instrument 50 may be inserted through a laparoscopic access port 55 (FIG. 3) held by the holder 46.
- the holder 46 also includes a port latch 46c for securing the access port 55 to the holder 46 (FIG. 2).
- the robotic arm 40 also includes a plurality of manual override buttons 53 (FIG. 1) disposed on the IDU 52 and the setup arm 61, which may be used in a manual mode. The user may press one or more of the buttons 53 to move the component associated with the button 53.
- each of the computers 21, 31, 41 of the surgical robotic system 10 may include a plurality of controllers, which may be embodied in hardware and/or software.
- the computer 21 of the control tower 20 includes a controller 21a and safety observer 21b.
- the controller 21a receives data from the computer 31 of the surgeon console 30 about the current position and/or orientation of the hand controllers 38a and 38b and the state of the foot pedals 36 and other buttons.
- the controller 21a processes these input positions to determine desired drive commands for each joint of the robotic arm 40 and/or the IDU 52 and communicates these to the computer 41 of the robotic arm 40.
- the controller 21a also receives the actual joint angles measured by encoders of the actuators 48a and 48b and uses this information to determine force feedback commands that are transmitted back to the computer 31 of the surgeon console 30 to provide haptic feedback through the hand controllers 38a and 38b.
- the safety observer 21b performs validity checks on the data going into and out of the controller 21a and notifies a system fault handler if errors in the data transmission are detected to place the computer 21 and/or the surgical robotic system 10 into a safe state.
- the robotic arm controller 41c controls each joint 44a and 44b of the robotic arm 40 and calculates desired motor torques required for gravity compensation, friction compensation, and closed loop position control of the robotic arm 40.
- the robotic arm controller 41c calculates a movement command based on the calculated torque.
- the calculated motor commands are then communicated to one or more of the actuators 48a and 48b in the robotic arm 40.
- the actual joint positions are then transmitted by the actuators 48a and 48b back to the robotic arm controller 41c.
- the IDU controller 4 Id receives desired joint angles for the surgical instrument 50, such as wrist and jaw angles, and computes desired currents for the motors in the IDU 52.
- the IDU controller 4 Id calculates actual angles based on the motor positions and transmits the actual angles back to the main cart controller 41a.
- System 10 includes a data reception system 305 that collects surgical data, including the video data and surgical instrumentation data.
- the data reception system 305 can include one or more devices (e.g., one or more user devices and/or servers) located within and/or associated with a surgical operating room and/or control center.
- the data reception system 305 can receive surgical data in real-time, i.e., as the surgical procedure is being performed.
- the procedural tracking data structure 355 may be a graph that includes a set of nodes and a set of edges, with each node corresponding to a potential phase.
- the edges may provide directional connections between nodes that indicate (via the direction) an expected order during which the phases will be encountered throughout an iteration of the surgical procedure.
- the procedural tracking data structure 355 may include one or more branching nodes that feed to multiple next nodes and/or may include one or more points of divergence and/or convergence between the nodes.
- a phase indicates a procedural action (e.g., surgical action) that is being performed or has been performed and/or indicates a combination of actions that have been performed.
- a 3D model 500 of the tissue is constructed from preoperative images, which may be done by obtaining a plurality of 2D images and reconstructing a 3D volumetric image therefrom.
- preoperative images may be provided to any computing device (e.g., outside the operating room) to perform the image processing steps described herein.
- the 3D model 500 may be a wire mesh model based on the preoperative image as shown in FIG. 9.
- the 3D model 500 may include a plurality of points or vertices interconnected by line segments based on the segmentations and include a surface texture over the vertices and segments.
- the 3D model 500 may also be saved in the PACS.
- the 3D model 500 is analyzed to identify a plurality of landmarks, which include any unique tissue structure, e.g., blood vessel.
- a computer vision algorithm may be used to identify the plurality of landmarks.
- the algorithm may be a machine leaming/artificial intelligence (ML/AI) algorithm trained on a dataset including various landmarks and corresponding images and 3D models of surgical sites.
- the algorithm may support generalization of landmarks to include “weak” constraints (i.e., less 3D-like features) such as ridges and contours, and “strong” constrains (i.e., more 3D-like features) such as distinct structures and geometries.
- the identified landmarks may be further modified by a user to keep well-detected landmarks, adjust poorly detected landmarks, add missed landmarks, etc.
- the landmarks are then tagged or annotated on the 3D model.
- the annotated 3D model is then saved in the PACS using the Digital Imaging and Communications in Medicine (DICOM) standard.
- DICOM Digital Imaging and Communications in Medicine
- Saved configurations may include the location on the screen 32 where the 3D model is displayed and transparency level for each anatomy or global transparency level for all anatomies in model.
- the 3D model may also highlight different sized vessels in different shades/saturation values to highlight recommended instrument selection, e.g., 7+ mm vessels - stapler, 6-7 mm vessels - vessel sealing, 5 mm vessels - ultrasonic dissection.
- additional annotation of the 3D model may be performed at this step by approving, adjusting, or removing anatomical landmarks.
- the identification of the landmarks may be performed at this step as well allowing the user to load the 3D model and annotate the same.
- the system 10 loads preoperative images (e.g., CT/MRI DICOM from the PACS) and displays the 3D model in picture -in-picture tiled display or on a secondary screen 34 of the console 30.
- preoperative images e.g., CT/MRI DICOM from the PACS
- the display of the 3D model in this manner allows for easier configuration without cluttering the main screen 32.
- the user may adjust contrast, brightness, windowing, and other display parameters using the hand controllers 38a and 38b.
- the hand controllers 38a and 38b may be used to move through slices of the 3D model in axial, coronal, and sagittal planes.
- manual registration may also include a semi-automatic alignment interface that allows users to successively refine initial rigid registration in coarse to fine fashion by a “branching strategy” of selecting from one of several renderings of plausible automatic registrations at each stage of refinement.
- This deformation refinement could be guided by refinement along principal modes of variation similar to a supervised simulated annealing in a “shape space” representation.
- the controller 21a registers the contact based on torque or other input provided by the system 10 or computer vision analysis of the video feed by the image processing device 56 to determine when the instrument 50 has contacted issue.
- a real-time dense depth map 504 is created as shown in FIG. 11.
- the depth map may be generated using stereo reconstruction from calibrated stereo cameras using conventional depth map generating algorithms, pyramid stereo matching with double cost volume network (PSMDCNet) algorithm, hierarchical iterative tile refinement network (HitNet), and the like.
- PSMDCNet pyramid stereo matching with double cost volume network
- HiNet hierarchical iterative tile refinement network
- the image processing device 56 also tracks the instrument 50 in 3D using point cloud data.
- the image processing device 56 stores 3D models of the instrument 50 and loads the 3D model based on the instrument ID.
- the image processing device 56 then loads instrument 3D model, which may be mesh with embedded keypoints and pre-trained keypoints to mesh vertices mapping.
- the image processing device 56 detects instrument keypoints in both camera images of stereo pair and instrument mesh is automatically registered with detected keypoints and point cloud.
- the surgeon may confirm the landmark by pressing buttons 375b on the hand controller 38a or via another input method, e.g., via GUI.
- the controller 21a notes the 3D point cloud location of the touched intra-operative landmark location and associates the pre-operative 3D model landmark with the intra-operative landmark.
- the system 10 enables the hand controllers 38a to manipulate the AR overlay of 3D model as shown in FIGS. 9 and 10.
- the user can further move the vertices of the individual meshes to impose strong anatomy deformation constraints while the controller 21a then updates deformable registration based on surgeon inputs.
- the user may also adjust the display properties of the 3D model to visualize the anatomy under the model and internal meshes obscured by external meshes (e.g., critical structures inside organ capsule as shown in FIG. 9.) To accomplish this, the user may change transparency of each 3D model or mesh, which may be adjusted via changing the dithering of the 3D model overlay.
- Force feedback may also be provided through handle controllers 38a and 38b to guide the user in the most optimal direction of registration.
- the controller 21a continuously computes registration of the 3D model with intra-operative imaging and as the user moves the 3D model for alignment, the controller 21a applies force feedback through handle controllers 38a and 38b to guide the user, i.e., provide less feedback to make movement easier when the 3D model moved in the right direction and provide more feedback to make movement harder when the 3D model is moved in the wrong direction.
- the controller 21a may also provide visual AR overlay arrows to show the direction of optimal alignment.
- Example 7 The surgical robotic system according to Example 1, wherein the controller generates a dense depth map from the video feed and a point cloud based on the dense depth map.
- Example 8 The surgical robotic system according to Example 7, wherein the controller tracks the instrument contact with the at least one anatomical landmark of the plurality of anatomical landmarks using the point cloud.
- Example 11 The method according to Example 9, further comprising receiving calibration data for the laparoscopic camera.
- Example 13 The method according to Example 9, further comprising receiving kinematic data pertaining to the laparoscopic camera and the robotic arm.
- Example 14 The method according to Example 13, further comprising tracking the 3D model on the video feed by tracking position and orientation of the laparoscopic camera based on the kinematic data.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Robotics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Pathology (AREA)
- Gynecology & Obstetrics (AREA)
- Radiology & Medical Imaging (AREA)
- Manipulator (AREA)
Abstract
Un système robotique chirurgical affiche un modèle informatique 3D du site chirurgical en tant que superposition de réalité augmentée sur un flux vidéo, fourni par une caméra laparoscopique stéréoscopique, sur un dispositif d'affichage de chirurgien 3D. Le système reçoit et affiche le modèle 3D et fournit un enregistrement manuel initial de quelques points de repère du modèle 3D à des points de repère anatomiques correspondants du site chirurgical sur la base d'une entrée d'utilisateur consistant à toucher les points de repère de site chirurgical avec un instrument. Le système effectue ensuite un enregistrement automatique sur la base de la confirmation manuelle des points de repère. Le système rend également le modèle 3D sur la base de données d'étalonnage de la caméra stéréoscopique de telle sorte que le modèle 3D est rendu à l'aide des mêmes paramètres que le flux vidéo.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363590077P | 2023-10-13 | 2023-10-13 | |
| US63/590,077 | 2023-10-13 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025078950A1 true WO2025078950A1 (fr) | 2025-04-17 |
Family
ID=93378223
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2024/059844 Pending WO2025078950A1 (fr) | 2023-10-13 | 2024-10-08 | Système robotique chirurgical et procédé de commande intégrée de données de modèle 3d |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025078950A1 (fr) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120289825A1 (en) * | 2011-05-11 | 2012-11-15 | Broncus, Technologies, Inc. | Fluoroscopy-based surgical device tracking method and system |
| US20200315729A1 (en) | 2016-06-03 | 2020-10-08 | Covidien Lp | Control arm assemblies for robotic surgical systems |
| US20210145523A1 (en) * | 2019-11-15 | 2021-05-20 | Verily Life Sciences Llc | Robotic surgery depth detection and modeling |
| US20210196398A1 (en) * | 2019-12-31 | 2021-07-01 | Auris Health, Inc. | Anatomical feature identification and targeting |
| US20220218420A1 (en) * | 2021-01-13 | 2022-07-14 | MediVis, Inc. | Instrument-based registration and alignment for augmented reality environments |
-
2024
- 2024-10-08 WO PCT/IB2024/059844 patent/WO2025078950A1/fr active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120289825A1 (en) * | 2011-05-11 | 2012-11-15 | Broncus, Technologies, Inc. | Fluoroscopy-based surgical device tracking method and system |
| US20200315729A1 (en) | 2016-06-03 | 2020-10-08 | Covidien Lp | Control arm assemblies for robotic surgical systems |
| US20210145523A1 (en) * | 2019-11-15 | 2021-05-20 | Verily Life Sciences Llc | Robotic surgery depth detection and modeling |
| US20210196398A1 (en) * | 2019-12-31 | 2021-07-01 | Auris Health, Inc. | Anatomical feature identification and targeting |
| US20220218420A1 (en) * | 2021-01-13 | 2022-07-14 | MediVis, Inc. | Instrument-based registration and alignment for augmented reality environments |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP4150603B1 (fr) | Système de rectification d'objet de simulation chirurgicale | |
| KR102839599B1 (ko) | 수술 로봇을 위한 핸드헬드 사용자 인터페이스 장치 | |
| Falk et al. | Cardio navigation: planning, simulation, and augmented reality in robotic assisted endoscopic bypass grafting | |
| EP2442744B1 (fr) | Outil de mesure virtuel pour chirurgie à invasion minimale | |
| EP1937176B1 (fr) | Affichage et manipulation d'images auxiliaires sur un affichage informatique d'un système robotique médical | |
| US20100169815A1 (en) | Visual force feedback in a minimally invasive surgical procedure | |
| JP2012529970A (ja) | 低侵襲手術のための仮想測定ツール | |
| KR101114232B1 (ko) | 수술 로봇 시스템 및 그 동작 제한 방법 | |
| WO2024238729A2 (fr) | Système robotique chirurgical et procédé de génération de jumeau numérique | |
| EP3463163A1 (fr) | Système chirurgical robotique à imageur intégré | |
| WO2024042468A1 (fr) | Système robotique chirurgical et procédé de fusion peropératoire de différentes modalités d'imagerie | |
| WO2025078950A1 (fr) | Système robotique chirurgical et procédé de commande intégrée de données de modèle 3d | |
| WO2024201216A1 (fr) | Système robotique chirurgical et méthode pour empêcher une collision d'instrument | |
| EP4543351A1 (fr) | Placement de port assisté pour chirurgie assistée par robot ou minimalement invasive | |
| US11786315B2 (en) | Surgical robotic system having grip-dependent control | |
| EP4322814A1 (fr) | Estimation de profondeur de scène chirurgicale robuste à l'aide d'une endoscopie | |
| US20250127580A1 (en) | Surgical robotic system and method for instrument tracking and visualization in near infra-red (nir) imaging mode using nir reflection | |
| EP4654912A1 (fr) | Système robotique chirurgical et procédé de placement d'orifice d'accès assisté | |
| EP4509087A1 (fr) | Système robotique chirurgical et procédé de compensation de mise à l'échelle d'entrée pour latence téléopératoire | |
| EP4649371A1 (fr) | Système robotique chirurgical et procédé de communication entre une console de chirurgien et un assistant de chevet | |
| WO2025181641A1 (fr) | Système robotique chirurgical pour l'affichage non obstructif d'images ultrasonores peropératoires en superposition | |
| US20250152276A1 (en) | Surgical robotic system and method for access port size identification | |
| WO2025088453A1 (fr) | Système robotique chirurgical pour suivi et visualisation d'instrument en mode d'imagerie proche infrarouge (nir) à l'aide d'une réflexion nir | |
| WO2025052266A1 (fr) | Système et procédé de grossissement de déplacement dans une vidéo laparoscopique en lumière blanche pour la vérification d'un clampage chirurgical | |
| US20210298854A1 (en) | Robotically-assisted surgical device, robotically-assisted surgical method, and system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24801340 Country of ref document: EP Kind code of ref document: A1 |