WO2025209993A1 - Visualisation chirurgicale coordonnée de manière robotisée - Google Patents
Visualisation chirurgicale coordonnée de manière robotiséeInfo
- Publication number
- WO2025209993A1 WO2025209993A1 PCT/EP2025/058755 EP2025058755W WO2025209993A1 WO 2025209993 A1 WO2025209993 A1 WO 2025209993A1 EP 2025058755 W EP2025058755 W EP 2025058755W WO 2025209993 A1 WO2025209993 A1 WO 2025209993A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- surgical
- robotic
- user
- display screen
- anatomy
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/20—Surgical microscopes characterised by non-optical aspects
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/32—Surgical robots operating autonomously
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00216—Electrical control of surgical instruments with eye tracking or head position tracking control
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/371—Surgical systems with images on a monitor during operation with simultaneous use of two cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
Definitions
- the disclosed technology relates generally to medical systems and methods. More particularly, the disclosed technology relates to surgical robots and methods and to apparatus for augmenting actual and virtual visualization of a robotic workspace.
- Enhanced imaging techniques such as virtual reality and/or augmented reality (VR/AR) capabilities, have been incorporated into surgery in general and robotic surgery in particular.
- VR/AR and other enhanced imaging tools has been suggested, for example, to help users navigate surgical tools through anatomies which lack direct line of sight and where it is difficult to orient the tool and/or distinguish anatomical features.
- Image guidance and visualization can be based, in whole or in part, on preoperative images and intraoperative imaging, including but not limited to computed tomography (CT) imaging, magnetic resonance imaging (MRI), X-ray imaging, fluoroscopic imaging, ultrasound imaging, and the like.
- CT computed tomography
- MRI magnetic resonance imaging
- X-ray imaging fluoroscopic imaging
- ultrasound imaging and the like.
- a tracked tool position can be superimposed on a preoperative or real-time anatomical image which can be shown on a display or projected onto a patient’s skin.
- VR/AR and other enhanced imaging systems intended for robotic surgery have significant drawbacks.
- available and proposed enhanced surgical robotic imaging systems are often passive, e.g. a display or other system component must be positioned and repositioned in the surgical field by the user during the robotic surgery.
- VR/AR and other imaging capabilities are often poorly integrated in surgical robots, typically added onto a navigation or other pre-existing camera. As navigation cameras are often distant from the surgical field, the quality and alignment of the virtual and actual images can be compromised.
- Many present VR/AR displays are inconvenient to deploy and difficult to watch.
- imaging and display capabilities suitable for use with surgical robots and robotic surgeries.
- the imaging and display capabilities were at least partially integrated with existing robotic technology and/or navigation technology to facilitate implementation and use.
- each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
- FIG. 1 shows a representative embodiment of a surgical robotic system 10 having first, second, and third surgical robotic arms 12, 14, and 16 mounted on a chassis 18.
- the chassis 18 typically comprises a single, rigid frame which provides a base or platform for the three surgical robotic arms 12, 14, and 16, where the surgical robotic arms are placed relatively far apart on the chassis on opposite longitudinal ends thereof, typically approximately one meter apart, thus allowing for desirable attributes such as reachability, maneuverability, and an ability to apply significant force.
- surgical robotic arms 12 and 16 are on a first end 18a of the chassis 18 and surgical robotic arm 14 is on a second end 18b of the chassis.
- the chassis 18 may be mobile, e.g., being in the form of a mobile cart as described in commonly owned WO2022/195460, previously incorporated herein by reference.
- the surgical arms 12, 14, and 16 can be mounted on a base or other structure of a surgical table.
- the surgical robotic arms 12, 14, and 16 may be located on a stable platform that allows the arms to be moved within a common robotic coordinate system under the control of a surgical robotic controller, typically an on-board controller 20, which typically includes a user interface, such as a touch screen, a track pad, a mouse, a joystick, a roller ball, or the like (not shown).
- the chassis 18 of the surgical robotic system 10 may be configured to be temporarily placed under a surgical table 26 when performing the robotic surgical procedure, allowing the first and third robotic surgical robotic arms 12 and 16 to be located on a first lateral side of the surgical table 26 and the second surgical robotic arm 14 to be located on a second lateral side.
- the robotic arms 12, 14, and 16 may optionally be configured to be retracted into the chassis 18 of the robotic surgical system, allowing the system to be moved into or out of the surgical field in a compact configuration.
- a virtual image of the target anatomy VTA can be presented on the screen in apparent alignment with the actual target anatomy TA.
- the robotic controller can determine the alignment of the virtual target anatomy VTA based upon a preoperative image or scan which has been provided to the robotic controller and registered to the surgical robotic coordinate space.
- the controller 20 can then position the display screen 30 along the field-of-view FOV or line-of-sight by kinematically positioning surgical robot arm 14 to locate the display screen along the line-of sight at a location which does not interfere with other robot functions, such as manipulation of the surgical tool by surgical robot arm 12, and/or the user’s access to the robot and/or the patient.
- controller 20 is configured to kinematically and/or optically track the locations of all robotic components as well as tracking the patient P and the user U. Using such real-time locational information, the controller 20 can automatically adjust the positions of the display screen 30, the surgical tool 38, and/or the camera 34, as the user moves or changes her/his line-of-sight or the patient anatomy changes position.
- the controller 72 can be configured to automatically reposition the display screen 66 as the user positions and repositions the microscope 62.
- the user may reposition the microscope 62 through an interface (not shown) causing the controller 72 to instruct the surgical arm 64 to move, as previously described, or in other instances the microscope and surgical robotic arm may be configured to be manually repositioned (with the user grasping and physically moving the microscope and robotic arm) with the controller kinematically and/or optically tracking the actual position in the surgical robotic coordinate space.
- the controller 72 may automatically reposition the display screen 66 based upon any one or more of numerous criteria including but not limited to, convenience and “intuitiveness” of the alignment of the display screen with the microscope, the patient, and the target anatomy TA.
- This technique can allow, for example, to request the robotic system to position one robotic arm which holds the virtual/ augmented reality screen perpendicularly to a tool that a second robotic arm is holding in relation to a desired location in the anatomical region. This can provide the user very valuable orienting visualization with minimum discomfort and unprecedented automation and efficiency.
- a virtual and/or augmented reality screen may facilitate another camera/sensor that detects the user’s eyes/gaze. Accordingly, the robotic arm can actively position the screen not only in the optimal position and angulation towards the patient and relevant anatomy but also the optimal position and angulation towards the user, this represents a significant optimization of virtual and/or augmented reality in surgery while leaving the user free of any burden or hassle that is usually associated with the use of virtual and/or augmented reality technology.
- multiple miniature markers 101, 102 may be placed on the relevant anatomy 103, 104 of a spinal surgery patient 105 during a surgical procedure by the physician.
- the miniature markers may optionally be placed with the assistance of preoperative imaging (e.g., CT or MRI), and additionally with the assistance of preoperative planning modalities.
- the markers may be active or passive and may optionally be placed on, for example, several aspects of several vertebrae in the patient’s spine that requires surgical intervention.
- the anatomy target(s) and markers can then be acquired and registered by intraoperative imaging (e.g., intraoperative CT).
- intraoperative imaging e.g., intraoperative CT
- several robotic navigation cameras 106, 107, 108 are used that are, in turn, mounted on a corresponding number of robotic arms 109, 110, and 111 that are, affixed to a single chassis 112 with a control unit 113.
- a virtual and/or augmented reality screen may also be deployed using the said robotic arms.
- the control unit 113 coordinates the movement of the multiple robotic arms and/or the navigation cameras toward the anatomy target, creating a closed feedback loop.
- the navigation camera held at a conventional distance can visualize the entire surgical field and assist in the placement of the other close-in navigation cameras adjacent to their anatomical regions of interest (e.g., adjacent vertebrae with markers already placed on them).
- This closed feedback loop can then be used to guide the deployment of a surgical tool and/or virtual/ augmented reality screen that may be robotically brought to the surgical field as an end effector on a robotic arm.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Robotics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Pathology (AREA)
- Manipulator (AREA)
Abstract
Des systèmes de navigation chirurgicale commandés par robot et coordonnés comprennent des dispositifs d'affichage qui peuvent avoir des capacités de réalité virtuelle et/ou augmentée. Des systèmes robotisés à bras multiples utilisés comprennent des caméras, des outils et des écrans de réalité virtuelle et/ou augmentée. Les bras robotisés sont déployés sur un châssis incorporant une unité de commande. De multiples éléments robotisés peuvent être fixés à la base unique et peuvent être commandés par l'unité de commande unique et peuvent être utilisés de manière coordonnée pour déployer et/ou être associés à des dispositifs de suivi, des caméras, des écrans de réalité virtuelle et/ou augmentée et des instruments chirurgicaux en tant que partie d'une intervention chirurgicale assistée par robot qui peut éventuellement être une intervention chirurgicale vertébrale assistée par robot.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/628,142 | 2024-04-05 | ||
| US18/628,142 US20240268919A1 (en) | 2021-10-21 | 2024-04-05 | Robotically coordinated surgical visualization |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025209993A1 true WO2025209993A1 (fr) | 2025-10-09 |
Family
ID=95309987
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2025/058755 Pending WO2025209993A1 (fr) | 2024-04-05 | 2025-03-31 | Visualisation chirurgicale coordonnée de manière robotisée |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025209993A1 (fr) |
Citations (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160191887A1 (en) * | 2014-12-30 | 2016-06-30 | Carlos Quiles Casas | Image-guided surgery with surface reconstruction and augmented reality visualization |
| US9918681B2 (en) | 2011-09-16 | 2018-03-20 | Auris Surgical Robotics, Inc. | System and method for virtually tracking a surgical tool on a movable display |
| EP3445048A1 (fr) * | 2017-08-15 | 2019-02-20 | Holo Surgical Inc. | Interface utilisateur graphique pour un système de navigation chirurgical pour fournir une image de réalité augmentée pendant le fonctionnement |
| US20190088162A1 (en) | 2016-03-04 | 2019-03-21 | Covidien Lp | Virtual and/or augmented reality to provide physical interaction training with a surgical robot |
| WO2020084625A1 (fr) | 2018-10-25 | 2020-04-30 | Beyeonics Surgical Ltd. | Iu destinée à un système de visiocasque |
| US20210093404A1 (en) | 2019-09-27 | 2021-04-01 | Globus Medical, Inc. | Surgical robot with passive end effector |
| WO2021250577A1 (fr) * | 2020-06-09 | 2021-12-16 | Stryker Leibinger Gmbh & Co. Kg | Dispositifs d'affichage sensibles à l'espace pour interventions assistées par ordinateur |
| WO2022195460A1 (fr) | 2021-03-16 | 2022-09-22 | Lem Surgical Ag | Système robotisé chirurgical bilatéral |
| WO2022197550A1 (fr) * | 2021-03-15 | 2022-09-22 | Relievant Medsystems, Inc. | Systèmes de colonne vertébrale robotique et procédés assistés par robot pour modulation de tissu |
| WO2023067415A1 (fr) | 2021-10-21 | 2023-04-27 | Lem Surgical Ag | Réalité virtuelle ou augmentée coordonnée robotisée |
| WO2023118985A1 (fr) | 2021-12-20 | 2023-06-29 | Lem Surgical Ag | Endoscopie spinale robotique bilatérale |
| WO2023118984A1 (fr) | 2021-12-20 | 2023-06-29 | Lem Surgical Ag | Fraisage d'os robotisé synchronisé |
| WO2023144602A1 (fr) | 2022-01-25 | 2023-08-03 | Lem Surgical Ag | Étalonnage robotisé peropératoire et dimensionnement d'outils chirurgicaux |
| WO2023152561A1 (fr) | 2022-02-10 | 2023-08-17 | Lem Surgical Ag | Système mobile pour alimentation d'outil robotique bilatéral |
| WO2023223215A1 (fr) | 2022-05-16 | 2023-11-23 | Lem Surgical Ag | Dispositif de préhension d'outil à obturateur concentrique intégré |
| WO2023237922A1 (fr) | 2022-06-06 | 2023-12-14 | Lem Surgical Ag | Système robotisé d'évitement de collision active et dynamique |
-
2025
- 2025-03-31 WO PCT/EP2025/058755 patent/WO2025209993A1/fr active Pending
Patent Citations (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9918681B2 (en) | 2011-09-16 | 2018-03-20 | Auris Surgical Robotics, Inc. | System and method for virtually tracking a surgical tool on a movable display |
| US20210289188A1 (en) | 2014-12-30 | 2021-09-16 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery |
| US20160191887A1 (en) * | 2014-12-30 | 2016-06-30 | Carlos Quiles Casas | Image-guided surgery with surface reconstruction and augmented reality visualization |
| US20190088162A1 (en) | 2016-03-04 | 2019-03-21 | Covidien Lp | Virtual and/or augmented reality to provide physical interaction training with a surgical robot |
| EP3445048A1 (fr) * | 2017-08-15 | 2019-02-20 | Holo Surgical Inc. | Interface utilisateur graphique pour un système de navigation chirurgical pour fournir une image de réalité augmentée pendant le fonctionnement |
| WO2020084625A1 (fr) | 2018-10-25 | 2020-04-30 | Beyeonics Surgical Ltd. | Iu destinée à un système de visiocasque |
| US20210093404A1 (en) | 2019-09-27 | 2021-04-01 | Globus Medical, Inc. | Surgical robot with passive end effector |
| WO2021250577A1 (fr) * | 2020-06-09 | 2021-12-16 | Stryker Leibinger Gmbh & Co. Kg | Dispositifs d'affichage sensibles à l'espace pour interventions assistées par ordinateur |
| WO2022197550A1 (fr) * | 2021-03-15 | 2022-09-22 | Relievant Medsystems, Inc. | Systèmes de colonne vertébrale robotique et procédés assistés par robot pour modulation de tissu |
| WO2022195460A1 (fr) | 2021-03-16 | 2022-09-22 | Lem Surgical Ag | Système robotisé chirurgical bilatéral |
| WO2023067415A1 (fr) | 2021-10-21 | 2023-04-27 | Lem Surgical Ag | Réalité virtuelle ou augmentée coordonnée robotisée |
| WO2023118985A1 (fr) | 2021-12-20 | 2023-06-29 | Lem Surgical Ag | Endoscopie spinale robotique bilatérale |
| WO2023118984A1 (fr) | 2021-12-20 | 2023-06-29 | Lem Surgical Ag | Fraisage d'os robotisé synchronisé |
| WO2023144602A1 (fr) | 2022-01-25 | 2023-08-03 | Lem Surgical Ag | Étalonnage robotisé peropératoire et dimensionnement d'outils chirurgicaux |
| WO2023152561A1 (fr) | 2022-02-10 | 2023-08-17 | Lem Surgical Ag | Système mobile pour alimentation d'outil robotique bilatéral |
| WO2023223215A1 (fr) | 2022-05-16 | 2023-11-23 | Lem Surgical Ag | Dispositif de préhension d'outil à obturateur concentrique intégré |
| WO2023237922A1 (fr) | 2022-06-06 | 2023-12-14 | Lem Surgical Ag | Système robotisé d'évitement de collision active et dynamique |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20240268919A1 (en) | Robotically coordinated surgical visualization | |
| CN113274128B (zh) | 外科手术系统 | |
| CN113243990B (zh) | 手术系统 | |
| US20230263586A1 (en) | Systems and methods for surgical navigation, including image-guided navigation of a patient's head | |
| US20250228624A1 (en) | Extended reality systems with three-dimensional visualizations of medical image scan slices | |
| EP4054468B1 (fr) | Positionnement robotique d'un dispositif | |
| JP7662627B2 (ja) | Ent処置の可視化システム及び方法 | |
| EP3720334B1 (fr) | Système et procédé d'assistance à la visualisation durant une procédure | |
| JP2022133440A (ja) | ナビゲーション手術における拡張現実ディスプレイのためのシステム及び方法 | |
| US5572999A (en) | Robotic system for positioning a surgical instrument relative to a patient's body | |
| JP2019532693A5 (fr) | ||
| EP4079247B1 (fr) | Système de navigation chirurgicale assistée par ordinateur pour des interventions sur la colonne vertébrale | |
| WO2010067267A1 (fr) | Caméra sans fil montée sur la tête et unité d'affichage | |
| EP4452092A1 (fr) | Endoscopie spinale robotique bilatérale | |
| JP2021122743A (ja) | ナビゲートされたロボット外科手術のためのエクステンデッドリアリティ器具相互作用ゾーン | |
| US20250049515A1 (en) | Surgical robot system and control method | |
| CN212490140U (zh) | 手术导航系统 | |
| WO2025209993A1 (fr) | Visualisation chirurgicale coordonnée de manière robotisée | |
| EP3936080A1 (fr) | Imagerie médicale dirigée | |
| EP4487803B1 (fr) | Technique d'assistance du personnel clinique avec de multiples vues de navigation | |
| WO2024180546A1 (fr) | Déplacement automatisé d'un dispositif de localisation optique pour une ligne de visée optimale avec des suiveurs optiques | |
| Vilsmeier et al. | Introduction of the Passive Marker Neuronavigation System VectorVision |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25717172 Country of ref document: EP Kind code of ref document: A1 |