[go: up one dir, main page]

WO2024236564A1 - Systèmes de positionnement d'un robot chirurgical - Google Patents

Systèmes de positionnement d'un robot chirurgical Download PDF

Info

Publication number
WO2024236564A1
WO2024236564A1 PCT/IL2024/050466 IL2024050466W WO2024236564A1 WO 2024236564 A1 WO2024236564 A1 WO 2024236564A1 IL 2024050466 W IL2024050466 W IL 2024050466W WO 2024236564 A1 WO2024236564 A1 WO 2024236564A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
location
desired location
processor
robotic arm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IL2024/050466
Other languages
English (en)
Inventor
Dany JUNIO
Ran Shaham
Yonathan WEISS
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mazor Robotics Ltd
Original Assignee
Mazor Robotics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mazor Robotics Ltd filed Critical Mazor Robotics Ltd
Priority to CN202480032086.8A priority Critical patent/CN121127197A/zh
Publication of WO2024236564A1 publication Critical patent/WO2024236564A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1615Programme controls characterised by special kind of manipulator, e.g. planar, scara, gantry, cantilever, space, closed chain, passive/active joints and tendon driven manipulators
    • B25J9/162Mobile manipulator, movable base with manipulator arm mounted on it
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/16Instruments for performing osteoclasis; Drills or chisels for bones; Trepans
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0807Indication means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B50/00Containers, covers, furniture or holders specially adapted for surgical or diagnostic appliances or instruments, e.g. sterile covers
    • A61B50/10Furniture specially adapted for surgical or diagnostic appliances or instruments
    • A61B50/13Trolleys, e.g. carts

Definitions

  • the present disclosure is generally directed to surgical procedures, and relates more particularly to robotic surgical procedures.
  • Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure, or may complete one or more surgical procedures autonomously.
  • Providing controllable linked articulating members allows a surgical robot to reach areas of a patient anatomy during various medical procedures.
  • Example aspects of the present disclosure include:
  • a method comprises: determining a desired location in which a robot is to be positioned; receiving information about a current location of the robot; rendering, to a display, a heatmap depicting at least one of the current location of the robot and the desired location of the robot; determining a difference between the desired location and the current location; and providing, based on the difference, a recommended adjustment to a pose of the robot to move the robot toward the desired location depicted by the heatmap.
  • any of the features herein further comprising: causing a robotic arm of the robot to move into a second pose, wherein the robotic arm, when in the second pose, interacts with a working volume when the robot is in the desired location.
  • any of the features herein further comprising: causing an imaging device to project a light field, wherein the light field illuminates the desired location or a path to the desired location.
  • the light field illuminates the desired location with a first intensity of light
  • the light field illuminates the current location of the robot with a second intensity of light different from the first intensity of light
  • any of the features herein further comprising: determining that the robot has moved from the current location to an intermediate location; generating, when the intermediate location is closer to the desired location than the current location, a first indicator; and generating, when the intermediate location is further away from the desired location than the current location, a second indicator.
  • any of the features herein, wherein at least one of the first indicator and the second indicator comprises at least one of an audio indicator, a visual indicator, and a haptic indicator.
  • the first indicator is generated at a first intensity when the intermediate location is within a threshold distance from the desired location, and wherein the first indicator is generated at a second intensity different from the first intensity when the intermediate location is outside of the threshold distance from the desired location.
  • a system comprises: a processor; and a memory coupled to the processor and storing data thereon that, when executed by the processor, enable the processor to: determine a desired location for positioning a robot; receive information describing a current location of the robot; render, to a display, a heatmap depicting at least one of the current location and the desired location; determine a difference between the desired location and the current location; and provide, based on the difference, a recommended adjustment to a pose of the robot to move the robot toward the desired location depicted by the heatmap.
  • the data further enable the processor to: cause a robotic arm of the robot to move into a second pose, wherein the robotic arm interacts with a working volume when the robot is in the desired location and when the robotic arm is in the second pose.
  • any of the features herein, wherein the data further enable the processor to: cause an imaging device to project a light field that illuminates the desired location or a path to the desired location with a first intensity of light.
  • any of the features herein, wherein the data further enable the processor to: determine that the robot has moved from the current location to an intermediate location; generate, when the intermediate location is closer to the desired location than the current location, a first indicator; and generate, when the intermediate location is further away from the desired location than the current location, a second indicator.
  • any of the features herein, wherein the data further enable the processor to: cause a surgical tool to emit one or more laser lines that illuminate the desired location.
  • a surgical system comprises: a robot including a robotic arm; a processor coupled to at least one of the robot and the robotic arm; and a memory coupled to the processor and storing data thereon that, when executed by the processor, enable the processor to: determine a desired location for positioning the robot; receive information associated with a current location of the robot; cause the robotic arm to move into a first pose, wherein the robotic arm interacts with a working volume when the robot is in the desired location and when the robotic arm is in the first pose; determine a difference between the desired location and the current location; and provide, based on the difference, a recommended adjustment to a second pose of the robot to move the robot toward the desiredlocation.
  • any of the features herein, wherein the data further enable the processor to: render, to a display, a heatmap depicting at least one of the current location of the robot and the desired location of the robot.
  • any of the features herein, wherein the data further enable the processor to: determine that the robot has moved from the current location to an intermediate location; generate, when the intermediate location is closer to the desired location than the current location, a first indicator; and generate, when the intermediate location is further away from the desired location than the current location, a second indicator.
  • each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • each one of A, B, and C in the above expressions refers to an element, such as X, Y, and Z, or class of elements, such as Xl-Xn, Yl- Ym, and Zl-Zo
  • the phrase is intended to refer to a single element selected from X, Y, and Z, a combination of elements selected from the same class (e.g., XI and X2) as well as a combination of elements selected from two or more classes (e.g., Y 1 and Zo).
  • FIG. 1 is a block diagram of aspects of a system according to at least one embodiment of the present disclosure
  • Fig. 2A is a diagram of additional aspects of the system according to at least one embodiment of the present disclosure.
  • Fig. 2B is a diagram of a robotic arm moving according to at least one embodiment of the present disclosure.
  • FIG. 2C is a diagram of an imaging device projecting a light field according to at least one embodiment of the present disclosure
  • Fig. 2D is a diagram of a robotic arm moving according to at least one embodiment of the present disclosure.
  • Fig. 2E is a diagram of a heatmap according to at least one embodiment of the present disclosure.
  • FIG. 3 is a flowchart according to at least one embodiment of the present disclosure.
  • Fig. 4 is a flowchart according to at least one embodiment of the present disclosure.
  • Fig. 5 is a flowchart according to at least one embodiment of the present disclosure
  • Fig. 6 is a flowchart according to at least one embodiment of the present disclosure.
  • the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions).
  • Computer- readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple Al 1, A 12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors), graphics processing units (e.g., Nvidia GeForce RTX 2000- series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • the step of positioning a surgical robot is an additional step that requires additional time and effort on the part of the surgeons and operating room (OR) staff.
  • the surgeons and OR staff have a lot on their minds during the procedure, and the positioning of the surgical robot may require the surgeons and OR staff to maintain an abundant amount of training on a large variety of clinical and technical aspects and devices from many different vendors and for different uses/procedures. This in turn drives a constant desire to reduce the complexity and time consumption of the setup process, with robot positioning being one of the bigger ones.
  • a big part of the positioning step is the understanding of the distances and angulations needed to position the robot in an optical or satisfactory location in relation to the area of interest.
  • the positioning step may be at times hard to execute for the surgeons and OR staff and/or it may be difficult to provide the necessary training for the surgeons and OR staff.
  • Embodiments of the present disclosure may address the aforementioned issues by providing for faster and/or simpler robotic cart or device positioning next to a patient undergoing or about to undergo a surgical procedure.
  • one or more positioning techniques may be used in combination with one another to increase efficiency in positioning the robotic cart or device.
  • the robot may be physically positioned in such a manner that the user (e.g., a surgeon, a member of the OR staff, etc.) receives a physical indication of what the distance and/or angulation of the robot relative to the area of interest is.
  • the end effector of the robot may point at the optimal distance to enable the robot to have optimized reachability.
  • the robot joints or base may be angled in a way that indicates to the user how to angle the robot joints or base relative to the area of interest.
  • the robot may have a light field indicating the location and angulation ideal to position therobot.
  • the robot may indicate to the user by motion of one or more joints the direction in which to move to achieve the optimal position.
  • the motion of the one or more joints may indicate to the user how close the robot is to achieving the optimal position.
  • a heatmap may be rendered to a monitor.
  • the heatmap may depict an optimal location for the robot, such that the user can move the robot to the optimal position.
  • the heatmap may be updated on the monitor in real time.
  • the user may receive an audio or visual indicator as to how to position the robot.
  • one or more lights may be shone and/or one or more beeps may be emitted based on how close the robot is to the correct position.
  • one or more laser lines may be shone on the floor to show the user the correct position of the robot.
  • Embodiments of the present disclosure provide technical solutions to one or more of the problems of (1) incorrect positioning of a surgical robot and (2) prolonged robot positioning during a surgery or surgical procedure.
  • a block diagram of a system 100 according to at least one embodiment of the present disclosure is shown.
  • the system 100 may be or comprise a surgical system in some cases.
  • the system 100 may be used to provide guidance in aligning a robot and/or components thereof; to control, pose, and/or otherwise manipulate a surgical mount system, a surgical arm, and/or surgical tools attached thereto; and/or carry out one or more other aspects of one or more of the methods disclosed herein.
  • the system 100 comprises a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database
  • Systems according to other embodiments of the present disclosure may comprise more or fewer components than the system 100.
  • the system 100 may not include one or more components of the computing device 102, the database 130, and/or the cloud 134.
  • the computing device 102 comprises a processor 104, a memory 106, a communication interface 108, and a user interface 110.
  • Computing devices according to other embodiments of the present disclosure may comprise more or fewer components than the computing device 102.
  • the processor 104 of the computing device 102 may be any processor described herein or any similar processor.
  • the processor 104 may be configured to execute instructions stored in the memory 106, which instructions may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud 134.
  • the memory 106 may be or comprise RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer- readable data and/or instructions.
  • the memory 106 may store information or data useful for completing, for example, any step of the methods 300, 400, 500, and/or 600 described herein, or of any other methods.
  • the memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114.
  • the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable image processing 120, segmentation 122, transformation 124, and/or registration 128.
  • Such content may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines.
  • the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc.) that can be processed by the processor 104 to carry out the various method and features described herein.
  • various contents of memory 106 may be described as instructions, it should be appreciated that functionality described herein can be achieved through use of instructions, algorithms, and/or machine learning models.
  • the data, algorithms, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, and/or the cloud 134.
  • the computing device 102 may also comprise a communication interface 108.
  • the communication interface 108 may be used for receiving image data or other information from an external source (such as the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100), and/or for transmitting instructions, images, or other information to an external system or device
  • an external source such as the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100
  • the communication interface 108 may comprise one or more wired interfaces (e.g., a USB port, an Ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.11a/b/g/n, Bluetooth, NFC, ZigBee, and so forth).
  • the communication interface 108 may be useful for enabling the computing device 102 to communicate with one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any otherreason.
  • the computing device 102 may also comprise one or more user interfaces 110.
  • the user interface 110 may be or comprise a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
  • the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system
  • the user interface 110 may be useful to allow a surgeon or other user to modify instructions to be executed by the processor 104 according to one or more embodiments of the present disclosure, and/or to modify or adjust a setting of other information displayed on the user interface 110 or corresponding thereto.
  • the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102.
  • the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computer device 102.
  • the imaging device 112 may be operable to image anatomical feature(s) (e.g., a bone, veins, tissue, etc.) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc.).
  • image data refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form.
  • the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof.
  • the image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure.
  • a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time, and a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
  • the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
  • the imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an 0-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient.
  • the imaging device 112 may be contained entirely within a single housing, or may comprise a transmitter/emitter and a receiver/ detector that are in separate housings or are otherwise physically separated.
  • the imaging device 112 may comprise more than one imaging device 112.
  • a first imaging device may provide first image data and/or a first image
  • a second imaging device may provide second image data and/or a second image.
  • the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein.
  • the imaging device 112 may be operable to generate a stream of image data.
  • the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images.
  • image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
  • the robot 114 may be any surgical robot or surgical robotic system.
  • the robot 114 may be or comprise, for example, the M azor XTM Stealth Edition robotic guidance system.
  • the robot 114 may be configured to position the imaging device 112 at one or more precise position(s) and orientation(s), and/or to return the imaging device 112 to the same position(s) and orientation(s) at a later point in time.
  • the robot 114 may additionally or alternatively be configured to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task.
  • the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure.
  • the robot 114 may comprise one or more robotic arms 116.
  • the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms.
  • one or more of the robotic arms 116 may be used to hold and/or maneuver the imaging device 112.
  • the imaging device 112 comprises two or more physically separate components (e.g., a transmitter and receiver)
  • one robotic arm 116 may hold one such component
  • another robotic arm 116 may hold another such component.
  • Each robotic arm 116 may be positionable independently of the other robotic arm.
  • the robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
  • the robot 114 together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
  • the robotic arm(s) 116 may comprise one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm 116 (as well as any object or element held by or secured to the robotic arm 116).
  • reference markers e.g., navigation markers
  • the reference markers may be placed on the robot 114 (including, e.g., on the robotic arm 116), the imaging device 112, or any other object in the surgical space.
  • the reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof.
  • the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation.
  • the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStationTM S8 surgical navigation system or any successor thereof.
  • the navigation system 118 may include one or more cameras or other sensor(s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the system 100 is located.
  • the one or more cameras may be optical cameras, infrared cameras, or other cameras.
  • the navigation system 118 may comprise one or more electromagnetic sensors.
  • the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, and/or one or more surgical tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing).
  • the navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
  • the system 100 can operate without the use of the navigation system 118.
  • the navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
  • the database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system).
  • the database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100); one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information.
  • one or more surgical plans including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100
  • the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud 134.
  • the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • a hospital image storage system such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • the cloud 134 may be or represent the Internet or any other wide area network.
  • the computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both.
  • the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
  • the system 100 or similar systems may be used, for example, to carry out one or more aspects of any of the methods 300, 400, 500, and/or 600 described herein.
  • the system 100 or similar systems may also be used for other purposes.
  • FIG. 2 A is a diagram depicting additional aspects of the system 100 according to at least one embodiment of the present disclosure.
  • a patient 204 may be positioned on a bed 208 to undergo a surgery or surgical procedure, with the imaging device 112, the robot 114, and the robotic arm 116 positioned proximate the patient 204.
  • the patient 204 may be positioned on the bed 208.
  • the bed 208 may be any operating bed or table configured to support a patient during a surgical procedure.
  • the bed 208 may include any accessories mounted to or otherwise coupled to the bed 208 such as, for example, a bed rail, a bed rail adaptor, an arm rest, an extender, or the like.
  • the bed 208 may be stationary or may be operable to maneuver the patient 204 (e.g., the bed 208 may be able to move).
  • the bed 208 has two positioning degrees of freedom and one rotational degree of freedom, which allows positioning of the specific anatomy of the patient anywhere in space (within a volume defined by the limits of movement of the bed 208).
  • the bed 208 can slide forward and backward and from side to side, and can tilt (e.g., around an axis positioned between the head and foot of the bed 208 and extending from one side of the bed 208 to the other) and/or roll (e.g., around an axis positioned between the two sides of the bed 208 and extending from the head of the bed 208 to the foot thereof).
  • the bed 208 can bend at one or more areas (which bending may be possible due to, for example, the use of a flexible surface for the bed 208, or by physically separating one portion of the bed 208 from another portion of the bed 208 and moving the two portions independently).
  • the bed 208 may be manually moved or manipulated by, for example, a surgeon or other user, or the bed 208 may comprise one or more motors, actuators, and/or other mechanisms configured to enable movement and/or manipulation of the bed 208 by a processor such as the processor 104.
  • the patient 204 may have a working volume 212.
  • the working volume 212 may be or represent a volume relevant to the surgery or surgical procedure.
  • the working volume 212 may contain patient anatomy (e.g., anatomical elements) that are to be manipulated or operated on during the course of the surgery or surgical procedure.
  • the working volume 212 may represent the volume through which surgical tools, such as those attached to and driven by the robot 114 and/or the robotic arm 116, may move during the course of the surgery or surgical procedure.
  • the patient anatomy may comprise one or more vertebrae (e.g., lumbar vertebrae), such as when the surgical procedure comprises a spinal fusion.
  • the patient 204 may be positioned on the bed 208 in a prone position, with the working volume 212 defined as a volume that includes the one or more vertebrae.
  • the working volume 212 defined as a volume that includes the one or more vertebrae.
  • Portions of the robot 114, an end effector of the robotic arm 116, and the like may move into, through, and/or out of the working volume 212 during the course of the surgery or surgical procedure.
  • a surgical drill or other tool coupled to the end effector of the robotic arm 116 may enter the working volume 212 to drill into one or more vertebrae.
  • the robot 114 and/or components thereof such as the robotic arm 116 may be positioned on a robot cart 216.
  • the robot cart 216 may be a mobile platform that enables the robot 114 to be positioned relative to the bed 208.
  • the robot cart 216 may comprise wheels that enable the robot cart 216 to roll or move relative to the bed 208.
  • the robot cart 216 may be detachable from the wheels or the wheels may lockable such that, once the robot cart 216 is positioned in a desired location 236 relative to the bed 208 and/or the working volume 212, the robot cart 216 will remain in the desired location 236.
  • the robot cart 216 may have a mechanism that enables the robot cart 216 to remain fixed relative to the bed 208 and/or the working volume 212.
  • the mechanism may better ensure that the robot 114 and/or any other components on the robot cart 216 do not move relative to the bed 208 and/or the working volume 212 due to the mobility of the robot cart 216 once the robot cart 216 has been positioned in the desired location 236.
  • the desired location 236 may more generally be a location in which the robotic arm 116 is correctly aligned for performing one or more tasks associated with the surgery or surgical procedure while the first location 232 may be a location in which the robotic arm 116 is initially positioned or a location other than the desired location 236.
  • the first location 232 may be a location where the robot cart 216 is positioned next to the bed 208 with the robotic arm 116 positioned within the working volume 212.
  • the desired location 236 may then be a location where the robotic arm 116 is rotated a first angle (e.g., 2 degrees, 5 degrees, 10 degrees, 15 degrees, 30 degrees, etc.) from the first location 232, such that the robotic arm 116 is more closely positioned within and/or has increased or optimized access to the working volume 212. Stated differently, the robotic arm 116 may be able to reach or access certain portions of the working volume 212 when in the desired location 236 that may be otherwise inaccessible when the robotic arm 116 is at the first location 232.
  • a first angle e.g., 2 degrees, 5 degrees, 10 degrees, 15 degrees, 30 degrees, etc.
  • Fig. 2B is a diagram of the robotic arm 116 moving according to at least one embodiment of the present disclosure. While Fig. 2B depicts the robotic arm 116 moving, it is to be understood that the robot 114 or components thereof may perform any movement discussed herein with respect to the robotic arm 116.
  • the robotic arm 116 may move from a first pose 220A to a second pose 220B.
  • one or more portions of the robotic arm 116 e.g., linkages, the end effector, the base, etc.
  • the movement of the robotic arm 116 may indirectly indicate to a user (e.g., a surgeon, a member of surgical staff, etc.) the desired location 236 relative to the robotic arm 116.
  • a user e.g., a surgeon, a member of surgical staff, etc.
  • the robotic arm 116 may be unable to interact with the working volume 212.
  • the robotic arm 116 may be able to interact with the working volume 212 when in robot cart 216 is positioned at the desired location 236.
  • the robotic arm 116 may be optimized to access the working volume 212 when in the second pose 220B as compared to when the robotic arm 116 is in the first pose 220A (e.g., one or more portions of the robotic arm 116, such as the end effector, is able to access a greater amount of volume in the working volume 212 when in the second pose 220B than in the first pose 220A).
  • the user may move the robot cart 216 such that the robotic arm 116 interacts with the working volume 212 while the robotic arm 116 is in the second pose 220B.
  • one or more components of the robot 114 may rotate.
  • the base of the robotic arm 116 may rotate by a first angle (e.g., 2 degrees, 5 degrees, 10 degrees, 30 degrees, 60 degrees, 90 degrees, 120 degrees, etc.) when the robotic arm 116 moves from the first pose 220 A into the second pose 220B.
  • the change in the angle of the base of the robotic arm 116 may indicate to the user how the robotic arm 116 should be angled relative to the working volume 212.
  • the robotic arm 116 may rotate to indicate to the user that robot cart 216 should be positioned such that the rotated robotic arm 116 is positioned in the working volume 212 when the robot cart 216 is at the desired location 236.
  • the robotic arm 116 may rotate 90 degrees clockwise when moving from the first pose 220A to the second pose 220B .
  • the rotation of the robotic arm 116 may enable the robotic arm 116 to interact with the working volume 212 when the robot cart 216 carrying the robotic arm 116 is positioned on the right-hand side of the patient 204, but such rotation of the robotic arm 116 may prevent the robotic arm 116 from interacting with the working volume 212 when the robot cart 216 is positioned on the left-hand side of the patient 204.
  • the user may realize that the desired location 236 is on the right-hand side of the patient 204, and that the robot cart 216 should be positioned on the right-hand side of the patient 204 to enable the robotic arm 116 to interact with the working volume 212.
  • the robot cart 216 may be moved relative to the bed 208 based on the second pose 220B of the robotic arm 116.
  • the robotic arm 116 may interact with the working volume 212 while in the second pose 220B, so the robot cart 216 may then be rolled toward the bed 208, such that the robotic arm 116 enters into the working volume 212.
  • the processor 104 may generate a recommended movement of the robot cart 216 to move the robotic arm 116 into the desired location 236.
  • the processor 104 may generate one or more recommendations (which may be rendered to a display such as the user interface 110) indicating that the robot cart 216 should be moved to the right-hand side of the patient 204.
  • the robotic arm 116 may begin in the first pose 220 A when the robot cart 216 is positioned at a first location 232.
  • the robotic arm 116 may be in the first pose 220A due to a prepositioning step performed by the user.
  • the surgical staff may predispose the robotic arm 116 in the first pose 220 A or any other pose other than the second pose 220B, such as when the robot cart 216 is first brought into the operating room.
  • the robotic arm 116 may be caused (e.g., by a processor 104) to move into the second pose 220B.
  • the second pose 220B may be or comprise a predetermined pose provided in a surgical plan and accessed by the processor 104 through, for example, the database 130.
  • the predetermined pose may be determined by the processor 104 using, for example, one or more transformations 124 to transform coordinates associated with the robot cart 216 when in the desired location 236 to a corresponding pose of the robotic arm 116.
  • the processor 104 may determine the coordinates of the robot cart 216 when the robot cart 216 is at the desired location 236, and determine the second pose 220B based on the known coordinates of the robotic arm 116 relative to the robot cart 216. The processor 104 may then move the robotic arm 116 into the second pose 220B.
  • Fig. 2C is a diagram of the imaging device 112 generating a light field 224 according to at least one embodiment of the present disclosure.
  • the light field 224 may be generated by the imaging device 112 and/or one or more other imaging devices.
  • Fig. 2C depicts a single imaging device 112 generating and emitting the light field 224
  • the light field 224 may be generated using a combination of other, non-depicted imaging devices.
  • the light field 224 may be projected by the robot 114 or the robotic arm 116.
  • the imaging device 112 may be held by the robot 114 and/or the robotic arm 116.
  • the navigation system 118 may track the position of the imaging device 112 and cause the imaging device 112 to emit the light field 224 once the imaging device 112 has been positioned in a first pose. While in the first pose, the imaging device 112 may be able to emit the light field 224 to illuminate the desired location 236 and/or path from the first location 232 to the desired location236.
  • the light field 224 illuminates one or more locations in the operating room.
  • the light field 224 illuminates the first location 232, which may be the location at which the robot cart 216 is currently positioned.
  • the first location 232 may be the location at which the robot cart 216 is initially positioned by the user. Additionally or alternatively, the first location 232 may be or represent any location of the robot cart 216 other than the desired location
  • the light field 224 may also illuminate the desired location 236 in which the robotic arm 116 should be aligned.
  • the light field 224 may illuminate a volume into which the robot cart 216 should be moved to allow the robotic arm 116 to occupy the working volume 212 and/or to carry out one or more surgical tasks in accordance with the surgical plan.
  • the light field 224 may illuminate a path between the first location 232 and the desired location 236, such that the user can maneuver the robot cart 216 along the path to the desired location 236.
  • the processor 104 may determine (e.g., based on image processing 120 performed on one or more images captured by the imaging device 112) a path to the desired location 236 from the first location 232, and may cause the imaging device 112 to generate the light field 224 in a pattern that illuminates the path.
  • the light field 224 may have differing intensities depending on the location in the operating room subject to illumination by the light field 224.
  • the light field 224 may illuminate the desired location 236 with a first illumination 228A and illuminate the first location 232 with a second illumination 228B.
  • the first illumination 228A may have a different intensity of illumination than the second illumination 228B.
  • the first illumination 228 A may be emitted at a greater intensity than the second illumination 228B (or vice versa).
  • the difference in intensities between the first illumination 228A and the second illumination 228B may enable the user to visually determine the difference in the first location 232 and the desired location 236, such as when, for example, the first location 232 and the desired location 236 are relatively close together and difficult to distinguish between absent different illumination intensities.
  • the light field 224 may illuminate the path to the desired location 236 with a different intensity of light than the first location 232 and/or the desired location 236.
  • the path to the desired location 236 may be illuminated with the second illumination 228B while the desired location 236 is illuminated with the first illumination 228A.
  • the desired location 236 and the path to the desired location 236 may be illuminated with the first illumination 228A, while the first location 232 is illuminated with the second illumination 228B.
  • the color of light in the first illumination 228A may differ from the second illumination 228B.
  • the first illumination 228 A of the light field 224 may be projected in a first color (e.g., green) while the second illumination 228B of the light field 224 may be projected in a second color different from the first color (e.g., red).
  • the first illumination 228A may be modulated relative to the second illumination 228B (or vice versa).
  • the first illumination 228A of the light field 224 may be pulsed, while the second illumination 228B may be continuous (e.g., time invariant).
  • the shape of the first illumination 228 A may differ from that of the second illumination 228B.
  • the first illumination 228A may illuminate the desired location 236 and/or the path to the desired location 236 with a first shape (e.g., a circle, a shape that outlines a path to the desired location 236, etc.) while the second illumination 228B may illuminate the first location 232 with a second shape (e.g., a square).
  • a first shape e.g., a circle, a shape that outlines a path to the desired location 236, etc.
  • a second illumination 228B may illuminate the first location 232 with a second shape (e.g., a square).
  • Fig. 2D is a diagram of the robotic arm 116 moving according to at least one embodiment of the present disclosure.
  • the robot cart 216 (and by extension the robotic arm 116) may move from the first location 232 to a second location 240.
  • the movement of the robot cart 216 to the second location 240 may occur when a user (e.g., a surgeon, a member of surgical staff, etc.) is attempting to move the robotic arm 116 into place during a robot positioning step of the surgery or surgical procedure.
  • the movement of the robotic arm 116 may be tracked by the system 100 (e.g., using the navigation system 118 and the imaging device 112) as the robot cart 216 moves from the first location 232 to the second location 240.
  • one or more indicators may be generated (e.g., by the processor 104 based on the movement of the robotic arm 116 tracked by the navigation system 118).
  • the indicators may reflect whether or not the movement from the first location 232 to the second location 240 resulted in the robotic arm 116 moving closer to the desired location 236 or further away from the desired location 236.
  • the indicators may be provided by to the user, such that the user knows whether or not the movement of the robot cart 216 to the second location 240 brought the robot cart 216 closer to the desired location 236 or not.
  • the indicators may be expressed using one or more components (e.g., one or more components of the system 10) within the operating room.
  • the indicator may be an audio indicator (e.g., a beep, a siren, etc.) emitted from one or more speakers.
  • the audio indicators may vary in terms of audio pitch, recurrence (e.g., a greater number of indicators per unit time), or the like depending on whether or not the movement to second location 240 has positioned the robotic arm 116 closer to the desired location 236 than the robotic arm 116 was at when at the first location 232.
  • the indicator may be delivered visually, such as through illumination of one or more light emitting diodes (LEDs) in the operating room, through light emitted from the imaging device 112, through information rendered to a display (e.g., the user interface 110), and the like.
  • the indicator may be a haptic indicator delivered through vibration motors, such as vibration motors disposed within the robot cart 216.
  • the type, intensity, and/or frequency of the indicator may vary depending on whether the second location 240 has moved the robot cart 216 closer to or further away from the desired location 236.
  • a first indicator e.g., a beeping sound
  • a second indicator e.g., an LED flashing
  • Such a difference in indicators may allow the user to determine whether the movement of the robot cart 216 to the second location 240 moved the robot cart 216 closer to or further away from the desired location 236.
  • audio indicators may be generated when the second location 240 is closer to the desired location 236 than the first location 232, but not when the second location 240 is further away from the desired location 236.
  • the user may be able to tell that the robot cart 216 is being moved into the desired location 236 when the movement of the robot cart 216 generates the audio indicator, and that the robot cart 216 is not being moved to the desired location 236 when the movement of the robot cart 216 does not result in an audio indicator.
  • the indicators may be generated continuously throughout the movement of the robot cart 216.
  • the second location 240 may be one of a plurality of locations occupied by the robot cart 216 as the robot cart 216 is moved (e.g., by the user) relative to the desired location 236.
  • one or more indicators may be generated that indicate whether or not the movement to the location moved the robot cart 216 closer to the desired location 236 or not.
  • the type, intensity, and/or frequency of the indicators may change as the robot cart 216 moves progressively closer to the desired location 236.
  • the system 100 may generate a visual indicator in the form of a flashing LED light on the user interface 110 and an audio indicator in the form of a beep.
  • the first indicator may change such that the LED light flashes more frequency, and the beep becomes louder and more frequent.
  • the LED light may stop flashing (e.g., the LED may remain on or off) and the beep may switch to a single tone. Such a change in the indicator may indicate to the user that the robot cart 216 is positioned at the desired location 236.
  • the second location 240 may be within a threshold distance 244 from the desired location 236.
  • the threshold distance may be determined based on the surgery or surgical procedure, and may be stored in and/or accessed from the database 130.
  • the type, intensity, and/or frequency of the indicator may change from the previously generated indicators.
  • a first indicator may be an LED light on the user interface 110.
  • the LED light may be illuminated with a first intensity of light (e.g., at 50% power as compared to a normal illumination of the LED light).
  • the LED light may instead be illuminated with a second intensity of light different from the first intensity (e.g., at 80% power, at 90% power, at normal illumination, etc.).
  • the imaging device 112 may emit a laser 248.
  • the laser 248 may be projected on the desired location 236 to provide a visual indicator as to the desired location
  • the laser 248 may be emitted by the robot 114 and/or one or more components thereof such as the robotic arm 116.
  • the laser 248 may be pulsed (e.g. , the laser 248 may switch between an “on state” and an “off state” a plurality of times over a unit amount of time). In other embodiments, the laser 248 may be continuously projected on the desired location 236.
  • Fig. 2E depicts a heatmap 252 according to at least one embodiment of the present disclosure.
  • the heatmap 252 may be rendered to a display (e.g., the user interface 110) and may depict the location of the robotic arm 116, the bed 208, the patient 204, and/or other components of the system 100.
  • the heatmap 252 may be divided into one or more regions.
  • the heatmap 252 may have a first region 256A, a second region 256B, and a third region 256C. Each of the different regions 256A-256C may represent different locations into which the robot 114 could be moved.
  • the first region 256A comprises the desired location 236, the second region 256B comprises the second location 240, and the third region 256C comprises the first location 232.
  • the different regions 256A-256C may be rendered in different colors. The different colors may indicate the relative closeness of the region to the desired location 236.
  • the first region 256A may be rendered in a first color (e.g., green) that indicates that the robot cart 216 is desired to be aligned within the first region 256A.
  • the second region 256B may be rendered in a second color (e.g., yellow) to indicate that the robot cart 216 will be closer to the desired location 236 when the robot cart 216 is aligned at the location depicted by the second region 256B (e.g., the second location 240).
  • the third region 256C may be rendered in a third color (e.g., red) that may indicate the current location of the robot cart 216.
  • the difference in color between the different regions 256A-256C may enable a user to visually determine the desired location 236 of the robot cart 216 (and by extension, the robot 114 and/or the robotic arm 116).
  • the processor 104 may provide one or more recommended adjustments to the robot cart 216, the robot 114, and/or the robotic arm 116 based on the tracked movement of any one or more of the foregoing by the system 100.
  • the movement of the robotic arm 116 may be tracked by the system 100 (e.g., using the navigation system 118 and the imaging device
  • the processor 104 may provide one or more recommendations to move the robot cart 216 to arrive at the desired location 236. For example, when the robot cart 216 is at the first location 232, the robot 114 may be rendered in a first color (e.g., red) that indicates that the robot cart 216 is not positioned at the desired location 236. The processor 104 may then recommend an adjustment to the pose of the robot cart 216, the robot 114, and/or the robotic arm 116 to move the robot cart 216 to the desired location 236.
  • a first color e.g., red
  • the processor 104 may receive information from the navigation system 118 related to the movement of the robot cart 216 and update the heatmap 252 to depict the robot cart 216 as having moved (e.g., into the second location 240, into the desired location 236, etc.).
  • the heatmap 252 may be updated continuously as the robot cart 216 is moved, such that a user can visually determine whether the movements of the robot cart 216 are resulting in the robot cart 216 moving closer to or into the desired location 236.
  • the one or more positioning techniques discussed herein may be used in combination with one another to increase efficiency in positioning the robotic cart or device.
  • Fig. 3 depicts a method 300 that may be used, for example, to provide recommended adjustments to a robot to position the robot at a desired location.
  • the method 300 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
  • a processor other than any processor described herein may also be used to execute the method 300.
  • the at least one processor may perform the method 300 by executing elements stored in a memory such as the memory 106.
  • the elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 300.
  • One or more portions of a method 300 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the method 300 comprises determining a desired location in which a robot is to be positioned (step 304).
  • the robot may be similar to or the same as the robot 114 or one or more components thereof (e.g., the robotic arm 116) and the desired location may be similar to or the same as the desired location 236.
  • the robot 114 may be used during the course of a surgery or surgical procedure, and it may be desired that the robot 114 is moved to the desired location 236 to perform one or more tasks associated with the surgery or surgical procedure.
  • the desired location 236 may be determined based on the surgical plan, such as a surgical plan stored in and accessed from the database 130.
  • the method 300 also comprises receiving information about a current location of the robot (step 308).
  • the current location may be similar to or the same as the first location 232.
  • the current location may be determined based on images of the operating room captured by the imaging device 112.
  • the robot 114 may comprise one or more navigation markers that can be identified in the images captured by the imaging device 112.
  • the navigation system 118 may, using the processor 104, perform image processing 120 and segmentation 122 to identify the segments of the image containing the robot 114.
  • the navigation system 118 may then determine the current location based on the identified segments and the known pose in which the imaging device 112 was placed to capture the images.
  • the current location may be determined based on a user input via, for example, the user interface 110.
  • the user may indicate that the robot 114 is in a known location in the operating room, such as in an area whose coordinates are known to the navigation system 118, by inputting such information into the user interface 110.
  • the method 300 also comprises rendering, to a display, a heatmap depicting at least one of the current location of the robot and the desired location of the robot (step 312).
  • the display may be similar to or the same as the user interface 110, and the heatmap may be similar to or the same as the heatmap 252.
  • the current location may be similar to or the same as the first location 232, while the desired location may be similar to or the same as the desired location 236.
  • the heatmap 252 may depict the current location of the robot in a first color (e.g., red) and may depict the location of the desired location 236 in a second color different from the first color (e.g., green).
  • the desired location 236 of the robot 114 may be located at a first region 256 A of the heatmap 252 and the current location of the robot 114 may be located at a second region 256B or a third region 256C.
  • the method 300 also comprises determining a difference between the desired location and the current location (step 316).
  • the step 316 may use information about the desired location, such as information stored in and accessed from the database 130, as well as information about the current location to determine the difference therebetween.
  • the information about the current location may be determined by the navigation system 118 that tracks the position of the robot 114.
  • determining the difference may include determining the difference in coordinates between the current location and the desired location in a shared coordinate system.
  • the method 300 also comprises providing, based on the difference, a recommended adjustment to a pose of the robot to move the robot toward the desired location depicted by the heatmap (step 320).
  • the processor 104 may determine a movement of the robot 114 to reduce the difference between the coordinates associated with the current location and the coordinate associated with the desired location in the shared coordinate system.
  • the processor 104 may render such information to the same display that displays the heatmap 252.
  • the recommended adjustment may comprise a recommended movement of the robot cart 216, the robot 114, the robotic arm 116, and/or the like to move the robot 114 closer to the desired location 236.
  • the present disclosure encompasses embodiments of the method 300 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • Fig. 4 depicts a method 400 that may be used, for example, to provide a recommended adjustment to move a robot to a desired location. In some embodiments, the method 400 may continue from the step 308 of the method 300.
  • the method 400 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
  • a processor other than any processor described herein may also be used to execute the method 400.
  • the at least one processor may perform the method 400 by executing elements stored in a memory such as the memory 106.
  • the elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 400.
  • One or more portions of a method 400 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the method 400 comprises causing a robotic arm of the robot to move into a first pose, wherein the robotic arm, when in the second pose, interacts with a working volume when the robot is in the desired location (step 404).
  • the robotic arm of the robot may be similar to or the same as the robotic arm 116 of the robot 114.
  • the step 404 may continue from the step 308 of the method 300, where information about the current location of the robot is received.
  • the movement of the robotic arm 116 may be from the first pose 220 A to the second pose 220B.
  • the robotic arm 116 may be able to interact with the working volume 212 associated with the surgery or surgical procedure when the robotic arm 116 is in the second pose 220B and the robotic arm 116 has been positioned at the desired location 236.
  • the method 400 also comprises determining a difference between the desired location and the current location (step 408).
  • the step 408 may be similar to the step 316 of the method 300.
  • the difference between the desired location of the robotic arm 116 and the current location of the robotic arm 116 may be determined based on coordinates associated with both the desired location of the robotic arm 116 and the desired location of the robotic arm
  • the processor 104 may determine the difference in such coordinates to determine a difference between the desired location and the current location.
  • the method 400 also comprises providing, based on the difference, a recommended adjustment to a second pose of the robot to move the robot toward the desired location (step 412).
  • the recommended adjustment may be similar to the recommended adjustment provided in the step 320 of the method 300.
  • the second pose of the robot 114 may be or comprise the pose of the robot 114 when at the current location.
  • the recommended adjustment to the robot 114 may include determining, based on the second pose 220B of the robotic arm 116, a movement to the robot 114 to move the robotic arm 116 into the working volume 212.
  • the processor 104 may provide a recommendation to move the robot cart 216 to a first end of the bed 208 to enable the robotic arm 116 to interact with the working volume 212.
  • the present disclosure encompasses embodiments of the method 400 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • Fig. 5 depicts a method 500 that may be used, for example, to provide an adjustment to a pose of a robot based on a light field and/or one or more laser lines. In some embodiments, the method 500 may continue from the step 304 of the method 300.
  • the method 500 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
  • a processor other than any processor described herein may also be used to execute the method 500.
  • the at least one processor may perform the method 500 by executing elements stored in a memory such as the memory 106.
  • the elements stored in the memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 500.
  • One or more portions of a method 500 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the method 500 comprises causing an imaging device to project a light field, wherein the light field illuminates the desired location or a path to the desired location (step 504).
  • the light field may be similar to or the same as the light field 224.
  • the step 504 may continue from the step 304 of the method 300, where a desired location of a robot is determined.
  • the light field 224 may illuminate the desired location, which may be similar to or the same as the desired location 236. In some embodiments, both the desired location 236 and a path to the desired location 236 may be illuminated by the light field 224.
  • the desired location 236 may illuminated with a first illumination 228A, while other locations in the operating room, such as the first location 232 of the robotic arm 116, may be illuminated in a second illumination 228B different from the first illumination 228A.
  • the method 500 also comprises causing the imaging device to generate one or more laser lines that illuminate the desired location (step 508).
  • the one or more laser lines may be similar to or the same as the laser 248 emitted from the imaging device 112, the robot 114, and/or other instruments in the operating room.
  • the laser 248 may be pointed at and/or illuminate the desired location 236.
  • the processor 104 may control the laser 248, and may orient and cause the laser 248 to emit light based on the desired location 236.
  • the processor 104 may access the database 130 and/or use information about the desired location 236 obtained in the step 304 to determine where to shine the laser 248.
  • the processor 104 may, in conjunction with the navigation system 118, cause the device emitting the laser 248 (e.g., the imaging device 112) to move such that the laser 248 illuminates the desired location 236.
  • the method 500 also comprises providing, based on the desired location, an adjustment to the pose of the robot to move the robot toward the desired location depicted by at least one of the light field and the one or more laser lines (step 512).
  • the processor 104 may provide (e.g., via a display) an adjustment, such as a recommended adjustment, that moves the robot 114 toward the desired location 236.
  • the robot 114 may be moved (e.g., by a user, by the processor 104 by controlling one or more motors in the robot 114, etc.) toward the desired location 236.
  • the present disclosure encompasses embodiments of the method 500 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • Fig. 6 depicts a method 600 that may be used, for example, to generate indicators based on a movement of a robot relative to a desired location. In some embodiments, the method 600 may continue from the step 308 of the method 300.
  • the method 600 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
  • a processor other than any processor described herein may also be used to execute the method 600.
  • the at least one processor may perform the method 600 by executing elements stored in a memory such as the memory 106.
  • the elements stored in the memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 600.
  • One or more portions of a method 600 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the method 600 comprises determining that the robot has moved from the current location to an intermediate location (step 604).
  • the robot may be similar to or the same as the robot 114, and may be disposed on the robot cart 216.
  • the current location may be similar to or the same as the first location 232 and the intermediate location may be similar to or the same as the second location 240.
  • the navigation system 118 may track the position of the robot 114 such that, when the robot 114 moves into the intermediate location, the navigation system 118 may generate an alert indicating that the robot 114 has moved.
  • the method 600 also comprises generating, when the intermediate location is closer to the desired location than the current location, a first indicator (step 608).
  • the first indicator may be similar to or the same as any indicator discussed herein.
  • the first indicator may be or comprise an audio, visual, and/or haptic indicator that enables the user to determine that the robot 114 has moved into the intermediate location and that the intermediate location is closer to the desired location 236 than the robot 114 was in the current location.
  • the first indicator may be generated at a first intensity when intermediate location is within a threshold distance 244 from the desired location 236, and generated at a second intensity different than the first intensity when the intermediate location is outside the threshold distance 244 from the desired location 236.
  • the method 600 also comprises generating, when the intermediate location is further away from the desired location than the current location, a second indicator (step 612).
  • the second indicator may be similar to or the same as any indicator discussed herein.
  • the first indicator may be or comprise an audio, visual, and/or haptic indicator that enables the user to determine that the robot 114 has moved into the intermediate location and that the intermediate location is further away from the desired location 236 than the location in which the robot 114 was initially positioned.
  • the present disclosure encompasses embodiments of the method 600 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • the present disclosure encompasses methods with fewer than all of the steps identified in Figs. 3, 4, 5, and 6 (and the corresponding description of the methods 300, 400, 500, and 600), as well as methods that include additional steps beyond those identified in
  • Figs. 3, 4, 5, and 6 (and the corresponding description of the methods 300, 400, 500, and 600).
  • the present disclosure also encompasses methods that comprise one or more steps from one method described herein, and one or more steps from another method described herein. Any correlation described herein may be or comprise a registration or any other correlation.
  • Example 1 A method, comprising: determining a desired location in which a robot is to be positioned; receiving information about a current location of the robot; rendering, to a display, a heatmap depicting at least one of the current location of the robot and the desired location of the robot; determining a difference between the desired location and the current location; and providing, based on the difference, a recommended adjustment to a pose of the robot to move the robot toward the desired location depicted by the heatmap.
  • Example 2 The method of claim 1, wherein the current location is rendered in a first color on the heatmap, and wherein the desired location is rendered in a second color different from the first color on the heatmap.
  • Example 3 The method of claim 1, further comprising: causing a robotic arm of the robot to move into a second pose, wherein the robotic arm, when in the second pose, interacts with a working volume when the robot is in the desired location.
  • Example 4 The method of claim 3, wherein the recommended adjustment is based on a combination of the second pose of the robotic arm and the working volume.
  • Example 5 The method of claim 1, further comprising: causing an imaging device to project a light field, wherein the light field illuminates the desired location or a path to the desired location.
  • Example 6 The method of claim 5, wherein the light field illuminates the desired location with a first intensity of light, and wherein the light field illuminates the current location of the robot with a second intensity of light different from the first intensity of light.
  • Example 7 The method of claim 1, further comprising: determining that the robot has moved from the current location to an intermediate location; generating, when the intermediate location is closer to the desired location than the current location, a first indicator; and
  • Example 8 The method of claim 7, wherein at least one of the first indicator and the second indicator comprises at least one of an audio indicator, a visual indicator, and a haptic indicator.
  • Example 9 The method of claim 7, wherein the first indicator is generated at a first intensity when the intermediate location is within a threshold distance from the desired location, and wherein the first indicator is generated at a second intensity different from the first intensity when the intermediate location is outside of the threshold distance from the desired location.
  • Example 10 The method of claim 1, further comprising:
  • Example 11 A system, comprising: a processor; and a memory coupled to the processor and storing data thereon that, when executed by the processor, enable the processor to : determine a desired location for positioning a robot; receive information describing a current location of the robot; render, to a display, a heatmap depicting at least one of the current location and the desired location; determine a difference between the desired location and the current location; and provide, based on the difference, a recommended adjustment to a pose of the robot to move the robot toward the desired location depicted by the heatmap.
  • Example 12 The system of claim 11, wherein the current location is rendered in a first color on the heatmap, and wherein the desired location is rendered in a second color different from the first color on the heatmap.
  • Example 13 The system of claim 11 , wherein the data further enable the processor to: cause a robotic arm of the robot to move into a second pose, wherein the robotic arm interacts with a working volume when the robot is in the desired location and when the robotic arm is in the second pose.
  • Example 14 The system of claim 13, wherein the recommended adjustment is based on a combination of the second pose of the robotic arm and the working volume.
  • Example 15 The system of claim 11, wherein the data further enable the processor to: cause an imaging device to project a light field that illuminates the desired location or a path to the desired location with a first intensity of light.
  • Example 16 The system of claim 11, wherein the data further enable the processor to: determine that the robot has moved from the current location to an intermediate location; generate, when the intermediate location is closer to the desired location than the current location, a first indicator; and generate, when the intermediate location is further away from the desired location than the current location, a second indicator.
  • Example 17 The system of claim 11, wherein the data further enable the processor to: cause a surgical tool to emit one or more laser lines that illuminate the desired location.
  • Example 18 A surgical system, comprising: a robot including a robotic arm; a processor coupled to at least one of the robot and the robotic arm; and a memory coupled to the processor and storing data thereon that, when executed by the processor, enable the processor to : determine a desired location for positioning the robot; receive information associated with a current location of the robot; cause the robotic arm to move into a first pose, wherein the robotic arm interacts with a working volume when the robot is in the desired location and when the robotic arm is in the first pose; determine a difference between the desired location and the current location; and provide, based on the difference, a recommended adjustment to a second pose of the robot to move the robot toward the desired location.
  • Example 19 The surgical system of claim 18, wherein the data further enable the processor to: render, to a display, a heatmap depicting at least one of the current location of the robot and the desired location of the robot.
  • Example 20 The surgical system of claim 18, wherein the data further enable the processor to: determine that the robot has moved from the current location to an intermediate location; generate, when the intermediate location is closer to the desired location than the current location, a first indicator; and generate, when the intermediate location is further away from the desired location than the current location, a second indicator.
  • Example 21 A system (100), comprising: a processor (104); and a memory (106) coupled to the processor (104) and storing data thereon that, when executed by the processor (104), enable the processor (104) to: determine a desired location (236) for positioning a robot (114); receive information describing a current location (232) of the robot (114); render, to a display (110), a heatmap (252) depicting at least one of the current location (232) and the desired location (236); determine a difference between the desired location (236) and the current location
  • Example 22 The system according to claim 21, wherein the current location (232) is rendered in a first color on the heatmap (252), and wherein the desired location (236) is rendered in a second color different from the first color on the heatmap (252).
  • Example 23 The system according to any of claims 21 to 22, wherein the data further enable the processor (104) to:
  • a robotic arm (116) of the robot (114) moves into a second pose (220B), wherein the robotic arm (116) interacts with a working volume (212) when the robot (114) is in the desired location (236) and when the robotic arm (116) is in the second pose (220B).
  • Example 24 The system according to claim 23, wherein the recommended adjustment is based on a combination of the second pose (220B) of the robotic arm (116) and the working volume (212).
  • Example 25 The system according to any of claims 21 to 24, wherein the data further enable the processor (104) to: cause an imaging device (112) to project a light field (224) that illuminates the desired location (236) or a path to the desired location (236) with a first intensity of light.
  • Example 26 The system according to claim 25, wherein the light field (224) illuminates the desired location (236) with a first intensity of light, and wherein the light field (224) illuminates the current location (232) of the robot (114) with a second intensity of light different from the first intensity of light.
  • Example 27 The system according to any of claims 21 to 26, wherein the data further enable the processor (104) to: determine that the robot (114) has moved from the current location (232) to an intermediate location (240) ; generate, when the intermediate location (240) is closer to the desired location (236) than the current location (232), a first indicator; and generate, when the intermediate location (240) is further away from the desired location (236) than the current location (232), a second indicator.
  • Example 28 The system according to claim 27, wherein at least one of the first indicator and the second indicator comprises at least one of an audio indicator, a visual indicator, and a haptic indicator.
  • Example 29 The system according to any of claims 27 to 28, wherein the first indicator is generated at a first intensity when the intermediate location (240) is within a threshold distance (244) from the desired location (236), and wherein the first indicator is generated at a second intensity different from the first intensity when the intermediate location (240) is outside of the threshold distance (244) from the desired location (236).
  • Example 30 The system according to any of claims 21 to 29, wherein the data further enable the processor (104) to: cause a surgical tool to emit one or more laser lines (248) that illuminate the desired location (236).
  • Example 31 A surgical system (100), comprising: a robot (114) including a robotic arm (116); a processor (104) coupled to at least one of the robot (114) and the robotic arm (116); and a memory (106) coupled to the processor (104) and storing data thereon that, when executed by the processor (104), enable the processor (104) to: determine a desired location (236) for positioning the robot (114); receive information associated with a current location (232) of the robot (114); cause the robotic arm (116) to move into a first pose (220B), wherein the robotic arm interacts with a working volume (212) when the robot (114) is in the desired location (236) and when the robotic arm is in the first pose (220B); determine a difference between the desired location (236) and the current location (232); and provide, based on the difference, a recommended adjustment to a second pose of the robot (114) to move the robot (114) toward the desired location (236).
  • Example 32 The surgical system according to claim 31, wherein the data further enable the processor (104) to: render, to a display (110), a heatmap (252) depicting at least one of the current location (232) of the robot (114) and the desired location (236) of the robot (114).
  • Example 33 The surgical system according to any of claims 31 to 32, wherein the data further enable the processor (104) to: determine that the robot (114) has moved from the current location (232) to an intermediate location (240) ; generate, when the intermediate location (240) is closer to the desired location (236) than the current location (232), a first indicator; and generate, when the intermediate location (240) is further away from the desired location (236) than the current location (232), a second indicator.
  • Example 34 The surgical system according to any one of claims 31 to 33, wherein the data further enable the processor (104) to: cause an imaging device (112) to project a light field (224) that illuminates the desired location (236) or a path to the desired location (236) with a first intensity of light.
  • Example 35 The surgical system according to any one of claims 31 to 34, wherein the data further enable the processor (104) to: cause a surgical tool to emit one or more laser lines (248) that illuminate the desired location (236).

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Robotics (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Mechanical Engineering (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Manipulator (AREA)

Abstract

Un procédé selon au moins un mode de réalisation de la présente divulgation consiste à : déterminer un emplacement souhaité dans lequel un robot doit être positionné ; recevoir des informations concernant un emplacement actuel du robot ; fournir un rendu, à un dispositif d'affichage, d'une carte thermique représentant l'emplacement actuel du robot et/ou l'emplacement souhaité du robot ; déterminer une différence entre l'emplacement souhaité et l'emplacement actuel ; et fournir, sur la base de la différence, un réglage recommandé à une pose du robot pour déplacer le robot vers l'emplacement souhaité représenté par la carte thermique.
PCT/IL2024/050466 2023-05-15 2024-05-15 Systèmes de positionnement d'un robot chirurgical Pending WO2024236564A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202480032086.8A CN121127197A (zh) 2023-05-15 2024-05-15 用于定位外科手术机器人的系统

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363466635P 2023-05-15 2023-05-15
US63/466,635 2023-05-15

Publications (1)

Publication Number Publication Date
WO2024236564A1 true WO2024236564A1 (fr) 2024-11-21

Family

ID=91581061

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2024/050466 Pending WO2024236564A1 (fr) 2023-05-15 2024-05-15 Systèmes de positionnement d'un robot chirurgical

Country Status (2)

Country Link
CN (1) CN121127197A (fr)
WO (1) WO2024236564A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190069962A1 (en) * 2016-02-26 2019-03-07 Think Surgical, Inc. Method and system for guiding user positioning of a robot
US20210153958A1 (en) * 2018-04-20 2021-05-27 Covidien Lp Systems and methods for surgical robotic cart placement
US20210228282A1 (en) * 2018-03-13 2021-07-29 Intuitive Surgical Operations Inc. Methods of guiding manual movement of medical systems
US20210322115A1 (en) * 2016-09-19 2021-10-21 Intuitive Surgical Operations, Inc. Positioning indicator system for a remotely controllable arm and related methods
US20230096023A1 (en) * 2021-09-22 2023-03-30 Mazor Robotics Ltd. Systems and methods for work volume mapping to facilitate dynamic collision avoidance

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190069962A1 (en) * 2016-02-26 2019-03-07 Think Surgical, Inc. Method and system for guiding user positioning of a robot
US20210322115A1 (en) * 2016-09-19 2021-10-21 Intuitive Surgical Operations, Inc. Positioning indicator system for a remotely controllable arm and related methods
US20210228282A1 (en) * 2018-03-13 2021-07-29 Intuitive Surgical Operations Inc. Methods of guiding manual movement of medical systems
US20210153958A1 (en) * 2018-04-20 2021-05-27 Covidien Lp Systems and methods for surgical robotic cart placement
US20230096023A1 (en) * 2021-09-22 2023-03-30 Mazor Robotics Ltd. Systems and methods for work volume mapping to facilitate dynamic collision avoidance

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ABOLGHASEMI POOYA ET AL: "Real-time placement of a wheelchair-mounted robotic arm", 2016 25TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), IEEE, 26 August 2016 (2016-08-26), pages 1032 - 1037, XP033006226, DOI: 10.1109/ROMAN.2016.7745235 *

Also Published As

Publication number Publication date
CN121127197A (zh) 2025-12-12

Similar Documents

Publication Publication Date Title
US20250186152A1 (en) Systems, methods, and devices for defining a path for a robotic arm
US20220395342A1 (en) Multi-arm robotic systems and methods for monitoring a target or performing a surgical procedure
US20230389991A1 (en) Spinous process clamp registration and methods for using the same
US20250152262A1 (en) Path planning based on work volume mapping
US12295797B2 (en) Systems, methods, and devices for providing an augmented display
US20230270503A1 (en) Segemental tracking combining optical tracking and inertial measurements
US20240382265A1 (en) Hybrid localization for minimally invasive surgery and cervical spinal referencing, and methods for using the same
US12274513B2 (en) Devices, methods, and systems for robot-assisted surgery
WO2024236564A1 (fr) Systèmes de positionnement d'un robot chirurgical
US11847809B2 (en) Systems, devices, and methods for identifying and locating a region of interest
US20250235271A1 (en) Devices, methods, and systems for robot-assisted surgery
US12295683B2 (en) Systems and methods for robotic collision avoidance using medical imaging
WO2024229649A1 (fr) Dispositif de suivi de patient non invasif pour intervention chirurgicale
WO2023141800A1 (fr) Système de positionnement de rayons x mobile
US20230404692A1 (en) Cost effective robotic system architecture
US20230293244A1 (en) Systems and methods for hybrid motion planning
US20230278209A1 (en) Systems and methods for controlling a robotic arm
US20230240659A1 (en) Systems, methods, and devices for tracking one or more objects
US20230281869A1 (en) Systems, methods, and devices for reconstructing a three-dimensional representation
WO2024261752A1 (fr) Systèmes de détection en temps réel de collision d'objet et/ou de mouvement d'objet
WO2025120637A1 (fr) Systèmes et procédés de planification et de mise à jour de trajectoires pour dispositifs d'imagerie
CN121127200A (zh) 用于标识一个或多个跟踪设备的系统和方法
CN117320655A (zh) 用于机器人辅助手术的装置、方法和系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24733337

Country of ref document: EP

Kind code of ref document: A1