[go: up one dir, main page]

WO2024229651A1 - Positionnement intelligent d'un chariot de bras de robot - Google Patents

Positionnement intelligent d'un chariot de bras de robot Download PDF

Info

Publication number
WO2024229651A1
WO2024229651A1 PCT/CN2023/092729 CN2023092729W WO2024229651A1 WO 2024229651 A1 WO2024229651 A1 WO 2024229651A1 CN 2023092729 W CN2023092729 W CN 2023092729W WO 2024229651 A1 WO2024229651 A1 WO 2024229651A1
Authority
WO
WIPO (PCT)
Prior art keywords
processor
target area
robot
imaging device
surgical target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/CN2023/092729
Other languages
English (en)
Inventor
Weijun Xu
Wei Tang
Fang GENG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mazor Robotics Ltd
Original Assignee
Mazor Robotics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mazor Robotics Ltd filed Critical Mazor Robotics Ltd
Priority to PCT/CN2023/092729 priority Critical patent/WO2024229651A1/fr
Priority to CN202380097898.6A priority patent/CN121174998A/zh
Publication of WO2024229651A1 publication Critical patent/WO2024229651A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Leader-follower robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
    • A61B2090/3735Optical coherence tomography [OCT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound

Definitions

  • the present disclosure is generally directed to surgical systems, and relates more particularly to robotic surgical devices.
  • Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure, or may complete one or more surgical procedures autonomously. Layout of the operating room during a surgical procedure is especially important to support successful use of surgical robots.
  • Example aspects of the present disclosure include:
  • a system including: a robot mounted to a movable base, the robot comprising one or more robotic arms; a processor; and memory coupled with the processor, the memory comprising data stored thereon that, when processed by the processor, enables the processor to: receive image data describing a position of the one or more robotic arms relative to a surgical target area; determine a current position of the robot is sub-optimal for enabling the robot to access the surgical target area; and provide at least one of instructions and animations for moving the movable base from the current position to a new position.
  • the instructions cause the movable base to autonomously move from the current position to the new position.
  • the data when processed by the processor, further enables the processor to: determine a proposed path from the current position to the new position.
  • the proposed path avoids at least one obstacle and remains in a sterile area.
  • the image data comprises an image of one or more tracking objects positioned in proximity with the surgical target area.
  • the one or more tracking objects are mounted to at least one of an end effector and the one or more robotic arms.
  • the one or more tracking objects are mounted to a patient anatomy.
  • the one or more tracking objects are mounted to a surgical instrument.
  • the image data is obtained from an imaging device and wherein the imaging device comprises at least one of an infrared imaging device, an optical imaging device, and a video camera.
  • the instructions are provided for moving the movable base from the current position to an new position and wherein the instructions include an indication of whether or not the movable base is located in the new position.
  • a navigation system includes: an imaging device; a processor; and memory coupled with the processor, the memory comprising data stored thereon that, when processed by the processor, enables the processor to: receive image data from the imaging device, wherein the image data comprises an image of a robotic arm and an image of a surgical target area; determine a position of the robotic arm; determine a position of the surgical target area; determine, based on the position of the robotic arm and the position of the surgical target area, that a cart supporting the robotic arm is sub-optimally placed relative to the surgical target area; and provide at least one of instructions and animations for moving the cart from a current position to a new position.
  • a tracking object is mounted on or held by the robotic arm, wherein the image data includes an image of the tracking object, and wherein the position of the robotic arm is determined by analyzing the image of the tracking object.
  • the data when processed by the processor, further enables the processor to: determine a proposed path from the current position to the new position; and display the proposed path via a user interface.
  • the data when processed by the processor, further enables the processor to: determine an obstacle is precluding the robotic arm from achieving a desired pose to enable an end effector to access the surgical target area; and determine that the new position enables the end effector to access the surgical target area.
  • the imaging device comprises at least one of an infrared imaging device, an optical imaging device, and a video camera.
  • a method includes: receiving image data from an imaging device, wherein the image data comprises an image of a robotic arm and an image of a surgical target area; determining a position of the robotic arm relative to a position of the surgical target area; determining, based on the position of the robotic arm relative to the position of the surgical target area, that a cart supporting the robotic arm is sub-optimally placed relative to the surgical target area; and providing at least one of instructions and animations for moving the cart from a current position to a new position.
  • the method further includes outputting an indication when the cart is co-located with the new position.
  • the method further includes determining a path from the current position to the new position.
  • the method further includes causing the cart to move autonomously along the path.
  • the path avoids at least one obstacle and remains in a sterile area.
  • Fig. 1 is a block diagram of a system according to at least one implementation of the present disclosure
  • Fig. 2 illustrates additional details of a system according to at least one implementation of the present disclosure
  • Fig. 3 is a plan view of an environment in which a surgical robot may operate according to at least one implementation of the present disclosure
  • Fig. 4 is an example of a first process flow according to at least one implementation of the present disclosure
  • Fig. 5 is an example of a second process flow according to at least one implementation of the present disclosure.
  • Fig. 6 is an example of a third process flow according to at least one implementation of the present disclosure.
  • the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions) .
  • Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer) .
  • processors such as one or more digital signal processors (DSPs) , general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple A11, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors) , graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units) , application specific integrated circuits (ASICs) , field programmable logic arrays (FPGAs) , or other equivalent integrated circuits (ASICs) ,
  • proximal and distal are used in this disclosure with their conventional medical meanings, proximal being closer to the operator or user of the system, and further from the region of surgical interest in or on the patient, and distal being closer to the region of surgical interest in or on the patient, and further from the operator or user of the system.
  • robotic surgical system implementations include operating a robotic system within proximity of a surgical table (e.g., a surgical bed, an operating table, etc. ) and a patient.
  • a surgical table e.g., a surgical bed, an operating table, etc.
  • the term “robotic system” may also be referred to as a “robotic surgical system” herein.
  • movement of the robotic system relative to the surgical table and/or patient during a surgical operation may become necessary or desirable. This can be achieved by moving the robotic system, which may be mounted on a cart or other moveable device. This can alternatively or additionally be achieved by moving the surgical table.
  • aspects of the present disclosure support improving positioning of the robotic system relative to the surgical table, the patient, or a surgical target area.
  • One aspect of the present disclosure is to provide an approach for determining whether or not a robot and components thereof are in an appropriate or optimal position relative to a surgical target area. Such determinations may be made during initial setup of the operating room or during the surgical procedure. Aspect of the present disclosure also provide suggestions for improving a position of the robot relative to the surgical target area.
  • the robot may be mounted on a cart (e.g., a robot arm cart) , and suggestions, instructions, and/or animations for moving the cart can be provided to operating room personnel.
  • the cart may be provided with an ability to move autonomously or semi-autonomously, in which case the instructions for moving the cart can be provided directly to the cart, thereby enabling the cart to move according to a predetermined path within the operating room.
  • navigation components may be mounted at or near the surgical target area, on the robot end effector, on the robot arm, or combinations thereof.
  • a navigation system may be provided with an ability to determine a position of the robot, the robot end effector, and the robot arm relative to the surgical target area by tracking a position of the navigation components. Based on determined relative positions of the robot, the robot end effector, the robot arm, and the surgical target area, the navigation system may also determine if the robot is optimally placed relative to the surgical target area, if an obstacle is precluding the robot from fully accessing the surgical target area, and/or if the robot should be moved.
  • the system may also be capable of determining if the robot cart can be safely moved from its current position to an improved position (e.g., without impacting another obstacle, without impacting personnel, without leaving a sterile area, etc. ) .
  • Similar approaches may be applied to non-robotic arms. For instance, approaches described herein may be used to determine whether a mechanical arm (different from a surgical robot arm) is able to move into a desired location to support a surgical procedure.
  • Fig. 1 illustrates an example of a system 100 that supports aspects of the present disclosure.
  • the system 100 is illustrated to include a computing device 102, imaging devices 112, a robot 114, a navigation system 118, a table 126, a database 130, and/or a cloud network 134 (or other network) .
  • Systems according to other implementations of the present disclosure may include more or fewer components than the system 100.
  • the system 100 may omit and/or include additional instances of the computing device 102, imaging devices 112, the robot 114, the navigation system 118, measurement device 138, measurement device 140, the table 126, one or more components of the computing device 102, the database 130, and/or the cloud network 134.
  • the system 100 may support the implementation of one or more other aspects of one or more of the methods disclosed herein.
  • the computing device 102 includes a processor 104, a memory 106, a communication interface 108, and a user interface 110.
  • Computing devices may include more or fewer components than the computing device 102.
  • the computing device 102 may be, for example, a control device including electronic circuitry associated with controlling the imaging devices 112, the robot 114, the navigation system 118, and the table 126.
  • the computing device 102 may also be, for example, a control device for autonomously or semi-autonomously controlling a cart on which the robot 114 is provided.
  • the computing device 102 may also be, for example, a device which provides instructions, suggestions, and/or animations to operating room personnel (e.g., doctor, nurse, staff, etc. ) for moving a cart on which the robot 114 is provided.
  • the processor 104 of the computing device 102 may be any processor described herein or any similar processor.
  • the processor 104 may be configured to execute instructions stored in the memory 106, which instructions may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from imaging devices 112, the robot 114, the navigation system 118, the table 126, the database 130, and/or the cloud network 134.
  • the processor 104 may include one or multiple processors.
  • the memory 106 may be or include RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions.
  • the memory 106 may store information or data associated with completing, for example, any step of the method 400 described herein, or of any other methods.
  • the memory 106 may store, for example, instructions and/or machine learning models that support one or more functions the computing device 102, imaging devices 112, the robot 114, the navigation system 118, the table 126.
  • the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable image processing 120, segmentation 122, transformation 124, and/or registration 128) .
  • Such content may, in some implementations, be organized into one or more applications, modules, packages, layers, or engines.
  • the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc. ) that can be processed by the processor 104 to carry out the various method and features described herein.
  • various contents of memory 106 may be described as instructions, it should be appreciated that functionality described herein can be achieved through use of instructions, algorithms, and/or machine learning models.
  • the data, algorithms, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging devices 112, the robot 114, the navigation system 118, the database 130, and/or the cloud network 134.
  • the computing device 102 may also include a communication interface 108.
  • the communication interface 108 may be used for receiving data or other information from an external source (e.g., the imaging devices 112, the robot 114, the navigation system 118, the database 130, the cloud network 134, and/or any other system or component separate from the system 100) , and/or for transmitting instructions, data (e.g., image data provided by the imaging devices 112, measurement data provided by measurement device (s) 138, measurement device (s) 138, 140, etc.
  • an external source e.g., the imaging devices 112, the robot 114, the navigation system 118, the database 130, the cloud network 134, and/or any other system or component separate from the system 100
  • data e.g., image data provided by the imaging devices 112, measurement data provided by measurement device (s) 138, measurement device (s) 138, 140, etc.
  • the communication interface 108 may include one or more wired interfaces (e.g., a USB port, an Ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.11a/b/g/n, Bluetooth, NFC, ZigBee, and so forth) .
  • the communication interface 108 may support communication between the device 102 and one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
  • the computing device 102 may also include one or more user interfaces 110.
  • the user interface 110 may be or include a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
  • the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100.
  • the user interface 110 may support user modification (e.g., by a surgeon, medical personnel, a patient, etc.
  • the user interface 110 may be used to display instructions and/or animations to operating room personnel regarding placement and/or positioning of the robot 114 relative to the table 126.
  • the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102.
  • the user interface 110 may be located proximate one or more other components of the computing device 102, while in other implementations, the user interface 110 may be located remotely from one or more other components of the computer device 102.
  • the imaging device 112 may be operable to image anatomical feature (s) (e.g., a bone, veins, tissue, etc. ) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc. ) .
  • the imaging device 112 may also be operable to capture discrete images or video images of an environment in which a surgical procedure is taking place.
  • the imaging device 112 may be configured to capture images of a patient, of the table 126 on which the patient is located, objects surrounding the table 126, and tracking objects positioned within a field of view of the imaging device 112. Examples of tracking objects are further described in U.S. Patent No.
  • Image data refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form.
  • the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof.
  • the image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure.
  • a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time
  • second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
  • the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
  • the imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver) , an O-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine) , a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera) , a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae) , or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient.
  • the imaging device 112 may
  • the imaging device 112 may comprise more than one imaging device 112.
  • a first imaging device may provide first image data and/or a first image
  • a second imaging device may provide second image data and/or a second image.
  • the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein.
  • the imaging device 112 may be operable to generate a stream of image data.
  • the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images.
  • image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
  • the robot 114 may be any surgical robot or surgical robotic system.
  • the robot 114 may be or comprise, for example, the Mazor X TM Stealth Edition robotic guidance system.
  • the robot 114 may be configured to position the imaging device 112 at one or more precise position (s) and orientation (s) , and/or to return the imaging device 112 to the same position (s) and orientation (s) at a later point in time.
  • the robot 114 may additionally or alternatively be configured to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task.
  • the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure.
  • the robot 114 may be configured to operate or control aspects of one or multiple measurement devices 138, 140.
  • the robot 114 may comprise one or more robotic arms 116.
  • the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms.
  • one or more of the robotic arms 116 may be used to hold and/or maneuver the imaging device 112, a surgical instrument, or the like.
  • the imaging device 112 comprises two or more physically separate components (e.g., a transmitter and receiver)
  • one robotic arm 116 may hold one such component
  • another robotic arm 116 may hold another such component.
  • Each robotic arm 116 may be positionable independently of the other robotic arm.
  • the robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
  • the robot 114 may have, for example, one, two, three, four, five, six, seven, or more Degrees of Freedom (DoF) .
  • the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point.
  • the pose includes a position and an orientation.
  • an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
  • the robotic arm (s) 116 may comprise one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm) .
  • the robotic arm (s) 116 may include an end effector (not illustrated) coupled to a distal end of the robotic arm (s) .
  • the end effector may support interaction of the robotic arm (s) with an environment.
  • reference markers e.g., navigation markers, three-dimensional markers, tracking objects, etc
  • the robot 114 including, e.g., on the robotic arm 116, on an end effector of the robot 114, etc.
  • the reference markers or tracking objects may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof.
  • the navigation system 118 can be used to track other components of the system (e.g., imaging device 112) and the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or more surgical tools, based on information and/or instructions generated by the navigation system 118, for example) .
  • the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation.
  • the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStation TM S8 surgical navigation system or any successor thereof.
  • the navigation system 118 may include one or more cameras or other sensor (s) for tracking one or more reference markers, navigated trackers, tracking objects, or other objects within the operating room or other room in which some or all of the system 100 is located.
  • the one or more cameras may be optical cameras, RGB cameras, infrared cameras, or other cameras.
  • the navigation system 118 may comprise one or more electromagnetic sensors.
  • the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, the measurement device (s) 138, the measurement device (s) 140, and/or one or more surgical tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing) .
  • a position and orientation e.g., a pose
  • the imaging device 112 the robot 114 and/or robotic arm 116
  • the measurement device (s) 138 the measurement device 140
  • one or more surgical tools or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing
  • the system 100 may support alternative and/or additional implementations of coordinate measuring and coordinate tracking in association with the patient (e.g., an anatomical element of the patient, a surgical target area, or the like) using the measurement device (s) 138, 140, and/or image data.
  • the navigation system 118 may include a display (e.g., display 242 later described herein) for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
  • the system 100 can operate without the use of the navigation system 118.
  • the navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, whether the robot 114 is positioned appropriately relative to a surgical target area, how to move the robot 114, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
  • the database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system) .
  • the database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100) ; one or more images useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information.
  • one or more surgical plans including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100
  • the database 130 may additionally or alternatively store, for example, location or coordinates of objects (e.g., anatomical elements of a patient, the robot 114, the table 126, etc. ) associated with the system 100.
  • the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud network 134.
  • the database 130 may include thresholds associated with movement of a patient, the robot 114, the measurement device (s) 138, the measurement device (s) 140, and/or the table 126.
  • the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS) , a health information system (HIS) , and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • a hospital image storage system such as a picture archiving and communication system (PACS) , a health information system (HIS)
  • PACS picture archiving and communication system
  • HIS health information system
  • the computing device 102 may communicate with a server (s) and/or a database (e.g., database 130) directly or indirectly over a communications network (e.g., the cloud network 134) .
  • the communications network may include any type of known communication medium or collection of communication media and may use any type of protocols to transport data between endpoints.
  • the communications network may include wired communications technologies, wireless communications technologies, or any combination thereof.
  • Wired communications technologies may include, for example, Ethernet-based wired local area network (LAN) connections using physical transmission mediums (e.g., coaxial cable, copper cable/wire, fiber-optic cable, etc. ) .
  • Wireless communications technologies may include, for example, cellular or cellular data connections and protocols (e.g., digital cellular, personal communications service (PCS) , cellular digital packet data (CDPD) , general packet radio service (GPRS) , enhanced data rates for global system for mobile communications (GSM) evolution (EDGE) , code division multiple access (CDMA) , single-carrier radio transmission technology (1 ⁇ RTT) , evolution-data optimized (EVDO) , high speed packet access (HSPA) , universal mobile telecommunications service (UMTS) , 3G, long term evolution (LTE) , 4G, and/or 5G, etc. ) , low energy, Wi-Fi, radio, satellite, infrared connections, and/or communication protocols.
  • PCS personal communications service
  • CDPD cellular
  • the Internet is an example of the communications network that constitutes an Internet Protocol (IP) network consisting of multiple computers, computing networks, and other communication devices located in multiple locations, and components in the communications network (e.g., computers, computing networks, communication devices) may be connected through one or more telephone systems and other means.
  • IP Internet Protocol
  • the communications network may include, without limitation, a standard Plain Old Telephone System (POTS) , an Integrated Services Digital Network (ISDN) , the Public Switched Telephone Network (PSTN) , a Local Area Network (LAN) , a Wide Area Network (WAN) , a wireless LAN (WLAN) , a Session Initiation Protocol (SIP) network, a Voice over Internet Protocol (VoIP) network, a cellular network, and any other type of packet-switched or circuit-switched network known in the art.
  • POTS Plain Old Telephone System
  • ISDN Integrated Services Digital Network
  • PSTN Public Switched Telephone Network
  • LAN Local Area Network
  • WAN Wide Area Network
  • WLAN wireless LAN
  • VoIP Voice over Internet Protocol
  • the communications network 120 may include of any combination of networks or network types.
  • the communications network may include any combination of communication mediums such as coaxial cable, copper cable/wire, fiber-optic cable, or antennas for communicating data (e.
  • the computing device 102 may be connected to the cloud network 134 via the communication interface 108, using a wired connection, a wireless connection, or both. In some implementations, the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud network 134.
  • an external device e.g., a computing device
  • the system 100 or similar systems may be used, for example, to carry out one or more aspects of the process flow 300 described herein.
  • the system 100 or similar systems may also be used for other purposes.
  • a system 200 which include a robotic system 201 are shown according example implementations of the present disclosure.
  • the robotic system 201 may be described in conjunction with a coordinate system 202.
  • the coordinate system 202 includes three-dimensions comprising an X-axis, a Y-axis, and a Z-axis. Additionally or alternatively, the coordinate system 202 may be used to define planes (e.g., the XY-plane, the XZ-plane, and the YZ-plane) of the robotic system 201. These planes may be disposed orthogonal, or at 90 degrees, to one another.
  • the origin of the coordinate system 202 may be placed at any point on or near the components of the robotic system 201, for the purposes of description, the axes of the coordinate system 202 are always disposed along the same directions from figure to figure, whether the coordinate system 202 is shown or not. In some examples, reference may be made to dimensions, angles, directions, relative positions, and/or movements associated with one or more components of the robotic system 201 with respect to the coordinate system 202.
  • Fig. 2 illustrates an example system 200 that supports aspects of the present disclosure.
  • the robotic system 201 may include a robot 114 (e.g., electronic and mechanical components including robotic arm 216) mounted on or supported by a movable base 212.
  • the movable base 212 for the robot 114 may also be referred to as a robot cart or robot arm cart.
  • the system 200 further illustrates placement of the robot 114 and the moveable base 212 relative to a table 226.
  • the table 226 may correspond to an example of table 126, which can also be referred to as a surgical table, an operating table, a patient bed, etc.
  • the robotic system 201 may include examples of aspects of like elements described herein with reference to Fig. 1.
  • the robotic system 201 may be referred to as a workstation.
  • the robotic system 201 may include a display 242 and additional user interfaces (e.g., keyboard, mouse, controls, etc. ) for manipulating the robot 114.
  • Display 242 may correspond to an example of the user interface 110.
  • the robotic system 201 may include one or multiple robotic arms 216, 220, which may correspond to an example of robotic arm 116.
  • a tracking object 208 or optical navigation component may be secured onto or held by a robotic arm 216, 220.
  • a robotic arm 216, 220 may be moved near a surgical target area, thereby providing a physical proximity between the tracking object 208 and the surgical target area of the patient 204.
  • the tracking object 208 may be attached to an end effector 224 of the robotic arm 220. It should be appreciated that the tracking object 208 may alternatively or additionally be attached to an intermediate arm or link (e.g., elbow) .
  • an intermediate arm or link e.g., elbow
  • the robotic system 201 may be configured to determine a distance (d1) between the patient 204 and the end of the table 226 and/or a distance (d2) between the patient 204 and the robot 114.
  • the distances (d1 or d2) may be determined using sensors provided on the robot 114 or by using optical navigation as described herein.
  • the navigation system 118 may be configured to determine a position of the tracking object 208 within the coordinate system 202, a position of the robot 114 within the coordinate system 202, and a pose of the robotic arms 216, 220 within the coordinate system 202.
  • Such information can be determined using image data and may be useful in determining whether or not the robot 114 is at an ideal location relative to the table 226.
  • the image data may also be useful in determining alternative or improved positions for the robot 114 and for suggestion such alternative or improved positions via the display 242.
  • aspects of the robotic system 201 may support monitoring of patient movement (e.g., as provided by a measurement device 238 and/or by image data) , monitoring of personnel movement (e.g., as provided by image data) , monitoring of surgical target areas, and the like.
  • the robotic system 201 may monitor patient movement (e.g., movement of an anatomical element) with respect to the robotic system 201.
  • the robotic system 201 may monitor patient movement with respect to the robot 114, the robotic arm 216, 220, and/or a surgical tool coupled to the robotic arm 216, 220.
  • the robotic system 201 possibly with support of information from the navigation system 118, may also be configured to determine if an obstacle has moved between the robot 114 and the surgical target area, as will be described in further detail herein.
  • the environment 300 may include a surgical environment, such as an operating room.
  • the environment 300 may include a sterile area 316 and a non-sterile area. Objects contained within the sterile area 316 may be considered sterile or “safe” as compared to objects located outside the sterile area 316.
  • the table 226 and patient 204 may be provided within the sterile area 316 along with health care personnel 324 (e.g., doctors, surgeons, nurses, support staff, etc. ) . Some or all of the robot 114 may also be provided within the sterile area 316. As shown in Fig. 3, the robot 114 may initially be positioned at a first position (e.g., current cart position 304) relative to the table 226. The robot 114 may be utilized by personnel 324 during the surgical procedure to assist at or near the surgical target area 312.
  • a first position e.g., current cart position 304
  • an obstacle 320 may move into a location that obstructs the robot’s 114 access to the surgical target area 312.
  • the obstacle 320 may partially or completely block the robot’s 114 access to the surgical target area 312.
  • the obstacle 320 may block movement of the robotic arm 216, 220 to a preferred or desired position.
  • the obstacle 320 may preclude the end effector 224 from accessing the surgical target area 312.
  • an alternative or new proposed cart position 308 may be identified and suggested to personnel 324.
  • a proposed cart path 328 may also be determined and suggested to personnel 324.
  • the proposed cart path 328 may originate at the current cart position 304 and end at the proposed cart position 308.
  • the proposed cart path 328 may be required to remain within the sterile area 316 and may further be required to avoid obstacles 320 or objects within the environment 300. If a safe proposed cart path 328 cannot be achieved (e.g., due to potential impacts or due to requirements of remaining within the sterile area 316, then the proposed cart position 308 may be determined as unavailable or non-viable, in which case no new proposed cart positions 308 are provided to personnel 324.
  • the system 100 may be enabled to determine: (1) if a current cart position 304 can be improved to access the surgical target area 312 and (2) if a viable cart path 328 is available to move the robot 114 from the current cart position 304 to the proposed cart position 308. If both conditions (1) and (2) cannot be satisfied, then the system 100 may determine the current cart position 304 is the best or optimal position for the robot 114.
  • locations of objects within the environment 300 may be determined, at least in part, by the navigation system 118 using image data captured by imaging device (s) 112.
  • imaging device (s) 112. Such information may be used to determine whether an initial layout of the environment 300 coincides with a defined layout, whether the initial layout of the environment 300 can be improved to support improved efficiencies in the surgical procedure, whether the layout of the environment 300 has changed such that a new position of the robot 114 is needed or desired, etc. Additional details regarding processes for determining such layouts and suggested improvements for the same will now be described with reference to Figs. 4-6.
  • Fig. 4 illustrates a first example of a process flow 400 in accordance with aspects of the present disclosure.
  • process flow 400 may implement aspects of a computing device 102, an imaging device 112, a robot 114, and/or a navigation system 118, described with reference to Figs. 1-3.
  • the process flow 400 begins by providing an initial system setup (step 404) .
  • the initial system setup may correspond to an initial layout of objects within the environment 300 and may correspond to an initial operating room layout.
  • the robot 114 may have a first position relative to the patient 204, the table 126, and to a surgical target area 312.
  • the flow 400 continues by placing one or more tracking objects 208 at or near the surgical target area (step 408) .
  • the one or more tracking objects 208 may be placed on or mounted to the patient 204, the robot 114, an end effector 224 of the robot 114, a robotic arm 216, 220, a surgical instrument, personnel 324, an obstacle 320, or the like. It may be desirable to place the tracking object (s) 208 in a field of view of the imaging device (s) 112 to enable tracking, surgical navigation, robotic movements, etc.
  • the navigation system 118 may be used to determine a location of the robot 114, components of the robot 114, and/or obstacles 320 relative to a surgical target area 312 (step 412) .
  • the navigation system 118 may determine that at least one obstacle 320 is blocking the robot’s 114 access to the surgical target area 312 (step 416) .
  • the flow 400 may continue by determining a new proposed cart position 308 for the robot 114 (step 420) .
  • the location of the new proposed cart position 308 may correspond to a different position relative to the table 226 that would allow the robot 114 to better access the surgical target area 312.
  • the new proposed cart position 308 may also correspond to a position that is accessible from the current cart position 304 via a safe and acceptable proposed cart path 328 (step 424) . If no safe and accessible cart path is available, then the flow 400 may stop as a new proposed cart position 308 may not be available.
  • the flow 400 may proceed by providing feedback to personnel 324 regarding the current cart position 304 and the new proposed cart position 308 (step 428) .
  • the feedback provided to personnel 324 may include a display of the environment 300 and the layout of objects in the environment 300 (e.g., a map-type display) .
  • the feedback provided to personnel 324 may also include indications of whether or not the robot 114 has been moved to a position that coincides with the proposed cart position 308 (e.g., green lights indicating that the robot 114 has been properly moved, red lights indicating that the robot 114 has not been properly moved, etc. ) .
  • the feedback provided to personnel 324 may also include a depiction of an animation for moving the robot 114 from the current cart position 304 to the proposed cart position 308 (e.g., via the proposed cart path 328) .
  • the feedback provided to personnel 324 may also include an indication that the current cart position 304 is the optimal cart position.
  • the flow 400 may also include an optional step of enabling the robot 114 to autonomously or semi-autonomously move from the current cart position 304 to the proposed cart position 308 via the proposed cart path 328 (step 432) .
  • the movable base 212 may include automated motor control components that autonomously move the robot 114 within the environment. If such autonomous or semi-autonomous movements are enabled, then a controller of the movable base 212 may also be provided with collision avoidance capabilities to ensure that the robot 114 does not impact obstacles 320, personnel 324, or other objects in the environment 300.
  • process flow 500 may implement aspects of a computing device 102, an imaging device 112, a robot 114, and/or a navigation system 118, described with reference to Figs. 1-3.
  • the flow 500 may begin by determining a current cart position 304 is sub-optimal for enabling the robot 114 to access the surgical target area 312 (step 504) .
  • the determination of step 504 may be made in response to patient 204 movement, a change in position of the surgical target area 312, a movement of an obstacle 320, or any other changing condition that might occur during a surgical procedure.
  • the flow 500 may continue by determining an improved or optimal cart position, that is different from the current cart position 304 (step 508) .
  • the flow 500 may then provide instructions and/or animations via the display 242 for placing the robot 114 at the improved or optimal position (step 512) .
  • personnel 324 may be presented with a display or animation showing the proposed cart path 328.
  • It may also be possible to provide instructions for moving the movable base 212 to a controller of the movable base 212.
  • the instructions may be executed by the controller of the movable base 212 to enable autonomous or semi-autonomous movement of the movable base 212 and robot 114.
  • process flow 600 may implement aspects of a computing device 102, an imaging device 112, a robot 114, and/or a navigation system 118, described with reference to Figs. 1-3.
  • the flow 600 may begin by determining an obstacle 320 is blocking a robotic arm 216, 220 from accessing a position or pose, thereby preventing the robot 114 from accessing a desired patient anatomy (step 604) .
  • the desired patient anatomy may correspond or be co-located with the surgical target area 312, although the desired patient anatomy does not necessarily need to be co-located with the surgical target area 312.
  • the flow 600 may continue by determining a change in operating room layout to accommodate the robotic arm 216, 220 better access to the desired patient anatomy (step 608) .
  • the flow 600 may then provide instructions and/or animations via the display 242 adjusting the operating room layout (step 612) .
  • changes to the operating room layout may include changing a position of the robot 114, changing a position of the table 226, changing a position of the obstacle 320, or combinations thereof.
  • phrases “at least one, ” “one or more, ” “or, ” and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation.
  • each of the expressions “at least one of A, B and C, ” “at least one of A, B, or C, ” “one or more of A, B, and C, ” “one or more of A, B, or C, ” “A, B, and/or C, ” and “A, B, or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • a or “an” entity refers to one or more of that entity.
  • the terms “a” (or “an” ) , “one or more, ” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising, ” “including, ” and “having” can be used interchangeably.
  • automated refers to any process or operation, which is typically continuous or semi-continuous, done without material human input when the process or operation is performed.
  • a process or operation can be automatic, even though performance of the process or operation uses material or immaterial human input, if the input is received before performance of the process or operation.
  • Human input is deemed to be material if such input influences how the process or operation will be performed. Human input that consents to the performance of the process or operation is not deemed to be “material. ”
  • aspects of the present disclosure may take the form of an implementation that is entirely hardware, an implementation that is entirely software (including firmware, resident software, micro-code, etc. ) or an implementation combining software and hardware aspects that may all generally be referred to herein as a “circuit, ” “module, ” or “system. ” Any combination of one or more computer-readable medium (s) may be utilized.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • a computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or Flash memory) , an optical fiber, a portable compact disc read-only memory (CD-ROM) , an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • a computer-readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including, but not limited to, wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Robotics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un système chirurgical, un système de navigation et un procédé. Un système illustratif comprend un robot monté sur une base mobile, le robot comprenant un ou plusieurs bras robotiques, un processeur et une mémoire couplée au processeur. La mémoire comprend des données stockées sur celle-ci qui, lorsqu'elles sont traitées par le processeur, permettent au processeur : de recevoir des données d'image décrivant une position du ou des bras robotiques par rapport à une zone cible chirurgicale ; de déterminer une position actuelle du robot qui est sous-optimale pour permettre au robot d'accéder à la zone cible chirurgicale ; et de fournir des instructions et/ou des animations pour déplacer la base mobile de la position actuelle à une nouvelle position.
PCT/CN2023/092729 2023-05-08 2023-05-08 Positionnement intelligent d'un chariot de bras de robot Pending WO2024229651A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2023/092729 WO2024229651A1 (fr) 2023-05-08 2023-05-08 Positionnement intelligent d'un chariot de bras de robot
CN202380097898.6A CN121174998A (zh) 2023-05-08 2023-05-08 机器人臂推车的智能定位

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2023/092729 WO2024229651A1 (fr) 2023-05-08 2023-05-08 Positionnement intelligent d'un chariot de bras de robot

Publications (1)

Publication Number Publication Date
WO2024229651A1 true WO2024229651A1 (fr) 2024-11-14

Family

ID=93431838

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/092729 Pending WO2024229651A1 (fr) 2023-05-08 2023-05-08 Positionnement intelligent d'un chariot de bras de robot

Country Status (2)

Country Link
CN (1) CN121174998A (fr)
WO (1) WO2024229651A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190069962A1 (en) * 2016-02-26 2019-03-07 Think Surgical, Inc. Method and system for guiding user positioning of a robot
US20190105776A1 (en) * 2017-10-05 2019-04-11 Auris Health, Inc. Robotic system with indication of boundary for robotic arm
CN109806004A (zh) * 2019-03-18 2019-05-28 汪俊霞 一种基于云数据技术的手术机器人系统及操作方法
US20210153958A1 (en) * 2018-04-20 2021-05-27 Covidien Lp Systems and methods for surgical robotic cart placement
US20210299877A1 (en) * 2020-03-30 2021-09-30 Depuy Ireland Unlimited Company Robotic surgical system with graphical user interface
CN114948003A (zh) * 2022-06-14 2022-08-30 苏州微创畅行机器人有限公司 手术设备、可读存储介质、电子设备及手术机器人系统
CN115469656A (zh) * 2022-08-30 2022-12-13 北京长木谷医疗科技有限公司 骨科手术机器人智能导航避障方法、系统及装置
CN115542889A (zh) * 2021-06-30 2022-12-30 上海微觅医疗器械有限公司 机器人术前导航方法、系统、存储介质及计算机设备

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190069962A1 (en) * 2016-02-26 2019-03-07 Think Surgical, Inc. Method and system for guiding user positioning of a robot
US20190105776A1 (en) * 2017-10-05 2019-04-11 Auris Health, Inc. Robotic system with indication of boundary for robotic arm
US20210153958A1 (en) * 2018-04-20 2021-05-27 Covidien Lp Systems and methods for surgical robotic cart placement
CN109806004A (zh) * 2019-03-18 2019-05-28 汪俊霞 一种基于云数据技术的手术机器人系统及操作方法
US20210299877A1 (en) * 2020-03-30 2021-09-30 Depuy Ireland Unlimited Company Robotic surgical system with graphical user interface
CN115542889A (zh) * 2021-06-30 2022-12-30 上海微觅医疗器械有限公司 机器人术前导航方法、系统、存储介质及计算机设备
CN114948003A (zh) * 2022-06-14 2022-08-30 苏州微创畅行机器人有限公司 手术设备、可读存储介质、电子设备及手术机器人系统
CN115469656A (zh) * 2022-08-30 2022-12-13 北京长木谷医疗科技有限公司 骨科手术机器人智能导航避障方法、系统及装置

Also Published As

Publication number Publication date
CN121174998A (zh) 2025-12-19

Similar Documents

Publication Publication Date Title
US20220395342A1 (en) Multi-arm robotic systems and methods for monitoring a target or performing a surgical procedure
US20250152262A1 (en) Path planning based on work volume mapping
US20230346492A1 (en) Robotic surgical system with floating patient mount
WO2023062624A1 (fr) Systèmes pour définir une géométrie d'objet à l'aide de bras robotiques
US20240382265A1 (en) Hybrid localization for minimally invasive surgery and cervical spinal referencing, and methods for using the same
WO2024229651A1 (fr) Positionnement intelligent d'un chariot de bras de robot
US12094128B2 (en) Robot integrated segmental tracking
US20240156531A1 (en) Method for creating a surgical plan based on an ultrasound view
WO2024103286A1 (fr) Bras de type branchez-et-utilisez pour robotique rachidienne
US20240382169A1 (en) Long image multi-field of view preview
US12295683B2 (en) Systems and methods for robotic collision avoidance using medical imaging
WO2023141800A1 (fr) Système de positionnement de rayons x mobile
US20240398362A1 (en) Ultra-wide 2d scout images for field of view preview
US20240407745A1 (en) Touch and move anatomy localization
WO2025079075A1 (fr) Caméra de navigation suivante
WO2024238179A1 (fr) Prévisualisation multi-champ de vision d'image longue
WO2024249025A1 (fr) Images de repérage 2d ultralarges pour prévisualisation de champ de vision
WO2025141396A1 (fr) Système et procédé d'orientation de l'affichage d'une sonde pour une navigation en temps réel
WO2024254040A1 (fr) Localisation d'anatomie par toucher-déplacer
WO2024236440A1 (fr) Localisation hybride pour chirurgie minimalement invasive et référencement spinal cervical, et leurs procédés d'utilisation
WO2025120636A1 (fr) Systèmes et procédés de détermination du mouvement d'un ou plusieurs éléments anatomiques
WO2025032446A1 (fr) Systèmes et procédés d'alignement de systèmes d'imagerie sur des emplacements de visualisation cibles de cadre stéréotaxique
WO2023286048A2 (fr) Systèmes, dispositifs et procédés d'identification et de localisation d'une région d'intérêt
EP4472545A1 (fr) Systèmes et procédés de commande d'un bras robotique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23935980

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023935980

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2023935980

Country of ref document: EP

Effective date: 20251208