[go: up one dir, main page]

WO2024229649A1 - Dispositif de suivi de patient non invasif pour intervention chirurgicale - Google Patents

Dispositif de suivi de patient non invasif pour intervention chirurgicale Download PDF

Info

Publication number
WO2024229649A1
WO2024229649A1 PCT/CN2023/092720 CN2023092720W WO2024229649A1 WO 2024229649 A1 WO2024229649 A1 WO 2024229649A1 CN 2023092720 W CN2023092720 W CN 2023092720W WO 2024229649 A1 WO2024229649 A1 WO 2024229649A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging device
tracking markers
image
images
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/CN2023/092720
Other languages
English (en)
Inventor
Weijun Xu
Wei Tang
Fang GENG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mazor Robotics Ltd
Original Assignee
Mazor Robotics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mazor Robotics Ltd filed Critical Mazor Robotics Ltd
Priority to CN202380097910.3A priority Critical patent/CN121099962A/zh
Priority to PCT/CN2023/092720 priority patent/WO2024229649A1/fr
Publication of WO2024229649A1 publication Critical patent/WO2024229649A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00831Material properties
    • A61B2017/00902Material properties transparent or translucent
    • A61B2017/00907Material properties transparent or translucent for light
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/363Use of fiducial points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
    • A61B2090/3735Optical coherence tomography [OCT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery

Definitions

  • the present disclosure is generally directed to surgical systems, and relates more particularly to robotic surgical devices.
  • Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure, or may complete one or more surgical procedures autonomously. Layout of the operating room during a surgical procedure is especially important to support successful use of surgical robots.
  • Example aspects of the present disclosure include:
  • a method comprising: obtaining, via a first imaging device, a first image of an object in a surgical area, where the first image includes one or more first tracking markers arranged in a first position relative to the object; obtaining, via a second imaging device, a second image of the first imaging device, where the second image includes one or more second tracking markers arranged in a first position; obtaining, via the first imaging device, one or more additional first images of the object, where the one or more additional first images includes the one or more first tracking markers; obtaining, via the second imaging device, one or more additional second images of the first imaging device, where the one or more additional second images include the one or more second tracking markers; and determining, based on the one or more additional first images and the one or more additional second images, whether the object has moved and/or whether the first imaging device has moved relative to the object.
  • the object comprises soft tissue
  • the method further includes: determining, based on the one or more additional first images, that the object has moved beyond a predetermined object motion threshold; and in response to determining that object has moved beyond the predetermined object motion threshold, reporting soft tissue motion to at least one of a surgeon and a robot.
  • the soft tissue motion is reported to the robot, and the method further includes: adjusting a robotic surgical plan based on the soft tissue motion.
  • the first imaging device is connected to the robot.
  • the first imaging device is connected to at least one of a robotic arm and an end effector of the robot.
  • the method further includes: determining, based on the one or more additional second images, that the first image device has moved relative to the object beyond a predetermined distance threshold; correlating motion of the first imaging device to motion of the robot; and in response to determining that the first image device has moved relative to the object beyond a predetermined distance threshold, reporting robot or patient movement to the surgeon.
  • the one or more first tracking markers are provided on a flexible material and the flexible material is attached to a patient at or near a surgical area.
  • the flexible material is attached to the patient using at least one of a staple, suture, and glue.
  • the one or more first tracking markers are sized according to a resolution of the first imaging device and are not visible to the second imaging device.
  • the one or more first tracking markers are printed on the flexible material.
  • the flexible material includes a center area that is substantially devoid of the one or more first tracking markers and the one or more first tracking markers are disposed radially around the center area.
  • the center area includes an area through which an incision is made.
  • the flexible material includes a transparent or semitransparent material.
  • the one or more second tracking markers are larger than the one or more first tracking markers and wherein the one or more second tracking markers are attached to the first imaging device.
  • the method further includes: determining, based on the one or more additional first images, that the one or more first tracking markers have moved to a second position relative to the object; determining, based on the one or more additional second images, that the one or more second tracking markers have moved to a second position indicating a motion of the first imaging device relative to the object; and reporting motion of the object and motion of the first imaging device.
  • At least one of the first image and the second image include one or more of an ultrasound image, a magnetic resonance image, a fluoroscopic image, an infrared image, a visible light image, a radiation image, a computed tomography image, a nuclear medicine image, and a positron-emission tomography image.
  • Another aspect includes a system, where the system includes: a first imaging device including a first field of view that captures one or more first tracking markers arranged on or around a surgical area; a second imaging device including a second field of view that captures the first imaging device; and a computing device, where the computing device includes: a processor; and computer memory coupled with the processor and having data stored therein that, when executed by the processor, enables the processor to: receive first image data from the first imaging device, where the first image data indicates a motion of the one or more first tracking markers relative to the surgical area; receive second image data from the second imaging device, where the second image data indicates a motion of the first imaging device relative to the surgical area; and determine, based on a combination of the first image data and the second image data, whether an object in the surgical area has moved and/or whether the first imaging device has moved relative to the object.
  • one or more second tracking markers are attached to the first imaging device, where the one or more first tracking markers are smaller than the one or more second tracking markers, where the second field of view includes the surgical area, and where the one or more first tracking markers are too small to be recognized in the second image data.
  • the first imaging device is connected to a robot that is supporting a surgical procedure at the surgical area and the first imaging device includes a 3D camera.
  • a surgical system that includes: a robot configured to execute a surgical plan; a first imaging device attached to the robot and configured to capture images of one or more first tracking markers arranged on or around a surgical area, wherein the first imaging device comprises one or more second tracking markers attached thereto; a second imaging device configured to capture images of the first imaging device and the one or more second tracking markers attached thereto; and a computing device including: a processor; and computer memory coupled with the processor and having data stored therein that, when executed by the processor, enables the processor to: receive first image data from the first imaging device, where the first image data indicates a motion of the one or more first tracking markers relative to the surgical area; receive second image data from the second imaging device, where the second image data indicates a motion of the first imaging device relative to the surgical area; determine, based on a combination of the first image data and the second image data, that an object in the surgical area has moved and/or the first imaging device has moved relative to the object; and in response to determining that the object in the surgical area has moved and/
  • Fig. 1 is a block diagram of a system according to at least one implementation of the present disclosure
  • Fig. 2 illustrates a non-invasive patient tracker according to at least one implementation of the present disclosure
  • Fig. 3A illustrates a first configuration of a flexible material with respect to a patient according to at least one embodiment of the present disclosure
  • Fig. 3B illustrates a second configuration of a patient having more than one patches of flexible material applied thereto according to at least one embodiment of the present disclosure
  • Fig. 4 is a plan view of an environment in which a surgical robot may operate according to at least one implementation of the present disclosure
  • Fig. 5 is an example of a first process flow according to at least one implementation of the present disclosure.
  • Fig. 6 is an example of a second process flow according to at least one implementation of the present disclosure.
  • the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions) .
  • Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer) .
  • processors such as one or more digital signal processors (DSPs) , general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple A11, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors) , graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units) , application specific integrated circuits (ASICs) , field programmable logic arrays (FPGAs) , or other equivalent integrated circuits (ASICs) ,
  • proximal and distal are used in this disclosure with their conventional medical meanings, proximal being closer to the operator or user of the system, and further from the region of surgical interest in or on the patient, and distal being closer to the region of surgical interest in or on the patient, and further from the operator or user of the system.
  • robotic surgical system implementations include operating a robotic system within proximity of a surgical table (e.g., a surgical bed, an operating table, etc. ) and a patient.
  • a surgical table e.g., a surgical bed, an operating table, etc.
  • the term “robotic system” may also be referred to as a “robotic surgical system” herein.
  • robotic system implementations e.g., in which the robotic system or a robotic arm is not table-mounted or patient-mounted
  • movement of the robotic system relative to the surgical table and/or patient during a surgical operation may become necessary or desirable.
  • Improved robot performance can be achieved when the robotic system and its components are tracked more accurately, especially relative to a surgical area. Aspects of the present disclosure support improving positioning and tracking of the robotic system relative to the surgical table, the patient, or a surgical area.
  • navigation components may be mounted at or near the surgical area, on the robot end effector, on the robot arm, on a camera, or combinations thereof.
  • a navigation system may be provided with an ability to determine a position of the robot, the camera mounted on the robot, the robot end effector, and the robot arm relative to the surgical area by tracking a position of the navigation components.
  • the camera mounted to the robot may further track motion of the tracking markers placed at or near the surgical area and the camera may provide images of the surgical area to a navigation system.
  • the navigation system may utilize its view of the tracking markers and objects in the operating room along with information received from the robot-mounted camera to control the robotic surgical procedure.
  • the navigation system may also determine if updates to a surgical plan should be made, if the robot should be maneuvered in a particular way, etc.
  • Embodiments of the present disclosure provide technical solutions to one or more of the problems of (1) tracking soft tissue changes intraoperatively, (2) tracking soft tissue changes postoperatively, (3) reporting on soft tissue movement in real-time or within a clinically meaningful amount of time from soft tissue movement, and/or (4) controlling robotic navigation in response to soft tissue movement.
  • the system 100 may be used to track soft tissue movement, assist robotic navigation in response to soft tissue movement, report on soft tissue movement, and/or carry out one or more other aspects of one or more of the methods disclosed herein.
  • the system 100 illustratively includes a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database 130, and/or a cloud or other network 128.
  • Systems according to other embodiments of the present disclosure may comprise more or fewer components than the system 100.
  • the system 100 may not include the imaging device 112, the robot 114, the navigation system 118, one or more components of the computing device 102, the database 130, and/or the cloud 128.
  • the computing device 102 comprises a processor 104, a memory 106, a communication interface 108, and a user interface 110.
  • Computing devices according to other embodiments of the present disclosure may comprise more or fewer components than the computing device 102.
  • the processor 104 of the computing device 102 may be any processor described herein, a collection of processors, or any similar processor.
  • the processor 104 may be configured to execute instructions or neural networks stored in the memory 106 (e.g., data) , which may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud 128.
  • the memory 106 may be or comprise RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions.
  • the memory 106 may store information or data useful for completing, for example, any step of the methods 400, 500, 600, 700, and 800 described herein, or of any other methods.
  • the memory 106 may store, for example, one or more image processing algorithms or neural networks 120, an object identification process 122, a tissue motion tracking process 124, and a reporting and feedback process 126.
  • Such instructions, data, or algorithms may, in some embodiments, be organized into one or more applications, modules, packages, layers, or engines.
  • the algorithms, data, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, and/or the cloud 128.
  • the memory 106 is shown to include image processing instructions 120, object identification instructions 122, tissue motion tracking instructions 124, and reporting/feedback instructions 126.
  • image processing instructions 120 object identification instructions 122
  • tissue motion tracking instructions 124 tissue motion tracking instructions 124
  • reporting/feedback instructions 126 may be implemented as a neural network, artificial neural network, or machine learning model without departing from the scope of the present disclosure.
  • the image processing 120 when executed by the processor 104, may enable the computing device 102 to cooperate with the imaging device (s) 112, robot 114, and/or navigation system 118 to obtain and use patient images.
  • the image processing 120 may be configured to receive patient images (e.g., preoperative patient images, intraoperative patient images, and/or postoperative patient images) , receive object images, receive environmental images (e.g., images of a surgical room) and prepare the images for processing by other components of the computing device 102.
  • the image processing 120 may be configured to receive images and format such images for consumption by the object identification 122 and/or tissue motion tracking 124.
  • the image processing 120 may be configured to transform an image or image data into a different consumable format by converting the image from one digital format to another digital format, by performing pixel analysis to determine locations of objects, by identifying locations of fiducials in an image, by compressing an image, by decompressing an image, by overlaying object models on an image, and/or any other task associated with preparing an image for consumption by the object identification 122 and/or tissue motion tracking 124.
  • the object identification 122 when executed by the processor 104, may enable the computing device 102 to identify one or more objects in an image, identify a pose (e.g., position and/or orientation) of objects in an image and in space, identify a pose of an object relative to another object, identify a pose of objects by analyzing locations of one or more fiducials, identify patient anatomy from pixel and/or gradient analysis of images, identify a pose of non-patient objects (e.g., a robotic arm 116) relative to a patient object (e.g., an organ, nerve, muscle, bone, etc. ) , label objects within an image, and any other combination of tasks associated with identifying an object, locating an object, and/or differentiating one object from another object.
  • a pose e.g., position and/or orientation
  • the tissue motion tracking 124 when executed by the processor 104, may enable the computing device 102 to track a motion of one or more objects within a field of view of an imaging device 112.
  • the tissue motion tracking 124 may be configured to identify one or more objects of interest and track a motion of the object (s) of interest intraoperatively and/or postoperatively.
  • the tissue motion tracking 124 may be configured to track an absolute motion of one or more objects (e.g., by identifying a movement of the object in space or relative to a fixed coordinate system) and/or track a motion of one object relative to another object.
  • the tissue motion tracking 124 may be configured to locate and track object (s) by identifying object (s) in an image via pixel analysis and then monitoring a movement of the object (s) based on an analysis of subsequent images.
  • the tissue motion tracking 124 may be configured to track a position and/or motion of tissue (e.g., soft tissue) by identifying and tracking a location of fiducials in one or more images.
  • tissue motion tracking 124 may be configured to correlate a motion of one or more fiducials to a motion of one or more objects, where the objects may include soft tissue objects such as an organ, nerve, muscle, tumor growth, skin, etc.
  • the reporting and feedback 126 when executed by the processor 104, may enable the computing device 102 to provide information related to object identification and/or object motion to a user of the computing device 102.
  • the reporting and feedback 126 may also be configured to provide feedback to the navigation system 118 to assist with the navigation of the robot 114.
  • the reporting and feedback 126 may provide the navigation system 118 and/or a surgeon with information related to object motion, soft tissue motion, fiducial motion, or combinations thereof.
  • the reporting and feedback 126 may alternatively or additionally provide one or more alerts if an object or multiple objects are determined to have moved (absolutely or relatively) beyond a predetermined movement threshold.
  • the reporting and feedback 126 may issue an alert or cause the navigation system 118 to update a navigation plan for the robot 114.
  • Updated navigation plans prepared in response to detecting soft tissue movement may allow the robot 114 to avoid cutting, puncturing, or otherwise damaging an object comprising soft tissue during a surgical procedure.
  • the computing device 102 may also comprise a communication interface 108.
  • the communication interface 108 may be used for receiving image data or other information from an external source (such as the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 128, and/or any other system or component not part of the system 100) , and/or for transmitting instructions, images, or other information to an external system or device (e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 128, and/or any other system or component not part of the system 100) .
  • an external system or device e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 128, and/or any other system or component not part of the system 100.
  • the communication interface 108 may comprise one or more wired interfaces (e.g., a USB port, an ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.11a/b/g/n, Bluetooth, NFC, ZigBee, and so forth) .
  • the communication interface 108 may be useful for enabling the device 102 to communicate with one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
  • the computing device 102 may also comprise one or more user interfaces 110.
  • the user interface 110 may be or comprise a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
  • the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100.
  • the user interface 110 may be useful to allow a surgeon or other user to modify instructions to be executed by the processor 104 according to one or more embodiments of the present disclosure, and/or to modify or adjust a setting of other information displayed on the user interface 110 or corresponding thereto.
  • the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102.
  • the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computer device 102.
  • the user interface 110 may include a user input device, a user output device, and a combination user input/user output device (e.g., a touch-sensitive display device) .
  • the imaging device 112 may be operable to image anatomical feature (s) (e.g., a bone, veins, soft tissue, etc. ) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, soft tissue, etc. ) .
  • image data refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form.
  • the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof.
  • the image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure.
  • a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time, and a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
  • the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
  • the imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver) , an O-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine) , a magnetic resonance imaging (MRI) scanner, an ultrasound imaging device, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera) , a radar or LIDAR system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae) , or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient.
  • the imaging device 112 may be contained entirely within a single housing, or may comprise a transmitter/emitter and
  • the imaging device 112 may comprise more than one imaging device 112.
  • a first imaging device may provide first image data and/or a first image
  • a second imaging device may provide second image data and/or a second image.
  • the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein.
  • the imaging device 112 may be operable to generate a stream of image data.
  • the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images.
  • image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
  • an imaging device 112 may be mounted to a robot 114 or robotic arm 116 and may be configured to obtain images of tracking markers that are also being imaged by other imaging devices 112 not mounted to the robot 114.
  • images obtained from the imaging device 112 mounted to the robot 114 may be used to supplement image data at the navigation system 118 obtained from other imaging devices 112. It may also be possible to track a motion of the imaging device 112 that is mounted to the robot 114 to determine a motion and pose of the robot 114 and the imaging device 112 relative to the surgical area.
  • the robot 114 may be any surgical robot or surgical robotic system.
  • the robot 114 may be or comprise, for example, the Mazor X TM Stealth Edition robotic guidance system.
  • the robot 114 may be configured to position the imaging device 112 at one or more precise position (s) and orientation (s) , and/or to return the imaging device 112 to the same position (s) and orientation (s) at a later point in time.
  • the robot 114 may additionally or alternatively be configured to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task.
  • the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure.
  • the robot 114 may comprise one or more robotic arms 116.
  • the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms. In some embodiments, one or more of the robotic arms 116 may be used to hold and/or maneuver the imaging device 112. In embodiments where the imaging device 112 comprises two or more physically separate components (e.g., a transmitter and receiver) , one robotic arm 116 may hold one such component, and another robotic arm 116 may hold another such component. Each robotic arm 116 may be positionable independently of the other robotic arm. The robotic arms may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
  • the robot 114 may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
  • the robotic arm (s) 116 may comprise one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm) .
  • reference markers i.e., navigation markers
  • the reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof.
  • the navigation system 118 can be used to track other components of the system (e.g., imaging device 112) and the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or more surgical tools, based on information and/or instructions generated by the navigation system 118, for example) .
  • the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation.
  • the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStation TM S8 surgical navigation system or any successor thereof.
  • the navigation system 118 may include one or more cameras or other sensor (s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the system 100 is located.
  • the one or more cameras may be optical cameras, infrared cameras, or other cameras.
  • the navigation system may comprise one or more electromagnetic sensors.
  • the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, and/or one or more surgical tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing) .
  • the navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
  • the system 100 can operate without the use of the navigation system 118.
  • the navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
  • the system 100 or similar systems may be used, for example, to carry out one or more aspects of any of the methods 500 and 600 described herein.
  • the system 100 or similar systems may also be used for other purposes. Any of the methods depicted and described herein may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor (s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118) .
  • a processor other than any processor described herein may also be used to execute the methods 500 and/or 600.
  • the at least one processor may perform a method (e.g., method 500 or 600) by executing instructions stored in a memory such as the memory 106.
  • One or more tracking markers 202 may be provided on a flexible material 204 and the flexible material 204 may be configured for attachment to an object or patient 300 (e.g., on skin, an organ, or other soft tissue of the patient 300) to assist with the tracking of soft tissue motion intraoperatively and/or postoperatively.
  • a flexible material 204 may include an array of markers 200 provided thereon, where each tracking marker 202 in the array of markers 200 is distributed in a known pattern across the flexible material 204.
  • the tracking markers 202 may be radially disposed relative to a center area 208 of the array 200.
  • the center area 208 may correspond to a portion of the array 200 in which an incision is made.
  • the center area 208 may or may not have a tracking marker 202 provided thereon.
  • the tracking markers 202 are positioned around the center area 208 in a pattern that will enable a determination of motion relative to the center area 208. Because the center area 208 may correspond to a location where an incision is made during a surgical procedure, the surrounding tracking markers 202 can be tracked with an imaging device 112 to determine a motion of the soft tissue around the surgical area (e.g., the area inside the center area 208) .
  • the tracking markers 202 are provided on a ribbon that surrounds the center area 208.
  • the tracking markers 202 may be integrated into the flexible material 204 or affixed to the flexible material 204 such that motion of the flexible material 204 translates to a motion of the tracking markers 202. More specifically, in one example, the tracking markers 202 may be printed into the flexible material 204 such that the profile of the array 200 is substantially constant across the entirety of the array 200. In other words, the tracking markers 202 may be integrated into the flexible material as printed dots, squares, circles, or the like, which means that motion of the flexible material 204 is immediately translated to motion in a tracking marker 202.
  • the flexible material 204 may be configured for attachment to a patient 300 or an object of interest (e.g., a soft tissue element belonging to a patient 300 such as an organ, nerve, muscle, tumor growth, skin, bone, etc. ) .
  • an object of interest e.g., a soft tissue element belonging to a patient 300 such as an organ, nerve, muscle, tumor growth, skin, bone, etc.
  • multiple patches of flexible material 204 may be attached to a patient 300 or one or more anatomical elements of a patient 300.
  • the object identification 122 and/or tissue motion tracking 124 of the computing device 102 may be configured to identify and track motion of soft tissue elements belonging to a patient 300 intraoperatively and/or postoperatively.
  • the flexible material 204 may include one or more tracking markers 202
  • the position and motion of tracking markers 202 can be correlated to a motion of the flexible material 204, which is correlated to a motion of an object to which the flexible material 204 is attached.
  • the flexible material 204 may be attached or connected to a patient 300 or object of a patient 300 in a number of ways.
  • attachment mechanisms e.g., connectors, staples, sutures, glue, etc.
  • the flexible material 204 may be provided with integrated attachment mechanisms, which enable the flexible material 204 to releasably connect with or attach to the patient 300 or anatomical element of the patient 300.
  • the flexible material 204 may correspond to a 3D printed elastic material, rubber, thin paper, a woven material, a non-woven material, or the like.
  • the tracking markers 202 may be created using the same technique used to create the flexible material 204, but a color of the tracking markers 202 may vary from other portions of the flexible material 204 that are not intended to correspond to tracking markers 202.
  • the flexible material 204 may correspond to a mesh or film that is configured to wrap around an object and accept a shape of the object once attached thereto.
  • the flexible material 204 may correspond to a membrane that connects to a soft tissue element of a patient 300.
  • the flexible material 204 may be attachable to a nerve, spinal cord, stent, or blood vessel that should not be cut during a surgical procedure. In this way, the flexible material 204 may be used to track an object that should not be cut or move in response to contact by the robot 114. Alternatively or additionally, the flexible material 204 may be attachable to an object that should be cut during a surgical procedure.
  • Such objects include, without limitation, tumors, growths, organs, foreign objects (e.g., non-human objects) , etc.
  • embodiments hereof may additionally or alternatively be used to track hard tissue as well, including by attaching a flexible material 204 comprising an array of markers 200 to a bone or other hard tissue of a patient.
  • the flexible material 204 may be configured for removal from the patient 300 when the surgical procedure is complete or the flexible material 204 may be configured to be left in the patient 300 postoperatively.
  • the flexible material 204 may include a bioabsorbable mesh that is configured to dissolve after being left in the patient 300 for more than a predetermined amount of time.
  • the flexible material 204 may include an optically transparent or semitransparent material that does not obstruct a view of an object to which the flexible material 204 is attached.
  • the flexible material 204 may be configured to be cut during surgery, but the cut flexible material 204 still retains some degree of attachment to the object when the flexible material 204 and object are cut at the same time.
  • the tracking markers 202 may correspond to active and/or passive fiducials. Tracking markers 202 may vary according to whether they are active or inactive, whether they respond to certain imaging techniques (e.g., visible light, ultrasound, magnetic resonance, radiation, LIDAR, etc. ) . As an example, some tracking markers 202 may correspond to fluorescent ink, visible ink, heat-activated ink, or the like. In some embodiments The tracking markers 202 may correspond to fiducials that are relatively small in size (e.g., between 2mm and 4mm) so that they are visible to the human eye, but do not hinder a visualization of the object to which the flexible material 204 is connected.
  • imaging techniques e.g., visible light, ultrasound, magnetic resonance, radiation, LIDAR, etc.
  • some tracking markers 202 may correspond to fluorescent ink, visible ink, heat-activated ink, or the like.
  • the tracking markers 202 may correspond to fiducials that are relatively small in size (e.g., between 2mm and 4mm) so that
  • a space between adjacent tracking markers 202 may be larger than a size of the tracking markers 202, but such a configuration is not required.
  • the size of the tracking markers 202 and/or the size of space between tracking markers 202 may depend upon the resolution of the imaging device 112 used to track the tracking markers 202.
  • a 3D camera may be connected to a robot 114 and may be used to obtain close-up images of the tracking markers 202 because the 3D camera is closer to the surgical area than other imaging devices 112.
  • the camera may have a resolution of 0.1mm, which means that tracking markers 202 of 1mm or 2mm in size and/or spaces between tracking markers 202 of 1mm or 2mm in size would be feasible.
  • any one of the tracking markers 202 may be identified and registered by the object identification 122 and/or tissue motion tracking 124 of the computing device 102 to help with the identification of objects and tracking of motion of objects during and/or after a surgical procedure.
  • each of the tracking markers 202 may be specifically identified once attached to the patient 300 or anatomical element of the patient 300.
  • the tissue motion tracking 124 may be configured to specifically monitor a position and movement of each individual tracking markers 202 during the surgical procedure. The monitoring may occur continuously or periodically during the surgical procedure to track a motion of soft tissue elements and to help ensure that a navigation and/or surgical plan is followed and/or determine whether a navigation and/or surgical plan should be adjusted.
  • Fig. 4 illustrates additional details of an environment 400 in which the system may be implemented according to at least some embodiments of the present disclosure.
  • the environment 400 in which a robotic system may operate will be described in accordance with at least some embodiments of the present disclosure.
  • the environment 300 may include a surgical environment, such as an operating room.
  • the environment 300 may include a sterile area 404 and a non-sterile area. Objects contained within the sterile area 404 may be considered sterile or “safe” as compared to objects located outside the sterile area 404.
  • the patient 300 may be positioned on a table 416 inside the sterile area 316 along with health care personnel 408 (e.g., doctors, surgeons, nurses, support staff, etc. ) .
  • Some or all of the robot 114 may also be provided within the sterile area 404.
  • a first imaging device 112a may be provided on, held by, or connected to some portion of the robot 114.
  • the first imaging device 112a may be connected to a robotic arm 116 or end effector of the robot 114.
  • the first imaging device 112a may correspond to a visible light camera.
  • the first imaging device 112a may correspond to a 3D camera having a resolution of 1mm per pixel.
  • the first imaging device 112a may be relatively close to a surgical area on which the array of markers 200 are provided.
  • the first imaging device 112a may be configured to have its field of view focused on the patient 300, thereby excluding other objects in the environment 400.
  • the first imaging device 112a may have its field of view focused on the surgical area around which the array of markers 200 are provided.
  • the first imaging device 112a is also shown to have multiple tracking markers 412 attached thereto.
  • the tracking markers 412 may correspond to larger fiducials than the tracking markers 202 on the array 200.
  • the tracking markers may have a size of 10mm to 100mm in diameter and may correspond to larger balls or globe-shaped objects that are connected to the first imaging device 112a at one or more predetermined locations.
  • the navigation system 118 may have access to one or more additional imaging devices 112, such as a second imaging device 112b.
  • the second imaging device 112b may be of the same type as the first imaging device 112a, or it may be of a different type.
  • the second imaging device 112b may correspond to an ultrasonic imaging device, an infrared imaging device, a magnetic resonance imaging device, a 2D camera, or a 3D camera.
  • the second imaging device 112b may not be as close to the patient 300 as the first imaging device 112b, but may have a field of view that captures the patient 300 as well as the first imaging device 112b. In this way, images obtained from the second imaging device 112b may be used to determine a position of the first imaging device 112a relative to the patient 300 and to the surgical area.
  • the navigation system 118 may be configured to receive image data from the first imaging device 112a and second imaging device 112b to determine motions of the patient 300 (e.g., breathing, heaving, etc. ) and to determine positions of the robot 114 relative to the patient 300. Such information can be used to support the robot’s 114 position and pose during a surgical procedure.
  • a surgical method 500 will be described according to at least one embodiment of the present disclosure.
  • the method 500 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor (s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118) .
  • a processor other than any processor described herein may also be used to execute the method 500.
  • the at least one processor may perform the method 500 by executing data stored in a memory such as the memory 106.
  • the data (e.g., instructions or neural networks) may cause the processor to execute one or more of the image processing 120, object identification 122, tissue motion tracking 124, and/or reporting and feedback 126.
  • the method 500 begins by selecting one or more flexible materials 204 for use during a surgical procedure (step 504) .
  • the selected flexible materials 204 may be of the same or different types (e.g., mesh, film, membrane, etc. ) .
  • the selected flexible material (s) 204 may include one or many tracking markers 202 of the same or different types.
  • the flexible material (s) 204 may be selected based on an object to which the flexible material 204 will be attached. For example, the size, shape, or type of object to which the flexible material 204 will be attached may determine the type of flexible material 204 selected for the object.
  • the flexible material 204 may also be selected based on the type (s) of tracking markers 202 provided thereon and/or capabilities of an imaging device 112 that will track a location of the tracking markers 202.
  • the method 500 may continue by attaching the flexible material (s) 204 to the patient 300 and/or anatomical element of the patient 300 (step 508) .
  • This step may further include attaching an imaging device 112 to a robot 114 that is located in proximity of the surgical area.
  • the flexible material (s) 204 may be attached to one or multiple objects and may be configured to conform to the shape of the object (s) once attached thereto.
  • the attachment mechanism (s) used to attach the flexible material (s) 204 to an object may depend upon the expected amount of motion the object will undergo during surgery.
  • a first type of attachment mechanism may be used (e.g., nano-hooks) whereas if the object is expected to move significantly (e.g., more than a few centimeters or millimeters) during surgery, then a second type of attachment mechanisms may be used (e.g., staples, sutures, glue, etc. ) .
  • the flexible material (s) 204 may be attached during a minimally invasive procedure, or during an open procedure. The flexible material (s) 204 may be attached prior to commencing a surgical plan that includes monitoring a pose of soft tissue while one or more other surgical steps are completed.
  • the flexible material (s) 204 may be attached intraoperatively, but before soft tissue movement monitoring is needed.
  • the imaging device 112 may be attached to the robot 114 in a way that enables the imaging device 112 to capture images of the tracking markers 202.
  • the imaging device 112 may be connected to the robot 114 such that the imaging device’s 112 field of view captures the surgical area and the tracking markers 202 surrounding the surgical area.
  • the imaging device 112 connected to the robot 114 may have one or more tracking markers 412 attached thereto that enable tracking of the imaging device 112 via another imaging device 112.
  • the method 500 may then continue by capturing one or more images of the patient 300, objects of the patient 300, the imaging device 112 connected to the robot 114, and/or tracking markers 202, 412 (step 512) .
  • the images may include any type of image or image data generated with machine vision techniques.
  • Non-limiting examples of imaging techniques that may be used to capture the one or more images include ultrasound imaging, magnetic resonance imaging, fluoroscopic imaging, infrared imaging, visible light imaging, radiation imaging, computed tomography imaging, nuclear medicine imaging, positron-emission tomography, combinations thereof, and the like.
  • the images may include preoperative images, in which case the images may not necessarily include the tracking markers 202, 412 (if, for example, the flexible material (s) 204 have not yet been attached to the patient) .
  • the images may include intraoperative images, which may or may not include tracking markers 202, 412.
  • the images may include postoperative images, which may or may not include tracking markers 202, 412.
  • the images may be received directly from an imaging device 112 or from a computer device that stored an electronic copy of the image.
  • the image (s) received in step 512 may include pixel-based images and/or model-based images.
  • the step 512 may comprise generating a two-dimensional or three-dimensional model from one or more images captured during the step 406, using known model generation techniques.
  • the method 500 may then continue with the processor 104 executing the object identification 122 and/or tissue motion tracking 124 to determine an initial relative position of the tracking markers 202, 412 (step 516) .
  • image data from the imaging device 112 connected to the robot 114 may identify a position of tracking markers 202 while image data from another imaging device 112 not connected to the robot 114 may identify a position of the tracking markers 412.
  • the combination of image data from these multiple imaging devices 112 may help determine a relative position of the robot 114 and the robotic arm 116 relative to the surgical area and the patient 300.
  • the initial relative position of the tracking markers 202, 412 may be used to determine an initial position (absolute or relative) of one or more objects in the image.
  • the position may be determined relative to other objects, relative to an arbitrary coordinate origin, or combinations thereof.
  • the method 500 may then include capturing one or more additional images of the patient 300, of the anatomical element of the patient 300, and/or of the tracking markers 202, 412 during the surgical procedure (step 520) .
  • the image (s) captured intraoperatively may utilize the same or different type of imaging technique as was used to capture images in step 516.
  • the image (s) may correspond to still images and/or video images.
  • the image (s) may be captured continuously during the surgical procedure or periodically (e.g., according to predetermined intervals) .
  • the step 520 may comprise registering the tracking markers 202, 412 as depicted in the one or more additional images to the tracking markers 202, 412 as depicted in the one or more images captured during the step 512.
  • the method 500 may then continue with the processor 104 executing the object identification 122 and/or tissue motion tracking 124 to determine a second position of the tracking markers 202, 412 (step 524) .
  • the second position may correspond to an absolute position in space or a position relative to at least one other object or adjacent tracking markers 202, 412.
  • a motion of a soft tissue element having a flexible material 204 attached thereto may result in one or more tracking markers 202, 412 moving from a first position to a second position between images.
  • the second position of tracking markers 202, 412 may be compared to previous positions of the same tracking markers 202, 412 and/or to current position (s) of other tracking markers 202, 412 (step 528) to determine whether or not the soft tissue element has moved during the surgical procedure (step 532) .
  • the processor 104 may execute the object identification 122 and/or tissue motion tracking 124 to perform steps 528 and/or 532.
  • the tissue motion tracking 124 may invoke the reporting and feedback 126 to report soft tissue movement (step 536) .
  • the soft tissue movement may be reported to a motion controller of the robot 114 and/or to a surgeon operating the robot 114.
  • feedback generated in step 536 may include electronic feedback for a motion controller and/or audible/visible feedback for a human user.
  • the feedback generated in step 536 may be used to optionally adjust a manual or robotic navigation and/or surgical plan that accounts for the soft tissue motion (step 540) .
  • the adjustment to the navigation plan may include adjustments to avoid cutting the soft tissue after movement, ensure cutting the soft tissue after movement, or combinations thereof.
  • the feedback generated in the step 536 may additionally or alternatively comprise a recommended modification to a navigation and/or surgical plan, which recommended modification may be generated automatically.
  • the present disclosure encompasses embodiments of the method 500 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • FIG. 6 additional details of a second method 600 will be described in accordance with at least some embodiments of the present disclosure. Operations of method 600 may be combined with operations of method 500 without departing from the scope of the present disclosure.
  • the method 600 begins by obtaining image (s) of the flexible material 204 using a first imaging device 112a connected to a robot 114 (step 604) .
  • the first imaging device 112a may correspond to a visible light camera.
  • the first imaging device 112a may include a 3D camera having a resolution of 1mm per pixel.
  • the first imaging device 112a may be connected to a robotic arm 116, an end effector, or some other component of the robot 114 that is supporting the surgical procedure.
  • the first imaging device 112a By attaching the first imaging device 112a to the robot 114, the first imaging device 112a may be in a position to capture better images of the flexible material 204 than other imaging devices 112 in the operating room because the first imaging device 112a is the closest imaging device 112 to the surgical area.
  • the method 600 may also include providing the first imaging device 112a with one or more tracking markers 412 (step 608) .
  • the tracking markers 412 may be connected to the first imaging device 112a such that motion of the first imaging device 112a translates to motion of the tracking markers 412, which can be tracked by other imaging devices in the operating room.
  • images obtained from a second imaging device 112b may be used to track a motion and/or pose of the first imaging device 112a (step 612) .
  • the motion and pose of the first imaging device 112a may be tracked relative to the surgical area, the flexible material 204, and the tracking markers 202 provided on/in the flexible material 204.
  • the method 600 may continue by capturing additional images of the flexible material 204 using the first imaging device 112a (step 616) .
  • the additional images captured by the first imaging device 112a may be capable of tracking motion of the tracking markers 202 attached to the flexible material 204 whereas other images captured by other imaging devices 112 may not have the ability or resolution required to track a motion of the tracking markers 202.
  • the images captured by the first imaging device 112a may be used to determine body position of the patient 300, movement of the patient 300, and/or chest motion of the patient 300 caused by breathing (step 620) .
  • the images from the first imaging device 112a and the second imaging device 112b may be provided to the navigation system 118 for processing (step 624) .
  • the navigation system 118 may be configured to process the image data from the first imaging device 112a and other imaging devices 112 (e.g., the second imaging device 112b) to determine whether motion of the patient 300 and/or other objects should be used to support surgical navigation. For instance, the determinations at step 620 can be used to initiate actions as described in operations 536 and/or 540.
  • phrases “at least one, ” “one or more, ” “or, ” and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation.
  • each of the expressions “at least one of A, B and C, ” “at least one of A, B, or C, ” “one or more of A, B, and C, ” “one or more of A, B, or C, ” “A, B, and/or C, ” and “A, B, or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • a or “an” entity refers to one or more of that entity.
  • the terms “a” (or “an” ) , “one or more, ” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising, ” “including, ” and “having” can be used interchangeably.
  • automated refers to any process or operation, which is typically continuous or semi-continuous, done without material human input when the process or operation is performed.
  • a process or operation can be automatic, even though performance of the process or operation uses material or immaterial human input, if the input is received before performance of the process or operation.
  • Human input is deemed to be material if such input influences how the process or operation will be performed. Human input that consents to the performance of the process or operation is not deemed to be “material. ”
  • aspects of the present disclosure may take the form of an implementation that is entirely hardware, an implementation that is entirely software (including firmware, resident software, micro-code, etc. ) or an implementation combining software and hardware aspects that may all generally be referred to herein as a “circuit, ” “module, ” or “system. ” Any combination of one or more computer-readable medium (s) may be utilized.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • a computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (anon-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or Flash memory) , an optical fiber, a portable compact disc read-only memory (CD-ROM) , an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • a computer-readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including, but not limited to, wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Robotics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un système chirurgical (100), un système de navigation (118) et un procédé. Le procédé consiste à obtenir, par l'intermédiaire d'un premier dispositif d'imagerie (112a), une première image d'un objet dans une zone chirurgicale ; à obtenir, par l'intermédiaire d'un second dispositif d'imagerie (112b), une seconde image du premier dispositif d'imagerie (112a), la seconde image comprenant un ou plusieurs seconds marqueurs de suivi (412) agencés dans une première position. Le procédé consiste en outre à obtenir, par l'intermédiaire du premier dispositif d'imagerie (112a), une ou plusieurs premières images supplémentaires de l'objet ; à obtenir, par l'intermédiaire du second dispositif d'imagerie (112b), une ou plusieurs secondes images supplémentaires du premier dispositif d'imagerie (112a) ; et à déterminer, sur la base de la ou des premières images supplémentaires et de la ou des secondes images supplémentaires, si l'objet s'est déplacé et/ou si le premier dispositif d'imagerie (112a) s'est déplacé par rapport à l'objet.
PCT/CN2023/092720 2023-05-08 2023-05-08 Dispositif de suivi de patient non invasif pour intervention chirurgicale Pending WO2024229649A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202380097910.3A CN121099962A (zh) 2023-05-08 2023-05-08 用于外科规程的非侵入式患者跟踪器
PCT/CN2023/092720 WO2024229649A1 (fr) 2023-05-08 2023-05-08 Dispositif de suivi de patient non invasif pour intervention chirurgicale

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2023/092720 WO2024229649A1 (fr) 2023-05-08 2023-05-08 Dispositif de suivi de patient non invasif pour intervention chirurgicale

Publications (1)

Publication Number Publication Date
WO2024229649A1 true WO2024229649A1 (fr) 2024-11-14

Family

ID=93431810

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/092720 Pending WO2024229649A1 (fr) 2023-05-08 2023-05-08 Dispositif de suivi de patient non invasif pour intervention chirurgicale

Country Status (2)

Country Link
CN (1) CN121099962A (fr)
WO (1) WO2024229649A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2298215A1 (fr) * 2009-09-21 2011-03-23 Stryker Leibinger GmbH & Co. KG Technique d'enregistrement de données d'image d'un objet
CN109152615A (zh) * 2016-05-23 2019-01-04 马科外科公司 在机器人手术过程期间识别和跟踪物理对象的系统和方法
CN110177518A (zh) * 2017-05-25 2019-08-27 柯惠Lp公司 用于在图像捕获装置的视场内检测物体的系统和方法
CN112087981A (zh) * 2018-03-01 2020-12-15 奥瑞斯健康公司 用于标测和导航的方法和系统
CN115944391A (zh) * 2022-09-01 2023-04-11 杭州三坛医疗科技有限公司 一种手术机器人导航定位方法、装置及系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2298215A1 (fr) * 2009-09-21 2011-03-23 Stryker Leibinger GmbH & Co. KG Technique d'enregistrement de données d'image d'un objet
CN109152615A (zh) * 2016-05-23 2019-01-04 马科外科公司 在机器人手术过程期间识别和跟踪物理对象的系统和方法
CN110177518A (zh) * 2017-05-25 2019-08-27 柯惠Lp公司 用于在图像捕获装置的视场内检测物体的系统和方法
CN112087981A (zh) * 2018-03-01 2020-12-15 奥瑞斯健康公司 用于标测和导航的方法和系统
CN115944391A (zh) * 2022-09-01 2023-04-11 杭州三坛医疗科技有限公司 一种手术机器人导航定位方法、装置及系统

Also Published As

Publication number Publication date
CN121099962A (zh) 2025-12-09

Similar Documents

Publication Publication Date Title
US20220395342A1 (en) Multi-arm robotic systems and methods for monitoring a target or performing a surgical procedure
US12295797B2 (en) Systems, methods, and devices for providing an augmented display
US12433683B2 (en) Tracking soft tissue changes intraoperatively
US20230240755A1 (en) Systems and methods for registering one or more anatomical elements
US20240382265A1 (en) Hybrid localization for minimally invasive surgery and cervical spinal referencing, and methods for using the same
WO2022215075A1 (fr) Suivi de changements d'un tissu mou en peropératoire
WO2024229649A1 (fr) Dispositif de suivi de patient non invasif pour intervention chirurgicale
EP4395650B1 (fr) Systèmes et dispositifs pour générer une image corrigée
US20230240753A1 (en) Systems and methods for tracking movement of an anatomical element
US11847809B2 (en) Systems, devices, and methods for identifying and locating a region of interest
US12004821B2 (en) Systems, methods, and devices for generating a hybrid image
US12249099B2 (en) Systems, methods, and devices for reconstructing a three-dimensional representation
US20230240659A1 (en) Systems, methods, and devices for tracking one or more objects
US12186045B2 (en) Systems, devices, and methods for triggering intraoperative neuromonitoring in robotic-assisted medical procedures
US20230278209A1 (en) Systems and methods for controlling a robotic arm
WO2025120637A1 (fr) Systèmes et procédés de planification et de mise à jour de trajectoires pour dispositifs d'imagerie
WO2025109596A1 (fr) Systèmes et procédés d'enregistrement à l'aide d'un ou de plusieurs repères
WO2024180545A1 (fr) Systèmes et procédés d'enregistrement d'un élément anatomique cible
CN117157030A (zh) 术中跟踪软组织变化
WO2025186761A1 (fr) Systèmes et procédés de détermination d'une position d'un objet par rapport à un dispositif d'imagerie
WO2024236568A1 (fr) Systèmes et procédés d'identification d'un ou plusieurs dispositifs de suivi
WO2024236440A1 (fr) Localisation hybride pour chirurgie minimalement invasive et référencement spinal cervical, et leurs procédés d'utilisation
WO2025120636A1 (fr) Systèmes et procédés de détermination du mouvement d'un ou plusieurs éléments anatomiques

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23935978

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023935978

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2023935978

Country of ref document: EP

Effective date: 20251208