[go: up one dir, main page]

WO2024236563A1 - Systèmes et procédés de génération et de mise à jour d'un plan chirurgical - Google Patents

Systèmes et procédés de génération et de mise à jour d'un plan chirurgical Download PDF

Info

Publication number
WO2024236563A1
WO2024236563A1 PCT/IL2024/050465 IL2024050465W WO2024236563A1 WO 2024236563 A1 WO2024236563 A1 WO 2024236563A1 IL 2024050465 W IL2024050465 W IL 2024050465W WO 2024236563 A1 WO2024236563 A1 WO 2024236563A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical plan
patient
updating
processor
incision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IL2024/050465
Other languages
English (en)
Inventor
Elad Rotman
Ido ZUCKER
Adi ESS
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mazor Robotics Ltd
Original Assignee
Mazor Robotics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mazor Robotics Ltd filed Critical Mazor Robotics Ltd
Priority to CN202480032085.3A priority Critical patent/CN121127193A/zh
Publication of WO2024236563A1 publication Critical patent/WO2024236563A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/366Correlation of different images or relation of image positions in respect to the body using projection of images directly onto the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients

Definitions

  • the present disclosure is generally directed to surgical planning, and relates more particularly to generating and updating a surgical plan.
  • Surgical robots may assist a surgeon or other medical provider in carrying out a surgical procedure, or may complete one or more surgical procedures autonomously.
  • a surgical plan based on preoperative patient anatomy may be used during the surgical procedure. Patient anatomy and, in particular, a position of the patient anatomy may change between preoperative imaging and intraoperative imaging.
  • Example aspects of the present disclosure include:
  • a system for updating a surgical plan comprises a processor; and a memory storing data for processing by the processor, the data, when processed, causes the processor to: receive a surgical plan comprising a plurality of implant positions and information about an incision; obtain a patient position; and update the surgical plan based on the patient position and the plurality of implant positions.
  • the memory stores further data for processing by the processor that, when processed, causes the processor to register the patient position to the surgical plan.
  • updating the surgical plan includes updating one or more implant positions of the plurality of implant positions.
  • updating the surgical plan includes optimizing the plurality of implant positions to enable a single incision.
  • updating the surgical plan includes averaging a corresponding implant entrance point of each of the plurality of implant positions to a single line.
  • updating the surgical plan includes updating the information about the incision.
  • the at least one imaging device comprises at least one LiDAR camera.
  • a system for updating a surgical plan comprises a processor; and a memory storing data for processing by the processor, the data, when processed, causes the processor to: receive a surgical plan comprising a plurality of implant positions and information about a skin incision; obtain a current patient position; and update the surgical plan based on the current patient position and the plurality of implant positions, wherein updating the surgical plan includes updating the information about the skin incision.
  • the memory stores further data for processing by the processor that, when processed, causes the processor to register the current patient position to the surgical plan.
  • updating the surgical plan includes updating one or more implant positions of the plurality of implant positions.
  • updating the surgical plan includes optimizing the plurality of implant positions to enable a single skin incision.
  • updating the surgical plan includes averaging a corresponding implant entrance point of each of the plurality of implant positions to a single line.
  • updating the surgical plan is based on at least one of whether a port is being used, symmetry of the plurality of implant positions, an entrance point of one or more implants, or reducing a number and size of incisions.
  • the at least one imaging device comprises at least one LiDAR camera.
  • a system for updating a surgical plan comprises an imaging device; a processor; and a memory storing data for processing by the processor, the data, when processed, causes the processor to: P-637050-PC receive a surgical plan comprising a plurality of implant positions; obtain a patient position from the at least one imaging device; and update information about the skin incision based on the current patient position and the plurality of implant positions.
  • the at least one imaging device comprises at least one LiDAR camera.
  • each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • each one of A, B, and C in the above expressions refers to an element, such as X, Y, and Z, or class of elements, such as Xl-Xn, Y 1-Ym, and Zl-Zo
  • the phrase is intended to refer to a single element selected from X, Y, and Z, a combination of elements selected from the same class (e.g., XI and X2) as well as a combination of elements selected from two or more classes (e.g., Y 1 and Zo).
  • FIG. 1 is a block diagram of a system according to at least one embodiment of the present disclosure
  • Fig. 2 A is an image of one or more planned incisions and one or more updated incisions according to at least one embodiment of the present disclosure
  • Fig. 2B is an image of one or more planned incisions and one or more updated incisions according to at least one embodiment of the present disclosure
  • Fig. 2C is an image of one or more planned incisions and one or more updated incisions according to at least one embodiment of the present disclosure
  • Fig. 3A is a schematic image of a planned incision according to at least one embodiment of the present disclosure.
  • Fig. 3B is a schematic image of a planned incision according to at least one embodiment of the present disclosure
  • Fig. 4 A is a flowchart according to at least one embodiment of the present disclosure.
  • Fig. 4B is a flowchart according to at least one embodiment of the present disclosure.
  • Fig. 5 is a flowchart according to at least one embodiment of the present disclosure.
  • Fig. 6 is a flowchart according to at least one embodiment of the present disclosure.
  • Fig. 7 is a flowchart according to at least one embodiment of the present disclosure.
  • the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions).
  • Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be P-637050-PC used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple Al l, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors), graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuit
  • DSPs digital signal processors
  • proximal and distal are used in this disclosure with their conventional medical meanings, proximal being closer to the operator or user of the system, and further from the region of surgical interest in or on the patient, and distal being closer to P-637050-PC the region of surgical interest in or on the patient, and further from the operator or user of the system.
  • Generating a surgical plan for planning a surgical procedure typically uses a preoperative 3D image such as a CT scan or an MRI scan.
  • a position of the patient on a surgical table may change a positioning of the patient’s soft tissue relative to the preoperative 3D image.
  • one or more parameters of the surgical plan may not align or match the actual position of the patient and the patient’s soft tissue.
  • the patient position (and thus, the position of the patient’s soft tissue) may be determined prior to a start of the surgical procedure or during the surgical procedure.
  • the patient position and the position of the patient’s soft tissue may be registered to one or more parameters or components of the surgical plan such as, for example, a plurality of implant positions and/or trajectories.
  • the plurality of implants positions and/or trajectories can be projected onto the patient and a user such as a surgeon can change the patient’s positioning or update the surgical plan in real time.
  • the surgical plan can also be updated automatically by, for example, a processor.
  • the surgical plan can be update by, for example: uniting a cluster of implants position skin entry points to a single hole, which may improve patient recovery from the surgery as less incisions will need to recover; uniting all position skin entry level to fit within a certain dimension to allow the usage of port, which may improve patient recovery from the surgery as less incisions will need to recover; selecting a skin incision that average all the implants position skin entry level point to a single line, which may improve the aesthetics aspect of the surgery outcomes; and/or planning symmetrical incisions.
  • Embodiments of the present disclosure provide technical solutions to one or more of the problems of (1) reducing incisions during a surgical procedure, (2) improving patient recovery time, (3) enabling adjustments to a surgical plan during a surgical procedure, and (4) improving patient safety.
  • a block diagram of a system 100 according to at least one embodiment of the present disclosure is shown.
  • the system 100 may be used to generate a surgical plan, update the surgical plan during a surgical procedure, and/or carry out one or more other aspects of one or more of the methods disclosed herein.
  • the system 100 comprises a computing device 102, one or more imaging devices 112, a robot 114, a P-637050-PC navigation system 118, a database 130, and/or a cloud or other network 134.
  • Systems according to other embodiments of the present disclosure may comprise more or fewer components than the system 100.
  • the system 100 may not include the imaging device 112, the robot 114, the navigation system 118, one or more components of the computing device 102, the database 130, and/or the cloud 134.
  • the computing device 102 comprises a processor 104, a memory 106, a communication interface 108, and a user interface 110.
  • Computing devices according to other embodiments of the present disclosure may comprise more or fewer components than the computing device 102.
  • the processor 104 of the computing device 102 may be any processor described herein or any similar processor.
  • the processor 104 may be configured to execute instructions stored in the memory 106, which instructions may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud 134.
  • the memory 106 may be or comprise RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions.
  • the memory 106 may store information or data useful for completing, for example, any step of the methods 500, 600, and/or 700 described herein, or of any other methods.
  • the memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the robot 114.
  • the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable image processing 120, a registration 122, an planning model 124, and/or an update planning model 128.
  • the image processing 120 enables the processor 104 to process image data of an image (received from, for example, the imaging device 112, an imaging device of the navigation system 118, or any imaging device) for the purpose of, for example, identifying a position of a patient (e.g., patient position), a position of the patient’s soft tissue, and/or a position of patient anatomy (e.g., vertebrae, soft tissue, hard tissue, etc.) and more specifically, an actual position of a portion of the patient on which one or more incisions are planned to be placed and one or more implants are planned to be inserted.
  • a position of a patient e.g., patient position
  • a position of the patient’s soft tissue e.g., a position of the patient’s soft tissue
  • patient anatomy e.g., vertebrae, soft tissue, hard tissue, etc.
  • the patient position obtained from the image processing 120 may enable the processor 104 to update one or more parameters and/or steps of a surgical plan 132.
  • the registration 122 enables the processor 104 to correlate an image with another image.
  • the registration 122 may enable the processor 104 to also correlate identified anatomical elements and/or individual objects in one image with identified anatomical elements and/or individual objects in another image.
  • the registration 122 may enable information about the anatomical elements and the individual objects to be obtained and measured. For example, a potential incision site on the patient in a first image (e.g., a preoperative image) and the potential incision site on the patient in a second image (e.g., an intraoperative image) can be determined by the processor 104. This information (among other inputs) can be used in determining and/or updating one or more parameters of the surgical plan.
  • the planning model 124 enables the processor 104 to receive one or more inputs such as one or more inputs 404 (shown in Fig. 4A) about a surgical procedure and a patient and generate a surgical plan such as the surgical plan 132.
  • the one or more inputs may include, for example, surgeon inputs, available surgical instruments, a 3D scan of the patient (e.g., an MRI scan and/or a CT scan), a plurality of implant positions and/or orientations, a desired incision (e.g., an minimally invasive surgical (MIS) incision, a single MIS incision, a single MIS regression line), whether a port will be used, and/or information about the port (e.g., size, shape, etc.).
  • MIS minimally invasive surgical
  • the implants may include, for example, one or more spine screws, one or more rods, one or more cages, etc.
  • the planning model 124 can generate, for example, one or more steps of the surgical plan 132, information about at least one planned incision 200 (shown in Figs. 2A-2C), information about a rod and rod insertion, a rod entrance point, a position for each of a plurality of implants and/or a trajectory for each of the plurality of implants based on the one or more inputs.
  • the information about the at least one planned incision 200 may include, for example, a length and position of the incision on the patient’s soft tissue.
  • the planning model 124 may be used during the surgical procedure. More specifically, an actual or current patient position 418 (shown in Fig. 4A and 4B) may be obtained or received during or prior to a start of a surgical procedure. The actual or current patient position 418 may be used with one or more inputs or parameters (e.g., type of procedure, surgical instruments to be used during the surgical procedure, skin elasticity, etc.) to plan an incision on the patient. In such embodiments, P-637050-PC the incision may not be planned prior to the surgical procedure, and rather is planned during the surgical procedure.
  • inputs or parameters e.g., type of procedure, surgical instruments to be used during the surgical procedure, skin elasticity, etc.
  • the update planning model 128 enables the processor 104 to update the surgical plan 132 or at least a portion of the surgical plan 132 based on, for example, information about the patient during or prior to a start of the surgical procedure.
  • the information may include a skin elasticity of the patient (which may affect, for example, a width of an incision) and/or an actual position of the patient 408 (shown in Fig. 4) and/or patient anatomy during the surgical procedure (obtained from, for example, one or more imaging devices 112.
  • the patient anatomy obtained from, for example, a first set of image data from two or more LiDAR cameras and/or an imaging device of the navigation system 118 may include, for example, a position of the soft tissue (e.g., skin) of the patient.
  • the patient anatomy obtained from, for example, a second set of image data from an X-ray imaging device may include, for example, a position of hard tissue (e.g., bone).
  • the first set of image data and the second set of image data can be used to create or update a patient’s position and an orientation and depth of the patient’s skin at, for example, a target skin incision site.
  • the update planning model 128 can update at least a portion of the surgical plan 132 to generate an updated surgical plan 412 (shown in Fig. 4).
  • the updated surgical plan 412 may include, for example, updated information about the incision (e.g., updated incision) 202 (shown in Figs. 2A-2C and 3A-3B), an updated plurality of implant positions 302 (shown in Fig. 3A), and/or updated trajectories of each of the plurality of implants 300 (shown in Fig. 3B).
  • the information about the incision may include instructions for performing multiple incisions and the instructions may be updated to include instructions to perform a single incision.
  • the update planning model 128 may update the information about the incision based on, for example, optimizing the plurality of implant positions and/or entry points to enable a single skin incision, averaging a corresponding implant position skin level point of each implant to a single line, whether a port is used and dimension(s) of the port, symmetry (or lack thereof) between two or more incisions, and/or a skin elasticity of the patient. It will be appreciated that the update planning model 128 can be used during any portion of the surgical procedure.
  • Such content described above may, in some embodiments, be organized into one or more applications, modules, packages, layers, or P-637050-PC engines.
  • the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc.) that can be processed by the processor 104 to carry out the various method and features described herein.
  • various contents of memory 106 may be described as instructions, it should be appreciated that functionality described herein can be achieved through use of instructions, algorithms, and/or machine learning models.
  • the data, algorithms, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging device 112, the robot 114, the database 130, and/or the cloud 134.
  • the memory 106 may also store the surgical plan 132.
  • the surgical plan 132 may comprise, for example, one or more steps for performing a surgical procedure, information about the incision, information about a rod and rod insertion, a position for each of the plurality of implants, and/or a trajectory for each of the plurality of implants.
  • the surgical procedure may be a spinal procedure (e.g., a spinal alignment, installing implants, osteotomy, fusion, and/or any other spinal procedure) to correct a spinal deformity.
  • the surgical plan 132 may comprise one or more surgical steps for inserting a plurality of implants and a rod to move a plurality of vertebrae to a predetermined alignment.
  • the surgical plan 132 may also be stored in the database 130 and/or the cloud 134.
  • the computing device 102 may also comprise a communication interface 108.
  • the communication interface 108 may be used for receiving image data or other information from an external source (such as the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100), and/or for transmitting instructions, images, or other information to an external system or device (e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100).
  • an external system or device e.g., another computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, the cloud 134, and/or any other system or component not part of the system 100.
  • the communication interface 108 may comprise one or more wired interfaces (e.g., a USB port, an Ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.1 la/b/g/n, Bluetooth, NFC, ZigBee, and so forth).
  • the communication interface 108 may be useful for enabling the device 102 to communicate with one or more other processors 104 or P-637050-PC computing devices 102, whether to reduce the time needed to accomplish a computingintensive task or for any other reason.
  • the computing device 102 may also comprise one or more user interfaces 110.
  • the user interface 110 may be or comprise a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
  • the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100.
  • the user interface 110 may be useful to allow a surgeon or other user to modify instructions to be executed by the processor 104 according to one or more embodiments of the present disclosure, and/or to modify or adjust a setting of other information displayed on the user interface 110 or corresponding thereto.
  • the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102.
  • the user interface 110 may be located proximate one or more other components of the computing device 102, while in other embodiments, the user interface 110 may be located remotely from one or more other components of the computer device 102.
  • the imaging device 112 may be operable to image anatomical feature(s) (e.g., a bone, veins, tissue, etc.) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc.).
  • image data from one set may be used to detect hard tissue such as, for example, bone
  • image data from another set may be used to detect soft tissue such as, for example, skin.
  • “Image data” as used herein refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form.
  • the image data may comprise data corresponding to an anatomical feature of a patient, or to a portion thereof.
  • the image data may be or comprise a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure.
  • a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time
  • a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
  • the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
  • the imaging device 112 may be or comprise, for example, an ultrasound scanner (which may comprise, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an O-arm, a C-arm, a G-arm, or any other device utilizing X- ray -based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a LiDAR camera, a depth camera, a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may comprise, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient.
  • the imaging device 112 may be contained entirely within a single housing, or may comprise a transmitter/emitter and a receiver
  • the imaging device 112 may comprise more than one imaging device 112.
  • a first imaging device may provide first image data and/or a first image
  • a second imaging device may provide second image data and/or a second image.
  • image data from one set may be used to detect hard tissue such as, for example, bone
  • image data from another set may be used to detect soft tissue such as, for example, skin.
  • the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein.
  • the imaging device 112 may be operable to generate a stream of image data.
  • the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images.
  • image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
  • the robot 114 may be any surgical robot or surgical robotic system.
  • the robot 114 may be or comprise, for example, the Mazor XTM Stealth Edition robotic guidance system.
  • the robot 114 may be configured to position the imaging device 112 at one or P-637050-PC more precise position(s) and orientation(s), and/or to return the imaging device 112 to the same position(s) and orientation(s) at a later point in time.
  • the robot 114 may additionally or alternatively be configured to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task.
  • the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure.
  • the robot 114 may comprise one or more robotic arms 116.
  • the robotic arm 116 may comprise a first robotic arm and a second robotic arm, though the robot 114 may comprise more than two robotic arms.
  • one or more of the robotic arms 116 may be used to hold and/or maneuver the imaging device 112.
  • the imaging device 112 comprises two or more physically separate components (e.g., a transmitter and receiver)
  • one robotic arm 116 may hold one such component
  • another robotic arm 116 may hold another such component.
  • Each robotic arm 116 may be positionable independently of the other robotic arm.
  • the robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
  • the robot 114 together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
  • the robotic arm(s) 116 may comprise one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm).
  • reference markers e.g., navigation markers
  • the reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof.
  • the navigation system 118 can be used to track other components of the system (e.g., imaging device 112) and the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or P-637050-PC more surgical tools, based on information and/or instructions generated by the navigation system 118, for example).
  • the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an procedure.
  • the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStationTM S8 surgical navigation system or any successor thereof.
  • the navigation system 118 may include one or more cameras or other sensor(s) for tracking one or more reference markers, navigated trackers, or other objects within the operating room or other room in which some or all of the system 100 is located.
  • the one or more cameras may be optical cameras, infrared cameras, or other cameras.
  • the navigation system 118 may comprise one or more electromagnetic sensors.
  • the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, and/or one or more surgical tools (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing).
  • the navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
  • the system 100 can operate without the use of the navigation system 118.
  • the navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
  • the database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems to a patient coordinate system and/or to a navigation coordinate system).
  • the database 130 may additionally or alternatively store, for example, one or more surgical plans such as the surgical plan 132 (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100); one or more images useful in connection with a surgery to be completed by or with the P-637050-PC assistance of one or more other components of the system 100; and/or any other useful information.
  • the surgical plan 132 including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the navigation system 118, and/or a user of the computing device 102 or of the system 100
  • the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud 134.
  • the database 130 may be or comprise part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • a hospital image storage system such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • the cloud 134 may be or represent the Internet or any other wide area network.
  • the computing device 102 may be connected to the cloud 134 via the communication interface 108, using a wired connection, a wireless connection, or both.
  • the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud 134.
  • the system 100 or similar systems may be used, for example, to carry out one or more aspects of any of the methods 500, 600, and/or 700 described herein.
  • the system 100 or similar systems may also be used for other purposes.
  • Figs. 2A - 2C and 3A - 3B illustrate processes for planning and updating the surgical plan 132, and more specifically, surgical incisions based on one or more factors. More specifically, Figs. 2A - 2C illustrate a first image, a second image, and a third image, respectively, of example planned incision(s) 200 and updated incision(s) 202. It will be appreciated that the planned incision(s) 200 and/or the updated incision(s) 202 may be displayed or projected on the patient or may be displayed on a model and/or images of the patient on the user interface 110. The planned incision(s) 200 may be part of the surgical plan 132 and may be generated by, for example, the planning model 124.
  • the planned incision(s) 200 can be mapped on the actual patient or shown on a 3D model or 2D image of the patient based on the patient position 408, which can be obtained by, for example, the imaging device 112 or an imaging device of the navigation system 118.
  • the patient position 408 can be inputted into the update planning model 126 to generate the updated incision(s) 202.
  • the updated incision(s) 202 may be generated by a user such as, for example, a surgeon or other medical provider based on viewing the planned incision(s) 200 on the patient or on the user interface 110.
  • the planned incision(s) 200 include two incisions 200A, 200B spaced apart from each other and also each spaced a distance from a center line 206 of the patient. As shown, the two incisions 200A, 200B are not symmetrical with respect to the center line 206 of the patient.
  • the update planning model 126 (or the user) may output an updated incision 200B’ in which the position of 200B is moved such that 200B’ is symmetrical with 200A with respect to the center line 206 of the patient.
  • Such updated incision 200B’ may be aesthetically desirable.
  • a set of planned incision(s) 200 may include multiple incisions positioned near each other.
  • the update planning model 126 may average the set of planned incision(s) 200 so as to align the incisions together along, for example, a common line 204, and/or combine two or more of the incisions 200 together.
  • the updated incisions 202 include combining the incisions into two incisions, which may reduce healing of the incisions (as less incisions are healing) and may improve the aesthetics of the surgical site.
  • a set of planned incision(s) 200 may include three incisions positioned near each other.
  • the update planning model 126 can generate an updated incision 202 (shown in solid line for clarity) to fit within the desired port.
  • the planned incision(s) 200 may result in rupture between the incisions due to a proximity of each incision to each other. Thus, optimizing the incisions and forming one incision eliminates the risk of rupture between multiple incisions.
  • Figs. 3 A and 3B illustrate a top schematic view and a side schematic view, respectively, of an updated incision such as the updated incision 202.
  • the updated incision 202 includes one incision to accommodate three trajectories 300 for three implants to be positioned at three positions 302.
  • three separate incisions would be formed (stated differently, one incision for each implant may be formed) for the three implants which may result in increased healing time and/or a risk of ruptures between two or more incisions.
  • the updated incision 202 comprising a single incision may be generated, thereby reducing a number of incisions on the patient, which may reduce a healing time of the patient and improve an aesthetics of the surgical site. It will be P-637050-PC appreciated that in other embodiments, the updated incision 202 may be determined by a user such as, for example, a surgeon or other medical provider.
  • FIG. 4A an example of a model architecture 400 that supports methods and systems (e.g., Artificial Intelligence (Al)-based methods and/or system) for generating the surgical plan 132 and updating the surgical plan 132 to provide an updated surgical plan 412 is shown.
  • methods and systems e.g., Artificial Intelligence (Al)-based methods and/or system
  • One or more inputs 404 may be used by a processor such as the processor 104 as input for the planning model 124.
  • the planning model 124 may output the surgical plan 132.
  • the one or more inputs 404 may include, for example, surgeon inputs (e.g., desired type of surgery (open or MIS), desired incision position, etc.), available surgical instruments, a 3D scan of the patient (e.g., an MRI and/or a CT scan), a plurality of implant positions and/or orientations, a desired incision (e.g., an minimally invasive surgical (MIS) incision, a single MIS incision, a single MIS regression line), whether a port will be used, and/or information about the port (e.g., size, shape, etc.).
  • surgeon inputs e.g., desired type of surgery (open or MIS), desired incision position, etc.
  • available surgical instruments e.g., a 3D scan of the patient (e.g., an MRI and/
  • the 3D scan of the one or more inputs 404 may be received from an imaging device such as the imaging device 112, an imaging device of a navigation system such as the navigation system 118, a memory such as the memory 106, a database such as the database 130, a cloud such as the cloud 134, or any other imaging device or component of a system such as the system 100.
  • the planning model 124 - using the one or more inputs 404 - can generate, for example, one or more steps of the surgical plan 132, information about at least one planned incision, information about a rod and rod insertion, and/or a position and/or a trajectory for each of the plurality of implants.
  • the planning model 124 may be trained using historical inputs such as, for example, historical image data, historical incision data, historical patient data, historical rod information, historical port information, historical surgical plans, and/or historical implant positions and trajectories. In other embodiments, the planning model 124 may be trained using the one or more or more inputs 404. In such embodiments, the planning model 124 may be trained prior to inputting the one or more inputs 404 into the planning model 124 or may be trained in parallel with inputting the one or more inputs 404 into the planning model 124.
  • the planning model 124 may output the surgical plan 132 and a planned patient position 408.
  • the surgical plan 132 and the planned patient position 408 may be used to position a patient to yield an actual patient position 418.
  • the actual position of the patient may be used to update the surgical plan 412 using, for example, the update planning model 128. More specifically, the patient may be positioned for a surgical procedure and the actual or current position of the patient 418 may be obtained by, for example, one or more imaging devices such as the imaging devices 112.
  • the actual or current patient position 418 may include a position of soft tissue on the patient upon which the planned incision(s) are to be placed.
  • the imaging device used to obtain a position of the position of the soft tissue e.g., the patient’s skin
  • the patient position may be obtained from a combination of two image data sets.
  • a first set of image data from an X-ray imaging device may include, for example, a position of hard tissue (e.g., bone) and a second set of image data from two or more LiDAR cameras may include, for example, a position of the soft tissue (e.g., the patient’ s skin) of the patient.
  • the first set of image data and the second set of image data can be used to update a patient’s position and an orientation and depth of the patient’s skin at, for example, a target skin incision site.
  • the surgical plan 132 and a patient position 418 may be used by the processor 104 as input for the update planning model 128.
  • the update planning model 128 may output an updated surgical plan 412 which may include, for example, updating at least one additional portion or parameter of the surgical plan 132.
  • the updated surgical plan 412 may include, for example, updated information about the incision 202, an updated plurality of implant positions 302, and/or an updated trajectory for each of the plurality of implants 300.
  • the update planning model 128 may be trained using historical surgical plans and/or historical patient positions. In other embodiments, the update planning model 128 may be trained using the surgical plan 132 and/or the patient position 408. In such embodiments, the update planning model 128 may be trained prior to inputting the surgical plan 132 and/or the patient position 408 into the update planning model 128 or may be trained in parallel with inputting the surgical plan 132 and/or the patient position 408 into the update planning model 128.
  • FIG. 4B an example of a model architecture 416 that supports methods and systems (e.g., Artificial Intelligence (Al)-based methods and/or system) for generating the surgical plan 132 during a surgical procedure is provided.
  • the model P-637050-PC architecture 416 is similar to the model architecture 400 except that the planning model 124 is executed to generate the surgical plan 132 during or prior to a start of the surgical procedure using an actual position of the patient (among other inputs).
  • the patient may be positioned for a surgical procedure and the actual or current position of the patient 418 may be obtained by, for example, one or more imaging devices such as the imaging devices 112.
  • the actual or current patient position 418 may include a position of soft tissue on the patient upon which the planned incision(s) are to be placed.
  • the imaging device used to obtain a position of the position of the soft tissue e.g., the patient’s skin
  • the patient position may be obtained from a combination of two image data sets.
  • a first set of image data from an X-ray imaging device may include, for example, a position of hard tissue (e.g., bone) and a second set of image data from two or more LiDAR cameras may include, for example, a position of the soft tissue (e.g., the patient’s skin) of the patient.
  • the first set of image data and the second set of image data can be used to update a patient’s position and an orientation and depth of the patient’s skin at, for example, a target skin incision site.
  • the actual or current patient position 418 and one or more inputs 404 may be used by a processor such as the processor 104 as input for the planning model 124.
  • the planning model 124 may output the surgical plan 132.
  • the one or more inputs 404 may include, for example, surgeon inputs (e.g., desired type of surgery (open or MIS), desired incision position, etc.), a plurality of desired implant positions and/or orientations, a desired incision (e.g., an minimally invasive surgical (MIS) incision, a single MIS incision, a single MIS regression line), available surgical instruments (e.g., a tubular retractor, a port, etc.), and/or information about the available surgical instruments (e.g., size, shape, etc.
  • surgeon inputs e.g., desired type of surgery (open or MIS), desired incision position, etc.
  • a plurality of desired implant positions and/or orientations e.g., an minimally invasive surgical (MIS) incision,
  • the planning model 124 - using the one or more inputs 404 and the actual or current patient position 418 - can generate, for example, one or more steps of the surgical plan 132, information about at least one planned incision, information about a rod and rod insertion, and/or a position and/or a trajectory for each of the plurality of implants.
  • the surgical plan 132 can be used to perform the surgical procedure 424. It will be appreciated that in some embodiments, the surgical plan 132 can be updated using, for example, the update planning model 128 at any time during the surgical procedure.
  • Fig. 5 depicts a method 500 that may be used, for example, for generating one or more models is provided.
  • the method 500 comprises generating a model (step 504).
  • the model may be the planning model 124 and/or the update planning model 128.
  • a processor such as the processor 104 may generate the model.
  • the model may be generated to facilitate and enable, for example, generating a surgical plan such as the surgical plan 132 and/or updating the surgical plan 132 to output an updated surgical plan such as the updated surgical plan 412.
  • the method 500 also comprises training the model (step 508).
  • the model may be trained using historical data from a number of patients and/or inputs such as, for example, historical image data, historical surgical plans, etc.
  • the historical data may be obtained from patients that have similar patient data to a patient on which a surgical procedure is to be performed. In other embodiments, the historical data may be obtained from any patient.
  • the model may be trained in parallel with use of another model.
  • Training in parallel may, in some embodiments, comprise training a model using input received during, for example, or prior to a surgical procedure, while also using a separate model to receive and act upon the same input. Such input may be specific to a patient undergoing the surgical procedure.
  • the model being trained exceeds the model in use (whether in efficiency, accuracy, or otherwise)
  • the model being trained may replace the model in use.
  • Such parallel training may be useful, for example, in situations, where a model is continuously in use (for example, when an input (such as, for example, an image) is continuously updated) and a corresponding model may be trained in parallel for further improvements.
  • the model trained using historical data may be initially used as a primary model at a start of a surgical procedure.
  • a training model may also be trained in parallel with the primary model using patientspecific input until the training model is sufficiently trained.
  • the primary model may then be replaced by the training model.
  • the method 500 also comprises storing the model (step 512).
  • the model may be stored in memory such as the memory 106 and/or a database such as the database 130 for later use.
  • the model is stored in the memory when the model P-637050-PC is sufficiently trained.
  • the model may be sufficiently trained when the model produces an output that meets a predetermined threshold, which may be determined by, for example, a user, or may be automatically determined by a processor such as the processor 104.
  • the present disclosure encompasses embodiments of the method 500 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • Fig. 6 depicts a method 600 that may be used, for example, for generating and updating a surgical plan.
  • the method 600 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
  • a processor other than any processor described herein may also be used to execute the method 600.
  • the at least one processor may perform the method 600 by executing elements stored in a memory such as the memory 106.
  • the elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 600.
  • One or more portions of a method 600 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a registration 122, an planning model 124, and/or an update planning model 128.
  • the method 600 comprises generating a surgical plan (step 604).
  • the surgical plan may be the same as or similar to the surgical plan 132.
  • the surgical plan may be generated by, for example, a processor such as the processor 104 inputting one or more inputs such as the one or more inputs 404 into an planning model such as the planning model 124.
  • the planning model 124 may generate the surgical plan and may store the surgical plan in memory such as the memory 106, a database such as the database 130, and/or a cloud such as the cloud 134.
  • the one or more inputs may include, for example, surgeon inputs (e.g., desired type of surgery (open or MIS), desired incision position, etc.), available surgical instruments, a 3D scan of the patient (e.g., a CT scan, an MRI scan, etc.), a plurality of implant positions and/or orientations, a desired incision (e.g., an minimally invasive surgical (MIS) incision, a single MIS incision, a single MIS regression line), whether a P-637050-PC port will be used, and/or information about the port (e.g., size, shape, etc.).
  • surgeon inputs e.g., desired type of surgery (open or MIS), desired incision position, etc.
  • available surgical instruments e.g., a 3D scan of the patient (e.g., a CT scan, an MRI scan, etc.), a plurality of implant positions and/or orientations, a desired incision (e.g., an minimally invasive surgical (MIS
  • the 3D scan of the one or more inputs may be obtained from an imaging device such as the imaging device 112, an imaging device of a navigation system such as the navigation system 118, a memory such as the memory 106, a database such as the database 130, a cloud such as the cloud 134, or any other imaging device or component of a system such as the system 100.
  • the planning model - using the one or more inputs - can generate, for example, one or more steps of the surgical plan which may include information about at least one planned incision such as the planned incision(s) 200, information about a rod and rod insertion, a rod entrance point, and/or a position and/or a trajectory for each of the plurality of implants.
  • the method 600 also comprises receiving the surgical plan (step 608).
  • the surgical plan may be received from, for example, the memory, the database, the cloud, and/or any component of the system.
  • the surgical plan may also include (in addition to the one or more steps, information about at least one planned incision, etc.) information about a patient, image data of the patient (e.g., CT scan(s), MRI scan(s), 3D scan(s), etc.), planned surgical instruments and/or tools available, or the like.
  • the method 600 also comprises obtaining a current patient position (step 612).
  • the patient position may be the same as the patient position 418 and may be obtained during or prior to a start of a surgical procedure by, for example, the imaging device and/or by an imaging device of the navigation system.
  • the patient position may include a position of soft tissue on the patient upon which the planned incision(s) are to be placed.
  • the imaging device may be one or more LiDAR cameras.
  • the image data obtained from the imaging device may be processed by the processor using an image processing such as the image processing 120.
  • the patient position may be obtained from reference markers placed on the patient and tracked by the navigation system.
  • the patient position may be obtained from a combination of two image data sets.
  • a first set of image data from an X-ray imaging device may include, for example, a position of hard tissue (e.g., bone).
  • a second set of image data from two or more LiDAR cameras and/or an imaging device of the navigation system may include, for example, a position of the soft tissue (e.g., skin) P-637050-PC of the patient.
  • the first set of image data and the second set of image data can be used to create or update a patient’s position and an orientation and depth of the patient’s skin at, for example, a target skin incision site.
  • the image processing may enable the processor to process the image so as to determine an actual position of the patient and more specifically, at least a portion of the patient on which one or more incisions are planned to be placed.
  • the patient position obtained from the image processing may enable the processor to update one or more parameters and/or steps of the surgical plan.
  • the method 600 also comprises registering the surgical plan and the patient position (step 616).
  • the registering may include the processor using a registration such as the registration 122 to register the surgical plan and the patient position.
  • the registration enables the processor to correlate an image with another image.
  • the registration may enable the processor to also correlate identified anatomical elements and/or individual objects in one image with identified anatomical elements and/or individual objects in another image.
  • the registration may enable information about the anatomical elements and the individual objects to be obtained and measured. For example, a potential incision site on the patient in a first image (e.g., a preoperative image) and the potential incision site on the patient in a second image (e.g., an intraoperative image) can be determined by the processor.
  • the image data set(s) (whether the first set of image data, the second set of image data, and/or a combination of the first and second sets of image data) corresponding to the patient position may be registered with, for example, a 3D scan (e.g., MRI scan or CT scan) of the patient obtained from the surgical plan.
  • the 3D scan may be obtained preoperatively and the image data corresponding to the patient position may be obtained prior to a start of the surgical procedure or intraoperatively.
  • the method 600 also comprises reviewing at least a portion of the surgical plan on the patient (step 620).
  • the portion of the surgical plan may be projected onto the patient and in other instances, or in other instances the portion of the surgical plan may be displayed on a 3D model, 3D scan, or 2D image of the patient on a user interface such as the user interface 110.
  • the portion of the surgical plan that may be displayed or projected may include, for example, the planned incision(s) and/or the positions of the plurality of implants.
  • a surgeon can provide additional input P-637050-PC to, for example, an update planning model such as the update planning model 128 to optimize and/or update the planned incision(s).
  • an update planning model such as the update planning model 128 to optimize and/or update the planned incision(s).
  • the surgeon may adjust the planned incision(s) based on the displayed or projected planned incision(s) and/or positions of the plurality of implants.
  • the method 600 also comprises updating at least a portion of the surgical plan (step 624).
  • the surgical plan or at least a portion of the surgical plan may be updated by the processor inputting the surgical plan (received in, for example, the step 608) and the patient position (receive in, for example, the step 612) into and update planning model such as the update planning model 128.
  • the update planning model may then output an updated surgical plan which may include, for example, at least an updated portion or an updated parameter of the surgical plan.
  • the updated surgical plan may include, for example, an updated rod entrance point, an updated rod trajectory, updated information about the incision such as the updated incision(s) 202, an updated plurality of implant positions such as the updated plurality of implant positions 302, and/or an updated trajectory for each of the plurality of implants such as an updated trajectory 300 for each of the plurality of implants.
  • updating at least a portion of the surgical plan may be updated by a user such as, for example, a surgeon or other medical provider.
  • the surgical plan may be updated automatically by the processor using input from the user.
  • the method 600 also comprises reviewing the updated surgical plan (step 628).
  • the updated surgical plan may be reviewed by a user such as, for example, a surgeon or other medical provider.
  • the updated surgical plan may need approval from the user prior to implementation of the updated surgical plan.
  • reviewing the updated surgical plan includes projecting updated incision(s), updated implant position(s), and/or an updated rod entry point and/or position onto the patient.
  • reviewing the updated surgical plan includes displaying the updated incision(s), updated implant position(s), and/or an updated rod entry point and/or position onto a 3D model, 3D scan, or 2D images of the patient on the user interface.
  • the method 600 may not include the step 628. In such instances, the updated surgical plan may be automatically implemented.
  • steps of the method 600 may be repeated throughout a surgical procedure or throughout a surgical step.
  • steps P-637050-PC may be repeated throughout a surgical procedure or throughout a surgical step.
  • any step of the method 600 may be automatically executed or may need approval by a user.
  • the steps 612-624 may be automatically executed after a surgical step, or may be executed based on input from the user.
  • the present disclosure encompasses embodiments of the method 600 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • Fig. 7 depicts a method 700 that may be used, for example, for generating a surgical plan during a surgical procedure.
  • the method 700 (and/or one or more steps thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
  • a processor other than any processor described herein may also be used to execute the method 700.
  • the at least one processor may perform the method 700 by executing elements stored in a memory such as the memory 106.
  • the elements stored in memory and executed by the processor may cause the processor to execute one or more steps of a function as shown in method 700.
  • One or more portions of a method 700 may be performed by the processor executing any of the contents of memory, such as an image processing 120, a registration 122, an planning model 124, and/or an update planning model 128.
  • the method 700 comprises receiving an actual or current patient position (step 704).
  • the patient position may be the same as the patient position 418 and may be obtained during or prior to a start of a surgical procedure by, for example, an imaging device such as the imaging device 112 and/or by an imaging device of a navigation system such as the navigation system 118.
  • the patient position may include a position of soft tissue on the patient upon which the planned incision(s) are to be placed.
  • the imaging device may be one or more LiDAR cameras.
  • the image data obtained from the imaging device may be processed by the processor using an image processing such as the image processing 120.
  • the patient position may be obtained from reference markers placed on the patient and tracked by the navigation system.
  • the patient position may be obtained from a combination of two image data sets.
  • a first set of image data from an X-ray imaging device may include, for example, a position of hard tissue (e.g., bone).
  • a second set of image data from two or more LiDAR cameras and/or an imaging device of the navigation system may include, for example, a position of the soft tissue (e.g., skin) of the patient.
  • the first set of image data and the second set of image data can be used to create or update a patient’s position and an orientation and depth of the patient’s skin at, for example, a target skin incision site.
  • the image processing may enable the processor to process the image so as to determine an actual position of the patient and more specifically, at least a portion of the patient on which one or more incisions are planned to be placed.
  • the patient position obtained from the image processing may enable the processor to generate one or more parameters and/or steps of the surgical plan.
  • the method 700 also comprises receiving one or more inputs (step 708).
  • the one or more inputs may be the same as or similar to the one or more inputs 404 and may include, for example, surgeon inputs (e.g., desired type of surgery (open or MIS), desired incision position, etc.), available surgical instruments (e.g., tubular retractors, ports, etc.), a plurality of desired implant positions and/or orientations, a desired incision (e.g., an minimally invasive surgical (MIS) incision, a single MIS incision, a single MIS regression line), whether a port will be used, and/or information about the port (e.g., size, shape, etc.).
  • the one or more inputs may be received and/or stored in a memory such as the memory 106, a database such as the database 130, and/or a cloud such as the cloud 134.
  • the method 700 also comprises generating a surgical plan (step 712).
  • the surgical plan may be the same as or similar to the surgical plan 132.
  • the surgical plan may be generated by, for example, a processor such as the processor 104 inputting the actual or current patient position (received in, for example, the step 704) and the one or more inputs (received in, for example, the step 708) into an planning model such as the planning model 124.
  • the planning model 124 may generate the surgical plan and may store the surgical plan in memory such as the memory 106, a database such as the database 130, and/or a cloud such as the cloud 134. It will be appreciated that the step(s) of the surgical plan and/or the parameters of the surgical plan 1 P-637050-PC may be constrained by the actual patient position and/or the input(s) and/or parameters specified for the surgical procedure.
  • the method 700 also comprises reviewing the surgical plan (step 716).
  • the surgical plan may be reviewed by a user such as, for example, a surgeon or other medical provider.
  • the surgical plan may need approval from the user prior to implementation of the surgical plan.
  • reviewing the surgical plan includes projecting the planned incision(s), planned implant position(s), and/or an planned rod entry point and/or position onto the patient.
  • reviewing the surgical plan includes displaying the planned incision(s), planned implant position(s), and/or a planned rod entry point and/or position onto a 3D model, 3D scan, or 2D images of the patient on the user interface.
  • the present disclosure encompasses embodiments of the method 700 that comprise more or fewer steps than those described above, and/or one or more steps that are different than the steps described above.
  • the present disclosure encompasses methods with fewer than all of the steps identified in Figs. 5, 6, and 7 (and the corresponding description of the methods 500, 600, and 700), as well as methods that include additional steps beyond those identified in Figs. 5, 6, and 7 (and the corresponding description of the methods 500, 600, and 700).
  • the present disclosure also encompasses methods that comprise one or more steps from one method described herein, and one or more steps from another method described herein. Any correlation described herein may be or comprise a registration or any other correlation.
  • Example 1 A system for updating a surgical plan comprising: a processor; and a memory storing data for processing by the processor, the data, when processed, causes the processor to: receive a surgical plan comprising a plurality of implant positions and information about an incision; obtain a patient position; and update the surgical plan based on the patient position and the plurality of implant positions.
  • Example 2 The system of claim 1, wherein the memory stores further data for processing by the processor that, when processed, causes the processor to register the patient position to the surgical plan.
  • Example 3 The system of claim 1, wherein updating the surgical plan includes updating one or more implant positions of the plurality of implant positions.
  • Example 4 The system of claim 1, wherein updating the surgical plan includes optimizing the plurality of implant positions to enable a single incision.
  • Example 5 The system of claim 1, wherein updating the surgical plan includes averaging a corresponding implant entrance point of each of the plurality of implant positions to a single line.
  • Example 6 The system of claim 1, wherein updating the surgical plan is based on a skin elasticity factor.
  • Example 7 The system of claim 1, wherein updating the surgical plan includes updating the information about the incision.
  • Example 8 The system of claim 1, further comprising at least one imaging device configured to obtain the patient position.
  • Example 9 The system of claim 8, wherein the at least one imaging device comprises at least one LiDAR camera.
  • Example 10 A system for updating a surgical plan comprising: a processor; and a memory storing data for processing by the processor, the data, when processed, causes the processor to: receive a surgical plan comprising a plurality of implant positions and information about a skin incision; obtain a current patient position; and update the surgical plan based on the current patient position and the plurality of implant positions, wherein updating the surgical plan includes updating the information about the skin incision.
  • Example 11 The system of claim 10, wherein the memory stores further data for processing by the processor that, when processed, causes the processor to register the current patient position to the surgical plan.
  • Example 12 The system of claim 10, wherein updating the surgical plan includes updating one or more implant positions of the plurality of implant positions.
  • Example 13 The system of claim 10, wherein updating the surgical plan includes optimizing the plurality of implant positions to enable a single skin incision.
  • Example 14 The system of claim 10, wherein updating the surgical plan includes averaging a corresponding implant entrance point of each of the plurality of implant positions to a single line.
  • Example 15 The system of claim 10, wherein updating the surgical plan is based on a skin elasticity factor.
  • Example 16 The system of claim 10, wherein updating the surgical plan is based on at least one of whether a port is being used, symmetry of the plurality of implant positions, an entrance point of one or more implants, or reducing a number and size of incisions.
  • Example 17 The system of claim 10, further comprising at least one imaging device configured to provide the current patient position.
  • Example 18 The system of claim 17, wherein the at least one imaging device comprises at least one LiDAR camera.
  • Example 19 A system for updating a surgical plan comprising: an imaging device; a processor; and a memory storing data for processing by the processor, the data, when processed, causes the processor to: receive a surgical plan comprising a plurality of implant positions; obtain a patient position from the at least one imaging device; and update information about the skin incision based on the current patient position and the plurality of implant positions.
  • Example 20 The system of claim 19, wherein the at least one imaging device comprises at least one LiDAR camera.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Robotics (AREA)
  • Urology & Nephrology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

L'invention concerne des systèmes et des procédés de génération et de mise à jour d'un plan chirurgical. Un plan chirurgical comprenant une pluralité de positions d'implant et des informations concernant une incision peuvent être reçus. Une position de patient peut également être reçue ou obtenue. Le plan chirurgical peut être mis à jour y compris les informations concernant l'incision sur la base de la position de patient et de la pluralité de positions d'implant.
PCT/IL2024/050465 2023-05-15 2024-05-15 Systèmes et procédés de génération et de mise à jour d'un plan chirurgical Pending WO2024236563A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202480032085.3A CN121127193A (zh) 2023-05-15 2024-05-15 用于生成和更新外科手术计划的系统和方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363466593P 2023-05-15 2023-05-15
US63/466,593 2023-05-15

Publications (1)

Publication Number Publication Date
WO2024236563A1 true WO2024236563A1 (fr) 2024-11-21

Family

ID=91585340

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2024/050465 Pending WO2024236563A1 (fr) 2023-05-15 2024-05-15 Systèmes et procédés de génération et de mise à jour d'un plan chirurgical

Country Status (2)

Country Link
CN (1) CN121127193A (fr)
WO (1) WO2024236563A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060015030A1 (en) * 2002-08-26 2006-01-19 Orthosoft Inc. Method for placing multiple implants during a surgery using a computer aided surgery system
US20190076195A1 (en) * 2015-11-11 2019-03-14 Think Surgical, Inc. Articulating laser incision indication system
US20220192701A1 (en) * 2020-12-21 2022-06-23 Mazor Robotics Ltd. Systems and methods for surgical port positioning
US20220241032A1 (en) * 2021-02-01 2022-08-04 Mazor Robotics Ltd. Multi-arm robotic systems and methods for identifying a target
US20230056596A1 (en) * 2021-08-12 2023-02-23 Smith & Nephew, Inc. System and method for implant surface matching for joint revision surgery

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060015030A1 (en) * 2002-08-26 2006-01-19 Orthosoft Inc. Method for placing multiple implants during a surgery using a computer aided surgery system
US20190076195A1 (en) * 2015-11-11 2019-03-14 Think Surgical, Inc. Articulating laser incision indication system
US20220192701A1 (en) * 2020-12-21 2022-06-23 Mazor Robotics Ltd. Systems and methods for surgical port positioning
US20220241032A1 (en) * 2021-02-01 2022-08-04 Mazor Robotics Ltd. Multi-arm robotic systems and methods for identifying a target
US20230056596A1 (en) * 2021-08-12 2023-02-23 Smith & Nephew, Inc. System and method for implant surface matching for joint revision surgery

Also Published As

Publication number Publication date
CN121127193A (zh) 2025-12-12

Similar Documents

Publication Publication Date Title
US20230135286A1 (en) Systems, devices, and methods for tracking one or more surgical landmarks
US20250152262A1 (en) Path planning based on work volume mapping
US20250152261A1 (en) Systems and methods for registering one or more anatomical elements
US20250325379A1 (en) Systems and methods for training and using an implant plan evaluation model
US12067653B2 (en) Systems, methods, and devices for generating a corrected image
US20230240753A1 (en) Systems and methods for tracking movement of an anatomical element
WO2024236563A1 (fr) Systèmes et procédés de génération et de mise à jour d'un plan chirurgical
US12004821B2 (en) Systems, methods, and devices for generating a hybrid image
US11847809B2 (en) Systems, devices, and methods for identifying and locating a region of interest
US20230278209A1 (en) Systems and methods for controlling a robotic arm
US20250009464A1 (en) Systems and methods for generating a corrected image
WO2025120637A1 (fr) Systèmes et procédés de planification et de mise à jour de trajectoires pour dispositifs d'imagerie
WO2024180545A1 (fr) Systèmes et procédés d'enregistrement d'un élément anatomique cible
WO2025109596A1 (fr) Systèmes et procédés d'enregistrement à l'aide d'un ou de plusieurs repères
WO2025120636A1 (fr) Systèmes et procédés de détermination du mouvement d'un ou plusieurs éléments anatomiques
WO2025186761A1 (fr) Systèmes et procédés de détermination d'une position d'un objet par rapport à un dispositif d'imagerie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24734146

Country of ref document: EP

Kind code of ref document: A1