[go: up one dir, main page]

WO2025019758A1 - System for automated surface contour matching for surgical navigation - Google Patents

System for automated surface contour matching for surgical navigation Download PDF

Info

Publication number
WO2025019758A1
WO2025019758A1 PCT/US2024/038708 US2024038708W WO2025019758A1 WO 2025019758 A1 WO2025019758 A1 WO 2025019758A1 US 2024038708 W US2024038708 W US 2024038708W WO 2025019758 A1 WO2025019758 A1 WO 2025019758A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
data
sensor data
patch
array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/038708
Other languages
French (fr)
Inventor
Rafi Avitsian
Pablo RECINOS
Sean NAGEL
Efstathios KONDYLIS
Stephen Jones
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cleveland Clinic Foundation
Original Assignee
Cleveland Clinic Foundation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cleveland Clinic Foundation filed Critical Cleveland Clinic Foundation
Publication of WO2025019758A1 publication Critical patent/WO2025019758A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/683Means for maintaining contact with the body
    • A61B5/6832Means for maintaining contact with the body using adhesives
    • A61B5/6833Adhesive patches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3991Markers, e.g. radio-opaque or breast lesions markers having specific anchoring means to fixate the marker to the tissue, e.g. hooks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/04Arrangements of multiple sensors of the same type
    • A61B2562/046Arrangements of multiple sensors of the same type in a matrix array
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/16Details of sensor housings or probes; Details of structural supports for sensors
    • A61B2562/164Details of sensor housings or probes; Details of structural supports for sensors the sensor is mounted in or on a conformable substrate or carrier
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for

Definitions

  • Intraoperative navigation is used in surgical procedures to accurately identify surgical field anatomy. Such navigation can be straightforward in open procedures where visual access to the surgical site is (or can become) available to the naked eye. In contrast, during other procedures where an operation is conducted from outside the patient’s body via tools inserted through one or more small incisions, accurate navigation can be more challenging.
  • a device in one aspect, includes a flexible substrate configured to conform to a contour of a surface.
  • the device also includes a sensor array associated with the flexible substate.
  • the sensor array has one or more sensors. Each sensor being configured to measure an amount of deformation of the sensor.
  • the device includes a controller configured to collect sensor signals from the one or more sensors of sensor array and output sensor data. The sensor data being indicative of a deformation of the flexible substrate.
  • a system for intraoperative navigation includes a processor coupled to a memory storing computer-executable instructions.
  • the instructions when executed by the processor, configure the processor to acquire sensor data from a patch device having a flexible substrate applied to a portion of a body of a patient, the sensor data indicative a deformation of the flexible substrate.
  • the instructions further configure the processor to generate a surface contour of the patch device based on the sensor data.
  • the surface contour corresponds to a surface contour of the portion of the body of the patient on which the patch device is applied.
  • the instructions configure the processor to register the surface contour to image data of the patient.
  • the system includes a contouring reference array having a sensor array coupled with a flexible material configured to conform to a contour of a surface, the sensor array having one or more sensors, each sensor configured to measure deformation of the flexible material.
  • the system also includes a computing device communicatively coupled with the contouring reference array.
  • the computing device includes a processor configured to: receive sensor data from the contouring reference array, the contouring reference array being applied to a body of a patient; match a contour measured by the contouring reference array corresponding to a surface contour of a portion of the body of the patient on which the contouring reference array is applied to image data of the patient; and output registration information between the contour of the contouring reference array and the image data to an intraoperative tracking system.
  • the intraoperative tracking system is configured to track at least a reference on the contouring reference array.
  • the intraoperative tracking system further configured to register tracking information associated with the contouring reference array with at least one of the image data or the contour of the contouring reference array.
  • FIG. 1 illustrates a schematic diagram of an exemplary, non-limiting implementation of an intraoperative navigation system according to various aspects.
  • Fig. 2 illustrates a schematic diagram of an exemplary, non-limiting implementation of a patch device for the intraoperative navigation system in accordance with one or more aspects.
  • Fig. 3 illustrates a schematic diagram of an exemplary, non-limiting implementation of a patch device for the intraoperative navigation system in accordance with one or more aspects.
  • Fig. 4 illustrates a flow chart of an exemplary, non-limiting implementation for registering a surface contour measured by a patch device with image data according to various aspects.
  • Figs. 5 and 6 illustrate a schematic diagram and flow chart of the exemplary, nonlimiting implementation of Fig. 4.
  • Fig. 7 illustrates a flow chart of an exemplary, non-limiting implementation for determining a transform for a patch device and image data according to various aspects.
  • Fig. 8 illustrates a schematic diagram and flow chart of the exemplary, non-limiting implementation of Fig. 7.
  • Fig. 9 is an exemplary, non-limiting illustration of a patch device applied to a patient according to one or more aspects.
  • Figs. 10 and 11 illustrate exemplary, non-limiting implementations of patch devices according to various aspects.
  • Fig. 12 illustrates an exemplary, non-limiting implementation of a patch device in accordance with an aspect.
  • Fig. 13 illustrates a schematic diagram of a non-limiting example of a surface contour based on image data according to one or more aspects.
  • Fig. 14 illustrates a schematic diagram of a non-limiting example of a surface contour based on a patch device according to one or more aspects.
  • Fig. 15 illustrates an exemplary, non-limiting registration of surface contours of Figs 13 and 14 according to various aspects.
  • Fig. 16 illustrates an exemplary, non-limiting implementation of an intraoperative navigation system in accordance with one or more aspects.
  • FIG. 17 illustrates an exemplary, non-limiting implementation of an intraoperative navigation system in accordance with one or more aspects.
  • Figs. 18 and 19 illustrate an exemplary, non-limiting implementation of a patch device according to various aspects.
  • Fig. 20 illustrates an exemplary, non-limiting implementation of a patch device according to various aspects.
  • Figs. 21-25 illustrate various exemplary, non-limiting implementations of patch devices in accordance with one or more aspects.
  • intraoperative navigation may be challenging for minimally invasive procedures (e.g. image-guided procedures, laparoscopic procedures, etc.).
  • minimally invasive procedures e.g. image-guided procedures, laparoscopic procedures, etc.
  • one such neurosurgical procedure places a ventricular drain within the brain through the skull in order to drain cerebrospinal fluid and relieve intracranial pressure.
  • stereotactic navigation equipment can be used during this procedure (as well as other, similar operations) to visualize the internal anatomy and guide insertion and placement of the drain catheter within the brain ventricle.
  • stereotactic navigation equipment is expensive and bulky, and using it can be tedious.
  • One available stereotactic navigation system utilizes expensive surgical reference markers that must be placed at various locations on the patient (e.g., on the head) to be scanned for mapping the surgical site.
  • a surface-mounted transducer device can be affixed to the outer surface of the patient (i.e. on the patient’s skin). The device generates signals corresponding to the contour of that surface. The resulting measured surface contour then can be compared to a preprocedure imaging study (such as an MRI or CT scan) and registered with a corresponding surface region from the scan having the corresponding contour.
  • a preprocedure imaging study such as an MRI or CT scan
  • a surgical procedure can be guided based on the positional and orientational relationships between the patient’s underlying anatomy and the registered surface contour.
  • surgical instruments can be visually synchronized to the location and orientation of the patch, and thereby the patient’s underlying anatomy, in a virtual environment, which can be displayed on a display screen or as an augmented-reality image.
  • the surface-mounted transducer device (or patch device) is described herein as an improvement on stereotactic navigation systems, it is to be appreciated that the device herein may be used in other applications. For instance, navigation or planning of radiosurgery procedures can be improved using the device herein. Further, it is to be appreciated that while examples herein describe a device affixed to an outer surface of the patient, the device may include anchors to facilitate attachment to bone.
  • the term “register” refers to the process of determining a transform between two objects to bring those objects into alignment or correspondence.
  • registering first image data to second image data may mean determining a transform for the image data such that like features of the first image data correspond to like features of the second image data.
  • the objects may be said to be “matched”, or “correlated”.
  • match refers to identifying like features in different data sets or the state in which like features have been identified in different data sets.
  • system 100 may be utilized to guide a medical procedure.
  • system 100 includes a patch device 110, which, in some implementations, is a device having a flexible material or substrate and a sensor array with one or more sensors.
  • the patch device 110 is configured to be applied to a portion of a patient’s body. More particularly, the patch device 110 is applied to a surface of the patient’s body with, for example, an adhesive. When applied, the flexible material or substrate of the patch device 110 enable the device 110 to conform to an underlying surface contour of the patient.
  • the one or more sensors of the sensor array output signals indicative of an amount of deformation (e.g. stretch, deflection, bend, rotation, etc.) of the respective sensor.
  • a predetermined neutral position e.g. flat
  • the sensors output predetermined signals.
  • the patch device 110 deflects from the neutral position. Different portions of the patch device 110 experience different degrees of deformation to conform to the underlying surface. Accordingly, the signals from the sensors indicate this deformation.
  • the signals from the sensors of the patch device 110 may be output to a computing device 120.
  • Computing device 120 may be a controller, a microcontroller, a system- on-a-chip (SoC), a computer processor, a mobile device, a server computer, a laptop, a desktop computer, or substantially any other computer system having a processor and configured to execute computer-executable instructions for carrying out various features described herein.
  • computing device 120 may be any device having a processor coupled to a memory (e.g., computer memory, such as a device or system that is used to store information for use in a computer or related computer hardware and digital electronic devices, including short and longterm memory, temporary and permanent memory, and the like).
  • the memory stores executable instructions for a software application.
  • the software application when executed by the processor, acquires sensor data form the patch device 110 and evaluates the data to determine a surface contour of the patient at the location on which the patch device 110 is applied.
  • system 100 may include an imaging system 130.
  • the imaging system 130 may include an imaging device (e.g. a medical imaging device such as a computed tomography (CT) device, a magnetic resonance imaging (MRI) device, a nuclear device such as a positron emission tomography (PET) device, single-photon emission computed tomography (SPECT) device, x-ray, or other imaging device suitable to acquire medical images for medical diagnostic and/or navigation applications.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • PET positron emission tomography
  • SPECT single-photon emission computed tomography
  • x-ray x-ray
  • PES picture archiving and communication system
  • the computing device 120 acquires image data of the patient from the imaging system 130.
  • the image data may be pre-operative images acquired prior to a medical procedure.
  • the image data may be intra-operative image data.
  • the computing device 120 registers the image data with the surface contour determined based on sensor data from the patch device 110. Once registered, a correspondence between the patch device 110 and underlying anatomy from the image data is generated. The correspondence may be maintained and updated if the patient moves. For instance, new sensor data is streamed and the registration is (re-)computed. With the correspondence, a medical procedure can be conducted utilizing navigation information derived from the patch device 110 matched with the image data (i.e. anatomy).
  • tracking system 140 may be an optical tracking system configured to detect, locate, and track objects in an environment.
  • the objects may include markers or references to enable tracking.
  • the tracking system 140 may an electromagnetic tracking system, a LIDAR tracking system, a laser time of flight based tracking system, or the like.
  • Objects tracked may include medical instruments, including patch device 110 and tools.
  • tracking system 140 may include an augmented reality (AR) device such as a wearable AR device (e.g. headset, AR glasses, etc.).
  • AR augmented reality
  • the tracking system 140 include multiple wearable AR devices worn by several users. The multiple wearable AR devices may jointly provide tracking and coordinate with each other.
  • the tracking system 140 may be a non-stationary system unlike conventional tracking systems.
  • the patch device 110 being trackable with the tracking system 140, a further correspondence can generated between the location and shape of the patch device 110, the image data (e.g. anatomy), and the patient and/or patch device 110 as tracked by the tracking system 140.
  • the image data e.g. anatomy
  • the patient and/or patch device 110 as tracked by the tracking system 140.
  • Fig. 1 depicts system 100 having one patch device 110
  • system 100 may include second patch device or a plurality of patch devices.
  • the second patch device (or several additional patch devices) may be placed at a different location to provide navigation assistance at multiple sites.
  • multiple patch devices may be placed at separate, but proximal, locations.
  • the sensor data from the multiple patch devices can be combined (e.g. processed together) to improve reliability of and/or reduce errors with the registration.
  • the patch device 110 may be utilized to augment standard registration procedures.
  • the patch device 110 may operate to orient the tracking system 140 or other optical tracker. This augmentation improves speed and accuracy and standard registration procedures.
  • image data from imaging system 130 may not yet be available.
  • the patch device 110 may be placed on the patient prior to imaging.
  • the surface contour measured by the patch device 110 is generated and compared to image data from the imaging system 130 once available.
  • this immediate registration would enable the patient to be taken directly for a procedure (e.g. such as an external ventricular drain, etc.).
  • a procedure e.g. such as an external ventricular drain, etc.
  • Patch device 110 may include a sensor array 210 having one or more sensors.
  • the sensors of the sensor array 210 may be flex sensors, bend sensors, stretch sensors, or substantially any type of sensor configured to measure a degree of deflection of the sensor.
  • the one or more sensors of the sensor array 210 may be arranged to have a known correspondence.
  • the sensors of the sensor array 210 may be positioned to provide relative constraints between sensors to improve analysis of sensor signals.
  • sensors may be placed tip-to-tail, orthogonally, transversely, or the like in a known or predetermined arrangement.
  • sensor signals are interpreted based on physical constraints of the sensors, the patch device 110, and/or the patient.
  • the sensors of the sensor array 210 may be capacitive, resistive, etc., such that an output signal thereof measures and is indicative of an amount of deflection.
  • the signal(s) may be output to a controller 220, which may be a computer processor, a microcontroller, an SoC, etc.
  • the controller 220 may be on-board the patch device 110.
  • controller 220 may process the sensor signals from sensor array 210.
  • the processing may be low- level processing such as signal conditioning.
  • the processing may be more involved and include evaluating signals to provide sensor data indicative of the surface contour and/or output of the surface contour itself.
  • the signals and/or data may be output by controller 220 to a communication interface 230 for communication or transmission to computing device 120, for example.
  • Communication interface 230 can be a wired or wireless interface including, but not limited to, a WiFi interface, an Ethernet interface, a Bluetooth interface, a fiber optic interface, a cellular radio interface, a satellite interface, etc.
  • a component interface 310 is provided, which may be a wired or wireless connection between sensor array 210 and controller and communication module 320.
  • the controller and communication module 320 may be separate device that may be plugged into or other coupled to the patch device 110 via the component interface 310.
  • patch device 110 includes the sensor array 210 and an interface via which sensor signals can be communicated. Processing, conditioning, and/or communication of signals to other remote systems (e g. computing device 120) is performed by controller and communication module 320.
  • patch device 110 may be a disposable part (e g. single use) while controller and communication module 320 may be re-used.
  • FIG. 4 illustrates a flowchart of a method for registering a surface contour measured by a patch device with image data.
  • the method of Fig. 4 in some implementations, may be performed by system 100, computing device, 120, and/or other computing device or controller associated with or in communication with system 100.
  • the method can begin at reference numeral 402 wherein sensor data from sensor signal(s) output from one or more sensors of a sensor array of a patch device are acquired.
  • the sensor data may be unified sensor data aggregated or otherwise packaged from all sensors of the sensor array.
  • the sensor data may include a set of sensor data from individual sensors of the sensor array.
  • sensors of the sensor array may provide sensor data further indicative of other sensors. For instance, while a sensor may provide sensor data indicative of its own position and/or bend, it may sense a position and provide data of other sensors. The other sensors may be nearby sensors. Thus, each sensor can provide data about its own position and bend as well as a relative position of other sensors. The sensor can sense its own bend and other sensors' bends.
  • the sensors can process data from a sensor nearby before sending data for additional processing.
  • the sensor data is transmitted to a computing device for further processing.
  • the computing device in some examples, may be separate from the patch device. It is to be appreciated that transmission to the computing device may be raw sensor signals streamed from the sensor data. In other implementations, transmission to the computing device may be optional. In this case, on-board processing circuitry of the patch device performs the further steps of the method of Fig. 4.
  • geometry data for the patch device is determined based on the sensor data.
  • Geometry data may be displacement data or position data associated with sensors of the sensor array.
  • geometry data may include angle data indicating bend angle or rotation angles of sensors of the sensor array.
  • geometry data may include both displacement data and angle data.
  • point cloud data is generated based on the geometry data.
  • the point cloud data includes coordinates of a plurality of points.
  • the plurality of points define a surface contour of the patch device and, by extension, the surface contour of the surface on which the patch device is applied.
  • the point cloud data is registered with image data of a patient. Once registered, a mapping transform or correspondence between the surface contour (e.g. patch device) and patient anatomy indicated in image data is generated. The registration of patch device and image data enables intraoperative navigation and/or guidance for a medical procedure.
  • the data collection and registration process may be repeated.
  • the process may be continuous or periodic according to a predetermined period.
  • a repetition may be triggered by some event such as a potential loss of registration due to movement of the patient, a position change of the patch device, or substantially any other even that may change the surface contour measured and/or alignment to anatomy.
  • the patch device may also be referred to as a contouring reference array. That is, the patch device includes an array that provides reference data or points indicative of a contour on which the device is placed.
  • the array may be a sensor array as described herein.
  • the array may be an array of markers or references that may be identified and tracked by a tracking system (e g. tracking system 140, an AR device, a medical navigation system, or the like). Through analysis of the reference array as observed by the tracking system, the surface contour may be determined.
  • Figs. 5 and 6 illustrate a schematic diagram and flow chart of the exemplary, nonlimiting implementation of Fig. 4.
  • different types of sensor data may be acquired or utilized based on a type of sensor utilized by a patch device 500.
  • displacement data may indicate a deflection of a position of a sensor and may be measured in a length measurement unit such as millimeters.
  • Flex data may indicate a bending of the sensor. The bend may be longitudinal (e g. along a length of a sensor) or rotational (e.g. twist or roll). The bend may be measured in an angle measurement unit such as radians or degrees.
  • a decision is made regarding what input type is available or selected. This selection may be determined by a configuration of patch device 500 or may be a user-selectable option.
  • displacement data is acquired from sensors of patch device 500 at 502.
  • sensor signals may be collected from individual sensors.
  • the signals may be aggregated (e.g. unified) or maintained as separate, individual signals.
  • the signals may be conditioned with a low-pass filter 504 before transmission at 506.
  • flex data is acquired from sensors of patch device 500 at 514.
  • sensor signals may be collected from individual sensors.
  • the signals may be aggregated (e.g. unified) or maintained as separate, individual signals.
  • the signals may be conditioned with a low-pass filter 516 before transmission at 518.
  • displacement data and flex data is acquired from sensors of patch device 500 at 508.
  • sensor signals may be collected from individual sensors.
  • the signals may be aggregated (e.g. unified) or maintained as separate, individual signals.
  • the signals may be conditioned with a low-pass filter 510 before transmission at 512.
  • the data from the patch device 500 may be received by a computing device 600.
  • deflection data indicating a deflection of positions of sensors is determined from the sensor data at 602.
  • the deflection data is utilized to calculate a point cloud at 604.
  • the point cloud provides coordinates (e.g. three-dimensional coordinates) for points of the patch device 500.
  • the measured point cloud 606 indicates a surface contour of the patch device 500 and, subsequently, a surface contour of a surface on which the patch device 500 is applied (e.g. a surface contour of a location on a patient).
  • angle data is determined at 616 based on the sensor data.
  • Angle data may include angles determined based on a bend relative to a known sensor length and/or rotation data determined based on patch geometry.
  • a point cloud is calculated at 618.
  • the point cloud 620 provides coordinates (e.g. three- dimensional coordinates) for points of the patch device 500.
  • the measured point cloud 620 indicates a surface contour of the patch device 500 and, subsequently, a surface contour of a surface on which the patch device 500 is applied (e.g. a surface contour of a location on a patient).
  • deflection data and angle data are both determined based on respective displacement data and flex data at 608.
  • a point cloud is calculated at 610 and the combined data is matched at 612 to provide a measured point cloud 614.
  • the measured point cloud is registered with image data at 622. More particularly, registration 622 may branch based on whether a-priori information is available. When a-priori information is available, correspondence-based registration is performed at 624. When a-priori information is not available, a best-fit registration is performed at 626.
  • Fig. 7 illustrated is a flow chart of an exemplary, non-limiting implementation for determining a transform for a patch device and image data.
  • the method can begin at reference numeral 702 where sensor data from sensor signal(s) output from one or more sensors of a sensor array of a patch device are acquired.
  • the sensor data may be unified sensor data aggregated or otherwise packaged from all sensors of the sensor array.
  • the sensor data may include a set of sensor data from individual sensors of the sensor array.
  • the sensor data is transmitted to a computing device for further processing.
  • the computing device in some examples, may be separate from the patch device. It is to be appreciated that transmission to the computing device may be raw sensor signals streamed from the sensor data.
  • a transform is determined based on image data and a patch simulation model.
  • a patch device may be virtually applied to a location derived from the image data and predicted sensor data is generated.
  • the predicted sensor data indicates sensor data expected, based on the model, from the patch device at the location.
  • Locations may be iteratively evaluated to find a best match to measured sensor data. For example, a location corresponding to a minimum error may be identified.
  • the transform can be determined that brings the patch device and image data into correspondence.
  • Fig. 8 depicts a schematic diagram and flow chart of the method of Fig. 7.
  • the flow of Fig. 5 may occur as described above.
  • a transform is determined based on a model at 802, 806, and 812 to generate transforms 804, 810, and 824, respectively.
  • computing device 800 may further match combined data at 808 to generate transform 810.
  • the transducer is a patch that can be in the form of a pliable film, sticker, etc. that can be conformed to the contour of the organ or surgical site where attached. More specifically, when the patch is applied to a multiplanar (semi) rigid surface of an object, the patch will conform to that surface and represent its underlying superficial contour without causing distortion (e.g., warping that would substantially reconfigure the superficial shape of the object).
  • a patch 902 can be placed on the patient’s head (e.g., external skin) in this example, and conformed to the shape of the underlying skull 902.
  • the patch may be placed at any location on the body, and even on/within internal organs that has/have been pre-procedure imaged in order to register the patch with the underlying physiologic structures.
  • the patch may be manufactured in a predetermined shape. For example, if the patch is to be placed on the skin covering portions of the nasal bone and the supraorbital ridge, then the patch may be produced in an “L” shape to match the general contour of the known location without significant alterati on s/ m odifi cati on s .
  • the patch may be manufactured with a pre-determined shape
  • the patch may alternatively be produced with a general shape (e.g., square, rectangle, circle, triangle, etc.) and subsequently modified/altered (e.g., via cutting) to fit in a particular location of interest.
  • a general shape e.g., square, rectangle, circle, triangle, etc.
  • modified/altered e.g., via cutting
  • the patch may include an adhesive backing to affix the patch to the desired location. Further, the patch may be moved from one location to another, for example during an operative procedure, if warranted.
  • the patch 1000 or 1100 includes a sensor array 1010, 1120 embedded therein or on its surface.
  • the individual sensors 1012, 1112 of the array 1010, 1110 may be arranged in a pattern such that the position (i.e., coordinate location) of each sensor on the patch is fixed and known relative to that of each other sensor.
  • the sensor array can include bend- or flex sensors as known in the art, which are effectively strain gauges that can detect a degree of bending deflection and generate an electrical signal based on the degree of deflection.
  • the electrical signals from all the flex sensors in the array can be sent to a computer, such as a microcomputer, which can calculate therefrom the three- dimensional contour of the overall patch.
  • the computer can construct a three- dimensional model of the patch in a virtual space corresponding to the real-world three- dimensional contour of the actual patch applied to the patient and conformed to her surface anatomy.
  • the flex sensors measure the deflection of the patch via mechanical deformation that is converted into electrical signal(s) that can be sent to the computer.
  • the flex sensors may be manufactured from conductive material (e.g., carbon or conductive polymers) that is embedded within or disposed atop a substrate (e.g., a non-conductive substrate, or one containing carbon particles and/or conductive material).
  • conductive material e.g., carbon or conductive polymers
  • Such sensors may take the shape of an elongated strip that is bendable/flexible along its length. It is to be understood that the sensors may all have the same shape, or may have distinct shapes (e.g., differing in length, width, geometry, etc.).
  • the sensor array may comprise any number and distribution of sensors on the patch sufficient to provide a requisite degree of contour-mapping resolution for a given application, to ensure that the resulting virtual contour map generated by the computer can be accurately registered with the pre-procedure imaging study for intraoperative navigation.
  • the conductive material of each sensor is evenly distributed within the sensor so that the overall sensor yields a particular, predetermined baseline electrical resistance.
  • the resistance can be measured by connecting the sensor to an electrical circuit in communication with the computer mentioned above. Specifically, the circuit applies a known voltage or current across each sensor and measures the resulting voltage or current passing therethrough. When the sensor bends, the conductive material experiences a change in its physical orientation, which yields a corresponding change in its electrical resistance. By analyzing the changes in the measured resistance values and comparing them to the baseline resistance, the amount of deflection can be determined for each sensor. By summing the measured deflections within the sensor array, the three-dimensional conformation of the patch can be discerned.
  • the above-discussed flex sensor is only one example of a conformational sensor that may be employed in the patch. It is contemplated that the sensors of the sensor array are all the same. Alternatively, the sensor array may collectively comprise distinct types of bend/flex sensors for determining an amount of deflection of the patch.
  • a patch 1200 further includes at least one optical reference mark 1202 provided at a known location.
  • the optical reference mark 1202 is identifiable by an optical tracking system, as will be discussed further below.
  • the reference mark 1202 can be provided on the top surface of the patch 1200 (optionally removably via magnetic attachment) such that the optical tracking system can have direct line-of-sight therewith.
  • the reference mark 1202 may be embedded within the patch 1200, such that the optical tracking system identifies and locates the reference mark 1202 via non line-of-sight means, for example via a passive RFID or NFC signal.
  • the patch is used during an operative procedure to establish a reference point for the navigation system.
  • a computer can execute a software algorithm to correlate a three-dimensional model of that patch in virtual space with a matching surface-contour region of the patient as previously acquired from the pre-procedure imaging study.
  • the software algorithm can be used to register the three- dimensional model of the patch to the matching contour of the patient’ s anatomy in a virtual space, which can be visualized on a display - such as a computer screen or using an augmented-reality headset, for example.
  • this algorithm allows the underlying anatomy (e.g., the brain structure, including the target ventricle in a ventricular catheterization procedure) to be displayed and registered in relation to the patch.
  • the patch may include a resident (i.e., on-board) control unit (e.g., a printed circuit board) for controlling the sensor array and/or processing the information derived therefrom.
  • the control unit may be embedded within the patch and electrically connected to the sensor array.
  • the control unit may be disposed on the top surface of the patch or even completely separate from, such as integrated in an external computer executing the algorithm discussed above, and the patch connected thereto via a cable or via wireless communication (such as Bluetooth, wifi, NFC communication, etc.).
  • the information acquired from the sensor array may be streamed wirelessly (e.g., via Bluetooth) or via a wired connection to the external computer executing the algorithm or other device (e g., phone, tablet, etc.).
  • the patch-resident control unit processing sensor data can establish a secure connection with the corresponding external computer or other device through encryption/authenti cation pairing using a link key.
  • the associated area of the patient (encompassing the surgical site) will be first imaged or scanned via conventional means (e.g., MRI or CT scans) to generate a (virtual) 3D reconstruction of that area.
  • Imaging software is then used to analyze the rendered reconstruction and determine known surface curvatures/geometries of the scanned area, in relation to the underlying anatomy.
  • Fig. 13 schematically illustrates such surface curvatures as ascertained via the imaging software.
  • the patch is fitted (e.g., affixed) to the constrained region (as shown in FIG. 14).
  • the sensor array Prior to or during the fitting, the sensor array can be activated.
  • the patch may include visual (e.g., lights), acoustic (speakers), and/or vibration indicators to indicate the status of the sensor array.
  • the sensor array measures the deflection of the patch at various positions.
  • the data from the sensor array is sent to a controller, and then optionally on to an external computer where software compares that information with the data previously acquired from the pre-operative imaging study to identify the exact placement of the patch on the patient (i.e., by matching known geometric contours from the scan with the measured deflection from the sensor array).
  • the patch is overlayed in a virtual space to the virtual reconstruction from the preoperative imaging study to form a transformation (as shown in Fig. 15) so that a virtual representation of the patch is placed in registry with the geometry of the underlying patient surface in the virtual space.
  • the resulting virtual reconstruction now depicts the patch at its actual location on the patient in a virtual space.
  • An image of this virtual space can be outputted to a display that accurately depicts the patch in spatial relation to the surgical site, including all of the anatomy acquired from the pre-operative imaging study.
  • the sensor array provides continuous measurements that are simultaneously compared to the known values from that imaging study, the resulting virtual reconstruction (and thus the outputted image) can be both autoregistering and updated in real-time to account for any movements of the patient or potentially repositioning of the patch on the patient’s outer surface.
  • an optical tracking system can be synchronized to the one or more optical reference marks on the patch, whose location(s) also is/are predetermined and fixed relative to the sensor array.
  • the optical tracking system may be included in a wearable sensor (i.e., an augmented reality head-set, glasses, etc.).
  • the peripheral sensor may be a stand-alone sensor that is not worn.
  • the peripheral sensor also can detect surgical instruments and other implements, either via use of a camera or by detecting dedicated optical markers present on those instruments.
  • the surgical instruments may be detected through signals (optical, electromagnetic, etc.), which are sent and/or received relative to the patch.
  • Models of those instrument(s) then also can be rendered in the virtual environment and displayed on the display (e.g. display screen, augmented-reality headset, etc.) so that the position, orientation and advancement thereof within the patient’s anatomy all can be visualized on the display in real-time.
  • the display e.g. display screen, augmented-reality headset, etc.
  • the resulting (virtual) reconstruction can be overlay ed in or virtually displayed such that the operator has visual access to the surgical site, and to the position and orientation of her tools therein, even though that site is inside the patient and obstructed from view.
  • various tools to be used during the operation can include sensors and/or reference markers such that those tools can be linked to the optical tracking system.
  • the operative algorithm can overlay renderings of particular tools or instruments (e.g., a wand or needle) on the virtual reconstruction, wherein the position of the wand is shown in real-time via sensed data and/or measuring distances of different references on the tool/instrument.
  • the transducer can be a sterile drape or covering, or incorporated in a sterile drape or covering, that performs a similar function to the above-noted patch.
  • the drape and patch covers a smaller portion of the target anatomy, whereas the former can cover a larger surface. Large surface coverage may be desirable in cases where the surface contour is relatively constant, so that a greater surface area will be useful to ensure sufficient differentiation between sensor sites in the associated sensor array, to ensure proper body-contour registration.
  • the drape example likewise includes a sensor array to determine the contour of the target anatomy where it is placed/affixed.
  • the sensor array for the drape differs from the patch sensor array, in that units (i.e. each sensor) of the array can be connected end-to-end (i.e., no spacing therebetween) across the entire (or a specific) area of the drape, its position fixed in relation to all other sensors to determine the geometry of the target anatomy as explained above.
  • the drape self-registers to the contour of the surface on which it is applied and conformed through the aforementioned software algorithm, in the manner explained above.
  • the resulting (virtual) reconstruction can then be rendered in a virtual space and synchronized to one or more optical markers also on the drape, in order to achieve intraoperative navigation as described above.
  • the drape may have some units in the described array be electromagnetic sensors 1910 and some electromagnetic emitters 1920. Embedded electromagnetic reflectors/markers in the surgical tools will interact with the emitted electromagnetic field from the drape emitters, and reflect the emitted signals received by each individual electromagnetic unit in the drape. These reflected signals will then be detected by the electromagnetic sensors integrated in the drape. Based on each electromagnetic sensor’ s relative strength of signal received, and since each unit’s relative position is predetermined (as described above), the surgical tools’ relative position to the drape can be established.
  • the transducer includes a plurality of fingers all extending from a common origin at known, fixed angles relative to one another.
  • Each finger includes a knuckle disposed intermediate a proximal portion that extends from the origin, and a distal portion extending from the knuckle and defining a fingertip at its distal end.
  • the distal portion of each finger is pivotable relative to the proximal portion thereof at the associated knuckle.
  • a flex sensor is provided at each knuckle to determine the angle of bending of the distal portion relative to the proximal portion.
  • the respective lengths of the proximal and distal portions of each finger are known and fixed. Accordingly, by determining the angle between the proximal and distal portions (via the flex sensor) of each finger, a coordinate location of each fingertip can be calculated relative to all the other fingertips in the transducer. In this manner, the transducer can generate a ‘point cloud’ having a plurality of (in the illustrated embodiment, five) points whose coordinate locations define a contour that can be registered to a surface of the patient in a virtual space similarly as explained above. Moreover, as in other embodiments, the finger-transducer also can include one or more optical markings that facilitate detection thereof using an optical tracking system that also facilitates integration of surgical tools into a virtual image of the surgical site for real-time intraoperative navigation.
  • the base of the transducer encompassing the origin can be placed at a known location (e.g., placed at the nasion as shown above in Fig. 20) and affixed via an adhesive material or other suitable technique.
  • the distal portions of the fingers are manipulated (i.e., pivoted about their respective knuckles) until the distal fingertips all contact the surface of the body part being measured (e.g., forehead).
  • the coordinate location for each fingertip is determined relative to one another.
  • the relative coordinate locations are then triangulated to generate a virtual point cloud that can be co-registered with a previously obtained image as discussed above.
  • the determined coordinate locations from the covering are compared to various points of a previous scan/image in 1 : 1 relation to (virtually) reconstruct its placement in reference to surgical field anatomy.
  • Fig. 21-25 illustrate various alternative and exemplary implementations for a patch device as described herein.
  • Fig. 21 illustrates a patch device 2100 having a flexible material or substrate 2102 on which one or more flex sensors 2104 are positioned.
  • Device 2100 further includes housing 2106 for circuitry.
  • the circuitry may be control or communication circuitry.
  • Fig. 22 illustrates a patch device 2200 having a flexible material or substrate 2202 on which one or more bend sensors 2204 are positioned.
  • Device 2200 further includes housing 2206 for circuitry.
  • the circuitry may be control or communication circuitry.
  • Fig. 23 illustrates a patch device 2300 having a flexible material 2302, bend sensors 2304, and circuitry housing 2306.
  • the bend sensors 2304 are arranged in an offset manner compared to the bend sensors 2204 in Fig. 22.
  • Fig. 24 illustrates a patch device 2400 having a sensor array 2404 embedded on a flexible substrate 2402.
  • the sensor array 2404 may be a bend sensor configured according to a shape of flexible substrate 2402.
  • the device 2400 includes a connector 2406 to transmit sensor signals to a processing device.
  • Fig. 25 illustrates a patch device 2500.
  • Patch device 2500 includes a central hub 2502 from which bend sensors 2504 fan out.
  • the bend sensors 2504 are rotatable mounted to the hub 2502 such that the relative angles between sensors 2504 are adjustable to facilitate mounting to various positions on a patient.
  • the above-described transducers for intraoperative navigation provide relatively quick and accurate intraoperative surgical navigation that can be deployed quickly and displayed (e.g., on a screen, tablet, peripheral head-set) in real-time.
  • the surgical field otherwise obstructed from view, can be thus synchronized to optical markers, and thereby also synchronized to virtual models of surgical tools and instruments used during an operative procedure.
  • This facilitates real-time surgical navigation via a computer display or in an augmented/virtual reality environment, to provide the operator with the ability to ‘see’ where she is going rather than to infer from anatomical landmarks.
  • the transducers and the associated optical tracking systems and/or peripherals are relatively small, and thus can be transported to and deployed in emergency settings readily.
  • the transducer generates a virtual surface contour or point cloud that is compared via an algorithm to patient surface contours to achieve registration, registration is automatic and is not dependent on the patient position. Indeed, if desired during a procedure the transducer can be removed and re-positioned (e.g. to provide access to another incision site if desired), whereupon the software algorithm will be able to again perform the necessary comparisons and adjust registration of the transducer to the patient’s anatomy in the virtual space until they are matched once again. This is a significant benefit compared to conventional systems, which require immobilizing the patient for extended periods.
  • exemplary is used herein to mean serving as an example, instance or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as advantageous over other aspects or designs. Rather, use of the word exemplary is intended to present concepts in a concrete fashion.
  • the term “or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from context, “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then “X employs A or B” is satisfied under any of the foregoing instances.
  • At least one of A and B and/or the like generally means A or B or both A and B.
  • the articles “a” and “an” as used in this application and the appended claims may generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Robotics (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

A surface-mounted device is applied to a surface of a patient. The device includes one or more sensors that generate signals indicative of a contour of that surface. The resulting measured surface contour then can be registered with pre-procedure medical image data. Once registered, a surgical procedure can be guided based on the positional and orientational relationships between the patient's underlying anatomy and the registered surface contour measured by the device.

Description

SYSTEM FOR AUTOMATED SURFACE CONTOUR MATCHING FOR SURGICAL NAVIGATION
BACKGROUND
[0001] Intraoperative navigation is used in surgical procedures to accurately identify surgical field anatomy. Such navigation can be straightforward in open procedures where visual access to the surgical site is (or can become) available to the naked eye. In contrast, during other procedures where an operation is conducted from outside the patient’s body via tools inserted through one or more small incisions, accurate navigation can be more challenging.
SUMMARY
[0002] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key factors or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
[0003] In one aspect, a device is provided. The device includes a flexible substrate configured to conform to a contour of a surface. The device also includes a sensor array associated with the flexible substate. The sensor array has one or more sensors. Each sensor being configured to measure an amount of deformation of the sensor. In addition, the device includes a controller configured to collect sensor signals from the one or more sensors of sensor array and output sensor data. The sensor data being indicative of a deformation of the flexible substrate.
[0004] In another aspect, a system for intraoperative navigation is provided. The system includes a processor coupled to a memory storing computer-executable instructions. The instructions, when executed by the processor, configure the processor to acquire sensor data from a patch device having a flexible substrate applied to a portion of a body of a patient, the sensor data indicative a deformation of the flexible substrate. The instructions further configure the processor to generate a surface contour of the patch device based on the sensor data. The surface contour corresponds to a surface contour of the portion of the body of the patient on which the patch device is applied. In addition, the instructions configure the processor to register the surface contour to image data of the patient. [0005] In yet another aspect, a system for intraoperative navigation is provided. The system includes a contouring reference array having a sensor array coupled with a flexible material configured to conform to a contour of a surface, the sensor array having one or more sensors, each sensor configured to measure deformation of the flexible material. The system also includes a computing device communicatively coupled with the contouring reference array. The computing device includes a processor configured to: receive sensor data from the contouring reference array, the contouring reference array being applied to a body of a patient; match a contour measured by the contouring reference array corresponding to a surface contour of a portion of the body of the patient on which the contouring reference array is applied to image data of the patient; and output registration information between the contour of the contouring reference array and the image data to an intraoperative tracking system. The intraoperative tracking system is configured to track at least a reference on the contouring reference array. The intraoperative tracking system further configured to register tracking information associated with the contouring reference array with at least one of the image data or the contour of the contouring reference array.
[0006] To the accomplishment of the foregoing and related ends, the following description and annexed drawings set forth certain illustrative aspects and implementations. These are indicative of but a few of the various ways in which one or more aspects may be employed. Other aspects, advantages and novel features of the disclosure will become apparent from the following detailed description when considered in conjunction with the annexed drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] Various non-limiting embodiments are further described in the detailed description given below with reference the accompanying drawings, which are incorporated in and constitute a part of the specification.
[0008] Fig. 1 illustrates a schematic diagram of an exemplary, non-limiting implementation of an intraoperative navigation system according to various aspects.
[0009] Fig. 2 illustrates a schematic diagram of an exemplary, non-limiting implementation of a patch device for the intraoperative navigation system in accordance with one or more aspects.
[0010] Fig. 3 illustrates a schematic diagram of an exemplary, non-limiting implementation of a patch device for the intraoperative navigation system in accordance with one or more aspects. [0011] Fig. 4 illustrates a flow chart of an exemplary, non-limiting implementation for registering a surface contour measured by a patch device with image data according to various aspects.
[0012] Figs. 5 and 6 illustrate a schematic diagram and flow chart of the exemplary, nonlimiting implementation of Fig. 4.
[0013] Fig. 7 illustrates a flow chart of an exemplary, non-limiting implementation for determining a transform for a patch device and image data according to various aspects.
[0014] Fig. 8 illustrates a schematic diagram and flow chart of the exemplary, non-limiting implementation of Fig. 7.
[0015] Fig. 9 is an exemplary, non-limiting illustration of a patch device applied to a patient according to one or more aspects.
[0016] Figs. 10 and 11 illustrate exemplary, non-limiting implementations of patch devices according to various aspects.
[0017] Fig. 12 illustrates an exemplary, non-limiting implementation of a patch device in accordance with an aspect.
[0018] Fig. 13 illustrates a schematic diagram of a non-limiting example of a surface contour based on image data according to one or more aspects.
[0019] Fig. 14 illustrates a schematic diagram of a non-limiting example of a surface contour based on a patch device according to one or more aspects.
[0020] Fig. 15 illustrates an exemplary, non-limiting registration of surface contours of Figs 13 and 14 according to various aspects.
[0021] Fig. 16 illustrates an exemplary, non-limiting implementation of an intraoperative navigation system in accordance with one or more aspects.
[0022] Fig. 17 illustrates an exemplary, non-limiting implementation of an intraoperative navigation system in accordance with one or more aspects.
[0023] Figs. 18 and 19 illustrate an exemplary, non-limiting implementation of a patch device according to various aspects.
[0024] Fig. 20 illustrates an exemplary, non-limiting implementation of a patch device according to various aspects.
[0025] Figs. 21-25 illustrate various exemplary, non-limiting implementations of patch devices in accordance with one or more aspects. DETAILED DESCRIPTION
[0026] As described above, intraoperative navigation may be challenging for minimally invasive procedures (e.g. image-guided procedures, laparoscopic procedures, etc.). For example, one such neurosurgical procedure places a ventricular drain within the brain through the skull in order to drain cerebrospinal fluid and relieve intracranial pressure. Conventionally, stereotactic navigation equipment can be used during this procedure (as well as other, similar operations) to visualize the internal anatomy and guide insertion and placement of the drain catheter within the brain ventricle. Such equipment, however, is expensive and bulky, and using it can be tedious. One available stereotactic navigation system utilizes expensive surgical reference markers that must be placed at various locations on the patient (e.g., on the head) to be scanned for mapping the surgical site. Sometimes these reference markers are surgically implanted. Moreover, even when such techniques are used the patient is confined and immobile. Even minor movements or changes in position of the patient can result in loss of registration, and consequent loss of efficient navigation. [0027] Further still, existing stereotactic navigation systems are not suited to emergency settings given their bulk and the time-consuming nature of their usage. For these reasons, most ventricular catheterizations are performed using anatomical landmarks to navigate catheter navigation into the ventricle. The imprecision of this technique can introduce increased risks.
[0028] As described herein, a surface-mounted transducer device can be affixed to the outer surface of the patient (i.e. on the patient’s skin). The device generates signals corresponding to the contour of that surface. The resulting measured surface contour then can be compared to a preprocedure imaging study (such as an MRI or CT scan) and registered with a corresponding surface region from the scan having the corresponding contour. Once the transducer-measured contour is registered with the pre-procedure imaging contour, a surgical procedure can be guided based on the positional and orientational relationships between the patient’s underlying anatomy and the registered surface contour. Specifically, surgical instruments can be visually synchronized to the location and orientation of the patch, and thereby the patient’s underlying anatomy, in a virtual environment, which can be displayed on a display screen or as an augmented-reality image.
[0029] While the surface-mounted transducer device (or patch device) is described herein as an improvement on stereotactic navigation systems, it is to be appreciated that the device herein may be used in other applications. For instance, navigation or planning of radiosurgery procedures can be improved using the device herein. Further, it is to be appreciated that while examples herein describe a device affixed to an outer surface of the patient, the device may include anchors to facilitate attachment to bone.
[0030] The claimed subject matter is now described with reference to the drawings, wherein like reference numerals are generally used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the claimed subject matter. It may be evident, however, that the claimed subject matter may be practiced without these specific details. In other instances, structures and devices are shown in block diagram form in order to facilitate describing the claimed subject matter.
[0031] As generally utilized herein, the term “register” refers to the process of determining a transform between two objects to bring those objects into alignment or correspondence. For example, registering first image data to second image data may mean determining a transform for the image data such that like features of the first image data correspond to like features of the second image data. After transformation, the objects may be said to be “matched”, or “correlated”. More generally, the term “match” refers to identifying like features in different data sets or the state in which like features have been identified in different data sets.
[0032] Referring initially to Fig. 1, an exemplary, non-limiting implementation of an intraoperative navigation system 100 is illustrated. System 100 may be utilized to guide a medical procedure. As shown in Fig. 1, system 100 includes a patch device 110, which, in some implementations, is a device having a flexible material or substrate and a sensor array with one or more sensors. The patch device 110 is configured to be applied to a portion of a patient’s body. More particularly, the patch device 110 is applied to a surface of the patient’s body with, for example, an adhesive. When applied, the flexible material or substrate of the patch device 110 enable the device 110 to conform to an underlying surface contour of the patient.
[0033] The one or more sensors of the sensor array output signals indicative of an amount of deformation (e.g. stretch, deflection, bend, rotation, etc.) of the respective sensor. In a predetermined neutral position (e.g. flat), the sensors output predetermined signals. When applied to the patient, the patch device 110 deflects from the neutral position. Different portions of the patch device 110 experience different degrees of deformation to conform to the underlying surface. Accordingly, the signals from the sensors indicate this deformation. [0034] In one aspect, the signals from the sensors of the patch device 110 may be output to a computing device 120. Computing device 120 may be a controller, a microcontroller, a system- on-a-chip (SoC), a computer processor, a mobile device, a server computer, a laptop, a desktop computer, or substantially any other computer system having a processor and configured to execute computer-executable instructions for carrying out various features described herein. Generally, computing device 120 may be any device having a processor coupled to a memory (e.g., computer memory, such as a device or system that is used to store information for use in a computer or related computer hardware and digital electronic devices, including short and longterm memory, temporary and permanent memory, and the like). The memory stores executable instructions for a software application. The software application, when executed by the processor, acquires sensor data form the patch device 110 and evaluates the data to determine a surface contour of the patient at the location on which the patch device 110 is applied.
[0035] In an aspect, system 100 may include an imaging system 130. According to one or more examples, the imaging system 130 may include an imaging device (e.g. a medical imaging device such as a computed tomography (CT) device, a magnetic resonance imaging (MRI) device, a nuclear device such as a positron emission tomography (PET) device, single-photon emission computed tomography (SPECT) device, x-ray, or other imaging device suitable to acquire medical images for medical diagnostic and/or navigation applications. The imaging system 130 may also include a picture archiving and communication system (PACS).
[0036] The computing device 120 acquires image data of the patient from the imaging system 130. The image data may be pre-operative images acquired prior to a medical procedure. In another example, the image data may be intra-operative image data. The computing device 120 registers the image data with the surface contour determined based on sensor data from the patch device 110. Once registered, a correspondence between the patch device 110 and underlying anatomy from the image data is generated. The correspondence may be maintained and updated if the patient moves. For instance, new sensor data is streamed and the registration is (re-)computed. With the correspondence, a medical procedure can be conducted utilizing navigation information derived from the patch device 110 matched with the image data (i.e. anatomy).
[0037] More particularly, in an aspect, registration information of the patch device 110 and image data may be provided to a tracking system 140. Tracking system 140 may be an optical tracking system configured to detect, locate, and track objects in an environment. The objects may include markers or references to enable tracking. In other examples, the tracking system 140 may an electromagnetic tracking system, a LIDAR tracking system, a laser time of flight based tracking system, or the like. Objects tracked may include medical instruments, including patch device 110 and tools. In another example, tracking system 140 may include an augmented reality (AR) device such as a wearable AR device (e.g. headset, AR glasses, etc.). Still further, the tracking system 140 include multiple wearable AR devices worn by several users. The multiple wearable AR devices may jointly provide tracking and coordinate with each other. Accordingly, it is appreciated that the tracking system 140 may be a non-stationary system unlike conventional tracking systems. With the patch device 110 being trackable with the tracking system 140, a further correspondence can generated between the location and shape of the patch device 110, the image data (e.g. anatomy), and the patient and/or patch device 110 as tracked by the tracking system 140. Thus, position and orientation of medical tools or other medical devices relative to anatomy is readily available and provided to medical professionals.
[0038] While Fig. 1 depicts system 100 having one patch device 110, it is to be appreciated that system 100, and other embodiments described herein, may include second patch device or a plurality of patch devices. The second patch device (or several additional patch devices) may be placed at a different location to provide navigation assistance at multiple sites. In another example, multiple patch devices may be placed at separate, but proximal, locations. The sensor data from the multiple patch devices can be combined (e.g. processed together) to improve reliability of and/or reduce errors with the registration.
[0039] In another example, the patch device 110 may be utilized to augment standard registration procedures. For example, the patch device 110 may operate to orient the tracking system 140 or other optical tracker. This augmentation improves speed and accuracy and standard registration procedures.
[0040] In another example, for instance in emergency scenarios, image data from imaging system 130 may not yet be available. In such scenarios, the patch device 110 may be placed on the patient prior to imaging. The surface contour measured by the patch device 110 is generated and compared to image data from the imaging system 130 once available. In the emergency scenario, this immediate registration would enable the patient to be taken directly for a procedure (e.g. such as an external ventricular drain, etc.). Accordingly, while examples herein describe the image data being previously generated, it is to be appreciated that the data generated by the patch device 1 10 and image data from imaging system 130 may be acquired contemporaneously, or in any order.
[0041] Turning to Fig. 2, illustrated is a schematic diagram of an exemplary, non-limiting implementation of patch device 110. Patch device 110 may include a sensor array 210 having one or more sensors. The sensors of the sensor array 210 may be flex sensors, bend sensors, stretch sensors, or substantially any type of sensor configured to measure a degree of deflection of the sensor. The one or more sensors of the sensor array 210 may be arranged to have a known correspondence. For instance, the sensors of the sensor array 210 may be positioned to provide relative constraints between sensors to improve analysis of sensor signals. For example, sensors may be placed tip-to-tail, orthogonally, transversely, or the like in a known or predetermined arrangement. Accordingly, sensor signals are interpreted based on physical constraints of the sensors, the patch device 110, and/or the patient. The sensors of the sensor array 210 may be capacitive, resistive, etc., such that an output signal thereof measures and is indicative of an amount of deflection. The signal(s) may be output to a controller 220, which may be a computer processor, a microcontroller, an SoC, etc.
[0042] In Fig. 2, the controller 220 may be on-board the patch device 110. In this example, controller 220 may process the sensor signals from sensor array 210. The processing may be low- level processing such as signal conditioning. In other examples, the processing may be more involved and include evaluating signals to provide sensor data indicative of the surface contour and/or output of the surface contour itself. The signals and/or data may be output by controller 220 to a communication interface 230 for communication or transmission to computing device 120, for example. Communication interface 230 can be a wired or wireless interface including, but not limited to, a WiFi interface, an Ethernet interface, a Bluetooth interface, a fiber optic interface, a cellular radio interface, a satellite interface, etc.
[0043] Turning to Fig. 3, illustrated is a schematic diagram of another exemplary, non-limiting implementation of patch device 110. In Fig. 3, a component interface 310 is provided, which may be a wired or wireless connection between sensor array 210 and controller and communication module 320. The controller and communication module 320 may be separate device that may be plugged into or other coupled to the patch device 110 via the component interface 310. In this implementation, patch device 110 includes the sensor array 210 and an interface via which sensor signals can be communicated. Processing, conditioning, and/or communication of signals to other remote systems (e g. computing device 120) is performed by controller and communication module 320. According to this implementation, patch device 110 may be a disposable part (e g. single use) while controller and communication module 320 may be re-used.
[0044] Turning to Fig. 4, various features and operations of the systems and techniques described herein are illustrated with an exemplary flowchart. The example in this figure is illustrative of some features of system 100, patch device 110, and/or computing device 120. Fig. 4 illustrates a flowchart of a method for registering a surface contour measured by a patch device with image data. The method of Fig. 4, in some implementations, may be performed by system 100, computing device, 120, and/or other computing device or controller associated with or in communication with system 100.
[0045] The method can begin at reference numeral 402 wherein sensor data from sensor signal(s) output from one or more sensors of a sensor array of a patch device are acquired. The sensor data may be unified sensor data aggregated or otherwise packaged from all sensors of the sensor array. In another example, the sensor data may include a set of sensor data from individual sensors of the sensor array. In yet another example, sensors of the sensor array may provide sensor data further indicative of other sensors. For instance, while a sensor may provide sensor data indicative of its own position and/or bend, it may sense a position and provide data of other sensors. The other sensors may be nearby sensors. Thus, each sensor can provide data about its own position and bend as well as a relative position of other sensors. The sensor can sense its own bend and other sensors' bends. Accordingly, the sensors can process data from a sensor nearby before sending data for additional processing. At 404, the sensor data is transmitted to a computing device for further processing. The computing device, in some examples, may be separate from the patch device. It is to be appreciated that transmission to the computing device may be raw sensor signals streamed from the sensor data. In other implementations, transmission to the computing device may be optional. In this case, on-board processing circuitry of the patch device performs the further steps of the method of Fig. 4.
[0046] At 406, geometry data for the patch device is determined based on the sensor data. Geometry data may be displacement data or position data associated with sensors of the sensor array. In other examples, geometry data may include angle data indicating bend angle or rotation angles of sensors of the sensor array. Further, geometry data may include both displacement data and angle data. At 408, point cloud data is generated based on the geometry data. The point cloud data includes coordinates of a plurality of points. The plurality of points define a surface contour of the patch device and, by extension, the surface contour of the surface on which the patch device is applied. At 410, the point cloud data is registered with image data of a patient. Once registered, a mapping transform or correspondence between the surface contour (e.g. patch device) and patient anatomy indicated in image data is generated. The registration of patch device and image data enables intraoperative navigation and/or guidance for a medical procedure.
[0047] As shown in Fig. 4, the data collection and registration process may be repeated. For example, the process may be continuous or periodic according to a predetermined period. In addition, a repetition may be triggered by some event such as a potential loss of registration due to movement of the patient, a position change of the patch device, or substantially any other even that may change the surface contour measured and/or alignment to anatomy.
[0048] The patch device may also be referred to as a contouring reference array. That is, the patch device includes an array that provides reference data or points indicative of a contour on which the device is placed. For instance, in one implementation, the array may be a sensor array as described herein. In another implementation, the array may be an array of markers or references that may be identified and tracked by a tracking system (e g. tracking system 140, an AR device, a medical navigation system, or the like). Through analysis of the reference array as observed by the tracking system, the surface contour may be determined.
[0049] Figs. 5 and 6 illustrate a schematic diagram and flow chart of the exemplary, nonlimiting implementation of Fig. 4. In an implementation, different types of sensor data may be acquired or utilized based on a type of sensor utilized by a patch device 500. For example, displacement data may indicate a deflection of a position of a sensor and may be measured in a length measurement unit such as millimeters. Flex data may indicate a bending of the sensor. The bend may be longitudinal (e g. along a length of a sensor) or rotational (e.g. twist or roll). The bend may be measured in an angle measurement unit such as radians or degrees. At 501, a decision is made regarding what input type is available or selected. This selection may be determined by a configuration of patch device 500 or may be a user-selectable option.
[0050] When displacement data only is selected, displacement data is acquired from sensors of patch device 500 at 502. For example, sensor signals may be collected from individual sensors. The signals may be aggregated (e.g. unified) or maintained as separate, individual signals. The signals may be conditioned with a low-pass filter 504 before transmission at 506. [0051] When flex data only is selected, flex data is acquired from sensors of patch device 500 at 514. For example, sensor signals may be collected from individual sensors. The signals may be aggregated (e.g. unified) or maintained as separate, individual signals. The signals may be conditioned with a low-pass filter 516 before transmission at 518.
[0052] When both types data are selected, displacement data and flex data is acquired from sensors of patch device 500 at 508. For example, sensor signals may be collected from individual sensors. The signals may be aggregated (e.g. unified) or maintained as separate, individual signals. The signals may be conditioned with a low-pass filter 510 before transmission at 512.
[0053] Turning to Fig. 6, the data from the patch device 500 may be received by a computing device 600. For displacement data, deflection data indicating a deflection of positions of sensors is determined from the sensor data at 602. The deflection data is utilized to calculate a point cloud at 604. The point cloud provides coordinates (e.g. three-dimensional coordinates) for points of the patch device 500. The measured point cloud 606 indicates a surface contour of the patch device 500 and, subsequently, a surface contour of a surface on which the patch device 500 is applied (e.g. a surface contour of a location on a patient).
[0054] For flex data, angle data is determined at 616 based on the sensor data. Angle data may include angles determined based on a bend relative to a known sensor length and/or rotation data determined based on patch geometry. Using the angles and rotations, a point cloud is calculated at 618. Like measured point cloud 606, the point cloud 620 provides coordinates (e.g. three- dimensional coordinates) for points of the patch device 500. The measured point cloud 620 indicates a surface contour of the patch device 500 and, subsequently, a surface contour of a surface on which the patch device 500 is applied (e.g. a surface contour of a location on a patient).
[0055] For both types of data, deflection data and angle data are both determined based on respective displacement data and flex data at 608. Using both deflection data and angle data, a point cloud is calculated at 610 and the combined data is matched at 612 to provide a measured point cloud 614. The measured point cloud, determined via any path, is registered with image data at 622. More particularly, registration 622 may branch based on whether a-priori information is available. When a-priori information is available, correspondence-based registration is performed at 624. When a-priori information is not available, a best-fit registration is performed at 626.
[0056] Turning now to Fig. 7, illustrated is a flow chart of an exemplary, non-limiting implementation for determining a transform for a patch device and image data. The method can begin at reference numeral 702 where sensor data from sensor signal(s) output from one or more sensors of a sensor array of a patch device are acquired. The sensor data may be unified sensor data aggregated or otherwise packaged from all sensors of the sensor array. In another example, the sensor data may include a set of sensor data from individual sensors of the sensor array. At 704, the sensor data is transmitted to a computing device for further processing. The computing device, in some examples, may be separate from the patch device. It is to be appreciated that transmission to the computing device may be raw sensor signals streamed from the sensor data. In other implementations, transmission to the computing device may be optional. In this case, onboard processing circuitry of the patch device performs the further steps of the method of Fig. 7. [0057] At 706, a transform is determined based on image data and a patch simulation model. Using the patch simulation model, a patch device may be virtually applied to a location derived from the image data and predicted sensor data is generated. The predicted sensor data indicates sensor data expected, based on the model, from the patch device at the location. Locations may be iteratively evaluated to find a best match to measured sensor data. For example, a location corresponding to a minimum error may be identified. When the location corresponding to measured sensor data is identified, the transform can be determined that brings the patch device and image data into correspondence.
[0058] Fig. 8 depicts a schematic diagram and flow chart of the method of Fig. 7. The flow of Fig. 5 may occur as described above. Then, depending on the data type, a transform is determined based on a model at 802, 806, and 812 to generate transforms 804, 810, and 824, respectively. When both types of data are utilized, computing device 800 may further match combined data at 808 to generate transform 810.
[0059] With reference to Figs. 9-12, one example of a surface-mounted transducer for intraoperative navigation is shown. In the depicted example, the transducer is a patch that can be in the form of a pliable film, sticker, etc. that can be conformed to the contour of the organ or surgical site where attached. More specifically, when the patch is applied to a multiplanar (semi) rigid surface of an object, the patch will conform to that surface and represent its underlying superficial contour without causing distortion (e.g., warping that would substantially reconfigure the superficial shape of the object).
[0060] As shown in Fig. 9, a patch 902 can be placed on the patient’s head (e.g., external skin) in this example, and conformed to the shape of the underlying skull 902. The patch may be placed at any location on the body, and even on/within internal organs that has/have been pre-procedure imaged in order to register the patch with the underlying physiologic structures. The patch may be manufactured in a predetermined shape. For example, if the patch is to be placed on the skin covering portions of the nasal bone and the supraorbital ridge, then the patch may be produced in an “L” shape to match the general contour of the known location without significant alterati on s/ m odifi cati on s .
[0061] While the patch may be manufactured with a pre-determined shape, the patch may alternatively be produced with a general shape (e.g., square, rectangle, circle, triangle, etc.) and subsequently modified/altered (e.g., via cutting) to fit in a particular location of interest.
[0062] The patch may include an adhesive backing to affix the patch to the desired location. Further, the patch may be moved from one location to another, for example during an operative procedure, if warranted.
[0063] As shown in Figs. 10 and 11 (each depicting a separate patch), the patch 1000 or 1100 includes a sensor array 1010, 1120 embedded therein or on its surface. The individual sensors 1012, 1112 of the array 1010, 1110 may be arranged in a pattern such that the position (i.e., coordinate location) of each sensor on the patch is fixed and known relative to that of each other sensor.
[0064] The sensor array can include bend- or flex sensors as known in the art, which are effectively strain gauges that can detect a degree of bending deflection and generate an electrical signal based on the degree of deflection. The electrical signals from all the flex sensors in the array can be sent to a computer, such as a microcomputer, which can calculate therefrom the three- dimensional contour of the overall patch. That is, knowing the overall shape of the patch, the precise relative position of each sensor on the patch, including the spatial location thereof with respect to each of the other sensors, and the specific degree of bending at each sensor location based on electrical signals from the respective sensors, the computer can construct a three- dimensional model of the patch in a virtual space corresponding to the real-world three- dimensional contour of the actual patch applied to the patient and conformed to her surface anatomy.
[0065] In one example, the flex sensors measure the deflection of the patch via mechanical deformation that is converted into electrical signal(s) that can be sent to the computer. More specifically, the flex sensors may be manufactured from conductive material (e.g., carbon or conductive polymers) that is embedded within or disposed atop a substrate (e.g., a non-conductive substrate, or one containing carbon particles and/or conductive material). Such sensors may take the shape of an elongated strip that is bendable/flexible along its length. It is to be understood that the sensors may all have the same shape, or may have distinct shapes (e.g., differing in length, width, geometry, etc.). Further, the sensor array may comprise any number and distribution of sensors on the patch sufficient to provide a requisite degree of contour-mapping resolution for a given application, to ensure that the resulting virtual contour map generated by the computer can be accurately registered with the pre-procedure imaging study for intraoperative navigation. Generally, the greater the number of sensors and the nearer their spacing to one another in the sensor array, the greater the degree of resolution.
[0066] In a neutral state where the patch is planar and the respective coordinate locations of the sensors are known (i.e., one coordinate along an X-axis and one coordinate along a Y-axis of an X-Y plane), the conductive material of each sensor is evenly distributed within the sensor so that the overall sensor yields a particular, predetermined baseline electrical resistance. The resistance can be measured by connecting the sensor to an electrical circuit in communication with the computer mentioned above. Specifically, the circuit applies a known voltage or current across each sensor and measures the resulting voltage or current passing therethrough. When the sensor bends, the conductive material experiences a change in its physical orientation, which yields a corresponding change in its electrical resistance. By analyzing the changes in the measured resistance values and comparing them to the baseline resistance, the amount of deflection can be determined for each sensor. By summing the measured deflections within the sensor array, the three-dimensional conformation of the patch can be discerned.
[0067] To be clear, the above-discussed flex sensor is only one example of a conformational sensor that may be employed in the patch. It is contemplated that the sensors of the sensor array are all the same. Alternatively, the sensor array may collectively comprise distinct types of bend/flex sensors for determining an amount of deflection of the patch.
[0068] With brief reference to Fig. 12, a patch 1200 further includes at least one optical reference mark 1202 provided at a known location. The optical reference mark 1202 is identifiable by an optical tracking system, as will be discussed further below. Notably, the reference mark 1202 can be provided on the top surface of the patch 1200 (optionally removably via magnetic attachment) such that the optical tracking system can have direct line-of-sight therewith. Alternatively, the reference mark 1202 may be embedded within the patch 1200, such that the optical tracking system identifies and locates the reference mark 1202 via non line-of-sight means, for example via a passive RFID or NFC signal.
[0069] According to various examples, the patch is used during an operative procedure to establish a reference point for the navigation system. Once the three-dimensional contour of the patch has been determined as explained above, a computer can execute a software algorithm to correlate a three-dimensional model of that patch in virtual space with a matching surface-contour region of the patient as previously acquired from the pre-procedure imaging study. Once this correlation has been identified, the software algorithm can be used to register the three- dimensional model of the patch to the matching contour of the patient’ s anatomy in a virtual space, which can be visualized on a display - such as a computer screen or using an augmented-reality headset, for example. Because the underlying physiologic anatomy of the patient also is known in relation to the patient’s surface contour as reflected in the pre-procedure imaging study, this algorithm allows the underlying anatomy (e.g., the brain structure, including the target ventricle in a ventricular catheterization procedure) to be displayed and registered in relation to the patch.
[0070] The patch may include a resident (i.e., on-board) control unit (e.g., a printed circuit board) for controlling the sensor array and/or processing the information derived therefrom. Specifically, the control unit may be embedded within the patch and electrically connected to the sensor array. Alternatively, the control unit may be disposed on the top surface of the patch or even completely separate from, such as integrated in an external computer executing the algorithm discussed above, and the patch connected thereto via a cable or via wireless communication (such as Bluetooth, wifi, NFC communication, etc.). The information acquired from the sensor array may be streamed wirelessly (e.g., via Bluetooth) or via a wired connection to the external computer executing the algorithm or other device (e g., phone, tablet, etc.). Notably, the patch-resident control unit processing sensor data can establish a secure connection with the corresponding external computer or other device through encryption/authenti cation pairing using a link key.
[0071] As noted above, where the above-noted patch is employed for intraoperative navigation, the associated area of the patient (encompassing the surgical site) will be first imaged or scanned via conventional means (e.g., MRI or CT scans) to generate a (virtual) 3D reconstruction of that area. Imaging software is then used to analyze the rendered reconstruction and determine known surface curvatures/geometries of the scanned area, in relation to the underlying anatomy. For example, Fig. 13 schematically illustrates such surface curvatures as ascertained via the imaging software.
[0072] Next, the patch is fitted (e.g., affixed) to the constrained region (as shown in FIG. 14). Prior to or during the fitting, the sensor array can be activated. The patch may include visual (e.g., lights), acoustic (speakers), and/or vibration indicators to indicate the status of the sensor array.
[0073] As the patch is conformed to the outer surface of the patient, the sensor array measures the deflection of the patch at various positions. The data from the sensor array is sent to a controller, and then optionally on to an external computer where software compares that information with the data previously acquired from the pre-operative imaging study to identify the exact placement of the patch on the patient (i.e., by matching known geometric contours from the scan with the measured deflection from the sensor array).
[0074] The patch is overlayed in a virtual space to the virtual reconstruction from the preoperative imaging study to form a transformation (as shown in Fig. 15) so that a virtual representation of the patch is placed in registry with the geometry of the underlying patient surface in the virtual space.
[0075] The resulting virtual reconstruction now depicts the patch at its actual location on the patient in a virtual space. An image of this virtual space can be outputted to a display that accurately depicts the patch in spatial relation to the surgical site, including all of the anatomy acquired from the pre-operative imaging study. Additionally, because the sensor array provides continuous measurements that are simultaneously compared to the known values from that imaging study, the resulting virtual reconstruction (and thus the outputted image) can be both autoregistering and updated in real-time to account for any movements of the patient or potentially repositioning of the patch on the patient’s outer surface.
[0076] With respect to Fig. 16, once the patch and the patient’s anatomy have been registered in a virtual space, an optical tracking system can be synchronized to the one or more optical reference marks on the patch, whose location(s) also is/are predetermined and fixed relative to the sensor array. The optical tracking system may be included in a wearable sensor (i.e., an augmented reality head-set, glasses, etc.). Notably, the peripheral sensor may be a stand-alone sensor that is not worn. Moreover, the peripheral sensor also can detect surgical instruments and other implements, either via use of a camera or by detecting dedicated optical markers present on those instruments. In another example, the surgical instruments may be detected through signals (optical, electromagnetic, etc.), which are sent and/or received relative to the patch. Models of those instrument(s) then also can be rendered in the virtual environment and displayed on the display (e.g. display screen, augmented-reality headset, etc.) so that the position, orientation and advancement thereof within the patient’s anatomy all can be visualized on the display in real-time. Because the optical tracking system is linked with the reference mark on the patch, the resulting (virtual) reconstruction can be overlay ed in or virtually displayed such that the operator has visual access to the surgical site, and to the position and orientation of her tools therein, even though that site is inside the patient and obstructed from view.
[0077] As shown in Fig. 17, various tools to be used during the operation can include sensors and/or reference markers such that those tools can be linked to the optical tracking system. Accordingly, the operative algorithm can overlay renderings of particular tools or instruments (e.g., a wand or needle) on the virtual reconstruction, wherein the position of the wand is shown in real-time via sensed data and/or measuring distances of different references on the tool/instrument.
[0078] With reference to FIG. 18, another example of a surface-mounted transducer for intraoperative navigation is shown. Specifically, the transducer can be a sterile drape or covering, or incorporated in a sterile drape or covering, that performs a similar function to the above-noted patch. One difference between the drape and patch is that the latter covers a smaller portion of the target anatomy, whereas the former can cover a larger surface. Large surface coverage may be desirable in cases where the surface contour is relatively constant, so that a greater surface area will be useful to ensure sufficient differentiation between sensor sites in the associated sensor array, to ensure proper body-contour registration.
[0079] The drape example likewise includes a sensor array to determine the contour of the target anatomy where it is placed/affixed. The sensor array for the drape differs from the patch sensor array, in that units (i.e. each sensor) of the array can be connected end-to-end (i.e., no spacing therebetween) across the entire (or a specific) area of the drape, its position fixed in relation to all other sensors to determine the geometry of the target anatomy as explained above.
[0080] Similar to the patch, the drape self-registers to the contour of the surface on which it is applied and conformed through the aforementioned software algorithm, in the manner explained above. The resulting (virtual) reconstruction can then be rendered in a virtual space and synchronized to one or more optical markers also on the drape, in order to achieve intraoperative navigation as described above. Additionally, with reference to FIG. 19, instead of direct optical markers for relative determination of position of surgical tools to the drape (as described above), the drape may have some units in the described array be electromagnetic sensors 1910 and some electromagnetic emitters 1920. Embedded electromagnetic reflectors/markers in the surgical tools will interact with the emitted electromagnetic field from the drape emitters, and reflect the emitted signals received by each individual electromagnetic unit in the drape. These reflected signals will then be detected by the electromagnetic sensors integrated in the drape. Based on each electromagnetic sensor’ s relative strength of signal received, and since each unit’s relative position is predetermined (as described above), the surgical tools’ relative position to the drape can be established.
[0081] Now moving to Fig. 20, yet another example of a surface-mounted transducer for intraoperative navigation is shown. Here, the transducer includes a plurality of fingers all extending from a common origin at known, fixed angles relative to one another. Each finger includes a knuckle disposed intermediate a proximal portion that extends from the origin, and a distal portion extending from the knuckle and defining a fingertip at its distal end. The distal portion of each finger is pivotable relative to the proximal portion thereof at the associated knuckle. Moreover, a flex sensor is provided at each knuckle to determine the angle of bending of the distal portion relative to the proximal portion.
[0082] Notably, the respective lengths of the proximal and distal portions of each finger are known and fixed. Accordingly, by determining the angle between the proximal and distal portions (via the flex sensor) of each finger, a coordinate location of each fingertip can be calculated relative to all the other fingertips in the transducer. In this manner, the transducer can generate a ‘point cloud’ having a plurality of (in the illustrated embodiment, five) points whose coordinate locations define a contour that can be registered to a surface of the patient in a virtual space similarly as explained above. Moreover, as in other embodiments, the finger-transducer also can include one or more optical markings that facilitate detection thereof using an optical tracking system that also facilitates integration of surgical tools into a virtual image of the surgical site for real-time intraoperative navigation.
[0083] In use, the base of the transducer encompassing the origin can be placed at a known location (e.g., placed at the nasion as shown above in Fig. 20) and affixed via an adhesive material or other suitable technique. After the base is affixed, the distal portions of the fingers are manipulated (i.e., pivoted about their respective knuckles) until the distal fingertips all contact the surface of the body part being measured (e.g., forehead). Subsequently, the coordinate location for each fingertip is determined relative to one another. The relative coordinate locations are then triangulated to generate a virtual point cloud that can be co-registered with a previously obtained image as discussed above. In other words, the determined coordinate locations from the covering are compared to various points of a previous scan/image in 1 : 1 relation to (virtually) reconstruct its placement in reference to surgical field anatomy.
[0084] Fig. 21-25 illustrate various alternative and exemplary implementations for a patch device as described herein. Fig. 21 illustrates a patch device 2100 having a flexible material or substrate 2102 on which one or more flex sensors 2104 are positioned. Device 2100 further includes housing 2106 for circuitry. The circuitry may be control or communication circuitry.
[0085] Fig. 22 illustrates a patch device 2200 having a flexible material or substrate 2202 on which one or more bend sensors 2204 are positioned. Device 2200 further includes housing 2206 for circuitry. The circuitry may be control or communication circuitry. In an alternative implementation, Fig. 23 illustrates a patch device 2300 having a flexible material 2302, bend sensors 2304, and circuitry housing 2306. In Fig. 23, the bend sensors 2304 are arranged in an offset manner compared to the bend sensors 2204 in Fig. 22.
[0086] Fig. 24 illustrates a patch device 2400 having a sensor array 2404 embedded on a flexible substrate 2402. The sensor array 2404 may be a bend sensor configured according to a shape of flexible substrate 2402. The device 2400 includes a connector 2406 to transmit sensor signals to a processing device.
[0087] Fig. 25 illustrates a patch device 2500. Patch device 2500 includes a central hub 2502 from which bend sensors 2504 fan out. In this exemplary implementation, the bend sensors 2504 are rotatable mounted to the hub 2502 such that the relative angles between sensors 2504 are adjustable to facilitate mounting to various positions on a patient.
[0088] The above-described transducers for intraoperative navigation provide relatively quick and accurate intraoperative surgical navigation that can be deployed quickly and displayed (e.g., on a screen, tablet, peripheral head-set) in real-time. The surgical field, otherwise obstructed from view, can be thus synchronized to optical markers, and thereby also synchronized to virtual models of surgical tools and instruments used during an operative procedure. This facilitates real-time surgical navigation via a computer display or in an augmented/virtual reality environment, to provide the operator with the ability to ‘see’ where she is going rather than to infer from anatomical landmarks. The transducers and the associated optical tracking systems and/or peripherals are relatively small, and thus can be transported to and deployed in emergency settings readily.
[0089] Moreover, because in each instance the transducer generates a virtual surface contour or point cloud that is compared via an algorithm to patient surface contours to achieve registration, registration is automatic and is not dependent on the patient position. Indeed, if desired during a procedure the transducer can be removed and re-positioned (e.g. to provide access to another incision site if desired), whereupon the software algorithm will be able to again perform the necessary comparisons and adjust registration of the transducer to the patient’s anatomy in the virtual space until they are matched once again. This is a significant benefit compared to conventional systems, which require immobilizing the patient for extended periods.
[0090] The word “exemplary” is used herein to mean serving as an example, instance or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as advantageous over other aspects or designs. Rather, use of the word exemplary is intended to present concepts in a concrete fashion. As used in this application, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from context, “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then “X employs A or B” is satisfied under any of the foregoing instances. Further, at least one of A and B and/or the like generally means A or B or both A and B. In addition, the articles “a” and “an” as used in this application and the appended claims may generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form.
[0091] Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. Of course, those skilled in the art will recognize many modifications may be made to this configuration without departing from the scope or spirit of the claimed subject matter.
[0092] Also, although the disclosure has been shown and described with respect to one or more implementations, equivalent alterations and modifications will occur to others skilled in the art based upon a reading and understanding of this specification and the annexed drawings. The disclosure includes all such modifications and alterations and is limited only by the scope of the following claims. In particular regard to the various functions performed by the above described components (e.g., elements, resources, etc.), the terms used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., that is functionally equivalent), even though not structurally equivalent to the disclosed structure which performs the function in the herein illustrated exemplary implementations of the disclosure.
[0093] In addition, while a particular feature of the disclosure may have been disclosed with respect to only one of several implementations, such features may be combined with one or more other features of the other implementations as may be desired and advantageous for any given or particular application. Furthermore, to the extent that the terms “includes,” “having,” “has,” “with,” or variants thereof are used in either the detailed description or the claims, such terms are intended to be inclusive in a manner similar to the term “comprising.”
[0094] The implementations have been described, hereinabove. It will be apparent to those skilled in the art that the above methods and apparatuses may incorporate changes and modifications without departing from the general scope of this invention. It is intended to include all such modifications and alterations in so far as they come within the scope of the appended claims or the equivalents thereof.

Claims

WHAT IS CLAIMED IS:
1. A device, comprising: a flexible substrate configured to conform to a contour of a surface; a sensor array associated with the flexible substate, the sensor array having one or more sensors, each sensor configured to measure an amount of deformation of the sensor; and a controller configured to collect sensor signals from the one or more sensors of sensor array and output sensor data, the sensor data indicative of a deformation of the flexible substrate.
2. The device of claim 1, wherein the sensor data includes displacement data.
3. The device of claim 1, wherein the sensor data includes flex data.
4. The device of claim 1, wherein the sensor data includes displacement data and flex data.
5. The device of any of claims 1-4, wherein the sensor data is unified sensor data aggregated across the one or more sensors of the sensor array.
6. The device of any of claims 1-4, wherein the sensor data is a set of sensor data including individual sensor data for each of the one or more sensors of the sensor array.
7. The device of any of claims 1-6, further comprising a communication interface, wherein the controller is further configured to communicate the sensor data to a computing system via the communication interface.
8. The device of claim 7, wherein the communication interface is a wired communication interface.
9. The device of claim 7, wherein the communication interface is a wireless communication interface.
10. The device of claim 1, wherein the controller is removably attached to the flexible substrate.
11. The device of any of claims 1-10, wherein the one or more sensors of the sensor array have a correlated arrangement such that a deformation of a first sensor of the sensor array has a predetermined relationship to a deformation of a second sensor of the sensor array.
12. The device of any of claims 1-11, further comprising an optical marker on a surface of the flexible substrate, the optimal marker being detectable by a tracking system.
13. A system for intraoperative navigation, comprising: a processor coupled to a memory storing computer-executable instructions that, when executed by the processor, configure the processor to: acquire sensor data from a patch device having a flexible substrate applied to a portion of a body of a patient, the sensor data indicative a deformation of the flexible substrate; generate a surface contour of the patch device based on the sensor data, wherein the surface contour corresponds to a surface contour of the portion of the body of the patient on which the patch device is applied; and register the surface contour to image data of the patient.
14. The system of claim 13, wherein the sensor data includes unified sensor data from one or more sensors of a sensor array of the patch device.
15. The system of claim 13, wherein the sensor data includes a set of sensor data including individual sensor data for each of one or more sensors of a sensor array of the patch device.
16. The system of any of claims 13-15, wherein the sensor data includes displacement data.
17. The system of any of claims 13-15, wherein the sensor data includes flex data.
18. The system of any of claims 13-15, wherein the sensor data includes displacement data and flex data.
19. The system of any of claims 13-18, wherein the processor is further configured to generate deformation data based on the sensor data.
20. The system of claim 19, wherein the deformation data indicates at least one of deflection information or angle information.
21. The system of claims 19 or 20, wherein the processor is further configured to generate a point cloud from the deformation data.
22. The system of any of claims 19-21, wherein the processor is further configured to generate deformation data based on a predetermined correlation of one or more sensors of a sensor array on the patch device that generates the sensor data.
23. The system of claim 21 , wherein the processor is further configured to generate a transform through registration of the point cloud to the image data of the patient.
24. The system of any of claims 13-18, wherein to generate the surface contour and to register the surface contour to the image data, the processor is further configured to: generate predicted sensor data based on a simulated model of the patch device; and identify a position of the patch device relative to the image data based on a comparison of the predicted sensor data and the sensor data acquired from the patch device.
25. The system of any of claims 13-24, wherein the processor is further configured to acquire tracking information of the patch device from a tracking system configured to identify and track a marker on the patch device.
26. The system of claim 25, wherein the processor is further configured to register the surface contour to image data of the patient based in part on the tracking information.
27. A system for intraoperative navigation, comprising: a contouring reference array having a sensor array coupled with a flexible material configured to conform to a contour of a surface, the sensor array having one or more sensors, each sensor configured to measure deformation of the flexible material; a computing device communicatively coupled with the contouring reference array, the computing device having a processor configured to: receive sensor data from the contouring reference array, the contouring reference array being applied to a body of a patient; match a contour measured by the contouring reference array corresponding to a surface contour of a portion of the body of the patient on which the contouring reference array is applied to image data of the patient; and output registration information between the contour of the contouring reference array and the image data to an intraoperative tracking system; and the intraoperative tracking system configured to track at least a reference on the contouring reference array, the intraoperative tracking system further configured to register tracking information associated with the contouring reference array with at least one of the image data or the contour of the contouring reference array.
28. The system of claim 27, wherein the computing device is further configured to generate a point cloud based on the sensor data, wherein the registration information is based on a registration of the point cloud to the image data.
29. The system of claim 27, wherein the computing device is further configured to generate predicted sensor data based on a simulated model of the contouring reference array and a given location on a patient selected from the image data.
30. The system of claim 29, wherein the computing device is further configured to identify a location on the patient having predicted sensor data corresponding to the sensor data received from the contouring reference array.
31. The system of claim 27, wherein the intraoperative tracking system includes one or more wearable augmented reality devices.
PCT/US2024/038708 2023-07-20 2024-07-19 System for automated surface contour matching for surgical navigation Pending WO2025019758A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363527968P 2023-07-20 2023-07-20
US63/527,968 2023-07-20

Publications (1)

Publication Number Publication Date
WO2025019758A1 true WO2025019758A1 (en) 2025-01-23

Family

ID=92258707

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/038708 Pending WO2025019758A1 (en) 2023-07-20 2024-07-19 System for automated surface contour matching for surgical navigation

Country Status (1)

Country Link
WO (1) WO2025019758A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140088377A1 (en) * 2011-06-10 2014-03-27 Koninklijke Philips N.V. Dynamic constraining with optical shape sensing
US20150327948A1 (en) * 2014-05-14 2015-11-19 Stryker European Holdings I, Llc Navigation System for and Method of Tracking the Position of a Work Target
US20170281281A1 (en) * 2014-09-08 2017-10-05 Koninklijke Philips N.V. Shape sensing for orthopedic navigation
WO2022125629A1 (en) * 2020-12-09 2022-06-16 Smith & Nephew, Inc. Fiber optic cable for less invasive bone tracking

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140088377A1 (en) * 2011-06-10 2014-03-27 Koninklijke Philips N.V. Dynamic constraining with optical shape sensing
US20150327948A1 (en) * 2014-05-14 2015-11-19 Stryker European Holdings I, Llc Navigation System for and Method of Tracking the Position of a Work Target
US20170281281A1 (en) * 2014-09-08 2017-10-05 Koninklijke Philips N.V. Shape sensing for orthopedic navigation
WO2022125629A1 (en) * 2020-12-09 2022-06-16 Smith & Nephew, Inc. Fiber optic cable for less invasive bone tracking

Similar Documents

Publication Publication Date Title
US20220039876A1 (en) Sensored surgical tool and surgical intraoperative tracking and imaging system incorporating same
JP7106258B2 (en) Pre-operative registration of anatomical images using an ultrasound-based position tracking system
JP6740316B2 (en) Radiation-free position calibration of fluoroscope
JP6615451B2 (en) Tracing the catheter from the insertion point to the heart using impedance measurements
EP3606459B1 (en) Non-invasive system and method for tracking bones
US20120323111A1 (en) Method and system for characterizing and visualizing electromagnetic tracking errors
US20060030771A1 (en) System and method for sensor integration
EP2760360B1 (en) Self-localizing medical device
JP2008126075A (en) System and method for visual verification of ct registration and feedback
EP3392835B1 (en) Improving registration of an anatomical image with a position-tracking coordinate system based on visual proximity to bone tissue
Jeon et al. A preliminary study on precision image guidance for electrode placement in an EEG study
WO2017193197A1 (en) Phantom to determine positional and angular navigation system error
US11160610B2 (en) Systems and methods for soft tissue navigation
JP7483468B2 (en) Medical Equipment Identification
US20240398375A1 (en) Spatial registration method for imaging devices
WO2025019758A1 (en) System for automated surface contour matching for surgical navigation
JP2022552983A (en) Dynamic update of tissue images
EP3903713B1 (en) Field generator assembly for surgical navigation
CN115398476A (en) Preoperative registration of anatomical images with a position tracking system using ultrasound measurements of skin tissue
US20250345122A1 (en) Sensor fusion method and apparatus capable of simultaneously measuring real-time shape deformation and tracking tool pose

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24752303

Country of ref document: EP

Kind code of ref document: A1